Real analysis

Broadly speaking, real analysis is the study of the real numbers and its topological properties, sequences and series of real numbers, and properties of real-valued functions. Some properties that are studied in the real numbers are the construction of the real numbers, convergence of sequences, subsets of the plane as metric spaces, limits, notions of continuity, differentiation, and integration.

A common description of real analysis courses is that real analysis is the formal rigorous study of single-variable calculus with proofs. This view does have merit to it because most (if not all) of the theorems typically presented to students in courses in single-variable calculus are proven rigorously; however, one should note that courses in real analysis also spend considerable amount of time on pathological examples with little concern for applications, and one also aims to generalize and prove results rather than apply results to calculate numerical answers to exercises as one typically does in a calculus course.

Construction of the real numbers

The entirety of real analysis is built upon the real numbers, particularly with the notion of completeness in mind. Intuitively, this is described as the fact that the real numbers $\mathbb R$ lack the existence of any "holes" unlike the rational numbers $\mathbb Q$ (for instance, the set $\{x \in \mathbb{Q} \mid x^2 < 2\}$ has no largest element in the rational numbers). This property of the real numbers is known as the least upper bound property.

Two particularly known constructions of the real numbers are via Cauchy sequences and Dedekind cuts, both of which take $\mathbb{Q}$ and construct $\mathbb{R}$ as a completion of $\mathbb{Q}$.

Sequences of real numbers

A sequence is a function $f:\mathbb{N}\to\mathbb{R}$. Conventionally, sequences are typically denoted by the notation $(s_n)_{n = k}^{\infty} = (s_k,s_{k + 1},\ldots)$ where $f(n)$ is denoted by $s_n$. In the case where $k = 1$, we can denote $(s_n)_{n = k}^{\infty}$ by $(s_n)_{n \in \mathbb{N}}$.

In real analysis, particular attention is paid attention to the convergence and divergence of sequences. Intuitively, the idea of convergence is captured by the notion that the sequence "approaches" some value as $n$ becomes arbitrarily large. Also important is the notion of Cauchy sequences which intuitively describe sequences whose terms become arbitrarily close to each other as $n$ becomes arbitrarily large.

Limits

A large problem with the intuitive notion of a sequence converging to some value is that "approaching" is not only vague, but is also handwavy and lacks mathematical precision. Limits solve this problem by precisely defining the notion of convergence of a sequence.

Definition: Let $(s_n)$ be a sequence of real numbers. The sequence $(s_n)$ converges to the limit $s\in\mathbb{R}$ provided that for every $\varepsilon > 0$, there exists $N\in\mathbb{N}$ such that for every $n \ge N$, we have $|s_n - s| < \varepsilon.$ If $(s_n)$ converges to $s$, then we say that $\lim_{n\to\infty}s_n = s$ or $s_n \rightarrow s$ as $n \to \infty$.

This definition can be shown to be equivalent to the likely more familiar $\varepsilon-\delta$ definition of a limit of a function.

Definition: Let $S\subseteq\mathbb{R}$. The limit of the function $f:S\to\mathbb{R}$ as $x$ approaches $x_0$ is $L$ provided that for every $\varepsilon > 0$, there exists a $\delta > 0$ such that that for every $x \in S$ and $0<|x-x_0|<\delta$, we have $|f(x)-L|<\varepsilon$. Notationally, we say that $\lim_{x\to x_0}f(x)=L$ or $f(x) \rightarrow L$ as $x \rightarrow x_0$.

Limits are a key tool in the definition of continuity, derivatives, and any result in real analysis that relies upon sequences.

Continuity

A common analogy used in calculus classes for continuity is a function whose graph can be drawn without lifting up one's pencil--that is, the graph has no breaks or jumps. While intuitive, it turns out that this notion of continuity is actually very misleading, in fact, a continuous function may have discontinuities at points not in its domain (for example, $f(x) = 1/x$ is continuous at all points in its domain yet is "visually discontinuous" at $x = 0$). This calls for a more precise notion of continuity.

Definition: Let $S\subseteq\mathbb{R}$. The function $f:S\to\mathbb R$ is continuous at $x_0\in S$ provided that for every sequence $(x_n)$ in $S$ converging to $x_0$, we have $f(x_n)\rightarrow f(x_0)$ as $x_n\rightarrow x_0$. In other words, $f$ preserves convergence.

This definition of continuity is equivalent to the more familiar definition of continuity from calculus below.

Definition. Let $S\subseteq\mathbb{R}$. The function $f:S\to\mathbb R$ is continuous at $x_0\in S$ provided that for every $\varepsilon > 0$, there exists a $\delta > 0$ such that for every $x \in S$ and $|x - x_0|<\varepsilon$, we have $|f(x)-f(x_0)|<\varepsilon$.

Continuity can also be generalized to topological spaces and described in terms of preimages of open sets. Furthermore, various other notions of continuity such as Lipschitz continuity, uniform continuity, and absolute continuity are studied in real analysis.

Differentiation

Derivatives are central notion in calculus and various fields such as the sciences and economics where derivatives can be interpreted as the instantaneous rate of change at a point.

Definition: The function $f:(a,b) \rightarrow \mathbb{R}$ is differentiable at $x_0$ if the limit \[f'(x_0) = \frac{f(x)-f(x_0)}{x-x_0}\] exists.

In the context of real analysis, differentiability is a condition which guarantees continuity, but not the other way around. In particular, there exist functions that are continuous on $\mathbb{R}$ but differentiable nowhere (for instance, Weierstrass's function). One can also define higher-order derivatives by inductively differentiating the derivative of a function.

Integration

In calculus, integrals are introduced as the approximation of the area under the curve of a continuous functions by rectangles as the number of rectangles becomes arbitrarily large. This construction is widely used in many fields of knowledge including math itself and is known as a Riemann integral. In real analysis, Riemann integrals of non-continuous functions are also considered and the notion of Lebesgue integration is also developed.

In some texts, the notion of Darboux integrals are introduced first which are then noted as special cases of Riemann sums.

Let $f:[a,b]\rightarrow\mathbb{R}$ be bounded.

Definition: Let $P$ be a tagged partition of $[a,b]$. Then the Riemann sum corresponding to $f$ and $P$ is \[R(f,P) = \sum_{i=1}^{n} f(t_i)(x_{i} - x_{i-1}).\] Furthermore, we define the mesh $|P|$ of the partition $P$ to be the length of the largest subinterval $[x_{i-1},x_i]$.

Definition: The real number $I$ is the Riemann integral of $f$ over $[a,b]$ provided that for every $\varepsilon > 0$, there exists some $\delta>0$ such that if $P$ is any partition of $[a,b]$, then $|R - I| < \varepsilon$ whenever $|P| < \delta$. If such an $I$ exists, then it is denoted as $\int_a^b f(x)\,dx = I$.

Various shortcomings of Riemann integration exist. A famous example is the Dirchlet function which is not Riemann-integrable on any interval of $\mathbb{R}$. Furthermore, the monotone convergence theorem is false in the context of Riemann-integrals which motivates the Lebesgue integral.

See also

This article is a stub. Help us out by expanding it.