Euler-Mascheroni constant

Revision as of 17:32, 19 September 2022 by Orange quail 9 (talk | contribs) (Fixed an off-by-one error.)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

The Euler-Mascheroni constant $\gamma$ is a constant defined by the limit \[\gamma = \lim_{n \rightarrow \infty} \left( \left( \sum_{k=1}^n \frac{1}{k} \right) - \ln(n) \right).\] Its value is approximately \[\gamma = 0.577215 \dots .\]

Whether $\gamma$ is rational or irrational and (if irrational) algebraic or transcendental is an open question.

Proof of existence

Alternate formulation of the limit

The tangent-line approximation (first-degree Taylor polynomial) of $\ln(k + 1)$ about $x = k$ is \[\ln(k + 1) = \ln(k) + ((k + 1) - k)\ln'(k) + E_k\] for some error term $E_k$. Using $\ln'(x) = \frac{1}{x}$ and simplifying, \[\ln(k + 1) = \ln(k) + \frac{1}{k} + E_k.\] Applying the tangent-line formula recursively for all $k$ descending from $n - 1$ to $1$,

\begin{align*} \ln(n) &= \ln(n-1) + \frac{1}{n-1} + E_{n-1} \\ &= \left( \ln (n-2) + \frac{1}{n-2} + E_{n-2} \right) + \frac{1}{n - 1} + E_{n-1} \\ &= \dots \\ &= \ln(1) + \left( \sum_{k=1}^{n-1} \frac{1}{k} \right) + \left( \sum_{k=1}^{n-1} E_k \right) . \end{align*}

Because $\ln(1) = 0$, we may rearrange to \[\left( \sum_{k=1}^{n-1} \frac{1}{k} \right) - \ln(n) = -\sum_{k=1}^{n-1} E_k.\] Adding $\frac{1}{n}$ to both sides yields \[\left( \sum_{k=1}^{n} \frac{1}{k} \right) - \ln(n) = \left( -\sum_{k=1}^{n-1} E_k \right) + \frac{1}{n}.\] Taking the limit as $n$ goes to infinity of both sides,

\begin{align*} \lim_{n \rightarrow \infty} \left( \left( \sum_{k=1}^n \frac{1}{k} \right) - \ln(n) \right) &= \lim_{n \rightarrow \infty} \left( \left( -\sum_{k=1}^{n-1} E_k \right) + \frac{1}{n} \right) \\ &= - \lim_{n \rightarrow \infty} \left( \sum_{k=1}^{n-1} E_k \right) + \lim_{n \rightarrow \infty} \frac{1}{n} \\ &= - \lim_{n \rightarrow \infty} \left( \sum_{k=1}^{n-1} E_k \right) \\ &= - \sum_{k=1}^{\infty} E_k. \end{align*}

Thus, $\gamma = - \sum_{k=1}^{\infty} E_k$.

Convergence of the sum of error terms

We have $\ln''(x) = \left(\frac{1}{x} \right)' = -\frac{1}{x^2}$. For $k \geq 1$, the maximum absolute value of $-\frac{1}{x^2}$ for $x \in [k, k+1]$ is $\frac{1}{k^2}$. Therefore, by the Lagrange Error Bound, \[|E_k| \leq \left| \frac{1^2 \left( \frac{1}{k^2} \right) }{2!} \right| = \frac{1}{2k^2}.\]

The series $\sum_{k=1}^{\infty} \frac{1}{k^2}$ famously converges to $\frac{\pi^2}{6}$ by the Basel problem, so $\sum_{k=1}^{\infty} -\frac{1}{2k^2}$ converges to $-\frac{\pi^2}{12}$ and $\sum_{k=1}^{\infty} \frac{1}{2k^2}$ converges to $\frac{\pi^2}{12}$.

Because $E_k \in \left[ -\frac{1}{2k^2}, \frac{1}{2k^2} \right]$ for all $k$, the Series Comparison Test gives that $\sum_{k=1}^{\infty} E_k$ must converge to a value in $\left[-\frac{\pi^2}{12}, \frac{\pi^2}{12} \right]$.

Hence, $\gamma = - \sum_{k=1}^{\infty} E_k$ is a defined constant.

See also