Difference between revisions of "Proofs of AM-GM"

(Changed for sake of meaning)
m (For sake of clarity)
Line 60: Line 60:
 
=== Proof by Rearrangement ===
 
=== Proof by Rearrangement ===
  
Define the <math>n</math> sequence <math>\{ r_{ij}\}_{i=1}^{n}</math> as <math>r_{ij} = \sqrt[n]{a_i}</math>, for all integers <math>1 \le i,j \le n</math>.  Evidently these sequences are similarly sorted.  Then by the [[Rearrangement Inequality]],
+
Define the <math>n</math> sequence <math>\{ r_{i,j}\}_{i=1}^{n}</math> as <math>r_{i,j} = \sqrt[n]{a_i}</math>, for all integers <math>1 \le i,j \le n</math>.  Evidently these sequences are similarly sorted.  Then by the [[Rearrangement Inequality]],
<cmath> \sum_i a_i = \sum_i \prod_j r_{ij} \ge \sum_i \prod_j r_{i+j,j} = n \prod_i \sqrt[n]{a_i} , </cmath>
+
<cmath> \sum_i a_i = \sum_i \prod_j r_{i,j} \ge \sum_i \prod_j r_{i+j,j} = n \prod_i \sqrt[n]{a_i} , </cmath>
where we take our indices modulo <math>n</math>, with equality exactly when all the <math>r_{ij}</math>, and therefore all the <math>a_i</math>, are equal.  Dividing both sides by <math>n</math> gives the desired inequality.  <math>\blacksquare</math>
+
where we take our indices modulo <math>n</math>, with equality exactly when all the <math>r_{i,j}</math>, and therefore all the <math>a_i</math>, are equal.  Dividing both sides by <math>n</math> gives the desired inequality.  <math>\blacksquare</math>
  
 
=== Proof by [[Cauchy Induction]] ===
 
=== Proof by [[Cauchy Induction]] ===

Revision as of 11:27, 3 July 2020

This pages lists some proofs of the weighted AM-GM Inequality. The inequality's statement is as follows: for all nonnegative reals $a_1, \dotsc, a_n$ and nonnegative reals $\lambda_1, \dotsc, \lambda_n$ such that $\sum_{i=1}^n \lambda_i = 1$, then \[\sum_{i=1}^n \lambda_i a_i \ge \prod_{i=1}^n a_i^{\lambda_i},\] with equality if and only if $a_i = a_j$ for all $i,j$ such that $\lambda_i, \lambda_j \neq 0$.

We first note that we may disregard any $a_i$ for which $\lambda_i= 0$, as they contribute to neither side of the desired inequality. We also note that if $a_i= 0$ and $\lambda_i \neq 0$, for some $i$, then the right-hand side of the inequality is zero and the left hand of the inequality is greater or equal to zero, with equality if and only if $a_j = 0 = a_i$ whenever $\lambda_j\neq 0$. Thus we may henceforth assume that all $a_i$ and $\lambda_i$ are strictly positive.

Complete Proofs

Proof by Convexity

We note that the function $x \mapsto \ln x$ is strictly concave. Then by Jensen's Inequality, \[\ln \sum_i \lambda_i a_i \ge \sum_i \lambda_i \ln a_i = \ln \prod_i a_i^{\lambda_i} ,\] with equality if and only if all the $a_i$ are equal. Since $x \mapsto \ln x$ is a strictly increasing function, it then follows that \[\sum_i \lambda_i a_i \ge \prod_i a_i^{\lambda_i},\] with equality if and only if all the $a_i$ are equal, as desired. $\blacksquare$

Alternate Proof by Convexity

This proof is due to G. Pólya.

Note that the function $f:x \mapsto e^x$ is strictly convex. Let $g(x)$ be the line tangent to $f$ at $(0,1)$; then $g(x) = x+1$. Since $f$ is also a continuous, differentiable function, it follows that $f(x) \ge g(x)$ for all $x$, with equality exactly when $x=0$, i.e., \[1+x \le e^x,\] with equality exactly when $x=0$.

Now, set \[r_i = a_i/\biggl( \sum_{j=1}^n \lambda_j a_j \biggr) - 1,\] for all integers $1\le i \le n$. Our earlier bound tells us that \[r_i +1 \le \exp(r_i),\] so \[(r_i +1)^{\lambda_i} \le \exp(\lambda_i r_i) .\] Multiplying $n$ such inequalities gives us

\[\prod_{i=1}^n (r_i + 1)^{\lambda_{i}} \le \prod_{i=1}^n \exp \lambda_i r_i\]

Evaluating the left hand side:

\[\prod_{i=1}^n (r_i + 1)^{\lambda_{i}} = \frac{\prod_{i=1}^n a_i^{\lambda_i} }{ (\sum_{j=1}^n \lambda_j a_j)^{\sum_{j=1}^n \lambda_i} } =  \frac{\prod_{i=1}^n a_i^{\lambda_i} }{ \sum_{j=1}^n \lambda_j a_j } ,\]

for

\[\sum_{j=1}^n \lambda_i = 1 .\]

Evaluating the right hand side:

\[\prod_{i=1}^n \exp \lambda_i r_i =  \prod_{i=1}^n \exp (\frac{a_i \lambda_i}{ \sum_{i=1}^n \lambda_j a_j} -  \lambda_i) = \exp (\frac{\sum_{i=1}^n a_i \lambda_i}{ \sum_{i=1}^n \lambda_j a_j} - \sum_{i=1}^n \lambda_i) = \exp 0 = 1 .\]

Substituting the results for the left and right sides:

\[\frac{\prod_{i=1}^n a_i^{\lambda_i} }{ \sum_{j=1}^n \lambda_j a_j } \le 1\]

\[\prod_{i=1}^n a_i^{\lambda_i} \le  \sum_{j=1}^n \lambda_j a_j =  \sum_{i=1}^n \lambda_i a_i ,\]

as desired. $\blacksquare$

Proofs of Unweighted AM-GM

These proofs use the assumption that $\lambda_i = 1/n$, for all integers $1 \le i \le n$.

Proof by Rearrangement

Define the $n$ sequence $\{ r_{i,j}\}_{i=1}^{n}$ as $r_{i,j} = \sqrt[n]{a_i}$, for all integers $1 \le i,j \le n$. Evidently these sequences are similarly sorted. Then by the Rearrangement Inequality, \[\sum_i a_i = \sum_i \prod_j r_{i,j} \ge \sum_i \prod_j r_{i+j,j} = n \prod_i \sqrt[n]{a_i} ,\] where we take our indices modulo $n$, with equality exactly when all the $r_{i,j}$, and therefore all the $a_i$, are equal. Dividing both sides by $n$ gives the desired inequality. $\blacksquare$

Proof by Cauchy Induction

We first prove that the inequality holds for two variables. Note that $(\sqrt{a}, \sqrt{b}), (\sqrt{a}, \sqrt{b})$ are similarly sorted sequences. Then by the Rearrangement Inequality, \[\sqrt{ab} = \frac{ \sqrt{a} \sqrt{b} + \sqrt{b} \sqrt{a}}{2} \le \frac{ \sqrt{a} \sqrt{a} + \sqrt{b} \sqrt{b}}{2} = \frac{a+b}{2} ,\] with equality exactly when $a=b$.

We next prove that if the inequality holds for $n$ variables (with equality when all are equal to zero), than it holds for $2n$ variables (with equality when all are equal to zero). Indeed, suppose the inequality holds for $n$ variables. Let $A_n, A'_n, A_{2n}$ denote the arithmetic means of $(a_1, \dotsc, a_n)$, $(a_{n+1}, \dotsc, a_{2n})$, $(a_1, \dotsc, a_{2n})$, respectively; let $G_n, G'_n, G_{2n}$ denote their respective geometric means. Then \[G_{2n} = \sqrt{G_n G'_n} \le \frac{G_n + G'_n}{2} \le \frac{A_n + A'_n}{2} = A_{2n},\] with equality when all the numbers $a_1, \dotsc, a_{2n}$ are equal, as desired.

These two results show by induction that the theorem holds for $n=2^k$, for all integers $k$. In particular, for every integer $n$, there is an integer $N >n$ such that the theorem holds for $N$ variables.

Finally, we show that if $N>n$ and the theorem holds for $N$ variables, then it holds for $n$ variables $a_1, \dotsc, a_n$ with arithmetic mean $A_n$ and geometric mean $G_n$. Indeed, set $b_k = a_k$ for $1\le k \le n$, and let $b_k = A_n$ for $n < k \le N$. Then \[G_n^{n/N} \cdot A_n^{(N-n)/N} = \prod_{i=1}^N b_i^{1/N} \le \sum_{i=1}^N b_i/N = A_n .\] It follows that $G_k^{n/N} \le A_k^{n/N}$, or $G_n \le A_n$, with equality exactly when all the $a_i$ are equal to $A_k$.

The last two results show that for all positive integers $n$, the theorem holds for $n$ variables. Therefore the theorem is true. $\blacksquare$