Difference between revisions of "Proofs of AM-GM"

(Alternate Proof by Convexity: Fixing Parentheses)
(Rewrote the Cauchy Induction proof to be longer, but more thorough for beginners. Also rearranged the document to prove it first, as it requires the least background knowledge.)
Line 1: Line 1:
This pages lists some proofs of the weighted [[AM-GM]] Inequality. The inequality's statement is as follows: for all nonnegative reals <math>a_1, \dotsc, a_n</math> and nonnegative reals <math>\lambda_1, \dotsc, \lambda_n</math> such that <math>\sum_{i=1}^n \lambda_i = 1</math>, then
+
This pages lists some proofs of the weighted [[AM-GM]] Inequality. The inequality's statement is as follows: for all nonnegative reals <math>a_1, \dotsc, a_n</math> and nonnegative reals <math>\lambda_1, \dotsc, \lambda_n</math> such that <math>\sum_{i=1}^n \lambda_i = 1</math>, then
 
<cmath> \sum_{i=1}^n \lambda_i a_i \ge \prod_{i=1}^n a_i^{\lambda_i}, </cmath>
 
<cmath> \sum_{i=1}^n \lambda_i a_i \ge \prod_{i=1}^n a_i^{\lambda_i}, </cmath>
 
with equality if and only if <math>a_i = a_j</math> for all <math>i,j</math> such that <math>\lambda_i, \lambda_j \neq 0</math>.
 
with equality if and only if <math>a_i = a_j</math> for all <math>i,j</math> such that <math>\lambda_i, \lambda_j \neq 0</math>.
  
We first note that we may disregard any <math>a_i</math> for which <math>\lambda_i= 0</math>, as they contribute to neither side of the desired inequality. We also note that if <math>a_i= 0</math> and <math>\lambda_i \neq 0</math>, for some <math>i</math>, then the right-hand side of the inequality is zero and the left hand of the inequality is greater or equal to zero, with equality if and only if <math>a_j = 0 = a_i</math> whenever <math>\lambda_j\neq 0</math>. Thus we may henceforth assume that all <math>a_i</math> and <math>\lambda_i</math> are ''strictly positive.''
+
We first note that we may disregard any <math>a_i</math> for which <math>\lambda_i= 0</math>, as they contribute to neither side of the desired inequality. We also note that if <math>a_i= 0</math> and <math>\lambda_i \neq 0</math>, for some <math>i</math>, then the right-hand side of the inequality is zero and the left hand of the inequality is greater or equal to zero, with equality if and only if <math>a_j = 0 = a_i</math> whenever <math>\lambda_j\neq 0</math>. Thus we may henceforth assume that all <math>a_i</math> and <math>\lambda_i</math> are ''strictly positive.''
  
== Complete Proofs ==
+
== Proofs of Unweighted AM-GM ==
  
=== Proof by Convexity ===
+
These proofs use the assumption that <math>\lambda_i = 1/n</math>, for all integers <math>1 \le i \le n</math>.
  
We note that the [[function]] <math>x \mapsto \ln x</math> is strictly [[concave]].  Then by [[Jensen's Inequality]],
+
=== Proof by Cauchy Induction ===
<cmath> \ln \sum_i \lambda_i a_i \ge \sum_i \lambda_i \ln a_i = \ln \prod_i a_i^{\lambda_i} , </cmath>
+
We use [[Cauchy Induction]], a variant of induction that involves proving a result for <math>2</math>, all powers of <math>2</math>, and then a backward step where <math>n</math> implies <math>n-1</math>.
with equality if and only if all the <math>a_i</math> are equal.
 
Since <math>x \mapsto \ln x</math> is a strictly increasing function, it then follows that
 
<cmath> \sum_i \lambda_i a_i \ge \prod_i a_i^{\lambda_i}, </cmath>
 
with equality if and only if all the <math>a_i</math> are equal, as desired.  <math>\blacksquare</math>
 
 
 
=== Alternate Proof by Convexity ===
 
 
 
''This proof is due to [[G. Pólya]].''
 
  
Note that the function <math>f:x \mapsto e^x</math> is strictly convex. Let <math>g(x)</math> be the line tangent to <math>f</math> at <math>(0,1)</math>; then <math>g(x) = x+1</math>.  Since <math>f</math> is also a [[continuous]], [[differentiable]] function, it follows that <math>f(x) \ge g(x)</math> for all <math>x</math>, with equality exactly when <math>x=0</math>, i.e.,
+
'''Base Case''': The smallest nontrivial case of AM-GM is in two variables. By the properties of perfect squares (or by the [[Trivial Inequality]]), <math>(x-y)^2 > 0,</math> with equality if and only if <math>x-y=0</math>, or <math>x=y</math>. Then because <math>x</math> and <math>y</math> are nonnegative, we can perform the following manipulations: <cmath>x^2 - 2xy + y^2 > 0</cmath> <cmath>x^2 + 2xy + y^2 > 4xy</cmath> <cmath>\frac{(x+y)^2}{4} \geq xy</cmath> <cmath>\frac{x+y}{2} \geq \sqrt{xy},</cmath> with equality if and only if <math>x=y</math>. This completes the proof of the base case.
<cmath> 1+x \le e^x, </cmath>
 
with equality exactly when <math>x=0</math>.
 
  
Now, set
+
'''Powers of Two''': We use induction. Suppose that AM-GM is true for <math>n</math> variables; we will then prove that the inequality is true for <math>2n</math>. Let <math>x_1, x_2, \ldots x_{2n}</math> be any list of nonnegative reals. Then, because the two lists <math>x_1, x_2 \ldots x_n</math> and <math>x_{n+1}, x_{n+2}, \ldots x_{2n}</math>, have <math>n</math> variables each, <cmath>\frac{x_1 + x_2 + \cdots + x_n}{n} \geq \sqrt[n]{x_1 x_2 \cdots x_n} \textrm{  and  } \frac{x_{n+1} + x_{n+2} + \cdots + x_{2n}}{n} \geq \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}.</cmath> Adding these two inequalities together and dividing by two yields <cmath>\frac{x_1 + x_2 + \cdots + x_{2n}}{2n} \geq \frac{\sqrt[n]{x_1 x_2 \cdots x_n} + \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}}{2}.</cmath> Then by AM-GM in two variables, <cmath>\frac{\sqrt[n]{x_1 x_2 \cdots x_n} + \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}}{2} \geq \sqrt[2n]{x_1 x_2 \ldots x_{2n}}.</cmath> Combining this inequality with the previous one yields unweighted AM-GM in <math>2n</math> variables, with one exception — equality.
<cmath> r_i = a_i/\biggl( \sum_{j=1}^n \lambda_j a_j \biggr) - 1, </cmath>
 
for all integers <math>1\le i \le n</math>.  Our earlier bound tells us that
 
<cmath> r_i +1 \le \exp(r_i), </cmath>
 
so
 
<cmath> (r_i +1)^{\lambda_i} \le \exp(\lambda_i r_i) .</cmath>
 
Multiplying <math>n</math> such inequalities gives us
 
  
<cmath>\prod_{i=1}^n (r_i + 1)^{\lambda_{i}} \le \prod_{i=1}^n \exp \lambda_i r_i </cmath>
+
For equality, note that every inequality mentioned must have equality as well; thus, inequality holds if and only if all the numbers in <math>x_1, x_2, \ldots x_n</math> are the same, all the numbers in <math>x_{n+1}, x_{n+2}, \ldots x_{2n}</math> are the same, and <math>\sqrt[n]{x_1 x_2 \cdots x_n} = \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}</math>. From here, it is trivial to show that this implies <math>x_1 = x_2 = \cdots x_{2n}</math>, which is the equality condition for unweighted AM-GM in <math>2n</math> variables.
  
Evaluating the left hand side:
+
This completes the induction and implies that unweighted AM-GM holds for all powers of two.
  
<cmath>\prod_{i=1}^n (r_i + 1)^{\lambda_{i}} = \frac{\prod_{i=1}^n a_i^{\lambda_i} }{ (\sum_{j=1}^n \lambda_j a_j)^{\sum_{j=1}^n \lambda_i} } = \frac{\prod_{i=1}^n a_i^{\lambda_i} }{ \sum_{j=1}^n \lambda_j a_j } ,</cmath>
+
'''Backward Step''': Assume that AM-GM holds for <math>n</math> variables. Letting <math>x_n = \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}</math>, we have that <cmath>\frac{x_1 + x_2 + \cdots + x_{n-1} + \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}}{n} \geq \sqrt[n]{x_1 x_2 \cdots x_{n-1}(\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1})}.</cmath> Equality holds if and only if <math>x_1 = x_2 = \cdots x_{n-1} = \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}</math>. However, note that the last equality is implied if all the numbers of <math>x_1, x_2 \ldots x_{n-1}</math> are the same; thus, equality holds if and only if <math>x_1 = x_2 = \cdots x_{n-1}</math>.
  
for
+
We then simplify the lefthand side. Multiplying both sides of the fraction by <math>n-1</math> and combining like terms, we get that <cmath>\frac{x_1 + x_2 + \cdots + x_{n-1} + \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}}{n} = \frac{nx_1 + nx_2 + \cdots + nx_{n-1}}{n(n-1)} = \frac{x_1 + x_2 + \cdots x_{n-1}}{n-1}.</cmath> Thus, <cmath>\frac{x_1 + x_2 + \cdots x_{n-1}}{n-1} \geq \sqrt[n]{x_1 x_2 \cdots x_{n-1}(\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1})}.</cmath> Raising both sides to the <math>n</math>th power yields <cmath>(\frac{x_1 + x_2 + \cdots x_{n-1}}{n-1}^n \geq x_1 x_2 \cdots x_{n-1}(\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}).</cmath> From here, we divide by <math>\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}</math> and take the <math>n</math> root to get that <cmath>\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1} \geq \sqrt[n-1]{x_1 x_2 \cdots x_{n-1}}.</cmath> Every step we took preserves our earlier equality condition, which implies that the inequality holds for <math>n-1</math> variables. This completes the backwards induction, which proves the unweighted AM-GM Inequality, as required. <math>\square</math>
 
 
<cmath> \sum_{j=1}^n \lambda_i = 1 .</cmath>
 
 
 
Evaluating the right hand side:
 
 
 
<cmath> \prod_{i=1}^n \exp \lambda_i r_i =  \prod_{i=1}^n \exp \left(\frac{a_i \lambda_i}{ \sum_{i=1}^n \lambda_j a_j} -  \lambda_i\right) = \exp \left(\frac{\sum_{i=1}^n a_i \lambda_i}{ \sum_{i=1}^n \lambda_j a_j} - \sum_{i=1}^n \lambda_i\right) = \exp 0 = 1 .</cmath>
 
 
 
Substituting the results for the left and right sides:
 
 
 
<cmath> \frac{\prod_{i=1}^n a_i^{\lambda_i} }{ \sum_{j=1}^n \lambda_j a_j } \le 1 </cmath>
 
 
 
<cmath> \prod_{i=1}^n a_i^{\lambda_i} \le  \sum_{j=1}^n \lambda_j a_j =  \sum_{i=1}^n \lambda_i a_i , </cmath>
 
 
 
as desired. <math>\blacksquare</math>
 
 
 
== Proofs of Unweighted AM-GM ==
 
 
 
These proofs use the assumption that <math>\lambda_i = 1/n</math>, for all integers <math>1 \le i \le n</math>.
 
  
 
=== Proof by Rearrangement ===
 
=== Proof by Rearrangement ===
  
Define the <math>n</math> sequence <math>\{ r_{i,j}\}_{i=1}^{n}</math> as <math>r_{i,j} = \sqrt[n]{a_i}</math>, for all integers <math>1 \le i,j \le n</math>. Evidently these sequences are similarly sorted. Then by the [[Rearrangement Inequality]],
+
Define the <math>n</math> sequence <math>\{ r_{i,j}\}_{i=1}^{n}</math> as <math>r_{i,j} = \sqrt[n]{a_i}</math>, for all integers <math>1 \le i,j \le n</math>. Evidently these sequences are similarly sorted. Then by the [[Rearrangement Inequality]],
 
<cmath> \sum_i a_i = \sum_i \prod_j r_{i,j} \ge \sum_i \prod_j r_{i+j,j} = n \prod_i \sqrt[n]{a_i} , </cmath>
 
<cmath> \sum_i a_i = \sum_i \prod_j r_{i,j} \ge \sum_i \prod_j r_{i+j,j} = n \prod_i \sqrt[n]{a_i} , </cmath>
where we take our indices modulo <math>n</math>, with equality exactly when all the <math>r_{i,j}</math>, and therefore all the <math>a_i</math>, are equal. Dividing both sides by <math>n</math> gives the desired inequality. <math>\blacksquare</math>
+
where we take our indices modulo <math>n</math>, with equality exactly when all the <math>r_{i,j}</math>, and therefore all the <math>a_i</math>, are equal. Dividing both sides by <math>n</math> gives the desired inequality. <math>\blacksquare</math>
 
 
=== Proof by [[Cauchy Induction]] ===
 
 
 
We first prove that the inequality holds for two variables.  Note that <math>(\sqrt{a}, \sqrt{b}), (\sqrt{a}, \sqrt{b})</math> are similarly sorted sequences.  Then by the Rearrangement Inequality,
 
<cmath> \sqrt{ab} = \frac{ \sqrt{a} \sqrt{b} + \sqrt{b} \sqrt{a}}{2} \le \frac{ \sqrt{a} \sqrt{a} + \sqrt{b} \sqrt{b}}{2} = \frac{a+b}{2} , </cmath>
 
with equality exactly when <math>a=b</math>.
 
 
 
We next prove that if the inequality holds for <math>n</math> variables (with equality when all are equal to zero), than it holds for <math>2n</math> variables (with equality when all are equal to zero).  Indeed, suppose the inequality holds for <math>n</math> variables.  Let <math>A_n, A'_n, A_{2n}</math> denote the arithmetic means of <math>(a_1, \dotsc, a_n)</math>, <math>(a_{n+1}, \dotsc, a_{2n})</math>, <math>(a_1, \dotsc, a_{2n})</math>, respectively; let <math>G_n, G'_n, G_{2n}</math> denote their respective geometric means.  Then
 
<cmath> G_{2n} = \sqrt{G_n G'_n} \le \frac{G_n + G'_n}{2} \le \frac{A_n + A'_n}{2} = A_{2n}, </cmath>
 
with equality when all the numbers <math>a_1, \dotsc, a_{2n}</math> are equal, as desired.
 
 
 
These two results show by induction that the theorem holds for <math>n=2^k</math>, for all integers <math>k</math>.  In particular, for every integer <math>n</math>, there is an integer <math>N >n</math> such that the theorem holds for <math>N</math> variables.
 
 
 
Finally, we show that if <math>N>n</math> and the theorem holds for <math>N</math> variables, then it holds for <math>n</math> variables <math>a_1, \dotsc, a_n</math> with arithmetic mean <math>A_n</math> and geometric mean <math>G_n</math>.  Indeed, set <math>b_k = a_k</math> for <math>1\le k \le n</math>, and let <math>b_k = A_n</math> for <math>n < k \le N</math>.  Then
 
<cmath> G_n^{n/N} \cdot A_n^{(N-n)/N} = \prod_{i=1}^N b_i^{1/N} \le \sum_{i=1}^N b_i/N = A_n . </cmath>
 
It follows that <math>G_k^{n/N} \le A_k^{n/N}</math>, or <math>G_n \le A_n</math>, with equality exactly when all the <math>a_i</math> are equal to <math>A_k</math>.
 
 
 
The last two results show that for all positive integers <math>n</math>, the theorem holds for <math>n</math> variables.  Therefore the theorem is true.  <math>\blacksquare</math>
 
  
 
=== Proof by Calculus ===
 
=== Proof by Calculus ===
Line 148: Line 96:
 
<cmath>a^2-2ab+b^2 \geq 0</cmath> <cmath>a^2+2ab+b^2 \geq 4ab</cmath> <cmath>\frac{(a+b)^2}4 \geq ab</cmath> <cmath>\frac{a+b}2 \geq \sqrt{ab}</cmath>
 
<cmath>a^2-2ab+b^2 \geq 0</cmath> <cmath>a^2+2ab+b^2 \geq 4ab</cmath> <cmath>\frac{(a+b)^2}4 \geq ab</cmath> <cmath>\frac{a+b}2 \geq \sqrt{ab}</cmath>
 
From here, we proceed with induction through the second paragraph of the above proof with Cauchy Induction to show the theorem is true. <math>\blacksquare</math>
 
From here, we proceed with induction through the second paragraph of the above proof with Cauchy Induction to show the theorem is true. <math>\blacksquare</math>
 +
 +
== Proof of Weighted AM-GM ==
 +
 +
=== Proof by Convexity ===
 +
 +
We note that the [[function]] <math>x \mapsto \ln x</math> is strictly [[concave]]. Then by [[Jensen's Inequality]],
 +
<cmath> \ln \sum_i \lambda_i a_i \ge \sum_i \lambda_i \ln a_i = \ln \prod_i a_i^{\lambda_i} , </cmath>
 +
with equality if and only if all the <math>a_i</math> are equal.
 +
Since <math>x \mapsto \ln x</math> is a strictly increasing function, it then follows that
 +
<cmath> \sum_i \lambda_i a_i \ge \prod_i a_i^{\lambda_i}, </cmath>
 +
with equality if and only if all the <math>a_i</math> are equal, as desired. <math>\blacksquare</math>
 +
 +
=== Alternate Proof by Convexity ===
 +
 +
''This proof is due to [[G. Pólya]].''
 +
 +
Note that the function <math>f:x \mapsto e^x</math> is strictly convex. Let <math>g(x)</math> be the line tangent to <math>f</math> at <math>(0,1)</math>; then <math>g(x) = x+1</math>. Since <math>f</math> is also a [[continuous]], [[differentiable]] function, it follows that <math>f(x) \ge g(x)</math> for all <math>x</math>, with equality exactly when <math>x=0</math>, i.e.,
 +
<cmath> 1+x \le e^x, </cmath>
 +
with equality exactly when <math>x=0</math>.
 +
 +
Now, set
 +
<cmath> r_i = a_i/\biggl( \sum_{j=1}^n \lambda_j a_j \biggr) - 1, </cmath>
 +
for all integers <math>1\le i \le n</math>. Our earlier bound tells us that
 +
<cmath> r_i +1 \le \exp(r_i), </cmath>
 +
so
 +
<cmath> (r_i +1)^{\lambda_i} \le \exp(\lambda_i r_i) .</cmath>
 +
Multiplying <math>n</math> such inequalities gives us
 +
 +
<cmath>\prod_{i=1}^n (r_i + 1)^{\lambda_{i}} \le \prod_{i=1}^n \exp \lambda_i r_i </cmath>
 +
 +
Evaluating the left hand side:
 +
 +
<cmath>\prod_{i=1}^n (r_i + 1)^{\lambda_{i}} = \frac{\prod_{i=1}^n a_i^{\lambda_i} }{ (\sum_{j=1}^n \lambda_j a_j)^{\sum_{j=1}^n \lambda_i} } = \frac{\prod_{i=1}^n a_i^{\lambda_i} }{ \sum_{j=1}^n \lambda_j a_j } ,</cmath>
 +
 +
for
 +
 +
<cmath> \sum_{j=1}^n \lambda_i = 1 .</cmath>
 +
 +
Evaluating the right hand side:
 +
 +
<cmath> \prod_{i=1}^n \exp \lambda_i r_i = \prod_{i=1}^n \exp \left(\frac{a_i \lambda_i}{ \sum_{i=1}^n \lambda_j a_j} - \lambda_i\right) = \exp \left(\frac{\sum_{i=1}^n a_i \lambda_i}{ \sum_{i=1}^n \lambda_j a_j} - \sum_{i=1}^n \lambda_i\right) = \exp 0 = 1 .</cmath>
 +
 +
Substituting the results for the left and right sides:
 +
 +
<cmath> \frac{\prod_{i=1}^n a_i^{\lambda_i} }{ \sum_{j=1}^n \lambda_j a_j } \le 1 </cmath>
 +
 +
<cmath> \prod_{i=1}^n a_i^{\lambda_i} \le \sum_{j=1}^n \lambda_j a_j = \sum_{i=1}^n \lambda_i a_i , </cmath>
 +
 +
as desired. <math>\blacksquare</math>
  
 
[[Category:Inequality]]
 
[[Category:Inequality]]

Revision as of 14:56, 19 December 2021

This pages lists some proofs of the weighted AM-GM Inequality. The inequality's statement is as follows: for all nonnegative reals $a_1, \dotsc, a_n$ and nonnegative reals $\lambda_1, \dotsc, \lambda_n$ such that $\sum_{i=1}^n \lambda_i = 1$, then \[\sum_{i=1}^n \lambda_i a_i \ge \prod_{i=1}^n a_i^{\lambda_i},\] with equality if and only if $a_i = a_j$ for all $i,j$ such that $\lambda_i, \lambda_j \neq 0$.

We first note that we may disregard any $a_i$ for which $\lambda_i= 0$, as they contribute to neither side of the desired inequality. We also note that if $a_i= 0$ and $\lambda_i \neq 0$, for some $i$, then the right-hand side of the inequality is zero and the left hand of the inequality is greater or equal to zero, with equality if and only if $a_j = 0 = a_i$ whenever $\lambda_j\neq 0$. Thus we may henceforth assume that all $a_i$ and $\lambda_i$ are strictly positive.

Proofs of Unweighted AM-GM

These proofs use the assumption that $\lambda_i = 1/n$, for all integers $1 \le i \le n$.

Proof by Cauchy Induction

We use Cauchy Induction, a variant of induction that involves proving a result for $2$, all powers of $2$, and then a backward step where $n$ implies $n-1$.

Base Case: The smallest nontrivial case of AM-GM is in two variables. By the properties of perfect squares (or by the Trivial Inequality), $(x-y)^2 > 0,$ with equality if and only if $x-y=0$, or $x=y$. Then because $x$ and $y$ are nonnegative, we can perform the following manipulations: \[x^2 - 2xy + y^2 > 0\] \[x^2 + 2xy + y^2 > 4xy\] \[\frac{(x+y)^2}{4} \geq xy\] \[\frac{x+y}{2} \geq \sqrt{xy},\] with equality if and only if $x=y$. This completes the proof of the base case.

Powers of Two: We use induction. Suppose that AM-GM is true for $n$ variables; we will then prove that the inequality is true for $2n$. Let $x_1, x_2, \ldots x_{2n}$ be any list of nonnegative reals. Then, because the two lists $x_1, x_2 \ldots x_n$ and $x_{n+1}, x_{n+2}, \ldots x_{2n}$, have $n$ variables each, \[\frac{x_1 + x_2 + \cdots + x_n}{n} \geq \sqrt[n]{x_1 x_2 \cdots x_n} \textrm{  and  } \frac{x_{n+1} + x_{n+2} + \cdots + x_{2n}}{n} \geq \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}.\] Adding these two inequalities together and dividing by two yields \[\frac{x_1 + x_2 + \cdots + x_{2n}}{2n} \geq \frac{\sqrt[n]{x_1 x_2 \cdots x_n} + \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}}{2}.\] Then by AM-GM in two variables, \[\frac{\sqrt[n]{x_1 x_2 \cdots x_n} + \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}}{2} \geq \sqrt[2n]{x_1 x_2 \ldots x_{2n}}.\] Combining this inequality with the previous one yields unweighted AM-GM in $2n$ variables, with one exception — equality.

For equality, note that every inequality mentioned must have equality as well; thus, inequality holds if and only if all the numbers in $x_1, x_2, \ldots x_n$ are the same, all the numbers in $x_{n+1}, x_{n+2}, \ldots x_{2n}$ are the same, and $\sqrt[n]{x_1 x_2 \cdots x_n} = \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}$. From here, it is trivial to show that this implies $x_1 = x_2 = \cdots x_{2n}$, which is the equality condition for unweighted AM-GM in $2n$ variables.

This completes the induction and implies that unweighted AM-GM holds for all powers of two.

Backward Step: Assume that AM-GM holds for $n$ variables. Letting $x_n = \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}$, we have that \[\frac{x_1 + x_2 + \cdots + x_{n-1} + \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}}{n} \geq \sqrt[n]{x_1 x_2 \cdots x_{n-1}(\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1})}.\] Equality holds if and only if $x_1 = x_2 = \cdots x_{n-1} = \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}$. However, note that the last equality is implied if all the numbers of $x_1, x_2 \ldots x_{n-1}$ are the same; thus, equality holds if and only if $x_1 = x_2 = \cdots x_{n-1}$.

We then simplify the lefthand side. Multiplying both sides of the fraction by $n-1$ and combining like terms, we get that \[\frac{x_1 + x_2 + \cdots + x_{n-1} + \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}}{n} = \frac{nx_1 + nx_2 + \cdots + nx_{n-1}}{n(n-1)} = \frac{x_1 + x_2 + \cdots x_{n-1}}{n-1}.\] Thus, \[\frac{x_1 + x_2 + \cdots x_{n-1}}{n-1} \geq \sqrt[n]{x_1 x_2 \cdots x_{n-1}(\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1})}.\] Raising both sides to the $n$th power yields \[(\frac{x_1 + x_2 + \cdots x_{n-1}}{n-1}^n \geq x_1 x_2 \cdots x_{n-1}(\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}).\] From here, we divide by $\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}$ and take the $n$ root to get that \[\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1} \geq \sqrt[n-1]{x_1 x_2 \cdots x_{n-1}}.\] Every step we took preserves our earlier equality condition, which implies that the inequality holds for $n-1$ variables. This completes the backwards induction, which proves the unweighted AM-GM Inequality, as required. $\square$

Proof by Rearrangement

Define the $n$ sequence $\{ r_{i,j}\}_{i=1}^{n}$ as $r_{i,j} = \sqrt[n]{a_i}$, for all integers $1 \le i,j \le n$. Evidently these sequences are similarly sorted. Then by the Rearrangement Inequality, \[\sum_i a_i = \sum_i \prod_j r_{i,j} \ge \sum_i \prod_j r_{i+j,j} = n \prod_i \sqrt[n]{a_i} ,\] where we take our indices modulo $n$, with equality exactly when all the $r_{i,j}$, and therefore all the $a_i$, are equal. Dividing both sides by $n$ gives the desired inequality. $\blacksquare$

Proof by Calculus

We will start the proof by considering the function $f(x) = \ln x - x$. We will now find the maximum of this function. We can do this simply using calculus. We need to find the critical points of $f(x)$, we can do that by finding $f'(x)$ and setting it equal to $0$. Using the linearity of the derivative $f'(x) = \frac{1}{x} - 1$. We need $f'(x) = 0$ \[f'(x) = \frac{1}{x} - 1 = 0\] \[\frac{1}{x} = 1\] \[x = 1\] Note that this is the only critical point of $f(x)$. We can confirm it is the maximum by finding it's second derivative and making sure it is negative. $f''(x) = -\frac{1}{x^2}$ letting x = 1 we get $f''(1) = -\frac{1}{1} < 0$. Since the second derivative $f''(x) < 0$, $f(1)$ is a maximum. $f(1) = \ln 1 - 1 = -1$. Now that we have that $f(1) = -1$ is a maximum of $f(x)$, we can safely say that $\ln x - x \leq -1$ or in other words $\ln x \leq x - 1$. We will now define a few more things and do some manipulations with them. Let $A = \frac{\sum_{i=1}^{n} a_i}{n}$, with this notice that $nA = \sum_{i=1}^{n} a_i$. This fact will come into play later. now we can do the following, let $x = \frac{a_i}{A}$ and plug this into $\ln x < x - 1$, we get \[\ln\left(\frac{a_1}{A}\right) \leq \frac{a_1}{A} - 1\] \[\ln\left(\frac{a_2}{A}\right) \leq \frac{a_2}{A} - 1\] \[\vdotswithin{=}\] \[\ln\left(\frac{a_n}{A}\right) \leq \frac{a_n}{A} - 1\] Adding all these results together we get \[\ln\left(\frac{a_1}{A}\right) + \ln\left(\frac{a_2}{A}\right) + \dots + \ln\left(\frac{a_n}{A}\right) \leq \frac{a_1}{A} - 1 + \frac{a_2}{A} - 1 + \dots + \frac{a_n}{A} - 1\] \[\ln\left(\frac{\prod_{i=1}^{n} a_i}{A^n}\right) \leq \frac{\sum_{i=1}^{n}a_i}{A} - n\] \[\ln\left(\frac{\prod_{i=1}^{n} a_i}{A^n}\right) \leq \frac{nA}{A} - n\] \[\ln\left(\frac{\prod_{i=1}^{n} a_i}{A^n}\right) \leq n - n\] \[\ln\left(\frac{\prod_{i=1}^{n} a_i}{A^n}\right) \leq 0\] Now exponentiating both sides we get \[e^{\ln\left(\frac{\prod_{i=1}^{n} a_i}{A^n}\right)} \leq e^{0}\] \[\frac{\prod_{i=1}^{n} a_i}{A^n} \leq 1\] \[\prod_{i=1}^{n} a_i \leq A^n\] \[\sqrt[\leftroot{2}\uproot{3}n]{\prod_{i=1}^{n} a_i} \leq \sqrt[\leftroot{2}\uproot{3}n]{A^n}\] \[\sqrt[\leftroot{2}\uproot{3}n]{\prod_{i=1}^{n} a_i} \leq A\] \[\sqrt[\leftroot{2}\uproot{3}n]{\prod_{i=1}^{n} a_i} \leq \frac{\sum_{i=1}^{n} a_i}{n}\] This proves the AM-GM inequality. $\blacksquare$

Alternate Proof by Induction

Because $(a-b)^2$ is a square, $(a-b)^2\geq 0$. Now, through algebra: \[a^2-2ab+b^2 \geq 0\] \[a^2+2ab+b^2 \geq 4ab\] \[\frac{(a+b)^2}4 \geq ab\] \[\frac{a+b}2 \geq \sqrt{ab}\] From here, we proceed with induction through the second paragraph of the above proof with Cauchy Induction to show the theorem is true. $\blacksquare$

Proof of Weighted AM-GM

Proof by Convexity

We note that the function $x \mapsto \ln x$ is strictly concave. Then by Jensen's Inequality, \[\ln \sum_i \lambda_i a_i \ge \sum_i \lambda_i \ln a_i = \ln \prod_i a_i^{\lambda_i} ,\] with equality if and only if all the $a_i$ are equal. Since $x \mapsto \ln x$ is a strictly increasing function, it then follows that \[\sum_i \lambda_i a_i \ge \prod_i a_i^{\lambda_i},\] with equality if and only if all the $a_i$ are equal, as desired. $\blacksquare$

Alternate Proof by Convexity

This proof is due to G. Pólya.

Note that the function $f:x \mapsto e^x$ is strictly convex. Let $g(x)$ be the line tangent to $f$ at $(0,1)$; then $g(x) = x+1$. Since $f$ is also a continuous, differentiable function, it follows that $f(x) \ge g(x)$ for all $x$, with equality exactly when $x=0$, i.e., \[1+x \le e^x,\] with equality exactly when $x=0$.

Now, set \[r_i = a_i/\biggl( \sum_{j=1}^n \lambda_j a_j \biggr) - 1,\] for all integers $1\le i \le n$. Our earlier bound tells us that \[r_i +1 \le \exp(r_i),\] so \[(r_i +1)^{\lambda_i} \le \exp(\lambda_i r_i) .\] Multiplying $n$ such inequalities gives us

\[\prod_{i=1}^n (r_i + 1)^{\lambda_{i}} \le \prod_{i=1}^n \exp \lambda_i r_i\]

Evaluating the left hand side:

\[\prod_{i=1}^n (r_i + 1)^{\lambda_{i}} = \frac{\prod_{i=1}^n a_i^{\lambda_i} }{ (\sum_{j=1}^n \lambda_j a_j)^{\sum_{j=1}^n \lambda_i} } = \frac{\prod_{i=1}^n a_i^{\lambda_i} }{ \sum_{j=1}^n \lambda_j a_j } ,\]

for

\[\sum_{j=1}^n \lambda_i = 1 .\]

Evaluating the right hand side:

\[\prod_{i=1}^n \exp \lambda_i r_i = \prod_{i=1}^n \exp \left(\frac{a_i \lambda_i}{ \sum_{i=1}^n \lambda_j a_j} - \lambda_i\right) = \exp \left(\frac{\sum_{i=1}^n a_i \lambda_i}{ \sum_{i=1}^n \lambda_j a_j} - \sum_{i=1}^n \lambda_i\right) = \exp 0 = 1 .\]

Substituting the results for the left and right sides:

\[\frac{\prod_{i=1}^n a_i^{\lambda_i} }{ \sum_{j=1}^n \lambda_j a_j } \le 1\]

\[\prod_{i=1}^n a_i^{\lambda_i} \le \sum_{j=1}^n \lambda_j a_j = \sum_{i=1}^n \lambda_i a_i ,\]

as desired. $\blacksquare$