Difference between revisions of "Proofs of AM-GM"

(Rewrote the Cauchy Induction proof to be longer, but more thorough for beginners. Also rearranged the document to prove it first, as it requires the least background knowledge.)
 
(15 intermediate revisions by 2 users not shown)
Line 1: Line 1:
This pages lists some proofs of the weighted [[AM-GM]] Inequality. The inequality's statement is as follows: for all nonnegative reals <math>a_1, \dotsc, a_n</math> and nonnegative reals <math>\lambda_1, \dotsc, \lambda_n</math> such that <math>\sum_{i=1}^n \lambda_i = 1</math>, then
+
This pages lists some proofs of the weighted [[AM-GM Inequality]]. The inequality's statement is as follows: for all nonnegative reals <math>a_1, \dotsc, a_n</math> and nonnegative reals <math>\lambda_1, \dotsc, \lambda_n</math> such that <math>\sum_{i=1}^n \lambda_i = 1</math>, then
<cmath> \sum_{i=1}^n \lambda_i a_i \ge \prod_{i=1}^n a_i^{\lambda_i}, </cmath>
+
<cmath> \sum_{i=1}^n \lambda_i a_i \geq \prod_{i=1}^n a_i^{\lambda_i}, </cmath>
 
with equality if and only if <math>a_i = a_j</math> for all <math>i,j</math> such that <math>\lambda_i, \lambda_j \neq 0</math>.
 
with equality if and only if <math>a_i = a_j</math> for all <math>i,j</math> such that <math>\lambda_i, \lambda_j \neq 0</math>.
  
Line 10: Line 10:
  
 
=== Proof by Cauchy Induction ===
 
=== Proof by Cauchy Induction ===
We use [[Cauchy Induction]], a variant of induction that involves proving a result for <math>2</math>, all powers of <math>2</math>, and then a backward step where <math>n</math> implies <math>n-1</math>.
+
We use [[Cauchy Induction]], a variant of induction in which one proves a result for <math>2</math>, all powers of <math>2</math>, and then that <math>n</math> implies <math>n-1</math>.  
  
'''Base Case''': The smallest nontrivial case of AM-GM is in two variables. By the properties of perfect squares (or by the [[Trivial Inequality]]), <math>(x-y)^2 > 0,</math> with equality if and only if <math>x-y=0</math>, or <math>x=y</math>. Then because <math>x</math> and <math>y</math> are nonnegative, we can perform the following manipulations: <cmath>x^2 - 2xy + y^2 > 0</cmath> <cmath>x^2 + 2xy + y^2 > 4xy</cmath> <cmath>\frac{(x+y)^2}{4} \geq xy</cmath> <cmath>\frac{x+y}{2} \geq \sqrt{xy},</cmath> with equality if and only if <math>x=y</math>. This completes the proof of the base case.
+
'''Base Case''': The smallest nontrivial case of AM-GM is in two variables. By the properties of perfect squares (or by the [[Trivial Inequality]]), <math>(x-y)^2 \geq 0,</math> with equality if and only if <math>x-y=0</math>, or <math>x=y</math>. Then because <math>x</math> and <math>y</math> are nonnegative, we can perform the following manipulations: <cmath>x^2 - 2xy + y^2 > 0</cmath> <cmath>x^2 + 2xy + y^2 > 4xy</cmath> <cmath> (x+y)^2 = 4xy</cmath> <cmath>\frac{(x+y)^2}{4} \geq xy</cmath> <cmath>\frac{x+y}{2} \geq \sqrt{xy},</cmath> with equality if and only if <math>x=y</math>, just as before. This completes the proof of the base case.
  
'''Powers of Two''': We use induction. Suppose that AM-GM is true for <math>n</math> variables; we will then prove that the inequality is true for <math>2n</math>. Let <math>x_1, x_2, \ldots x_{2n}</math> be any list of nonnegative reals. Then, because the two lists <math>x_1, x_2 \ldots x_n</math> and <math>x_{n+1}, x_{n+2}, \ldots x_{2n}</math>, have <math>n</math> variables each, <cmath>\frac{x_1 + x_2 + \cdots + x_n}{n} \geq \sqrt[n]{x_1 x_2 \cdots x_n} \textrm{  and  } \frac{x_{n+1} + x_{n+2} + \cdots + x_{2n}}{n} \geq \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}.</cmath> Adding these two inequalities together and dividing by two yields <cmath>\frac{x_1 + x_2 + \cdots + x_{2n}}{2n} \geq \frac{\sqrt[n]{x_1 x_2 \cdots x_n} + \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}}{2}.</cmath> Then by AM-GM in two variables, <cmath>\frac{\sqrt[n]{x_1 x_2 \cdots x_n} + \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}}{2} \geq \sqrt[2n]{x_1 x_2 \ldots x_{2n}}.</cmath> Combining this inequality with the previous one yields unweighted AM-GM in <math>2n</math> variables, with one exception — equality.
+
'''Powers of Two''': We use induction. Suppose that AM-GM is true for <math>n</math> variables; we will then prove that the inequality is true for <math>2n</math>. Let <math>x_1, x_2, \ldots, x_{2n}</math> be any list of nonnegative reals. Then, because the two lists <math>x_1, x_2 \ldots, x_n</math> and <math>x_{n+1}, x_{n+2}, \ldots, x_{2n}</math>, each have <math>n</math> variables, <cmath>\frac{x_1 + x_2 + \cdots + x_n}{n} \geq \sqrt[n]{x_1 x_2 \cdots x_n} \textrm{  and  } \frac{x_{n+1} + x_{n+2} + \cdots + x_{2n}}{n} \geq \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}.</cmath> Adding these two inequalities together and dividing by <math>2</math> yields <cmath>\frac{x_1 + x_2 + \cdots + x_{2n}}{2n} \geq \frac{\sqrt[n]{x_1 x_2 \cdots x_n} + \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}}{2}.</cmath> From here, we perform AM-GM in two variables on <math>\sqrt[n]{x_1 x_2 \cdots x_n}</math> and <math>\sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}</math> to get <cmath>\frac{\sqrt[n]{x_1 x_2 \cdots x_n} + \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}}{2} \geq \sqrt[2n]{x_1 x_2 \ldots x_{2n}}.</cmath> Combining this inequality with the previous one yields AM-GM in <math>2n</math> variables, with one exception — equality.
  
For equality, note that every inequality mentioned must have equality as well; thus, inequality holds if and only if all the numbers in <math>x_1, x_2, \ldots x_n</math> are the same, all the numbers in <math>x_{n+1}, x_{n+2}, \ldots x_{2n}</math> are the same, and <math>\sqrt[n]{x_1 x_2 \cdots x_n} = \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}</math>. From here, it is trivial to show that this implies <math>x_1 = x_2 = \cdots x_{2n}</math>, which is the equality condition for unweighted AM-GM in <math>2n</math> variables.
+
For equality, note that every AM-GM application mentioned must have equality as well; thus, inequality holds if and only if all the numbers in <math>x_1, x_2, \ldots x_n</math> are the same, all the numbers in <math>x_{n+1}, x_{n+2}, \ldots x_{2n}</math> are the same, and <math>\sqrt[n]{x_1 x_2 \cdots x_n} = \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}</math>. From here, it is trivial to show that this implies <math>x_1 = x_2 = \cdots x_{2n}</math>, which is the equality condition for AM-GM in <math>2n</math> variables.
  
This completes the induction and implies that unweighted AM-GM holds for all powers of two.
+
This completes the induction and proves that the inequality holds for all powers of <math>2</math>.
  
'''Backward Step''': Assume that AM-GM holds for <math>n</math> variables. Letting <math>x_n = \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}</math>, we have that <cmath>\frac{x_1 + x_2 + \cdots + x_{n-1} + \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}}{n} \geq \sqrt[n]{x_1 x_2 \cdots x_{n-1}(\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1})}.</cmath> Equality holds if and only if <math>x_1 = x_2 = \cdots x_{n-1} = \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}</math>. However, note that the last equality is implied if all the numbers of <math>x_1, x_2 \ldots x_{n-1}</math> are the same; thus, equality holds if and only if <math>x_1 = x_2 = \cdots x_{n-1}</math>.
+
'''Backward Step''': Assume that AM-GM holds for <math>n</math> variables. We will then use a substitution to derive AM-GM for <math>n-1</math> variables. Letting <math>x_n = \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}</math>, we have that <cmath>\frac{x_1 + x_2 + \cdots + x_{n-1} + \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}}{n} \geq \sqrt[n]{x_1 x_2 \cdots x_{n-1} \left(\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}\right)}.</cmath> Because we assumed AM-GM in <math>n</math> variables, equality holds if and only if <math>x_1 = x_2 = \cdots = x_{n-1} = \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}</math>. However, note that the last equality is implied if all the numbers of <math>x_1, x_2, \ldots, x_{n-1}</math> are the same; thus, equality holds if and only if <math>x_1 = x_2 = \cdots = x_{n-1}</math>.
  
We then simplify the lefthand side. Multiplying both sides of the fraction by <math>n-1</math> and combining like terms, we get that <cmath>\frac{x_1 + x_2 + \cdots + x_{n-1} + \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}}{n} = \frac{nx_1 + nx_2 + \cdots + nx_{n-1}}{n(n-1)} = \frac{x_1 + x_2 + \cdots x_{n-1}}{n-1}.</cmath> Thus, <cmath>\frac{x_1 + x_2 + \cdots x_{n-1}}{n-1} \geq \sqrt[n]{x_1 x_2 \cdots x_{n-1}(\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1})}.</cmath> Raising both sides to the <math>n</math>th power yields <cmath>(\frac{x_1 + x_2 + \cdots x_{n-1}}{n-1}^n \geq x_1 x_2 \cdots x_{n-1}(\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}).</cmath> From here, we divide by <math>\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}</math> and take the <math>n</math> root to get that <cmath>\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1} \geq \sqrt[n-1]{x_1 x_2 \cdots x_{n-1}}.</cmath> Every step we took preserves our earlier equality condition, which implies that the inequality holds for <math>n-1</math> variables. This completes the backwards induction, which proves the unweighted AM-GM Inequality, as required. <math>\square</math>
+
We first simplify the lefthand side. Multiplying both sides of the fraction by <math>n-1</math> and combining like terms, we get that <cmath>\frac{x_1 + x_2 + \cdots + x_{n-1} + \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}}{n} = \frac{nx_1 + nx_2 + \cdots + nx_{n-1}}{n(n-1)} = \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}.</cmath> Plugging this into the earlier inequality yields <cmath>\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1} \geq \sqrt[n]{x_1 x_2 \cdots x_{n-1} \left(\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1} \right)}.</cmath> Raising both sides to the <math>n</math>th power yields <cmath>\left( \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}\right)^n \geq x_1 x_2 \cdots x_{n-1}\left(\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}\right).</cmath> From here, we divide by <math>\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}</math> and take the <math>(n-1)^{\textrm{th}}</math> root to get that <cmath>\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1} \geq \sqrt[n-1]{x_1 x_2 \cdots x_{n-1}}.</cmath> This is the inequality in <math>n-1</math> variables. Note that every step taken also preserves equality, which completes the backward step. Then by Cauchy Induction, the AM-GM inequality holds. <math>\square</math>
  
 
=== Proof by Rearrangement ===
 
=== Proof by Rearrangement ===
  
Define the <math>n</math> sequence <math>\{ r_{i,j}\}_{i=1}^{n}</math> as <math>r_{i,j} = \sqrt[n]{a_i}</math>, for all integers <math>1 \le i,j \le n</math>. Evidently these sequences are similarly sorted. Then by the [[Rearrangement Inequality]],
+
Define the <math>n</math> sequence <math>\{ r_{i,j}\}_{i=1}^{n}</math> as <math>r_{i,j} = \sqrt[n]{a_i}</math>, for all integers <math>1 \le i,j \le n</math>. Evidently, these sequences are similarly sorted. Then by the [[Rearrangement Inequality]],
 
<cmath> \sum_i a_i = \sum_i \prod_j r_{i,j} \ge \sum_i \prod_j r_{i+j,j} = n \prod_i \sqrt[n]{a_i} , </cmath>
 
<cmath> \sum_i a_i = \sum_i \prod_j r_{i,j} \ge \sum_i \prod_j r_{i+j,j} = n \prod_i \sqrt[n]{a_i} , </cmath>
 
where we take our indices modulo <math>n</math>, with equality exactly when all the <math>r_{i,j}</math>, and therefore all the <math>a_i</math>, are equal. Dividing both sides by <math>n</math> gives the desired inequality. <math>\blacksquare</math>
 
where we take our indices modulo <math>n</math>, with equality exactly when all the <math>r_{i,j}</math>, and therefore all the <math>a_i</math>, are equal. Dividing both sides by <math>n</math> gives the desired inequality. <math>\blacksquare</math>
Line 91: Line 91:
 
</cmath>
 
</cmath>
 
This proves the AM-GM inequality. <math>\blacksquare</math>
 
This proves the AM-GM inequality. <math>\blacksquare</math>
 
=== Alternate Proof by Induction ===
 
Because <math>(a-b)^2</math> is a square, <math>(a-b)^2\geq 0</math>. Now, through algebra:
 
<cmath>a^2-2ab+b^2 \geq 0</cmath> <cmath>a^2+2ab+b^2 \geq 4ab</cmath> <cmath>\frac{(a+b)^2}4 \geq ab</cmath> <cmath>\frac{a+b}2 \geq \sqrt{ab}</cmath>
 
From here, we proceed with induction through the second paragraph of the above proof with Cauchy Induction to show the theorem is true. <math>\blacksquare</math>
 
  
 
== Proof of Weighted AM-GM ==
 
== Proof of Weighted AM-GM ==
Line 146: Line 141:
 
as desired. <math>\blacksquare</math>
 
as desired. <math>\blacksquare</math>
  
[[Category:Inequality]]
+
[[Category:Algebra]]
 +
[[Category:Inequalities]]

Latest revision as of 07:55, 11 May 2023

This pages lists some proofs of the weighted AM-GM Inequality. The inequality's statement is as follows: for all nonnegative reals $a_1, \dotsc, a_n$ and nonnegative reals $\lambda_1, \dotsc, \lambda_n$ such that $\sum_{i=1}^n \lambda_i = 1$, then \[\sum_{i=1}^n \lambda_i a_i \geq \prod_{i=1}^n a_i^{\lambda_i},\] with equality if and only if $a_i = a_j$ for all $i,j$ such that $\lambda_i, \lambda_j \neq 0$.

We first note that we may disregard any $a_i$ for which $\lambda_i= 0$, as they contribute to neither side of the desired inequality. We also note that if $a_i= 0$ and $\lambda_i \neq 0$, for some $i$, then the right-hand side of the inequality is zero and the left hand of the inequality is greater or equal to zero, with equality if and only if $a_j = 0 = a_i$ whenever $\lambda_j\neq 0$. Thus we may henceforth assume that all $a_i$ and $\lambda_i$ are strictly positive.

Proofs of Unweighted AM-GM

These proofs use the assumption that $\lambda_i = 1/n$, for all integers $1 \le i \le n$.

Proof by Cauchy Induction

We use Cauchy Induction, a variant of induction in which one proves a result for $2$, all powers of $2$, and then that $n$ implies $n-1$.

Base Case: The smallest nontrivial case of AM-GM is in two variables. By the properties of perfect squares (or by the Trivial Inequality), $(x-y)^2 \geq 0,$ with equality if and only if $x-y=0$, or $x=y$. Then because $x$ and $y$ are nonnegative, we can perform the following manipulations: \[x^2 - 2xy + y^2 > 0\] \[x^2 + 2xy + y^2 > 4xy\] \[(x+y)^2 = 4xy\] \[\frac{(x+y)^2}{4} \geq xy\] \[\frac{x+y}{2} \geq \sqrt{xy},\] with equality if and only if $x=y$, just as before. This completes the proof of the base case.

Powers of Two: We use induction. Suppose that AM-GM is true for $n$ variables; we will then prove that the inequality is true for $2n$. Let $x_1, x_2, \ldots, x_{2n}$ be any list of nonnegative reals. Then, because the two lists $x_1, x_2 \ldots, x_n$ and $x_{n+1}, x_{n+2}, \ldots, x_{2n}$, each have $n$ variables, \[\frac{x_1 + x_2 + \cdots + x_n}{n} \geq \sqrt[n]{x_1 x_2 \cdots x_n} \textrm{  and  } \frac{x_{n+1} + x_{n+2} + \cdots + x_{2n}}{n} \geq \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}.\] Adding these two inequalities together and dividing by $2$ yields \[\frac{x_1 + x_2 + \cdots + x_{2n}}{2n} \geq \frac{\sqrt[n]{x_1 x_2 \cdots x_n} + \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}}{2}.\] From here, we perform AM-GM in two variables on $\sqrt[n]{x_1 x_2 \cdots x_n}$ and $\sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}$ to get \[\frac{\sqrt[n]{x_1 x_2 \cdots x_n} + \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}}{2} \geq \sqrt[2n]{x_1 x_2 \ldots x_{2n}}.\] Combining this inequality with the previous one yields AM-GM in $2n$ variables, with one exception — equality.

For equality, note that every AM-GM application mentioned must have equality as well; thus, inequality holds if and only if all the numbers in $x_1, x_2, \ldots x_n$ are the same, all the numbers in $x_{n+1}, x_{n+2}, \ldots x_{2n}$ are the same, and $\sqrt[n]{x_1 x_2 \cdots x_n} = \sqrt[n]{x_{n+1} x_{n+2} \cdots x_{2n}}$. From here, it is trivial to show that this implies $x_1 = x_2 = \cdots x_{2n}$, which is the equality condition for AM-GM in $2n$ variables.

This completes the induction and proves that the inequality holds for all powers of $2$.

Backward Step: Assume that AM-GM holds for $n$ variables. We will then use a substitution to derive AM-GM for $n-1$ variables. Letting $x_n = \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}$, we have that \[\frac{x_1 + x_2 + \cdots + x_{n-1} + \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}}{n} \geq \sqrt[n]{x_1 x_2 \cdots x_{n-1} \left(\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}\right)}.\] Because we assumed AM-GM in $n$ variables, equality holds if and only if $x_1 = x_2 = \cdots = x_{n-1} = \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}$. However, note that the last equality is implied if all the numbers of $x_1, x_2, \ldots, x_{n-1}$ are the same; thus, equality holds if and only if $x_1 = x_2 = \cdots = x_{n-1}$.

We first simplify the lefthand side. Multiplying both sides of the fraction by $n-1$ and combining like terms, we get that \[\frac{x_1 + x_2 + \cdots + x_{n-1} + \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}}{n} = \frac{nx_1 + nx_2 + \cdots + nx_{n-1}}{n(n-1)} = \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}.\] Plugging this into the earlier inequality yields \[\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1} \geq \sqrt[n]{x_1 x_2 \cdots x_{n-1} \left(\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1} \right)}.\] Raising both sides to the $n$th power yields \[\left( \frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}\right)^n \geq x_1 x_2 \cdots x_{n-1}\left(\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}\right).\] From here, we divide by $\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1}$ and take the $(n-1)^{\textrm{th}}$ root to get that \[\frac{x_1 + x_2 + \cdots + x_{n-1}}{n-1} \geq \sqrt[n-1]{x_1 x_2 \cdots x_{n-1}}.\] This is the inequality in $n-1$ variables. Note that every step taken also preserves equality, which completes the backward step. Then by Cauchy Induction, the AM-GM inequality holds. $\square$

Proof by Rearrangement

Define the $n$ sequence $\{ r_{i,j}\}_{i=1}^{n}$ as $r_{i,j} = \sqrt[n]{a_i}$, for all integers $1 \le i,j \le n$. Evidently, these sequences are similarly sorted. Then by the Rearrangement Inequality, \[\sum_i a_i = \sum_i \prod_j r_{i,j} \ge \sum_i \prod_j r_{i+j,j} = n \prod_i \sqrt[n]{a_i} ,\] where we take our indices modulo $n$, with equality exactly when all the $r_{i,j}$, and therefore all the $a_i$, are equal. Dividing both sides by $n$ gives the desired inequality. $\blacksquare$

Proof by Calculus

We will start the proof by considering the function $f(x) = \ln x - x$. We will now find the maximum of this function. We can do this simply using calculus. We need to find the critical points of $f(x)$, we can do that by finding $f'(x)$ and setting it equal to $0$. Using the linearity of the derivative $f'(x) = \frac{1}{x} - 1$. We need $f'(x) = 0$ \[f'(x) = \frac{1}{x} - 1 = 0\] \[\frac{1}{x} = 1\] \[x = 1\] Note that this is the only critical point of $f(x)$. We can confirm it is the maximum by finding it's second derivative and making sure it is negative. $f''(x) = -\frac{1}{x^2}$ letting x = 1 we get $f''(1) = -\frac{1}{1} < 0$. Since the second derivative $f''(x) < 0$, $f(1)$ is a maximum. $f(1) = \ln 1 - 1 = -1$. Now that we have that $f(1) = -1$ is a maximum of $f(x)$, we can safely say that $\ln x - x \leq -1$ or in other words $\ln x \leq x - 1$. We will now define a few more things and do some manipulations with them. Let $A = \frac{\sum_{i=1}^{n} a_i}{n}$, with this notice that $nA = \sum_{i=1}^{n} a_i$. This fact will come into play later. now we can do the following, let $x = \frac{a_i}{A}$ and plug this into $\ln x < x - 1$, we get \[\ln\left(\frac{a_1}{A}\right) \leq \frac{a_1}{A} - 1\] \[\ln\left(\frac{a_2}{A}\right) \leq \frac{a_2}{A} - 1\] \[\vdotswithin{=}\] \[\ln\left(\frac{a_n}{A}\right) \leq \frac{a_n}{A} - 1\] Adding all these results together we get \[\ln\left(\frac{a_1}{A}\right) + \ln\left(\frac{a_2}{A}\right) + \dots + \ln\left(\frac{a_n}{A}\right) \leq \frac{a_1}{A} - 1 + \frac{a_2}{A} - 1 + \dots + \frac{a_n}{A} - 1\] \[\ln\left(\frac{\prod_{i=1}^{n} a_i}{A^n}\right) \leq \frac{\sum_{i=1}^{n}a_i}{A} - n\] \[\ln\left(\frac{\prod_{i=1}^{n} a_i}{A^n}\right) \leq \frac{nA}{A} - n\] \[\ln\left(\frac{\prod_{i=1}^{n} a_i}{A^n}\right) \leq n - n\] \[\ln\left(\frac{\prod_{i=1}^{n} a_i}{A^n}\right) \leq 0\] Now exponentiating both sides we get \[e^{\ln\left(\frac{\prod_{i=1}^{n} a_i}{A^n}\right)} \leq e^{0}\] \[\frac{\prod_{i=1}^{n} a_i}{A^n} \leq 1\] \[\prod_{i=1}^{n} a_i \leq A^n\] \[\sqrt[\leftroot{2}\uproot{3}n]{\prod_{i=1}^{n} a_i} \leq \sqrt[\leftroot{2}\uproot{3}n]{A^n}\] \[\sqrt[\leftroot{2}\uproot{3}n]{\prod_{i=1}^{n} a_i} \leq A\] \[\sqrt[\leftroot{2}\uproot{3}n]{\prod_{i=1}^{n} a_i} \leq \frac{\sum_{i=1}^{n} a_i}{n}\] This proves the AM-GM inequality. $\blacksquare$

Proof of Weighted AM-GM

Proof by Convexity

We note that the function $x \mapsto \ln x$ is strictly concave. Then by Jensen's Inequality, \[\ln \sum_i \lambda_i a_i \ge \sum_i \lambda_i \ln a_i = \ln \prod_i a_i^{\lambda_i} ,\] with equality if and only if all the $a_i$ are equal. Since $x \mapsto \ln x$ is a strictly increasing function, it then follows that \[\sum_i \lambda_i a_i \ge \prod_i a_i^{\lambda_i},\] with equality if and only if all the $a_i$ are equal, as desired. $\blacksquare$

Alternate Proof by Convexity

This proof is due to G. Pólya.

Note that the function $f:x \mapsto e^x$ is strictly convex. Let $g(x)$ be the line tangent to $f$ at $(0,1)$; then $g(x) = x+1$. Since $f$ is also a continuous, differentiable function, it follows that $f(x) \ge g(x)$ for all $x$, with equality exactly when $x=0$, i.e., \[1+x \le e^x,\] with equality exactly when $x=0$.

Now, set \[r_i = a_i/\biggl( \sum_{j=1}^n \lambda_j a_j \biggr) - 1,\] for all integers $1\le i \le n$. Our earlier bound tells us that \[r_i +1 \le \exp(r_i),\] so \[(r_i +1)^{\lambda_i} \le \exp(\lambda_i r_i) .\] Multiplying $n$ such inequalities gives us

\[\prod_{i=1}^n (r_i + 1)^{\lambda_{i}} \le \prod_{i=1}^n \exp \lambda_i r_i\]

Evaluating the left hand side:

\[\prod_{i=1}^n (r_i + 1)^{\lambda_{i}} = \frac{\prod_{i=1}^n a_i^{\lambda_i} }{ (\sum_{j=1}^n \lambda_j a_j)^{\sum_{j=1}^n \lambda_i} } = \frac{\prod_{i=1}^n a_i^{\lambda_i} }{ \sum_{j=1}^n \lambda_j a_j } ,\]

for

\[\sum_{j=1}^n \lambda_i = 1 .\]

Evaluating the right hand side:

\[\prod_{i=1}^n \exp \lambda_i r_i = \prod_{i=1}^n \exp \left(\frac{a_i \lambda_i}{ \sum_{i=1}^n \lambda_j a_j} - \lambda_i\right) = \exp \left(\frac{\sum_{i=1}^n a_i \lambda_i}{ \sum_{i=1}^n \lambda_j a_j} - \sum_{i=1}^n \lambda_i\right) = \exp 0 = 1 .\]

Substituting the results for the left and right sides:

\[\frac{\prod_{i=1}^n a_i^{\lambda_i} }{ \sum_{j=1}^n \lambda_j a_j } \le 1\]

\[\prod_{i=1}^n a_i^{\lambda_i} \le \sum_{j=1}^n \lambda_j a_j = \sum_{i=1}^n \lambda_i a_i ,\]

as desired. $\blacksquare$