Difference between revisions of "Mock AIME I 2012 Problems/Problem 12"

(Created page with "==Problem== Let <math>P(x)</math> be a polynomial of degree 10 satisfying <math>P(x^2) = P(x)P(x-1)</math>. Find the maximum possible sum of the coefficients of <math>P(x)</math>...")
 
 
(One intermediate revision by one other user not shown)
Line 3: Line 3:
  
 
==Solution==
 
==Solution==
Notice that if <math>a</math> is a root of <math>P</math>, then <math>a^2</math> must be a root of <math>P</math> and <math>(a + 1)^2</math> must be a root of <math>P</math>. But then continuing this, <math>a^{2^n}</math> and <math>(a + 1)^{2^n}</math> must be roots of <math>P</math> for all <math>n</math>. Since a polynomial has finitely many roots, <math>a</math> and <math>a + 1</math> must be roots of unity so that the above two sets contain finitely many elements. But there is a unique pair of roots of unity with real parts that differ by <math>1</math>, making <math>a = </math> <math>-1/2 \pm i\sqrt{3}/2</math>. Then the disjoint union of the two sets above is <math>\{-1/2 + i\sqrt{3}/2, 1/2 + i\sqrt{3}/2\}</math>, the minimal polynomial for which is <math>x^2 + x + 1</math>. Since any power of this base polynomial will work, <math>P(x) = (x^2 + x + 1)^5</math>, making the sum of coefficients <math>\boxed{243}</math>.
+
Notice that if <math>a</math> is a root of <math>P</math>, then <math>a^2</math> must be a root of <math>P</math> and <math>(a + 1)^2</math> must be a root of <math>P</math>. But then continuing this, <math>a^{2^n}</math> and <math>(a + 1)^{2^n}</math> must be roots of <math>P</math> for all <math>n</math>. Since a polynomial has finitely many roots, <math>a</math> and <math>a + 1</math> must be roots of unity so that the above two sets contain finitely many elements. But there is a unique such pair of roots of unity, making <math>a = </math> <math>-1/2 \pm i\sqrt{3}/2</math>. Then the disjoint union of the two sets above is <math>\{-1/2 + i\sqrt{3}/2, 1/2 + i\sqrt{3}/2\}</math>, the minimal polynomial for which is <math>x^2 + x + 1</math>. Since any power of this base polynomial will work, <math>P(x) = (x^2 + x + 1)^5</math>, making the sum of coefficients <math>\boxed{243}</math>.

Latest revision as of 00:56, 25 November 2016

Problem

Let $P(x)$ be a polynomial of degree 10 satisfying $P(x^2) = P(x)P(x-1)$. Find the maximum possible sum of the coefficients of $P(x)$.

Solution

Notice that if $a$ is a root of $P$, then $a^2$ must be a root of $P$ and $(a + 1)^2$ must be a root of $P$. But then continuing this, $a^{2^n}$ and $(a + 1)^{2^n}$ must be roots of $P$ for all $n$. Since a polynomial has finitely many roots, $a$ and $a + 1$ must be roots of unity so that the above two sets contain finitely many elements. But there is a unique such pair of roots of unity, making $a =$ $-1/2 \pm i\sqrt{3}/2$. Then the disjoint union of the two sets above is $\{-1/2 + i\sqrt{3}/2, 1/2 + i\sqrt{3}/2\}$, the minimal polynomial for which is $x^2 + x + 1$. Since any power of this base polynomial will work, $P(x) = (x^2 + x + 1)^5$, making the sum of coefficients $\boxed{243}$.