2013 AMC 10B Problems/Problem 19

Revision as of 16:34, 3 February 2016 by Dank1728 (talk | contribs) (Solution 1)

Problem

The real numbers $c,b,a$ form an arithmetic sequence with $a\ge b\ge c\ge 0$ The quadratic $ax^2+bx+c$ has exactly one root. What is this root?

$\textbf{(A)}\ -7-4\sqrt{3}\qquad\textbf{(B)}\ -2-\sqrt{3}\qquad\textbf{(C)}\ -1\qquad\textbf{(D)}\ -2+\sqrt{3}\qquad\textbf{(E)}\ -7+4\sqrt{3}$


Solutions

Solution 1

It is given that $ax^2+bx+c=0$ has 1 real root, so the discriminant is zero, or $b^2=4ac$. Because a, b, c are in arithmetic progression, $b-a=c-b$, or $b=\frac {a+c} {2}$. We need to find the unique root, or $-\frac {b} {2a}$ (discriminant is 0). From $b^2=4ac$, we have $-\frac {b} {2a} =-\frac {2c} {b}$. Ignoring the negatives, we have $\frac {2c} {b} = \frac {2c} {\frac {a+c} {2}} = \frac {4c} {a+c} = \frac {1} {\frac {1} {\frac {4c} {a+c}}} = \frac {1} {\frac {a+c} {4c}} = \frac {1} {\frac {a} {4c} + \frac {1} {4} }$. Fortunately, finding $\frac {a} {c}$ is not very hard. Plug in $b=\frac {a+c} {2}$ to $b^2=4ac$, we have $a^2+2ac+c^2=16ac$, or $a^2-14ac+c^2=0$, and dividing by $c^2$ gives $(\frac {a} {c} ) ^2-14( \frac {a} {c} ) +1 = 0$, so $\frac {a} {c} = \frac {14 \pm \sqrt {192} } {2} = 7 \pm 4 \sqrt {3}$. But $7-4\sqrt {3} <1$, violating the assumption that $a \ge c$. Therefore, $\frac {a} {c} = 7 +4\sqrt {3}$. Plugging this in, we have $\frac {1} {\frac {a} {4c} + \frac {1} {4} } = \frac {1} {2+ \sqrt {3} } = 2- \sqrt {3}$. But we need the negative of this, so the answer is $\boxed {\textbf{(D)}}.$

Solution 2

Note that we can divide the polynomial by $a$ to make the leading coefficient 1 since dividing does not change the roots or the fact that the coefficients are in an arithmetic sequence. Also, we know that there is exactly one root so this equation must be of the form $(x-r)^2 = x^2 - 2rx + r^2$ where $1 \ge -2r \ge r^2 \ge 0$. We now use the fact that the coefficients are in an arithmetic sequence. Note that in any arithmetic sequence, the average is equal to the median. Thus, $r^2 + 1 = -4r$ and $r = -2 \pm \sqrt{3}$. Since $1 > r^2$, we easily see that $|r|$ has to be between 1 and 0. Thus, we can eliminate $-2 - \sqrt{3}$ and are left with $\boxed{\textbf{(D)} -2 + \sqrt{3}}$ as the answer.

Solution 3

Given that $ax^2+bx+c=0$ has only 1 real root, we know that the discriminant must equal 0, or that $b^2=4ac$. Because the discriminant equals 0, we have that the root of the quadratic is $r=\frac {-b} {2a}$. We are also given that the coefficients of the quadratic are in arithmetic progression, where $a \ge b \ge c \ge 0$. Letting the arbitrary difference equal variable $d$, we have that $a=b+d$ and that $c=b-d$. Plugging those two equations into $b^2=4ac$, we have $b^2=4(b^2-d^2)=4b^2-4d^2$ which yields $3b^2=4d^2$. Isolating $d$, we have $d=\frac {b \sqrt{3}} {2}$. Substituting that in for $d$ in $a=b+d$, we get $a=b+\frac {b \sqrt{3}} {2}=b(1+\frac {\sqrt{3}} {2})$. Once again, substituting that in for $a$ in $r=\frac {-b} {2a}$, we have $r=\frac {-b} {2b(1+\frac {\sqrt{3}} {2})}=\frac {-1} {2+\sqrt {3}}=-2+\sqrt {3}$. The answer is: \[\boxed {\textbf{(D)}}.\]

Solution 4

Let the double root be $r$. Then by the arithmetic progression and Vieta's,\begin{align*}a-b & =b-c\\ 1-\frac{b}a & =\frac{b}a-\frac{c}a\\ 1+2r & =-2r-r^2\\ r^2+4r+1 & =0\\ r & =-2\pm\sqrt{3}\end{align*}

We see $0\le b\le a\Rightarrow 0\le \frac{b}{a}\le 1$, and so we want $0\le -2r\le 1$ . Note that since $0\le -2(-2-\sqrt{3})=4+2\sqrt{3}\ge 1$ and $0 \le -2(-2+\sqrt{3})=4-2\sqrt{3}\le 1$, we can conclude that $r=-2+\sqrt{3}$, so the answer is: \[\boxed{\textbf{(D)}}.\]

See also

2013 AMC 10B (ProblemsAnswer KeyResources)
Preceded by
Problem 18
Followed by
Problem 20
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
All AMC 10 Problems and Solutions

The problems on this page are copyrighted by the Mathematical Association of America's American Mathematics Competitions. AMC logo.png