Difference between revisions of "User:DVO"

(Personal info)
(Contributions)
Line 1: Line 1:
== Contributions ==
 
 
*  Created [[Chain Rule]] article
 
*  Created [[Fundamental Theorem of Calculus]] article
 
 
 
 
== Random Math Problems ==
 
== Random Math Problems ==
  

Revision as of 16:58, 26 March 2011

Random Math Problems

The following problems are part of a discussion I'm having with someone I know. Please don't comment about them, and most importantly please don't correct any errors or give any hints about better solutions.


Suppose you have a fair six-sided die and you're going to roll the die again and again indefinitely. What's the expected number of rolls until a $1$ comes up on the die?


The probability that it will take one roll is $\frac{1}{6}$.

The probability that it will take two rolls is $\left(\frac56 \right)\left(\frac16 \right)$.

The probability that it will take three rolls is $\left(\frac56 \right)^2 \left(\frac16 \right)$.

The probability that it will take four rolls is $\left(\frac56 \right)^3 \left(\frac16 \right)$.

And so on.

So, the expected number of rolls that it will take to get a $1$ is:

$1\cdot \frac{1}{6} + 2\cdot \left(\frac56 \right)\left(\frac16 \right) + 3\cdot \left(\frac56 \right)^2 \left(\frac16 \right) + 4 \cdot \left(\frac56 \right)^3 \left(\frac16 \right) + \cdots$.

What's the sum of this infinite series? It looks kind of like an infinite geometric series, but not exactly. Factoring out a $\frac16$ makes it look a bit more like an infinite geometric series:

$\left(\frac16 \right) \left(1 + 2\cdot \left(\frac56 \right) + 3\cdot \left(\frac56 \right)^2 + 4 \cdot \left(\frac56 \right)^3 + \cdots \right)$

This is similar to a geometric series, which we know how to sum. But we have those annoying factors of $2$, $3$, $4$, etc. to worry about. Maybe we could play around with the formula for a geometric series to get a formula for this series. The formula for a geometric series is:

$1 + r + r^2 + r^3 + r^4 + \cdots = \frac{1}{1-r}$.

Differentiating both sides of this with respect to $r$, we find:

$1 + 2r + 3r^2 + 4r^3 + \cdots = -(1-r)^{-2}(-1) = \frac{1}{(1-r)^2}$.

So, we find that

$\left(\frac16 \right) \left(1 + 2\cdot \left(\frac56 \right) + 3\cdot \left(\frac56 \right)^2 + 4 \cdot \left(\frac56 \right)^3 + \cdots \right) = \frac16 \frac{1}{(1-\frac56)^2} = \frac16 (36) = 6$.

Which seems like a very reasonable answer, given that the die has six sides.



What's the expected number of rolls it will take in order to get all six numbers at least once?



As a first step, let's find out the expected number of rolls it will take in order to get both $1$ and $2$ at least once.

Let $n \ge 2$ be an integer. What's the probability that it will take $n$ rolls to get a $1$ and a $2$?

Consider two different events: Event $OneFirst$ is the event that the $n^{th}$ roll is a $2$, and the first $n-1$ rolls give at least one $1$ and also no $2$'s. Event $TwoFirst$ is the event that the $n^{th}$ roll is a $1$, and the first $n-1$ rolls give at least one $2$ and also no $1$'s. The probability that it will take $n$ rolls to get a $1$ and a $2$ is $P(OneFirst)+ P(TwoFirst)$.

What is $P(OneFirst)$?

Let $A$ be the event that the $n^{th}$ roll is a $2$. Let $B$ be the event that the first $n-1$ rolls give at least one $1$. Let $C$ be the event that the first $n-1$ rolls give no $2$'s.

$P(OneFirst)=P(A \cap B \cap C)$.

Event $A$ is independent of the event $B \cap C$, so:

$P(A \cap (B \cap C)) = P(A)P(B \cap C)$.

$P(A)$ is $\frac16$, but what is $P(B \cap C)$?

Events $B$ and $C$ aren't independent. But we can say this:

$P(B \cap C)= P(C)P(B|C)= P(C)\left(1-P(B^c|C)\right)$. (Here $B^c$ is the complement of the event $B$. I have used the fact that $P(X|Y)+P(X^c|Y)=1$.)

$P(C) = \left(\frac56 \right)^{n-1}$.

$P(B^c|C) = \frac{P(B^c \cap C)}{P(C)}= \frac{(\frac46)^{n-1}}{(\frac56)^{n-1}}$.

Thus, $P(B \cap C)= \left(\frac56 \right)^{n-1} \left(1- \frac{ (\frac46)^{n-1} }{ (\frac56)^{n-1} } \right)

= \left(\frac56 \right)^{n-1} - \left(\frac46 \right)^{n-1}$ (Error compiling LaTeX. Unknown error_msg).

So $P(A \cap (B \cap C) ) = \left( \frac16 \right) \left( \left(\frac56 \right)^{n-1} - \left(\frac46 \right)^{n-1} \right)$.

We have found $P(OneFirst)$. We're close to finding the probability that it will take n rolls to get both a $1$ and a $2$.

$P(TwoFirst) = P(OneFirst)$. Thus the probability that it will take $n$ rolls to get both a $1$ and a $2$ is $P(OneFirst)+ P(TwoFirst) = \left( \frac13 \right) \left( \left(\frac56 \right)^{n-1} - \left(\frac46 \right)^{n-1} \right)$.

Okay.....

So what's the expected number of rolls it will take to get both a $1$ and a $2$?

It is:

${2 \cdot \left( \frac13 \right) \left( \left(\frac56 \right)^1 - \left(\frac46 \right)^1 \right) + 3 \cdot \left( \frac13 \right) \left( \left(\frac56 \right)^2 - \left(\frac46 \right)^2 \right) + 4 \cdot \left( \frac13 \right) \left( \left(\frac56 \right)^3 - \left(\frac46 \right)^3 \right) + \cdots}$

$= \left( \frac13 \right) \left( 2 \left(\frac56 \right)^1 + 3 \left(\frac56 \right)^2 + 4 \left(\frac56 \right)^3 + \cdots \right) - \left( \frac13 \right) \left( 2\left(\frac46 \right)^1 + 3 \left(\frac46 \right)^2 + 4 \left(\frac46 \right)^3 + \cdots \right)$

$= \left( \frac13 \right) \left( -1 + 1 + 2 \left(\frac56 \right)^1 + 3 \left(\frac56 \right)^2 + 4 \left(\frac56 \right)^3 + \cdots \right) - \left( \frac13 \right) \left(-1 + 1 +  2\left(\frac46 \right)^1 + 3 \left(\frac46 \right)^2 + 4 \left(\frac46 \right)^3 + \cdots \right)$

$= \left( \frac13 \right) \left( -1 + \frac{1}{(1-\frac56)^2} \right) - \left( \frac13 \right)  \left( -1 + \frac{1}{(1-\frac46)^2} \right)$

$= \left( \frac13 \right) ( -1 + 36 + 1 - 9 ) = \left( \frac13 \right) (27) = 9$.

The expected number of rolls is $9$.


Using a similar (but a little more complicated) method, I get that the expected number of rolls until all six numbers appear is $14.7$.

I also found that the expected number of rolls until $1$, $2$, and $3$ appear is $11$.