|
|
(2 intermediate revisions by the same user not shown) |
Line 1: |
Line 1: |
− | == Personal info ==
| |
| | | |
− | Name: Daniel O'Connor
| |
− |
| |
− | (full name: Daniel Verity O'Connor)
| |
− |
| |
− | Location: Los Angeles
| |
− |
| |
− |
| |
− | == Contributions ==
| |
− |
| |
− | * Created [[Chain Rule]] article
| |
− | * Created [[Fundamental Theorem of Calculus]] article
| |
− |
| |
− |
| |
− | == Random Math Problems ==
| |
− |
| |
− | The following problems are part of a discussion I'm having with someone I know. Please don't comment about them, and most importantly please don't correct any errors or give any hints about better solutions.
| |
− |
| |
− | ----
| |
− | Suppose you have a fair six-sided die and you're going to roll the die again and again indefinitely. What's the expected number of rolls until a <math>1</math> comes up on the die?
| |
− | ----
| |
− |
| |
− | The probability that it will take one roll is <math>\frac{1}{6} </math>.
| |
− |
| |
− | The probability that it will take two rolls is <math>\left(\frac56 \right)\left(\frac16 \right) </math>.
| |
− |
| |
− | The probability that it will take three rolls is <math>\left(\frac56 \right)^2 \left(\frac16 \right) </math>.
| |
− |
| |
− | The probability that it will take four rolls is <math>\left(\frac56 \right)^3 \left(\frac16 \right) </math>.
| |
− |
| |
− | And so on.
| |
− |
| |
− | So, the expected number of rolls that it will take to get a <math>1</math> is:
| |
− |
| |
− | <math>1\cdot \frac{1}{6} + 2\cdot \left(\frac56 \right)\left(\frac16 \right) + 3\cdot \left(\frac56 \right)^2 \left(\frac16 \right) + 4 \cdot \left(\frac56 \right)^3 \left(\frac16 \right) + \cdots</math>.
| |
− |
| |
− | What's the sum of this infinite series? It looks kind of like an infinite geometric series, but not exactly. Factoring out a <math>\frac16</math> makes it look a bit more like an infinite geometric series:
| |
− |
| |
− | <math>\left(\frac16 \right) \left(1 + 2\cdot \left(\frac56 \right) + 3\cdot \left(\frac56 \right)^2 + 4 \cdot \left(\frac56 \right)^3 + \cdots \right)</math>
| |
− |
| |
− | This is similar to a geometric series, which we know how to sum. But we have those annoying factors of <math>2</math>, <math>3</math>, <math>4</math>, etc. to worry about. Maybe we could play around with the formula for a geometric series to get a formula for this series. The formula for a geometric series is:
| |
− |
| |
− | <math>1 + r + r^2 + r^3 + r^4 + \cdots = \frac{1}{1-r}</math>.
| |
− |
| |
− | Differentiating both sides of this with respect to <math>r</math>, we find:
| |
− |
| |
− | <math>1 + 2r + 3r^2 + 4r^3 + \cdots = -(1-r)^{-2}(-1) = \frac{1}{(1-r)^2}</math>.
| |
− |
| |
− | So, we find that
| |
− |
| |
− | <math>\left(\frac16 \right) \left(1 + 2\cdot \left(\frac56 \right) + 3\cdot \left(\frac56 \right)^2 + 4 \cdot \left(\frac56 \right)^3 + \cdots \right) = \frac16 \frac{1}{(1-\frac56)^2} = \frac16 (36) = 6</math>.
| |
− |
| |
− | Which seems like a very reasonable answer, given that the die has six sides.
| |
− |
| |
− |
| |
− | ----
| |
− | What's the expected number of rolls it will take in order to get all six numbers at least once?
| |
− | ----
| |
− |
| |
− |
| |
− | As a first step, let's find out the expected number of rolls it will take in order to get both <math>1</math> and <math>2</math> at least once.
| |
− |
| |
− | Let <math>n \ge 2</math> be an integer. What's the probability that it will take <math>n</math> rolls to get a <math>1</math> and a <math>2</math>?
| |
− |
| |
− | Consider two different events: Event <math>OneFirst</math> is the event that the <math>n^{th}</math> roll is a <math>2</math>, and the first <math>n-1</math> rolls give at least one <math>1</math> and also no <math>2</math>'s. Event <math>TwoFirst</math> is the event that the <math>n^{th}</math> roll is a <math>1</math>, and the first <math>n-1</math> rolls give at least one <math>2</math> and also no <math>1</math>'s. The probability that it will take <math>n</math> rolls to get a <math>1</math> and a <math>2</math> is <math>P(OneFirst)+ P(TwoFirst)</math>.
| |
− |
| |
− | What is <math>P(OneFirst)</math>?
| |
− |
| |
− | Let <math>A</math> be the event that the <math>n^{th}</math> roll is a <math>2</math>. Let <math>B</math> be the event that the first <math>n-1</math> rolls give at least one <math>1</math>. Let <math>C</math> be the event that the first <math>n-1</math> rolls give no <math>2</math>'s.
| |
− |
| |
− | <math>P(OneFirst)=P(A \cap B \cap C)</math>.
| |
− |
| |
− | Event <math>A</math> is independent of the event <math>B \cap C</math>, so:
| |
− |
| |
− | <math>P(A \cap (B \cap C)) = P(A)P(B \cap C)</math>.
| |
− |
| |
− | <math>P(A)</math> is <math>\frac16</math>, but what is <math>P(B \cap C)</math>?
| |
− |
| |
− | Events <math>B</math> and <math>C</math> aren't independent. But we can say this:
| |
− |
| |
− | <math>P(B \cap C)= P(C)P(B|C)= P(C)\left(1-P(B^c|C)\right)</math>. (Here <math>B^c</math> is the complement of the event <math>B</math>. I have used the fact that <math>P(X|Y)+P(X^c|Y)=1</math>.)
| |
− |
| |
− | <math>P(C) = \left(\frac56 \right)^{n-1} </math>.
| |
− |
| |
− | <math>P(B^c|C) = \frac{P(B^c \cap C)}{P(C)}= \frac{(\frac46)^{n-1}}{(\frac56)^{n-1}} </math>.
| |
− |
| |
− | Thus, <math>P(B \cap C)=
| |
− | \left(\frac56 \right)^{n-1} \left(1- \frac{ (\frac46)^{n-1} }{ (\frac56)^{n-1} } \right)
| |
− |
| |
− | = \left(\frac56 \right)^{n-1} - \left(\frac46 \right)^{n-1} </math>.
| |
− |
| |
− | So <math>P(A \cap (B \cap C) ) = \left( \frac16 \right) \left( \left(\frac56 \right)^{n-1} - \left(\frac46 \right)^{n-1} \right)</math>.
| |
− |
| |
− | We have found <math>P(OneFirst)</math>. We're close to finding the probability that it will take n rolls to get both a <math>1</math> and a <math>2</math>.
| |
− |
| |
− | <math>P(TwoFirst) = P(OneFirst)</math>. Thus the probability that it will take <math>n</math> rolls to get both a <math>1</math> and a <math>2</math> is <math>P(OneFirst)+ P(TwoFirst) = \left( \frac13 \right) \left( \left(\frac56 \right)^{n-1} - \left(\frac46 \right)^{n-1} \right)</math>.
| |
− |
| |
− | Okay.....
| |
− |
| |
− | So what's the expected number of rolls it will take to get both a <math>1</math> and a <math>2</math>?
| |
− |
| |
− | It is:
| |
− |
| |
− | <math>{2 \cdot \left( \frac13 \right) \left( \left(\frac56 \right)^1 - \left(\frac46 \right)^1 \right) + 3 \cdot \left( \frac13 \right) \left( \left(\frac56 \right)^2 - \left(\frac46 \right)^2 \right) + 4 \cdot \left( \frac13 \right) \left( \left(\frac56 \right)^3 - \left(\frac46 \right)^3 \right) + \cdots}</math>
| |
− |
| |
− | <math> = \left( \frac13 \right) \left( 2 \left(\frac56 \right)^1 + 3 \left(\frac56 \right)^2 + 4 \left(\frac56 \right)^3 + \cdots \right) - \left( \frac13 \right) \left( 2\left(\frac46 \right)^1 + 3 \left(\frac46 \right)^2 + 4 \left(\frac46 \right)^3 + \cdots \right) </math>
| |
− |
| |
− | <math>= \left( \frac13 \right) \left( -1 + 1 + 2 \left(\frac56 \right)^1 + 3 \left(\frac56 \right)^2 + 4 \left(\frac56 \right)^3 + \cdots \right) - \left( \frac13 \right) \left(-1 + 1 + 2\left(\frac46 \right)^1 + 3 \left(\frac46 \right)^2 + 4 \left(\frac46 \right)^3 + \cdots \right)</math>
| |
− |
| |
− | <math>= \left( \frac13 \right) \left( -1 + \frac{1}{(1-\frac56)^2} \right) - \left( \frac13 \right) \left( -1 + \frac{1}{(1-\frac46)^2} \right) </math>
| |
− |
| |
− | <math>= \left( \frac13 \right) ( -1 + 36 + 1 - 9 ) = \left( \frac13 \right) (27) = 9</math>.
| |
− |
| |
− | The expected number of rolls is <math>9</math>.
| |
− |
| |
− |
| |
− | Using a similar (but a little more complicated) method, I get that the expected number of rolls until all six numbers appear is <math>14.7</math>.
| |
− |
| |
− | I also found that the expected number of rolls until <math>1</math>, <math>2</math>, and <math>3</math> appear is <math>11</math>.
| |