Expected value

Revision as of 14:53, 17 February 2007 by Bictor717 (talk | contribs)

Given an event with a variety of different possible outcomes, the expected value is what one should expect to be the average outcome if the event were to be repeated many times. Note that this is not the same as the "most likely outcome."

For example, flipping a fair coin has two possible outcomes, heads (denoted here by $H$) or tails ($T$). If we flip a fair coin repeatedly, we expect that we will get about the same number of heads as tails, or half as many as the total number of flips. Thus, the average outcome is $\frac 12 H + \frac 12 T$. Note that not only is this not the most likely outcome, it is not even a possible outcome for a single flip.


More formally, we can define expected value as follows: if we have an event $Z$ whose outcomes have a discrete probability distribution, the expected value $E(Z) = \sum_z P(z) \cdot z$ where the sum is over all outcomes $z$ and $P(z)$ is the probability of that particular outcome. If the event $Z$ has a continuous probability distribution, then $E(Z) = \int_z P(z)\cdot z\ dz$.

Example Problems


This article is a stub. Help us out by expanding it.