The maximal variance.

by 3ch03s, Aug 11, 2015, 9:23 PM

In this year, I found something interesting. this post in AOPS

What about a random variable with support in [0,1]?

well, we can think, there is so many distributions, some well known (like the beta family, bernoulli trials, and a large etc), some very strange (like the rm distribution in my first entry blog). all with their mean and variance, could we think a distribution given a arbitrary mean and variance? the answer is no. in this entry, we find a several interesting things for the mean and variance of any distribution in [0,1].

First, ¿Why not all the pair mean-variance are possible in this kind of distributions?.

The answer is pretty simple. for X random variable distributed in [0,1] we can check that all the moments exists, and his mgf also exists always, in particular for first and second moment, who determines the mean and variance.

as for $0 \leq x \leq 1$ we have $x^q \leq x^p$ for p<q naturals, is easy to show that $E[X^q] \leq E[X^p]$ for this kind of distributions, specially setting p=1 and q=2: $E[X^2] \leq E[X]$. We know that the variance is defined by: $V[X]=E[X^2]-E[X]^2$, so how to use the preceding fact?. just note this simply operation:
$E[X^2] \leq E[X]$
$E[X^2]-E[X]^2 \leq E[X]-E[X]^2$
then $V[X] \leq E[X](1-E[X])$

This simple fact shows for a given mean $\mu$ in [0,1] and a distribution $X$ in [0,1] the variance is bounded by the term $\mu(1-\mu)$ (in fact is a parabola) and most important: as $\mu(1-\mu) \leq 1/4$ (a well-known inequality) we conclude that the variance can't be greater than 1/4. this is his maximal variance, so thinking a distribution with an arbitrary pair of mean-variance is not posible unless this pair lies on the "factible set" given by:
$\Phi=\{(\mu,\sigma^2) \in [0,1]^2 \sigma^2 \leq \mu(1-\mu)\}$
more interestingly, if we think a pair mean-standard.deviation, the factible set is a semicircle of radius 1/2 and center in x=1/2.

The main questions are: ¿what's happening about the distribution for a given pair in the factible set? can the proximity of the pair respect the boundary of this set tell us how is the distribution behavior?, what distribution reaches is maximal variance? is unique?, what about the moments? how does affect for a more general distributions with bounded support? what's happening if the variance is very near to the maximal line?.

We know that the distribution who reaches is maximal variance is the bernoulli distribution, this distribution is often regarded a "discrete" distribution, but by the pointwiew of measuring theory, can be also regarded a concentrated distribution at the points 0 and 1 in some proportion given by the mean $\mu$ in [0,1], in fact the relation are $\mu$ for the point 1 and $1-\mu$ for the point 0 (that is, a simple trial giving 1 with prob $\mu$). the uniqueness of the "maximal variance" distribution can be proved using chebyshev inequalities. setting $|x-1/2|\leq h$ for fixed h<1/2 we have using some algebra that $x(1-x)\geq \frac 14-h^2$ so putting the function $g(x)=x(1-x)$ and the chebyshev inequalities we have:
$P[|x-1/2|\leq h]\leq \frac{E[X(1-X)]}{\frac 14-h^2}$
But $E[X(1-X)]=\mu(1-\mu)-\sigma^2$
$P[|x-1/2|\geq h]\leq \frac{\mu(1-\mu)-\sigma^2}{\frac 14-h^2}$
the preceding inequality tell us for a bernoulli distribution, for all h<1/2 the probability is $P[|x-1/2|\geq h]=0$ (in fact is an equivalency), so, if there is another distribution different of bernoulli there exists "h" such that $P[|x-1/2|\geq h]>0$, if also, this distribution reaches his maximal variance so we have for the same h: $P[|x-1/2|\geq h]=0$ which is a contradiction.
Again this inequality tell us something more about how the distribution is because the probability of X taking values around 1/2, or even, every set A (medible) in ]0,1[ it's bounded by a quantity who depends of the diference of the maximal variance and the variance and the maximal distance respect of x=1/2, then, if the variances comes closer to his maximal, the probability begins to be more smaller, and evidently, more concentrated in the extremes, so, the more the variance approaches is maximum, more concentrated in 0 and 1 is the distribution.
Then, converges into a bernoulli.

It will continue.....

Comment

0 Comments

Shouts
Submit
  • hi
    also dead blog

    by aidan0626, Oct 10, 2022, 9:03 PM

  • hai.....

    by fluff_E, Dec 17, 2021, 5:13 PM

  • omg so pro

    by mathboy282, Sep 16, 2021, 4:52 AM

  • how is he so good?

    by centslordm, Aug 8, 2021, 3:45 PM

  • hello comrade

    by tauros, Sep 3, 2020, 2:55 PM

  • HOLA AMIGOS!

    by dolphindesigner, Mar 12, 2020, 12:41 AM

  • puedes hablar espanol conmigo?

    by Eugenis, Aug 12, 2015, 3:04 AM

  • tengo shouts? yaaaa.....

    by 3ch03s, Aug 11, 2015, 7:38 PM

  • hola. Mi nombre es claudio, soy chileno (y la ctm) Ingeniero civil Matematico con menci??n en estadistica aceptado en el doctorado de Estadpistica de la puc. Mucho Floyd Buki TB-303 y Lacie Baskerville y disco house.

    by 3ch03s, Nov 18, 2014, 4:29 AM

9 shouts
Tags
About Owner
  • Posts: 1320
  • Joined: May 23, 2014
Blog Stats
  • Blog created: Jun 9, 2014
  • Total entries: 3
  • Total visits: 1236
  • Total comments: 0
Search Blog
a