Hi addisonbr!!
I will start the answer showing one important property of the Expected
Value of a random variable: E(X)
E(aX+b) = a*E(X) + b , where a and b are constant values.
See "Properties of Expectation" at:
http://www-stat.stanford.edu/~susan/courses/s116/node100.html
Regarding the random variables see the following simple definition:
"Random Variable:
The outcome of an experiment need not be a number, for example, the
outcome when a coin is tossed can be 'heads' or 'tails'. However, we
often want to represent outcomes as numbers. A random variable is a
function that associates a unique numerical value with every outcome
of an experiment. The value of the random variable will vary from
trial to trial as the experiment is repeated.
There are two types of random variable - discrete and continuous.
A random variable has either an associated probability distribution
(discrete random variable) or probability density function (continuous
random variable)."
From "Random variables and probability distributions":
http://www.stats.gla.ac.uk/steps/glossary/probability_distributions.html
The above definition tell us that when you talk about a random
variable there is a distribution or density function associated, then
for a given random variable the expected value is a defined number
denoted as E(X) and almost called Mu (the greek letter).
What I want to tell you with all of this stuff is that in the
demonstration the term E(X) can be treated as a constant number: Mu .
Now it is easy:
1) Why does 2E(xE(x)) = 2(E(x))^2 ?
2*E(x*E(x)) = 2*E(x*Mu) =
= 2*Mu*E(x) =
= 2*E(x)*E(x) =
= 2*(E(x))^2
2) Why does E((E(x))^2 = (E(x))^2 ?
E((E(x))^2 = E(Mu)^2 =
= Mu^2 =
= (E(x))^2
-------------------------------------------------------
Search strategy:
I did this based mainly in my own knowledge of these topics, but for
online references I used the following search keys at Google.com:
random variable
expected value
variance
I hope that this helps you. Feel free to request for a clarification
if you need it.
Regards,
livioflores-ga |