(a): "only if" part: I'll assume X and Y are continuous here. The
discrete version is similar.
Suppose X and Y are independent with pdfs given by r(x) and s(y)
respectively and joint pdf p(x, y), and let f and g be functions from
R to R.
Then LHS = Int( Int(p(x, y).f(x).g(y) dy) dx) where Int() = integral over R.
= Int( Int(r(x).s(y).f(x).g(y) dy) dx) since X and Y are independent.
= Int(r(x).f(x) . Int (s(y).g(y) dy) dx) since r(x) and f(x) are
independent of y.
= Int(r(x).f(x) . E[g(Y)] dx)
= Int(r(x).f(x) dx) . E[g(Y)] since E[g(Y)] is independent of x.
= E[f(X)]. E[g(Y)] = RHS.
"if" part: Suppose the given condition is true. For any a and b in R, define
f(x) = {1, x <= a; 0, x > a}
g(y) = {1, y <= b; 0, y > b}
The condition says
E[f(X)g(Y)] = E[f(X)].E[g(Y)]
Putting in this f and g gives the LHS as the double integral as x goes
from -oo to a and y goes from -oo to b of p(x, y) [sorry, too hard to
do legibly in ASCII] and the RHS as the integral from -oo to a of
r(x), multiplied by the integral from -oo to b of g(y).
By definition, the LHS is then the joint CDF of X and Y at (a, b) and
the RHS is the product of the CDF of X at a and the CDF of Y at b;
equality between these for an arbitrary a and b is the definition of
independence, so we are done.
(b): Follows from (a). Take arbitrary functions f and g from R to R
and let a, b be arbitrary elements of R. Define f'(x) = {1, f(x) <= a;
0, f(x) > a} and g'(y) similarly. (Note tht here ' is just a
distinguishing mark and has nothing to do with derivatives!) From (a),
we know E[f'(x)g'(y)] = E[f'(x)].E[g'(y)]. Formulate these as
integrals and you will immediately get the joint CDF at (a, b) on the
left and the product of the individual CDFs on the right.
(c): An easy example is to define X and Y both on {-1, 0, 1} with
joint pmf as follows: f(x, y) = {0.2 if |x| = |y|, 0 otherwise}.
You should be able to verify for yourself that this is a valid pmf,
Cov(X, Y) = 0 and Cov(|X|, |Y|) = 0.16. |