Subscribe to the weekly news from TrueShelf

## Basics of Expectation and Variance

1. If the random variable $X$ takes values in non-negative integers, prove that:

$E[X] = \sum_{t=0}^\infty \Pr(X > t)$

2. Prove that if $X_1$ and $X_2$ are independent random variables then $E[X_1 \cdot X_2] = E[X_1] E[X_2]$. Is the converse true ?

3. Let $X$ be a random variable taking integral nonnegative values. Let $E[X]$ be its expectation, and let $E[X^2]$ denote the expectation of its square. Prove that

$\Pr(X>0) \geq \frac{E[X]^2}{E[X^2]}$

Hint : Use Cauchy-Schwarz inequality.

Source: folklore

0

0

0

0

0

0

0

0

0

0