1 Answers
โ What is Linearity of Expectation?
Linearity of Expectation is a fundamental principle in probability theory stating that the expected value of the sum of random variables is equal to the sum of their individual expected values, regardless of whether those variables are independent.
๐ History and Background
The concept evolved gradually within the development of probability theory. While not attributable to a single individual, its formalization became prominent with the rise of modern probability and statistics in the 20th century. Early applications were seen in fields like statistical mechanics and actuarial science.
๐ Key Principles
- โ Additivity: The expected value of a sum is the sum of the expected values: $E[X + Y] = E[X] + E[Y]$.
- ๐ข Scaling: The expected value of a constant times a random variable is the constant times the expected value: $E[aX] = aE[X]$, where 'a' is a constant.
- ๐ค Independence (Irrelevant): Linearity of expectation holds true whether the random variables are independent or dependent.
๐ก Real-world Examples
Example 1: Rolling Dice
Suppose you roll two dice. Let $X$ be the number on the first die and $Y$ be the number on the second die. What is the expected value of the sum of the numbers on the two dice?
We know that $E[X] = 3.5$ and $E[Y] = 3.5$.
Using linearity of expectation:
$E[X + Y] = E[X] + E[Y] = 3.5 + 3.5 = 7$.
Example 2: Counting Aces
Imagine you draw 5 cards from a standard deck. What is the expected number of aces you will draw?
Let $X_i$ be an indicator random variable that is 1 if the $i$-th card is an ace, and 0 otherwise. So, $E[X_i] = P(\text{the i-th card is an ace}) = \frac{4}{52} = \frac{1}{13}$.
The total number of aces is $X = X_1 + X_2 + X_3 + X_4 + X_5$.
Using linearity of expectation:
$E[X] = E[X_1] + E[X_2] + E[X_3] + E[X_4] + E[X_5] = 5 \cdot \frac{1}{13} = \frac{5}{13} \approx 0.385$.
Example 3: Coin Flips
Suppose you flip a coin $n$ times. What is the expected number of heads?
Let $X_i$ be an indicator random variable that is 1 if the $i$-th flip is heads, and 0 otherwise. So, $E[X_i] = P(\text{the i-th flip is heads}) = \frac{1}{2}$.
The total number of heads is $X = X_1 + X_2 + ... + X_n$.
Using linearity of expectation:
$E[X] = E[X_1] + E[X_2] + ... + E[X_n] = n \cdot \frac{1}{2} = \frac{n}{2}$.
๐ฏ Conclusion
Linearity of Expectation provides a powerful and straightforward method to compute expected values, even when dealing with complex scenarios involving sums of random variables. Its applicability regardless of independence makes it an indispensable tool in probability and statistics.
Join the discussion
Please log in to post your answer.
Log InEarn 2 Points for answering. If your answer is selected as the best, you'll get +20 Points! ๐