Overview
Expected value and variance are the two most important properties of a random variable, describing its center and spread.
Expected Value (Mean)
The expected value is the long-run average value of a random variable.
Discrete Random Variable
E ( X ) = μ = ∑ x i × P ( X = x i ) E(X) = \mu = \sum x_i \times P(X = x_i) E ( X ) = μ = ∑ x i × P ( X = x i )
Continuous Random Variable
E ( X ) = μ = ∫ x × f ( x ) d x E(X) = \mu = \int x \times f(x) \, dx E ( X ) = μ = ∫ x × f ( x ) d x
Properties of Expected Value
E ( c ) = c (constant) E(c) = c \quad \text{(constant)} E ( c ) = c (constant)
E ( c X ) = c × E ( X ) (scalar multiplication) E(cX) = c \times E(X) \quad \text{(scalar multiplication)} E ( c X ) = c × E ( X ) (scalar multiplication)
E ( X + Y ) = E ( X ) + E ( Y ) (addition, always true) E(X + Y) = E(X) + E(Y) \quad \text{(addition, always true)} E ( X + Y ) = E ( X ) + E ( Y ) (addition, always true)
E ( X − Y ) = E ( X ) − E ( Y ) (subtraction) E(X - Y) = E(X) - E(Y) \quad \text{(subtraction)} E ( X − Y ) = E ( X ) − E ( Y ) (subtraction)
E ( a X + b ) = a × E ( X ) + b (linear transformation) E(aX + b) = a \times E(X) + b \quad \text{(linear transformation)} E ( a X + b ) = a × E ( X ) + b (linear transformation)
For independent X X X and Y Y Y :
E ( X Y ) = E ( X ) × E ( Y ) E(XY) = E(X) \times E(Y) E ( X Y ) = E ( X ) × E ( Y )
Variance
Variance measures the spread around the mean.
Definition
Var ( X ) = σ 2 = E [ ( X − μ ) 2 ] \text{Var}(X) = \sigma^2 = E[(X - \mu)^2] Var ( X ) = σ 2 = E [( X − μ ) 2 ]
Computational Formula
Var ( X ) = E ( X 2 ) − [ E ( X ) ] 2 \text{Var}(X) = E(X^2) - [E(X)]^2 Var ( X ) = E ( X 2 ) − [ E ( X ) ] 2
Discrete Random Variable
Var ( X ) = ∑ ( x i − μ ) 2 × P ( X = x i ) \text{Var}(X) = \sum (x_i - \mu)^2 \times P(X = x_i) Var ( X ) = ∑ ( x i − μ ) 2 × P ( X = x i )
or equivalently:
Var ( X ) = ∑ x i 2 × P ( X = x i ) − μ 2 \text{Var}(X) = \sum x_i^2 \times P(X = x_i) - \mu^2 Var ( X ) = ∑ x i 2 × P ( X = x i ) − μ 2
Standard Deviation
σ = S D ( X ) = Var ( X ) \sigma = SD(X) = \sqrt{\text{Var}(X)} σ = S D ( X ) = Var ( X )
Properties of Variance
Var ( c ) = 0 (constant has no variance) \text{Var}(c) = 0 \quad \text{(constant has no variance)} Var ( c ) = 0 (constant has no variance)
Var ( c X ) = c 2 × Var ( X ) (scalar squared) \text{Var}(cX) = c^2 \times \text{Var}(X) \quad \text{(scalar squared)} Var ( c X ) = c 2 × Var ( X ) (scalar squared)
Var ( X + c ) = Var ( X ) (shifting doesn’t change spread) \text{Var}(X + c) = \text{Var}(X) \quad \text{(shifting doesn't change spread)} Var ( X + c ) = Var ( X ) (shifting doesn’t change spread)
Var ( a X + b ) = a 2 × Var ( X ) (linear transformation) \text{Var}(aX + b) = a^2 \times \text{Var}(X) \quad \text{(linear transformation)} Var ( a X + b ) = a 2 × Var ( X ) (linear transformation)
For independent X X X and Y Y Y :
Var ( X + Y ) = Var ( X ) + Var ( Y ) \text{Var}(X + Y) = \text{Var}(X) + \text{Var}(Y) Var ( X + Y ) = Var ( X ) + Var ( Y )
Var ( X − Y ) = Var ( X ) + Var ( Y ) (note: still addition!) \text{Var}(X - Y) = \text{Var}(X) + \text{Var}(Y) \quad \text{(note: still addition!)} Var ( X − Y ) = Var ( X ) + Var ( Y ) (note: still addition!)
Standardization
Converting X X X to a standard form with mean 0 and SD 1:
Z = X − μ σ Z = \frac{X - \mu}{\sigma} Z = σ X − μ
E ( Z ) = 0 , Var ( Z ) = 1 E(Z) = 0, \quad \text{Var}(Z) = 1 E ( Z ) = 0 , Var ( Z ) = 1
Covariance and Correlation
Covariance
Cov ( X , Y ) = E [ ( X − μ X ) ( Y − μ Y ) ] = E ( X Y ) − E ( X ) E ( Y ) \text{Cov}(X, Y) = E[(X - \mu_X)(Y - \mu_Y)] = E(XY) - E(X)E(Y) Cov ( X , Y ) = E [( X − μ X ) ( Y − μ Y )] = E ( X Y ) − E ( X ) E ( Y )
Correlation
ρ = Corr ( X , Y ) = Cov ( X , Y ) σ X × σ Y \rho = \text{Corr}(X, Y) = \frac{\text{Cov}(X, Y)}{\sigma_X \times \sigma_Y} ρ = Corr ( X , Y ) = σ X × σ Y Cov ( X , Y )
Where − 1 ≤ ρ ≤ 1 -1 \leq \rho \leq 1 − 1 ≤ ρ ≤ 1
For Linear Combinations
E ( a X + b Y ) = a × E ( X ) + b × E ( Y ) E(aX + bY) = a \times E(X) + b \times E(Y) E ( a X + bY ) = a × E ( X ) + b × E ( Y )
Var ( a X + b Y ) = a 2 × Var ( X ) + b 2 × Var ( Y ) + 2 a b × Cov ( X , Y ) \text{Var}(aX + bY) = a^2 \times \text{Var}(X) + b^2 \times \text{Var}(Y) + 2ab \times \text{Cov}(X,Y) Var ( a X + bY ) = a 2 × Var ( X ) + b 2 × Var ( Y ) + 2 ab × Cov ( X , Y )
If X X X and Y Y Y are independent:
Var ( a X + b Y ) = a 2 × Var ( X ) + b 2 × Var ( Y ) \text{Var}(aX + bY) = a^2 \times \text{Var}(X) + b^2 \times \text{Var}(Y) Var ( a X + bY ) = a 2 × Var ( X ) + b 2 × Var ( Y )
Examples
Example 1: Expected Value
Die roll X X X :
E ( X ) = 1 ( 1 6 ) + 2 ( 1 6 ) + 3 ( 1 6 ) + 4 ( 1 6 ) + 5 ( 1 6 ) + 6 ( 1 6 ) E(X) = 1\left(\frac{1}{6}\right) + 2\left(\frac{1}{6}\right) + 3\left(\frac{1}{6}\right) + 4\left(\frac{1}{6}\right) + 5\left(\frac{1}{6}\right) + 6\left(\frac{1}{6}\right) E ( X ) = 1 ( 6 1 ) + 2 ( 6 1 ) + 3 ( 6 1 ) + 4 ( 6 1 ) + 5 ( 6 1 ) + 6 ( 6 1 )
E ( X ) = 21 6 = 3.5 E(X) = \frac{21}{6} = 3.5 E ( X ) = 6 21 = 3.5
Example 2: Variance
For the die:
E ( X 2 ) = 1 2 ( 1 6 ) + 2 2 ( 1 6 ) + 3 2 ( 1 6 ) + 4 2 ( 1 6 ) + 5 2 ( 1 6 ) + 6 2 ( 1 6 ) E(X^2) = 1^2\left(\frac{1}{6}\right) + 2^2\left(\frac{1}{6}\right) + 3^2\left(\frac{1}{6}\right) + 4^2\left(\frac{1}{6}\right) + 5^2\left(\frac{1}{6}\right) + 6^2\left(\frac{1}{6}\right) E ( X 2 ) = 1 2 ( 6 1 ) + 2 2 ( 6 1 ) + 3 2 ( 6 1 ) + 4 2 ( 6 1 ) + 5 2 ( 6 1 ) + 6 2 ( 6 1 )
E ( X 2 ) = 91 6 = 15.17 E(X^2) = \frac{91}{6} = 15.17 E ( X 2 ) = 6 91 = 15.17
Var ( X ) = 15.17 − ( 3.5 ) 2 = 15.17 − 12.25 = 2.92 \text{Var}(X) = 15.17 - (3.5)^2 = 15.17 - 12.25 = 2.92 Var ( X ) = 15.17 − ( 3.5 ) 2 = 15.17 − 12.25 = 2.92
σ = 2.92 = 1.71 \sigma = \sqrt{2.92} = 1.71 σ = 2.92 = 1.71
Example 3: Linear Transformation
If E ( X ) = 50 E(X) = 50 E ( X ) = 50 and Var ( X ) = 16 \text{Var}(X) = 16 Var ( X ) = 16 , find E ( Y ) E(Y) E ( Y ) and Var ( Y ) \text{Var}(Y) Var ( Y ) where Y = 3 X + 10 Y = 3X + 10 Y = 3 X + 10 :
E ( Y ) = 3 × 50 + 10 = 160 E(Y) = 3 \times 50 + 10 = 160 E ( Y ) = 3 × 50 + 10 = 160
Var ( Y ) = 3 2 × 16 = 144 \text{Var}(Y) = 3^2 \times 16 = 144 Var ( Y ) = 3 2 × 16 = 144
S D ( Y ) = 12 SD(Y) = 12 S D ( Y ) = 12
Example 4: Sum of Independent Variables
If X X X has mean 100, SD 15 and Y Y Y has mean 80, SD 10 (independent):
E ( X + Y ) = 100 + 80 = 180 E(X + Y) = 100 + 80 = 180 E ( X + Y ) = 100 + 80 = 180
Var ( X + Y ) = 225 + 100 = 325 \text{Var}(X + Y) = 225 + 100 = 325 Var ( X + Y ) = 225 + 100 = 325
S D ( X + Y ) = 325 = 18.03 SD(X + Y) = \sqrt{325} = 18.03 S D ( X + Y ) = 325 = 18.03