Discussion 10A

Expectation

E[X]=xxPr[X=x]=xxPrX[x]E[X] = \sum_x x \Pr[X = x] = \sum_x x \Pr_X[x]
XX is a random variable and E[X]E[X] is a scalar.

Physical Interpretation

If XX consists of masses on a number line, where the mass at position xx is Pr[X=x]\Pr[X=x], then E[X]E[X] is the center of mass of these masses. The value xPr[X=x]x \Pr[X=x] can be thought of as the torque acting on the number line with a fulcrum at 00. If the center of mass is at 00 (E[X]=0E[X]=0) then the net torque xxPr[X=x]\sum_x x \Pr[X=x] is 00.

Variance

The variance of a random variable is a measure of its spread.
Var(X)=E[(XE[X])2]=E[X2]E[X]2\mathrm{Var}(X) = E[(X - E[X])^2] = E[X^2] - E[X]^2

Geometric Distribution

XGeometric(p)X \sim \text{Geometric}(p) when XX is the number of independent trials until a success, where each trial succeeds with probability pp.

Proof of E[X]=1pE[X] = \frac{1}{p}

With probability pp our trial succeeds, so X=1X=1. Otherwise, we fail with probability 1p1-p, meaning we need to start again, so X=1+XX = 1 + X', where XX' is a copy of XX (they have the same distribution). Then E[X]=(p)(1)+(1p)(1+E[X])E[X] = (p) (1) + (1-p) (1 + E[X]). Rearranging gives us our result.

Poisson Distribution

If an event occurs with some average rate λ\lambda instances per unit, then the number of times this event occurs over 11 unit follows a Poisson distribution.
XPoisson(λ)X \sim \text{Poisson}(\lambda)
Pr[X=k]=λkk!eλ\Pr[X=k] = \frac{\lambda^k}{k!} e^{-\lambda}

The sum of two independent Poisson random variables XPoisson(λ)X \sim \text{Poisson}(\lambda) and YPoisson(μ)Y \sim \text{Poisson}(\mu) is X+YPoisson(λ+μ)X+Y \sim \text{Poisson}(\lambda + \mu).
Ex. If an average of 5.45.4 people walk through one door per hour and an average of 33 people walk through the other door per hour, then the average number of people who walk through either door per hour is 5.4+35.4 + 3.

Summary

XBernoulli(p)X \sim \text{Bernoulli}(p)
E[X]=pE[X] = p
Var(X)=p(1p)\mathrm{Var}(X) = p (1-p)
XGeometric(p)X \sim \text{Geometric}(p)
E[X]=1pE[X] =\frac{1}{p}
Var(X)=1pp2\mathrm{Var}(X) = \frac{1-p}{p^2}
XBinomial(n,p)X \sim \text{Binomial}(n, p)
E[X]=npE[X] = np
Var(X)=np(1p)\mathrm{Var}(X) = n p (1-p)
XPoisson(λ)X \sim \text{Poisson}(\lambda)
E[X]=λE[X] = \lambda
Var(X)=λ\mathrm{Var}(X) = \lambda

Tips and Tricks

If XBernoulli(p)X \sim \text{Bernoulli}(p) then XX has the same distribution as X2,X3,X^2 , X^3 , \cdots.
If YBernoulli(q)Y \sim \text{Bernoulli}(q) then XYBernoulli(Pr[X=1,Y=1])X Y \sim \text{Bernoulli}(\Pr[X=1, Y=1]).
If they are independent then the parameter is Pr[X=1,Y=1]=pq\Pr[X=1, Y=1] = p q.

If X=i=1nXiX = \sum_{i=1}^n X_i, then X2=i,jXiXj=i=1nXi2+ijXiXjX^2 = \sum_{i, j} X_i X_j = \sum_{i=1}^n X_i^2 + \sum_{i \neq j} X_i X_j. If the XiX_i have the same distribution, then E[X2]=nE[Xi2]+n(n1)E[XiXj]E[X^2] = n E[X_i^2] + n (n-1) E[X_i X_j].