X is a random variable and E[X] is a scalar.
Physical Interpretation
If X consists of masses on a number line, where the mass at position x is Pr[X=x], then E[X] is the center of mass of these masses. The value xPr[X=x] can be thought of as the torque acting on the number line with a fulcrum at 0. If the center of mass is at 0(E[X]=0) then the net torque ∑xxPr[X=x] is 0.
Variance
The variance of a random variable is a measure of its spread.
Var(X)=E[(X−E[X])2]=E[X2]−E[X]2
Geometric Distribution
X∼Geometric(p) when X is the number of independent trials until a success, where each trial succeeds with probability p.
Proof of E[X]=p1
With probability p our trial succeeds, so X=1. Otherwise, we fail with probability 1−p, meaning we need to start again, so X=1+X′, where X′ is a copy of X(they have the same distribution). Then E[X]=(p)(1)+(1−p)(1+E[X]). Rearranging gives us our result.
Poisson Distribution
If an event occurs with some average rate λ instances per unit, then the number of times this event occurs over 1 unit follows a Poisson distribution.
X∼Poisson(λ)
Pr[X=k]=k!λke−λ
The sum of two independent Poisson random variables X∼Poisson(λ) and Y∼Poisson(μ) is X+Y∼Poisson(λ+μ).
Ex. If an average of 5.4 people walk through one door per hour and an average of 3 people walk through the other door per hour, then the average number of people who walk through either door per hour is 5.4+3.
Summary
X∼Bernoulli(p)
E[X]=p
Var(X)=p(1−p)
X∼Geometric(p)
E[X]=p1
Var(X)=p21−p
X∼Binomial(n,p)
E[X]=np
Var(X)=np(1−p)
X∼Poisson(λ)
E[X]=λ
Var(X)=λ
Tips and Tricks
If X∼Bernoulli(p) then X has the same distribution as X2,X3,⋯.
If Y∼Bernoulli(q) then XY∼Bernoulli(Pr[X=1,Y=1]).
If they are independent then the parameter is Pr[X=1,Y=1]=pq.
If X=∑i=1nXi, then X2=∑i,jXiXj=∑i=1nXi2+∑i≠jXiXj. If the Xi have the same distribution, then E[X2]=nE[Xi2]+n(n−1)E[XiXj].
Expectation
Physical Interpretation
Variance
Geometric Distribution
Proof of E[X]=p1
Poisson Distribution
Summary
Tips and Tricks