Está en la página 1de 1

Normal, N (, 2 )

1 2 2
PDF: f (x) = e(x) /2
2
2 t2 /2
MGF: M (t) = et+

Integration formulas
Z
1 1 1
p(x)eax dx = p(x)eax 2 p0 (x)eax + 3 p00 (x)eax . . .
a a a
Z  
1 x/
x e dx = (a + )ea/
a
Z  
2 1 x/
x e dx = ((a + )2 + 2 )ea/
a
Other Useful Facts

2 = E[(X )2 ] = E[X 2 ] 2 = M 00 (0) M 0 (0)2

Cov(X, Y ) = E[(X x )(Y y )] = E[XY ] x y

Cov(X, Y ) = xy = x y
and
xy
=
x y
y
Least squares regression line: y = y + (x x )
x
When variables X1 , X2 , . . . , Xn are not pairwise independent, then
Xn Xn X
Var( Xi ) = i2 + 2 ij
i=1 i=1 i<j
and
Xn n
X X
Var( ai Xi ) = a2i i2 + 2 ai aj ij
i=1 i=1 i<j
where ij is the covariance of Xi and Xj .

When X depends upon Y , E[X] = E[E[X|Y ]].

When X depends upon Y , Var(X) = E[Var(X|Y )] + Var(E[X|Y ]). (Called the Total Variance of
X.)

Chebyshevs Inequality: For a random variable X having any distribution with finite mean and
variance 2 , P (|X | k) k12 .

For the variables X and Y having the joint PMF/PDF f (x, y), the moment generating function for
this distribution is
XX
M (t1 , t2 ) = E[et1 X+t2 Y ] = E[et1 X et2 Y ] = et1 x et2 y f (x, y)
x y

x = Mt1 (0, 0) and y = Mt2 (0, 0) (These are the first partial derivatives.)
E[X 2 ] = Mt1 t1 (0, 0) and E[Y 2 ] = Mt2 t2 (0, 0) (These are the pure second partial derivatives.)
E[XY ] = Mt1 t2 (0, 0) = Mt2 t1 (0, 0) (These are the mixed second partial derivatives.)

Central Limit Theorem: As the sample size n grows,