Documentos de Académico
Documentos de Profesional
Documentos de Cultura
=
=
( ) 1 f x dx
=
}
2
| | ( ) F x P X x = s
| | ( ) ( )
x
F x P X x f x dx
= s =
}
3
| | Mean ( )
i i
i
E X x p x = =
| | Mean ( ) E X xf x dx
= =
}
4
2 2
( )
i i
i
E X x p x ( =
2 2
( ) E X x f x dx
( =
}
5
( ) ( ) ( )
2
2
Var X E X E X ( =
( ) ( ) ( )
2
2
Var X E X E X ( =
6
Moment =
r r
i i
i
E X x p ( =
Moment = ( )
r r
E X x f x dx
( =
}
7 M.G.F M.G.F
( ) ( )
tX tx
X
x
M t E e e p x ( = =
( ) ( )
tX tx
X
M t E e e f x dx
( = =
}
4) ( ) ( ) E aX b aE X b + = +
5) ( ) ( )
2
Var Var aX b a X + =
6) ( ) ( ) ( )
2 2
Var Var aX bY a X bVar Y = +
7) ( ) Standard Deviation Var X =
8) ( ) ( ) f x F x ' =
9) ( ) 1 ( ) p X a p X a > = s
10) ( )
( )
( )
/
p A B
p A B
p B
= , ( ) 0 p B =
11) If A and B are independent, then ( ) ( ) ( ) p A B p A p B = .
12) 1
st
Moment about origin = | | E X =
( )
0
X
t
M t
=
( '
(Mean)
2
nd
Moment about origin =
2
E X (
=
( )
0
X
t
M t
=
( ''
The co-efficient of
!
r
t
r
=
r
E X (
(r
th
Moment about the origin)
13) Limitation of M.G.F:
i) A random variable X may have no moments although its m.g.f exists.
ii) A random variable X can have its m.g.f and some or all moments, yet the
m.g.f does not generate the moments.
iii) A random variable X can have all or some moments, but m.g.f does not
exist except perhaps at one point.
14) Properties of M.G.F:
i) If Y = aX + b, then ( ) ( )
bt
Y X
M t e M at = .
ii) ( ) ( )
cX X
M t M ct = , where c is constant.
iii) If X and Y are two independent random variables then
( ) ( ) ( )
X Y X Y
M t M t M t
+
= .
15) P.D.F, M.G.F, Mean and Variance of all the distributions:
Sl.
No.
Distributio
n
P.D.F ( ( ) P X x = )
M.G.F Mean Variance
1 Binomial x n x
x
nc p q
( )
n
t
q pe +
np npq
2 Poisson
!
x
e
x
( )
1
t
e
e
3 Geometric 1 x
q p
(or)
x
q p
1
t
t
pe
qe
1
p
2
q
p
4 Uniform
1
,
( )
0, otherwise
a x b
f x b a
< <
( )
bt at
e e
b a t
2
a b +
2
( )
12
b a
5 Exponential
, 0, 0
( )
0, otherwise
x
e x
f x
> >
=
2
1
6 Gamma 1
( ) , 0 , 0
( )
x
e x
f x x
= < < >
I
1
(1 ) t
7 Normal
2
1
2
1
( )
2
x
f x e
o
o t
| |
|
\ .
=
2 2
2
t
t
e
o
+
2
o
16) Memoryless property of exponential distribution
( ) ( ) / P X S t X S P X t > + > = > .
17) Function of random variable: ( ) ( )
Y X
dx
f y f x
dy
=
UNIT-II (RANDOM VARIABLES)
1) 1
ij
i j
p =
= =
}
Marginal density function of Y, ( ) ( ) ( , )
Y
f y f y f x y dx
= =
}
7) ( 1) 1 ( 1) P X Y P X Y + > = + <
8) Correlation co efficient (Discrete):
( , )
( , )
X Y
Cov X Y
x y
o o
=
1
( , ) Cov X Y XY XY
n
=
,
2 2
1
X
X X
n
o =
,
2 2
1
Y
Y Y
n
o =
9) Correlation co efficient (Continuous):
( , )
( , )
X Y
Cov X Y
x y
o o
=
( ) ( ) ( ) ( , ) , Cov X Y E X Y E X E Y = , ( )
X
Var X o = , ( )
Y
Var Y o =
10) If X and Y are uncorrelated random variables, then ( , ) 0 Cov X Y = .
11) ( ) ( ) E X xf x dx
=
}
, ( ) ( ) E Y yf y dy
=
}
, ( ) , ( , ) E X Y xyf x y dxdy
=
} }
.
12) Regression for Discrete random variable:
Regression line X on Y is ( )
xy
x x b y y = ,
( ) ( )
( )
2 xy
x x y y
b
y y
=
Regression line Y on X is ( )
yx
y y b x x = ,
( ) ( )
( )
2 yx
x x y y
b
x x
=
Correlation through the regression, .
XY YX
b b = Note: ( , ) ( , ) x y r x y =
13) Regression for Continuous random variable:
Regression line X on Y is ( ) ( ) ( )
xy
x E x b y E y = ,
x
xy
y
b r
o
o
=
Regression line Y on X is ( ) ( ) ( )
yx
y E y b x E x = ,
y
yx
x
b r
o
o
=
Regression curve X on Y is ( ) ( ) / / x E x y x f x y dx
= =
}
Regression curve Y on X is ( ) ( ) / / y E y x yf y x dy
= =
}
14) Transformation Random Variables:
( ) ( )
Y X
dx
f y f x
dy
= (One dimensional random variable)
( , ) ( , )
UV XY
x x
u v
f u v f x y
y y
u v
c c
c c
=
c c
c c
(Two dimensional random variable)
15) Central limit theorem (Liapounoffs form)
If X1, X2, Xn be a sequence of independent R.Vs with E[Xi] = i and Var(Xi) = i
2
, i
= 1,2,n and if Sn = X1 + X2 + + Xn then under certain general conditions, Sn
follows a normal distribution with mean
1
n
i
i
=
=
and variance
2 2
1
n
i
i
o o
=
=
as
n .
16) Central limit theorem (Lindberg Levys form)
If X1, X2, Xn be a sequence of independent identically distributed R.Vs with E[Xi]
= i and Var(Xi) = i
2
, i = 1,2,n and if Sn = X1 + X2 + + Xn then under certain
general conditions, Sn follows a normal distribution with mean n and variance
2
no as n .
Note:
n
S n
z
n
= ( for n variables),
X
z
n
= .
Note:
( )
var 0
T
T
Lt X
= .
Where
T
Y is the time average of ( ) Y t .
9) Auto covariance function:
( ) ( ) ( ) ( ) ( ) ( )
XX XX
C R E X t E X t t t t = +
10) Mean and variance of time average:
Mean:
| |
0
1
( )
T
T
E X E X t dt
T
( =
}
Variance:
2
2
1
( ) ( )
2
T
T XX XX
T
Var X R C d
T
t t t
( =
}
11) Markov process:
A random process in which the future value depends only on the present value
and not on the past values, is called a markov process. It is symbolically
represented by
1 1 1 1 0 0
( ) / ( ) , ( ) ... ( )
n n n n n n
P X t x X t x X t x X t x
+ +
s = = = (
1 1
( ) / ( )
n n n n
P X t x X t x
+ +
= s = (
Where
0 1 2 1
...
n n
t t t t t
+
s s s s s
12) Markov Chain:
If for all n,
1 1 2 2 0 0
/ , , ...
n n n n n n
P X a X a X a X a
= = = = (
1 1
/
n n n n
P X a X a
= = = (
then the process { }
n
X , 0,1, 2, ... n= is called the
markov chain. Where
0 1 2
, , , ... , ...
n
a a a a are called the states of the markov chain.
13) Transition Probability Matrix (tpm):
When the Markov Chain is homogenous, the one step transition probability is
denoted by Pij. The matrix P = {Pij} is called transition probability matrix.
14) Chapman Kolmogorov theorem:
If P is the tpm of a homogeneous Markov chain, then the n step tpm P
(n)
is
equal to P
n
. (i.e)
( )
n
n
ij ij
P P ( =
.
15) Markov Chain property: If ( )
1 2 3
, , H = H H H , then P H = H and
1 2 3
1 H + H + H = .
16) Poisson process:
If ( ) X t represents the number of occurrences of a certain event in (0, ) t ,then
the discrete random process { } ( ) X t is called the Poisson process, provided the
following postulates are satisfied.
(i) | | ( ) 1 occurrence in ( , ) P t t t t O t + A = A + A
(ii) | | ( ) 0 occurrence in ( , ) 1 P t t t t O t + A = A + A
(iii) | | ( ) 2 or more occurrences in ( , ) P t t t O t + A = A
(iv) ( ) X t is independent of the number of occurrences of the event in any
interval.
17) Probability law of Poisson process: { }
( )
( ) , 0,1, 2, ...
!
x
t
e t
P X t x x
x
= = =
Mean | | ( ) E X t t = ,
2 2 2
( ) E X t t t ( = +
, | | ( ) Var X t t = .
UNIT-IV (CORRELATION AND SPECTRAL DENSITY)
( )
XX
R t - Auto correlation function
( )
XX
S e - Power spectral density (or) Spectral density
( )
XY
R t - Cross correlation function
( )
XY
S e - Cross power spectral density
1) Auto correlation to Power spectral density (spectral density):
( ) ( )
i
XX XX
S R e d
et
e t t
=
}
2) Power spectral density to Auto correlation:
( ) ( )
1
2
i
XX XX
R S e d
et
t e e
t
=
}
3) Condition for ( ) X t and ( ) X t t + are uncorrelated random process is
| | | | ( ) ( ) ( ) ( ) 0
XX XX
C R E X t E X t t t t = + =
4) Cross power spectrum to Cross correlation:
( ) ( )
1
2
i
XY XY
R S e d
et
t e e
t
=
}
5) General formula:
i) ( )
2 2
cos cos sin
ax
ax
e
e bx dx a bx b bx
a b
= +
+
}
ii) ( )
2 2
sin sin cos
ax
ax
e
e bx dx a bx b bx
a b
=
+
}
iii)
2
2
2
2 4
a a
x ax x
| |
+ = +
|
\ .
iv) sin
2
i i
e e
i
u u
u
=
v) cos
2
i i
e e
u u
u
+
=
UNIT-V (LINEAR SYSTEMS WITH RANDOM INPUTS)
1) Linear system:
f is called a linear system if it satisfies
( ) ( )
1 1 2 2 1 1 2 2
( ) ( ) ( ) ( ) f a X t a X t a f X t a f X t = (
2) Time invariant system:
Let ( ) ( ) ( ) Y t f X t = . If ( ) ( ) ( ) Y t h f X t h + = + then f is called a time
invariant system.
3) Relation between input ( ) X t and output ( ) Y t :
( ) ( ) ( ) Y t h u X t u du
=
}
Where ( ) h u system weighting function.
4) Relation between power spectrum of ( ) X t and output ( ) Y t :
2
( ) ( ) ( )
YY XX
S S H e e e =
If ( ) H e is not given use the following formula ( ) ( )
j t
H e h t dt
e
e
=
}
5) Contour integral:
2 2
imx
ma
e
e
a x a
t
=
+
}
(One of the result)
6)
1
2 2
1
2
a
e
F
a a
t
e
=
`
+
)
(from the Fourier transform)
---- All the Best ----