Está en la página 1de 5

ISyE 3232

Z. She , YL. Chang

Stochastic Manufacturing and Service Models

Fall 2015

Homework 6 - Solutions
October 2, 2015
1. Suppose each morning a factory post the number of days worked in a row without any injuries. Assume
that each day there is injury free with probability 99/100. Let X0 = 0 be the morning the factory first
opened. Let Xn be the number posted on the morning after n full days of work. Is X0 , X1 , . . . a
Markov chain? If so, give its state space, initial distribution, and transition matrix P . If not, show
that it is not a Markov chain.
Note that Xn is the number of consecutive injury-free days posted on the morning after n full days of
work. Lets work on a table of Xn first. Let In = 1 if injury-free on the nth day and In = 0, otherwise.
Xn
0
1
2
..
.

In+1 (Injury-free on the n + 1st day?)


1 (Yes)
0 (No)
1 (Yes)
0 (No)
1 (Yes)
0 (No)
..
.

Xn+1
1
0
2
0
3
0
..
.

So from the above table, you can see that Xn+1 is a function of Xn and In+1 as follows:

Xn + 1, if In+1 = 1
Xn+1 =
0,
if In+1 = 0
This proves that Xn is a DTMC.
State space is S = {0, 1, 2, . . .}.
The initial distribution is given by
a(0) = (1, 0, 0, . . .)
because it is given that X0 = 0.
Now we find the transition matrix. Note that
P (Xn+1 = j + 1|Xn = j) = P (In+1 = 1) = 0.99 and
P (Xn+1 = 0|Xn = j) = P (In+1 = 0) = 0.01.
Therefore the transition probability

0.01
0.01

P = 0.01

..
.

matrix is
0.99 0
0
0.99
0
0
..
..
.
.

0
0
0.99
..
.

0
0
0
..
.

...
...
...

...

2. Let X0 , X1 , . . . be a Markov chain with state space {0, 1, 2}, initial distribution a = (?, 0.4, 0.5), and
transition matrix

.4 ? .5
P = .3 ? .1
? .2 .5
Fill in the entries for P and a, and compute the following:

(a) Pr {X1 = 2 | X0 = 2},


(b) Pr {X8 = 2, X10 = 1 | X7 = 0},
(c) Pr {X1 = 0},
(d) Pr {X0 = 2 | X1 = 2}
(e) E[X22 | X0 = 2],
(f) E[X22 ]
(g) Pr {X0 = 2 | X0 = 0}
(h) Pr {X2 = 1 | X0 = 2}
(i) Pr {X2 = 1, X0 = 2}
(j) Pr {X1 = 1 | X0 = 0, X2 = 2}
(k) Pr {X0 X1 X2 = 1}
Since the initial probability vector must sum up to 1, we have that:
a(0) = (0.1, 0.4, 0.5)

Moreover, if the probability of moving from i to j in one time step is Pr(Xn+1 = j | Xn = i) = pij ,
then each row in the transition matrix must sum up to one, i.e.
X
pij = 1.
j

Hence, we get that:


p01 = 0.1, p11 = 0.6, p20 = 0.3.

.4
P = .3
.3

.1
.6
.2

.5
.1
.5

.34
P = .33
.33
2

.2 .46
.41 .26
.25 .42

a) Pr(X1 = 2|X0 = 2) = p22 = 0.5


b)
Pr(X8 = 2, X10 = 1|X7 = 0)

Pr(X10 = 1 | X8 = 2, X7 = 0) Pr(X8 = 2 | X7 = 0)

Pr(X10 = 1 | X8 = 2) Pr(X8 = 2 | X7 = 0)
(2)

= p21 p02 = 0.25 0.5 = 0.125.


(2)

(2)

Note that p21 is the (2, 1)th element in P 2 and thus from above p21 = 0.25.
c)
Pr(X1 = 0)

2
X

Pr(X1 = 0, X0 = i)

i=0

2
X

Pr(X1 = 0 | X0 = i) Pr(X0 = i)

i=0

p00 a0 + p10 a1 + p20 a2 = 0.4 0.1 + 0.3 0.4 + 0.3 0.5 = 0.31.
2

d) Pr(X0 = 2|X1 = 2) =

Pr(X1 = 2)

Pr(X0 =2,X1 =2)


Pr(X1 =2)

2
X

Pr(X1 =2|X0 =2) Pr(X0 =2)


Pr(X1 =2)

(0)

P22 a2
Pr(X1 =2)

0.50.5
0.34

0.735.

Pr(X1 = 2, X0 = i)

i=0

2
X

Pr(X1 = 2 | X0 = i) Pr(X0 = i)

i=0

p02 a0 + p12 a1 + p22 a2 = 0.5 0.1 + 0.1 0.4 + 0.5 0.5 = 0.34.


 P2
P2
2
= 0 .33 + 1 0.25 + 4 0.42 = 1.93
e) E X22 |X0 = 2 = i=0 i2 Pr(X2 = i|X0 = 2) = i=0 i2 P2i
  P2


f) E X22 = i=0 E X22 |X0 = i Pr(X0 = i) = 2.04 0.1 + 1.45 0.4 + 1.93 0.5 = 1.749.

 P2
P2
2
= 0 .34 + 1 0.2 + 4 0.46 = 2.04
E X22 |X0 = 0 = i=0 i2 Pr(X2 = i|X0 = 0) = i=0 i2 P0i
 2
 P2 2
P2 2 2
E X2 |X0 = 1 = i=0 i Pr(X2 = i|X0 = 1) = i=0 i P1i = 0 .33 + 1 0.41 + 4 0.26 = 1.45


From part (e), E X22 |X0 = 2 = 1.93
g) Pr(X0 = 2|X0 = 0) = 0, since the Markov chain cannot be at two different states at the same time.
h)
Pr(X2 = 1|X0 = 2)

2
X

Pr(X2 = 1, X1 = i|X0 = 2)

i=0

2
X

Pr(X2 = 1 | X1 = i, X0 = 2) Pr(X1 = i | X0 = 2)

i=0

2
X

Pr(X2 = 1 | X1 = i) Pr(X1 = i | X0 = 2)

i=0

2
X

pi1 p2i

i=0

p01 p20 + p11 p21 + p21 p22

0.1 0.3 + 0.6 0.2 + 0.2 0.5 = 0.25.

Note that Pr(X2 = 1 | X1 = i, X0 = 2) = Pr(X2 = 1 | X1 = i) because only the most recent


information matters due to the Markov property.
i) Pr(X2 = 1, X0 = 2) = Pr(X2 = 1|X0 = 2) Pr(X0 = 2) = 0.25 0.5 = 0.125
j)
Pr(X1 = 1|X0 = 0, X2 = 2)

=
=
=
=

Pr(X1 = 1, X0 = 0, X2 = 2)
Pr(X0 = 0, X2 = 2)
Pr(X2 = 2 | X1 = 1, X0 = 0) Pr(X1 = 1, X0 = 0)
Pr(X2 = 2 | X0 = 0) Pr(X0 = 0)
Pr(X2 = 2 | X1 = 1) Pr(X1 = 1 | X0 = 0) Pr(X0 = 0)
Pr(X2 = 2 | X0 = 0) Pr(X0 = 0)
p12 p01 a0
0.1 0.1
1
=
=
0.0217.
(2)
0.46
46
p a0
02

k) The only possible combination of (X0 , X1 , X2 ) to get their product equal to 1 is (1, 1, 1). Thus,
Pr(X0 X1 X2 = 1)

Pr(X2 = 1 | X1 = 1, X0 = 1) Pr(X1 = 1, X0 = 1)

Pr(X2 = 1 | X1 = 1) Pr(X1 = 1 | X0 = 1) Pr(X0 = 1)

p11 p11 a1 = 0.6 0.6 0.4 = 0.144.

3. A six-sided die is rolled repeatedly. After each roll n = 1, 2, ..., let Xn be the largest number rolled in
the first n rolls. Is {Xn , n 1} a discrete-time Markov chain? If its not, show that it is not. If it is,
what is the state space and the transition probabilities of the Markov chain?
The following table shows how Xn changes.

Xn
1

..
.

Un+1 (the outcome of the n + 1st roll)


1
2
3
4
5
6
1
2
3
4
5
6
1
2
3
4
5
6
..
.

Xn+1
1
2
3
4
5
6
2
2
3
4
5
6
3
3
3
4
5
6
..
.

Since Xn+1 = max{Xn , Un+1 } where {Un : n 1} is an i.i.d. sequence of uniform distribution on
{1, 2, 3, 4, 5, 6}, {Xn : n 1} is a Markov chain with state space S = {1, 2, . . . , 6} and transition
probability
Pr{Xn+1 = i | Xn = i} = Pr{Un+1 i} = i/6,
Pr{Xn+1 = j | Xn = i} = 0,

j < i

Pr{Xn+1 = j | Xn = i} = Pr{Un+1 = j} = 1/6,


The transition matrix is

1
6

P =

0
0
0
0
0

1
6
2
6

0
0
0
0

1
6
1
6
3
6

0
0
0

1
6
1
6
1
6
4
6

0
0

1
6
1
6
1
6
1
6
5
6

1
6
1
6
1
6
1
6
1
6
6
6

j > i

4. In a study of the effect of advertising on brand shifting in the U.S. brewing industry, a Markov chain
model was estimated to represent the chance of consumers shifting preferences among Anheuser-Busch
(state 1), Miller (state 2) and other (state 3) beers from 1978 to 1979. The one-step transition
matrix is

0.9940

0.0000
P =
0.0084

0.0046
0.9981
0.0526

0.0014
0.0019
0.9390

Clearly brand loyalty is quite strong since the diagonal elements are all near 1. Suppose that there
were 6 million beer drinkers in 1978, divided equally among the three brands. Answer the following
questions:
If the transition matrix does not change over time (typically it does) and the number of beer drinkers is
unchanged (certainly not true), how many consumers are expected to prefer Anheuser-Busch products
in the long-run?
Solve P =
1

0.9941 + 0.00843

0.00461 + 0.99812 + 0.05263

0.00141 + 0.00192 + 0.93903

1 + 2 + 3

1.

Then we get (1 , 2 , 3 ) = (0.0418, 0.9283, 0.0299).


If you are interested in the long run or limiting distribution of the process, you can play with this code
on WolframAlpha (http://www.wolframalpha.com/) by replacing n with a large enough integer.
MatrixPower[{{0.994, 0.0046, 0.0014}, {0, 0.9981, 0.0019}, {0.0084,
0.0526, 0.939}}, n]
You will get something similar to

0.0418 0.9283
0.0418 0.9283
0.0418 0.9283

0.0299
0.0299
0.0299

, which means, in the long run, 6 0.0418 = 0.2508 million customers prefer Anheuser-Busch.

También podría gustarte