Está en la página 1de 3

' $

Sampling Distribution

Observation Estimator θbn = g(X1 , X2 , ..., Xn ) a function of


RVs; hence, RV itself ⇒ has its own distribution.

Thought Experiment Imagine hypothetically collecting


different sets of data.
 

 →
(1)
{x1 , ...
(1)
, xn } →
(1)
θbn 


 


 (2) (2) (2) 

 → {x1 , ... , xn } → θbn 
f (x; θ) → (3) (3) (3) distribution of θbn


 → {x1 , ... , xn } → θbn 



 .. .. 


 

. .

Remark Thought experiment above can easily be simulated on

& %
the computer.

ActSc 613 | Lecture 14 © 2015-17 by M. Zhu, PhD 1 of 3


' $

Computer Simulation

Pseudo Code

for b = 1 to B
data = generate(f, size=n)
theta[b] = estimate(data)
end for
histogram(theta[1:B])

Exercise First, choose a probability model and fix its


parameters (at the ground truth). Then, come up with a few
different (either intuitively sensible or totally arbitrary) formulae
for estimating one of the parameters, and use simulation to argue

& %
why one particular formula is better than others.

ActSc 613 | Lecture 14 © 2015-17 by M. Zhu, PhD 2 of 3


' $

James-Stein Estimator

Xi ∼ N(θi , σ 2 ), X1 , X2 , ..., Xn all independent, σ 2 known

 2

b(js) (n − 3)σ
P
θi = X̄ + 1 − n 2
(Xi − X̄)
i=1 (X i − X̄)

Remark Obvious estimator is θbi = Xi (also unbiased) but JS


has smaller MSE. Counter-intuitive why estimate of “my” ability
should depend on that of “others”. Important milestone in
statistics. Will come back to this later.

& %
ActSc 613 | Lecture 14 © 2015-17 by M. Zhu, PhD 3 of 3

También podría gustarte