Documentos de Académico
Documentos de Profesional
Documentos de Cultura
1
Outline
2
Johann Carl Friedrich Gauss
3
Facts about Gauss
Attended Brunswick College in 1792, where he
discovered many important theorems before even
reaching them in his studies
Found a square root in two different ways to fifty
decimal places by ingenious expansions and
interpolations
Constructed a regular 17 sided polygon, the first
advance in this matter in two millennia. He was only
18 when he made the discovery
4
Ideas of Gauss
5
Intellectual Personality and Controversy
Those who knew Gauss best found him to be cold and
uncommunicative.
6
Formal Arrival of Least Squares
Gauss
Published The theory of the Motion of Heavenly Bodies in 1809.
He gave a probabilistic justification of the method,which was
based on the assumption of a normal distribution of errors.
Gauss himself later abandoned the use of normal error function.
Published Theory of the Combination of Observations Least
Subject to Errors in 1820s. He substituted the root mean square
error for Laplaces mean absolute error.
7
Treatment of Errors
Random error
8
Error Assumptions
All errors within these limits are possible, but not necessarily
with equal likelihood
9
Density Function
We define the function ( x) with the same meaning as a density function w ith
the following properties .
Positive and negative errors of the same maginitude are equally likely, ( x) ( x)
10
Mean and Variance
2
If k=0 then the variance will equal m
11
Reasons for m
2
12
More on Variance
k 0 then variance equals
2 2
If m k .
Suppose we have independent random variables {e, e' , e' ' ,...}
with standard deviation 1 and expected value 0. The
linear function of total errors is given by E e ' e'...
k k
Now the variance of E is given as M e i2
2 2 2
i i
i 1 i 1
13
Gauss Derivation of the Method of Least Squares
14
Gauss Derivation of the Method of Least Squares
Problem :
We want to estimate V , V ' , V ' ' , by taking independen t observatio ns : L, L' , L' ' ,.
where V , V ' , V ' ' ,. are functions of unkowns x, y, z ,.
V f1 ( x, y, z ,)
V ' f 2 ( x, y, z , )
V " f 3 ( x, y, z ,)
Let the errors in the observatio ns be :
(V L) (V ' L' )
v : , v' ,
p p'
where the p ' s are the weights of the ' mean errors of the observatio ns'.
( Note : We scaled the errors so they have the same variance ) 15
Gauss Derivation of the Method of Least Squares
16
Solve an optimizati on problem :
min 2 '2 ' '2
where , ' , " , are coefficients of v, v ' , v" ,
s.t : v ' v ' ' ' v ' ' etc. x k
for some constant k independen t of x, y, z, .
We can state the problem as :
We are looking for a linear mapping G( v, v ' , v" , ) from R to R such that :
1. G F is the identiy on R
2. G statisfies an optimality condition, described as below :
Suppose x g (v, v' , v"...) is the first component of G. Then
x g (v, v' , v"...) v ' v ' " v"... k .
Solutions:
2 '2 ' '2
2 '2 ' '2 etc. ( ) 2 ( ' ' ) 2 ( ' ' ' ' ) 2 etc.
where all the ' s denote the coefficien ts we derived by eliminatio n
of the system. From which it is obvious that the sum 2 '2 ' '2
attains its minimun wh en , ' ' , " " , etc.
Its still not obvious:
How do these results relate with the least squares estimation?
18
Gauss Derivation of the Method of Least Squares
19
Gauss derivation by modern matrix notation:
Assume that observable quantities V1 , V2 , , V are linear
functions of parameters x1 , x2 ,, x such that
Vi bi1 x1 ... bi x ci , bij , ci R
we know the values of all the bij and ci .
We measure the Vi in an attempt to infer the values of the xi .
Assume Li is an observatio n of Vi
Switch to a new coordinate system by setting :
vi (Vi Li ) / pi
The system becomes :
v Ax l
20
Gauss derivation by modern matrix notation:
In a linear model
x A
where A is an n p matrix wit h rank p, is an unknown ve ctor, and is the
error vect or. If E( ) 0 and Var( ) 2 I , then for any unbiased estimator
~ ~
of CT , we have E(LS ) and Var(C TLS ) Var ( )
In other word s, when ' s have the same variance and are uncorrelat ed, the least -
squares estimator is the best unbiased estimator with the smallest v ariance.
23
Limitation of the Method of Least Squares
Nothing is perfect:
24
References
Gauss, Carl Friedrich, Translated by G. W. Stewart. 1995. Theory of the
Combination of Observations Least Subject to Errors: Part One, Part
Two, Supplement. Philadelphia: Society for Industrial and Applied
Mathematics.
Plackett, R. L. 1949. A Historical Note on the Method of Least Squares.
Biometrika. 36:458460.
Stephen M. Stiger, Gauss and the Invention of Least Squares. The
Annals of Statistics, Vol.9, No.3(May,1981),465-474.
Plackett, Robin L. 1972. The Discovery of the Method of Least Squares.
Plackett, Robin L. 1972. The Discovery of the Method of Least Squares.
Belinda B.Brand, Guass Method of Least Squares: A historically-based
introduction. August 2003
http://www.infoplease.com/ce6/people/A0820346.html
http://www.stetson.edu/~efriedma/periodictable/html/Ga.html
25