Proof of the Gauss-Markov Theoremdnett/S511/04GaussMarkovProof.pdf · The Gauss-Markov Theorem...

8

Click here to load reader

Transcript of Proof of the Gauss-Markov Theoremdnett/S511/04GaussMarkovProof.pdf · The Gauss-Markov Theorem...

Page 1: Proof of the Gauss-Markov Theoremdnett/S511/04GaussMarkovProof.pdf · The Gauss-Markov Theorem Under the Gauss-Markov Linear Model, the OLS estimator c0 ^ of an estimable linear function

Proof of the Gauss-Markov Theorem

Copyright c©2012 Dan Nettleton (Iowa State University) Statistics 511 1 / 8

Page 2: Proof of the Gauss-Markov Theoremdnett/S511/04GaussMarkovProof.pdf · The Gauss-Markov Theorem Under the Gauss-Markov Linear Model, the OLS estimator c0 ^ of an estimable linear function

The Gauss-Markov Theorem

Under the Gauss-Markov Linear Model, the OLS estimator c′β̂ ofan estimable linear function c′β is the unique Best LinearUnbiased Estimator (BLUE) in the sense that Var(c′β̂) is strictlyless than the variance of any other linear unbiased estimator ofc′β.

Copyright c©2012 Dan Nettleton (Iowa State University) Statistics 511 2 / 8

Page 3: Proof of the Gauss-Markov Theoremdnett/S511/04GaussMarkovProof.pdf · The Gauss-Markov Theorem Under the Gauss-Markov Linear Model, the OLS estimator c0 ^ of an estimable linear function

Unbiased Linear Estimators of c′β

If a is a fixed vector, then a′y is a linear function of y.

An estimator that is a linear function of y is said to be alinear estimator.

A linear estimator a′y is an unbiased estimator of c′β if andonly if

E(a′y) = c′β ∀ β ∈ IRp ⇐⇒ a′E(y) = c′β ∀ β ∈ IRp

⇐⇒ a′Xβ = c′β ∀ β ∈ IRp

⇐⇒ a′X = c′.

Copyright c©2012 Dan Nettleton (Iowa State University) Statistics 511 3 / 8

Page 4: Proof of the Gauss-Markov Theoremdnett/S511/04GaussMarkovProof.pdf · The Gauss-Markov Theorem Under the Gauss-Markov Linear Model, the OLS estimator c0 ^ of an estimable linear function

The OLS Estimator of c′β is a Linear Estimator

We have previously defined the Ordinary Least Squares(OLS) estimator of an estimable c′β by c′β̂, where β̂ is anysolution to the normal equations X′Xb = X′y.

We have previously shown that c′β̂ is the same for any β̂that is a solution to the normal equations.

We have previously shown that (X′X)−X′y is a solution to thenormal equations for any generalized inverse of X′X denotedby (X′X)−.

Thus, c′β̂ = c′(X′X)−X′y = `′y (where `′ = c′(X′X)−X′) sothat c′β̂ is a linear estimator.

Copyright c©2012 Dan Nettleton (Iowa State University) Statistics 511 4 / 8

Page 5: Proof of the Gauss-Markov Theoremdnett/S511/04GaussMarkovProof.pdf · The Gauss-Markov Theorem Under the Gauss-Markov Linear Model, the OLS estimator c0 ^ of an estimable linear function

c′β̂ is an Unbiased Estimator of an Estimable c′β

By definition, c′β is estimable if and only if there exists alinear unbiased estimator of c′β.

It follows from slide 3 that c′β is estimable if and only ifc′ = a′X for some vector a.

If c′β is estimable, then

`′X = c′(X′X)−X′X = a′X(X′X)−X′X = a′PXX = a′X = c′.

Thus, by slide 3, c′β̂ = `′y is an unbiased estimator of c′βwhenever c′β is estimable.

Copyright c©2012 Dan Nettleton (Iowa State University) Statistics 511 5 / 8

Page 6: Proof of the Gauss-Markov Theoremdnett/S511/04GaussMarkovProof.pdf · The Gauss-Markov Theorem Under the Gauss-Markov Linear Model, the OLS estimator c0 ^ of an estimable linear function

Proof of the Gauss-Markov Theorem

Suppose d′y is any linear unbiased estimator other than theOLS estimator c′β̂ = `′y.

Then we know the following:1 d 6= ` ⇐⇒ ||d − `||2 = (d − `)′(d − `) > 0, and

2 d′X = `′X = c′ =⇒ d′X − `′X = 0′ =⇒ (d − `)′X = 0′.

We need to show Var(d′y) > Var(c′β̂).

Copyright c©2012 Dan Nettleton (Iowa State University) Statistics 511 6 / 8

Page 7: Proof of the Gauss-Markov Theoremdnett/S511/04GaussMarkovProof.pdf · The Gauss-Markov Theorem Under the Gauss-Markov Linear Model, the OLS estimator c0 ^ of an estimable linear function

Proof of the Gauss-Markov Theorem

Var(d′y) = Var(d′y− c′β̂ + c′β̂)= Var(d′y− c′β̂) + Var(c′β̂) + 2Cov(d′y− c′β̂, c′β̂).

Var(d′y− c′β̂) = Var(d′y− `′y) = Var((d′ − `′)y) = Var((d − `)′y)= (d − `)′Var(y)(d − `) = (d − `)′(σ2I)(d − `)= σ2(d − `)′I(d − `) = σ2(d − `)′(d − `) > 0 by (1).

Cov(d′y− c′β̂, c′β̂) = Cov(d′y− `′y, `′y) = Cov((d − `)′y, `′y)= (d − `)′Var(y)` = σ2(d − `)′`= σ2(d − `)′X[(X′X)−]′c = 0 by (2).

Copyright c©2012 Dan Nettleton (Iowa State University) Statistics 511 7 / 8

Page 8: Proof of the Gauss-Markov Theoremdnett/S511/04GaussMarkovProof.pdf · The Gauss-Markov Theorem Under the Gauss-Markov Linear Model, the OLS estimator c0 ^ of an estimable linear function

Proof of the Gauss-Markov Theorem

It follows that

Var(d′y) = Var(d′y− c′β̂) + Var(c′β̂)> Var(c′β̂). 2

Copyright c©2012 Dan Nettleton (Iowa State University) Statistics 511 8 / 8