Statistical Sciences 3859a More Quiz 1 Practice · PDF file2 (a) Write down the least-squares...

2

Click here to load reader

Transcript of Statistical Sciences 3859a More Quiz 1 Practice · PDF file2 (a) Write down the least-squares...

Page 1: Statistical Sciences 3859a More Quiz 1 Practice · PDF file2 (a) Write down the least-squares estimator for β 0. βb 0 = ¯y − bβ 1¯x where βb 1 = S xy/S xx. (b) Show that the

1

Department of Statistical and Actuarial SciencesStatistical Sciences 3859a

More Quiz 1 Practice Solutions

1. Consider the modelyi = β1xi + εi, i = 1, 2, . . . , n

where ε1, . . . , εn are independent normal random variables with mean 0 and varianceσ2.

(a) Derive the least-squares estimator for β1, β1.The least-squares estimator for β1 minimizes L =

∑(yi − β1xi)

2. Set the derivative ∂L∂β1

= 0 to obtain∑xiyi = β1

∑x2

i

from which it follows that

β1 =

∑xiyi∑x2

i

.

(b) Show that the estimator obtained above is unbiased.

E[β1] =

∑xiE[yi]∑

x2i

=

∑xiβ1xi∑

x2i

= β1.

Therefore, β1 is an unbiased estimator of β1.

(c) Show that the variance of the estimator obtained above is σ2/∑

x2i .

Var(β1) =

∑x2

i Var(yi)∑x2

i

=σ2

∑x2

i(∑x2

i

)2=

σ2∑x2

i

.

(d) Write down an unbiased estimator for σ2.

σ2 =

∑(ei)

2

n− 1

where ei = yi − β1xi. The n− 1 appearing in the denominator follows from the fact that only one parameter

has been estimated.

(e) Derive a formula to compute a (1− α)100% confidence interval for β1.

β1 ± tn−1,α/2σ√∑

x2i

(f) Suppose a new response y0 is to be observed at the predictor value x0, using the point predictory0 = β1x0. Show that the variance of y0 is σ2(x2

0/∑

x2 + 1). (Wording is bad here. We want thevariance of the predicted response at x0.)We would predict the new response to be y0 + ε which has variance σ2x2

0/∑

x2i + σ2.

(g) Derive a formula to compute a (1− α)100% prediction interval for y0.Our best prediction of the new response is y0 (i.e. our estimate of its expected value.) Thus, a predictioninterval is given by

y0 ± tn−1,α/2

√σ2

(x2

0∑x2

i

+ 1

).

2. Consider the modelyi = β0 + β1xi + εi, i = 1, 2, . . . , n

where ε1, . . . , εn are independent normal random variables with mean 0 and varianceσ2.

Page 2: Statistical Sciences 3859a More Quiz 1 Practice · PDF file2 (a) Write down the least-squares estimator for β 0. βb 0 = ¯y − bβ 1¯x where βb 1 = S xy/S xx. (b) Show that the

2

(a) Write down the least-squares estimator for β0.

β0 = y − β1x

where β1 = Sxy/Sxx.

(b) Show that the above estimator is unbiased.

E[β0] = E[y]− xE[β1]

= β0 + β1x− xβ1 (since β1 is unbiased)

= β0.

Therefore, the estimator is unbiased for β0.

(c) Derive the variance formula for the above estimator.

β0 =1

n

∑yi −

x

Sxx

∑(xi − x)yi

=∑(

1

n− x(xi − x)

Sxx

)yi.

Since the yi’s are independent with variance σ2, we have

Var(β0) =∑(

1

n− x(xi − x)

Sxx

)2

σ2.

Expanding the quadratic term, and noting that∑

(xi − x) = 0 gives

Var(β0) = σ2

(1

n+

x2

Sxx

).

(d) Give a formula to compute a (1− α)100% confidence interval for β0.

β0 ± tn−2,α/2

√MSE

(1

n+

x2

Sxx

)(e) Find the expected value and variance of y1−y

x1−x . Is it unbiased for β1? Is there an estimator withsmaller variance? Explain briefly.

E [] =1

x1 − x(β0 + β1x1 − β0 − β1x) = β1.

Therefore, this estimator is unbiased for β1. It is also linear in the responses. By the Gauss-Markov theorem,the least-squares estimator for β1 has the smallest variance among all such estimators. Therefore, there is alinear unbiased estimator with a smaller variance than the one given here.