Hypothesis Testing
Embed Size (px)
description
Transcript of Hypothesis Testing

1
Hypothesis Testing

2
Hypothesis Testing
Greene: App. C:892-897
Statistical Test: Divide parameter space (Ω) into two disjoint sets: Ω0, Ω1
Ω0 ∩ Ω1= and Ω0 Ω1=Ω
Based on sample evidence does estimated parameter (*) and therefore the true parameter fall into one of these sets? We answer this question using a statistical test.

3
Hypothesis Testing
y1,y2,…,yT is a random sample providing information on the (K x 1) parameter vector, Θ where ΘΩ
R(Θ)=[R1(Θ), R2(Θ),…RJ(Θ)] is a (J x 1) vector of restrictions (e.g., hypotheses) on K parameters, Θ.
For this class: R(Θ)=0, ΘΩ Ω0 = Θ| ΘΩ, R(Θ)=0 Ω1 = Θ| ΘΩ, R(Θ)≠0

4
Hypothesis Testing
Null Hypothesis: ΘΩ0 (H0)
Alternate Hypothesis: ΘΩ1 (H1)
Hypothesis Testing: Divide sample space into two
portions pertaining to H0 and H1
The region where we reject H0 referred to as critical region
of the test

5
Hypothesis Testing
Test of whether * 0 or 1 (* an est. of ) based on a test statistic w/known dist. under H0 and some other dist. if H1 true Transform * into test statistic Critical region of hyp. test is the set of values for which H0
would be rejected (e.g., values of test statistic unlikely to occur if H0 is true) If test statistic falls into the critical region→evidence that H0 not true

6
Hypothesis Testing
General Test Procedure Develop a null hypothesis (Ho)
that will be maintained until evidence to the contrary
Develop an alternate hypothesis (H1) that will be adopted if H0 not accepted
Estimate appropriate test statistic
Identify desired critical region Compare calculated test
statistic to critical region Reject H0 if test statistic in
critical region

7
Hypothesis Testing
Definition of Rejection Region
P(cvL ≤ ≤ cvU)=1-Pr(Type I Error)
cvL cvU
Do Not Reject H0Reject H0 Reject H0
f(|H0)
Prob. rejecting H0 even though true

8
Hypothesis Testing
Defining the Critical Region Select a region that identifies
parameter values that are unlikely to occur if the null hypothesis is true
Value of Type I ErrorPr (Type I Error) = PrRejecting H0|H0 truePr (Type II Error) = PrAccepting H0|H1 true
Never know with certainty whether you are correct→pos. Pr(Type I Error)
Example of Standard Normal

9
Hypothesis Testing
0.00
0.05
0.10
0.15
0.20
0.25
0.30
0.35
0.40
-3 -2.5 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 2.5 3
Standard Normal Distribution
P(-1.96 ≤ z ≤ 1.96)=0.95
α = 0.05= P(Type I Error)0.025 0.025

10
Hypothesis Testing
Example of mean testing Assume RV is normally
distributed: yt~N(,2) H0: = 1 H1: ≠ What is distribution of mean
under H0?2σ
β~N 1,T
æ ö÷ç ÷ç ÷ç ÷çè ø
Assume 2=10, T=10→ ( ) 0β~N 1,1 if H true

11
Hypothesis Testing
0.00
0.05
0.10
0.15
0.20
0.25
0.30
0.35
0.40
-2 -1.5 -1 -0.5 0 0.5 1 1.5 2 2.5 3 3.5 4
β~N(1,1) if H0 True
P(-0.96 ≤ β ≤ 2.96)=0.95 P(-1.96 ≤ z ≤ 1.96)=0.95 (e.g, transform dist. of β into RV with std. normal dist.
α = 0.050.025 0.025

12
Hypothesis Testing
0.00
0.05
0.10
0.15
0.20
0.25
0.30
0.35
0.40
-3 -2.5 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 2.5 3
Standard Normal Distribution
P(-1.96 ≤ z ≤ 1.96)=0.95
α = 0.05= P(Type I Error)
0.025 0.025
0 0
ˆ~ (0,1)
/if normally dist. and H true (e.g., = )
oz NT

13
Hypothesis Testing

14
Hypothesis Testing
Again, this assumes we know σ
P(-t(T-1),α/2 ≤ t ≤ t(T-1),α/2)=1-α
1
0 0
ˆ~
ˆ /if normally dist. and H true (e.g., = )
oTt t
T

15
Hypothesis Testing

16
Hypothesis Testing
Likelihood Ratio Test:

17
Hypothesis Testing
Likelihood Ratio Test: Compare value of likelihood function, l(•), under the null hypothesis, l(Ω0)] vs. value with unrestricted parameter choice [l*(Ω)] Null hyp. could reduce set of
parameter values. What does this do to the max.
likelihood function value? If the two resulting max. LF
values are close enough→can not reject H0

18
Hypothesis Testing
Is this difference in likelihood function values large?
Likelihood ratio (λ):
λ is a random variable since it depends on yi’s
What are possible values of λ?
0 0
LF* Ω *
LF Ω
l
l

19
Hypothesis Testing
Likelihood Ratio Principle Null hypo. defining Ω0 is
rejected if λ > 1 (Why 1?) Need to establish critical level
of λ, λC that is unlikely to occur under H0 (e.g., is 1.1 far enough away from 1.0)? Reject H0 if estimated value of
λ is greater than λC λ = 1→Null hypo. does not
sign. reduce parameter space
H0 not rejected Result conditional on sample

20
Hypothesis Testing
General Likelihood Ratio Test Procedure
Choose probability of Type I error, (e.g., test sign. level)
Given , find value of C that
satisfies: P(> C | H0 is
true) Evaluate test statistic based on
sample information
Reject (fail to reject) null hypothesis if > C (<C)
0
*l
l

21
Hypothesis Testing
LR test of mean of Normal Distribution (µ) with s2not known
This implies the following test procedures:
F-Test t-Test
LR test of hypothesized value of 2 (on class website)

22
Asymptotic Tests
Previous tests based on finite samples Use asymptotic tests when
appropriate finite sample test statistic is unavailable
Three tests commonly used: Asymptotic Likelihood Ratio Wald Test Lagrangian Multiplier (Score)
Test Greene p.484-492 Buse article (on website)

23
Asymptotic Tests
Asymptotic Likelihood Ratio Test y1,…,yt are iid, E(yt)=β, var(yt)=σ
(β*-β)T1/2 converge in dist to N(0,σ) As T→∞, use normal pdf to generate LF
λ ≡ l*(Ω)/l(Ω0) or l(l)/l(0) l*(Ω) = Max[l(|y1,…,yT):Ω] l(Ω0) = Max[l(|y1,…,yT):Ω0]
Restricted LF given H0

24
Asymptotic Tests
Asymptotic Likelihood Ratio (LR) LR ≡ 2ln(λ) = 2[L*()-L(0)] L() = lnl() LR~χ
J asymptotically where J is the number of joint null hypothesis (restrictions)

25
Asymptotic Tests
Asymptotic Likelihood Ratio Test
l
L
Ll L
.5LR
LR ≡ 2ln()=2[L(1)-L(0)]LR~2
J asymptotically (p.851 Greene)Evaluated L(•) at both 1 and 0
L≡ Log-LikelihoodFunction
l generates unrestricted L(•) max L(0) value obtained under H0

26
Greene defines as: -2[L(0)-L(1)] Result is the same Buse, p.153, Greene p.484-486
Given H0 true, LR has an approximateχ2 dist. with J DF (the number of joint hypotheses) Reject H0when LR > χ
c where χ
c is the predefined critical value of the dist. given J DF.
Asymptotic Tests
Asymptotic Likelihood Ratio Test

27
Suppose consists of 1 element Have 2 samples generating different estimates of the LF with same value of that max. the LF
0.5LR will depend on Distance between l and 0(+) The curvature of the LF (+) C() represents LF curvature
2
20
l
d LC
dL
Don’t forget the “–” sign
Asymptotic TestsImpact of Curvature on LRShows Need For Wald Test
InformationMatrix

28
Asymptotic TestsImpact of Curvature on LRShows Need For Wald Test
l
L
Ll
L.5LR0
L.5LR1
L1
H0: 0 W=(l-0)2 C(|=l)W=(l-0)2 I(|=l) W~2
J asymptoticallyNote: Evaluated at l
Max at same point
Two samples
L

29
Asymptotic TestsImpact of Curvature on LRShows Need For Wald Test
The above weights the squared distance, (l - 0)2 by the curvature of the LF instead of using the differences as in LR test
Two sets of data may produce the same (l - 0)2 value but give diff. LR values because
of curvature The more curvature, the more
likely H0 not true (e.g., test statistic is larger)
Greene, p. 486-487 gives alternative motivation (careful of notation)
Buse, 153-154

30
Asymptotic TestsImpact of Curvature on LRShows Need For Wald Test
Extending this to J simultaneous hypotheses and k parameters
( )( )
( )( )
( )( )
( )( )
( )( )
( )( )
11
l lll l
2l J
l
d RIR dWJ x k J x1k x k1x J k x J
Rd ; W ~ assymp.
|
--é ù¢¢ Q QQQ Qê ú= ê ú
ê úë û¶ Q
Q = c¶Q Q=Q
Note that R(∙), d(∙) and I(∙) evaluated at l
When Rj() of the form: j=j0,j=1,…k
d()=Ik, W=(l-0)2 I(|=l)

31
Asymptotic Tests
Based on the curvature of the log-likelihood function (L) At unrestricted max:
Summary of Lagrange Multiplier (Score) Test
log0
d LS
d
Score of LikelihoodFunction

32
Asymptotic TestsSummary of Lagrange Multiplier (Score) Test
How much does S() depart from 0 when evaluated at the hypothesized value?
Weight squared slope by curvature The greater the curvature, the closer 0will be to the max. value Weight by C()-1→smaller test statistic the more curvature Small values of test statistic, LM, will be generated if the value of L(0) is close to the max. value, L(l), e.g. slope closer to 0

33
Asymptotic TestsSummary of Lagrange Multiplier (Score) Test
L
LM~2J asympt.
S(0)
LB
LA
S(0)=dL/d|=0LM= S(0)2 I(0)-1
I() = -d2L/d2|=0
S()=0
S() ≡ dL/d
Two samples
L

34
Asymptotic TestsSummary of Lagrange Multiplier (Score) Test
Small values of test statistic, LM, should be generated when
L(∙) has greater curvature when evaluated at 0 The test statistic is smaller when 0 nearer the value that
generates maximum LF value (e.g. S(0) is closer to zero)

35
Asymptotic TestsSummary of Lagrange Multiplier (Score) Test
Extending this to multiple parameters
( )( )
( )( )
( )( )
1000
2J
SISLMk x1k x k1x k
LM ~
-¢ qqq=
c
Buse, pp. 154-155
Greene, pp.489-490

36
Asymptotic Tests Summary
LR, W, LM differ in type of information required
LR requires both restricted and unrestricted parameter estimates W requires only unrestricted
estimates LM requires only restricted
estimates If log-likelihood quadratic with respect to the 3 tests result in same numerical values for large samples

37
Asymptotic Tests Summary
All test statistics distributed asym. 2 with J d.f. (number of joint
hypotheses) In finite samples W > LR > LM This implies W more conservative Example: With 2 known, a test of
parameter value (e.g., 0) results in:
( )202
ˆLR W LM
T
b- b= = = æ ös ÷ç ÷÷çè ø
One case where LR=W=LM in finite samples

38
Asymptotic Tests Summary
Example of asymptotic tests
Buse (pp.155-156) same example but assumes =1