Martingales - UNIL · Martingales 1 Discrte time -de nition let (;F;P) be a probability space. let...

8
Martingales 1 Discrte time -definition let (Ω, F , P) be a probability space. let F 0 ⊆F 1 ⊆ ··· ⊆ F . and σ ( S n=0 F n ) ⊆F . F n contains all the event that are known up to time n. Example: Consider the binomial model with 3 tosses.F 0 ⊆F 1 ⊆F 2 ⊆F 3 Let A be the number of H in the first toss is 1. then A is in F 1 . Definition 1.1. A discrete time stochastic X n is adapted to F n if X n is F n measurable. That means that by time n the value of X n is known. Definition 1.2. A discrete time stochastic process {X n } on probability space , P, F ) is a martingale with respect to a filteration {F n } if 1. X n is adapted to {F n }. 2. E[|X n |] < 3. E[X n+1 |F n ]= X n When the equality in 3 is replaced by we have super-martingale and when it is replaced by we have submartingale. Conclusion: For m<n, E[X n |F m ]= E[X n [E[X n |F n-1 |F m ]= E[X n-1 |F m ]= ··· X m If X n is a supermartingale then E[X n |F m ] X m . Examples: 1. Let X i i.i.d with expectation μ. Let S n = X i - . 2. Branching process; Consider a population. At time n the number of individulas is Z n . Indivdual (n, i) give birth to X n,i individuals where X n,i are i.i.d with mean μ. E[Z n+1 |Z 0 , ··· ,Z n ]= E[ Zn i=1 X (n,i) ]= μZ n Clearly martingale if μ = 1 otherwise Z n n martingale. 1

Transcript of Martingales - UNIL · Martingales 1 Discrte time -de nition let (;F;P) be a probability space. let...

Martingales

1 Discrte time -definition

let (Ω,F ,P) be a probability space. let F0 ⊆ F1 ⊆ · · · ⊆ F . and σ (⋃∞n=0Fn) ⊆ F . Fn

contains all the event that are known up to time n.

Example: Consider the binomial model with 3 tosses.F0 ⊆ F1 ⊆ F2 ⊆ F3

Let A be the number of H in the first toss is 1. then A is in F1.

Definition 1.1. A discrete time stochastic Xn is adapted to Fn if Xn is Fn measurable.

That means that by time n the value of Xn is known.

Definition 1.2. A discrete time stochastic process Xn on probability space (Ω,P,F) is amartingale with respect to a filteration Fn if

1. Xn is adapted to Fn.

2. E[|Xn|] <∞

3. E[Xn+1|Fn] = Xn

When the equality in 3 is replaced by ≤ we have super-martingale and when it is replacedby ≥ we have submartingale.

Conclusion: For m < n, E[Xn|Fm] = E[Xn[E[Xn|Fn−1|Fm] = E[Xn−1|Fm] = · · ·Xm IfXn is a supermartingale then E[Xn|Fm] ≤ Xm.

Examples:

1. Let Xi i.i.d with expectation µ. Let Sn =∑Xi − nµ.

2. Branching process; Consider a population. At time n the number of individulas isZn. Indivdual (n, i) give birth to Xn,i individuals where Xn,i are i.i.d with mean µ.

E[Zn+1|Z0, · · · , Zn] = E[∑Zn

i=1X(n,i)] = µZn Clearly martingale if µ = 1 otherwiseZn/µ

n martingale.

1

3. Consider a Casino game where at each game you win 1 with probability p and lose1 with probability 1 − p. Let Yi = 1 if you win 1 at the ith game and Yi = −1if you lose 1 at the ith game. Let Sn = S0 +

∑ni=1 Yi. Consider Xn = zSn then

E[Xn+1|Fn] = Xn(pz + q/z) Take z = q/p then we have a martingale.

4. Let Xi i.i.d with moment generating function ϕ(θ). Then Mn = eθSn

ϕ(θ)nis a martingale.

Martingales and convex function

Convex function: g(αx+ (1− α)y) ≤ αg(x) + (1− α)g(y)

Jensen inequality: E[g(X)|Fn] ≥ g(E[X|Fn])

Proposition 1.1. Let Xn be a martingale and g a convex function. Then g(Xn issub-martingale.

Proof

E[g(Xn+1)|Fn] ≥ g(E[Xn+1|Fn]) = g(Xn)

Let Mn be a martingale. Let ξj = Mj −Mj−1 then E[ξn+1|Fn] = 0 (HW) Moreover,assume that E[M2

n] <∞ then

1. E[M2j ] <∞ for j ≤ n.

2. E[M2n] = E[M2

0 ] +∑n

j=1E[ξ2j ]

proof

1. E[(M2n] = E[E[M2

n|Fk]] ≥ E[(E[Mn|Fk])2] = E[M2k ]

2.

E[(Mn −M0)2] = E[(

n∑j=1

ξj)2] =

n∑j=1

E[ξ2j ] + 2∑i<j

E[ξiξj]

For i < j,E[ξiξj] = E[E[ξiξj|Fi]] = E[ξiE[ξj|Fi] = 0

Since E[ξj|Fi] = E[Mj −Mj−1|Fi] = 0

Proposition 1.2. Let H0, H1, · · · be a bounded adapted process and Mn a martingale. Let

Zn =n−1∑i=0

Hi(Mi+1 −Mi)

is a martingale.

2

Proof: E[|Zn|] ≤∑n

i=0 |Hi||(Mi+1|+ |Mi|) <∞. Zn is adapted,and

E[Zn+1|Fn]

=n−1∑i=0

Hi(Mi+1 −Mi) + E[Hn(Mn+1 −Mn)|Fn]

=n−1∑i=0

Hi(Mi+1 −Mi) +HnE[(Mn+1 −Mn)|Fn] = Zn

Stopping time:

Definition 1.3. Let (Ω,P,F) be a probability sapace with filtration F0 ⊆ F1 ⊆ .. ⊆ F . Anon-negative random variable τ is a stopping if (τ ≤ n) ⊆ Fn

properties of Stopping times Let S and T be stopping times then with respect toprobability space (Ω,P,F), and filtration F0 ⊆ F1, · · · ⊆ F

1. S + T is a stopping time

2. S ∧ T is a stopping time

3. S ∧ T is a stopping time

4. S ∨ T is a stopping time

5. An integer constant is stopping time.

Examples for stopping times: Consider a random walk as in example 3. let τ the firsttome that Sn /∈ (0, b) where b is an integer.Why?

(τ > n) = S0 ∈ (0, b), S1 ∈ (0, b), · · ·Sn ∈ (0, b) ⊂ Fn

Is τ − 2 a stoppimg time? is τ + 2 a stopping time?

Question:

We saw that for martingales E[Xn] = E[E[Xn|Fn−1]] = E[Xn−1] = · · ·E[X0] Is it truethat

E[Xτ ] = E[X0]

Answer : Not always. Consider a simple random walk, and consider the first time theprocess reached 1 starting at 0.

The stopped process

Let T be a stopping time and Xn, n ≥ 1 be a stochastic process. Then Xn∧T is calledthe stopped process.

3

Proposition 1.3. If Xn, n ≥ 1 is a martingale defined on (Ω,P,F) with respect to afiltration (Fn), and T is a stopping time with respect to the same filtration then the stoppedprocess Xn∧T is a martingale.

Proof

Let

Hi =

0 if T ≤ i

1 if T > i

Xn∧T = X0 +n−1∑i=0

Hi(Xi+1 −Xi)

H satisfied assumption of (1.2)

Similarily if X is a submartingale (suprtmartingale) then the stopped process is sub-martingale (suprtmartingale).

Proposition 1.4. If Xn, n ≥ 1 is a super (sub) martingale defined on (Ω,P,F) withrespect to a filtration (Fn), and T is a stopping time with respect to the same filtration thenthe stopped process Xn∧T is a super (sub) martingale.

Observation 1 E[XT∧n] = E[[E[XT∧n|Fn−1] = E[XT∧(n−1)] = · · · = E[X0]

Observation 2 If P(T <∞) = 1 then as n→∞, XT∧n → XT with probability 1

Question: Is it always true that E[XT = E[X0]? Not always:

Example: consider a random walk with p = 1/2 In this case Sn is recurrent Markov chainand thus with probability 1 T1 = infn : Sn = 1 <∞. E[ST1∧n] = 0 while E[ST1 ] = 1.

Doob’s Optional Stopping Theorem:

Theorem 1.1. Let T be a stopping time. Let X be a martingale. Then

E[XT ] = E[X0]

in each of the following situations:

(i) T is bounded . (For some N , T (w) ≤ N for all w).

(ii) X is bounded ( For Some K > 0 |Xn(w)| ≤ K for all n and all w ) and T is finite withprobability 1.

(ii’) T is finite with probability 1. and |Xn∧T (w)| ≤ K

4

(iii) E[T ] <∞, and for some K > 0

|Xn(w)−Xn−1(w)| ≤ K

for all n ≥ 1 and all w.

Proof.

(i) E[XT∧N ] = E[XT ] = E[X0]

(ii) Since P(T < ∞) = 1 then XT∧n → XT , and the result follows from the Boundedconvergence theorem.

(ii’) Since T is finite XT∧n → XT , and XT ≤ K. |E[X0] − E[XT ]| = |E[XT∧n] − E[XT ]| ≤2KP(T > n)→ 0

(iii) XT =∑T

i=1(Xi − Xi−1), |XT | ≤∑T

i=1 |(Xi − Xi−1)|SinceE(∑T

i=1 |(Xi − Xi−1)|) ≤KE(T ) <∞ the rsult follows fron the DCT.

Example 1. Example 1: Let Xi be i.i.d random variables where E[|X|] < ∞ and let T bea stopping time (with respect to the filtration Fn generated by (X1, · · · , Xn) ). Assume thatE[T ] <∞, and E[Xi] = µ. Then:

1. Sn =∑n

i=1Xi − nµ is a martingale.

2. E[∑T

i=1Xi] = E[T ]µ.

Proof

First consider non-negative random variables.

E[∑T∧n

i=1 Xi] = E[T ∧ n]µ and then the result folows from MCT. For general Xi since

E|Xi| <∞ |∑T∧n

i=1 Xi| ≤∑T

i=1 |Xi| the result follows from the BCT.

Example 2. Gambler ruin problem

Consider a gamler who wins 1 with probability p and loses 1 with probability q = 1− p 6=1/2. Assume that she starts with fortune x and each time bets on 1 $. Let T the first timethe fortunes is either 0 or b, T = inf[n : Sn /∈ (0, b). Where Sn =

∑ni=1 Yi, Yi is 1 w.p.p and

−1 w.p. 1− p.

Solution

Define Mn = (q/p)Sn then Mn is a martingale. T is finite with probability 1. (Clearlythere is a positive probability ε to be out after b−a steps, from any state. Thus the probabilityof getting out after n(b−a) steps is at least 1−(1−ε)n → 1. Also MT∧n ≤ max((q/p)b, 1)thusapply (ii’), E[MT ] = (q/p)x. Let Tb be the first time Sn hits b and T0 the first time that ithits 0.

5

(q/p)x = E[(q/p)ST ] = P (Tb < T0)(q/p)b + (1− P (Tb < T0))

P (Tb < T0) =(q/p)x − 1

(q/p)b − 1

In the symmetric case Sn is a martingale. Applying similar arguments: |Sn∧T | ≤ bapplying (ii’)

E[ST ] = x = bP (Tb < T0)b

P (Tb < T0) = x/b

Example 3. In the last example for the symmetric random walk let us obtain the expectedtime to exit (−a, b)

Observation: V ar(Yi) = 1, and V ar(Sn) = E[(Sn)2] = n, S2n∧T ≤ max a2, b2) (Why?)

S2n − n is a martingale. Assume that we can apply the optional stopping theorem then:

0 = E[(ST )2 − T ] = E[(ST )2]− E[T ]

thus,E[T ] = (a/(a+ b)b2 + b/(a+ b)a2 = ab

Justification:

E[S2T∧n − T ∧ n] = 0

thus

E[S2T∧n] = E[T ∧ n]

Apply DCT for left hand side and MCT for right hand side.

Example 4. Consider again a random walk with p < 1/2. Let S0 = x. Consider the firsttime that the process is 0. First note that Mn = Sn − n(1− 2p) is a martingale.

Thus E[Sn∧T ]− (n ∧ T )(2p− 1)] = x

E[(n ∧ T )(1− 2p)] = x− E[Sn∧T ] ≤ x

6

E[(n ∧ T ] ≤ x/(1− 2p)

(Why?)E(T ) ≤ x/(1− 2p)

Example 5. Ruin probability: Consider an insurance company. probability for a claim atany time period is q = 1−p, the claim size is 2. premium 1 . assume intial capital x what isthe ruin probability? Assume the net profit condition 1 > 2(1− p), or p > 1/2. Notice thateach period the reserve either increases by 1 with probability p or decreases by 1 with probability q = 1− p. (q/p)Sn is a martingale. Thus

(q/p)x = E[(q/p)Sn∧T ] = P(T ≤ n) + E[(q/p)n1T>n] (Why?)

P(T ≤ n)→ P(T <∞)

E[(q/p)n1T>n] ≤ (q/p)n → 0

Thus P(T <∞) = (q/p)x

2 Continuous time martingales

Consider a process Xt, t ∈ [0,∞). The sample paths are aeither continuous functions (from[0,∞)→ R, or functions continuous from the right that have left limits. Let Ft : t ∈ [0,∞)be a family of sub-sigma algebras with the property that Fs ⊆,Ft whenever s ≤ t. Xt isa martingale if:

1.

2. Xt is Ft measurable.

3. E|Xt| <∞.

4. E[Xt|Fs] = Xs

Definition 2.1. τ is a Ft stopping time if (τ ≤ t) ∈ Ft

Definition 2.2. F+t = ∩ε>0Ft+ε.

The filtration is right continuous if Ft = F+t .

Problem with continuous martingales: inft : Xt ≥ u is a stopping time. However,inft : Xt > u is not always a stopping time. However when Ft is right continuous bothare stopping times.

When Xt is right continuous.

In our case when speaking about continuous tome martingale (or stochastic process)weassume that:

7

1. If A ⊂ B and B ∈ F , and P (B) = 0 then A ∈ F and P (A) = 0.

2. F0 contains all the P -null set.

3. Ft is right continuous.

Under the above condiotons the optional sampling theorem holds also for continuousmartingales.

8