18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175...

83
18.175: Lecture 19 Even more on martingales Scott Sheffield MIT 18.175 Lecture 19

Transcript of 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175...

Page 1: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

18.175: Lecture 19

Even more on martingales

Scott Sheffield

MIT

18.175 Lecture 19

Page 2: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Outline

Recollections

More martingale theorems

18.175 Lecture 19

Page 3: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Outline

Recollections

More martingale theorems

18.175 Lecture 19

Page 4: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Recall: conditional expectation

I Say we’re given a probability space (Ω,F0,P) and a σ-fieldF ⊂ F0 and a random variable X measurable w.r.t. F0, withE |X | <∞. The conditional expectation of X given F is anew random variable, which we can denote by Y = E (X |F).

I We require that Y is F measurable and that for all A in F ,we have

∫A XdP =

∫A YdP.

I Any Y satisfying these properties is called a version ofE (X |F).

I Theorem: Up to redefinition on a measure zero set, therandom variable E (X |F) exists and is unique.

I This follows from Radon-Nikodym theorem.

I Theorem: E (X |Fi ) is a martingale if Fi is an increasingsequence of σ-algebras and E (|X |) <∞.

18.175 Lecture 19

Page 5: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Recall: conditional expectation

I Say we’re given a probability space (Ω,F0,P) and a σ-fieldF ⊂ F0 and a random variable X measurable w.r.t. F0, withE |X | <∞. The conditional expectation of X given F is anew random variable, which we can denote by Y = E (X |F).

I We require that Y is F measurable and that for all A in F ,we have

∫A XdP =

∫A YdP.

I Any Y satisfying these properties is called a version ofE (X |F).

I Theorem: Up to redefinition on a measure zero set, therandom variable E (X |F) exists and is unique.

I This follows from Radon-Nikodym theorem.

I Theorem: E (X |Fi ) is a martingale if Fi is an increasingsequence of σ-algebras and E (|X |) <∞.

18.175 Lecture 19

Page 6: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Recall: conditional expectation

I Say we’re given a probability space (Ω,F0,P) and a σ-fieldF ⊂ F0 and a random variable X measurable w.r.t. F0, withE |X | <∞. The conditional expectation of X given F is anew random variable, which we can denote by Y = E (X |F).

I We require that Y is F measurable and that for all A in F ,we have

∫A XdP =

∫A YdP.

I Any Y satisfying these properties is called a version ofE (X |F).

I Theorem: Up to redefinition on a measure zero set, therandom variable E (X |F) exists and is unique.

I This follows from Radon-Nikodym theorem.

I Theorem: E (X |Fi ) is a martingale if Fi is an increasingsequence of σ-algebras and E (|X |) <∞.

18.175 Lecture 19

Page 7: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Recall: conditional expectation

I Say we’re given a probability space (Ω,F0,P) and a σ-fieldF ⊂ F0 and a random variable X measurable w.r.t. F0, withE |X | <∞. The conditional expectation of X given F is anew random variable, which we can denote by Y = E (X |F).

I We require that Y is F measurable and that for all A in F ,we have

∫A XdP =

∫A YdP.

I Any Y satisfying these properties is called a version ofE (X |F).

I Theorem: Up to redefinition on a measure zero set, therandom variable E (X |F) exists and is unique.

I This follows from Radon-Nikodym theorem.

I Theorem: E (X |Fi ) is a martingale if Fi is an increasingsequence of σ-algebras and E (|X |) <∞.

18.175 Lecture 19

Page 8: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Recall: conditional expectation

I Say we’re given a probability space (Ω,F0,P) and a σ-fieldF ⊂ F0 and a random variable X measurable w.r.t. F0, withE |X | <∞. The conditional expectation of X given F is anew random variable, which we can denote by Y = E (X |F).

I We require that Y is F measurable and that for all A in F ,we have

∫A XdP =

∫A YdP.

I Any Y satisfying these properties is called a version ofE (X |F).

I Theorem: Up to redefinition on a measure zero set, therandom variable E (X |F) exists and is unique.

I This follows from Radon-Nikodym theorem.

I Theorem: E (X |Fi ) is a martingale if Fi is an increasingsequence of σ-algebras and E (|X |) <∞.

18.175 Lecture 19

Page 9: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Recall: conditional expectation

I Say we’re given a probability space (Ω,F0,P) and a σ-fieldF ⊂ F0 and a random variable X measurable w.r.t. F0, withE |X | <∞. The conditional expectation of X given F is anew random variable, which we can denote by Y = E (X |F).

I We require that Y is F measurable and that for all A in F ,we have

∫A XdP =

∫A YdP.

I Any Y satisfying these properties is called a version ofE (X |F).

I Theorem: Up to redefinition on a measure zero set, therandom variable E (X |F) exists and is unique.

I This follows from Radon-Nikodym theorem.

I Theorem: E (X |Fi ) is a martingale if Fi is an increasingsequence of σ-algebras and E (|X |) <∞.

18.175 Lecture 19

Page 10: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingales

I Let Fn be increasing sequence of σ-fields (called a filtration).

I A sequence Xn is adapted to Fn if Xn ∈ Fn for all n. If Xn isan adapted sequence (with E |Xn| <∞) then it is called amartingale if

E (Xn+1|Fn) = Xn

for all n. It’s a supermartingale (resp., submartingale) ifsame thing holds with = replaced by ≤ (resp., ≥).

18.175 Lecture 19

Page 11: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingales

I Let Fn be increasing sequence of σ-fields (called a filtration).

I A sequence Xn is adapted to Fn if Xn ∈ Fn for all n. If Xn isan adapted sequence (with E |Xn| <∞) then it is called amartingale if

E (Xn+1|Fn) = Xn

for all n. It’s a supermartingale (resp., submartingale) ifsame thing holds with = replaced by ≤ (resp., ≥).

18.175 Lecture 19

Page 12: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Two big results

I Optional stopping theorem: Can’t make money inexpectation by timing sale of asset whose price is non-negativemartingale.

I Proof: Just a special case of statement about (H · X ) ifstopping time is bounded.

I Martingale convergence: A non-negative martingale almostsurely has a limit.

I Idea of proof: Count upcrossings (times martingale crosses afixed interval) and devise gambling strategy that makes lots ofmoney if the number of these is not a.s. finite. Basically, youbuy every time price gets below the interval, sell each time itgets above.

18.175 Lecture 19

Page 13: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Two big results

I Optional stopping theorem: Can’t make money inexpectation by timing sale of asset whose price is non-negativemartingale.

I Proof: Just a special case of statement about (H · X ) ifstopping time is bounded.

I Martingale convergence: A non-negative martingale almostsurely has a limit.

I Idea of proof: Count upcrossings (times martingale crosses afixed interval) and devise gambling strategy that makes lots ofmoney if the number of these is not a.s. finite. Basically, youbuy every time price gets below the interval, sell each time itgets above.

18.175 Lecture 19

Page 14: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Two big results

I Optional stopping theorem: Can’t make money inexpectation by timing sale of asset whose price is non-negativemartingale.

I Proof: Just a special case of statement about (H · X ) ifstopping time is bounded.

I Martingale convergence: A non-negative martingale almostsurely has a limit.

I Idea of proof: Count upcrossings (times martingale crosses afixed interval) and devise gambling strategy that makes lots ofmoney if the number of these is not a.s. finite. Basically, youbuy every time price gets below the interval, sell each time itgets above.

18.175 Lecture 19

Page 15: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Two big results

I Optional stopping theorem: Can’t make money inexpectation by timing sale of asset whose price is non-negativemartingale.

I Proof: Just a special case of statement about (H · X ) ifstopping time is bounded.

I Martingale convergence: A non-negative martingale almostsurely has a limit.

I Idea of proof: Count upcrossings (times martingale crosses afixed interval) and devise gambling strategy that makes lots ofmoney if the number of these is not a.s. finite. Basically, youbuy every time price gets below the interval, sell each time itgets above.

18.175 Lecture 19

Page 16: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Problems

I Assume a betting market’s prices are continuous martingales.(Forget about bid-ask spreads, possible longshot bias, bizarrearbitrage opportunities, discontinuities brought about bysudden spurts of information, etc.)

I How many primary candidates does one expect to ever exceed20 percent on a primary nomination market? (Asked byAldous.)

I Compute probability of having a martingale price reach abefore b if martingale prices vary continuously.

I Polya’s urn: r red and g green balls. Repeatedly samplerandomly and add extra ball of sampled color. Ratio of red togreen is martingale, hence a.s. converges to limit.

18.175 Lecture 19

Page 17: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Problems

I Assume a betting market’s prices are continuous martingales.(Forget about bid-ask spreads, possible longshot bias, bizarrearbitrage opportunities, discontinuities brought about bysudden spurts of information, etc.)

I How many primary candidates does one expect to ever exceed20 percent on a primary nomination market? (Asked byAldous.)

I Compute probability of having a martingale price reach abefore b if martingale prices vary continuously.

I Polya’s urn: r red and g green balls. Repeatedly samplerandomly and add extra ball of sampled color. Ratio of red togreen is martingale, hence a.s. converges to limit.

18.175 Lecture 19

Page 18: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Problems

I Assume a betting market’s prices are continuous martingales.(Forget about bid-ask spreads, possible longshot bias, bizarrearbitrage opportunities, discontinuities brought about bysudden spurts of information, etc.)

I How many primary candidates does one expect to ever exceed20 percent on a primary nomination market? (Asked byAldous.)

I Compute probability of having a martingale price reach abefore b if martingale prices vary continuously.

I Polya’s urn: r red and g green balls. Repeatedly samplerandomly and add extra ball of sampled color. Ratio of red togreen is martingale, hence a.s. converges to limit.

18.175 Lecture 19

Page 19: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Problems

I Assume a betting market’s prices are continuous martingales.(Forget about bid-ask spreads, possible longshot bias, bizarrearbitrage opportunities, discontinuities brought about bysudden spurts of information, etc.)

I How many primary candidates does one expect to ever exceed20 percent on a primary nomination market? (Asked byAldous.)

I Compute probability of having a martingale price reach abefore b if martingale prices vary continuously.

I Polya’s urn: r red and g green balls. Repeatedly samplerandomly and add extra ball of sampled color. Ratio of red togreen is martingale, hence a.s. converges to limit.

18.175 Lecture 19

Page 20: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Outline

Recollections

More martingale theorems

18.175 Lecture 19

Page 21: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Outline

Recollections

More martingale theorems

18.175 Lecture 19

Page 22: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Theorems (proofs discussed on subsequent slides)

I Lp convergence theorem: If Xn is martingale withsupE |Xn|p <∞ where p > 1 then Xn → X a.s. and in Lp.

I Orthogonal increment theorem: Let Xn be a martingalewith EX 2

n <∞ for all n. If m ≤ n and Y ∈ Fm withEY 2 <∞, then E

((Xn − Xm)Y

)= 0.

I Cond. variance theorem: If Xn is martingale, EX 2n <∞ for

all n, then E((Xn − Xm)2|Fm

)= E (X 2

n |Fm)− X 2m.

I “Accumulated variance” theorems: Consider martingaleXn with EX 2

n <∞ for all n. By Doob, can writeX 2n = Mn + An where Mn is a martingale, and

An =n∑

m=1

E (X 2m|Fm−1)−X 2

m−1 =n∑

m=1

E((Xm−Xm−1)2|Fm−1

).

Then E (supm |Xm|2) ≤ 4EA∞. And limn→∞ Xn exists and isfinite a.s. on A∞ <∞.

18.175 Lecture 19

Page 23: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Theorems (proofs discussed on subsequent slides)

I Lp convergence theorem: If Xn is martingale withsupE |Xn|p <∞ where p > 1 then Xn → X a.s. and in Lp.

I Orthogonal increment theorem: Let Xn be a martingalewith EX 2

n <∞ for all n. If m ≤ n and Y ∈ Fm withEY 2 <∞, then E

((Xn − Xm)Y

)= 0.

I Cond. variance theorem: If Xn is martingale, EX 2n <∞ for

all n, then E((Xn − Xm)2|Fm

)= E (X 2

n |Fm)− X 2m.

I “Accumulated variance” theorems: Consider martingaleXn with EX 2

n <∞ for all n. By Doob, can writeX 2n = Mn + An where Mn is a martingale, and

An =n∑

m=1

E (X 2m|Fm−1)−X 2

m−1 =n∑

m=1

E((Xm−Xm−1)2|Fm−1

).

Then E (supm |Xm|2) ≤ 4EA∞. And limn→∞ Xn exists and isfinite a.s. on A∞ <∞.

18.175 Lecture 19

Page 24: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Theorems (proofs discussed on subsequent slides)

I Lp convergence theorem: If Xn is martingale withsupE |Xn|p <∞ where p > 1 then Xn → X a.s. and in Lp.

I Orthogonal increment theorem: Let Xn be a martingalewith EX 2

n <∞ for all n. If m ≤ n and Y ∈ Fm withEY 2 <∞, then E

((Xn − Xm)Y

)= 0.

I Cond. variance theorem: If Xn is martingale, EX 2n <∞ for

all n, then E((Xn − Xm)2|Fm

)= E (X 2

n |Fm)− X 2m.

I “Accumulated variance” theorems: Consider martingaleXn with EX 2

n <∞ for all n. By Doob, can writeX 2n = Mn + An where Mn is a martingale, and

An =n∑

m=1

E (X 2m|Fm−1)−X 2

m−1 =n∑

m=1

E((Xm−Xm−1)2|Fm−1

).

Then E (supm |Xm|2) ≤ 4EA∞. And limn→∞ Xn exists and isfinite a.s. on A∞ <∞.

18.175 Lecture 19

Page 25: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Theorems (proofs discussed on subsequent slides)

I Lp convergence theorem: If Xn is martingale withsupE |Xn|p <∞ where p > 1 then Xn → X a.s. and in Lp.

I Orthogonal increment theorem: Let Xn be a martingalewith EX 2

n <∞ for all n. If m ≤ n and Y ∈ Fm withEY 2 <∞, then E

((Xn − Xm)Y

)= 0.

I Cond. variance theorem: If Xn is martingale, EX 2n <∞ for

all n, then E((Xn − Xm)2|Fm

)= E (X 2

n |Fm)− X 2m.

I “Accumulated variance” theorems: Consider martingaleXn with EX 2

n <∞ for all n. By Doob, can writeX 2n = Mn + An where Mn is a martingale, and

An =n∑

m=1

E (X 2m|Fm−1)−X 2

m−1 =n∑

m=1

E((Xm−Xm−1)2|Fm−1

).

Then E (supm |Xm|2) ≤ 4EA∞. And limn→∞ Xn exists and isfinite a.s. on A∞ <∞.

18.175 Lecture 19

Page 26: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Lp convergence theorem

I Theorem: If Xn is a martingale with supE |Xn|p <∞ wherep > 1 then Xn → X a.s. and in Lp.

I Proof idea: Have (EX+n )p ≤ (E |Xn|)p ≤ E |Xn|p for

martingale convergence theorem Xn → X a.s. Use Lp maximalinequality to get Lp convergence.

18.175 Lecture 19

Page 27: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Lp convergence theorem

I Theorem: If Xn is a martingale with supE |Xn|p <∞ wherep > 1 then Xn → X a.s. and in Lp.

I Proof idea: Have (EX+n )p ≤ (E |Xn|)p ≤ E |Xn|p for

martingale convergence theorem Xn → X a.s. Use Lp maximalinequality to get Lp convergence.

18.175 Lecture 19

Page 28: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Orthogonality of martingale increments

I Theorem: Let Xn be a martingale with EX 2n <∞ for all n. If

m ≤ n and Y ∈ Fm with EY 2 <∞, thenE((Xn − Xm)Y

)= 0.

I Proof idea: E((Xn − Xm)Y

)= E

[E((Xn − Xm)Y |Fm

)]=

E[YE((Xn − Xm)|Fm

)]= 0

I Conditional variance theorem: If Xn is a martingale withEX 2

n <∞ for all n thenE((Xn − Xm)2|Fm

)= E (X 2

n |Fm)− X 2m.

18.175 Lecture 19

Page 29: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Orthogonality of martingale increments

I Theorem: Let Xn be a martingale with EX 2n <∞ for all n. If

m ≤ n and Y ∈ Fm with EY 2 <∞, thenE((Xn − Xm)Y

)= 0.

I Proof idea: E((Xn − Xm)Y

)= E

[E((Xn − Xm)Y |Fm

)]=

E[YE((Xn − Xm)|Fm

)]= 0

I Conditional variance theorem: If Xn is a martingale withEX 2

n <∞ for all n thenE((Xn − Xm)2|Fm

)= E (X 2

n |Fm)− X 2m.

18.175 Lecture 19

Page 30: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Orthogonality of martingale increments

I Theorem: Let Xn be a martingale with EX 2n <∞ for all n. If

m ≤ n and Y ∈ Fm with EY 2 <∞, thenE((Xn − Xm)Y

)= 0.

I Proof idea: E((Xn − Xm)Y

)= E

[E((Xn − Xm)Y |Fm

)]=

E[YE((Xn − Xm)|Fm

)]= 0

I Conditional variance theorem: If Xn is a martingale withEX 2

n <∞ for all n thenE((Xn − Xm)2|Fm

)= E (X 2

n |Fm)− X 2m.

18.175 Lecture 19

Page 31: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Square integrable martingales

I Suppose we have a martingale Xn with EX 2n <∞ for all n.

I We know X 2n is a submartingale. By Doob’s decomposition,

an write X 2n = Mn + An where Mn is a martingale, and

An =n∑

m=1

E (X 2m|Fm−1)−X 2

m−1 =n∑

m=1

E((Xm−Xm−1)2|Fm−1

).

I An in some sense measures total accumulated variance bytime n.

I Theorem: E (supm |Xm|2) ≤ 4EA∞I Proof idea: L2 maximal equality gives

E (sup0≤m≤n |Xm|2) ≤ 4EX 2n = 4EAn. Use monotone

convergence.

18.175 Lecture 19

Page 32: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Square integrable martingales

I Suppose we have a martingale Xn with EX 2n <∞ for all n.

I We know X 2n is a submartingale. By Doob’s decomposition,

an write X 2n = Mn + An where Mn is a martingale, and

An =n∑

m=1

E (X 2m|Fm−1)−X 2

m−1 =n∑

m=1

E((Xm−Xm−1)2|Fm−1

).

I An in some sense measures total accumulated variance bytime n.

I Theorem: E (supm |Xm|2) ≤ 4EA∞I Proof idea: L2 maximal equality gives

E (sup0≤m≤n |Xm|2) ≤ 4EX 2n = 4EAn. Use monotone

convergence.

18.175 Lecture 19

Page 33: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Square integrable martingales

I Suppose we have a martingale Xn with EX 2n <∞ for all n.

I We know X 2n is a submartingale. By Doob’s decomposition,

an write X 2n = Mn + An where Mn is a martingale, and

An =n∑

m=1

E (X 2m|Fm−1)−X 2

m−1 =n∑

m=1

E((Xm−Xm−1)2|Fm−1

).

I An in some sense measures total accumulated variance bytime n.

I Theorem: E (supm |Xm|2) ≤ 4EA∞I Proof idea: L2 maximal equality gives

E (sup0≤m≤n |Xm|2) ≤ 4EX 2n = 4EAn. Use monotone

convergence.

18.175 Lecture 19

Page 34: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Square integrable martingales

I Suppose we have a martingale Xn with EX 2n <∞ for all n.

I We know X 2n is a submartingale. By Doob’s decomposition,

an write X 2n = Mn + An where Mn is a martingale, and

An =n∑

m=1

E (X 2m|Fm−1)−X 2

m−1 =n∑

m=1

E((Xm−Xm−1)2|Fm−1

).

I An in some sense measures total accumulated variance bytime n.

I Theorem: E (supm |Xm|2) ≤ 4EA∞

I Proof idea: L2 maximal equality givesE (sup0≤m≤n |Xm|2) ≤ 4EX 2

n = 4EAn. Use monotoneconvergence.

18.175 Lecture 19

Page 35: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Square integrable martingales

I Suppose we have a martingale Xn with EX 2n <∞ for all n.

I We know X 2n is a submartingale. By Doob’s decomposition,

an write X 2n = Mn + An where Mn is a martingale, and

An =n∑

m=1

E (X 2m|Fm−1)−X 2

m−1 =n∑

m=1

E((Xm−Xm−1)2|Fm−1

).

I An in some sense measures total accumulated variance bytime n.

I Theorem: E (supm |Xm|2) ≤ 4EA∞I Proof idea: L2 maximal equality gives

E (sup0≤m≤n |Xm|2) ≤ 4EX 2n = 4EAn. Use monotone

convergence.

18.175 Lecture 19

Page 36: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Square integrable martingales

I Suppose we have a martingale Xn with EX 2n <∞ for all n.

I Theorem: limn→∞ Xn exists and is finite a.s. on A∞ <∞.I Proof idea: Try fixing a and truncating at time

N = infn : An+1 > a2, use L2 convergence theorem.

18.175 Lecture 19

Page 37: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Square integrable martingales

I Suppose we have a martingale Xn with EX 2n <∞ for all n.

I Theorem: limn→∞ Xn exists and is finite a.s. on A∞ <∞.

I Proof idea: Try fixing a and truncating at timeN = infn : An+1 > a2, use L2 convergence theorem.

18.175 Lecture 19

Page 38: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Square integrable martingales

I Suppose we have a martingale Xn with EX 2n <∞ for all n.

I Theorem: limn→∞ Xn exists and is finite a.s. on A∞ <∞.I Proof idea: Try fixing a and truncating at time

N = infn : An+1 > a2, use L2 convergence theorem.

18.175 Lecture 19

Page 39: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Uniform integrability

I Say Xi , i ∈ I , are uniform integrable if

limM→∞

(supi∈I

E (|Xi |; |Xi | > M))

= 0.

I Example: Given (Ω,F0,P) and X ∈ L1, then a uniformlyintegral family is given by E (X |F) (where F ranges over allσ-algebras contained in F0).

I Theorem: If Xn → X in probability then the following areequivalent:

I Xn are uniformly integrableI Xn → X in L1

I E |Xn| → E |X | <∞

18.175 Lecture 19

Page 40: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Uniform integrability

I Say Xi , i ∈ I , are uniform integrable if

limM→∞

(supi∈I

E (|Xi |; |Xi | > M))

= 0.

I Example: Given (Ω,F0,P) and X ∈ L1, then a uniformlyintegral family is given by E (X |F) (where F ranges over allσ-algebras contained in F0).

I Theorem: If Xn → X in probability then the following areequivalent:

I Xn are uniformly integrableI Xn → X in L1

I E |Xn| → E |X | <∞

18.175 Lecture 19

Page 41: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Uniform integrability

I Say Xi , i ∈ I , are uniform integrable if

limM→∞

(supi∈I

E (|Xi |; |Xi | > M))

= 0.

I Example: Given (Ω,F0,P) and X ∈ L1, then a uniformlyintegral family is given by E (X |F) (where F ranges over allσ-algebras contained in F0).

I Theorem: If Xn → X in probability then the following areequivalent:

I Xn are uniformly integrableI Xn → X in L1

I E |Xn| → E |X | <∞

18.175 Lecture 19

Page 42: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Uniform integrability

I Say Xi , i ∈ I , are uniform integrable if

limM→∞

(supi∈I

E (|Xi |; |Xi | > M))

= 0.

I Example: Given (Ω,F0,P) and X ∈ L1, then a uniformlyintegral family is given by E (X |F) (where F ranges over allσ-algebras contained in F0).

I Theorem: If Xn → X in probability then the following areequivalent:

I Xn are uniformly integrable

I Xn → X in L1

I E |Xn| → E |X | <∞

18.175 Lecture 19

Page 43: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Uniform integrability

I Say Xi , i ∈ I , are uniform integrable if

limM→∞

(supi∈I

E (|Xi |; |Xi | > M))

= 0.

I Example: Given (Ω,F0,P) and X ∈ L1, then a uniformlyintegral family is given by E (X |F) (where F ranges over allσ-algebras contained in F0).

I Theorem: If Xn → X in probability then the following areequivalent:

I Xn are uniformly integrableI Xn → X in L1

I E |Xn| → E |X | <∞

18.175 Lecture 19

Page 44: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Uniform integrability

I Say Xi , i ∈ I , are uniform integrable if

limM→∞

(supi∈I

E (|Xi |; |Xi | > M))

= 0.

I Example: Given (Ω,F0,P) and X ∈ L1, then a uniformlyintegral family is given by E (X |F) (where F ranges over allσ-algebras contained in F0).

I Theorem: If Xn → X in probability then the following areequivalent:

I Xn are uniformly integrableI Xn → X in L1

I E |Xn| → E |X | <∞

18.175 Lecture 19

Page 45: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Submartingale convergence

I Following are equivalent for a submartingale:

I It’s uniformly integrable.I It converges a.s. and in L1.I It converges in L1.

18.175 Lecture 19

Page 46: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Submartingale convergence

I Following are equivalent for a submartingale:I It’s uniformly integrable.

I It converges a.s. and in L1.I It converges in L1.

18.175 Lecture 19

Page 47: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Submartingale convergence

I Following are equivalent for a submartingale:I It’s uniformly integrable.I It converges a.s. and in L1.

I It converges in L1.

18.175 Lecture 19

Page 48: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Submartingale convergence

I Following are equivalent for a submartingale:I It’s uniformly integrable.I It converges a.s. and in L1.I It converges in L1.

18.175 Lecture 19

Page 49: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingale convergence

I Martingale convergence theorem: The following areequivalent for a martingale:

I It’s uniformly integrable.I It converges a.s. and in L1.I It converges in L1.I There is an integrable random variable X so that

Xn = E (X |Fn).

I This implies that every uniformly integrable martingale can beinterpreted as a “revised expectation given latest information”sequence.

18.175 Lecture 19

Page 50: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingale convergence

I Martingale convergence theorem: The following areequivalent for a martingale:

I It’s uniformly integrable.

I It converges a.s. and in L1.I It converges in L1.I There is an integrable random variable X so that

Xn = E (X |Fn).

I This implies that every uniformly integrable martingale can beinterpreted as a “revised expectation given latest information”sequence.

18.175 Lecture 19

Page 51: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingale convergence

I Martingale convergence theorem: The following areequivalent for a martingale:

I It’s uniformly integrable.I It converges a.s. and in L1.

I It converges in L1.I There is an integrable random variable X so that

Xn = E (X |Fn).

I This implies that every uniformly integrable martingale can beinterpreted as a “revised expectation given latest information”sequence.

18.175 Lecture 19

Page 52: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingale convergence

I Martingale convergence theorem: The following areequivalent for a martingale:

I It’s uniformly integrable.I It converges a.s. and in L1.I It converges in L1.

I There is an integrable random variable X so thatXn = E (X |Fn).

I This implies that every uniformly integrable martingale can beinterpreted as a “revised expectation given latest information”sequence.

18.175 Lecture 19

Page 53: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingale convergence

I Martingale convergence theorem: The following areequivalent for a martingale:

I It’s uniformly integrable.I It converges a.s. and in L1.I It converges in L1.I There is an integrable random variable X so that

Xn = E (X |Fn).

I This implies that every uniformly integrable martingale can beinterpreted as a “revised expectation given latest information”sequence.

18.175 Lecture 19

Page 54: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingale convergence

I Martingale convergence theorem: The following areequivalent for a martingale:

I It’s uniformly integrable.I It converges a.s. and in L1.I It converges in L1.I There is an integrable random variable X so that

Xn = E (X |Fn).

I This implies that every uniformly integrable martingale can beinterpreted as a “revised expectation given latest information”sequence.

18.175 Lecture 19

Page 55: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Backwards martingales

I Suppose E (Xn+1|Fn) = Xn with n ≤ 0 (and Fn increasing asn increases).

I Theorem: X−∞ = limn→−∞ Xn exists a.s. and in L1.

I Proof idea: Use upcrosing inequality to show expectednumber of upcrossings of any interval is finite. SinceXn = E (X0|Fn) the Xn are uniformly integrable, and we candeduce convergence in L1.

18.175 Lecture 19

Page 56: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Backwards martingales

I Suppose E (Xn+1|Fn) = Xn with n ≤ 0 (and Fn increasing asn increases).

I Theorem: X−∞ = limn→−∞ Xn exists a.s. and in L1.

I Proof idea: Use upcrosing inequality to show expectednumber of upcrossings of any interval is finite. SinceXn = E (X0|Fn) the Xn are uniformly integrable, and we candeduce convergence in L1.

18.175 Lecture 19

Page 57: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Backwards martingales

I Suppose E (Xn+1|Fn) = Xn with n ≤ 0 (and Fn increasing asn increases).

I Theorem: X−∞ = limn→−∞ Xn exists a.s. and in L1.

I Proof idea: Use upcrosing inequality to show expectednumber of upcrossings of any interval is finite. SinceXn = E (X0|Fn) the Xn are uniformly integrable, and we candeduce convergence in L1.

18.175 Lecture 19

Page 58: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

General optional stopping theorem

I Let Xn be a uniformly integrable submartingale.

I Theorem: For any stopping time N, XN∧n is uniformlyintegrable.

I Theorem: If E |Xn| <∞ and Xn1(N>n) is uniformlyintegrable, then XN∧n is uniformly integrable.

I Theorem: For any stopping time N ≤ ∞, we haveEX0 ≤ EXN ≤ EX∞ where X∞ = limXn.

I Fairly general form of optional stopping theorem: IfL ≤ M are stopping times and YM∧n is a uniformly integrablesubmartingale, then EYL ≤ EYM and YL ≤ E (YM |FL).

18.175 Lecture 19

Page 59: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

General optional stopping theorem

I Let Xn be a uniformly integrable submartingale.

I Theorem: For any stopping time N, XN∧n is uniformlyintegrable.

I Theorem: If E |Xn| <∞ and Xn1(N>n) is uniformlyintegrable, then XN∧n is uniformly integrable.

I Theorem: For any stopping time N ≤ ∞, we haveEX0 ≤ EXN ≤ EX∞ where X∞ = limXn.

I Fairly general form of optional stopping theorem: IfL ≤ M are stopping times and YM∧n is a uniformly integrablesubmartingale, then EYL ≤ EYM and YL ≤ E (YM |FL).

18.175 Lecture 19

Page 60: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

General optional stopping theorem

I Let Xn be a uniformly integrable submartingale.

I Theorem: For any stopping time N, XN∧n is uniformlyintegrable.

I Theorem: If E |Xn| <∞ and Xn1(N>n) is uniformlyintegrable, then XN∧n is uniformly integrable.

I Theorem: For any stopping time N ≤ ∞, we haveEX0 ≤ EXN ≤ EX∞ where X∞ = limXn.

I Fairly general form of optional stopping theorem: IfL ≤ M are stopping times and YM∧n is a uniformly integrablesubmartingale, then EYL ≤ EYM and YL ≤ E (YM |FL).

18.175 Lecture 19

Page 61: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

General optional stopping theorem

I Let Xn be a uniformly integrable submartingale.

I Theorem: For any stopping time N, XN∧n is uniformlyintegrable.

I Theorem: If E |Xn| <∞ and Xn1(N>n) is uniformlyintegrable, then XN∧n is uniformly integrable.

I Theorem: For any stopping time N ≤ ∞, we haveEX0 ≤ EXN ≤ EX∞ where X∞ = limXn.

I Fairly general form of optional stopping theorem: IfL ≤ M are stopping times and YM∧n is a uniformly integrablesubmartingale, then EYL ≤ EYM and YL ≤ E (YM |FL).

18.175 Lecture 19

Page 62: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Problems

I Classic brainteaser: 52 cards (half red, half black) shuffledand face down. I turn them over one at a time. At some point(before last card is turned over) you say “stop”. If subsequentcard is red, you get one dollar. How do you time your stop tomaximize your probability of winning?

I Classic observation: if rn denotes fraction of face-downcards that are red after n have been turned over, then rn is amartingale.

I Optional stopping theorem implies that it doesn’t matterwhen you say stop. All strategies yield same expected payoff.

I Odds of winning are same for monkey and genius.

I Unless you cheat.

I Classic question: Is this also true of the stock market?

18.175 Lecture 19

Page 63: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Problems

I Classic brainteaser: 52 cards (half red, half black) shuffledand face down. I turn them over one at a time. At some point(before last card is turned over) you say “stop”. If subsequentcard is red, you get one dollar. How do you time your stop tomaximize your probability of winning?

I Classic observation: if rn denotes fraction of face-downcards that are red after n have been turned over, then rn is amartingale.

I Optional stopping theorem implies that it doesn’t matterwhen you say stop. All strategies yield same expected payoff.

I Odds of winning are same for monkey and genius.

I Unless you cheat.

I Classic question: Is this also true of the stock market?

18.175 Lecture 19

Page 64: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Problems

I Classic brainteaser: 52 cards (half red, half black) shuffledand face down. I turn them over one at a time. At some point(before last card is turned over) you say “stop”. If subsequentcard is red, you get one dollar. How do you time your stop tomaximize your probability of winning?

I Classic observation: if rn denotes fraction of face-downcards that are red after n have been turned over, then rn is amartingale.

I Optional stopping theorem implies that it doesn’t matterwhen you say stop. All strategies yield same expected payoff.

I Odds of winning are same for monkey and genius.

I Unless you cheat.

I Classic question: Is this also true of the stock market?

18.175 Lecture 19

Page 65: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Problems

I Classic brainteaser: 52 cards (half red, half black) shuffledand face down. I turn them over one at a time. At some point(before last card is turned over) you say “stop”. If subsequentcard is red, you get one dollar. How do you time your stop tomaximize your probability of winning?

I Classic observation: if rn denotes fraction of face-downcards that are red after n have been turned over, then rn is amartingale.

I Optional stopping theorem implies that it doesn’t matterwhen you say stop. All strategies yield same expected payoff.

I Odds of winning are same for monkey and genius.

I Unless you cheat.

I Classic question: Is this also true of the stock market?

18.175 Lecture 19

Page 66: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Problems

I Classic brainteaser: 52 cards (half red, half black) shuffledand face down. I turn them over one at a time. At some point(before last card is turned over) you say “stop”. If subsequentcard is red, you get one dollar. How do you time your stop tomaximize your probability of winning?

I Classic observation: if rn denotes fraction of face-downcards that are red after n have been turned over, then rn is amartingale.

I Optional stopping theorem implies that it doesn’t matterwhen you say stop. All strategies yield same expected payoff.

I Odds of winning are same for monkey and genius.

I Unless you cheat.

I Classic question: Is this also true of the stock market?

18.175 Lecture 19

Page 67: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Problems

I Classic brainteaser: 52 cards (half red, half black) shuffledand face down. I turn them over one at a time. At some point(before last card is turned over) you say “stop”. If subsequentcard is red, you get one dollar. How do you time your stop tomaximize your probability of winning?

I Classic observation: if rn denotes fraction of face-downcards that are red after n have been turned over, then rn is amartingale.

I Optional stopping theorem implies that it doesn’t matterwhen you say stop. All strategies yield same expected payoff.

I Odds of winning are same for monkey and genius.

I Unless you cheat.

I Classic question: Is this also true of the stock market?

18.175 Lecture 19

Page 68: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingales as real-time subjective probability updates

I Ivan sees email from girlfriend with subject “possibly seriousnews”, thinks there’s a 20 percent chance she’ll break up withhim him by email’s end. Revises number after each line:

I Oh Ivan, I’ve missed you so much! 12

I But there’s something I have to tell you 23

I and please don’t take this the wrong way. 29

I I’ve been spending lots of time with a guy named Robert, 47

I a visiting database consultant on my project 34

I who seems very impressed by my work 23

I and wants me to join his startup in Palo Alto. 38

I Said I’d absolutely have to talk to you first, 19

I that you are my first priority. 7

I But I’m just so confused on so many levels. 15

I Please call me! I love you! Alice 0

18.175 Lecture 19

Page 69: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingales as real-time subjective probability updates

I Ivan sees email from girlfriend with subject “possibly seriousnews”, thinks there’s a 20 percent chance she’ll break up withhim him by email’s end. Revises number after each line:

I Oh Ivan, I’ve missed you so much! 12

I But there’s something I have to tell you 23

I and please don’t take this the wrong way. 29

I I’ve been spending lots of time with a guy named Robert, 47

I a visiting database consultant on my project 34

I who seems very impressed by my work 23

I and wants me to join his startup in Palo Alto. 38

I Said I’d absolutely have to talk to you first, 19

I that you are my first priority. 7

I But I’m just so confused on so many levels. 15

I Please call me! I love you! Alice 0

18.175 Lecture 19

Page 70: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingales as real-time subjective probability updates

I Ivan sees email from girlfriend with subject “possibly seriousnews”, thinks there’s a 20 percent chance she’ll break up withhim him by email’s end. Revises number after each line:

I Oh Ivan, I’ve missed you so much! 12

I But there’s something I have to tell you 23

I and please don’t take this the wrong way. 29

I I’ve been spending lots of time with a guy named Robert, 47

I a visiting database consultant on my project 34

I who seems very impressed by my work 23

I and wants me to join his startup in Palo Alto. 38

I Said I’d absolutely have to talk to you first, 19

I that you are my first priority. 7

I But I’m just so confused on so many levels. 15

I Please call me! I love you! Alice 0

18.175 Lecture 19

Page 71: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingales as real-time subjective probability updates

I Ivan sees email from girlfriend with subject “possibly seriousnews”, thinks there’s a 20 percent chance she’ll break up withhim him by email’s end. Revises number after each line:

I Oh Ivan, I’ve missed you so much! 12

I But there’s something I have to tell you 23

I and please don’t take this the wrong way. 29

I I’ve been spending lots of time with a guy named Robert, 47

I a visiting database consultant on my project 34

I who seems very impressed by my work 23

I and wants me to join his startup in Palo Alto. 38

I Said I’d absolutely have to talk to you first, 19

I that you are my first priority. 7

I But I’m just so confused on so many levels. 15

I Please call me! I love you! Alice 0

18.175 Lecture 19

Page 72: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingales as real-time subjective probability updates

I Ivan sees email from girlfriend with subject “possibly seriousnews”, thinks there’s a 20 percent chance she’ll break up withhim him by email’s end. Revises number after each line:

I Oh Ivan, I’ve missed you so much! 12

I But there’s something I have to tell you 23

I and please don’t take this the wrong way. 29

I I’ve been spending lots of time with a guy named Robert, 47

I a visiting database consultant on my project 34

I who seems very impressed by my work 23

I and wants me to join his startup in Palo Alto. 38

I Said I’d absolutely have to talk to you first, 19

I that you are my first priority. 7

I But I’m just so confused on so many levels. 15

I Please call me! I love you! Alice 0

18.175 Lecture 19

Page 73: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingales as real-time subjective probability updates

I Ivan sees email from girlfriend with subject “possibly seriousnews”, thinks there’s a 20 percent chance she’ll break up withhim him by email’s end. Revises number after each line:

I Oh Ivan, I’ve missed you so much! 12

I But there’s something I have to tell you 23

I and please don’t take this the wrong way. 29

I I’ve been spending lots of time with a guy named Robert, 47

I a visiting database consultant on my project 34

I who seems very impressed by my work 23

I and wants me to join his startup in Palo Alto. 38

I Said I’d absolutely have to talk to you first, 19

I that you are my first priority. 7

I But I’m just so confused on so many levels. 15

I Please call me! I love you! Alice 0

18.175 Lecture 19

Page 74: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingales as real-time subjective probability updates

I Ivan sees email from girlfriend with subject “possibly seriousnews”, thinks there’s a 20 percent chance she’ll break up withhim him by email’s end. Revises number after each line:

I Oh Ivan, I’ve missed you so much! 12

I But there’s something I have to tell you 23

I and please don’t take this the wrong way. 29

I I’ve been spending lots of time with a guy named Robert, 47

I a visiting database consultant on my project 34

I who seems very impressed by my work 23

I and wants me to join his startup in Palo Alto. 38

I Said I’d absolutely have to talk to you first, 19

I that you are my first priority. 7

I But I’m just so confused on so many levels. 15

I Please call me! I love you! Alice 0

18.175 Lecture 19

Page 75: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingales as real-time subjective probability updates

I Ivan sees email from girlfriend with subject “possibly seriousnews”, thinks there’s a 20 percent chance she’ll break up withhim him by email’s end. Revises number after each line:

I Oh Ivan, I’ve missed you so much! 12

I But there’s something I have to tell you 23

I and please don’t take this the wrong way. 29

I I’ve been spending lots of time with a guy named Robert, 47

I a visiting database consultant on my project 34

I who seems very impressed by my work 23

I and wants me to join his startup in Palo Alto. 38

I Said I’d absolutely have to talk to you first, 19

I that you are my first priority. 7

I But I’m just so confused on so many levels. 15

I Please call me! I love you! Alice 0

18.175 Lecture 19

Page 76: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingales as real-time subjective probability updates

I Ivan sees email from girlfriend with subject “possibly seriousnews”, thinks there’s a 20 percent chance she’ll break up withhim him by email’s end. Revises number after each line:

I Oh Ivan, I’ve missed you so much! 12

I But there’s something I have to tell you 23

I and please don’t take this the wrong way. 29

I I’ve been spending lots of time with a guy named Robert, 47

I a visiting database consultant on my project 34

I who seems very impressed by my work 23

I and wants me to join his startup in Palo Alto. 38

I Said I’d absolutely have to talk to you first, 19

I that you are my first priority. 7

I But I’m just so confused on so many levels. 15

I Please call me! I love you! Alice 0

18.175 Lecture 19

Page 77: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingales as real-time subjective probability updates

I Ivan sees email from girlfriend with subject “possibly seriousnews”, thinks there’s a 20 percent chance she’ll break up withhim him by email’s end. Revises number after each line:

I Oh Ivan, I’ve missed you so much! 12

I But there’s something I have to tell you 23

I and please don’t take this the wrong way. 29

I I’ve been spending lots of time with a guy named Robert, 47

I a visiting database consultant on my project 34

I who seems very impressed by my work 23

I and wants me to join his startup in Palo Alto. 38

I Said I’d absolutely have to talk to you first, 19

I that you are my first priority. 7

I But I’m just so confused on so many levels. 15

I Please call me! I love you! Alice 0

18.175 Lecture 19

Page 78: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingales as real-time subjective probability updates

I Ivan sees email from girlfriend with subject “possibly seriousnews”, thinks there’s a 20 percent chance she’ll break up withhim him by email’s end. Revises number after each line:

I Oh Ivan, I’ve missed you so much! 12

I But there’s something I have to tell you 23

I and please don’t take this the wrong way. 29

I I’ve been spending lots of time with a guy named Robert, 47

I a visiting database consultant on my project 34

I who seems very impressed by my work 23

I and wants me to join his startup in Palo Alto. 38

I Said I’d absolutely have to talk to you first, 19

I that you are my first priority. 7

I But I’m just so confused on so many levels. 15

I Please call me! I love you! Alice 0

18.175 Lecture 19

Page 79: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Martingales as real-time subjective probability updates

I Ivan sees email from girlfriend with subject “possibly seriousnews”, thinks there’s a 20 percent chance she’ll break up withhim him by email’s end. Revises number after each line:

I Oh Ivan, I’ve missed you so much! 12

I But there’s something I have to tell you 23

I and please don’t take this the wrong way. 29

I I’ve been spending lots of time with a guy named Robert, 47

I a visiting database consultant on my project 34

I who seems very impressed by my work 23

I and wants me to join his startup in Palo Alto. 38

I Said I’d absolutely have to talk to you first, 19

I that you are my first priority. 7

I But I’m just so confused on so many levels. 15

I Please call me! I love you! Alice 0

18.175 Lecture 19

Page 80: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Continuous martingales

I Cassandra is a rational person. She subjective probabilityestimates in real time so fast that they can be viewed ascontinuous martingales.

I She uses the phrase “I think X” in a precise way: it meansthat P(X ) > 1/2.

I Cassandra thinks she will win her tennis match today.However, she thinks that she will at some point think shewon’t win. She does not think that she will ever think thatshe won’t at some point think she will win.

I What’s the probability that Cassandra will win? (Give the fullrange of possibilities.)

18.175 Lecture 19

Page 81: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Continuous martingales

I Cassandra is a rational person. She subjective probabilityestimates in real time so fast that they can be viewed ascontinuous martingales.

I She uses the phrase “I think X” in a precise way: it meansthat P(X ) > 1/2.

I Cassandra thinks she will win her tennis match today.However, she thinks that she will at some point think shewon’t win. She does not think that she will ever think thatshe won’t at some point think she will win.

I What’s the probability that Cassandra will win? (Give the fullrange of possibilities.)

18.175 Lecture 19

Page 82: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Continuous martingales

I Cassandra is a rational person. She subjective probabilityestimates in real time so fast that they can be viewed ascontinuous martingales.

I She uses the phrase “I think X” in a precise way: it meansthat P(X ) > 1/2.

I Cassandra thinks she will win her tennis match today.However, she thinks that she will at some point think shewon’t win. She does not think that she will ever think thatshe won’t at some point think she will win.

I What’s the probability that Cassandra will win? (Give the fullrange of possibilities.)

18.175 Lecture 19

Page 83: 18.175: Lecture 19 .1in Even more on martingalesmath.mit.edu/~sheffield/2016175/Lecture19.pdf18.175 Lecture 19 Recall: conditional expectation I Say we’re given a probability space

Continuous martingales

I Cassandra is a rational person. She subjective probabilityestimates in real time so fast that they can be viewed ascontinuous martingales.

I She uses the phrase “I think X” in a precise way: it meansthat P(X ) > 1/2.

I Cassandra thinks she will win her tennis match today.However, she thinks that she will at some point think shewon’t win. She does not think that she will ever think thatshe won’t at some point think she will win.

I What’s the probability that Cassandra will win? (Give the fullrange of possibilities.)

18.175 Lecture 19