Continuous-Time Martingalesocw.nctu.edu.tw/upload/classbfs1209013347185177.pdf · CHAPTER 7...
-
Upload
truongtruc -
Category
Documents
-
view
216 -
download
3
Transcript of Continuous-Time Martingalesocw.nctu.edu.tw/upload/classbfs1209013347185177.pdf · CHAPTER 7...
CHAPTER 7
Continuous-Time Martingales
7.1. Stochastic processes
Let (Ω,F , P) be a probability space and let I ⊂ [0,∞) be an interval.
Definition 7.1. A real-valued stochastic process X = (Xt)t∈I is a family of random
variables (Xt : t ∈ I) on (Ω,F).
��������, ��������� probability space.
Remark 7.2. (1) We may regard the stochastic process X as a function of two
variables
X : I × Ω −→ R
(t, ω) �−→ Xt(ω).
(���� stochastic process ��������.)
(2) For fixed ω ∈ Ω, then
t �−→ Xt(ω)
is a function: I −→ R, which is called a path of X.
(3) For fixed t ∈ I, then
ω �−→ Xt(ω)
is a function: Ω −→ R, which is a random variable.
������: ��� path? path ����? ���� discrete time ������
����.
157
158 7. CONTINUOUS-TIME MARTINGALES
Definition 7.3. (1) A filtration F = (Ft)t∈I is a family of σ-algebras satisfying
Fs ⊆ Ft
for all s, t ∈ I with s ≤ t.
(2) A filtration is called right-continuous1 if
Ft = Ft+ :=⋂s>t
Fs.
(3) The σ-algebra F is complete if A ∈ F with P(A) = 0 and B ⊂ A implies B ∈ F .
(���������������� σ-algebra.)
(4) The filtration F is complete if F is complete and every null set in F is contained
in Ft for all t ∈ I (or contained in F0).2
(5) F is said to satisfy the usual condition if F is right-continuous and complete.
� continuous time �������������� measurability ���.
Definition 7.4. (1) The stochastic process X is adapted to F if, for each t ≥ 0,
Xt is Ft-measurable.
1�����������. ��������������, �����������
�������. ���������� right-continuous ���� ������������
���, � �������������, ��������. ���� ���������
�� right-continuous ���. In fact, filtration � right-continuous ��, �� left-continuous �
continuous ���, ���������, � ��.2F is complete ���� F �� null set ��� Ft �. Moreover, we have
F is complete =⇒ Ft is complete for all t ∈ I.
���������.
7.1. STOCHASTIC PROCESSES 159
(2) The stochastic process X is called measurable if the mapping
([0,∞) × Ω,B([0,∞)) ⊗F) −→ (R,B(R))
(t, ω) �−→ Xt(ω)
is measurable, i.e., for each A ∈ B, the set
{(t, ω) : Xt(ω) ∈ A} ∈ B([0,∞)) ⊗F .
(3) The stochastic process X is called progressively measurable with respect to F if
([0, t] × Ω,B([0, t]) ⊗Ft) −→ (R,B(R))
(s, ω) �−→ Xs(ω)
is measurable for all t ≥ 0, i.e., for each t ≥ 0, A ∈ B(R), the set
{(s, ω) : 0 ≤ s ≤ t, ω ∈ Ω, Xs(ω) ∈ A} ∈ B([0, t]) ⊗F
Remark 7.5. (1) Any progressively measurable process is measurable and adapted.
(2) If X is measurable and adapted, then it has a progressively measurable modifi-
cation3.
(3) If the stochastic process X is F-adapted and every sample path is right-continuous
or else every sample path is left-continuous, then X is progressively measurable
with respect to F.
Definition 7.6. A random variable T : Ω −→ I ∪ {∞} is called a stopping time with
respect to F if
{T ≤ t} ∈ Ft, for all t ∈ I.
3A modification Y of the stochastic process X is a stochastic process on the same probability space,
with the same parameter set I such that
P(Xt = Yt) = 1
for all t ∈ I.
160 7. CONTINUOUS-TIME MARTINGALES
� ����� stopping time discrete stopping time �����. Discrete
stopping time ����
{T = t} ∈ Ft for all t ∈ I.
���� continuous time �����. �������� remark.
Remark 7.7. If T is a stopping time, then {T < t} ∈ Ft for all t ∈ I. But the
converse does not hold in general.
Proof. (1) For all t ∈ I,
{T < t} =⋃n∈N
{T ≤ t − 1
n
}︸ ︷︷ ︸
∈ Ft− 1n
∈ Ft.
(2) Consider a filtration
Ft =
⎧⎪⎨⎪⎩
{∅, Ω}, t ≤ 1,
power set of Ω, t > 1,
and a random time T given by
T =
⎧⎪⎨⎪⎩
1, if ω ∈ A,
2, otherwise,
where A is a nontrivial subset of Ω. Then
• for t ≤ 1, {T < t} = ∅ ∈ Ft,
• for 1 < t ≤ 2, {T < t} = A ∈ Ft,
• for t > 2, {T < t} = Ω ∈ Ft.
This implies that {T < t} ∈ Ft for all t ≥ 0. However, {T ≤ 1} = A �∈ F1. This means
that T is not a stopping time. �
������������?
7.1. STOCHASTIC PROCESSES 161
Lemma 7.8. Suppose that F is right-continuous. Then T is a stopping time if and
only if {T < t} ∈ Ft, for all t ∈ I.
Proof. “=⇒”: By Remark 7.7.
“⇐=”: Since
{T ≤ t} =⋂
n≥m
{T < t +
1
n
}︸ ︷︷ ︸∈ Ft+ 1
n⊆ Ft+ 1
m
∈ Ft+ 1m
for all m ∈ N, we have
{T ≤ t} ∈⋂
m∈N
Ft+ 1m
= Ft+ = Ft
due to the right-continuity of F. �
Example 7.9. Consider a set A ⊆ R (Rn ��). Define the first hitting time of A by
TA := inf{t ≥ 0 : Xt ∈ A}.
1 dimensional � 2 dimensional first hitting time ���� Figure 7.9 � Figure 7.9.
A
X0
TA TA
Figure 7.1. 1 dimensional first hitting time
162 7. CONTINUOUS-TIME MARTINGALES
A X0
XTA
Figure 7.2. 2 dimensional first hitting time
(1) If F is right-continuous, X is adapted and right-continuous, A is open, then TA
is a stopping time.
(2) If X is adapted and continuous, A is closed, then TA is a stopping time.
Proof. For every t ∈ I, since X is right-continuous, A is open
{TA < t} =⋃s<t
{Xs ∈ A} =⋃
r<t,r∈Q
{Xr ∈ A}︸ ︷︷ ︸∈ Fr ⊆ Ft
∈ Ft.
By Lemma 7.8, we see that TA is a stopping time. �
Definition 7.10. For stopping times S and T with S ≤ T , the stochastic interval
((S, T ]] is defined by
((S, T ]] := {(t, ω) ∈ I × Ω : S(ω) < t < T (ω)}.
[[S, T ]], ((S, T )), and [[S, T )) are defined similarly.
Definition 7.11. Let T be a stopping time,
FT := {A ∈ F : A ∩ {T ≤ t} ∈ Ft for all t ∈ I}
7.1. STOCHASTIC PROCESSES 163
I
S1(w ) S2(w )
RI
Figure 7.3. ٠= R, ���� [[S, T ]] ��.
is called the σ-algebra of events determined prior to the stopping time T .
����� discrete time �����, � ���� discrete time ����
�.
Lemma 7.12. Suppose (Xt) is an adapted and right-continuous stochastic process and
T < ∞ is a stopping time. Then (XT )(ω) := XT (ω)(ω) is FT -measurable.
Exercise
(1) For the given sample space and probability measure, find the smallest complete
σ-algebra.
(a) Ω = {1, 2, 3, 4, 5, 6}, P({1, 2}) = P({3, 4}) = P({5, 6}) =1
3.
(b) Ω = {1, 2, 3, 4, 5, 6}, P({1, 2}) = 0, P({3, 4}) = P({5, 6}) =1
2.
(c) Ω = R, P([n − 1, n)) =1
2n, for all n ∈ N.
164 7. CONTINUOUS-TIME MARTINGALES
7.2. Uniform integrability
Definition 7.13. A family of random variables (Yα)α∈Λ is called uniformly integrable
(u.i.) if
limc→∞
supα∈Λ
∫{|Yα|>c}
|Yα| dP = 0.
�������, �����������.
Theorem 7.14. (Yα)α∈Λ is uniformly integrable if and only if it satisfies the following
two conditions:
(i) supα∈Λ
E|Yα| < ∞;
(ii) For all ε > 0, there exists δ = δ(ε) > 0 such that for E ∈ F ,
∫E
|Yα| dP < ε, for all α ∈ Λ,
whenever P(E) < δ.
Example 7.15. (1) If |Yα| ≤ Z for all α ∈ Λ and for a random variable Z ∈L1(P), then (Yα)α∈Λ is uniformly integrable.
Proof. Since |Yα| ≤ Z, we have
{|Yα| ≥ C} ⊆ {Z ≥ C}.
Hence,
0 ≤ limc→∞
supα∈Λ
∫{|Yα|>c}
|Yα| dP ≤ limc→∞
supα∈Λ
∫{Z>c}
|Yα| dP
≤ limc→∞
supα∈Λ
∫{Z>c}
Z dP = limc→∞
∫{Z>c}
Z dP
= limc→∞
∫Z I{Z>c} dP =
∫ (limc→∞
Z I{Z>c})
dP = 0
7.2. UNIFORM INTEGRABILITY 165
due to the dominated convergence theorem. Hence, we get that
limc→∞
supα∈Λ
∫{|Yα|>c}
|Yα| dP = 0.
�
(2) If
supα∈Λ
E[|Yα|p] < ∞
for some p > 1, then (Yα)α∈Λ is uniformly integrable.
(Note: � Theorem 7.14 ���� p = 1 ������.)
Proof. Since
supα∈Λ
∫{|Yα|>c}
|Yα| dP = supα∈Λ
∫{|Yα|>c}
|Yα|p|Yα|p−1
dP
≤ supα∈Λ
1
cp−1
∫|Yα|p dP =
1
cp−1supα∈Λ
E[|Yα|p],
which approaches to 0 as c goes to ∞, we get that (Yα)α∈Λ is uniformly integrable.
�
(3) If Z ∈ L1(P), then the collection of random variables
{E[Z|G] : G ⊆ F is a σ-algebra }
is uniformly integrable.
(4) Let Yn = nI(0,1/n) and let P be the Lebesgue measure. For any c, there exists
n ∈ N such that for n > c,
∫{|Yn|>c}
|Yα| dP =
∫ 1/n
0
n dx = 1.
This implies that
supn∈N
∫{|Yn|>c}
|Yn| dP = 1, for all c.
166 7. CONTINUOUS-TIME MARTINGALES
This means
limc→∞
supn∈Λ
∫{|Yn|>c}
|Yn| dP = 1.
Thus, (Yn) is not uniformly integrable.
(5) Consider a sequence of i.i.d. random variables (ξi) with
P(ξi = 1) = P(ξi = −1) =1
2.
Let X0 = 0 and for n ≥ 1,
Xn =]ξ1 + ξ2 + · · · + ξn.
Then (Xn)n≥1 is not uniformly integrable, since
limnto∞E|Xn|√
n=
√2
π,
which implies that supn∈Λ
E|Yn| < ∞.
7.3. Martingale theory in continuous-time
Let I = [0,∞).
Definition 7.16. (1) A stochastic process X = (Xt)t≥0 is called a martingale
(with respect to P and F) if
(a) Xt ∈ L1(P) for all t ≥ 0;
(b) X is adapted;
(c) For 0 ≤ s ≤ t < ∞,
E[Xt|Fs] = Xs, P − a.s.
(2) X is called a submartingale if (i) + (ii) +
7.3. MARTINGALE THEORY IN CONTINUOUS-TIME 167
(iiia) For 0 ≤ s ≤ t < ∞,
E[Xt|Fs] ≥ Xs, P − a.s.
(3) X is called a supermartingale if (i) + (ii) +
(iiib) For 0 ≤ s ≤ t < ∞,
E[Xt|Fs] ≤ Xs, P − a.s.
Example 7.17. Let Z ∈ L1(P), then the process (Xt) defined by
Xt = E[Z|Ft],
is a martingale. In fact, (Xt) is a uniformly integrable martingale.
Theorem 7.18 (Optional Stopping Theorem). (1) Let X = (Xt) be a right-continuous,
uniformly integrable martingale and let S and T be stopping times with S ≤ T ,
then XS, XT ∈ L1(P) and
E[XT |FS] = XS, P − a.s.
(2) Let X = (Xt) be a right-continuous, uniformly integrable supermartingale and let
S and T be stopping times with S ≤ T , then XS, XT ∈ L1(P) and
E[XT |FS] ≤ XS, P − a.s.
Definition 7.19. For a stochastic process X = (Xt)t≥0 and a stopping time T , the
stopped process XT := (XTt )t≥0 is defined by
XTt := Xt∧T =
⎧⎪⎨⎪⎩
Xt, if t ≤ T,
XT , if t > T.
168 7. CONTINUOUS-TIME MARTINGALES
Corollary 7.20. (1) If X = (Xt) is a right-continuous, uniformly integrable su-
permartingale and T is a stopping time, then XT is a right-continuous, uniformly
integrable supermartingale.
(2) If X = (Xt) is a right-continuous, uniformly integrable martingale and T is a
stopping time, then XT is a right-continuous, uniformly integrable martingale.
Remark 7.21. The condition “uniformly integrable” is necessary, e.g., let (Xt)t≥0 be
a random walk4 and let
T := inf{t ≥ 0 : Xt = 1},
then XT is not a martingale, since XT∞ = 1 P-a.s. ���, �� X � Xt =
t∑i=1
ξi,
where (ξi) is a sequence of Bernoulli distributed random variables with
P(ξi = 1) = P(ξi = −1) =1
2.
By Example 7.15(5), we see that X is not uniformly integrable. �������� X �
uniformly integrable ���.
Proposition 7.22. Let X = (Xt)0≤t≤∞ be adapted, right-continuous and satisfy
E[XT ] = E[X0]
for every stopping time T with XT ∈ L1(P), then X is a uniformly integrable martingale.
Remark 7.23. An alternative version of optional sampling theorem.
(1) Suppose X = (Xt)0≤t≤∞ is a right-continuous martingale with last element X∞,
S and T are stopping times with S ≤ T , then
E[XT |FS] = XS, P − a.s.
4��� random walk �������� ������.
7.4. LOCAL MARTINGALES 169
(2) Suppose X = (Xt)0≤t≤∞ is a right-continuous supermartingale with last element
X∞, S and T are stopping times with S ≤ T , then
E[XT |FS] ≤ XS, P − a.s.
7.4. Local martingales
Definition 7.24. An adapted, right-continuous stochastic process X = (Xt)t≥0 is
called a local martingale, if there exists a sequence of stopping times (Tn) with Tn ↑ ∞P-a.s. such that the stopped process
XTnI{Tn>0} = (Xt∧TnI{Tn>0})t≥0
is a (uniformly integrable) martingale with respect to (Ft).
Notation 7.25. Mloc = the collection of all local martingales.
Mloc0 = {X ∈ Mloc : X0 = 0 P − a.s.}.
Remark 7.26. Every martingale is local martingale.
Proof. Let Tn = n, then (Xt∧n,Ft) is a martingale. this implies that (Xt) is a local
martingale. �
Remark 7.27. A local martingale may not be a martingale, c.f., Karatzas and Shreve
[19] P.168.
Remark 7.28. A local martingale with
sup0≤r≤t
|Xr| ∈ L1(P)
for all t ≥ 0, is a martingale.
170 7. CONTINUOUS-TIME MARTINGALES
Proof. Let X0 = 0 and let (Tn) be the sequence of stopping times with Tn ↑ ∞ such
that XTn is a martingale for all n. For 0 ≤ s ≤ t, we have
E[Xt∧Tn|Fs] = Xs∧Tn .
As n → ∞, Tn −→ ∞, we have
Xt∧Tn −→ Xt P − a.s.,
Xs∧Tn −→ Xs P − a.s.
Due to
supn
|Xt∧Tn| ≤ sup0≤r≤t
|Xr| ∈ L1(P),
Xs = limn→∞
Xs∧Tn = limn→∞
E [Xt∧Tn|Fs]
= E
[lim
n→∞Xt∧Tn|Fs
]= E[Xt|Fs]
by Lebesgue convergence theorem. Hence (Xt) is a martingale. �
Proposition 7.29. very nonnegative local martingale is a supermartingale.
Proof. Let (Tn) be a sequence of stopping times with Tn ↑ ∞ and XTn is a martingale
for all n. Then for 0 ≤ s < t,
Xs = limn→∞
Xs∧Tn = limn→∞
E[Xt∧Tn|Fs]
≥ E
[lim
n→∞Xt∧Tn|Fs
]= E[Xt|Fs]
due to Fatou’s lemma. �
7.5. DOOB-MEYER DECOMPOSITION 171
7.5. Doob-Meyer decomposition
Recall: (Doob decomposition)
Suppose that (Xn,Fn) is a supermartingale, then
Xn = Yn − Zn,
where (Yn) is a martingale and (Zn) is an increasing previsible process.
This decomposition is unique.
How about the case in the continuous time?
�������������� previsible ��������. � discrete time �,
previsible �����: Xn � Fn−1-measurable. �� continuous time �����
���, ���� ��������.
Definition 7.30. Let Ω = Ω × (0,∞).
(1) P is called a previsible σ-algebra if it is generated by all left-continuous, adapted
process on Ω.
(2) A stochastic process X is called previsible if X is measurable with respect to a
previsible σ-algebra P over Ω.
Remark 7.31. previsble ��������������������� (�
����������).
����� left-continuous process ����? �� ����� stochastic process
� left-continuous ��, ���������������������.
Note: previsible process ������ left-continuous.
Remark 7.32. (1) Every previsible process is adapted.
172 7. CONTINUOUS-TIME MARTINGALES
(2) Every continuous, increasing process is previsible.
(3) If F satisfies the usual condition, every previsible process is adapted to (Ft−).
� previsible process �������� Ph. Protter [24].
�� Doob-Meyer decomposition ��, ���� �� notation.
Notation 7.33. M2 = the collection of all cadlag5 martingales (Mt) with
supt≥0
E[M2t ] < ∞; (7.1)
M20 = {M ∈ M2 : M0 = 0};
M2,c0 = {M ∈ M2
0 : M is continuous in t}.
Theorem 7.34 (Doob-Meyer decomposition). Let X = (Xt)t≥0 be a right-continuous
supermartingale and the collection of random variables
{XT : T is a stopping time with P(T < ∞) = 1}
is uniformly integrable. Then X admits a unique decomposition
Xt = X0 + Mt − At,
where M is a right-continuous, uniformly integrable martingale with M0 = 0 and A is an
increasing, right-continuous, previsible process with A0 = 0.
Corollary 7.35. Let M ∈ M2 be right-continuous. Then there exists a unique right-
continuous previsible process 〈M〉 = (〈M〉t)t≥0 with 〈M〉0 = 0 such that the process M2 −〈M〉 is a martingale.
5continu a droite, limite a gauche, � right-continuous, left-limit exists, ��� RCLL.
7.5. DOOB-MEYER DECOMPOSITION 173
������ M ∈ M2 ��� ����, ���� martingale � convergence
theorem. ���, ��� M∞ �
Mt = E[M∞|Ft].
Proof. By Jensen’s inequality,
E[M2t |F s] ≥ (E[Mt|Fs])
2 = M2s .
we see that M2t is a submartingale. Remain to check that (M2
t ) is uniformly integrable.
� M ∈ M2 �� M∞ exists and is integrable ��, ��� (M2t ) is uniformly
integrable. Applying Doob-Meyer decomposition, we can get the desired result. �
Definition 7.36. 〈M〉 is called the quadratic variation of M .
������ t ∈ [0,∞) ���� (7.1) ���������� quadratic variation?
For general case, ��� ���� ��. ��, for fixed N , then
sup0≤t≤N
E[M2t ] = E[M2
N ] < ∞,
and
Mt = E[MN |Ft].
�������� ��, � � martingale ��� (7.1), ����� ���
quadratic variation. ����� 〈M〉 ����������. � discrete time case
�, ���������, �!������� ��� � Brownian motion �"
���������.
Lemma 7.37. Let M ∈ M2,c. For partition Π of [0, t], set
‖Π‖ := max1≤k≤m
|tk − tk−1|,
174 7. CONTINUOUS-TIME MARTINGALES
we have
lim‖Π‖→0
m∑k=1
|Mtk − Mtk−1|2 = 〈M〉t in probability,
i.e., for any ε > 0, η > 0, there exists δ > 0 such that
max1≤k≤m
|tk − tk−1| < δ =⇒ P(∣∣|Mtk − Mtk−1
|2 − 〈M〉t∣∣ > ε
)< η.
Definition 7.38. Let M,N ∈ M2. Then the process
〈M,N〉 :=1
4(〈M + n〉 − 〈M − N〉)
is called a cross variation (or quadratic covariation) of M and N .
Remark 7.39. (1) 〈M,M〉t = 〈M〉t.(2) MN − 〈M, N〉 is a martingale. Moreover, if M , N are right-continuous, 〈M, N〉
is the unique right-continuous, previsible process B of bounded variation (��
��: ������� B ��� nonincreasing or nondecreasing) with B0 = 0
such that MN − B is a martingale.
(3) If M, N ∈ M2,loc are right-continuous, then there exist a unique increasing,
right-continuous, previsible process 〈M〉 and a unique right-continuous, previsible
process 〈M,N〉 of bounded variation with 〈M〉0 = 〈M,N〉0 = 0 such that M2 −〈M〉 and MN − 〈M,N〉 are local martingales.
(4) lim‖Π‖→0
m∑k=1
(Mtk − Mtk−1)(Ntk − Ntk−1
) = 〈M,N〉t in probability.
Exercise
(1) Let M,N ∈ M2,c0 be independent stochastic processes on a filtered probability
space (Ω,F , F = (Ft)t≥0, P) with quadratic variation 〈M〉t = 2t and 〈N〉t = 4t,
respectively.
(a) Find the cross variation 〈M, N〉t.
7.6. SEMIMARTINGALES 175
(b) Find the quadratic variation of M + N .
(c) Find the quadratic variation of M − N .
(2) (16 points) Let M, N ∈ M2,c0 be stochastic processes on a filtered probability
space (Ω,F , F = (Ft)t≥0, P) with quadratic variation 〈M〉t = 4t and 〈N〉t = 6t,
respectively. Moreover, their cross variation 〈M, N〉t is given by 2t.
(a) Find the quadratic variation of M + N .
(b) Find the quadratic variation of M − N .
7.6. Semimartingales
������ martingale � local martingale ���������.
Definition 7.40. A stochastic process X = (Xt)t≥0 is called a semimartingale (#�)
if X is an adapted process with the decomposition
Xt = X0 + Mt + At, (7.2)
where (Mt) is a local martingale with M0 = 0 and (At) is an adapted, cadlag process of
bounded variation, i.e., there exist increasing, adapted process A+, A− such that
At = A+t − A−
t .
Remark 7.41. In general, the decomposition (7.2) is not unique. But if X is contin-
uous, this this decomposition is unique.
Lemma 7.42. A continuous local martingale of bounded variation is constant P-a.s.
Remark 7.43. A continuous non-constant local martingale is not of bounded varia-
tion.
176 7. CONTINUOUS-TIME MARTINGALES
�� �����"�������������� martingale �, �����
���������.