Solutions to the Exercises in Stochastic Analysis · Solutions to the Exercises in Stochastic...

42
Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alterna- tive proofs once we learnt conditional expectations. Exercise 1 For x, y R d define p t (x, y) = (2πt) - d 2 e - |x-y|2 2t . Prove that P t (x, dy)= p t (x, y)dy satisfies the Chapman-Kolmogorov equation: for Γ Borel subset of R d , P t+s (x, Γ) = Z R d P s (y, Γ)P t (x, dy). Solution: One one hand P t+s (x, Γ) = (2π(t + s)) - d 2 Z Γ e - |x-z| 2 2(t+s) dz. On the other hand, Z R d P s (y, Γ)P t (x, dy) = (2πt) - d 2 (2πs) - d 2 Z R d Z Γ e - |y-z| 2 2s e - |y-x| 2 2t dzdy. We now complete the squares in y. - |y - z | 2 2s - |y - x| 2 2t = - 1 2 t + s st y - tz + sx t + s 2 - 1 2 |x - z | 2 t - s . 1

Transcript of Solutions to the Exercises in Stochastic Analysis · Solutions to the Exercises in Stochastic...

Solutions to the Exercises in Stochastic Analysis

Lecturer: Xue-Mei Li

1 Problem Sheet 1

In these solution I avoid using conditional expectations. But do try to give alterna-tive proofs once we learnt conditional expectations.

Exercise 1 For x, y ∈ Rd define pt(x, y) = (2πt)−d2 e−

|x−y|22t . Prove thatPt(x, dy) =

pt(x, y)dy satisfies the Chapman-Kolmogorov equation: for Γ Borel subset of Rd,

Pt+s(x,Γ) =

∫Rd

Ps(y,Γ)Pt(x, dy).

Solution: One one hand

Pt+s(x,Γ) = (2π(t+ s))−d2

∫Γe− |x−z|2

2(t+s) dz.

On the other hand,∫Rd

Ps(y,Γ)Pt(x, dy)

= (2πt)−d2 (2πs)−

d2

∫Rd

∫Γe−|y−z|2

2s e−|y−x|2

2t dzdy.

We now complete the squares in y.

− |y − z|2

2s− |y − x|

2

2t= −1

2

t+ s

st

∣∣∣∣y − tz + sx

t+ s

∣∣∣∣2 − 1

2

|x− z|2

t− s.

1

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 2

next we change the variable y − tz+sxt+s to y, then∫

Rd

Ps(y,Γ)Pt(x, dy)

= (2πt)−d2 (2πs)−

d2

∫Rd

∫Γe−

12

t+sst|y|2e−

12|x−z|2t−s dzdy

= (2πt)−d2 (2πs)−

d2

∫Rd

e−12

t+sst|y|2dy

∫Γe−

12|x−z|2t−s dz

= (2π(t− s))−fd2

∫Γe− |x−z|2

2(t−s) dz.

Exercise 2 Let (Xt, t ≥ 0) be a Markov process with X0 = 0 and transition func-tion pt(x, y)dy where pt(x, y) is the heat kernel. Prove the following statements.

(1) For any s < t, Xt −Xs ∼ Pt−s(0, dy);

(2) Prove that (Xt) has independent increments.

(3) For every number p > 0 there exists a constant c(p) such that

E|Xt −Xs|p = c(p)|t− s|p2 .

(4) State Kolomogorov’s continuity theorem and conclude that for almost surelyall ω, Xt(ω) is locally Holder continuous with exponent α for any numberα < 1/2.

(5) Prove that this is a Brownian motion on Rd.

Solution: Let f be a bounded measurable function.(1) Since (xs, xt) ∼ Ps(0, dx)Pt−s(x, dy),

Ef(xt − xs) =

∫Rd

∫Rd

f(y − x)ps(0, x)pt−s(x, y)dxdy

=

∫Rd

∫Rd

f(z)ps(0, x)pt−s(0, z)dxdz

=

∫Rd

f(z)pt−s(0, z)dz

∫Rd

ps(0, x)dx =

∫Rd

f(z)pt−s(0, z)dz.

Hence xt − xs ∼ Pt−s(0, dz).

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 3

(2) Let us fix 0 = t0 < t1 < t2 < . . . < tn and Borel sets Ai ∈ B(R),i = 1, . . . , n. Let fi(x) = 1x∈Ai where Ai are Borel measurable set. Then weobtain

P(Xt1 ∈ A1, . . . , Xtn −Xtn−1 ∈ An)

=

∫R. . .

∫Rf1(x1)f2(x2 − x2) . . . fn(xn − xn−1)

× pt1(0, x1)pt2−t1(x1, x2) . . . ptn−tn−1(xn−1, xn) dxn . . . dx1 ,

where in the last line we have used the identity (ii). Introducing new variables:y1 = x1, y2 = x2 − x1, . . ., yn = xn−1 − xn, we obtain

P(Xt1 ∈ A1, . . . , Xtn −Xtn−1 ∈ An) =

∫R. . .

∫Rf1(y1)f2(y2) . . . fn(yn)

× pt1(0, y1)pt2−t1(0, y2) . . . ptn−tn−1(0, yn) dyn . . . dy1

=

n∏i=1

∫Rfi(yi)pti−ti1 (0, yi) dyi .

This means that Xti−ti−1 , i = 1, . . . , n are independent random variables.(3)

E|xt − xs|p =1

(2π(t− s))d2

∫Rd

|z|pe−|z|2

2(t−s)dz

=1

(2π(t− s))d2

∫Rd

(t− s)p2 |y|pe−

|y|22 |t− s|

d2 dy

= (t− s)p2

1

(2π)d2

∫Rd

|y|pe−|y|22 dy.

The integral is finite. Let

c(p) =

∫Rd

|y|pp1(0, y)dy,

which is the pth moment of a N(0, Id×d) distributed variable.(4) and (5) are straight forward application of Kolmogorov’s theorem.

Exercise 3 If (Bt) is a Brownian motion prove that (Bt) is a Markov process withtransition function pt(x, y)dy.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 4

Solution: Let us denote for simplicity fi(x) = 1x∈Ai . Furthermore, we define therandom variables Yi = Xti − Xti−1 , for i = 1, . . . , n, where we have postulatedt0 = 0. From the properties of the Brownian motion we obtain that the randomvariables Yi are independent and moreover Yi ∼ N (0, ti− ti−1). Thus, we have

P[Xt1 ∈ A1, . . . , Xtn ∈ An] = En∏i=1

fi(Xti)

= E[f1(Y1)f2(Y2 + Y1) . . . fn(Yn + . . .+ Y1)]

=

∫R. . .

∫Rf1(y1)f2(y2 + y1) . . . fn(yn + . . .+ y1)

× pt1(0, y1)pt2−t1(0, y2) . . . ptn−tn−1(0, yn) dyn . . . dy1 .

Now we introduce new variables: x1 = y1, x2 = y2 + y1, . . ., xn = yn + . . .+ y1,and obtain that the last integral equals∫R. . .

∫Rf1(x1)f2(x2) . . . fn(xn)

× pt1(0, x1)pt2−t1(0, x2 − x1) . . . ptn−tn−1(0, xn − xn−1) dxn . . . dx1 .

Noticing that pt(0, y − x) = pt(x, y) and recalling the definition of the functionsfi, so the finite dimensional distribution agrees with that of the Markov processdetermined by the heat kernels. The two processes must agree.

Exercise 4 Let (Xt, t ≥ 0) be a continuous real-valued stochastic process withX0 = 0 and let pt(x, y) be the heat kernel on R. Prove that the following state-ments are equivalent:

(i) (Xt, t ≥ 0) is a one dimensional Brownian motion.

(ii) For any number n ∈ N, any sets Ai ∈ B(R), i = 1, . . . , n, and any 0 <t1 < t2 < . . . < tn,

P[Xt1 ∈ A1, . . . , Xtn ∈ An]

=

∫A1

. . .

∫Ak

pt1(0, y1)pt2−t1(y1, y2) . . . ptn−tn−1(yn−1, yn) dyn . . . dy1 .

Solution: This follows from the previous exercises.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 5

Exercise 5 A zero mean Gaussian process BHt is a fractional Brownian motion of

Hurst parameter H , H ∈ (0, 1), if its covariance is

E(BHt B

Hs ) =

1

2(t2H + s2H − |t− s|2H) .

Then E|BHt −BH

s |p = C|t−s|pH . IfH = 1/2 this is Brownian motion (Otherwisethis process is not even a semi-martingale). Show that (BH

t ) has a continuousmodification whose sample paths are Holder continuous of order α < H .

Solution: Since E|BHt − BH

s |p = C|t− s|pH = C|t− s|1+(pH−1), we can applythe Kolmogorov continuity criterion to obtain that BH

t has a modification whosesample paths are Holder continuous of order α < (pH−1)/p. This means that forany α < H we can take p large enough to have α < (pH − 1)/p. This finishes theproof.

Exercise 6 Let (Bt) be a Brownian motion on Rd. Let T be a positive number.For t ∈ [0, T ] set Yt = Bt − t

TBT . Compute the probability distribution of Yt.

Solution:

Ef(Bt −t

TBT )

=

∫Rd

∫Rd

f

(x− t

Ty

)(2πt)−

d2 e−

|x|22t (2π(T − t))−

d2 e− |y−x|2

2(T−t)dxdy.

We observe that

|x|2 =

∣∣∣∣x− t

Ty

∣∣∣∣2 +2t

T〈x, y〉 − t2

T 2|y|2

and that

|x− y|2 =

∣∣∣∣x− t

Ty

∣∣∣∣2 − 2(T − t)T

〈x, y〉+2t(T − t)

T 2|y|2 +

(T − t)2

T 2|y|2.

Thus,

|x|2

t+|y − x|2

(T − t)=

1

t

∣∣∣∣x− t

Ty

∣∣∣∣2 +1

T − t

∣∣∣∣x− t

Ty

∣∣∣∣2 +1

T|y|2.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 6

Finally

Ef(Bt −t

TBT )

= (2πt)−d2 (2π(T − t))−

d2

∫Rd

∫Rd

f

(x− t

Ty

)e−

12t |x− t

Ty|2e−

12(T−t) |x− t

Ty|2e−

|y|22T dxdy

= (2πt)−d2 (2π(T − t))−

d2

∫Rd

f(z)e−12t|z|2e

− 12(T−t)

|z|2dz

∫Rd

e−|y|22T dy

=

∫Rd

f(z)pt(0, z)pT−t(z, 0)dz · (2πT )d2 .

Finally we see

Bt −t

TBT ∼

pt(0, z)pT−t(z, 0)

pT (0, 0)dz.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 7

2 Brownian motion, conditional expectation, and uniformintegrability

Exercise 7 Let (Bt) be a standard Brownian motion. Prove that

(a) (i) for any t > 0, E[Bt] = 0 and E[B2t ] = t;

(ii) for any s, t ≥ 0, E[BsBt] = s ∧ t, where s ∧ t = min(s, t).

(b) (scaling invariance) For any a > 0, 1√aBat is a Brownian motion;

(c) (Translation Invariance) For any t0 ≥ 0, Bt0+t−Bt0 is a standard Brownianmotion;

(d) If Xt = Bt− tB1, 0 ≤ t ≤ 1, then E(XsXt) = s(1− t) for s ≤ t. Computethe probability distribution of Xt.

Hint: break Xt down as the sum of two independent Gaussian random vari-ables, then compute its characteristic function).

Solution: (a) (i) Since the distribution of Bt is N (0, t), we have E[Bt] = 0 andE[B2

t ] = t.(a) (ii) We fix any t ≥ s ≥ 0. Then we have

E[BsBt] = E[(Bt −Bs)Bs] + E[B2s ] .

Since the Brownian motion has independent increments, the random variablesBt−Bs and Bs are independent and we have

E[(Bt −Bs)Bs] = E[Bt −Bs]E[Bs] = 0 .

Furthermore, from (i) we know that E[B2s ] = s. Hence, we conclude

E[BsBt] = s ,

which is the required identity.(b) Let us denote Wt = 1√

aBat. Then W has continuous sample paths and in-

dependent increments, which follows from the same properties of B. Furthermore,for any t > s ≥ 0, we have

Wt −Ws =1√a

(Bat −Bas) ∼ N (0,a(t− s)

a) = N (0, t− s) ,

which finishes the proof.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 8

(c) If we denote Wt = Bt0+t − Bt0 , then W has continuous sample paths andindependent increments, which follows from the same properties of B. Moreover,for any t > s ≥ 0, we have

Wt −Ws = Bt0+t −Bt0+s ∼ N (0, (t0 + t)− (t0 + s)) = N (0, t− s) ,

which finishes the proof.(d) For t = 0 or t = 1 we have Xt = 0. For 1 > t ≥ s > 0 we have

E[XtXs] = E[(Bt − tB1)(Bs − sB1)]

=E[BtBs]− sE[BtB1]− tE[B1Bs] + stE[B1B1]

=s− st− st+ st = s(1− t) .

Let us take t ∈ (0, 1). Then we have Xt = (1 − t)Bt − t(B1 − Bt). Sincethe random variables Bt and B1 − Bt are independent, the distribution of Xt isN (0, (1− t)2t+ t2(1− t)) = N (0, t(1− t)).

Exercise 8 Let X ∈ L1(Ω,F , P ). Prove that the family of random variableEX|G : G ⊂ F is L1 bounded, i.e. supG⊂F E (|EX|G|) <∞.

Solution: Let us take any G ⊂ F . Then using the Jensen’s inequality we have

E (|EX|G|) ≤ E (E|X||G) = E|X| <∞ ,

which proves the claim.

Exercise 9 Let X ∈ L1(Ω;R). Prove that the family of functions

EX|G : G is a sub σ-algebra of F

is uniformly integrable.

Solution: We note that, if a set of measurable sets AC satisfies limC→∞ P[AC ] =0, then limC→∞ E[1AC

|X|] = 0 ( since X ∈ L1, dominated convergence).Let us take any G ⊂ F and consider the family of events

A(C,G) = ω : |E[X|G](ω)| ≥ C .

Applying the Markov’s and Jensen’s inequalities we obtain

P(A(C,G)) ≤ C−1E (|E[X|G]|) ≤ C−1E[E[|X||G]] = C−1E|X| → 0 ,

as C →∞ ( since E|X| <∞).

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 9

For any ε > 0, there exist a δ > 0 such that if P[A] < δ, then E(1A|X|) < ε.For this δ, take C > E|X|

δ , then P[A(C,G)] < δ for any G ⊂ F , which implies

supG⊂F

E[1A(C,G)|X|] < ε.

Finally, we conclude

supG⊂F

E[1A(C,G)|E[X|G]|] ≤ supG⊂F

E[1A(C,G)|X|] < ε ,

which proves the claim.

Exercise 10 Let (Gt, t ≥ 0), (Ft, t ≥ 0) be filtrations with the property that Gt ⊂Ft for each t ≥ 0. Suppose that (Xt) is adapted to (Gt). If (Xt) is an (Ft)-martingale prove that (Xt) is an (Gt)-martingale.

Solution: The fact that (Xt) is (Gt)-adapted, follows from the inclusion Gt ⊂ Ftfor each t ≥ 0. Furthermore, for any t > s ≥ 0, using the tower property of theconditional expectation, we obtain

E[Xt|Gs] = E[E[Xt|Fs]|Gs] = E[Xs|Gs] = Xs .

This shows that (Xt) is an (Gt)-martingale.

Exercise 11 (Elementary processes) Let 0 = t0 < · · · < tn < tn+1, Hi bebounded Fti-measurable functions, and

Ht(ω) = H0(ω)10(t) +

n∑i=1

Hi(ω)1(ti,ti+1] . (1)

Prove that H : R+ × Ω→ R is Borel measurable. Define the stochastic integral

It ≡∫ t

0HsdMs ≡

n∑i=1

Hi(Mti+1∧t −Mti∧t)

and prove that

E[∫ t

0HsdBs

]= 0 , E

[∫ t

0HsdBs

]2

= E[∫ t

0(Hs)

2ds

]. (2)

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 10

Solution: First, we will prove that the function (1) is Borel-measurable. To thisend, we take any Borel set A ∈ B(R) and we have to show that the set (t, ω) :H(t, ω) ∈ A is measurable in the product space (R+,B(R+))× (Ω,F). We canrewrite this set in the following way:

(t, ω) : H(t, ω) ∈ A = (0 × ω : H0(ω) ∈ A)∪ni=1 ((ti, ti+1]× ω : Hi(ω) ∈ A) .

Since the sets 0 and (ti, ti+1], i = 1, . . . , n, belong to B(R+), and ω : Hi(ω) ∈A ∈ Fti ⊂ F , the claim now follows from the fact that the product of twomeasurable sets is measurable in the product space.

Next, we will show the identities (2). Let us denote It =∫ t

0 HsdBs. Then wehave

E[It] =n∑i=1

E[Hi(Bti+1∧t −Bti∧t)

]=

n∑i=1

E [Hi]E[Bti+1∧t −Bti∧t

]= 0 ,

where in the second equality we have used the independence of Bti+1∧t − Bti∧tfrom Fti , which follows from the properties of the Brownian motion and the factthat Bti+1∧t − Bti∧t = 0 if ti ≥ t. Furthermore, in the last equality we have usedE[Bti+1∧t −Bti∧t

]= 0.

For the variance of the stochastic integral we have

E[I2t ] =

n∑i=1

E[H2i (Bti+1∧t −Bti∧t)2

]+∑i 6=j

E[HiHj(Bti+1∧t −Bti∧t)(Btj+1∧t −Btj∧t)

].

In the same way as before, using the independence of the increments of the Brow-nian motion, we obtain that the second sum is 0. Thus, we have

E[I2t ] =

n∑i=1

E[H2i (Bti+1∧t −Bti∧t)2

]=

n∑i=1

E[H2i

]E[(Bti+1∧t −Bti∧t)2

]=

n∑i=1

E[H2i

](ti+1 ∧ t− ti ∧ t) = E

[∫ t

0(Hs)

2ds

],

where in the second line we have used the fact that H2i and (Bti+1∧t −Bti∧t)2 are

independent.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 11

3 Martingales and Conditional Expectations

A given non-specific filtration Ft is used unless otherwise stated.

Exercise 12 Let µ be a probability measure. If Xn is u.i. and Xn → X inmeasure, prove that Xn is L1 bounded and X ∈ L1.

Solution: By the u.i., there exists a number C such that∫|Xn|≥C |Xn|dµ ≤ 1.∫

|Xn|dµ ≤∫|Xn|≥C

|Xn|dµ+

∫|Xn|≤C

|Xn|dµ ≤ 1 + C.

Take an almost surely convergence sub-sequence if necessary, we may and willassume that Xn → X . Then

E[|X|] = E[lim infn→∞

|Xn|] ≤ lim infn→∞

E|Xn| ≤ supn

E|Xn|,

which is finite.

Exercise 13 IfX is anL1 function, prove thatXt := EX|Ft is anFt-martingale.

Solution: By the definition of conditional expectations, each Xt ∈ L1. By thedefinition Xt ∈ Ft for each t. By the tower property if s < t, E(Xt|Fs) =E(E(X|Ft)|Fs) = E(X|Fs) = Xs.

Exercise 14 (Discrete Martingales) If (Mn) is a martingale, n ∈ N, its quadraticvariation is the unique process (discrete time Doob-Meyer decomposition theorem)〈M〉n such that 〈M〉0 = 0 and M2

n − 〈M〉n is a martingale. Let (Xi) be a familyof i.i.d.’s with EXi = 0 and E(X2

i ) = 1. Then Sn =∑n

i=1Xi is a martingalew.r.t. its natural filtration. This is the random walk.

Prove that Sn is a martingale and that its quadratic variation is 〈S〉n = n.

Solution: Let Fn denotes the natural filtration of Xn. Then

ESn|Fn−1 = ESn−1|Fn−1+ EXn|Fn−1 = Sn−1 + 0.

We used the fact that Xn is independent of Fn−1. Similarly,

E(Sn)2 − n|Fn−1= E(Sn−1)2|Fn−1+ 2ESn−1Xn|Fn−1+ E(Xn)2|Fn−1 − n= (Sn−1)2 + 2Sn−1EXn|Fn−1+ E[(Xn)2]− n= (Sn−1)2 − (n− 1).

So (Sn)2 − n is an Fn martingale.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 12

Exercise 15 Let φ : Rd → R be a convex function. Show that

(a) If (Xt) is a sub-martingale and φ is increasing s.t. φ(Xt) ∈ L1, then (φ(Xt))is a sub-martingale.

(b) If (Xt) is a martingale and φ(Xt) ∈ L1, then φ(Xt) is a sub-martingale.

(c) If (Xt) is anLp integrable martingale, for a number p ≥ 1, prove that (|Xt|p)is a sub-martingale.

(d) If (Xt) is a real valued martingale, prove that (Xt ∨ 0) is a sub-martingale.

Solution: Since φ is convex, it is continuous and hence it is measurable. Thismeans that in what follows the process φ(Xt) is adapted.

(a) For t > s, using the Jensen’s inequality we obtain

E[φ(Xt)|Fs] ≥ φ(E[Xt|Fs]) ≥ φ(Xs) ,

where in the last inequality we have used E[Xt|Fs] ≥ Xs and the fact that φ isincreasing.

(b) The claim can be shown in the same way as (a), but now

φ(E[Xt|Fs]) = φ(Xs) .

(c), (d) The claims follow from (b) and the fact that the maps x 7→ |x|p andx 7→ x ∨ 0 are convex.

Exercise 16 (Limit of Martingales) Let (Mn(t), t ∈ [0, 1]), n ∈ N be a fam-ily of martingales. Suppose that for each t, limn→∞Mn(t) = Mt almost surelyand Mn(t), n ∈ N is uniformly integrable for each t. Prove that M(t) is amartingale.

Solution: For any t ∈ [0, 1] and any A ∈ F fixed, the family Mn(t)1A, n ∈ Nis uniformly integrable. Indeed,

limC→∞

supn

∫|Mn(t)|1A≥C

|Mn(t)|1A dP ≤ limC→∞

supn

∫|Mn(t)|≥C

|Mn(t)| dP = 0 ,

where the inequality follows from |Mn(t)|1A ≤ |Mn(t)| and the last equalityfollows from the fact that Mn(t), n ∈ N are uniformly integrable.

Since, limn→∞Mn(t)1A = Mt1A almost surely, the uniform integrabilityimplies convergence in L1. Take A = Ω, we see that Mt is integrable. For any0 ≤ s < t ≤ 1 and any A ∈ Fs we have

E[M(t)1A] = limn→∞

E[Mn(t)1A] = limn→∞

E[Mn(s)1A] = E[M(s)1A] ,

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 13

where the second equality holds, because Mn is a martingale. This implies that

E[M(t)|Fs] = M(s) ,

which finishes the proof.

Exercise 17 Let a > 0 be a real number. For 0 = t1 < · · · < tn < tn+1 < . . .and i = 1, . . . , n, . . . let Hi be bounded Fti measurable functions and, let H0 be abounded F0-measurable function. Define

Ht(ω) = H0(ω)10(t) +∞∑i=1

Hi(ω)1(ti,ti+1](t).

Let (Mt, t ≤ a) a martingale with M0 = 0, and (Ht) an elementary process.Define the elementary integral

It ≡∫ t

0HsdMs ≡

∞∑i=1

Hi(Mti+1∧t −Mti∧t).

Prove that (It, 0 ≤ t ≤ a) is an Ft- martingale.

Solution: Note: The sum is always a finite sum. Please refer also to Exercise 11.It is clear that Mti+1∧t −Mti∧t ∈ Ft. There exists an n s.t. t < tn+1. Then

Mti+1∧t −Mti∧t = 0, ∀i ≥ n+ 1.∫ t

0HsdMs ≡

n∑i=1

Hi(Mti+1∧t −Mti∧t),

and for 1 ≤ i ≤ n, Hi ∈ Fti ⊂ Ft. In particular, It ∈ Ft for each t and theprocess (It) is (Ft)-adapted.

The finite number of bounded random variables |Hi|, i = 1, . . . , n is boundedby a common constant C. Furthermore, since each s, Ms is integrable,

E∣∣∣∣∫ t

0HsdMs

∣∣∣∣ ≤ n∑i=1

E(|Hi||Mti+1∧t −Mti∧t|]

)≤ C

n∑i=1

(E|Mti+1∧t|+ E|Mti∧t|

)<∞,

proving that for each t, It is integrable.Finally, we prove the martingale property. First note that if (Ms, s ≥ 0) is a

martingale thenEMt|Fs = Ms∧t, ∀s, t ≥ 0 (3)

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 14

Take t ≥ s ≥ 0 and assume that s ∈ [tk, tk+1), for some k ∈ 0, . . . , n.Explanation: We only need to consider two cases: (1) i ≤ k in which case

Hi ∈ Fs in which case we can take Hi out of the conditional expectation and (2)i ≥ k + 1 in which case s < ti and we may use tower property, to condition inaddition w.r.t Fti and take Hi out.

Then we have

E[It|Fs] =

n∑i=1

E[Hi(Mti+1∧t −Mti∧t)|Fs]

=k∑i=1

E[Hi(Mti+1∧t −Mti∧t)|Fs] +n∑

i=k+1

E[Hi(Mti+1∧t −Mti∧t)|Fs] .

For i ≤ k, we have ti ≤ s, and do the random variable Hi ∈ Fs and

k∑i=1

E[Hi(Mti+1 −Mti)|Fs] =

k∑i=1

HiE[(Mti+1 −Mti)|Fs]

=k∑i=1

Hi(Mti+1∧s −Mti∧s) =∞∑i=1

Hi(Mti+1∧s −Mti∧s) = Is.

If i ≥ k + 1 then s ≤ ti, we may use the tower property of the conditionalexpectation:

n∑i=k+1

E[Hi(Mti+1∧t −Mti ∧ t)|Fs] =

n∑i=k+1

E[E[Hi(Mti+1∧t −Mti∧t)|Fti

]|Fs]

=n∑

i=k+1

E[HiE[(Mti+1∧t −Mti∧t)|Fti ]|Fs]

=

n∑i=k+1

E

Hi

0︷ ︸︸ ︷(Mti+1∧t∧ti −Mti∧t∧ti

) ∣∣∣Fs = 0.

Combing all these equalities we conclude

E[It|Fs] = Is ,

and (It) is a martingale.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 15

4 Stopping times, progressively measurability

The filtration Ft satisfies the usual conditions.

Exercise 18 Let S, T be stopping times. Prove that

(1) S ∧ T , S ∨ T , aS where a > 1, are stopping times.

(2) If T is stopping time, then there exists a sequence of stopping times Tn suchthat Tn takes only a finite number of values and Tn decreases to T .

Solution: (1) For any t ≥ 0 we have

S ∧ T ≤ t = S ≤ t ∪ T ≤ t ∈ Ft ,S ∨ T ≤ t = S ≤ t ∩ T ≤ t ∈ Ft ,aS ≤ t = S ≤ t/a ∈ Ft/a ⊂ Ft ,

which proves the claim.(2) For any n ∈ N we define the stopping time

Tn =

i2−n, if (i− 1)2−n ≤ T < i2−n for some i < n2−n ,+∞, if T ≥ n .

These stopping times satisfy the required conditions.

Exercise 19 Prove that T is a stopping time iff T < t ∈ Ft, for any t > 0.

Solution: If T is a stopping time, then T < t = ∪n≥1T ≤ t − 1n ∈ Ft,

because T ≤ t− 1n ∈ Ft−1/n ⊂ Ft.

Conversely, if T < t ∈ Ft, for any t > 0, then

T ≤ t = ∩n≥1T < t+1

n ∈ Ft+ = Ft ,

because the filtration is right-continuous.

Exercise 20 Let (Mt, t ∈ I) be an (Ft)-martingale. Let τ be a bounded stoppingtime that takes countably many values. Let Y be a boundedFτ measurable randomvariable. Let Nt = Y (Mt −Mt∧τ ). Prove that (Nt) is a martingale.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 16

Solution: Since Y is bounded, Nt is an integrable process. Furthermore, it isadapted, since, for any t ≥ 0 and any Borel set B, we have

Nt ∈ B = (Y (Mt −Mt∧τ ) ∈ B ∩ τ ≤ t) ∪ (0 ∈ B ∩ τ > t) ∈ Ft .

Let us now take any 0 ≤ s < t. Then we have

E[Nt|Fs] = E[Y (Mt −Mt∧τ )1τ>s|Fs] + E[Y (Mt −Mt∧τ )1τ≤s|Fs]= I1 + I2 .

For the first term we have, by the optional stopping theorem,

I1 = E[E[Y (Mt −Mt∧τ )1τ>s|Fτ ]|Fs] = E[Y

0︷ ︸︸ ︷E[Mt −Mt∧τ |Fτ ] 1τ>s|Fs] = 0.

For the second term we can get, again by the optional stopping theorem,

I2 = E[Y (Mt −Mτ )1τ≤s|Fs] = Y 1τ≤sE[Mt −Mτ |Fs]= 1τ≤sY (Ms −Mτ∧s) = Ns .

This finishes the proof.

Exercise 21 Show that for s < t and A ∈ Fs, τ = s1A + t1Ac is a stopping time.

Solution: Indeed, for any r ≥ 0 we have

τ ≤ r =

∅, if r < sA, if s ≤ r < tΩ, if r ≥ t

∈ Fr .

Exercise 22 Let 0 = t1 < · · · < tn+1 < . . . with limn→∞ tn = ∞. For eachi = 0, 1, . . . , let Hi be a real valued Fti measurable random variable and H0 anF0-measurable random variable. For t > 0, we define

Xt(ω) = H0(ω)10(t) +

∞∑i=0

Hi(ω)1(ti,ti+1](t).

Prove that (Xt, t ≥ 0) is progressively measurable.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 17

Solution: To this end, we take any Borel set A ∈ B(R) and we have to show thatfor any t ≥ 0, the set (s, ω) : H(s, ω) ∈ A is measurable in the product space([0, t],B([0, t]))× (Ω,Ft). We can rewrite this set in the following way:

(s, ω) ∈ [0, t]× Ω : H(s, ω) ∈ A = (0 × ω : H0(ω) ∈ A)∪ti≤t ((ti ∧ t, ti+1 ∧ t]× ω : Hi(ω) ∈ A) .

Since the sets 0 and (ti, ti+1], i = 1, . . . , n, belong to B([0, t]), and ω :Hi(ω) ∈ A ∈ Fti ⊂ F , the claim now follows from the fact that the productof two measurable sets is measurable in the product space.

Exercise 23 Let s < t ≤ u < v and let (Hs) and (Ks) be two elementary pro-cesses. We define:

∫ ts HrdBr =

∫ t0 HrdBr −

∫ s0 HrdBr. Prove that

E[∫ t

sHrdBr

∫ v

uKrdBr

]= 0.

Solution: Recall that the stochastic process (∫ t

0 HrdBr, t ≥ 0) is measurable w.r.t.Ft and is a martingale. We use the tower property to obtain the following:

E[∫ t

sHrdBr

∫ v

uKrdBr

]= E

[E[∫ t

sHrdBr

∫ v

uKrdBr

∣∣∣Fu]]= E

[∫ t

sHrdBrE

[∫ v

uKrdBr

∣∣∣Fu]] = 0 ,

because the stochastic integral is a martingale, and hence the inner expectationvanishes.

Exercise 24 Let f : R+ → R be a differentiable function with f ′ ∈ L1([0, 1]).Let ∆n : 0 = t

(n)1 < t

(n)2 < · · · < t

(n)Nn

= t be a sequence of partitions of [0, t] withlimn→∞ |∆n| → 0. Prove that

limn→∞

Nn∑j=1

(f(t

(n)j )− f(t

(n)j−1)

)2= 0 .

Hint: f is uniformly continuous on [0, t] and f(t)− f(s) =∫ ts f′(r) dr.

Solution: We have a simple estimate

Nn∑j=1

(f(t

(n)j )− f(t

(n)j−1)

)2≤ max

j

∣∣∣f(t(n)j )− f(t

(n)j−1)

∣∣∣ Nn∑j=1

∣∣∣f(t(n)j )− f(t

(n)j−1)

∣∣∣ .

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 18

Firstly, we have maxj

∣∣∣f(t(n)j )− f(t

(n)j−1)

∣∣∣→ 0, as n→∞, because of the uniformcontinuity of f on [0, t]. Secondly, we can estimate

Nn∑j=1

∣∣∣f(t(n)j )− f(t

(n)j−1)

∣∣∣ =

Nn∑j=1

∣∣∣∣∣∫ t

(n)j

t(n)j−1

f ′(r) dr

∣∣∣∣∣ ≤Nn∑j=1

∫ t(n)j

t(n)j−1

∣∣f ′(r)∣∣ dr=

∫ t

0

∣∣f ′(r)∣∣ dr <∞ ,

which follows from the properties of f . Thus, from these facts the claim follows.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 19

5 Martingales and Optional Stopping Theorem

Let (Ft) be a filtration satisfying the usual assumptions, unless otherwise stated..

Exercise 25 Use the super-martingale convergence theorem to prove the followingstatement. Let (Xn) be a sub-martingale sequence and supn E(X+

n ) < ∞. Thenlimn→∞Xn exists almost surely.

Solution: The process Yn = −Xn is a super-martingale. Furthermore, supn E(Y −n ) =supn E(X+

n ) < ∞. Thus, from the super-martingale convergence theorem, thelimit limn→∞ Yn = − limn→∞Xn exists almost surely.

Exercise 26 Let (Mt, 0 ≤ t ≤ t0) be a continuous local martingale with M0 = 0.Prove that (Mt, 0 ≤ t ≤ t0) is a martingale if supt≤t0 Mt ∈ L1.

Solution: Since M0 = 0, there is a sequence of stopping times Tn ↑ ∞ almostsurely, such that (MTn

t ) is a martingale. Hence, for any t > s, we have

E[MTnt |Fs] = MTn

s →Ms ,

almost surely, as n → ∞. Observe that MTnt ≤ supt≤t0 Mt, the condition that

supt≤t0 Mt ∈ L1 allows us to apply the dominated convergence theorem to obtain

E[MTnt |Fs]→ E[Mt|Fs] ,

almost surely, as n→∞. This shows that E[Mt|Fs] = Ms, what means that (Mt)is a martingale.

Exercise 27 Let Btt≥0 be one dimensional Brownian Motion starting at x (B0 =x), and let a < x < b. In this question we are going to find the probability of theBrownian Motion hitting b before a using the Optional Stopping Theorem (OST).Set

Ta = inf t : Bt = a, Tb = inf t : Bt = b, and T = Ta ∧ Tb.

(a) Give an easy arguments why Ta, Tb and T are all stopping times with respectto the natural filtration of the Brownian Motion.

(b) One would like to compute E[BT ] using OST, but (Bt, t ≥ 0) is not a uni-formly integrable martingale and apply OST would require for T to be abounded stopping time. Instead we are using the limiting argument.

(b1) Let n ∈ N use OST to prove that E[BT∧n] = x.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 20

(b2) Conclude that E[BT ] = x.

(c) Compute P(Ta > Tb).

Solution: (a) Ta is a stopping time since it is the hitting time by (Bt) of the closedset a and the same can be said for Tb. That T = Ta ∧ Tb is the stopping time byExercise 18.(b1) Note that 0 ≤ T ∧n and both 0 and T ∧n are bounded stopping times. Henceby the OST E[BT∧n|B0] = B0 = x almost surely. Taking the expectation on theboth parts and using the tower law we get the result.(b2) It is clear that pointwiseBT∧n → BT as n→∞. MoreoverBT∧n is boundedbetween a and b hence we can use DCT to conclude:

E[BT ] = limn→∞

E[BT∧n] = x

(c) Denote P(Ta > Tb) by p, then:

x = E[BT ] = E[BTb1Ta>Tb +BTa1Tb>Ta] = E[b1Ta>Tb + a1Tb>Ta]

= bp+ a(1− p)

From here we deduce that P(Ta > Tb) = x−ab−a .

Exercise 28 If (Mt, t ∈ I) is a martingale and S and T are two stopping timeswith T bounded, prove that

MS∧T = EMT |FS.

Solution: We can write

EMT |FS = E1T<SMT |FS+ E1T≥SMT |FS .

Note, that 1T<SMT is FS-measurable. Thus, for the first term we have

E1T<SMT |FS = 1T<SMT = 1T<SMS∧T .

One can see that T ≥ S ∈ FS∧T . Indeed, for any t ≥ 0 one has

T ≥ S ∩ T ∧ S ≤ t = S ≤ T ∧ t = S ≤ T ≤ t ∪ (S ≤ t ∩ T > t) ∈ Ft ,

because every set belongs to Ft. Thus, we conclude

E1T≥SMT |FS = E1T≥SMT |FS∧T = 1T≥SEMT |FS∧T = 1T≥SMS∧T ,

where we have used the Doob’s optional stopping theorem. Combining all theseequalities together, we obtain the claim.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 21

Exercise 29 Prove the following: (1) A positive right continuous local martingale(Mt, t ≥ 0) with M0 = 1 is a super-martingale. (2) A positive right continuouslocal martingale (Mt, t ≥ 0) is a martingale if E|M0| < ∞ and EMt = EM0 forall t > 0.

Solution: (1) Since E|M0| < ∞, there is a sequence of increasing stopping timesTn, such that the process MTn

t is a martingale. In particular, MTnt is integrable.

For s < t, we use Fatou’s lemma to obtain

E[Mt|Fs] = E[lim infn→∞

Mt∧Tn |Fs] ≤ lim infn→∞

E[Mt∧Tn |Fs] = lim infn→∞

Ms∧Tn = Ms .

(2) Let S ≤ T be two stopping times, bonded by a constant K. Then, from (1), wehave

EM0 ≥ EMS ≥ EMT ≥ EMK = EM0 ,

which implies EMT = EMS . We know that EMT = EMS for any two boundedstopping times t ≤ T implies that (Mt) is a martingale, completing the proof.

Exercise 30 Let (Mt, t ≥ 0) and (Nt, t ≥ 0) be continuous local martingales withM0 = 0 and N0 = 0.

(1) Let (At) and (A′t) be two continuous stochastic processes of finite variationwith initial values 0 and such that (MtNt − At) and (MtNt − A′t) are localmartingales. Prove that (At) and (A′t) are indistinguishable.

(2) Prove that 〈M,N〉t is symmetric in (Mt) and (Nt) and is bilinear.

(3) Prove that 〈M,N〉t = 14

(〈M +N,M +N〉t − 〈M −N,M −N〉t

).

(4) Prove that 〈M −M0, N −N0〉t = 〈M,N〉t.

(5) Let T be a stopping time, prove that

〈MT , NT 〉 = 〈M,N〉T = 〈M,NT 〉.

Solution: (1) We use the following theorem. If (Mt, 0 ≤ t ≤ T ) is a continuouslocal martingale with M0 = 0. Suppose that (Mt, 0 ≤ t ≤ T ) has finite totalvariation. Then Mt = M0, any 0 ≤ t ≤ T .

The process A′t − At = (MtNt − At) − (MtNt − A′t) is a continuous localmartingale, and at the same time a process of finite total variation. ThenA′t−At =A′0 −A0 = 0.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 22

(2) The symmetry equality 〈M,N〉 = 〈N,M〉 comes from that of the product:MN = NM . Next, we will prove

〈M1 +M2, N〉 = 〈M1, N〉+ 〈M2, N〉 .

The processM1N +M2N − 〈M1, N〉 − 〈M2, N〉

is a local martingale. Thus, the claim follows from the uniqueness of the bracketprocess. Similarly for k ∈ R, kM − k〈M, 〉 is a local martingale and 〈kM〉 =k〈M〉.

(3) is a consequence of the bi-linearity, proved earlier.(4) Since (MtN0, t ≥ 0) is a local martingale, the bracket process 〈M,N0〉 van-ishes. By the same reason we have 〈M0, N〉 = 〈M0, N0〉 = 0. Hence, the claimfollows from the bilinearity.

(5) By the definition MTNT − 〈MT , NT 〉 and (MN)T − 〈M,N〉T are localmartingales. This implies 〈MT , NT 〉 = 〈M,N〉T , from the uniqueness of thebracket process.

Furthermore, both MTNT − 〈MT , NT 〉 and (M −MT )NT are local mar-tingales, hence their sum MNT − 〈MT , NT 〉 is a local martingale as well. Thisimplies 〈M,NT 〉 = 〈MT , NT 〉, from the uniqueness of the bracket process.

Exercise 31 Let (Mt, t ∈ [0, 1]) and (Nt, t ∈ [0, 1]) be bounded continuous mar-tingales with M0 = N0 = 0. If (Mt) and (Nt) are furthermore independent provethat their quadratic variation process (〈M,N〉t) vanishes.

Solution: Let us take a partition 0 = t0 < t1 < . . . < tn = 1. Then we have

E[ n∑i=1

(Mti −Mti−1)(Nti −Nti−1)]2

=

n∑i,j=1

E[(Mti −Mti−1)(Mtj −Mtj−1)

]E[(Nti −Nti−1)(Ntj −Ntj−1)

]=

n∑i=1

E[Mti −Mti−1

]2E[Nti −Nti−1

]2=

n∑i=1

E[M2ti −M

2ti−1

]E[N2ti −N

2ti−1

]≤

n∑i=1

E[M2ti −M

2ti−1

]supj

E[N2tj −N

2tj−1

]≤

n∑i=1

E[M2ti −M

2ti−1

]E sup

j

[N2tj −N

2tj−1

].

Because N2tj is uniformly continuous, supj |N2

tj − N2tj−1| → 0 almost surely and

N2t is bounded, E supj

[N2tj −N

2tj−1

]→ 0. Since the bracket process is the limit

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 23

of∑n

i=1(Mti −Mti−1)(Nti − Nti−1) (in probability), this implies 〈M,N〉t = 0.

Exercise 32 Prove that (1) for almost surely all ω, the Brownian paths t 7→ Bt(ω)has infinite total variation on any intervals [a, b]. (2) For almost surely all ω, Bt(ω)cannot have Holder continuous path of order α > 1

2 .

Solution: (1) We know that t is the quadratic variation of (Bt). We can choose asequence of partitions t(n)

j j=1,...,Mn such that for almost surely all ω,

n∑j=1

(Bt(n)j

(ω)−Bt(n)j−1

(ω)

)2

→ (b− a) , (4)

as n→∞. Also for every ω,

Mn∑j=1

(Bt(n)j

(ω)−Bt(n)j−1

(ω))2 ≤ maxj|B

t(n)j

(ω)−Bt(n)j−1

(ω)|Mn∑j=1

|Bt(n)j

(ω)−Bt(n)j−1

(ω)|

≤ maxj|B

t(n)j

(ω)−Bt(n)j−1

(ω)||B(ω)|TV[a,b].

If |B(ω)|TV[a,b]is finite then, since t 7→ Bt(ω) is uniformly continuous on [a, b],

Mn∑j=1

(Bt(n)j

(ω)−Bt(n)j−1

(ω))2 ≤ maxj|B

t(n)j

(ω)−Bt(n)j−1

(ω)|TV[a,b](B(ω))→ 0 .

But for almost surely all ω, this limit is b − a, and hence TV[a,b](B(ω)) = ∞ foralmost surely all ω.(2) Suppose that for some α > 1

2 and some number C, both may depend on ω,

|Bt(ω)−Bs(ω)| ≤ C(ω)|t− s|α.

We take the partitions t(n)j j=1,...,Mn , as in (1). For any ω we have

Mn∑j=1

(Bt(n)j

(ω)−Bt(n)j−1

(ω))2 ≤ C2Mn∑j=1

|t(n)j − t

(n)j−1|

2α ≤ C2|∆n|2α−1Mn∑j=1

|t(n)j − t

(n)j−1|

= C2(b− a)|∆n|2α−1 → 0 ,

what contradicts with (4), unless b − a = 0. Hence for almost surely all ω, theBrownian path is not Holder continuous of order α > 1/2 in any interval [a, b].

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 24

6 Local Martingales and Stochastic Integration

If M ∈ H2, L2(M) denotes the L2 space of progressively measurable processeswith the following L2 norm:

|f |L2(M) := E[∫ t

0(fs)

2d〈M,M〉s)]<∞.

If (Mt) is a continuous local martingale we denote by L2loc(M) the space of pro-

gressively measurable processes with∫ t

0(Ks)

2d〈M,M〉s <∞, ∀t.

Exercise 33 Let (Bt) be a standard Brownian Motion and let fs, gs ∈ L2loc(B).

Compute the indicated bracket processes, the final expression should not involvestochastic integration.

1. 〈∫ t

0 fsdBs,∫ t

0 gsdBs〉.

2. 〈∫ ·

0 BsdBs,∫ ·

0 B3sdBs〉t.

Solution: (1a) By the definition,

〈∫ ·

0fsdBs,

∫ ·0gsdBs〉t =

∫ t

0fsgsd〈B〉s =

∫ t

0fsgsds .

(1b) By Ito isometry:

〈∫ ·

0BsdBs,

∫ ·0B3sdBs〉t =

∫ t

0B4sd〈B〉s =

∫ t

0B4sds .

(2) We have the following identities:

It =

∫ t

0(2Bs + 1)d

(∫ s

0Brd(Br + r)

)=

∫ t

0(2Bs + 1)Bsd(Bs + s)

= 2

∫ t

0B2sdBs +

∫ t

0BsdBs + 2

∫ t

0B2sds+

∫ t

0Bsds .

Exercise 34 We say that a stochastic process belongs to Cα− if its sample pathsare of locally Holder continuous of order γ for every γ < α. Let (Ht, t ≤ 1)be an adapted continuous and Lp bounded stochastic process where p > 2, withH ∈ L2([0, 1] × Ω). Let (Bt) be a one dimensional Brownian motion and setMt =

∫ t0 HsdBs.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 25

(a) Prove that (Mt) belongs to C12− 1

p−.

(b) If (Ht) is a bounded process, i.e. sup(t,ω) |Ht(ω)| < ∞, prove that (Mt)

belongs to C12−.

Proof: First note that the bracket process of∫ ts HrdBr is

∫ ts (Hr)

2dr.(1) Since H is Lp bounded, there exists C s.t. sups∈[0,1] E(|Hs|p) < C. By

Burkholder-Davis-Gundy inequality,

E|Mt −Ms|p ≤ cpE(∫ t

s(Hr)

2dr

) p2

≤ cp(t− s)p2

1

t− sE∫ t

s(Hr)

pdr

≤ Ccp(t− s)p2 .

By Kolmogorov’s continuity criterion, (Mt) has a modification which is Holdercontinuous for any γ < 1

p(p2 − 1) = 12 −

1p .

(2) Similarly for any p ≥ 0,

E|Mt −Ms|p ≤ cpE(∫ t

s(Hr)

2dr

) p2

≤ cp(t− s)p2 supr∈[0,1],ω∈Ω

(Hr(ω))p,

so that (Mt) has a modification which is Holder continuous for any γ < 12 −

1p for

any p > 0 and concluding the second assertion, again by Kolmogorov’s continuitytheorem.

Exercise 35 Let T > 0. Let f be a left continuous and adapted process such thatE∫ T

0 (fs)2ds <∞. Prove that (

∫ t0 fsdBs, 0 ≤ t ≤ T ) is a martingale.

Solution: We know that (∫ t

0 fsdBs, 0 ≤ t ≤ T ) is a local martingale. Furthermore,

E(

∫ t

0fsdBs, )

2 ≤ c2E∫ t

0(fs)

2ds ≤ c2E∫ T

0(fs)

2ds <∞.

by Burkholder-Davis-Gundy inequality. It is therefore an L2 bounded process, andso a martingale.

Exercise 36 Let M ∈ H2 and H ∈ L2(M). Let τ be a stopping time. Prove that∫ t∧τ

0HsdMs =

∫ t

01s≤τHsdMs =

∫ t

0HsdM

τs .

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 26

Solution: For any N ∈ H2, we use Exercise 30 to obtain the following identities∫ t

0Hsd〈M τ , N〉s =

∫ t

0Hsd〈M,N〉τ∧s

=

∫ t

01s≤τHsd〈M,N〉s =

∫ τ∧t

0Hsd〈M,N〉s .

The claim now follows from the fact, that the stochastic integral It =∫ t

0 HsdMτs

is defined uniquely by the identities

〈I,N〉t =

∫ t

0Hsd〈M τ , N〉s , t ≥ 0 ,

for any N ∈ H2.

Exercise 37 Let M ∈ H2 and K ∈ E . Prove that the elementary integral It :=∫ t0 KsdMs satisfies

〈I,N〉t =

∫ t

0Ks d〈M,N〉s, ∀t ≥ 0 ,

for any N ∈ H2.

Solution: See lecture notes.

Exercise 38 LetBt = (B1t , . . . , B

nt ) be an n-dimensional Brownian motion. Prove

that

|Bt|2 = 2n∑i=1

∫ t

0BisdB

is + nt.

Solution: Either use |Bt|2 =∑n

i=1 |Bit|2 apply the product to each component

(Bit), then summing up or use the multi-dimensional version of the Ito’s formula.Let us consider the function f : Rn → R, f : x 7→ |x|2. Then its derivatives

are given by∂if(x) = 2xi , ∂2

ijf(x) = 2δi,j .

Applying the Ito formula, we obtain

|Bt|2 = f(Bt) = f(B0) +n∑i=1

∫ t

0∂if(Bs)dB

is +

1

2

n∑i,j=1

∫ t

0∂2ijf(Bs)d < Bi, Bj >s

= 0 + 2n∑i=1

∫ t

0BisdB

is +

n∑i=1

∫ t

0ds = 2

n∑i=1

∫ t

0BisdB

is + nt ,

which is the required identity.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 27

Exercise 39 Give an expression for∫ t

0 s dBs that does not involve stochastic inte-gration.

Solution: Applying the classical integration by parts formula, we obtain∫ t

0s dBs = tBt −

∫ t

0Bs ds.

Exercise 40 Write∫ t

0 (2Bs + 1)d(∫ s

0 Brd(Br + r))

as a function of (Bt), not in-volving stochastic integrals.

Solution: Firstly,∫ t

0(2Bs + 1)d

(∫ s

0Brd(Br + r)

)=

∫ t

0(2Bs + 1)Bsd(Bs + s)

=

∫ t

02B2

sdBs +

∫ t

0BsdBs +

∫ t

0(2B2

s +Bs)ds.

From the Ito formula for the function x3 we obtain∫ t

0B2sdBs =

1

3B3t −

∫ t

0Bsds .

And the Ito formula for the function x2 gives∫ t

0BsdBs =

1

2(B2

t − t) .

Substituting these stochastic integrals into the formula obtained in (2a), we get

It =2

3B3t +

1

2(B2

t − t) + 2

∫ t

0B2sds−

∫ t

0Bsds .

Exercise 41 If (Ht) and (Kt) are continuous martingales, prove that

〈HK,B〉t =

∫ t

0Hrd〈K,B〉r +

∫ t

0Krd〈H,B〉r.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 28

Solution: The product formula gives

HtKt =

∫ t

0HrdKr +

∫ t

0KrdHr +H0K0 + 〈H,K〉t .

Hence, by Ito isometry, we obtain

〈HK,B〉t =<

∫ ·0HrdKr +

∫ ·0KrdHr, B〉t

=

∫ t

0Hrd〈K,B〉r +

∫ t

0Krd〈H,B〉r ,

what is the required identity.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 29

7 Ito’s Formula

Exercise 42 If (Nt) is a continuous local martingale with N0 = 0, show that(eNt− 1

2〈N,N〉t) is a local martingale, and E

(eNt− 1

2〈N,N〉t

)≤ 1.

Solution: Let us denote Xt = eNt− 12〈N,N〉t . The process (Nt − 1

2〈N,N〉t) is asemi-martingale, and we can apply the Ito formula to the function ex:

Xt = 1 +

∫ t

0XsdNs .

Since the stochastic integral is a local martingale, we conclude that (Xt) is a localmartingale.

Moreover, let Tn be the sequence of increasing stopping times such that XTnt

is a uniformly integrable martingale. Then, applying the Fatou lemma, we get

E[Xt] = E[ limn→∞

XTnt ] ≤ lim inf

n→∞E[XTn

t ] = E[X0] = 1 ,

which is the required bound.

Exercise 43 Show that a positive continuous local martingale (Nt) with N0 =1 can be written in the form of Nt = exp (Mt − 1

2〈M,M〉t) where (Mt) is acontinuous local martingale.

Solution: Let us define Mt =∫ t

0 N−1s dNs. Applying the Ito formula to the func-

tion log(x), we obtain

log(Nt) = log(N0) +

∫ t

0N−1s dNs −

1

2

∫ t

0N−2s d〈N〉s = Mt −

1

2〈M〉t ,

which implies the claim.

Exercise 44 Let (Bt) be a one dimensional Brownian motion on (Ω,F ,Ft, P )and let g : R→ R be a bounded Borel measurable function. Find a solution to

xt = x0 +

∫ t

0xsg(Bs)dBs .

Solution: We define Mt =∫ t

0 g(Bs)dBs. Then dXt = XtdMt whose solution isthe exponential martingale:

eMt− 12〈M〉t .

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 30

Let f(x) = x and yt = Mt − 12〈M〉t. We see that

eyt = 1 +

∫ t

0eysdys +

1

2

∫ t

0eysd〈M〉s = 1 +

∫ t

0eysdMs,

proving the claim.

Exercise 45 1. Let f, g : R→ R be C2 functions, prove that

〈f(B), g(B)〉t =

∫ t

0f ′(Bs)g

′(Bs)ds.

2. Compute 〈exp (Mt − 12〈M〉t), exp (Nt − 1

2〈N〉t)〉, where (Mt) and (Nt)are continuous local martingales.

Solution: (1) By Ito formula we have

f(Bt) = f(0) +

∫ t

0f ′(Bs)dBs +

1

2

∫ t

0f ′′(Bs)ds ,

g(Bt) = g(0) +

∫ t

0g′(Bs)dBs +

1

2

∫ t

0g′′(Bs)ds .

Hence, we obtain

〈f(B), g(B)〉t = 〈∫ ·

0f ′(Bs)dBs,

∫ ·0g′(Bs)dBs〉t

=

∫ t

0f ′(Bs)g

′(Bs)ds ,

what is the required identity.(2) Let us denote Xt = exp (Mt − 1

2〈M〉t) and Yt = exp (Nt − 12〈N〉t). Then,

applying the Ito formula to the functions ex, we obtain

Xt = exp (M0) +

∫ t

0XsdMs , Yt = exp (N0) +

∫ t

0YsdNs .

Hence, we conclude

〈X,Y 〉t = 〈∫ ·

0XsdMs,

∫ ·0YsdNs〉t =

∫ t

0XsYsd〈M,N〉s .

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 31

Exercise 46 Let Bt = (B1t , . . . , B

nt ) be an n-dimensional Brownian motion on

(Ω,F ,Ft, P ), with n > 1. We know that for any t > 0, rs 6= 0 for all s ≤ t hasprobability one. Prove that the process rt = |Bt| is a solution of the equation

rt = r0 +Bt +

∫ t

0

n− 1

2rsds , r0 = 0.

Solution: (1) Let us take a function f(x) = |x|, for x ∈ Rn. Its derivatives aregiven by

∂f

∂xif(x) =

xi|x|

,∂2f

∂x2i

(x) =|x|2 − x2

i

|x|3,

for x 6= 0. Note that f is C2 on R2\0. Let τ be the first time thatXt := x0 +Bthits 0. Then τ =∞ almost surely. Applying the Ito formula to f andXτ

t , the latterequals Xs almost surely. We obtain

rt = r0 +

n∑i=1

∫ t

0

Xis

|Xs|dBi

s +1

2

n∑i=1

∫ t

0

|Xs|2 − (Xis)

2

|Xs|3ds

= r0 +

n∑i=1

∫ t

0

Xis

|Xs|dBi

s +1

2

∫ t

0

n|Xs|2 − |Xs|2

|Xs|3ds.

To finish the proof, we have to show that Bt :=∑n

i=1

∫ t0Xi

s|Xs|dB

is is a Brownian

motion. Firstly it is a continuous martingale, starting at 0. It has the quadraticvariation

〈B, B〉t =n∑i=1

∫ t

0

(Bis)

2

|Bs|2ds = t.

Hence, the claim follows from the Levy characterization theorem.

Exercise 47 Let f : R → R be C2. Suppose that there is a constant C > 0 s.t.f ≥ C. Let

h(x) =

∫ x

0

1

f(y)dy.

Suppose that limx→+∞ h(x) =∞. Denote by h−1 : [0,∞)→ [0,∞) its inverse..Prove that h−1

(h(x0) +Bt

)satisfies:

xt = x0 +

∫ t

0f(xs)dBs +

1

2

∫ t

0f(xs)f

′(xs)ds.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 32

Solution: Let us denote g(x) = h−1(h(x0) + x

). Then its derivatives are given

byg′(x) = f(g(x)) , g′′(x) = f ′(g(x))f(g(x)) .

The claim now follows by applying the Ito formula to g(Bt).

Exercise 48 Let (Bt) be a Brownian motion and τ its hitting time of the set [2,∞).Is the stochastic process ( 1

Bt−2 , t < τ) a solution to a stochastic differential equa-tion? (The time τ is the natural life time of the process 1

Bt−2 .)

Solution: Note that τ = inft : Bt ∈ [2,∞). Let τn be a sequence of stoppingtime increasing to τ . Let f(x) = (x − 2)−1. Where f is differentiable, f ′(x) =−(x− 2)−2 = −f2 and f ′′(x) = 2(x− 2)−3 = 2f3. Applying the Ito formula tof and to the stopped process Xt := (Bt − 2)−1 we obtain

Xτnt ≡ f(Bτn

t ) = −1

2−∫ t

0(Xτn

s )2dBs +

∫ t

0(Xτn

s )3ds ,

i.e.

Xτnt = −1

2−∫ t∧τn

0(Xs)

2dBs +

∫ t∧τn

0(Xs)

3ds ,

We see that on the set t < τn,

Xt = −1

2−∫ t

0X2sdBs +

∫ t

0X3sds .

which is the required equation. Since t < τ = ∪nt < τn, the equation holdson t < τ.

Exercise 49 Let (Xt) be a continuous local martingale begins with 0. Prove that〈X2, X〉t = 2

∫ t0 Xsd〈X,X〉s.

Solution: Note that Xt =∫ t

0 dXs and

X2t = 2

∫ t

0XsXs + 〈X,X〉t.

Consequently,

〈X2, X〉t = 〈∫ ·

0dXs, 2

∫ ·0XsXs〉t = 2

∫ t

0Xsd〈X,X〉s.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 33

8 Levy’s Characterisation Theorem, SDEs

Exercise 50 Let σk = (σ1k, . . . , σ

dk), b = (b1, . . . , bd) be C2 functions from Rd to

Rd. Let

L =1

2

d∑i,j=1

(m∑k=1

σikσjk

)∂2

∂xi∂xj+

d∑l=1

bl∂

∂xl.

Suppose that for C > 0,

|σ(x)|2 ≤ c(1 + |x|2), 〈b(x), x〉 ≤ c(1 + |x|2).

(a) Let f(x) = |x|2 + 1. Prove that Lf ≤ af for some a.

(b) If xt = (x1t , . . . , x

dt ) is a stochastic process with values in Rd, satisfying the

following relations:

xit = x0 +m∑k=1

∫ t

0σik(xs)dB

ks +

∫ t

0bi(xs)ds,

and f : Rd → R, prove that f(xt) − f(x0) −∫ t

0 Lf(xs)ds is a local mar-tingale and give the semi-martingale decomposition for f(xt).

Solution: (a) The partial derivatives of f are given by

∂f

∂xi(x) = 2xi ,

∂2f

∂xi∂xj(x) = 2δi,j .

Hence, we obtain the following bound

Lf(x) =d∑i=1

m∑k=1

(σik)2 + 2

d∑i=1

bixi = |σ(x)|2 + 2 < b(x), x >

≤ cf(x) + 2cf(x) = 3cf(x) ,

which is the required bond.(b) Firstly, 〈Bi, Bj〉s = t if i = j and vanishes otherwise. Hence,

〈xi, xj〉t =

⟨m∑k=1

∫ ·0σik(xs)dB

ks ,

m∑l=1

∫ ·0σjl (xs)dB

ls

⟩t

=m∑k=1

∫ t

0σik(xs)σ

jk(xs)ds.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 34

f(xt) =f(x0) +d∑i=1

∫ t

0

∂f

∂xi(xs)dx

is +

1

2

d∑i,j=1

∫ t

0

∂2f

∂xi∂xj(xs)d〈xi, xj〉s

=f(x0) +d∑i=1

∫ t

0

∂f

∂xi(xs)

m∑k=1

σik(xs) dBks +

d∑i=1

∫ t

0

∂f

∂xi(xs)b

i(xs)ds

+1

2

d∑i,j=1

∫ t

0

∂2f

∂xi∂xj(xs)

m∑k=1

σik(xs)σjk(xs)ds

=f(x0) +

m∑k=1

∫ t

0

d∑i=1

∂f

∂xi(xs)σ

ik(xs) dB

ks

+

∫ t

0

1

2

d∑i,j=1

(m∑k=1

σik(xs)σjk(xs)

)∂2f

∂xi∂xj(xs) +

d∑i=1

∂f

∂xi(xs)b

i(xs)

ds

Thus f(xt) − f(x0) −∫ t

0 Lf(xs)ds =∑m

k=1

∫ t0

∑di=1

∂f∂xi

(xs)σik(xs) dB

ks is a

local martingale and the semi-martingale decomposition is as given in the identityearlier.

Exercise 51 Let T be a bounded stopping time and (Bt) a one dimensional Brow-nian motion. Prove that (BT+s −BT , s ≥ 0) is a Brownian motion.

Solution: Let us denote Ws = BT+s−BT . It is obvious, that W0 = 0 and Wt hascontinuous sample paths. Moreover, by the OST we have, for any s < t,

E[Wt|FT+s] = E[BT+t|FT+s]−BT = BT+s −BT = Ws .

Let Gt = FT+t. Then (Wt) is a (Gt)-martingale. Next, we take s < t and derive

E[W 2t |FT+s] = E[B2

T+t +B2T − 2BTBT+t|FT+s]

= E[B2T+t − (T + t)|FT+s] + (T − t) +B2

T − 2BTBT+s

= (B2T+s − (T + s)) + (T − t) +B2

T − 2BTBT+s

= W 2s + t− s ,

where we have used the fact that (BT+t)2 − (T + t) is a martingale. This im-

plies that W 2t − t is a (Gt)-martingale, and hence 〈W,W 〉t = t. Using the Levy

characterization theorem, we conclude that Wt is a (Gt)-Brownian motion.

Exercise 52 Let Bt,W 1t ,W

2t be independent one dimensional Brownian mo-

tions.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 35

1. Let (xs) be an adapted continuous stochastic process. Prove that (Wt) de-fined below is a Brownian motion,

Wt =

∫ t

0cos(xs)dW

1s +

∫ t

0sin(xs)dW

2s .

2. Let sgn(x) = 1 if x > 0 and sgn(x) = −1 if x ≤ 0. Prove that (Wt) is aBrownian motion if it satisfies the following relation

Wt =

∫ t

0sgn(Ws)dBs.

3. Prove that the process (Xt, Yt) is a Brownian motion if they satisfy the fol-lowing relations,

Xt =

∫ t

01Xs>YsdW

1s +

∫ t

01Xs≤YsdW

2s

Yt =

∫ t

01Xs≤YsdW

1s +

∫ t

01Xs>YsdW

2s .

Solution: (1) The process (Wt) is a continuous martingale with the quadratic vari-ation

< W >t=

∫ t

0

((cos(xs))

2 + (sin(xs))2)ds = t .

Hence, by Levy characterization theorem, we conclude that (Wt) is a Brownianmotion.(2) Can be shown in the same way.(3) The processes (Xt) and (Yt) are martingales, and their quadratic variations are

< X >t =

∫ t

0

(12Xs>Ys + 12

Xs≤Ys)ds = t ,

< Y >t =

∫ t

0

(12Xs≤Ys + 12

Xs>Ys

)ds = t .

The bracket process is equal to

< X,Y >t=

∫ t

01Xs>Ys1Xs≤Ysds+

∫ t

01Xs≤Ys1Xs>Ysds = 0 ,

where we have used the fact that 1Xs>Ys1Xs≤Ys = 0. The claim now follows fromthe Levy characterization theorem.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 36

Exercise 53 Let t ∈ [0, 1). Define x0 = 0 and

xt = (1− t)∫ t

0

1

1− sdBs.

(1) Prove that

E[∫ t

0

dBs1− s

]2

=t

1− t.

(2) Prove that ⟨∫ ·0

dBs1− s

⟩t

=t

1− t.

Define A(t) = t1−t . Then A−1(r) = r

r+1 .

(3) Define

Wr =

∫ A−1(r)

0

dBs1− s

, 0 ≤ r <∞.

Let Gr = FA−1(r). Prove that (Wr) is an (Gr)-martingale.

(4) For a standard one dimensional Brownian motion Bt, limr→0 rB1/r = 0.Use this to prove that limt→1 xt = 0.

(5) Prove that xt solves

xt = Bt −∫ t

0

xs1− s

ds.

(6) Prove that∫ t

0xs

1−sds is of finite total variation on [0, 1].

Hint: it is sufficient to prove that∫ 1

0

|xs|1− s

ds <∞

almost surely.

Solution: (1) By Ito isometry we obtain

E[∫ t

0

dBs1− s

]2

=

∫ t

0

ds

(1− s)2=

t

1− t.

(2) We have ⟨∫ ·0

dBs1− s

⟩t

=

∫ t

0

d < B >s(1− s)2

=t

1− t.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 37

(3) It is obvious that (Wr) is integrable and adapted to (Gr). Moreover, for anyt > s, we have

E[Wt|Gs] = E

[∫ t/(t+1)

0

dBr1− r

∣∣∣Fs/(s+1)

]=

∫ s/(s+1)

0

dBr1− r

= Ws ,

because s/(s + 1) < t/(t + 1). Thus the Ito integral (Wt) is a martingale withrespect to (Ft). This shows that (Wr) is a martingale with respect to (Gr).(4) By (2), we can calculate

〈W 〉t =A−1(t)

1−A−1(t)= t .

By Paul Levy characterization theorem, we conclude that (Wt) is a Brownian mo-tion. Then, limr→0 rW1/r = 0. This implies

limt→1

Xt = limr→0

Xr/(r+1) = limr→0

rW1/r = 0 .

(5) Since the function 11−s has finite total variation on [0, c], for every c < 1,

the integral∫ t

0 Bsd1

1−s , for t ≤ c, can be defined in the Stieltjes sense. Usingintegration by parts, we see that the process Wt is defined in the Stieltjes sense aswell. We can use the classical analysis to derive the following equalities:∫ t

0

xs1− s

ds =

∫ t

0WA(s)ds = sWA(s)

∣∣∣t0−∫ t

0s dWA(s)

= t

∫ t

0

dBs1− s

−∫ t

0

s

1− sdBs = t

∫ t

0

dBs1− s

−∫ t

0

dBs1− s

+

∫ t

0dBs

= −xt +Bt ,

which is the claimed equality, for any t ≤ c. Passing c → 1, we conclude that theequality holds for any t ∈ [0, 1].(6) Since, xs is defined as an integral of a non-random process with respect to theBrownian motion, it has the normal distribution, whose parameters are easily seento be 0 and s(1− s) (the latter follows from (1)). Thus, we apply Fubini lemma toderive

E∫ 1

0

|xs|1− s

ds =

∫ 1

0

E|xs|1− s

ds = C

∫ 1

0

√s(1− s)1− s

ds <∞ ,

for some constantC > 0. This implies that∫ 1

0|xs|1−sds <∞ almost surely. Defining

the process Yt =∫ t

0xs

1−sds and taking any partition 0 = t0 < t1 < . . . < tn = 1,

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 38

we derive

n−1∑i=0

|Yti+1 − Yti | =n−1∑i=0

∣∣∣ ∫ ti+1

ti

xs1− s

ds∣∣∣ ≤ n−1∑

i=0

∫ ti+1

ti

|xs|1− s

ds

=

∫ 1

0

|xs|1− s

ds <∞ ,

from what the claim follows.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 39

Problem Sheet 9: SDE and Grisanov

Exercise 54 Let us consider an SDE dxt =∑m

k=1 σk(xt)dBkt + σ0(xt)dt on Rd

with infinitesimal generator L. LetD be a bounded open subset of Rd with closureD and C2 boundary ∂D. Let x0 ∈ D. Suppose that there is a global solution to theSDE with initial value x0 and denote by τ its first exit time from D. Suppose thatτ < ∞ almost surely. Let g : D → R be a continuous function, and u : D → Rbe a solution to the Dirichlet problem Lu = −g on D and u = 0 on the boundary.

Prove that u(x0) = E∫ τ

0 g(xs)ds. [Hint: Use a sequence of increasing stop-ping times τn with limit τ ].

Solution: Let Dn be a sequence of increasing open sets of Dn ⊂ Dn+1 ⊂ D andτn the first exit time of xt from Dn. By Ito’s formula,

u(xt∧τn) = u(x0) +

∫ t∧τn

0Lu(xs)ds+

n∑k=1

∫ t∧τn

0du(σk(xs))dB

ks

= u(x0)−∫ t∧τn

0g(xs)ds+

n∑k=1

∫ t∧τn

0du(σk(xs))dB

ks .

Since u, g are continuous on Dn, the stochastic integral is a martingale. We have

E[u(xt∧τn)] = u(x0)− E∫ t∧τn

0g(xs)ds.

Since τn < τ < ∞, limt↑∞(t ∧ τn) = t a.s. We take t → ∞ followed bytaking n → ∞ and used dominated convergence theorem to take the limit insidethe expectation, we see that

limn→∞

limt→∞

E[u(xt∧τn)] = Eu(xτ ) = 0.

and

limn→∞

limt→∞

E∫ t∧τn

0g(xs)ds = E

∫ τ

0g(xs)ds.

This completes the proof.

Exercise 55 A sample continuous Markov process on R2 is a Markov process onthe hyperbolic space (the upper half space model) if its infinitesimal generator is12y

2( ∂2

∂x2+ ∂2

∂y2). Prove that the solution to the following equation is a hyperbolic

Brownian motion.

dxt = ytdB1t

dyt = ytdB2t .

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 40

[Hint: Show that if y0 > 0 then yt is positive and hence the SDE can be consideredto be defined on the upper half plane. Compute its infinitesimal generator L.]

Solution: Just observe that yt = y0eB2

t−(1/2)t so yt > 0 if y0 > 0. Compute thegenerator by Ito’s formula.

Exercise 56 Discuss the uniqueness and existence problem for the SDE

dxt = sin(xt)dB1t + cos(xt)dB

2t .

Solution: The functions sin(x) and cos(x) are C1 with bounded derivative, andhence Lipschitz continuous. For each initial value there is a unique global strongsolution.

Exercise 57 Let (xt) and (yt) be solutions to the following respective SDEs (inStratnovitch form),

xt = x0 −∫ t

0ys dBs, yt = y0 +

∫ t

0xs dBs.

Show that x2t + y2

t is independent of t.

Solution: Let us rewrite the SDEs in the Ito form:

xt = x0 −∫ t

0ysdBs −

1

2< y,B >t , yt = y0 +

∫ t

0xsdBs +

1

2< x,B >t .

We can calculate the bracket processes in these expressions:

< y,B >t=<

∫ ·0xsdBs, B >t=

∫ t

0xsds , < x,B >t=

∫ t

0ysds .

Moreover, the quadratic variations of (xt) and (yt) are

< y >t=<

∫ ·0xsdBs >t=

∫ t

0x2sds , < x >t=

∫ t

0y2sds .

Applying now the Ito formula to the function f(x, y) = x2 + y2, we obtain

f(xt, yt) =f(x0, y0) + 2

∫ t

0xsdxs + 2

∫ t

0ysdys+ < x >t + < y >t

=f(x0, y0)− 2

∫ t

0xsysdBs −

∫ t

0xsd < y,B >s +2

∫ t

0xsysdBs

+

∫ t

0ysd < x,B >s + < x >t + < y >t

=f(x0, y0)−∫ t

0x2sds−

∫ t

0y2sds+ < x >t + < y >t= f(x0, y0) ,

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 41

which is independent of t.

Exercise 58 1. Let (ht) be a deterministic real valued process with∫∞

0 (hs)2ds <

∞. Let (Bt) be an Ft-Brownian motion on (Ω,F ,Ft, P ). Prove that

exp

(∫ t

0hsdBs −

1

2

∫ t

0(hs)

2ds

)is a martingale.

2. Let (ht) be a bounded continuous and adapted real valued stochastic process.Prove that

exp

(∫ t

0hsdBs −

1

2

∫ t

0(hs)

2ds

)is a martingale.

3. Let Q,P be two equivalent martingale measures on Ft with

dQ

dP= exp

(∫ t

0hsdBs −

1

2

∫ t

0(hs)

2ds

).

Define Bt = Bt −∫ t

0 hsds for (hs) given in part (1). Prove that (Bt) is an(Ft)-Brownian motion with respect to Q.

Solution: Let

Mt := exp

(∫ t

0hsdBs −

1

2

∫ t

0(hs)

2ds

)By Ito’s formula, Mt satisfies the equation dxt = htxtdBt with initial value 1 andso (Mt) is a local martingale. That it is a martingale follows from the followingarguments.

First proof. We use Novikov’s condition: If exp(12〈N,N〉t) is finite then

exp(Nt − 1

2〈N,N〉t)

is a martingale. In our case

E exp

(1

2

⟨∫ ·0hsdBs

⟩t

)= E exp

(1

2

∫ t

0(hs)

2ds

)<∞

in either case (1) or (2).Second proof for part (1). Note that

〈M,M〉t =

∫ t

0(hs)

2(Ms)2ds.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 42

Since Mt − 1 is a local martingale starting from 0, we apply Burkholder-Davis-Gundy inequality,

E(Mt)2 ≤ 2E(Mt − 1)2 + 2 ≤ cE〈M,M〉t + 2.

Thus

E〈M,M〉t ≤ 2

∫ t

0(hs)

2ds+ c

∫ t

0(hs)

2E〈M,M〉sds.

By a version of Grownall’s inequality,

E〈M,M〉t ≤ 2

∫ t

0(hs)

2ds exp

(c

∫ t

0(hr)

2dr

).

This proves (1).Second proof for part (2). From Mt = 1 +

∫ t0 hsMsdBs,

E(Mt)2 ≤ 2 + 2

∫ t

0E(hsMs)

2ds.

If (hs) is bounded, E(Mt)2 ≤ 2e2|h|2∞t and this proves that (Mt) is an L2 bounded

martingale for case (2).To prove the question (3), let us define Nt =

∫ t0 hsdBs. Then

dQ

dP= exp

(Nt −

1

2< N >t

),

and Bt = Bt− < B,N >t. First observe that the exponential martingale of(Nt) is a martingale, c.f. part (1) and Q is a probability measure, equivalent toP . By Girsanov theorem, (Bt) is an Ft local -martingale with respect to Q. Since< B >t=< B >t= t, it follows from the Levy characterisation theorem that (Bt)is a Brownian motion with respect to Q.