Solutions to HW: Course: Page: University of Texas at ... · PDF fileSolutions to HW: 1...
Transcript of Solutions to HW: Course: Page: University of Texas at ... · PDF fileSolutions to HW: 1...
Solutions to HW: 1 Course: Theory of Probability II Page: 1 of 6
University of Texas at Austin
Solutions to HW Assignment 1
Problem 1.1. Let (Ω,F , Fnn≥0,P) be a filtered probability space and T a stopping time.
(1) Show thatFT := A ∈ F|A ∩ T = n ∈ Fn (∀) n
is a sigma-field.(2) If Xnn is an adapted process, then the process Xn∧T n is also adapted, and the random
variable XT IT<∞ is FT -measurable.
Solution:
(1) this item is just an easy application of the definition of sigma-algebra.(2) Let B any Borel measurable set of R (or, in general, any measurable set of the state space).
The we have
Xn∧T ∈ B = ∪n−1k=0
(T = k ∩ Xk ∈ B
)∪(T ≥ n ∩ Xn ∈ B
)∈ Fn.
In addition
XT IT<∞ ∈ B ∩ T = n = Xn ∈ B ∩ T = n ∈ Fn
for each n, soXT IT<∞ ∈ B ∈ FT
by the definition of FT .
Problem 1.2. Let 0 ≤ S ≤ T two stopping times with respect to the discrete-time filtration(Fn)n≥0. Let A ∈ FS . Show that the random time T ′ defined by
T ′ = S 1A + T 1Ac
is also a stopping time.
Solution: We have that
T ′ = n =(S = n ∩A
)∪(S ≤ n ∩ T = n ∩Ac
)∈ Fn
sinceS ≤ n ∩Ac ∈ Fn.
Problem 1.3. (1) (Doob’s decomposition). Let Xn,Fn be a submartingale. Show that itcan be decomposed uniquely as
Xn = Mn +An, n = 0, 1, . . .
where Mn,Fn is a martingale and the process Ann is increasing, predictable withrespect to the filtration Fnn and A0 = 0. Find the processes M and A in terms of theprocess X.
(2) Let Xn =∑n
k=1 IBn for Bn ∈ Fn. What is the Doob decomposition for Xn?
Solution: 1. Use the decomposition relation for n+ 1 and n and substract term by term to get
Xn+1 −Xn = Mn+1 −Mn + (An+1 −An)
Taking conditional expectation with respect to Fn and using the martingale property for M andthe fact that A is predictable we obtain
E[Xn+1 −Xn|Fn] = An+1 −An.
Now
An = A0 + (A1 −A0) + · · ·+ (An −An−1) = E[X1 −X0|F0] + · · ·+ E[Xn −Xn−1|Fn−1].
Instructor: Mihai Sırbu Semester: Spring 2016
Solutions to HW: 1 Course: Theory of Probability II Page: 2 of 6
We can also see that
Mn+1 −Mn = Xn+1 −Xn − (An+1 −An) = Xn+1 − E[Xn+1|Fn]
and since M0 = X0 we have
Mn = X0 + (X1 − E[X1|F0]) + · · ·+ (Xn − E[Xn|Fn−1]).
Sow far we have only showed uniqueness, but existence of this decomposition can be easily proventaking these relation as definitions for M and A.
2. Since Xn −Xn−1 = 1Bn and E[1Bn |Fn−1] = p(Bn|Fn−1), we conclude that
An =
n∑i=1
p(Bi|Fi−1)
and
Mn = Xn −An =n∑
i=1
(1Bi − p(Bi|Fi−1))
Problem 1.4. Let Y1, Y2, . . . be non-negative i.i.d. with E[Yn] = 1 and P(Yn = 1) < 1.
(1) Show that Xn =∏n
k=1 Yk is a martingale(2) Use the martingale convergence theorem to conclude that Xn → 0 a.s.(3) Use the SLLN to show that
log(Xn)
n→ c < 0, a.s.
Solution: 1. E[Xn+1|Fn] = E[XnYn+1|Fn] = XnE[Yn+1|Fn] = Xn, since Yn+1 is independent onthe sigma algebra Fn generated by X1, . . . , Xn
3. we have that log(Xn) = log(Y1) + · · · + log(Yn). The hypotheses ensures that (log Y )+ ∈ L1
(by Jensen) and E[log Y ] < 0 (it CAN be −∞). If the expectation is finite, then we can apply theSLLN. If the expectation is −∞, we can TRUNCATE log Yn at a negative level −C and then letC →∞ to conclude that
log(Y1) + · · ·+ log(Yn)
n→ E[log Y ] ∈ [−∞, 0).
2. Since log(Xn)/n→ c < 0 then Xn → 0.One can prove this part directly, using martingale convergence theorem for Xn, concluding that
Xn → X∞ ≥ 0. Now, if X∞ > 0 then Yn = Xn/Xn−1 → 1, and we can obtain a contradiction.
Problem 1.5. (An example of a Uniformly Integrable Martingale which is not in H1). ConsiderΩ = N, F = 2N and the probability measure P defined by
P[k] = 2−k, k = 1, 2 . . . .
Consider now the random variable K such that K(k) = k, k = 1, 2 . . . .
(1) find an explicit example of a random variable Y : Ω → [1,∞) such that E[Y ] < ∞ andE[Y K] =∞.
(2) consider the filtration
Fn = σ (1, 2, . . . , n− 1, n, n+ 1, . . . ) .Find an expression for Xn = E[Y |Fn].
(3) Show that the martingale (Xn)n is not in H1 (note that the martingale is UI by the way itis defined).
Solution:
(1) We can define Y (k) = 2k 1k2.
Instructor: Mihai Sırbu Semester: Spring 2016
Solutions to HW: 1 Course: Theory of Probability II Page: 3 of 6
(2) It is clear (the formal proof is not very short though!) that
Xn(k) = Y (k), k = 1, 2, . . . n− 1
andXn(n) = Xn(n+ 1) = · · · = 2−1Y (n) + 2−2Y (n+ 1) + . . .
(3) we see that X∗(n) ≥ Xn(n), so
E[X∗(n)] ≥∞∑n=1
2−n
( ∞∑k=n
2n−k−12k1
k2
)=
1
2
∞∑n=1
∞∑k=n
1
k2=∞.
Problem 1.6. (1) Suppose (Xn)n is a submartingale such that E[|Xn+1−Xn||Fn] ≤ B a.s foreach n where B is a constant. Show that if N is a stopping time such that E[N ] <∞ then(Xn∧N )n is u.i., so, according to the optional sampling theorem E[X0] ≤ E[XN ]
(2) (Wald I): Let ξ1, ξ2, . . . be i.i.d such that ξi ∈ L1. If Sn = ξ1 + . . . ξn and N is a stoppingtime such that E[N ] <∞ then
E[SN ] = E[N ]E[ξ]
Please note that this implies that the first hitting time at level one for the symmetricrandom walk has INFINITE expectation.
Solution: 1. supn |Xn∧N | ≤ |X0|+ |X1−X0|+ . . . |XN−XN−1| = X0 +∑∞
n=0 |Xn+1−Xn|1N>n.Taking into account that N > n ∈ Fn, we can condition each term on time n to conclude that
E[supn|Xn∧N |] ≤ E[X0] +
∞∑n=0
E[E[|Xn+1 −Xn||Fn]1N>n] ≤
E[X0] +B∞∑n=0
P(N > n) = E[X0] + P(N = 0) + E[N ] <∞.
This means that the sequence (Xn∧N )n is dominated by an2. Define the process Mn = Sn − nE[ξ]. We have Mn+1 −Mn = ξn+1 − E[ξn+1], so E[Mn+1 −
Mn|Fn] = 0, andE[|Mn+1 −Mn||Fn] ≤ E[|ξ − E[ξ]|]
This means that M is a martingale and we can apply Part 1 to get E[MN ] = 0 which meansE[SN ] = E[N ]E[ξ].
Problem 1.7. (Quadratic variation of square integrable martingales). Consider a square integrablemartingale, by which we mean that X0 = 0 and Xn ∈ L2 for each n. We can define the squarebracket (or the quadratic variation) of the martingale by
[X,X]n =n∑
k=1
(Xk −Xk−1)2.
At the same time, we define the angle bracket, or the predictable quadratic variation process 〈X,X〉as the predictable increasing process in the Doob decomposition
X2 = M + 〈X,X〉.Show that
(1) 〈X,X〉n =∑n
k=1 E[(Xk −Xk−1)2|Fk−1] and the predictable process 〈X,X〉 is the compen-sator in the Doob decomposition of [X,X].
(2) let T be a stopping time (possibly undbounded and infinite). Show that
E[〈X,X〉T ] = E[[X,X]T ].
(note that the random variables above are actually well defined)
Instructor: Mihai Sırbu Semester: Spring 2016
Solutions to HW: 1 Course: Theory of Probability II Page: 4 of 6
(3) show thatE[ sup
0≤k≤T|Xk|2] ≤ 4E[〈X,X〉T ] = 4E[[X,X]T ].
(4) assume thatE[〈X,X〉T ] <∞.
Then the processesX2
n∧T − [X,X]n∧T , n = 0, 1, 2 . . . ,
X2n∧T − 〈X,X〉n∧T , n = 0, 1, 2 . . . ,
are uniformly integrable martingales. In particular, this means that X∞ is well defined onthe set T =∞ for such a stopping time T and
E[X2T ] = E[〈X,X〉T ] = E[[X,X]∞].
(5) show that X∞ = limnXn exists on and is finite on 〈X,X〉∞ <∞.
Solution:
(1) Using the ”fundamental property of martingales”, we can easily check that X2− [X,X] is amartingale. Therefore, the predictable compensator of X2 (in the Doob decomposition) isthe same as the predictable compensator of [X,X]. Writing now the explicit representationfor the predictable compensator of [X,X] we get the answer.
(2) according to the item above, [X,X]− 〈X,X〉 is a martingale. In particular, for each fixedn we have
E[〈X,X〉n∧T ] = E[[X,X]n∧T ].
We can now let n→∞ and use Monotone Convergence.(3) We apply the Maximal Inequality for the stopped process XT up to time n, and use the
martingale property of the process X2 − [X,X] stopped at T between time 0 and n. Afterthat we let n go to infinity.
(4) using the previous item, we see that both random variables in the difference are boundedby something integrable (as in class for continuous martingales)
(5) let Tk = infn|〈X,X〉n+1 ≥ k. Because 〈X,X〉 is predictable, then Tk is a stopping time.We also have that
〈X,X〉Tk≤ k, 〈X,X〉∞ < k ⊂ Tk =∞.
According to part 4, X∞ = limnXn is well defined on Tk =∞, so the conclusion followsby letting k →∞.
Problem 1.8. (Wald II) Use the notation of exercise 9.4.2 and assume that E[ξ] = 0 and E[ξ2] =σ2 < ∞. Use exercise 9.5.3 to show that if N is a stopping time such that E[N ] < ∞ thenE[S2
N ] = σ2E[N ].
Solution: Since E[ξ] = 0 it is an easy exercise to show that (Sn)n is a martingale. DenoteYn = Sn − nσ2. We have that
Yn+1 − Yn = S2n+1 − S2
n − σ2 = (Sn + ξn+1)2 − S2n − σ2 = 2Snξn+1 + ξ2
n+1 − σ2,
soE[Yn+1 − Yn|Fn] = 2SnE[ξn+1] + E[ξ2
n+1]− σ2 = 0,
which means that Y is a martingale. This shows that the process An = nσ2 is the predictablequadratic variation of the martingale S. We can apply now the previous problem to finish theproof.
Instructor: Mihai Sırbu Semester: Spring 2016
Solutions to HW: 1 Course: Theory of Probability II Page: 5 of 6
Problem 1.9. (de la Vallee Poussin criterion) A set of random variables (Xi)ı∈I is u.i. if and onlyif there exists a function ψ : [0,∞)→ [0,∞) such that limx→∞ ψ(x)/x =∞ and
supi∈I
E[ψ(|Xi|)] <∞.
The function ψ can be chosen to be convex.
Solution: One implication is quite easy. More precisely, if there exists such a function ψ, then,for each ε there exists M(ε) such that
ψ(x)
x≥ 1
ε, x ≥M(ε).
Now, for any M ≥M(ε) we have
E[|Xi|1|Xi|≥M] ≤ εE[ψ(|Xi|)1|Xi|≥M] ≤ ε supi∈I
E[ψ(|Xi|)].
For the other implication, let us recall that, if ψ is (piecewise) C1 and ψ(0) = 0 then
E[ψ(|X|)] =
∫ ∞0
ψ′(x)P(|X| ≥ x)dx.
We also have that
E[|X|1|X|≥M] =
∫ ∞0
P(|X|1|X|≥M ≥ x)dx =
∫ ∞M
P(|X| ≥ x)dx.
Using the definition of uniform integrability, we can therefore choose an increasing sequences an ∞, (with an increasing verly slowly), such that∑
n
an supi
∫ ∞n
P(|Xi| ≥ an)dx <∞.
We can now define the piece-wise C1 function ψ by ψ(0) = 0 and
ψ(x) = an, x ∈ (n, n+ 1),
and finish the proof.
Problem 1.10. (backward martingales) Let (Ω,F ,P) be a probability space and Fnn≥0 a “de-creasing filtration”, which means that Fn+1 ⊂ Fn. Let Xn,Fnn be a bacward martingale, whichmeans that Xn ∈ L1(Ω,Fn,P), n ≥ 0 and Xn+1 = E[Xn|Fn+1]. Show that Xn converges a.s. andin L1 to some variable X∞ and X∞ = E[Xn|F∞], where F∞ = ∩nFn.
Note: you know already from Probability I that backward martingales converge a.s., so thepoint is to see the convergence is actually L1 as well.
Solution: As already seen in Probability I (and mentioned above), we know that
Xn → X∞ ∈ F∞, a.s.Since Xn = E[X0|Fn], we know (for example from Hw 1) that Xn is a UI sequence. This impliesthat X∞ is integrable, and Xn → X∞ in L1.
Now, let A ∈ F∞. Let also k ≥ n. Since Xk = E[Xn|Fk], we have
E[Xn1A] = E[Xk1A].
Letting k →∞ we obtainE[Xn1A] = E[X∞1A], (∀)A ∈ F∞,
orX∞ = E[Xn|F∞].
Instructor: Mihai Sırbu Semester: Spring 2016
Solutions to HW: 1 Course: Theory of Probability II Page: 6 of 6
Problem 1.11. (backward submartingales) Let (Ω,F ,P) be a probability space and Fnn≥0 a“decreasing filtration” as above. Let Xn,Fnn be a bacward submartingale, which means thatXn ∈ L1(Ω,Fn,P), n ≥ 0 and Xn+1 ≤ E[Xn|Fn+1]. Show that Xnn is u.i. if and only ifinfn E[Xn] > −∞.
Solution: We can use a (backwards) upcrossing inequality to conclude that there exists an X∞ ∈L(F∞) such that
Xn → X∞ a.s.
The positive part X+n is UI, so (splitting each Xn into X+
n −X−n ) it is enough to show that
E[Xn] E[X∞].
Actually, just using Fatou for the negative part (since the positive part converges in L1) we havethat E[X∞] ≥ limn E[Xn]. Now, for k ≤ n we have Xk ≤ E[Xn|Fk]. Letting k →∞ and using theprevious problem, we have that
X∞ ≤ E[Xn|F∞],
so E[X∞] ≤ E[Xn], finishing the proof.
Instructor: Mihai Sırbu Semester: Spring 2016