• date post

22-Jun-2020
• Category

## Documents

• view

1

0

Embed Size (px)

### Transcript of Foundations of Financial Economics Introduction to ... · PDF file Stochastic processes...

• Foundations of Financial Economics Introduction to stochastic processes

Paulo Brito

1pbrito@iseg.ulisboa.pt University of Lisbon

May 15, 2020

• Information set ▶ The information set is given by

(Ω,F ,P),F,P

▶ where F is a filtration

F ≡ {Ft}Tt=0 = {F0,F1, . . . ,FT}

which is an ordered sequences of subsets of Ω such that: ▶ F0 = Ω, ▶ FT = F ▶ and Ft ⊂ Ft+1 meaning ”more information”

▶ Then, we can consider a sequence of events up until time t

Wt = {W0,W1, . . . ,Wt} where Wt ∈ Ft

• Filtration: example Binomial information tree: for T = 3 and Ω = {ω1, . . . , ω8}

ω1, ..., ω8

ω1, ..., ω4

ω1, ω2

ω1

ω2

ω3, ω4

ω3

ω4

ω5, ..., ω8

ω5, ω6

ω5

ω6

ω7, ω8

ω7

ω8

Observation: some events are being eliminated

• Filtration: example Sequence of events: {W0,W1,W2,W3} where W1 = {w1,1,w1,2}

W0

w1,1

w2,1 w3,1

w3,2

w2,2 w3,3

w3,4

w1,2

w2,3 w3,5

w3,6

w2,4 w3,7

w3,8

• Stochastic processes Adapted stochastic processes

▶ Definition: the sequence of random variables Xt

Xt = {X0, . . .Xt}, t ∈ T

▶ is an adapted stochastic process to the filtration F if

Xt = X(Wt),Wt ∈ Ft

if Xt is a random variable as regards the event Wt ∈ Ft  ▶ intuition: the information as regards t has the same structure as

Ft, in the sense that some potencial sequences are being eliminated across time.

• Stochastic processes Histories

▶ Let Nt = {Nt}Tt=0, N0 = 1 be the sequence of the number of possible events (which are equal to the number of nodes for an information tree representing F)

▶ We can represent an adapted stochastic process as a sequence of possible realizations  for every t ∈ 0, . . . ,T

Xt = X(Wt) =

 xt,1. . . xt,Nt

 ∈ RNt   where Nt is the number of possible realizations of the process at time t,

▶ History: it is a particular realization of Xt = xt up until time t where

xt = {x0, . . . , xt}, t ∈ T ▶ The set of all histories

Xt = X(Wt)

• Probabilities ▶ Consider a particular history up until time t, Xt = xt

xt = {x0, . . . , xt}, t ∈ T

▶ We call unconditional probability of history xt

P(xt) = P(Xt = xt,Xt−1,= xt−1, . . . ,X0 = x0),

▶ Then, we have a sequence of unconditional probability distributions

{P0,P1, . . . ,Pt}   where Pt = Pt(Xt) where Xt are all  histories until time t,

Pt =

 πt,1. . . πt,Nt

   Nt is the number of nodes of the information at t

▶ then

P0 = Nt∑

s=1 πt,s = 1, for every t

• A binomial stochastic process

x0

x1,1

x2,1 x3,1

x3,2

x2,2

x3,3

x3,4

x1,2

x2,3 x3,5

x3,6

x2,4

x3,7

x3,8

π1 ,1

π2,1

π3,1

π3,2

π2,2 π 3,3

π3,4

π1,2 π2,3

π3,5

π3,6

π2,4 π 3,7

π3,8

▶ The process {X0,X1,X2,X3} has 8 possible histories ▶ The sequence of uncontitional probability distributions is

{1,P1,P2,P3} where ∑2

s=1 π1,s = ∑4

s=1 π2,s = ∑8

s=1 π3,s = 1

• Transition probabilities ▶ The conditional probability of xt+1 given a particular history xt is

P(xt+1|xt) = P(Xt+1 = xt+1|Xt = xt,Xt−1,= xt−1, . . . ,X0 = x0) = P(Xt+1 = xt+1,Xt = xt,Xt−1,= xt−1, . . . ,X0 = x0)

P(Xt = xt,Xt−1,= xt−1, . . . ,X0 = x0) (1)

▶ Definition we call transition probability for Xt+h = xt+h given the information history at t,

Pt(xt+h) = P(Xt+h = xt+h|Xt = xt) we denote Pt+h|t = Pt(xt+h) where

Pt+h|t =

 πt+h|t,1. . . πt+h|t,Nt+h|t

   where Nt+h|t is the number of nodes, at t + h, of the information node at xt,s;

▶ We have now Nt+h|t∑ s=1

πt+h|t,s = 1, for every t

• A binomial stochastic process, after a t = 1 realization

x0

x1,1

x2,1 x3,1

x3,2

x2,2

x3,3

x3,4

π2,1 |1,1

π3,1|1,1

π3,2|1,1

π2,2|1,1 π3,3|1,1

π3,4|1,1

Conditional probabilities satisfy: ∑2

s=1 π2,s|1,1 = ∑4

s=1 π3,s|1,1 = 1

• A binomial stochastic process, after t = 1 and t = 2 realizations

x0

x1,1

x2,2

x3,3

x3,4

π3,3|2,21,1

π3,4|2,21,1

Conditional probabilities satisfy: ∑2

s=1 π2,s|2,21,1 = 1

• Markovian processes ▶ Definition: a stochastic process has the Markov property if

the probability conditional to a history  is the same as the probability conditional on the last realization

P(Xt+h = xt+h|Xt = xt) = P(Xt+h = xt+h|Xt = xt)

▶ In other words: the transition probability from Xt = xt is equal to the conditional probability conditional on the history until time t

Pt+h|t = Pt(xt+h) ≡ P(Xt+h = xt+h|Xt = xt) ▶ Observe that a general property of adapted processes is that the

unconditional probability of Xt = xt is equal to the probability of the history xt, i.e.,

Pt = P0(xt) = P(Xt = xt|X0 = x0) = P(xt) ▶ Then Markov processes verify the following relationship between

conditional and unconditional probabilities Pt+1 = Pt+1|t ◦ Pt

• A Markovian binomial process

x0

x1,1

x2,1 x3,1

x3,2

x2,2

x3,3

x3,4

x1,2

x2,3 x3,5

x3,6

x2,4

x3,7

x3,8

π

π 2

π 3

π 2(1 − π)

π(1 − π) π

2 (1 − π)

π(1 − π)2

1− π

(1 − π)π

(1 − π)π 2

(1 − π)2π

(1 − π) 2 (1 − π)

(1 − π)3

• A Markovian binomial process after a t = 1 realization

x0

x1,1

x2,1 x3,1

x3,2

x2,2

x3,3

x3,4

π

π 2

π(1 − π)

1 − π (1 − π)π

(1 − π)2

• A Markovian binomial process after a t = 1 and t = 2 realization

x0

x1,1

x2,2

x3,3

x3,4

π

1 − π

• Mathematical expectation for stochastic processes ▶ Unconditional mathematical expectation of Xt is a number

E0[xt] = E[xt|x0] = Nt∑

s=1 P0(xt,s)xt,s =

Nt∑ s=1

πt,sxt,s

▶ Unconditional variance of Xt is

V0[xt] = V[xt|x0] = E0[(xt − E0(xt))2] = Nt∑

s=1 πt,s(xt,s − E0(xt))2.

▶ The conditional mathematical expectation

Eτ [xt] = E[xt | xτ ] is an adapted stochastic process  because

Eτ [xt] = (Eτ,1(xt), . . .Eτ,Nτ (xt)) where

Eτ,i[xt] = Nt|τ,i∑ j=1

P(Xt = xt,j|xτ )xt,i = Nt|τ,i∑ j=1

πt|τ,jxt,j, i = 1, . . . ,Nτ

where ∑

i Nt|τ,i = Nτ

• Properties of conditional mathematical expectation: Et ▶ if A is a constant

Et[A] = A

▶ if Xt = {xτ}tτ=0 is an adapted process

Et[xt] = xt

▶ law of the iterated expectations:

Et−s[Et[xt+r]] = Et−s[xt+r], s > 0, r > 0

this is a very important property: the expected value operator should be taken from the time in which we have the least  information

▶ if Yt is a predictable process (i.e., Ft−1-adapted)

Et[yt+1] = yt+1

• Martingales

▶ Definition: a process Xt = {Xτ}tτ=0 has the martingale property  if

Et[xt+r] = xt, r > 0

▶ Definition: super-martingale if

Et[xt+r] ≤ xt, r > 0

▶ Definition: sub-martingale if

Et[xt+r] ≥ xt, r > 0

• Example ▶ Let

xt+1 = (

u × xt d × xt

) =

( u d

) xt

d and u are known constants such that 0 < d < u ▶ and assume that

Pt+1|t = (

P(xt+1 = u × xt|xt) P(xt+1 = d × xt|xt)

) =

( p

1 − p

)   for 0 < p < 1

▶ Then the conditional mathematical expectation is

Et[xt+1] = (pu + (1 − p)d)xt.

▶ If pu + (1 − p)d = 1 then Et[xt+1] = xt, that is Xt is a martingale.

▶ Intuition: the martingale property is associated to the properties of the possible realisations of a stochastic process and of the probability sequence.

• Wiener process (or Standard Brownian Motion)

▶ The process Xt = {Xt, t ∈ [0,T)} is a Wiener process if:

x0 = 0, E0[Xt] = 0, V0[Xt − Xτ ] = t − τ

for any pair t, τ ∈ [0,T). ▶ in particular: V0[Xt − Xt−1] = 1 ▶ observe that the process has asymptotically infinite unconditional

variance limt→∞ V0[Xt − Xτ ] = ∞ for a finite τ ≥ 0 ▶ The variation of the process then follows a stationary standard

normal distribution

∆Xt = Xt+1 − Xt ∼ N(0, 1)

• Wiener process

10 replications of a Wiener process