Generalized Semi- Markov Processes (GSMP). Summary Some Definitions Markov and Semi-Markov Processes...
-
Upload
samson-mcdowell -
Category
Documents
-
view
224 -
download
1
Transcript of Generalized Semi- Markov Processes (GSMP). Summary Some Definitions Markov and Semi-Markov Processes...
Generalized Semi-Markov Processes (GSMP)
Summary
Some Definitions Markov and Semi-Markov Processes The Poisson Process Properties of the Poisson Process
Interarrival times Memoryless property and the residual lifetime
paradox Superposition of Poisson processes
Random Process
Let (Ω, F, P) be a probability space. A stochastic (or random) process {X(t)} is a collection of random variables defined on (Ω, F, P), indexed by t T (where t is usually time). X(t) is the state of the process.
Continuous Time and Discrete Time stochastic processes If the set T is finite or countable then {X(t)} is called discrete-time
process. In this case t {0, 1,2,…} and we may referred to a stochastic sequence. We may also use the notation {Xk}, k=0,1,2,…
Otherwise, the process is called continuous-time process
Continuous State and Discrete State stochastic processes If {X(t)} is defined over a countable set, then the process is
discrete-state, also referred to as chain. Otherwise, the process is continuous-state.
Classification of Random Processes
Joint cdf of the random variables X(t0),…,X(tn)
Independent Process Let X1,…,Xn be a sequence of independent random variables,
then
0 0 0 0,..., ; ,..., Pr ,...,X n n n nF x x t t X t x X t x
00 0 0 0,..., ; ,..., ; ... ;
nX n n X X n nF x x t t F x t F x t
Stationary Process (strict sense stationarity) The sequence {Xn} is stationary if and only if for any τ R
0 0 0 0,..., ; ,..., ,..., ; ,...,X n n X n nF x x t t F x x t t
Classification of Random Processes
Wide-sense Stationarity Let C be a constant and g(τ) a function of τ but not of t, then a
process is wide-sense stationary if and only if
Markov Process The future of a process does not depend on its past, only on its
present
X Ct
1 1 0 0
1 1
Pr | ,...,
Pr |
k k k k
k k k k
X t x X t x X t x
X t x X t x
X X gt t and
Also referred to as the Markov property
Markov and Semi-Markov Property
The Markov Property requires that the process has no memory of the past. This memoryless property has two aspects: All past state information is irrelevant in determining the future (no
state memory). How long the process has been in the current state is also
irrelevant (no state age memory). The later implies that the lifetimes between subsequent events
(interevent time) should also have the memoryless property (i.e., exponentially distributed).
Semi-Markov Processes For this class of processes the requirement that the state age is
irrelevant is relaxed, therefore, the interevent time is no longer required to be exponentially distributed.
Example
Consider the process
with Pr{X0=0}= Pr{X0=1}= 0.5 and Pr{X1= 0}= Pr{X1=1}= 0.5
1 1k k kX X X
Is this a Markov process? NO Is it possible to make the process Markov?
Define Yk= Xk-1 and form the vector Zk= [Xk, Yk]T then we can write
11
1
1 1 1 1
1 0 1 0k k
k kk k
X XZ Z
Y Y
Renewal Process
A renewal process is a chain {N(t)} with state space {0,1,2,…} whose purpose is to count state transitions. The time intervals between state transitions are assumed iid from an arbitrary distribution. Therefore, for any 0 ≤ t1 ≤ … ≤ tk ≤ …
1 20 ... ...0 kN N N N tt t
Generalized Semi-Markov Processes (GSMP)
A GSMP is a stochastic process {X(t)} with state space X generated by a stochastic timed automaton
X is the countable state space E is the countable event set Γ(x) is the feasible event set at state x. f(x, e): is state transition function. p0 is the probability mass function of the initial state G is a vector with the cdfs of all events.
0, , , , ,X E f p G
The semi-Markov property of GSMP is due to the fact that at the state transition instant, the next state is determined by the current state and the event that just occurred.
The Poisson Counting Process
Let the process {N(t)} which counts the number of events that have occurred in the interval [0,t). For any 0 ≤ t1 ≤ … ≤ tk ≤ … 1 20 ... ...0 kN N N N tt t
0
6
4
2
t1 t2 t3 tk-1… tk
1 1,k k k kN t t N t N t
Process with independent increments: The random variables N(t1), N(t1,t2),…, N(tk-1,tk),… are mutually independent.
Process with stationary independent increments: The random variable N(tk-1, tk), does not depend on tk-1, tk but only on tk -tk-1
The Poisson Counting Process
Assumptions: At most one event can occur at any time instant (no two
or more events can occur at the same time) A process with stationary independent increments
1 1Pr , Prk k k kN t t n N t t n
Given that a process satisfies the above assumptions, find
Pr , 0,1,2,...nP N n nt t
The Poisson Process
Step 1: Determine 0 Pr 0P Nt t
Starting from Pr 0N t s Pr 0 and 0,N N t t st
Pr 0 Pr 0N Nt s Stationary independent increments
0 0 0P P Pt s t s Lemma: Let g(t) be a differentiable function for all t≥0
such that g(0)=1 and g(t) ≤ 1 for all t >0. Then for any t, s≥0
( ) tg t s g g g et s t for some λ>0
The Poisson Process
Therefore 0 Pr 0 tP N et t
Step 2: Determine P0(Δt) for a small Δt.
Pr 0 tN et
2 3
1 ...2! 3!
t tt
PrnP N n ot t t
1 .t o t Step 3: Determine Pn(Δt) for a small Δt. For n=2,3,… since by assumption no two events can occur at
the same time
1 Pr 1P N t ot t t As a result, for n=1
The Poisson Process
Step 4: Determine Pn(t+Δt) for any n
PrnP N nt t t t 0
n
n k kk
P Pt t
0 1 1 .n nP P P P ot t t t t
Moving terms between sides,
Taking the limit as Δt 0
1 .1 n nP P ot o t ot t tt t
1 .n nn n
P P ot t t tP Pt tt t
1
nn n
dP t P Pt tdt
The Poisson Process
Step 4: Determine Pn(t+Δt) for any n
PrnP N nt t t t 0
n
n k kk
P Pt t
0 1 1 .n nP P P P ot t t t t
Moving terms between sides,
Taking the limit as Δt 0
1 .1 n nP P ot o t ot t tt t
1 .n nn n
P P ot t t tP Pt tt t
1
nn n
dP t P Pt tdt
The Poisson Process
Step 5: Solve the differential equation to obtain
This expression is known as the Poisson distribution and it full characterizes the stochastic process {N(t)} in [0,t) under the assumptions that No two events can occur at exactly the same time, and Independent stationary increments
Pr , 0, 0,1, 2,...
!
nt
ntP N n e t nt t
n
You should verify that
E tN t and var ( ) tN t Parameter λ has the interpretation of the “rate” that events
arrive.
Properties of the Poisson Process: Interevent Times
Let tk-1 be the time when the k-1 event has occurred and let Vk denote the (random variable) interevent time between the kth and k-1 events.
What is the cdf of Vk, Gk(t)? Pr 1 Prk k kG V t V tt
1 11 Pr 0 arrivals in the interval [ , )k kt t t
1 Pr 0N t
1 te
Stationary independent increments
tk-1 tk-1 + t 1 tG et
Vk
N(tk-1,tk-1+t)=0 Exponential Distribution
Properties of the Poisson Process: Exponential Interevent Times
The process {Vk} k=1,2,…, that corresponds to the interevent times of a Poisson process is an iid stochastic sequence with cdf
Therefore, the Poisson is also a renewal process
Pr 1 tkG V t et
The corresponding pdf is
, 0tg e tt
One can easily show that
1kE V
2
1var kV
and
Properties of the Poisson Process: Memoryless Property
Let tk be the time when previous event has occurred and let V denote the time until the next event.
Assuming that we have been at the current state for z time units, let Y be the remaining time until the next event.
What is the cdf of Y?
Pr Pr |YF Y t V z t V zt
1
G Gt z zG z
Pr and
Pr
V z V z t
V z
tk tk+z
V
Memoryless! It does not matter that we have already spent z time units at the current state.
Y=V-z
Pr
1 Pr
z V z t
V z
1 1
1 1
zt z
z
e e
e
1 tYF e Gt t
Memoryless Property
This is a unique property of the exponential distribution. If a process has the memoryless property, then it must be exponential, i.e.,
Pr | Pr Pr 1 tV z t V z V t V t e
PoissonProcess
λ
ExponentialInterevent Times
G(t)=1-e-λt
MemorylessProperty
Superposition of Poisson Processes
Consider a DES with m events each modeled as a Poisson Process with rate λi, i=1,…,m. What is the resulting process?
Suppose at time tk we observe event 1. Let Y1 be the time until the next event 1. Its cdf is G1(t)=1-exp{-λ1t}.
Let Y2,…,Ym denote the residual time until the next occurrence of the corresponding event.
Their cdfs are: 1 it
iG et
tk
V1= Y1
e1ej
Vj
Memoryless Property Yj=Vj-zj
Let Y* be the time until the next event (any type).
* min iY Y Therefore, we need to find
*
*PrY
G Y tt
Superposition of Poisson Processes
The superposition of m Poisson processes is also a Poisson process with rate equal to the sum of the rates of the individual processes
*
*PrY
G Y tt Pr min iY t
11 Pr ,..., mY t Y t
1
1 Prm
ii
Y t
1
1 i
mt
i
e
1 Pr min iY t
* 1 t
YG et
1
where =m
ii
Independence
Superposition of Poisson Processes
Suppose that at time tk an event has occurred. What is the probability that the next event to occur is event j?
Pr next event is 1j
1
Without loss of generality, let j=1 and define
Y’=min{Yi: i=2,…,m}.
1Pr Y Y
2
~1-exp 1 expm
ii
t t
1 11 1
0 0
yy ye e dy dy
1 1
0
1yy e dye
1
where =m
ii
Residual Lifetime Paradox
Suppose that buses pass by the bus station according to a Poisson process with rate λ. A passenger arrives at the bus station at some random point.
How long does the passenger has to wait?
V
bk bk+1p
YZ
Solution 1: E[V]= 1/λ. Therefore, since the passenger will (on average) arrive in
the middle of the interval, he has to wait for E[Y]=E[V]/2= 1/(2λ). But using the memoryless property, the time until the next bus is
exponentially distributed with rate λ, therefore E[Y]=1/λ not 1/(2λ)!
Solution 2: Using the memoryless property, the time until the next bus is
exponentially distributed with rate λ, therefore E[Y]=1/λ. But note that E[Z]= 1/λ therefore E[V]= E[Z]+E[Y]= 2/λ not 1/λ!