4-1 Random Process 1 - mmlab.kaist.ac.krmmlab.kaist.ac.kr/menu2/popup/2018EE528_spring/data/4-1...
Transcript of 4-1 Random Process 1 - mmlab.kaist.ac.krmmlab.kaist.ac.kr/menu2/popup/2018EE528_spring/data/4-1...
1
Random Process – Part 1
( )
A random process ( , ) is a signal or waveform in time.
: time
: outcome in the sample space
Each time we reapeat the experiment, a new waveform is generated.
We will adopt for short.
X tt
X t
ζ
ζ
1 2 3 1 2 3
Time samples of a random process ( ) constitute a sequence of random variables:
, , , or ( ), ( ), ( ),t t t
X tX X X X t X t X t
Statistical Relationship among Random Samples
In general, a random process is completely characterized by the joint cdf
1 21 2, , , 1, 2 1 2( , , ) , , ,
for any choice of and for any1 2
t t t kkX X X k t t t kF x x x P X x X x X x
t t t kk
= ≤ ≤ ≤
< < <
However we may need a simpler model that is
close to the real world
tractable to analysis.
Frequently used assumptions are:
Stationary and wide-sense stationary random processes for analyzing signals
Markov processes for analyzing queues
2
Classes of Random Process
Stationary Random Process: time-shift-invariant
( ) is a stationary random process ifX t
1 2 1 2, , , 1 2 , , , 1 2
1 2
( , , , ) ( , , , )
for any , for any choice of , and for any
t t t t t tk kX X X k X X X k
k
F x x x F x x x
t t t k
τ τ τ
τ
+ + +=
< < <
Wide-Sense Stationary Random Process
( ) is a wide-sense stationary (WSS) random process ifX t
1 21 2 2 1 1 2
Mean: ( ) for all
Autocorrelation: ( , ) ( ), for any and X t t X
X t m t
R t t X X R t t t tτ τ=
= = = −
Markov process
Memoryless:
|1 1 1 1| , , , 1 , 1 1 1
1 2 1
( | , , ) ( | )
for any and for any choice of
t t t t t tk k k k kX X X X k k k X X k k
k k
F x x x x F x x
k t t t t+ − ++ − +
+
=
< < < <
3
Moments of a Random Process
Mean
( ) ( )X tm t X t X= ≡
Variance
{ }22 ( ) ( ) ( )X Xt X t m tσ = −
Auto-correlation
1 21 2( , )X t tR t t X X=
Auto-covariance
{ }{ }
( )
1 2
1 2
1 2 1 2
1 2 1 2
( , ) ( ) ( )
( , ) ( ) ( )
cov , is an alternative notation.
X t X t X
X X X
t t
C t t X m t X m t
R t t m t m t
X X
= − −
= −
4
Examples of Random Processes
Random Phase Signal
( ) cos( ),
where is a random variable, uniform in the interval (0,2 ).
X t tω
π
= + Θ
Θ
2
0
The mean of the random process is
( ) ( ) cos( )
cos( ) ( )
1cos( )
2
0 for all .
Xm t X t t
t f d
t d
t
π
ω
ω θ θ θ
ω θ θπ
∞
Θ−∞
= = + Θ
= +
= +
=
{ }
1 2 1 2 1 2
21 20
2 1 2 1 20
2 1
The auto-correlation is
( , ) ( ) ( ) cos( )cos( )
1cos( )cos( )
2
cos( 2 ) cos( )1
2 2
10 cos ( )
2
1cos
2
XR t t X t X t t t
t t d
t t t t d
t t
π
π
ω ω
ω θ ω θ θπ
ω ω θ ω ω θπ
ω
ωτ
= = + Θ + Θ
= + +
+ + + −=
= + −
=
1 2 2 1( , ) is a function of XR t t t tτ = − .
The random phase signal is a wide-sense stationary random process
Ref. cos( ) cos( )
cos cos , cos( ) cos cos sin sin2
A B A BA B A B A B A B+ + −= + = −
5
Random Telegraph Signal
( ) takes on one of the two values for any .
For simplicity, assume the signal level is either 1 or 1 at any time.
1The time between successive transitions is exponentially distributed with mean .
Th
X t t
α
+ −
e process begins at time 0.
(0) 1 with equal probabilities 0.5
t
X
=
= ±
We claim [ ] [ ]( ) 1 ( ) 1 0.5 for any P X t P X t t= = = − =
( ) ( ) ( ) ( ) ( ) ( ) ( )
[ ] [ ]
[ ]
1 0 1 1 0 1 0 1 1 0 1
1 1an odd number of transitions an even number of transitions
2 21
any number of transitions21
2
P X t P X P X t X P X P X t X
P P
P
= = = − = = − + = = =
= +
=
=
By the way,
[ ] ( )
( )
2
20 0
2
0
2
an even number of transitions during ( ) (2 )!
(2 )!
2
1
2
k t
kk k
kt
kt t
t
t
t eP t P t
k
te
k
e ee
e
α
α
α αα
α
α
α
−∞ ∞
= =
∞−
=−
−
−
= =
=
+=
+=
[ ]2
an odd number of transitions during 2
1
2
t tt
t
e eP t e
e
α αα
α
−−
−
−=
−=
6
Naturally ( ) 0 and ( ) 1 for allX Xm t t tσ= = .
We claim 21 2 2 1( , ) where .XR t t e t tα τ τ−= = −
( ) [ ] ( ) [ ]
[ ] [ ]2 1 2 1
2 1
1 2 1 2
2 1 2 1
2 1
2 | | 2 | |
2 | |
22 1
( , ) ( ) ( )
1 ( ) ( ) 1 ( ) ( )
an even number of transitions during an odd number of
1 1
2 2
where
X
t t t t
t t
R t t X t X t
P X t X t P X t X t
P t t P
e e
e
e t t
α α
α
α τ τ
− − − −
− −
−
=
= + ⋅ = + − ⋅ ≠
= − −
+ −= −
=
= = −
The autocorrelation decays with τ .
The random telegraph signal is a wide-sense stationary random process.
7
Wiener Process and Brownian Motion
Initially (0) 0.X =
The process makes a transition every in time, up or down by with equal probabilities.hΔ
We claim that ( ) 0 for all , and ( ) as 0X Xm t t t tσ α= = Δ → .
Model the successive transitions as iid random variables 1 2 3,, ,D D D . Then
2 20 .jj DD and hσ α= = = Δ .
How many transitions occur during (0, )? .tt Ans
Δ
1 2
2
Therefore
( )
and
( ) 0,
( )
t
X
X
X t D D D
m t
tt tσ α α
Δ
= + + +
=
= Δ ≅ Δ
According to the central limit theorem, as 0,
( ) becomes a gaussian random variable (0, ).X t N tαΔ →
8
We claim that 1 2 1 2( , ) min( , )XR t t t tα=
Suppose 1 2t t< . Then
1 2
1
1 2 1 2
1 2 1 2
2 2 21 2
1
1
1, 2
( , ) ( ) ( )
noting 0 for
min( )
X
t t
i j i j
t
R t t X t X t
D D D D D D
D D D D i j
D D D
t
t
t t
α
α
α
Δ Δ
Δ
=
= + + + + + +
= = ≠
= + + +
= Δ Δ
≅
= ⋅
The Wiener process is not a wide-sense stationary random process.
9
Examples of Stationary Random Processes
To prove ( )X t is a stationary random process, we must show the time-shift-invariance:
1 2 1 2, , , 1 2 , , , 1 2( , , , ) ( , , , )
t t t t t tk kX X X k X X X kF x x x F x x xτ τ τ+ + +
=
1 2for any , any choice of , and for anykt t t kτ < < <
A random telegraph signal is stationary
We have shown
[ ] [ ]
[ ]
[ ]
2
2
( ) 1 ( ) 1 0.5 for any
1( ) an even number of transitions during a time interval
2
1( ) an odd number of transitions during a time interval
2
e
o
P X t P X t t
ep P
ep P
αθ
αθ
θ θ
θ θ
−
−
= = = − =
+= =
−= =
1
1 1
1
Let be the number of samples.
For 1,
( ) 1 with prob 0.5, for any .
That is equivalent to say
( ) ( ) for any
t
t t
X
X X
kk
f x t
f x f xτ
τ+
=
= ±
=
1 2, 1 2 2 1
2 1
2 1
2 1
For 2,
( , ) (1,1) with prob 0.5 ( ) for any .
(1, 1) 0.5 ( )
( 1,1) 0.5 ( )
( 1, 1) 0.5 ( )
t tX X e
o
o
e
kf x x p t t
p t t
p t t
p t t
τ ττ
+ +
== −
− −
− −
− − −
10
1 2
1 2 1 2
, 1 2
, 1 2 , 1 2
( , ) does not depend on .
( , ) ( , ) for any .
t t
t t t t
X X
X X X X
f x x
f x x f x xτ τ
τ τ
τ
τ+ +
+ +=
1 2 3, , 1 2 3 2 1 3 2
2 1 3 2
2 1 3 2
2 1 3 2
2 1 3 2
For 3,
( , , ) (1,1,1) with prob 0.5 ( ) ( )
(1,1, 1) 0.5 ( ) ( )
(1, 1,1) 0.5 ( ) ( )
(1, 1, 1) 0.5 ( ) ( )
( 1,1,1) 0.5 ( ) ( )
( 1,1, 1)
t t tX X X e e
e o
o o
o e
o e
kf x x x p t t p t t
p t t p t t
p t t p t t
p t t p t t
p t t p t t
τ τ τ+ + +
== − −
− − −
− − −
− − − −
− − −
− − 2 1 3 2
2 1 3 2
2 1 3 2
for any .
0.5 ( ) ( )
( 1, 1,1) 0.5 ( ) ( )
( 1, 1, 1) 0.5 ( ) ( )
o o
e o
e e
p t t p t t
p t t p t t
p t t p t t
τ
− −
− − − −
− − − − −
1 2 3
1 2 3 1 2 3
, , 1 2 3
, , 1 2 3 , , 1 2 3
( , , ) does not depend on .
( , , ) ( , , ) for any .
t t t
t t t t t t
X X X
X X X X X X
f x x x
f x x x f x x xτ τ τ
τ τ τ
τ
τ+ + +
+ + +=
1 2, , , 1 2
1
In general, we can see that ( , , , ) does not depend on for any
and for any sampling instants , , .
t t tkX X X k
k
f x x x k
t tτ τ τ
τ+ + +
11
A random phase signal is stationary
( ) cos( ), where is uniform over (0,2 )X t tω π= + Θ Θ is stationary.
In fact, one can show any periodic signal with the random phase uniformly distributed over the period is stationary [ref. Wozencraft and Jacobs, Principles of Communications Engineering, pp137]
12
Wide-sense Stationary Random Process
Theorem. A stationary random process is a wide-sense stationary random process.
( )( ) ( )
( )
( )
1 2 1 2
1 2
1 2
0 2 1
, 1 2 , 1 2 1 2
1 2
1 2 , 1 2 1 1
1
1 2 , 1 2 1 1
Suppose is stationary. Then
, , for any , and .
However
( , )
,
applying time shift of
,
( ) w
t t t s t s
t t
t t
X X X X
X t t
X X
X X
X
X t
f x x f x x t t s
R t t X X
x x f x x dx dx
s t
x x f x x dx dx
R τ
+ +
−
=
=
=
= −
=
=
2 1here .t tτ = −
Some properties of wide-sense stationary random processes:
1. Average power does not vary with the time
20 (0), which doe not vary with . t t t XX X X R t+= =
2. Autocorrelation is an even function ( ) ( )X XR Rτ τ− =
( )
interchanging the order
( )
X t t
t t
X
R X X
X XR
τ
τ
τ
τ
−
−
− =
==
stationary
wide-sense stationary
13
3. Max Value (0) ( )X XR R for anyτ τ≥
22 2
From the Cauchy-Schwartz inequality, for any rvs and ,
.
X Y
X Y XY≥,
22 2
2 2
2 2
Therefore
for any
However
(0)
( )
That says
(0) ( )
t t t t
X t t
X t t
X X
X X X X
R X X
R X X
R R for any
τ τ
τ
τ
τ
τ
τ τ
+ +
+
+
≥
= =
=
≥
.
Since (0) 0, (0) ( ) .
Also ( ) ( )
X X X
X X
R R R
R R
τ
τ τ
≥ ≥
≥
4. Bound on the rate of change { }
2
2 (0) ( )X Xt t
R RP X Xτ
τε
ε+−
− > ≤ .
[ ]
Markov Inequality: For any non-negative random variable ,
( ) /
XXP X u X Xε ε εε
> = − ≤ =
( )
( )
{ }
2 2
2
2
2
from the Markov inequality
2 (0) ( )
t t t t
t t
X X
P X X P X X
X X
R R
τ τ
τ
ε ε
ε
τε
+ +
+
− > = − >
−≤
−=
14
5. If (0) ( )X XR R d= , then ( )XR τ is periodic with period d, and ( )20t d tX X+ − =
Using the Cauchy-Schwartz inequality,
( ) ( )2 2 2 for any t d t t t d t tX X X X X Xτ τ τ τ τ+ + + + + +− ≤ −
2 2 2 22 for any t t d t t t d t t t d tX X X X X X X X Xτ τ τ τ τ τ τ+ + + + + + + + +− ≤ + −
{ } { }{ }
2
2
( ) ( ) 2 (0) 2 ( ) (0)
If (0) ( ), ( ) ( ) 0.
X X X X X
X X X X
R d R R R d R
R R d R d R
τ τ
τ τ
+ − ≤ −
= + − =
Therefore ( ) ( ) for anyX XR d Rτ τ τ+ = .
( )2 2 2Also 2 2 (0) 2 ( ) 0t d t t d t t t d X XX X X X X X R R d+ + +− = + − = − =
6. If ( ) ( )X t m N t= + and ( ) 0 and lim ( ) 0NN t Rτ
τ→∞
= = , then 2lim ( )XR mτ
τ→∞
= .
( )( )
( )2
2
2
( )
( )
Therefore lim ( )
X t t
t t
t t t t
N
X
R X X
m N m N
m m N N N N
m R
R m
τ
τ
τ τ
τ
τ
τ
τ
+
+
+ +
→∞
=
= + +
= + + +
= +
=
See LG (3rd Edition) pp.524 figure9.13 for some plots of autocorrelation.