ECE 302: Chapter 08: Random Processes › ChanGroup › ECE302 › files › Slide08.pdfc Stanley...
Transcript of ECE 302: Chapter 08: Random Processes › ChanGroup › ECE302 › files › Slide08.pdfc Stanley...
c©Stanley Chan 2019. All Rights Reserved.
ECE 302: Chapter 08: Random Processes
Fall 2019
Prof Stanley Chan
School of Electrical and Computer EngineeringPurdue University
1 / 22
c©Stanley Chan 2019. All Rights Reserved.
What is a Random Process?
Definition (Random Process)
A random process X (t, ξ) is a function of t indexed by a random index ξ.
Figure: The sample space of a random process X (t, ξ) contains many functions.Therefore, each random realization is a function.
2 / 22
c©Stanley Chan 2019. All Rights Reserved.
A Tale of Two Cities
Statistical View: Fix time t. We can look at the 2-dimensional functionX (t, ξ) “vertically” as
X (t, ξ1)X (t, ξ2)...X (t, ξN)
This is a sequence of random variables because ξ1, . . . , ξN arerealizations of the random variable ξ.
Temporal View: Fix the random index ξ. We can look at X (t, ξ)“horizontally” as
X (t1, ξ),X (t2, ξ), . . . ,X (tK , ξ).
This is a deterministic time series evaluated at time points t1, . . . , tK .
3 / 22
c©Stanley Chan 2019. All Rights Reserved.
Example 1
Random Process. Let A ∼Uniform[0, 1]. Define X (t, ξ) = A(ξ) cos(2πt).
Statistical View: Fix t (for example t = 10). In this case, we have
X (t, ξ) = A(ξ) cos(2π(10)) = A(ξ) cos(20π),
which is a random variable because cos(20π) is a constant. Therandomness of X comes from the fact that A(ξ) ∼ Uniform[0, 1].
Temporal View: Fix ξ (for example A(ξ) = 0.7). In this case, wehave
X (t, ξ) = 0.7 cos(2πt),
which is a deterministic function in t.
4 / 22
c©Stanley Chan 2019. All Rights Reserved.
Example 2
Random Process. Let A be a discrete random variable with PMF
P(A = +1) =1
2, and P(A = −1) =
1
2.
Define X (n, ξ) = A(ξ)(−1)n.
Statistical View: Fix n, say n = 10. Then,
X (n, ξ) = X (10, ξ) =
{(−1)10 = 1, with prob 1/2
(−1)11 = −1, with prob 1/2,
which is a random variable.
Temporal View: Fix ξ. Then,
X (n, ξ) =
{(−1)n, if A = +1
(−1)n+1, if A = −1,
which is a time series.5 / 22
c©Stanley Chan 2019. All Rights Reserved.
Figure: Left: Realizations of the random process X (t, ξ). Right: Realizations ofthe random process X (n, ξ).
6 / 22
c©Stanley Chan 2019. All Rights Reserved.
Mean function
The mean function µX (t) of a random process X (t) is
µX (t) = E [X (t)] .
Example: Let A ∼Uniform[0, 1], and let X (t) = A cos(2πt), find µX (0),and µX (t).
µX (0) = E[X (0)] = E[A cos(0)] = E[A] =1
2
µX (t) = E[X (t)] = E[A cos(2πt)] = cos(2πt)E[A] =1
2cos(2πt).
Example: Let Θ ∼Uniform[−π, π], and let X (t) = cos(ωt + Θ). FindµX (t).
µX (t) = E [cos(ωt + Θ)] =
∫ π
−πcos(ωt + θ)
1
2πdθ = 0
7 / 22
c©Stanley Chan 2019. All Rights Reserved.
Variance function
The variance function of a random process X (t) is
Var[X (t)] = E[(X (t)− µX (t))2)
].
Note: Both µX (t) and Var [X (t)] are functions of t.
The autocorrelation function of a random process X (t) is
RX (t1, t2) = E [X (t1)X (t2)] .
Autocorrelation function takes two time instants t1 and t2. Since X (t1)and X (t2) are two random variables, RX (t1, t2) = E [X (t1)X (t2)] measuresthe correlation of these two random variables.
8 / 22
c©Stanley Chan 2019. All Rights Reserved.
Autocovariance Function
The autocovariance function of a random process X (t) is
CX (t1, t2) = E [(X (t1)− µX (t1)) (X (t2)− µX (t2))] .
Two useful properties:1 CX (t1, t2) = RX (t1, t2)− µX (t1)µX (t2)2 CX (t, t) = Var [X (t)]
Proof.
9 / 22
c©Stanley Chan 2019. All Rights Reserved.
Cross Correlation
The cross-correlation function of X (t) and Y (t) is
RX ,Y (t1, t2) = E [X (t1)Y (t2)] .
The cross-covariance function of X (t) and Y (t) is
CX ,Y (t1, t2) = E [(X (t1)− µX (t1)) (Y (t2)− µY (t2))]
Remark: CX ,Y (t1, t2) = RX ,Y (t1, t2) = E[X (t1)Y (t2)] ifµX (t1) = µY (t2) = 0.
10 / 22
c©Stanley Chan 2019. All Rights Reserved.
Example 1
Problem. Let A ∼Uniform[0, 1], X (t) = A cos(2πt). Find µX (t),RX (t1, t2), CX (t1, t2).
Solution:
11 / 22
c©Stanley Chan 2019. All Rights Reserved.
Example 2
Problem. Let Θ ∼Uniform[−π, π], X (t) = cos(ωt + Θ). Find µX (t),RX (t1, t2), CX (t1, t2).
Solution:
12 / 22
c©Stanley Chan 2019. All Rights Reserved.
Wide Sense Stationary
A random process X (t) is wide sense stationary (W.S.S.) if:
1 µX (t) = µX for all t,
2 CX (t1, t2) = CX (t1 − t2) for all t1, t2.
Notation: When X (t) is W.S.S., we define τ = t2 − t1 and write
CX (t1, t2) = CX (τ)
RX (t1, t2) = RX (τ).
13 / 22
c©Stanley Chan 2019. All Rights Reserved.
Auto-correlation Function RX (τ)
Why Study RX (τ)?
If the random process is WSS, then everything is characterized byRX (τ).
We seldom worry about µX (t) because µX (t) is a constant.
RX (τ) tells you how much correlation do you have with a sample withdistance τ away from you.
If RX (τ) is small, that means you and that sample has weakcorrelation.
If RX (τ) is a delta function, that means you are only correlated withyourself but no one else. This happens when X (t) is white noise(i.i.d. Gaussian at every t).
14 / 22
c©Stanley Chan 2019. All Rights Reserved.
Physical Interpretation of RX (τ)
Why should we care about this?
Seldom know RX (τ), just like the expectation of a random variable
Need some methods to estimate RX (τ)
Candidate:Consider a W.S.S. process X (t) and a function
R̂X (τ)def=
1
2T
∫ T
−TX (t + τ)X (t)dt. (1)
Is it good?Will be good if E[R̂X (τ)] = RX (τ).
15 / 22
c©Stanley Chan 2019. All Rights Reserved.
Physical Interpretation of RX (τ)
Well, it is indeed the case!
Let R̂X (τ)def= 1
2T
∫ T−T X (t + τ)X (t)dt. Then,
E[R̂X (τ)
]= RX (τ). (2)
Proof.
E[R̂X (τ)
]=
1
2T
∫ T
−TE [X (t + τ)X (t)] dt
=1
2T
∫ T
−TRX (τ)dt
= RX (τ)1
2T
∫ T
−Tdt
= RX (τ).
16 / 22
c©Stanley Chan 2019. All Rights Reserved.
Physical Interpretation of RX (τ)
So what?If the signal X (t) is long enough, we can estimate RX (τ) by computing
R̂X (τ) =1
2T
∫ T
−TX (t + τ)X (t)dt.
What is R̂X (τ)?Convolution:
Y (τ) =
∫ T
−TX (t − τ)X (t)dt,
Correlation:
Y (τ) =
∫ T
−TX (t + τ)X (t)dt.
17 / 22
c©Stanley Chan 2019. All Rights Reserved.
Numerical Example
Generate 1000 sample paths
N = 1000; % number of sample paths
T = 1000; % number of time stamps
X = 0.1*randn(N,T);
xc = zeros(N,2*T-1);
plot(X(1,:),’b:’, ’LineWidth’, 2); hold on;
plot(X(2,:),’k:’, ’LineWidth’, 2); hold off;
0 100 200 300 400 500 600 700 800 900 1000−0.4
−0.3
−0.2
−0.1
0
0.1
0.2
0.3
0.4
18 / 22
c©Stanley Chan 2019. All Rights Reserved.
Numerical Example
Visualize auto-correlation.
for i=1:N
xc(i,:) = xcorr(X(i,:));
end
plot(xc(1,:),’b:’, ’LineWidth’, 2); hold on;
plot(xc(2,:),’k:’, ’LineWidth’, 2);
plot(mean(xc),’r’, ’LineWidth’, 2); hold off;
0 200 400 600 800 1000 1200 1400 1600 1800 2000−2
0
2
4
6
8
10
12
correlation of sample 1
correlation of sample 2
auto−correlation function
19 / 22
c©Stanley Chan 2019. All Rights Reserved.
Power Spectral Density
Theorem (Einstein-Wiener-Khinchin Theorem)
The power spectral density SX (ω) of a W.S.S. process is
SX (ω) =
∫ ∞−∞
RX (τ)e−jωτdτ
= F(RX (τ)).
That is, SX (ω) is the Fourier Transform of the autocorrelation function.
Proof: See Chapter 10. Not required for this course.
20 / 22
c©Stanley Chan 2019. All Rights Reserved.
Examples
Example 1. Let RX (τ) = e−2α|τ |, find SX (ω).
Solution:
SX (ω) = F {RX (τ)} =4α
4α2 + ω2.
Example 2. Given that SX (ω) = N02 rect( ω
2W ), find RX (τ).
Solution:
RX (τ) =N0
2
W
πsinc(W τ)
21 / 22
c©Stanley Chan 2019. All Rights Reserved.
Examples
Example 3. Let X (t) = a cos(ω0t + Θ), Θ ∼ Uniform[0, 2π]. Find RX (τ)and SX (ω).
Solution:
RX (τ) =a2
2cos(ω0τ) =
a2
2
(e jω0τ + e−jω0τ
2
).
Then, by taking Fourier transform of both sides, we have
SX (ω) =a2
2
[2πδ(ω − ω0) + 2πδ(ω + ω0)
2
]=πa2
2[δ(ω − ω0) + δ(ω + ω0)] .
22 / 22