Week 2ELE 774 - Adaptive Signal Processing 1 STOCHASTIC PROCESSES AND MODELS.

Post on 12-Jan-2016

234 views 2 download

Transcript of Week 2ELE 774 - Adaptive Signal Processing 1 STOCHASTIC PROCESSES AND MODELS.

Week 2 ELE 774 - Adaptive Signal Processing 1

STOCHASTIC PROCESSES AND MODELS

ELE 774 - Adaptive Signal Processing 2Week 2

EigenAnalysis

Read Appendices E, B, C and others!

Let R be an MxM autocorrelation matrix corresponding to u(n):Mx1

The eigenvalue problem is

There are M eigenvalues and M eigenvectors of R. Rewriting

or

→eigenvalues {λi}

An eigenvalue of RThe eigenvector of R

corresponding to an eigenvalue λ

ELE 774 - Adaptive Signal Processing 3Week 2

EigenAnalysis Property 1: Let the eigenvalues of R be λ1, λ2, ..., λM

Then the eigenvalues of Rk are λ1k, λ2

k, ..., λMk

Then the eigenvalues of R-1 are λ1-1, λ2

-1, ..., λM-1

Proof: Rq=λq, R2q=λ(Rq)= λ2, etc.

Property 2: The M eigenvectors {qi} of R are linearly independent.

Proof: Let

with at least one non-zero vi.

Multiply by R, R2, R3, ..., RM-1 to obtainAlways non-singular

Must be an all-zero vector

!!! CONTRADICTION !!!

ELE 774 - Adaptive Signal Processing 4Week 2

EigenAnalysis

Property 3: The eigenvalues {λi} are real and non-negative.

Proof:

Property 4: If the eigenvalues {λi} are distinct, then the eigenvectors {qi} are orthogonal to eachother.

Proof:

≠0

ELE 774 - Adaptive Signal Processing 5Week 2

EigenAnalysis

Property 5: Diagonalisation of R

, then by stacking and multiplying by QH from the left we obtain

or

ELE 774 - Adaptive Signal Processing 6Week 2

EigenAnalysis

Property 6: Spectral Theorem

Property 7:

Property 8: Condition number

Small (~1) is good, large (R is ill-conditioned) (→∞) is bad.

(Aw=d, → w=A-1d, a small pertubation will result in a large perturbation in A-1.)

ELE 774 - Adaptive Signal Processing 7Week 2

Stochastic Processes

Definition: The term stochastic process (random process) is used to describe the time evolution of a statistical phenomenon according to probabilistic laws. Computer data, radar signal, measurements, data

A stochastic process is not just a single function of time It represents an infinite number of different realizations. One particular realization is called a time series.

u(n), u(n-1), ... , u(n-M)

0 20 40 60 80 100 120 140 1608

8.5

9

9.5

10

10.5

11

11.5

ELE 774 - Adaptive Signal Processing 8Week 2

Stochastic Processes

A stochastic process is strictly stationary,

if its statistical properties are invariant to a time shift Joint pdf of {u(n), u(n-1), ... , u(n-M)} remain the same regardless of n.

Joint pdf is not easy to obtain, First and second moments are used frequently.

ELE 774 - Adaptive Signal Processing 9Week 2

Mean and Covariance

Mean (Expected) Value of u(n) (1st order)

Autocorrelation Function of u(n) (2nd order)

Autocovariance Function of u(n) (2nd order)

u(n): zero mean -> r(n, n-k)=c(n, n-k)

ELE 774 - Adaptive Signal Processing 10Week 2

Mean and Covariance

Stationary (strictly/w.s.s) processes

lag k=0 is important

μ=0 →

variance

ELE 774 - Adaptive Signal Processing 11Week 2

0 20 40 60 80 100 120 140 1608

8.5

9

9.5

10

10.5

11

11.5

Ensembleaverage

sample (time)average

Ergodic Processes Ensemble averages (expectations) are across the process -> can be

obtained analytically (actual expec.) Sample (time) averages are along the process -> can be obtained

emprically (from realizations of a process) (estimated expectation)

ELE 774 - Adaptive Signal Processing 12Week 2

Ergodic Processes

If sample averages converge to ensemble averages in, e.g. mean square error sense,

We call the process u(n) as ergodic.

We can estimate the mean value of the process as

If the process is ergodic, i.e. then

ELE 774 - Adaptive Signal Processing 13Week 2

Ergodic Processes

For a w.s.s. process, the autocorrelation can be estimated

if the process is correlation ergodic, e.g. in MMSE sense.

We will generally assume that

ELE 774 - Adaptive Signal Processing 14Week 2

Correlation Matrix

Let u(n) be the Mx1 observation vector

Define the MxM correlation matrix as

ELE 774 - Adaptive Signal Processing 15Week 2

Properties of the Correlation Matrix

Property 1: The correlation matrix of a stationary discrete-time stochastic process is Hermitian symmetric.

Then

ELE 774 - Adaptive Signal Processing 16Week 2

Properties of the Correlation Matrix

Property 2: The correlation matrix of a stationary discrete-time stochastic process is Toeplitz.

ELE 774 - Adaptive Signal Processing 17Week 2

Properties of the Correlation Matrix

Property 3: The correlation matrix of a discrete-time stochastic process is always non-negative definite and almost always positive definite. Let a be an arbitrary Mx1 vector and let

Then

Since ,

ELE 774 - Adaptive Signal Processing 18Week 2

Properties of the Correlation Matrix Property 4: The correlation matrix of a w.s.s. process is nonsingular

due to the unavoidable presence of additive noise. None of the eigenvalues of R is zero due to noise, i.e.

Property 5: If the order of the elements of the vector u(n) is (time)-reversed, the effect is the transposition of the autocorrelation matrix.

ELE 774 - Adaptive Signal Processing 19Week 2

Properties of the Correlation Matrix

Property 6: The correlation matrices RM and RM+1 of a stationary discrete-time stochastic process, pertaining to M and M+1observations of the process, respectively are related by

or equivalently

where r(0) is the autocorrelation of u(n) for lag zero,

ELE 774 - Adaptive Signal Processing 20Week 2

Properties of the Correlation Matrix

ELE 774 - Adaptive Signal Processing 21Week 2

Correlation Matrix of a Sine Wave + Noise Let

where v(n) is zero-mean additive white noise with

Then

ELE 774 - Adaptive Signal Processing 22Week 2

Correlation Matrix of a Sine Wave + Noise

Given noise power , can be obtained from r(0).

Given , the angular frequency can be obtained from r(k), k>0.

Magnitude, and angular frequency can be found from the autocorrelation function.

Autocorrelation lacks the phase information.

ELE 774 - Adaptive Signal Processing 23Week 2

Stochastic Models

v(n): zero-mean, white random variable with variance . Generally v(n) is assumed to be Gaussian

zero-mean Additive White Gaussian Noise (AWGN) with variance N(0, )

Input-output relation of the linear filter (transversal)

A stochastic process described in this way is called a linear process.

input model of

spast value andpresent

ofn combinatiolinear

output model

of spast value of

ncombinatiolinear

output model of

luepresent va

ELE 774 - Adaptive Signal Processing 24Week 2

Stochastic Models

1. AR (AutoRegressive) Model Only the past values of the model output, and

the present value of the model input are used.

2. MA (Moving-Average) Model Only the past values of the model input,

no past value of the model output are used.

3. ARMA (Auto-Regressive Moving-Average) Model The past values of both the model input and output are used.

(causal models, we are interested in present and past values not future)

ELE 774 - Adaptive Signal Processing 25Week 2

AR Model

Input-output relation of M-th order AR model:

a1, a2, ... aM: AR parameters.

Another perspective

convolution in time domain, multiplication in z-domain

ELE 774 - Adaptive Signal Processing 26Week 2

AR Model

Two interpretation of the model (assuming {ak} is given)

Process Generator Given the white process

(noise) v(n), the stochastic process u(n) is generated:

IIR impulse response

AR Generator (all-pole filter)

ELE 774 - Adaptive Signal Processing 27Week 2

AR Model

ELE 774 - Adaptive Signal Processing 28Week 2

AR Model

Process Analyser Given the stochastic process u(n), the white process v(n) is produced

FIR impulse response

AR Analyser (all-zero filter)

ELE 774 - Adaptive Signal Processing 29Week 2

MA Model Input-output relation of M-th order MA model:

b1, b2, ... bM: MA parameters.

Two interpretation of the model (assuming {bk} is given)

Process Generator Given the white process (noise) v(n), the stochastic process u(n) is

generated:

FIR impulse response

MA Generator (all-zero filter)

ELE 774 - Adaptive Signal Processing 30Week 2

MA Model

Process Analyser Given the stochastic process u(n), the white process v(n) is produced

IIR impulse response

ELE 774 - Adaptive Signal Processing 31Week 2

ARMA Model Input-output relation of M-th order ARMA model:

b1, b2, ... bM: MA parameters.

Process Generator Given the white process (noise) v(n), the stochastic process u(n) is

generated:

FIR impulse response Process Analyser

Given the stochastic process u(n), the white process v(n) is produced

IIR impulse response

ELE 774 - Adaptive Signal Processing 32Week 2

ARMA Model

ARMA Generator of order (M, K)assuming that M>K

ELE 774 - Adaptive Signal Processing 33Week 2

Wold Decomposition

Theorem: Any stationary discrete-time stochastic process, x(n)

may be decomposed into the sum of a general linear process, u(n) and a predictable process, s(n)

1. u(n) and s(n) are uncorrelated processes,

2. u(n) is a general linear process represented by a MA model with white noise v(t) as input.

3. s(n) can be predicted from its own past values with zero prediction variance.

ELE 774 - Adaptive Signal Processing 34Week 2

Stationarity of AR Processes

For asymptotic stationarity of a discrete-time stochastic process, all the poles of the filter in the AR model must lie inside the unit-circle in the z-plane.

Assuming stationarity, an important recursive relation for the autocorrelation of an AR process

Observe that , then multiply both sides by and take expectation to get

Autocorrelation function must satisfy the characteristic equation of the AR model.

ELE 774 - Adaptive Signal Processing 35Week 2

Yule-Walker Equations To specify an AR model we need

the model coefficients a1,a2,...,aM

and the variance of the input noise, Given the autocorrelation function of the AR process, how can we find

these parameters?

Rewrite

Stacking the eqn.s for each lag l, write the Yule-Walker Equations

ELE 774 - Adaptive Signal Processing 36Week 2

Yule-Walker Equations

Assuming R is non-singular, the coefficients {wk} (hence {ak}) are

Noise variance, for l=0, we have (slide 28)

Therefore

Section 1.9: Computer Experiment: Autoregressive Process of Order Two

ELE 774 - Adaptive Signal Processing 37Week 2

Complex Gaussian Processes

(Complex valued) Gaussian processes are frequently encountered. Consider a complex Gaussian process u(n) consisting of N samples. First-order statistics (zero-mean)

Second-order statistics (Autocorrelation function)

The NxN autocorrelation matrix R of u(n) can be constructed from r(k).

With this definition of first and second order stat.s the process is w.s.s.

Shorthand notation for this process is N(0,R)

ELE 774 - Adaptive Signal Processing 38Week 2

Complex Gaussian Processes

Probability density function (pdf) of u(n) is totally defined by mean and correlation matrix

where

Hence, the process u(n) is

1. strictly stationary (totally defined by w.s.s. parameters)

2. circularly complex, i.e.

3.

ELE 774 - Adaptive Signal Processing 39Week 2

Power Spectral Density

Autocorrelation function is a time-domain description and Power Spectral Density (PSD) is a frequency-domain description

of second-order statistics.

PSD analysis requires w.s.s. processes. (non-stationary processes can be analysed by e.g. wavelet transform)

Let uN(n) be a window of u(n) with N samples

Take DTFT of uN(n)

Calculate squared-magnitude of UN(ω) and take expectation

ELE 774 - Adaptive Signal Processing 40Week 2

Power Spectral Density

But we know that

then

let l=n-k and after rearranging

Finally, PSD is defined as

ELE 774 - Adaptive Signal Processing 41Week 2

Properties of PSD

Property 1: The autocorrelation function and power spectral density of a w.s.s. stochastic process form a (discrete-time) Fourier transform pair.

Property 2: The frequency support of the PSD is the Nyquist interval, i.e. Outside this interval PSD is periodic

ELE 774 - Adaptive Signal Processing 42Week 2

Properties of PSD

Property 3: The PSD of a stationary discrete-time stochastic process is real.

Property 4: The PSD of a real-valued stationary discrete-time stochastic process is even, if the process is complex-valued, it is not even. u(n): real → r(-l)=r(l) → S(-ω)=S(ω) u(n): complex → r(l)=r*(l) → S(-ω)≠S(ω)

ELE 774 - Adaptive Signal Processing 43Week 2

Properties of PSD

Property 5: The mean-square value (variance, power) of a stationary discrete-time stochastic process equals the area under S(ω).

Property 6: The PSD of a stationary discrete-time stochastic process is non-negative.

Property 7: Let {λi} be the eigenvalues of the autocorrelation matrix R of a stationary discrete-time stochastic process, and S(ω) be the corresponding PSD. Then,

ELE 774 - Adaptive Signal Processing 44Week 2

Transmission of a Stationary Process Through a Linear Filter

Filter is LTI, input process is stationary

Impulse response of the filter is h(n), then its output is

and the aucorrelation of the output is

taking the DTFT we obtain the PSD of the output

ELE 774 - Adaptive Signal Processing 45Week 2

Power Spectrum Analyser

Take a window of bandwidth 2Δω around ωc which can be obtained by a BPF

With 2Δω«ωc, we can assume that S(ω) is constant over 2Δω

Then, the power in this window is

Or, equivalently