# Sect. 1.5: Probability Distribution for Large N

date post

18-Jan-2016Category

## Documents

view

19download

0

Embed Size (px)

description

### Transcript of Sect. 1.5: Probability Distribution for Large N

Sect. 1.5: Probability Distribution for Large N

Weve found that, for the one-dimensional Random Walk Problem, the probability distribution is the Binomial Distribution: WN(n1) = [N!/(n1!n2!)]pn1qn2Here, q = 1 p, n2 = N - n1 The Relative Width: (*n1)/ = (q)(pN) as N increases, the mean value increases N, & the relative width decreases (N)-

N = 20, p = q =

Now, imagine N getting larger & larger. Based on what we just said, the relative width of WN(n1) gets smaller & smaller & the mean value gets larger & larger. If N is VERY, VERY large, can treat W(n1) as a continuous function of a continuous variable n1. For N large, its convenient to look at the natural log ln[W(n1)] of W(n1), rather than the function itself. Now, do a Taylors series expansion of ln[W(n1)] about value of n1 where W(n1) has a maximum. Detailed math (in the text) shows that this value of n1 is its average value = Np. It also shows that the width is equal to the value of the width = Npq. For ln[W(n1)], use Stirlings Approximation (Appendix A-6) for logs of large factorials. Stirlings Approximation If n is a large integer, the natural log of its factorial is approximately:ln[n!] n[ln(n) 1]

In this large N, large n1 limit, the Binomial Distribution W(n1) becomes (shown in the text): W(n1) = exp[-(n1 - )2/(2)] Here, = [2 ]- This is called the Gaussian Distribution or the Normal Distribution. Weve found that = Np, = Npq.The reasoning which led to this for large N & continuous n1 limit started with the Binomial Distribution. However, this is a very general result. If one starts with ANY discrete probability distribution & takes the limit of LARGE N, one will obtain a Gaussian or Normal Distribution. This is called The Central Limit Theorem or The Law of Large Numbers.

Sect. 1.6: Gaussian Probability DistributionsIn the limit of a large number of steps in the random walk, N (>>1), the Binomial Distribution becomes a Gaussian Distribution: W(n1) = [2]-exp[-(n1 - )2/(2)] = Np, = Npq

Recall that n1 = (N + m), where the displacement x = m & that = N(p q). We can use this to convert to the probability distribution for displacement m, in the large N limit (after algebra):P(m) = [2]-exp[-(m - )2/(2)] = N(p q), = 4Npq

- P(m) = [2Npq]-exp[-(m N{p q})2/(8Npq)] We can express this in terms of x = m. As N >> 1, x can be treated as continuous. In this case, |P(m+2) P(m)|
After some math, we obtain the standard Gaussian Distribution form: P(x)dx = (2)--1exp[-(x )2/22] Here: N(p q) mean value of x 2(Npq)- width of the distributionNOTE: The generality of the arguments weve used is such that a Gaussian distribution occursin the limit of large numbers for any discrete distribution

P(x)dx = (2)--1exp[-(x )2/22] N(p q) 2(Npq)-

Note: To deal with Gaussian distributions, you need to get used to doing integrals with them! Many of these are tabulated!!Is P(x) properly normalized? That is, does P(x)dx = 1? (limits - < x < )P(x)dx = (2)--1exp[-(x )2/22]dx = (2)--1exp[-y2/22]dy (y = x ) = (2)--1 [(2)] (from a table)P(x)dx = 1

P(x)dx = (2)--1exp[-(x )2/22] N(p q) 2(Npq)- Compute the mean value of x (): = xP(x)dx = (limits - < x < )xP(x)dx = (2)--1xexp[-(x )2/22]dx = (2)--(y + )exp[-y2/22]dy (y = x ) = (2)--1yexp[-y2/22]dy + exp[-y2/22]dy yexp[-y2/22]dy = 0 (odd function times even function) exp[-y2/22]dy = [(2)] (from a table) = N(p q)

P(x)dx = (2)--1exp[-(x )2/22] N(p q) 2(Npq)- Compute the dispersion in x () = = (x )2P(x)dx = (limits - < x < )xP(x)dx = (2)--1xexp[-(x )2/22]dx = (2)--1y2exp[-y2/22]dy (y = x ) = (2)--1()()(22)1.5 (from table)

= 2 = 4Npq2

Comparison of Binomial & Gaussian DistributionsDots: BinomialCurve: Gaussian The same mean & the same width

The width of a Gaussian is 22

Areas under portions of a Gaussian

Sect. 1.7: Probability Distributions Involving Several Variables

Consider a statistical description of a situation with more than one variable: For example, 2 variables, u, vThe possible values of u are: u1,u2,u3,uMThe possible values of v are: v1,v2,v3,vMLet P(ui,vj) Probability that u = ui, & v = vj simultaneously We must have: i = 1 M j = 1 N P(ui,vj) = 1Pu(ui) Probability that u = ui independent of value v = vj Pu(ui) j = 1 N P(ui,vj)Pv(vj) Probability that v = vj independent of value u = ui Pv(vj) i = 1 M P(ui,vj)Of course, i = 1 M Pu(ui) = 1 & j = 1 N Pv(vj) = 1

In the special case that u, v are Statistically Independent or Uncorrelated:Then & only then: P(ui,vj) Pu(ui)Pv(vj)

General Discussion of Mean Values:If F(u,v) = any function of u,v, its mean value is given by: i = 1 M j = 1 N P(ui,vj)F(ui,vj) If F(u,v) & G(u,v) are any 2 functions of u,v, can easily show: = + If f(u) is any function of u & g(v)> is any function of v, we can easily show: The only case when the inequality becomes an equality is if u & v are statistically independent.

Sect. 1.8: Comments on Continuous Probability DistributionsEverything weve discussed for discrete distributions generalizes in obvious ways.u a continuous random variable in the range:a1 u a2 The probability of finding u in the range u to u + du P(u) P(u)du P(u) Probability Density of the distribution function Normalization: P(u)du = 1 (limits a1 u a2)Mean values: F(u)P(u)du.

Consider two continuous random variables:u continuous random variable in range: a1 u a2 v continuous random variable in range: b1 v b2

The probability of finding u in the range u to u + du AND v in the range v to v + dv isP(u,v) P(u,v)dudv P(u,v) Probability Density of the distribution function Normalization: P(u,v)dudv = 1 (limits a1 u a2, b1 v b2)Mean values: G(u,v)P(u,v)dudv

Functions of Random VariablesAn important, often occurring problem:Consider a random variable u. Suppose (u) any continuous function of u. Question: If P(u)du Probability of finding u in the range u to u + du, what is the probability W()d of finding in the range to + d?

Answer by using essentially the Chain Rule of differentiation, but take the absolute value to make sure that W 0:W()d P(u)|du/d|dCaution!! (u) may not be a single valued function of u!

Example: A 2-dimensional vector B of constant magnitude |B| is EQUALLY LIKELY to point in any direction in the x-y plane. Figures Equally Likely The probability of finding between & + d is: P()d (d/2) Question: What is the probability W(Bx)dBx that the x component of B lies between Bx & Bx + dBx? Clearly, we must have B Bx B. Also, each value of dBx corresponds to 2 possible values of d. Also, dBx = |Bsin|d

So, we have:W(Bx)dBx = 2P()|d/dBx|dBx = ()-1dBx/|Bsin| Note also that: |sin| = [1 cos2] = [1 (Bx)2/B2] so finally, W(Bx)dBx = ()-1dBx[1 (Bx)2/B2]-, B Bx B = 0, otherwiseW not only has a maximum at Bx = B, it diverges there! It has a minimum at Bx = 0. So, it looks like W diverges at Bx = B, but it can be shown that its integral is finite. So, that W(Bx) is a proper probability:W(Bx)dBx= 1 (limits: B Bx B)

The Poisson Probability Distribution"Researches on the probability of criminal and civil verdicts" 1837 Looked at the form of the binomial distribution when the number of trials is large. He derived the cumulative Poisson distribution as the limiting case of the binomial whenthe chance of success tends to zero.

Simeon Denis Poisson

The Poisson Probability Distribution"Researches on the probability of criminal and civil verdicts" 1837 Looked at the form of the binomial distribution when the number of trials is large. He derived the cumulative Poisson distribution as the limiting case of the binomial whenthe chance of success tends to zero.

Simeon Denis Poisson Simeon Denis Fish!

- Another Useful Probability Distribution: The Poisson DistributionPoisson Distribution: Approximation to binomial distribution for the special case when the average number of successes is very much smaller than the possible number i.e.
The Poisson Distribution models counts: If events happen at a constant rate over time, the Poisson distribution gives the probability of X number of events occurring in a time T.This distribution tells us theprobability of all possible numbers of counts, from 0 to infinity. If X= # of counts per second, then the Poisson probability that X = k (a particular count) is:

Here, the average number of counts per second.

Mean and Variance for the Poisson DistributionIts easy to show that: The Mean

The Variance & Standard DeviationFor a Poisson Distribution, the variance and mean are the same!

Terminology: A Poisson Process The Poisson parameter can be given as the mean number of events that occur in a defined time period OR, equivalently, can

*View more*