Modelling data networks stochastic processes and Markov · PDF fileModelling data networks...

Click here to load reader

  • date post

    26-Aug-2018
  • Category

    Documents

  • view

    222
  • download

    0

Embed Size (px)

Transcript of Modelling data networks stochastic processes and Markov · PDF fileModelling data networks...

  • Introduction Stochastic processes Markov chains Markov Chain simple examples The leaky bucket model

    Modelling data networks stochastic processes and

    Markov chains

    a

    b

    c

    u

    v

    1, 2

    2, 2

    1, 6

    1, 3

    0, 3

    2, 3

    0, 3

    1, 1

    1, 3

    2, 2

    Richard G. Clegg (richard@richardclegg.org) December 2011Available online at http://www.richardclegg.org/lectures accompanying printed

    notes provide full bibliography.(Prepared using LATEX and beamer.)

    http://www.richardclegg.org/lectures

  • Introduction Stochastic processes Markov chains Markov Chain simple examples The leaky bucket model

    Introduction to stochastic processes and Markov chains

    Stochastic processes

    A stochastic process describes how a system behaves over time an arrival process describes how things arrive to a system.

    Markov chains

    Markov chains describe the evolution of a system in time inparticular they are useful for queuing theory. (A markov chain is astochastic process).

  • Introduction Stochastic processes Markov chains Markov Chain simple examples The leaky bucket model

    Stochastic processes

    Stochastic process

    Let X (t) be some value (or vector of values) which varies in time t.Think of the stochastic process as the rules for how X (t) changeswith t. Note: t may be discrete (t = 0, 1, 2, . . .) or continuous.

    Poisson process

    A process where the change in X (t) from time t1 to t2 is a Poissondistribution, that is X (t2) X (t2) follows a Poisson distribution.

  • Introduction Stochastic processes Markov chains Markov Chain simple examples The leaky bucket model

    A simple stochastic process the drunkards walk

    Random walk or drunkards walk

    A man walks home from the pub. He starts at a distance X (0)from some point. At every step he (randomly) gets either one unitcloser (probability p) or one unit further away.

    X (t + 1) =

    {X (t) + 1 probability p

    X (t) 1 probability 1 p.

    Can answer questions like where, on average, will he be at timet?

  • Introduction Stochastic processes Markov chains Markov Chain simple examples The leaky bucket model

    Drunkards walk p = 0.5,X (0) = 0

  • Introduction Stochastic processes Markov chains Markov Chain simple examples The leaky bucket model

    Drunkards walk p = 0.5,X (0) = 0

    What is the expected value of X (t), that is, E [X (t)]?

    E [X (t)] = 0.5(X (t 1) + 1) + 0.5(X (t 1) 1) =0.5(X (t 1) + X (t 1)) + 0.5(1 1) = X (t 1).

    Therefore E [X (t)] = X (0) = 0 the poor drunk makes noprogress towards his house (on average).

    E[X (t)2

    ]= 0.5(X (t 1) + 1)2 + 0.5(X (t 1) 1)2 =

    X (t 1)2 + 1.Therefore E

    [X (t)2

    ]= t on average the drunk does get

    further from the starting pub.

    This silly example has many uses in physics and chemistry(Brownian motion) not to mention gambling (coin tosses).

  • Introduction Stochastic processes Markov chains Markov Chain simple examples The leaky bucket model

    Drunkards walk p = 0.5,X (0) = 0

    What is the expected value of X (t), that is, E [X (t)]?

    E [X (t)] = 0.5(X (t 1) + 1) + 0.5(X (t 1) 1) =0.5(X (t 1) + X (t 1)) + 0.5(1 1) = X (t 1).Therefore E [X (t)] = X (0) = 0 the poor drunk makes noprogress towards his house (on average).

    E[X (t)2

    ]= 0.5(X (t 1) + 1)2 + 0.5(X (t 1) 1)2 =

    X (t 1)2 + 1.Therefore E

    [X (t)2

    ]= t on average the drunk does get

    further from the starting pub.

    This silly example has many uses in physics and chemistry(Brownian motion) not to mention gambling (coin tosses).

  • Introduction Stochastic processes Markov chains Markov Chain simple examples The leaky bucket model

    Drunkards walk p = 0.5,X (0) = 0

    What is the expected value of X (t), that is, E [X (t)]?

    E [X (t)] = 0.5(X (t 1) + 1) + 0.5(X (t 1) 1) =0.5(X (t 1) + X (t 1)) + 0.5(1 1) = X (t 1).Therefore E [X (t)] = X (0) = 0 the poor drunk makes noprogress towards his house (on average).

    E[X (t)2

    ]= 0.5(X (t 1) + 1)2 + 0.5(X (t 1) 1)2 =

    X (t 1)2 + 1.Therefore E

    [X (t)2

    ]= t on average the drunk does get

    further from the starting pub.

    This silly example has many uses in physics and chemistry(Brownian motion) not to mention gambling (coin tosses).

  • Introduction Stochastic processes Markov chains Markov Chain simple examples The leaky bucket model

    Drunkards walk p = 0.5,X (0) = 0

    What is the expected value of X (t), that is, E [X (t)]?

    E [X (t)] = 0.5(X (t 1) + 1) + 0.5(X (t 1) 1) =0.5(X (t 1) + X (t 1)) + 0.5(1 1) = X (t 1).Therefore E [X (t)] = X (0) = 0 the poor drunk makes noprogress towards his house (on average).

    E[X (t)2

    ]= 0.5(X (t 1) + 1)2 + 0.5(X (t 1) 1)2 =

    X (t 1)2 + 1.Therefore E

    [X (t)2

    ]= t on average the drunk does get

    further from the starting pub.

    This silly example has many uses in physics and chemistry(Brownian motion) not to mention gambling (coin tosses).

  • Introduction Stochastic processes Markov chains Markov Chain simple examples The leaky bucket model

    The Poisson process

    The Poisson process

    Let X (t) with (t 0) and X (0) = 0 be a Poisson process withrate . Let t2, t1 be two times such that t2 > t1. Let = t2 t1.

    P [X (t2) X (t1) = n] = exp[()][

    ()n

    n!

    ],

    for n = 0, 1, 2, . . ..

    In other words, the number of arrivals in some time period follows a Poisson distribution with rate .

  • Introduction Stochastic processes Markov chains Markov Chain simple examples The leaky bucket model

    The special nature of the Poisson process

    The Poisson process is in many ways the simplest stochasticprocess of all.

    This is why the Poisson process is so commonly used.

    Imagine your system has the following properties:

    The number of arrivals does not depend on the number ofarrivals so far.No two arrivals occur at exactly the same instant in time.The number of arrivals in time period depends only on thelength of .

    The Poisson process is the only process satisfying theseconditions (see notes for proof).

  • Introduction Stochastic processes Markov chains Markov Chain simple examples The leaky bucket model

    Some remarkable things about Poisson processes

    The mean number of arrivals in a period is (see notes).

    If two Poisson processes arrive together with rates 1 and 2the arrival process is a Poisson process with rate 1 + 2.

    In fact this is a general result for n Poisson processes.

    If you randomly sample a Poisson process e.g. pickarrivals with probability p, the sampled process is Poisson,rate p.

    This makes Poisson processes easy to deal with.

    Many things in computer networks really are Poissonprocesses (e.g. people logging onto a computer or requestingweb pages).

    The Poisson process is also memoryless as the next sectionexplains.

  • Introduction Stochastic processes Markov chains Markov Chain simple examples The leaky bucket model

    The interarrival time the exponential distribution

    The exponential distribution

    An exponential distribution for a variable T takes this form:

    P [T t] =

    {1 exp[(t)], t 0,0 t < 0.

    The time between packets is called the interarrival time thetime between arrivals.For a Poisson process this follows the exponential distribution(above).This is easily shown the probability of an arrival occuringbefore time t is one minus the probability of no arrivalsoccuring up until time t.The probability of no arrivals occuring during a time period tis (t)0 exp[(t)]/0! = exp[(t)].The mean interarrival time is 1/.

  • Introduction Stochastic processes Markov chains Markov Chain simple examples The leaky bucket model

    The memoryless nature of the Poisson process

    There is something strange to be noticed here thedistribution of our interarrival time T was given byP [T t] = 1 exp[(t) for t 0.However, if looked at the Poisson process at any instant andasked how long must we wait for the next arrival? theanswer is just the same 1/.

    Exactly the same argument can be made for any arrival time.The probability of no arrivals in the next t seconds does notchange because an arrival has just happened.

    The expected waiting time for the next arrival does notchange if you have been waiting for just one second, or for anhour or for many years the average time to the next arrivalis still the same 1/.

  • Introduction Stochastic processes Markov chains Markov Chain simple examples The leaky bucket model

    The Poisson bus dilemma

    X (t)

    Consider you arrive at the bus stop at a random time.

    Buses arrive as a Poisson process with a given rate .

    Buses are (on average) 30 minutes apart 1/ = 30 minutes.

    How long do you wait for the bus on average?

    Bus passenger 1: Obviously 15 minutes the buses are 30minutes apart, on average I arrive half way through thatperiod.

    Bus passenger 2: Obviously 30 minutes the buses are aPoisson process and memoryless. The average waiting time is30 minutes no matter when the last bus was or when I arrive.

    So, who is correct?

  • Introduction Stochastic processes Markov chains Markov Chain simple examples The leaky bucket model

    The Poisson bus dilemma

    X (t)

    Consider you arrive at the bus stop at a random time.

    Buses arrive as a Poisson process with a given rate .

    Buses are (on average) 30 minutes apart 1/ = 30 minutes.

    How long do you wait for the bus on average?

    Bus passenger 1: Obviously 15 minutes the buses are 30minutes apart, on ave