Statistics for Business and Economics, 7/e · Let X 1, X 2, . . ., X k be continuous random...

Post on 05-Jul-2020

10 views 4 download

Transcript of Statistics for Business and Economics, 7/e · Let X 1, X 2, . . ., X k be continuous random...

Unit 5

Continuous Random Variables

and Probability Distributions

Linear Functions of Random Variables

Let W = a + bX , where X has mean μX and

variance σX2 , and a and b are constants

Then the mean of W is

the variance is

the standard deviation of W is

XW bμabX]E[aμ

2

X

22

W σbbX]Var[aσ

XW σbσ

Example

An important special case of the result for the linear

function of random variable is the standardized random

variable

Find the mean and standard deviation of Z

X

X

X μZ

σ

X

X

X μZ

σ

Jointly Distributed Continuous Random Variables

Let X and Y be continuous random variables

Their joint cumulative distribution function, is defined

as

0 0

0 0 0 0( , ) ( ) ( , )

y x

F x y P X x Y y f x y dxdy

Jointly Distributed Continuous Random Variables

The cumulative distribution functions

F(x) and F(y)

of the individual random variables are called their

marginal distribution functions

(continued)

( ) ( , )F x f x y dy

( ) ( , )F y f x y dx

Jointly Distributed Continuous Random Variables

X and Y are independent if and only if

(continued)

( , ) ( ) ( )F x y F x F y

Covariance

Let X and Y be jointly distributed continuous random

variables, with means μx and μy

The expected value of (X - μx)(Y - μy) is called the

covariance between X and Y

)]μ)(YμE[(XY)Cov(X, yx

( )( ) ( , )X Y

x y f x y dxdy

Covariance

If X and Y are independent, then the covariance

between them is 0.

However, the converse is not true EXCEPT in the case

when the jointly normally distributed

Covariance measures linear dependence only

Correlation

Let X and Y be jointly distributed continuous random

variables standard deviations sx and sy

The correlation between X and Y is

YXσσ

Y)Cov(X,Y)Corr(X,ρ

Example – Bivariate Uniform

( , ) 1, 0 1, 0 1f x y x y

Linear Combinations of Random Variables

A linear combination of two random variables, X and Y, (where a and b are constants) is

The mean of W is

bYaXW

YXW bμaμbY]E[aXE[W]μ

Linear Combinations of Random Variables

The variance of W is

Or using the correlation,

If both X and Y are joint normally distributed random variables then the linear combination, W, is also normally distributed

Y)2abCov(X,σbσaσ 2

Y

22

X

22

W

YX

2

Y

22

X

22

W σY)σρ(X,2abσbσaσ

(continued)

Example

Two tasks must be performed by the same worker.

X = minutes to complete task 1; μx = 20, σx = 5

Y = minutes to complete task 2; μy = 20, σy = 5

X and Y are normally distributed and independent

What is the mean and standard deviation of the time to

complete both tasks?

Example

Two tasks must be performed by the same worker.

X = minutes to complete task 1; μx = 20, σx = 5

Y = minutes to complete task 2; μy = 20, σy = 5

X and Y are normally distributed and independent

What is the mean and standard deviation of the difference

in time to complete both tasks?

Example

Two tasks must be performed by the same worker.

X = minutes to complete task 1; μx = 20, σx = 5

Y = minutes to complete task 2; μy = 20, σy = 5

X and Y are normally distributed and independent

Find the probability that task1 takes longer than task 2.

Jointly Distributed Continuous Random Variables

Let X1, X2, . . ., Xk be continuous random variables

Their joint cumulative distribution function,

F(x1, x2, . . ., xk)

defines the probability that simultaneously X1 is less

than x1, X2 is less than x2, and so on; that is

)xXxXxP(X)x,,x,F(x kk2211k21

Jointly Distributed Continuous Random Variables

The cumulative distribution functions

F(x1), F(x2), . . ., F(xk)

of the individual random variables are called their

marginal distribution functions

The random variables are independent if and only if

(continued)

)F(x))F(xF(x)x,,x,F(x k21k21

Sums of Random Variables

1 1 2 2Y k ka a a

1 2

1 1 2 2 k

1 2

Let , , , be random variables

with means , 1 .

Let X

for constants , , , .

Then the mean of Y is

k

i

k

k

X X X k

i k

Y a X a X a

a a a

Sums of Random Variables

2 2 2 2 2 2 2

1 1 2 2Y k ka a as s s s

1 2

2

i

1 1 2 2 k

1 2

Let , , , be random variables

with variances , 1 .

Let X

for constants , , , .

Then the variance of Y is

k

k

k

X X X k

i k

Y a X a X a

a a a

s

If the covariance between every pair of these random variables is 0, then the variance of their sum is the sum of their variances

Sums of Random Variables

2 2 2 2 2 2 2

1 1 2 22 ( , )

Y k k i j i ji j

a a a a a Cov X Xs s s s

1 2

2

i

1 1 2 2 k

1 2

Let , , , be random variables

with variances , 1 .

Let X

for constants , , , .

Then the variance of Y is

k

k

k

X X X k

i k

Y a X a X a

a a a

s

Example

It is estimated that in normal highway driving the

number of miles that can be covered by

automobiles of a particular model on 1 gallon of

gasoline is a normally distributed random variable

with mean 28 mpg and standard deviation 2.4 mpg.

Four of these cars, each with 1 gallon of gasoline,

are driven independently under highway conditions.

Find the probability that the average mpg of these

four cars exceeds 30 mpg.