# Introduction to Bayesian Methods

date post

19-Aug-2014Category

## Education

view

24.589download

1

Embed Size (px)

description

A guest lecture given in advanced biostatistics (BIOL597) at McGill University. EDIT: t=Principle.

### Transcript of Introduction to Bayesian Methods

- Introduction to Bayesian Methods Theory, Computation, Inference and PredictionCorey ChiversPhD CandidateDepartment of BiologyMcGill University
- Script to run examples inthese slides can be foundhere:bit.ly/Wnmb2WThese slides are here:bit.ly/P9Xa9G
- Corey Chivers, 2012
- The Likelihood Principle All information contained in data x, with respect to inference about the value of , is contained in the likelihood function: L | x P X= x | Corey Chivers, 2012
- The Likelihood Principle L.J. Savage R.A. FisherCorey Chivers, 2012
- The Likelihood Function L | x P X= x | L | x =f | x Where is(are) our parameter(s) of interest ex: Attack rate Fitness Mean body mass Mortality etc...Corey Chivers, 2012
- The Ecologists Quarter Lands tails (caribou up) 60% of the timeCorey Chivers, 2012
- The Ecologists Quarter Lands tails (caribou up) 60% of the time 1) What is the probability that I will flip tails, given that I am flipping an ecologists quarter (p(tail=0.6))? P x | =0.6 2) What is the likelihood that I am flipping an ecologists quarter, given the flip(s) that I have observed? L=0.6 | x Corey Chivers, 2012
- The Ecologists Quarter T H L | x = 1 t=1 h=1 L=0.6 | x=H T T H T 3 2 = 0.6 0.4 t =1 h=1 = 0.03456Corey Chivers, 2012
- The Ecologists Quarter T H L | x = 1 t=1 h=1 L=0.6 | x=H T T H T 3 2 But what does this = 0.6 0.4 mean? 0.03456 P(|x) !!!! t =1 h=1 = 0.03456Corey Chivers, 2012
- How do we ask Statistical Questions? A Frequentist asks: What is the probability of having observed data at least as extreme as my data if the null hypothesis is true? P(data | H0) ? note: P=1 does not mean P(H0)=1 A Bayesian asks: What is the probability of hypotheses given that I have observed my data? P(H | data) ? note: here H denotes the space of all possible hypothesesCorey Chivers, 2012
- P(data | H0) P(H | data) But we both want to make inferences about our hypotheses, not the data.Corey Chivers, 2012
- Bayes Theorem The posterior probability of , given our observation (x) is proportional to the likelihood times the prior probability of . P x | P P | x= P xCorey Chivers, 2012
- The Ecologists Quarter Redux Lands tails (caribou up) 60% of the timeCorey Chivers, 2012
- The Ecologists Quarter T H L | x = 1 t=1 h=1 L=0.6 | x=H T T H T 3 2 = 0.6 0.4 t =1 h=1 = 0.03456Corey Chivers, 2012
- Likelihood of data given hypothesis P( x | ) But we want to know P( | x )Corey Chivers, 2012
- How can we make inferences about our ecologists quarter using Bayes? P( x | ) P() P( | x )= P( x )Corey Chivers, 2012
- How can we make inferences about our ecologists quarter using Bayes? Likelihood P x | P P | x= P xCorey Chivers, 2012
- How can we make inferences about our ecologists quarter using Bayes? Likelihood Prior P( x | ) P() P( | x )= P( x )Corey Chivers, 2012
- How can we make inferences about our ecologists quarter using Bayes? Likelihood Prior P x | P P | x= Posterior P xCorey Chivers, 2012
- How can we make inferences about our ecologists quarter using Bayes? Likelihood Prior P x | P P | x= Posterior P x P x = P x | P d Not always a closed form solution possible!!Corey Chivers, 2012
- Randomization to Solve Difficult Problems ` Feynman, Ulam & Von Neumann f d Corey Chivers, 2012
- Monte Carlo Throw darts at random Feynman, Ulam & Von Neumann (0,1) P(blue) = ? P(blue) = 1/2 P(blue) ~ 7/15 ~ 1/2 (0.5,0) (1,0)Corey Chivers, 2012
- Your turn...Lets use Monte Carlo to estimate - Generate random x and y values using the number sheet- Plot those points on your graphHow many of the points fallwithin the circle? y=17 x=4
- Your turn...Estimate using the formula: 4 #incircle / total
- Now using a more powerful computer!
- Posterior Integration via Markov Chain Monte Carlo A Markov Chain is a mathematical construct where given the present, the past and the future are independent. Where I decide to go next depends not on where I have been, or where I may go in the future but only on where I am right now. -Andrey Markov (maybe)Corey Chivers, 2012
- Corey Chivers, 2012
- Metropolis-Hastings Algorithm 1. Pick a starting location at The Markovian Explorer! random. 2. Choose a new location in your vicinity. 3. Go to the new location with probability: p=min 1, x proposal x current 4. Otherwise stay where you are. 5. Repeat.Corey Chivers, 2012
- MCMC in Action!Corey Chivers, 2012
- Weve solved our integration problem! P x | P P | x= P x P | x P x | P Corey Chivers, 2012
- Ex: Bayesian Regression Regression coefficients are traditionally estimated via maximum likelihood. To obtain full posterior distributions, we can view the regression problem from a Bayesian perspective.Corey Chivers, 2012
- ##@ 2.1 @##Corey C

Recommended

*View more*