How to Write and Present Class 6: Results

Download Report

Transcript How to Write and Present Class 6: Results

Bayesian Statistics Without Tears:
Prelude
Eric-Jan
Wagenmakers
Three Schools of
Statistical Inference
 Neyman-Pearson: α-level, power
calculations, two hypotheses, guide for
action (i.e., what to do).
 Fisher: p-values, one hypothesis (i.e., H0),
quantifies evidence against H0.
 Bayes: prior and posterior distributions,
attaches probabilities to parameters and
hypotheses.
A Freudian Analogy
 Neyman-Pearson: The Superego.
 Fisher: The Ego.
 Bayes: The Id.
Claim: What Id really wants is to attach
probabilities to hypotheses and parameters. This
wish is suppressed by the Superego and the Ego.
The result is unconscious internal conflict.
Internal Conflict Causes
Misinterpretations
 p < .05 means that H0 is unlikely to be true,
and can be rejected.
 p > .10 means that H0 is likely to be true.
 For a given parameter μ, a 95% confidence
interval from, say, a to b means that there is a
95% chance that μ lies in between a and b.
Two Ways to Resolve the
Internal Conflict
1. Strengthen Superego and Ego by teaching
the standard statistical methodology more
rigorously. Suppress Id even more!
2. Give Id what it wants.
What is Bayesian Inference?
Why be Bayesian?
Eric-Jan
Wagenmakers
What is Bayesian
Inference?
What is Bayesian Inference?
“Common sense expressed in numbers”
What is Bayesian Inference?
“The means by which rational agents draw
optimal conclusions in an uncertain environment”
What is Bayesian Inference?
“The only statistical procedure that is
coherent, meaning that it avoids statements
that are internally inconsistent.”
What is Bayesian Inference?
“A method for rational updating of beliefs
about the world”
What is Bayesian Inference?
“The only good statistics”
Outline
 Bayes in a Nutshell
 The Inevitability of Probability
 Bayesian Revolutions
 This Course
Bayesian Inference
in a Nutshell
 In Bayesian inference, uncertainty or degree
of belief is quantified by probability.
 Prior beliefs are updated by means of the
data to yield posterior beliefs.
Bayesian Parameter
Estimation: Example
 We prepare for you a series of 10 factual
true/false questions of equal difficulty.
 You answer 9 out of 10 questions correctly.
 What is your latent probability θ of
answering any one question correctly?
Bayesian Parameter
Estimation: Example
 We start with a prior distribution for θ. This
reflect all we know about θ prior to the
experiment. Here we make a standard choice
and assume that all values of θ are equally
likely a priori.
Bayesian Parameter
Estimation: Example
 We then update the prior distribution by means
of the data (technically, the likelihood) to
arrive at a posterior distribution.
 The posterior distribution is a compromise
between what we knew before the experiment
and what we have learned from the
experiment. The posterior distribution reflects
all that we know about θ.
Mode = 0.9
95% confidence
interval: (0.59, 0.98)
NB. We do not
have to use the
uniform prior!
Outline
 Bayes in a Nutshell
 The Inevitability of Probability
 Bayesian Revolutions
 This Course
The Inevitability
of Probability
 Why would one measure “degree of belief”
by means of probability? Couldn’t we choose
something else that makes sense?
 Yes, perhaps we can, but the choice of
probability is anything but ad-hoc.
The Inevitability
of Probability
 Assume “degree of belief” can be measured
by a single number.
 Assume you are rational, that is, not selfcontradictory or “obviously silly”.
 Then degree of belief can be shown to follow
the same rules as the probability calculus.
The Inevitability
of Probability
 For instance, a rational agent would not hold
intransitive beliefs, such as:
Bel  A
Bel  B 
Bel  B 
Bel C 
Bel  C 
Bel  A
The Inevitability
of Probability
 When you use a single number to measure
uncertainty or quantify evidence, and these numbers
do not follow the rules of probability calculus, you
can (almost certainly?) be shown to be silly or
incoherent.
 One of the theoretical attractions of the Bayesian
paradigm is that it ensures coherence right from the
start.
Coherence I
 Coherence is also key in de Finetti’s
conceptualization of probability.
Coherence II
 One aspect of coherence is that “today’s
posterior is tomorrow’s prior”.
 Suppose we have exchangeable (iid) data x =
{x1, x2}. Now we can update our prior using
x, using first x1 and then x2, or using first x2
and then x1.
 All the procedures will result in exactly the
same posterior distribution.
Coherence III
 Assume we have three models: M1, M2, M3.
 After seeing the data, suppose that M1 is 3
times more plausible than M2, and M2 is 4
times more plausible than M3.
 By transitivity, M1 is 3x4=12 times more
plausible than M3.
Outline
 Bayes in a Nutshell
 The Inevitability of Probability
 Bayesian Revolutions
 This Course
The Bayesian Revolution
 Until about 1990, Bayesian statistics could
only be applied to a select subset of very
simple models.
 Only recently, Bayesian statistics has
undergone a transformation; With current
numerical techniques, Bayesian models are
“limited only by the user’s imagination.”
The Bayesian Revolution
in Statistics
The Bayesian Revolution
in Statistics
The Bayesian Revolution
in Psychology?
Are Psychologists
Inconsistent?
 The content of Psych Review shows that


Psychologists are happy to develop Bayesian
models for human cognition and human behavior
based on the assumption that agents or people
process noisy information in a rational or
optimal way;
But psychologist do not use Bayesian models to
analyze their own data statistically!
Why Bayes is Now Popular
Markov chain Monte Carlo!
Markov Chain Monte Carlo
 Instead of calculating the posterior
analytically, numerical techniques such as
MCMC approximate the posterior by
drawing samples from it.
 Consider again our earlier example…
Mode = 0.89
95% confidence
interval: (0.59, 0.98)
With 9000 samples,
almost identical to
analytical result.
Want to Know More
About MCMC?
MCMC
 With MCMC, the models you can build and
estimate are said to be “limited only by the
user’s imagination”.
 But how do you get MCMC to work?


Option 1: write the code it yourself.
Option 2: use WinBUGS!
Outline
 Bayes in a Nutshell
 The Inevitability of Probability
 Bayesian Revolutions
 This Course
A Workshop in Bayesian Modeling for
Cognitive Science
Eric-Jan Wagenmakers
The Bayesian Book
 …is a course book used at UvA and UCI.
 …is still regularly updated.
 ….is freely available at my homepage, at
http://www.ejwagenmakers.com/BayesCours
e/BayesBook.html
 …greatly benefits from your suggestions for
improvement! [e.g., typos, awkward
sentences, etc.]
Contributors
Michael Lee
http://www.socsci.uci.edu/~mdlee/
Contributors
Dora Matzke
Contributors
Ruud Wetzels
http://www.ruudwetzels.com/
Why We Like Graphical
Bayesian Modeling
 It is fun.
 It is cool.
 It is easy.
 It is principled.
 It is superior.
 It is useful.
 It is flexible.
Our Goals These Weeks Are…
 For you to experience some of the
possibilities that WinBUGS has to offer.
 For you to get some hands-on training by
trying out some programs.
 For you to work at your own pace.
 For you to get answers to questions when you
get stuck.
Our Goals These Weeks
Are NOT…
 For you become a Bayesian graphical
modeling expert in one week.
 For you to gain deep insight in the statistical
foundations of Bayesian inference.
 For you to get frustrated when the programs
do not work or you do not understand the
materials (please ask questions).
Want to Know More
About Bayes?
Want to Know More
About Bayes?
WinBUGS
Bayesian inference
Using
Gibbs Sampling
You want to have this
installed (plus the
registration key)
WinBUGS
 Knows many probability distributions
(likelihoods);
 Allows you to specify a model;
 Allows you to specify priors;
 Will then automatically run the MCMC
sampling routines and produce output.
Want to Know More
About WinBUGS?
WinBUGS & R
 WinBUGS produces MCMC samples.
 We want to analyze the output in a nice
program, such as R.
 This can be accomplished using the R
package “R2WinBUGS”
R: “Here’s the data and a
bunch of commands”
WinBUGS: “OK, I did
what you wanted, here’s
the samples you asked for”
Getting Started
 Work through some of the exercises of the
book.
 Most of you will want to get started with the
chapter “getting started”.
Running the R programs
 The R scripts have extension .R. You can use
“File” -> “Open Script” to read these.
 You can run these scripts by copying-andpasting the scripts in the R console.
Course Webpage
 Check out
http://www.ejwagenmakers.com/BayesCours
e/BayesCourse.html
for lectures and a pdf file with answers to the
exercises!
Questions?
 Feel free to ask questions when you are really
stuck.