𝔖 Scriptorium
✦   LIBER   ✦

πŸ“

Doing Bayesian Data Analysis: A Tutorial Introduction with R and BUGS

✍ Scribed by John K. Kruschke, Kruschke John


Publisher
Academic Pr
Year
2010
Tongue
English
Leaves
542
Edition
1
Category
Library

⬇  Acquire This Volume

No coin nor oath required. For personal study only.

✦ Synopsis


There is an explosion of interest in Bayesian statistics, primarily because recently created computational methods have finally made Bayesian analysis obtainable to a wide audience. Doing Bayesian Data Analysis, A Tutorial Introduction with R and BUGS provides an accessible approach to Bayesian data analysis, as material is explained clearly with concrete examples. The book begins with the basics, including essential concepts of probability and random sampling, and gradually progresses to advanced hierarchical modeling methods for realistic data. The text delivers comprehensive coverage of all scenarios addressed by non-Bayesian textbooks--t-tests, analysis of variance (ANOVA) and comparisons in ANOVA, multiple regression, and chi-square (contingency table analysis). This book is intended for first year graduate students or advanced undergraduates. It provides a bridge between undergraduate training and modern Bayesian methods for data analysis, which is becoming the accepted research standard. Prerequisite is knowledge of algebra and basic calculus. Free software now includes programs in JAGS, which runs on Macintosh, Linux, and Windows. Author website: http://www.indiana.edu/~kruschke/DoingBayesianDataAnalysis/-Accessible, including the basics of essential concepts of probability and random sampling -Examples with R programming language and BUGS software -Comprehensive coverage of all scenarios addressed by non-bayesian textbooks- t-tests, analysis of variance (ANOVA) and comparisons in ANOVA, multiple regression, and chi-square (contingency table analysis). -Coverage of experiment planning -R and BUGS computer programming code on website -Exercises have explicit purposes and guidelines for accomplishment

✦ Table of Contents


Cover......Page 1
Foreword......Page 2
Contents......Page 4
1.1 Real people can read this book......Page 14
1.2 Prerequisites......Page 15
1.3 The organization of this book......Page 16
1.3.2 Where’s the equivalent of traditional test X in this book?......Page 17
1.5 Acknowledgments......Page 18
I The Basics: Parameters, Probability, Bayes’ Rule, and R......Page 20
2 Introduction: Models we believe in......Page 22
2.1 Models of observations and models of beliefs......Page 23
2.1.1 Models have parameters......Page 24
2.2.1 Estimation of parameter values......Page 26
2.2.3 Model comparison......Page 27
2.3.2 Invoking R and using the command line......Page 28
2.3.3 A simple example of R in action......Page 29
2.3.4 Getting help in R......Page 30
2.3.5.2 Variable names in R......Page 31
2.4 Exercises......Page 32
3 What is this stuff called probability?......Page 34
3.1.1 Coin flips: Why you should care......Page 35
3.2.1.1 Simulating a long-run relative frequency......Page 36
3.2.1.2 Deriving a long-run relative frequency......Page 37
3.2.2.1 Calibrating a subjective belief by preferences......Page 38
3.3 Probability distributions......Page 39
3.3.2 Continuous distributions: Rendezvous with density......Page 40
3.3.2.1 Properties of probability density functions......Page 42
3.3.2.2 The normal probability density function......Page 43
3.3.3 Mean and variance of a distribution......Page 45
3.3.3.1 Mean as minimized variance......Page 46
3.3.5 Highest density interval (HDI)......Page 47
3.4 Two-way distributions......Page 48
3.4.1 Marginal probability......Page 49
3.4.2 Conditional probability......Page 51
3.4.3 Independence of attributes......Page 52
3.5.1 R code for Figure 3.1......Page 53
3.6 Exercises......Page 54
4 Bayes’ Rule......Page 5
4.1.1 Derived from definitions of conditional probability......Page 57
4.1.2 Intuited from a two-way discrete table......Page 58
4.2 Applied to models and data......Page 60
4.2.1 Data order invariance......Page 62
4.2.2 An example with coin flipping......Page 63
4.3.2 Prediction of data values......Page 65
4.3.3 Model comparison......Page 66
4.3.5.1 Holmesian deduction......Page 69
4.4.1 R code for Figure 4.1......Page 70
4.5 Exercises......Page 72
II All the Fundamentals Applied to Inferring a Binomial Proportion......Page 76
5 Inferring a Binomial Proportion via Exact Mathematical Analysis......Page 78
5.1 The likelihood function: Bernoulli distribution......Page 79
5.2 A description of beliefs: The beta distribution......Page 80
5.2.1 Specifying a beta prior......Page 81
5.2.2 The posterior beta......Page 83
5.3.1 Estimating the binomial proportion......Page 84
5.3.2 Predicting data......Page 85
5.3.3 Model comparison......Page 86
5.4 Summary: How to do Bayesian inference......Page 88
5.5.1 R code for Figure 5.2......Page 89
5.6 Exercises......Page 92
6 Inferring a Binomial Proportion via Grid Approximation......Page 96
6.2 Discretizing a continuous prior density......Page 97
6.2.1 Examples using discretized priors......Page 98
6.3 Estimation......Page 100
6.4 Prediction of subsequent data......Page 101
6.6 Summary......Page 102
6.7.1 R code for Figure 6.2 etc......Page 103
6.8 Exercises......Page 105
7 Inferring a Binomial Proportion via the Metropolis Algorithm......Page 110
7.1 A simple case of the Metropolis algorithm......Page 111
7.1.1 A politician stumbles upon the Metropolis algorithm......Page 112
7.1.3 General properties of a random walk......Page 114
7.1.5 Why it works......Page 117
7.2 The Metropolis algorithm more generally......Page 121
7.2.2 Terminology: Markov chain Monte Carlo......Page 122
7.3 From the sampled posterior to the three goals......Page 123
7.3.1.1 Highest density intervals from random samples......Page 124
7.3.1.2 Using a sample to estimate an integral......Page 125
7.3.3 Model comparison: Estimation of p(D)......Page 126
7.4 MCMC in BUGS......Page 128
7.4.1 Parameter estimation with BUGS......Page 129
7.4.2 BUGS for prediction......Page 131
7.4.3 BUGS for model comparison......Page 132
7.5 Conclusion......Page 133
7.6.1 R code for a home-grown Metropolis......Page 134
7.7 Exercises......Page 136
8 Inferring Two Binomial Proportions via Gibbs Sampling......Page 140
8.1 Prior, likelihood and posterior for two proportions......Page 142
8.2 The posterior via exact formal analysis......Page 143
8.3 The posterior via grid approximation......Page 146
8.4 The posterior via Markov chain Monte Carlo......Page 147
8.4.1 Metropolis algorithm......Page 148
8.4.2 Gibbs sampling......Page 149
8.4.2.1 Disadvantages of Gibbs sampling......Page 152
8.5 Doing it with BUGS......Page 153
8.5.1 Sampling the prior in BUGS......Page 154
8.6 How different are the underlying biases?......Page 155
8.7 Summary......Page 156
8.8.1 R code for grid approximation (Figures 8.1 and 8.2)......Page 157
8.8.2 R code for Metropolis sampler (Figure 8.3)......Page 159
8.8.3 R code for BUGS sampler (Figure 8.6)......Page 162
8.8.4 R code for plotting a posterior histogram......Page 164
8.9 Exercises......Page 166
9 Bernoulli Likelihood with Hierarchical Prior......Page 170
9.1 A single coin from a single mint......Page 171
9.1.1 Posterior via grid approximation......Page 174
9.2 Multiple coins from a single mint......Page 177
9.2.1 Posterior via grid approximation......Page 179
9.2.2 Posterior via Monte Carlo sampling......Page 182
9.2.2.1 Doing it with BUGS......Page 184
9.2.3 Outliers and shrinkage of individual estimates......Page 188
9.2.4 Case study: Therapeutic touch......Page 190
9.3.1 Independent mints......Page 191
9.3.2 Dependent mints......Page 195
9.3.3 Individual differences and meta-analysis......Page 197
9.5.1 Code for analysis of therapeutic-touch experiment......Page 198
9.5.2 Code for analysis of filtration-condensation experiment......Page 201
9.6 Exercises......Page 204
10.1 Model comparison as hierarchical modeling......Page 208
10.2.1 A simple example......Page 210
10.2.2 A realistic example with β€œpseudopriors”......Page 212
10.2.3 Some practical advice when using transdimensional MCMC withpseudopriors......Page 217
10.3 Model comparison and nested models......Page 219
10.4.1 Comparing methods for MCMC model comparison......Page 221
10.4.2 Summary and caveats......Page 222
10.5 Exercises......Page 223
11 Null Hypothesis Significance Testing......Page 228
11.1.1 When the experimenter intends to fix N......Page 230
11.1.2 When the experimenter intends to fix z......Page 232
11.1.3 Soul searching......Page 233
11.2 Prior knowledge about the coin......Page 235
11.2.2.1 Priors are overt and should influence......Page 236
11.3.1 NHST confidence interval......Page 237
11.4 Multiple comparisons......Page 240
11.4.1 NHST correction for experimentwise error......Page 241
11.4.2 Just one Bayesian posterior no matter how you look at......Page 243
11.5.1 Planning an experiment......Page 244
11.5.2 Exploring model predictions (posterior predictive check)......Page 245
11.6 Exercises......Page 246
12 Bayesian Approaches to Testing a Point (β€œNull”) Hypothesis......Page 252
12.1.1 Is a null value of a parameter among the credible values?......Page 253
12.1.2 Is a null value of a difference among the credible values?......Page 254
12.1.2.1 Differences of correlated parameters......Page 255
12.1.3 Region of Practical Equivalence (ROPE)......Page 257
12.2 The model-comparison (two-prior) approach......Page 258
12.2.1 Are the biases of two coins equal or not?......Page 259
12.2.1.1 Formal analytical solution......Page 260
12.2.1.2 Example application......Page 261
12.2.2 Are different groups equal or not?......Page 262
12.3.2 Recommendations......Page 264
12.4.1 R code for Figure 12.5......Page 265
12.5 Exercises......Page 268
13 Goals, Power, and Sample Size......Page 272
13.1.1 Goals and Obstacles......Page 273
13.1.2 Power......Page 274
13.1.3 Sample Size......Page 275
13.2 Sample size for a single coin......Page 277
13.2.1 When the goal is to exclude a null value......Page 278
13.2.2 When the goal is precision......Page 279
13.3 Sample size for multiple mints......Page 280
13.4 Power: prospective, retrospective, and replication......Page 282
13.4.1 Power analysis requires verisimilitude of simulated data......Page 283
13.5 The importance of planning......Page 284
13.6.1 Sample size for a single coin......Page 285
13.6.2 Power and sample size for multiple mints......Page 287
13.7 Exercises......Page 294
III The Generalized Linear Model......Page 302
14 Overview of the Generalized Linear Model......Page 304
14.1.1 Predictor and predicted variables......Page 305
14.1.2 Scale types: metric, ordinal, nominal......Page 306
14.1.3 Linear function of a single metric predictor......Page 307
14.1.4 Additive combination of metric predictors......Page 309
14.1.5 Nonadditive interaction of metric predictors......Page 311
14.1.6.1 Linear model for a single nominal predictor......Page 313
14.1.6.2 Additive combination of nominal predictors......Page 315
14.1.6.3 Nonadditive interaction of nominal predictors......Page 316
14.1.7 Linking combined predictors to the predicted......Page 317
14.1.7.1 The sigmoid (a.k.a. logistic) function......Page 318
14.1.7.2 The cumulative normal (a.k.a. Phi) function......Page 320
14.1.9 Formal expression of the GLM......Page 321
14.2 Cases of the GLM......Page 324
14.2.1 Two or more nominal variables predicting frequency......Page 326
14.3 Exercises......Page 328
15 Metric Predicted Variable on a Single Group......Page 330
15.1.1 Solution by mathematical analysis......Page 331
15.1.2 Approximation by MCMC in BUGS......Page 335
15.1.3 Outliers and robust estimation: The t distribution......Page 336
15.1.4 When the data are non-normal: Transformations......Page 339
15.2 Repeated measures and individual differences......Page 341
15.2.1 Hierarchical model......Page 343
15.2.2 Implementation in BUGS......Page 344
15.4.1 Estimating the mean and precision of a normal likelihood......Page 346
15.4.2 Repeated measures: Normal across and normal within......Page 348
15.5 Exercises......Page 351
16 Metric Predicted Variable with One Metric Predictor......Page 356
16.1 Simple linear regression......Page 357
16.1.1 The hierarchical model and BUGS code......Page 359
16.1.1.1 Standardizing the data for MCMC sampling......Page 360
16.1.1.2 Initializing the chains......Page 361
16.1.2 The posterior: How big is the slope?......Page 362
16.1.3 Posterior prediction......Page 363
16.2 Outliers and robust regression......Page 365
16.3 Simple linear regression with repeated measures......Page 367
16.4 Summary......Page 370
16.5.1 Data generator for height and weight......Page 371
16.5.2 BRugs: Robust linear regression......Page 372
16.5.3 BRugs: Simple linear regression with repeated measures......Page 375
16.6 Exercises......Page 379
17 Metric Predicted Variable with Multiple Metric Predictors......Page 384
17.1.1 The perils of correlated predictors......Page 385
17.1.2 The model and BUGS program......Page 388
17.1.3 The posterior: How big are the slopes?......Page 389
17.1.4 Posterior prediction......Page 391
17.2 Hyperpriors and shrinkage of regression coefficients......Page 392
17.2.1 Informative priors, sparse data, and correlated predictors......Page 394
17.3 Multiplicative interaction of metric predictors......Page 396
17.3.1 The hierarchical model and BUGS code......Page 397
17.3.2 Interpreting the posterior......Page 398
17.4 Which predictors should be included?......Page 401
17.5.1 Multiple linear regression......Page 403
17.5.2 Multiple linear regression with hyperprior on coefficients......Page 407
17.6 Exercises......Page 412
18 Metric Predicted Variable with One Nominal Predictor......Page 414
18.1 Bayesian oneway ANOVA......Page 415
18.1.1 The hierarchical prior......Page 416
18.1.2 Doing it with R and BUGS......Page 417
18.1.3 A worked example......Page 419
18.1.3.1 Contrasts and complex comparisons......Page 420
18.1.3.2 Is there a difference?......Page 421
18.2 Multiple comparisons......Page 422
18.3 Two group Bayesian ANOVA and the NHST t test......Page 425
18.4.1 Bayesian oneway ANOVA......Page 426
18.5 Exercises......Page 430
19 Metric Predicted Variable with Multiple Nominal Predictors......Page 434
19.1 Bayesian multi-factor ANOVA......Page 435
19.1.1 Interaction of nominal predictors......Page 436
19.1.2 The hierarchical prior......Page 437
19.1.3 An example in R and BUGS......Page 438
19.1.4.1 Metric predictors and ANCOVA......Page 441
19.1.4.2 Interaction contrasts......Page 442
19.1.5 Non-crossover interactions, rescaling, and homogeneous variances......Page 443
19.2 Repeated measures, a.k.a. within-subject designs......Page 445
19.2.1 Why use a within-subject design? And why not?......Page 447
19.3.1 Bayesian two-factor ANOVA......Page 448
19.4 Exercises......Page 457
20 Dichotomous Predicted Variable......Page 462
20.1 Logistic regression......Page 463
20.1.2 Doing it in R and BUGS......Page 464
20.1.3 Interpreting the posterior......Page 465
20.1.6 Hyperprior across regression coefficients......Page 467
20.2 Interaction of predictors in logistic regression......Page 468
20.3 Logistic ANOVA......Page 469
20.4 Summary......Page 471
20.5.1 Logistic regression code......Page 472
20.5.2 Logistic ANOVA code......Page 476
20.6 Exercises......Page 481
21 Ordinal Predicted Variable......Page 484
21.1.2 The mapping from metric x to ordinal y......Page 485
21.1.3 The parameters and their priors......Page 487
21.1.5 Posterior prediction......Page 488
21.2 Some examples......Page 489
21.2.1 Why are some thresholds outside the data?......Page 491
21.3 Interaction......Page 493
21.5 R code......Page 494
21.6 Exercises......Page 499
22 Contingency Table Analysis......Page 502
22.1.2 The exponential link function......Page 503
22.1.3 The Poisson likelihood......Page 506
22.1.4 The parameters and the hierarchical prior......Page 507
22.2.1 Credible intervals on cell probabilities......Page 508
22.3 Log linear models for contingency tables......Page 509
22.4 R code for Poisson exponential model......Page 510
22.5 Exercises......Page 517
23 Tools in the Trunk......Page 520
23.1.1 Essential points......Page 521
23.1.3 Helpful points......Page 522
23.2 MCMC burn-in and thinning......Page 523
23.3.2 R code for computing HDI of a MCMC sample......Page 526
23.3.3 R code for computing HDI of a function......Page 528
23.4.1 Examples......Page 529
23.4.2 Reparameterization of two parameters......Page 530
References......Page 532
Index......Page 540


πŸ“œ SIMILAR VOLUMES


Doing Bayesian Data Analysis: A Tutorial
✍ John K. Kruschke πŸ“‚ Library πŸ“… 2010 πŸ› Academic Press 🌐 English

There is an explosion of interest in Bayesian statistics, primarily because recently created computational methods have finally made Bayesian analysis obtainable to a wide audience. Doing Bayesian Data Analysis, A Tutorial Introduction with R and BUGS provides an accessible approach to Bayesian data

Doing Bayesian Data Analysis: A Tutorial
✍ John K. Kruschke πŸ“‚ Library πŸ“… 2010 πŸ› Academic Press 🌐 English

<p>There is an explosion of interest in Bayesian statistics, primarily because recently created computational methods have finally made Bayesian analysis tractable and accessible to a wide audience. Doing Bayesian Data Analysis, A Tutorial Introduction with R and BUGS, is for first year graduate stu

Doing Bayesian Data Analysis: A Tutorial
✍ John K. Kruschke πŸ“‚ Library πŸ“… 2010 πŸ› Academic Press 🌐 English

<p>There is an explosion of interest in Bayesian statistics, primarily because recently created computational methods have finally made Bayesian analysis tractable and accessible to a wide audience. Doing Bayesian Data Analysis, A Tutorial Introduction with R and BUGS, is for first year graduate stu

Doing Bayesian Data Analysis: A Tutorial
✍ John K. Kruschke πŸ“‚ Library πŸ“… 2010 πŸ› Academic Press 🌐 English

There is an explosion of interest in Bayesian statistics, primarily because recently created computational methods have finally made Bayesian analysis obtainable to a wide audience. Doing Bayesian Data Analysis, A Tutorial Introduction with R and BUGS provides an accessible approach to Bayesian data

Doing Bayesian Data Analysis: A Tutorial
✍ John K. Kruschke πŸ“‚ Library πŸ“… 2014 πŸ› Academic Press 🌐 English

<p><i>Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan, Second Edition</i> provides an accessible approach for conducting Bayesian data analysis, as material is explained clearly with concrete examples. Included are step-by-step instructions on how to carry out Bayesian data analyses

Doing Bayesian Data Analysis: A Tutorial
✍ Kruschke, John πŸ“‚ Library πŸ“… 2015 πŸ› Academic Press 🌐 English

There is an explosion of interest in Bayesian statistics, primarily because recently created computational methods have finally made Bayesian analysis obtainable to a wide audience. "Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan" provides an accessible approach to Bayesian data ana