<strong>Likelihood Methods in Biology and Ecology: A Modern Approach to Statistics</strong>emphasizes the importance of the likelihood function in statistical theory and applications and discusses it in the context of biology and ecology. Bayesian and frequentist methods both use the likelihood func
Likelihood and Bayesian Inference (Statistics for Biology and Health)
✍ Scribed by Held
- Publisher
- Springer
- Year
- 2020
- Tongue
- English
- Leaves
- 409
- Category
- Library
No coin nor oath required. For personal study only.
✦ Synopsis
This richly illustrated textbook covers modern statistical methods with applications in medicine, epidemiology and biology. Firstly, it discusses the importance of statistical models in applied quantitative research and the central role of the likelihood function, describing likelihood-based inference from a frequentist viewpoint, and exploring the properties of the maximum likelihood estimate, the score function, the likelihood ratio and the Wald statistic. In the second part of the book, likelihood is combined with prior information to perform Bayesian inference. Topics include Bayesian updating, conjugate and reference priors, Bayesian point and interval estimates, Bayesian asymptotics and empirical Bayes methods. It includes a separate chapter on modern numerical techniques for Bayesian inference, and also addresses advanced topics, such as model choice and prediction from frequentist and Bayesian perspectives. This revised edition of the book “Applied Statistical Inference” has been expanded to include new material on Markov models for time series analysis. It also features a comprehensive appendix covering the prerequisites in probability theory, matrix algebra, mathematical calculus, and numerical analysis, and each chapter is complemented by exercises. The text is primarily intended for graduate statistics and biostatistics students with an interest in applications.
✦ Table of Contents
Likelihood and Bayesian Inference
Preface
Contents
Chapter 1: Introduction
1.1 Examples
1.1.1 Inference for a Proportion
1.1.2 Comparison of Proportions
1.1.3 The Capture-Recapture Method
1.1.4 Hardy-Weinberg Equilibrium
1.1.5 Estimation of Diagnostic Tests Characteristics
1.1.6 Quantifying Disease Risk from Cancer Registry Data
1.1.7 Predicting Blood Alcohol Concentration
1.1.8 Analysis of Survival Times
1.2 Statistical Models
1.3 Contents and Notation of the Book
1.4 References
Chapter 2: Likelihood
2.1 Likelihood and Log-Likelihood Function
2.1.1 Maximum Likelihood Estimate
2.1.2 Relative Likelihood
2.1.3 Invariance of the Likelihood
2.1.4 Generalised Likelihood
2.2 Score Function and Fisher Information
2.3 Numerical Computation of the Maximum Likelihood Estimate
2.3.1 Numerical Optimisation
2.3.2 The EM Algorithm
2.4 Quadratic Approximation of the Log-Likelihood Function
2.5 Sufficiency
2.5.1 Minimal Sufficiency
2.5.2 The Likelihood Principle
2.6 Exercises
2.7 References
Chapter 3: Elements of Frequentist Inference
3.1 Unbiasedness and Consistency
3.2 Standard Error and Confidence Interval
3.2.1 Standard Error
3.2.2 Confidence Interval
3.2.3 Pivots
3.2.4 The Delta Method
3.2.5 The Bootstrap
3.3 Significance Tests and P-Values
3.4 Exercises
3.5 References
Chapter 4: Frequentist Properties of the Likelihood
4.1 The Expected Fisher Information and the Score Statistic
4.1.1 The Expected Fisher Information
4.1.2 Properties of the Expected Fisher Information
4.1.3 The Score Statistic
4.1.4 The Score Test
4.1.5 Score Confidence Intervals
4.2 The Distribution of the ML Estimator and the Wald Statistic
4.2.1 Cramér-Rao Lower Bound
4.2.2 Consistency of the ML Estimator
4.2.3 The Distribution of the ML Estimator
4.2.4 The Wald Statistic
4.3 Variance-Stabilising Transformations
4.4 The Likelihood Ratio Statistic
4.4.1 The Likelihood Ratio Test
4.4.2 Likelihood Ratio Confidence Intervals
4.5 The p* Formula
4.6 A Comparison of Likelihood-Based Confidence Intervals
4.7 Exercises
4.8 References
Chapter 5: Likelihood Inference in Multiparameter Models
5.1 Score Vector and Fisher Information Matrix
5.2 Standard Error and Wald Confidence Interval
5.3 Profile Likelihood
5.4 Frequentist Properties of the Multiparameter Likelihood
5.4.1 The Score Statistic
5.4.2 The Wald Statistic
5.4.3 The Multivariate Delta Method
5.4.4 The Likelihood Ratio Statistic
5.5 The Generalised Likelihood Ratio Statistic
5.6 Conditional Likelihood
5.7 Exercises
5.8 References
Chapter 6: Bayesian Inference
6.1 Bayes' Theorem
6.2 Posterior Distribution
6.3 Choice of the Prior Distribution
6.3.1 Conjugate Prior Distributions
6.3.2 Improper Prior Distributions
6.3.3 Jeffreys' Prior Distributions
6.4 Properties of Bayesian Point and Interval Estimates
6.4.1 Loss Function and Bayes Estimates
6.4.2 Compatible and Invariant Bayes Estimates
6.5 Bayesian Inference in Multiparameter Models
6.5.1 Conjugate Prior Distributions
6.5.2 Jeffreys' and Reference Prior Distributions
6.5.3 Elimination of Nuisance Parameters
6.5.4 Compatibility of Uni- and Multivariate Point Estimates
6.6 Some Results from Bayesian Asymptotics
6.6.1 Discrete Asymptotics
6.6.2 Continuous Asymptotics
6.7 Empirical Bayes Methods
6.8 Exercises
6.9 References
Chapter 7: Model Selection
7.1 Likelihood-Based Model Selection
7.1.1 Akaike's Information Criterion
7.1.2 Cross Validation and AIC
7.1.3 Bayesian Information Criterion
7.2 Bayesian Model Selection
7.2.1 Marginal Likelihood and Bayes Factor
7.2.2 Marginal Likelihood and BIC
7.2.3 Deviance Information Criterion
7.2.4 Model Averaging
7.3 Exercises
7.4 References
Chapter 8: Numerical Methods for Bayesian Inference
8.1 Standard Numerical Techniques
8.2 Laplace Approximation
8.3 Monte Carlo Methods
8.3.1 Monte Carlo Integration
8.3.2 Importance Sampling
8.3.3 Rejection Sampling
8.4 Markov Chain Monte Carlo
8.5 Numerical Calculation of the Marginal Likelihood
8.5.1 Calculation Through Numerical Integration
8.5.2 Monte Carlo Estimation of the Marginal Likelihood
8.6 Exercises
8.7 References
Chapter 9: Prediction
9.1 Plug-in Prediction
9.2 Likelihood Prediction
9.2.1 Predictive Likelihood
9.2.2 Bootstrap Prediction
9.3 Bayesian Prediction
9.3.1 Posterior Predictive Distribution
9.3.2 Computation of the Posterior Predictive Distribution
9.3.3 Model Averaging
9.4 Assessment of Predictions
9.4.1 Discrimination and Calibration
9.4.2 Scoring Rules
9.5 Exercises
9.6 References
Chapter 10: Markov Models for Time Series Analysis
10.1 The Markov Property
10.2 Observation-Driven Models for Categorical Data
10.2.1 Maximum Likelihood Inference
10.2.2 Prediction
10.2.3 Inclusion of Covariates
10.3 Observation-Driven Models for Continuous Data
10.3.1 The First-Order Autoregressive Model
10.3.2 Maximum Likelihood Inference
10.3.3 Inclusion of Covariates
10.3.4 Prediction
10.4 Parameter-Driven Models
10.4.1 The Likelihood Function
10.4.2 The Posterior Distribution
10.5 Hidden Markov Models
10.5.1 The Viterbi Algorithm
10.5.2 Bayesian Inference for Hidden Markov Models
10.6 State Space Models
10.7 Exercises
10.8 References
Appendix A: Probabilities, Random Variables and Distributions
A.1 Events and Probabilities
A.1.1 Conditional Probabilities and Independence
A.1.2 Bayes' Theorem
A.2 Random Variables
A.2.1 Discrete Random Variables
A.2.2 Continuous Random Variables
A.2.3 The Change-of-Variables Formula
A.2.4 Multivariate Normal Distributions
A.3 Expectation, Variance and Covariance
A.3.1 Expectation
A.3.2 Variance
A.3.3 Moments
A.3.4 Conditional Expectation and Variance
A.3.5 Covariance
A.3.6 Correlation
A.3.7 Jensen's Inequality
A.3.8 Kullback-Leibler Discrepancy and Information Inequality
A.4 Convergence of Random Variables
A.4.1 Modes of Convergence
A.4.2 Continuous Mapping and Slutsky's Theorem
A.4.3 Law of Large Numbers
A.4.4 Central Limit Theorem
A.4.5 Delta Method
A.5 Probability Distributions
A.5.1 Univariate Discrete Distributions
A.5.2 Univariate Continuous Distributions
A.5.3 Multivariate Distributions
Appendix B: Some Results from Matrix Algebra and Calculus
B.1 Some Matrix Algebra
B.1.1 Trace, Determinant and Inverse
B.1.2 Cholesky Decomposition
B.1.3 Inversion of Block Matrices
B.1.4 Sherman-Morrison Formula
B.1.5 Combining Quadratic Forms
B.2 Some Results from Mathematical Calculus
B.2.1 The Gamma and Beta Functions
B.2.2 Multivariate Derivatives
B.2.3 Taylor Approximation
B.2.4 Leibniz Integral Rule
B.2.5 Lagrange Multipliers
B.2.6 Landau Notation
Appendix C: Some Numerical Techniques
C.1 Optimisation and Root Finding Algorithms
C.1.1 Motivation
C.1.2 Bisection Method
C.1.3 Newton-Raphson Method
C.1.4 Secant Method
C.2 Integration
C.2.1 Newton-Cotes Formulas
C.2.2 Laplace Approximation
Notation
References
Index
📜 SIMILAR VOLUMES
"Filling a gap in current Bayesian theory, Statistical Inference: An Integrated Bayesian/Likelihood Approach presents a unified Bayesian treatment of parameter inference and model comparisons that can be used with simple diffuse prior specifications. This novel approach provides new solutions to dif
<p><p>This book covers modern statistical inference based on likelihood with applications in medicine, epidemiology and biology. Two introductory chapters discuss the importance of statistical models in applied quantitative research and the central role of the likelihood function. The rest of the bo
This text concentrates on what can be achieved using the likelihood/Fisherian methods of taking into account uncertainty when studying a statistical problem. It takes the concept of the likelihood as the best method for unifying the demands of statistical modeling and theory of inference. Every like
This text concentrates on what can be achieved using the likelihood/Fisherian methods of taking into account uncertainty when studying a statistical problem. It takes the concept of the likelihood as the best method for unifying the demands of statistical modeling and theory of inference. Every like