𝔖 Scriptorium
✦   LIBER   ✦

πŸ“

Probability and Statistical Inference: From Basic Principles to Advanced Models (Chapman & Hall/CRC Texts in Statistical Science)

✍ Scribed by Miltiadis C. Mavrakakis, Jeremy Penzer


Publisher
Chapman and Hall/CRC
Year
2021
Tongue
English
Leaves
444
Edition
1
Category
Library

⬇  Acquire This Volume

No coin nor oath required. For personal study only.

✦ Synopsis


Probability and Statistical Inference: From Basic Principles to Advanced Models covers aspects of probability, distribution theory, and inference that are fundamental to a proper understanding of data analysis and statistical modelling. It presents these topics in an accessible manner without sacrificing mathematical rigour, bridging the gap between the many excellent introductory books and the more advanced, graduate-level texts. The book introduces and explores techniques that are relevant to modern practitioners, while being respectful to the history of statistical inference. It seeks to provide a thorough grounding in both the theory and application of statistics, with even the more abstract parts placed in the context of a practical setting.

Features:

β€’Complete introduction to mathematical probability, random variables, and distribution theory.
β€’Concise but broad account of statistical modelling, covering topics such as generalised linear models, survival analysis, time series, and random processes.
β€’Extensive discussion of the key concepts in classical statistics (point estimation, interval estimation, hypothesis testing) and the main techniques in likelihood-based inference.
β€’Detailed introduction to Bayesian statistics and associated topics.
β€’Practical illustration of some of the main computational methods used in modern statistical inference (simulation, boostrap, MCMC).

This book is for students who have already completed a first course in probability and statistics, and now wish to deepen and broaden their understanding of the subject. It can serve as a foundation for advanced undergraduate or postgraduate courses. Our aim is to challenge and excite the more mathematically able students, while providing explanations of statistical concepts that are more detailed and approachable than those in advanced texts. This book is also useful for data scientists, researchers, and other applied practitioners who want to understand the theory behind the statistical methods used in their fields.

✦ Table of Contents


Cover
Half Title
Series Page
Title Page
Copyright Page
Dedication
Contents
Preface
About the authors
1. Introduction
2. Probability
2.1. Intuitive probability
2.2. Mathematical probability
2.2.1. Measure
2.2.2. Probability measure
2.3. Methods for counting outcomes
2.3.1. Permutations and combinations
2.3.2. Number of combinations and multinomial coefficients
2.4. Conditional probability and independence
2.4.1. Conditional probability
2.4.2. Law of total probability and Bayes’ theorem
2.4.3. Independence
2.5. Further exercises
2.6. Chapter summary
3. Random variables and univariate
distributions
3.1. Mapping outcomes to real numbers
3.2. Cumulative distribution functions
3.3. Discrete and continuous random variables
3.3.1. Discrete random variables and mass functions
3.3.2. Continuous random variables and density functions
3.3.3. Parameters and families of distributions
3.4. Expectation, variance, and higher moments
3.4.1. Mean of a random variable
3.4.2. Expectation operator
3.4.3. Variance of a random variable
3.4.4. Inequalities involving expectation
3.4.5. Moments
3.5. Generating functions
3.5.1. Moment-generating functions
3.5.2. Cumulant-generating functions and cumulants
3.6. Functions of random variables
3.6.1. Distribution and mass/density for g(X)
3.6.2. Monotone functions of random variables
3.7. Sequences of random variables and convergence
3.8. A more thorough treatment of random variables
3.9. Further exercises
3.10. Chapter summary
4. Multivariate distributions
4.1. Joint and marginal distributions
4.2. Joint mass and joint density
4.2.1. Mass for discrete distributions
4.2.2. Density for continuous distributions
4.3. Expectation and joint moments
4.3.1. Expectation of a function of several variables
4.3.2. Covariance and correlation
4.3.3. Joint moments
4.3.4. Joint moment-generating functions
4.4. Independent random variables
4.4.1. Independence for pairs of random variables
4.4.2. Mutual independence
4.4.3. Identical distributions
4.5. Random vectors and random matrices
4.6. Transformations of continuous random variables
4.6.1. Bivariate transformations
4.6.2. Multivariate transformations
4.7. Sums of random variables
4.7.1. Sum of two random variables
4.7.2. Sum of n independent random variables
4.8. Multivariate normal distribution
4.8.1. Bivariate case
4.8.2. n-dimensional multivariate case
4.9. Further exercises
4.10. Chapter summary
5. Conditional distributions
5.1. Discrete conditional distributions
5.2. Continuous conditional distributions
5.3. Relationship between joint, marginal, and conditional
5.4. Conditional expectation and conditional moments
5.4.1. Conditional expectation
5.4.2. Conditional moments
5.4.3. Conditional moment-generating functions
5.5. Hierarchies and mixtures
5.6. Random sums
5.7. Conditioning for random vectors
5.8. Further exercises
5.9. Chapter summary
6. Statistical models
6.1. Modelling terminology, conventions, and assumptions
6.1.1. Sample, observed sample, and parameters
6.1.2. Structural and distributional assumptions
6.2. Independent and identically distributed sequences
6.2.1. Random sample
6.2.2. Error sequences
6.3. Linear models
6.3.1. Simple linear regression
6.3.2. Multiple linear regression
6.3.3. Applications
6.4. Generalised linear models
6.4.1. Motivation
6.4.2. Link function
6.5. Time-to-event models
6.5.1. Survival function and hazard function
6.5.2. Censoring of time-to-event data
6.5.3. Covariates in time-to-event models
6.6. Time series models
6.6.1. Autoregressive models
6.6.2. Moving-average models
6.6.3. Autocovariance, autocorrelation, and stationarity
6.7. Poisson processes
6.7.1. Stochastic processes and counting processes
6.7.2. Definitions of the Poisson process
6.7.3. Thinning and superposition
6.7.4. Arrival and interarrival times
6.7.5. Compound Poisson process
6.7.6. Non-homogeneous Poisson process
6.8. Markov chains
6.8.1. Classification of states and chains
6.8.2. Absorption
6.8.3. Periodicity
6.8.4. Limiting distribution
6.8.5. Recurrence and transience
6.8.6. Continuous-time Markov chains
6.9. Further exercises
6.10. Chapter summary
7. Sample moments and quantiles
7.1. Sample mean
7.1.1. Mean and variance of the sample mean
7.1.2. Central limit theorem
7.2. Higher-order sample moments
7.2.1. Sample variance
7.2.2. Joint sample moments
7.3. Sample mean and variance for a normal population
7.4. Sample quantiles and order statistics
7.4.1. Sample minimum and sample maximum
7.4.2. Distribution of ith order statistic
7.5. Further exercises
7.6. Chapter summary
8. Estimation, testing, and prediction
8.1. Functions of a sample
8.1.1. Statistics
8.1.2. Pivotal functions
8.2. Point estimation
8.2.1. Bias, variance, and mean squared error
8.2.2. Consistency
8.2.3. The method of moments
8.2.4. Ordinary least squares
8.3. Interval estimation
8.3.1. Coverage probability and length
8.3.2. Constructing interval estimators using pivotal functions
8.3.3. Constructing interval estimators using order statistics
8.3.4. Confidence sets
8.4. Hypothesis testing
8.4.1. Statistical hypotheses
8.4.2. Decision rules
8.4.3. Types of error and the power function
8.4.4. Basic ideas in constructing tests
8.4.5. Conclusions and p-values from tests
8.5. Prediction
8.6. Further exercises
8.7. Chapter summary
9. Likelihood-based inference
9.1. Likelihood function and log-likelihood function
9.2. Score and information
9.3. Maximum-likelihood estimation
9.3.1. Properties of maximum-likelihood estimates
9.3.2. Numerical maximisation of likelihood
9.3.3. EM algorithm
9.4. Likelihood-ratio test
9.4.1. Testing in the presence of nuisance parameters
9.4.2. Properties of the likelihood ratio
9.4.3. Approximate tests
9.5. Further exercises
9.6. Chapter summary
10. Inferential theory
10.1. Sufficiency
10.1.1. Sufficient statistics and the sufficiency principle
10.1.2. Factorisation theorem
10.1.3. Minimal sufficiency
10.1.4. Application of sufficiency in point estimation
10.2. Variance of unbiased estimators
10.3. Most powerful tests
10.4. Further exercises
10.5. Chapter summary
11. Bayesian inference
11.1. Prior and posterior distributions
11.2. Choosing a prior
11.2.1. Constructing reference priors
11.2.2. Conjugate priors
11.3. Bayesian estimation
11.3.1. Point estimators
11.3.2. Absolute loss
11.3.3. 0-1 loss
11.3.4. Interval estimates
11.4. Hierarchical models and empirical Bayes
11.4.1. Hierarchical models
11.4.2. Empirical Bayes
11.4.3. Predictive inference
11.5. Further exercises
11.6. Chapter summary
12. Simulation methods
12.1. Simulating independent values from a distribution
12.1.1. Table lookup
12.1.2. Probability integral
12.1.3. Box-Muller method
12.1.4. Accept/reject method
12.1.5. Composition
12.1.6. Simulating model structure and the bootstrap
12.2. Monte Carlo integration
12.2.1. Averaging over simulated instances
12.2.2. Univariate vs. multivariate integrals
12.2.3. Importance sampling
12.2.4. Antithetic variates
12.3. Markov chain Monte Carlo
12.3.1. Discrete Metropolis
12.3.2. Continuous Metropolis
12.3.3. Metropolis-Hastings algorithm
12.3.4. Gibbs sampler
12.4. Further exercises
12.5. Chapter summary
A. Proof of Proposition 5.7.2
Index


πŸ“œ SIMILAR VOLUMES


Probability and Statistical Inference (C
✍ Miltiadis C. Mavrakakis; Jeremy Penzer πŸ“‚ Library πŸ“… 2021 πŸ› CRC Press 🌐 English

Probability and Statistical Inference: From Basic Principles to Advanced Models covers aspects of probability, distribution theory, and inference that are fundamental to a proper understanding of data analysis and statistical modelling. It presents these topics in an accessible manner without sacrif

Probability and Statistical Inference: F
✍ Miltiadis C. Mavrakakis, Jeremy Penzer πŸ“‚ Library πŸ“… 2021 πŸ› Chapman and Hall/CRC 🌐 English

<p><span>Probability and Statistical Inference: From Basic Principles to Advanced Models</span><span> covers aspects of probability, distribution theory, and inference that are fundamental to a proper understanding of data analysis and statistical modelling. It presents these topics in an accessible

Statistical Inference (Chapman & Hall/CR
✍ George Casella, Roger Berger πŸ“‚ Library πŸ“… 2024 πŸ› Chapman and Hall/CRC 🌐 English

<p><span>This classic textbook builds theoretical statistics from the first principles of probability theory. Starting from the basics of probability, the authors develop the theory of statistical inference using techniques, definitions, and concepts that are statistical and natural extensions, and

Statistical Inference (Chapman & Hall/CR
✍ George Casella, Roger Berger πŸ“‚ Library πŸ“… 2024 πŸ› Chapman and Hall/CRC 🌐 English

<p><span>This classic textbook builds theoretical statistics from the first principles of probability theory. Starting from the basics of probability, the authors develop the theory of statistical inference using techniques, definitions, and concepts that are statistical and natural extensions, and

Introduction to Probability (Chapman & H
✍ Joseph K. Blitzstein ;Jessica Hwang πŸ“‚ Library πŸ“… 2019 πŸ› CRC Press 🌐 English

"Introduction to Probability is a very nice text for a calculus-based first course in probability. … The exercises are truly impressive. There are about 600 and some of them are very interesting and new to me. … The website has R code, the previously mentioned solutions, and many videos from the aut

Time Series: Modeling, Computation, and
✍ Raquel Prado, Marco A. R. Ferreira, Mike West πŸ“‚ Library πŸ“… 2021 πŸ› Chapman and Hall/CRC 🌐 English

<p><span>Focusing on Bayesian approaches and computations using analytic and simulation-based methods for inference, </span><span>Time Series: Modeling, Computation, and Inference, Second Edition</span><span> integrates mainstream approaches for time series modeling with significant recent developme