𝔖 Scriptorium
✦   LIBER   ✦

📁

Statistical Theory: A Concise Introduction

✍ Scribed by Felix Abramovich, Ya'acov Ritov


Publisher
CRC Press/Chapman & Hall
Year
2022
Tongue
English
Leaves
237
Series
Chapman & Hall/CRC Texts in Statistical Science
Edition
2
Category
Library

⬇  Acquire This Volume

No coin nor oath required. For personal study only.

✦ Synopsis


Designed for a one-semester advanced undergraduate or graduate statistical theory course, Statistical Theory: A Concise Introduction, Second Edition clearly explains the underlying ideas, mathematics, and principles of major statistical concepts, including parameter estimation, confidence intervals, hypothesis testing, asymptotic analysis, Bayesian inference, linear models, nonparametric statistics, and elements of decision theory. It introduces these topics on a clear intuitive level using illustrative examples in addition to the formal definitions, theorems, and proofs.

Based on the authors’ lecture notes, the book is self-contained, which maintains a proper balance between the clarity and rigor of exposition. In a few cases, the authors present a "sketched" version of a proof, explaining its main ideas rather than giving detailed technical mathematical and probabilistic arguments.

Features:

    • Second edition has been updated with a new chapter on Nonparametric Estimation; a significant update to the chapter on Statistical Decision Theory; and other updates throughout
    • No requirement for heavy calculus, and simple questions throughout the text help students check their understanding of the material
    • Each chapter also includes a set of exercises that range in level of difficulty
    • Self-contained, and can be used by the students to understand the theory
    • Chapters and sections marked by asterisks contain more advanced topics and may be omitted
    • Special chapters on linear models and nonparametric statistics show how the main theoretical concepts can be applied to well-known and frequently used statistical tools

    The primary audience for the book is students who want to understand the theoretical basis of mathematical statistics―either advanced undergraduate or graduate students. It will also be an excellent reference for researchers from statistics and other quantitative disciplines.

    ✦ Table of Contents


    Cover
    Half Title
    Series Page
    Title Page
    Copyright Page
    Contents
    List of Figures
    List of Tables
    Preface to the Second Edition
    1. Introduction
    1.1. Preamble
    1.2. Likelihood
    1.3. Sufficiency
    1.4. Minimal sufficiency
    1.5.
    Completeness
    1.6. Exponential family of distributions
    1.7. Exercises
    2. Point Estimation
    2.1. Introduction
    2.2. Maximum likelihood estimation
    2.3. Method of moments
    2.4. Method of least squares
    2.5. M-estimators
    2.6. Goodness-of-estimation: mean squared error
    2.7. Unbiased estimation
    2.7.1. Definition and main properties
    2.7.2. Uniformly minimum variance unbiased estimators: The Cramer–Rao lower bound
    2.7.3.
    The Cramer–Rao lower bound for multiparameter case
    2.7.4. Rao–Blackwell theorem
    2.7.5. Lehmann–Scheffe theorem
    2.8. Exercises
    3. Confidence Intervals, Bounds, and Regions
    3.1. Introduction
    3.2. Quoting the estimation error
    3.3. Confidence intervals
    3.4. Confidence bounds
    3.5.
    Confidence regions
    3.6. Exercises
    4. Hypothesis Testing
    4.1. Introduction
    4.2. Simple hypotheses
    4.2.1. Type I and Type II errors
    4.2.2. Choice of a critical value
    4.2.3. The p-value
    4.2.4. Maximal power tests. Neyman–Pearson lemma
    4.3. Composite hypotheses
    4.3.1. Power function
    4.3.2. Uniformly most powerful tests
    4.3.3. Generalized likelihood ratio tests
    4.4. Duality between hypothesis testing and confidence intervals (regions)
    4.5. Sequential testing
    4.6. Multiple testing
    4.6.1. Family-wise error
    4.6.2. False discovery rate
    4.7. Exercises
    5. Asymptotic Analysis
    5.1. Introduction
    5.2. Convergence and consistency in MSE
    5.3. Convergence and consistency in probability
    5.4. Convergence in distribution
    5.5. The central limit theorem
    5.6. Asymptotically normal consistency
    5.7. Asymptotic confidence intervals
    5.8. Asymptotic properties of MLEs, Wald confidence intervals, and tests
    5.9.
    Multiparameter case
    5.10. Asymptotic properties of M-estimators
    5.11. Score (Rao) asymptotic tests and confidence regions
    5.12. Asymptotic distribution of the GLRT, Wilks’ theorem
    5.13. Exercises
    6. Bayesian Inference
    6.1. Introduction
    6.2. Choice of priors
    6.2.1. Conjugate priors
    6.2.2. Noninformative (objective) priors
    6.3. Point estimation
    6.4. Interval estimation: Credible sets
    6.5. Hypothesis testing
    6.5.1. Simple hypotheses
    6.5.2. Composite hypotheses
    6.5.3. Testing a point null hypothesis
    6.6.
    Asymptotic properties of the posterior distribution
    6.7. Exercises
    7. Elements of Statistical Decision Theory
    7.1. Introduction and notations
    7.2. Risk function and admissibility
    7.3. Minimax risk and minimax rules
    7.4. Bayes risk and Bayes rules
    7.5. Posterior expected loss and Bayes actions
    7.6. Admissibility of Bayes rules
    7.7. Minimaxity and Bayes rules
    7.8. Exercises
    8.
    Linear Models
    8.1. Introduction
    8.2. Definition and examples
    8.3. Estimation of regression coefficients
    8.4. Residuals. Estimation of the variance
    8.5. Examples
    8.5.1. Estimation of a normal mean
    8.5.2. Comparison between the means of two independent normal samples with a common variance
    8.5.3. Simple linear regression
    8.6. Goodness-of-fit: Multiple correlation coefficient
    8.7. Confidence intervals and regions for the coefficients
    8.8. Hypothesis testing in linear models
    8.8.1. Testing significance of a single predictor
    8.8.2. Testing significance of a group of predictors
    8.8.3. Testing a general linear hypothesis
    8.9. Predictions
    8.10. Analysis of variance
    8.10.1. One-way ANOVA
    8.10.2. Two-way ANOVA and beyond
    9. *Nonparametric Estimation
    9.1. Introduction
    9.2. The empirical distribution function and the histogram
    9.3. Kernel density estimation
    9.4. The minimax rate
    9.5. Nonparametric kernel regression
    9.6. Nonparametric estimation by orthonormal series
    9.6.1. Orthonormal series
    9.6.2. Cosine series
    9.6.3. Nonparametric density estimation by cosine series
    9.6.4. Nonparametric regression and orthonormal cosine series
    9.7. Spline smoothing
    9.8. Choice of the smoothing parameter
    A. Probabilistic Review
    A.1. Introduction
    A.2. Basic probabilistic laws
    A.3. Random variables
    A.3.1. Expected value and the variance
    A.3.2. Chebyshev’s and Markov’s inequalities
    A.3.3. Expectation of functions and the Jensen’s inequality
    A.3.4. Joint distribution
    A.3.5. Covariance, correlation, and the Cauchy–Schwarz inequality
    A.3.6. Expectation and variance of a sum of random variables
    A.3.7. Conditional distribution and Bayes Theorem
    A.3.8. Distributions of functions of random variables
    A.3.9. Random vectors
    A.4. Special families of distributions
    A.4.1. Bernoulli and binomial distributions
    A.4.2. Geometric and negative binomial distributions
    A.4.3. Hypergeometric distribution
    A.4.4. Poisson distribution
    A.4.5. Uniform distribution
    A.4.6. Exponential distribution
    A.4.7. Weibull distribution
    A.4.8. Gamma-distribution
    A.4.9. Beta-distribution
    A.4.10. Cauchy distribution
    A.4.11. Normal distribution
    A.4.12. Log-normal distribution
    A.4.13. χ2 distribution
    A.4.14. t-distribution
    A.4.15. F-distribution
    A.4.16. Multinormal distribution
    A.4.16.1. Definition and main properties
    A.4.16.2. Projections of normal vectors
    B. Solutions of Selected Exercises
    B.1. Chapter 1
    B.2. Chapter 2
    B.3. Chapter 3
    B.4. Chapter 4
    B.5. Chapter 5
    B.6. Chapter 6
    B.7. Chapter 7
    Bibliography
    Index


    📜 SIMILAR VOLUMES


    Statistical Theory : A Concise Introduct
    ✍ Abramovich, Felix; Ritov, Ya'acov 📂 Library 📅 2013 🏛 CRC Press 🌐 English

    Introduction Preamble Likelihood Sufficiency Minimal sufficiency Completeness Exponential family of distributionsPoint Estimation Introduction Maximum likelihood estimation Method of moments Method of least squares Goodness-of-estimation. Mean squared error. Unbiased estimationConfidence Intervals,

    Theory of Spatial Statistics A Concise I
    ✍ M.N.M. van Lieshout 📂 Library 📅 2019 🏛 CRC Press 🌐 English

    Theory of Spatial Statistics: A Concise Introduction presents the most important models used in spatial statistics, including random fields and point processes, from a rigorous mathematical point of view and shows how to carry out statistical inference. It contains full proofs, real-life examples a

    A Concise Introduction to Statistical In
    ✍ Jacco Thijssen 📂 Library 📅 2016 🏛 Chapman and Hall/CRC 🌐 English

    <P>This short book introduces the main ideas of statistical inference in a way that is both user friendly and mathematically sound. Particular emphasis is placed on the common foundation of many models used in practice. In addition, the book focuses on the formulation of appropriate statistical mode

    Optimization Theory: A Concise Introduct
    ✍ Jiongmin Yong 📂 Library 📅 2018 🏛 World Scientific Publishing Company 🌐 English

    Mathematically, most of the interesting optimization problems can be formulated to optimize some objective function, subject to some equality and/or inequality constraints. This book introduces some classical and basic results of optimization theory, including nonlinear programming with Lagrange mul

    Optimization Theory: A Concise Introduct
    ✍ Jiongmin Yong 📂 Library 📅 2018 🏛 World Scientific Publishing Co. Pte. Ltd.

    Mathematically, most of the interesting optimization problems can be formulated to optimize some objective function, subject to some equality and/or inequality constraints. This book introduces some classical and basic results of optimization theory, including nonlinear programming with Lagrange mul