<p><span>This classic textbook builds theoretical statistics from the first principles of probability theory. Starting from the basics of probability, the authors develop the theory of statistical inference using techniques, definitions, and concepts that are statistical and natural extensions, and
Time Series: Modeling, Computation, and Inference, (Chapman & Hall/CRC Texts in Statistical Science)
โ Scribed by Raquel Prado, Marco A. R. Ferreira, Mike West
- Publisher
- Chapman and Hall/CRC
- Year
- 2021
- Tongue
- English
- Leaves
- 473
- Edition
- 2
- Category
- Library
No coin nor oath required. For personal study only.
โฆ Synopsis
Focusing on Bayesian approaches and computations using analytic and simulation-based methods for inference, Time Series: Modeling, Computation, and Inference, Second Edition integrates mainstream approaches for time series modeling with significant recent developments in methodology and applications of time series analysis. It encompasses a graduate-level account of Bayesian time series modeling, analysis and forecasting, a broad range of references to state-of-the-art approaches to univariate and multivariate time series analysis, and contacts research frontiers in multivariate time series modeling and forecasting.
It presents overviews of several classes of models and related methodology for inference, statistical computation for model fitting and assessment, and forecasting. It explores the connections between time- and frequency-domain approaches and develop various models and analyses using Bayesian formulations and computation, including use of computations based on Markov chain Monte Carlo (MCMC) and sequential Monte Carlo (SMC) methods. It illustrates the models and methods with examples and case studies from a variety of fields, including signal processing, biomedicine, environmental science, and finance.
Along with core models and methods, the book represents state-of-the art approaches to analysis and forecasting in challenging time series problems. It also demonstrates the growth of time series analysis into new application areas in recent years, and contacts recent and relevant modeling developments and research challenges.
New in the second edition:
- Expanded on aspects of core model theory and methodology.
- Multiple new examples and exercises.
- Detailed development of dynamic factor models.
- Updated discussion and connections with recent and current research frontiers.
โฆ Table of Contents
Cover
Half Title
Series Page
Title Page
Copyright Page
Contents
Preface
Authors
1. Notation, definitions, and basic inference
1.1. Problem Areas and Objectives
1.2. Stochastic Processes and Stationarity
1.3. Autocorrelation and Cross-correlation
1.4. Smoothing and Differencing
1.5. A Primer on Likelihood and Bayesian Inference
1.5.1. ML, MAP, and LS Estimation
1.5.2. Traditional Least Squares
1.5.3. Full Bayesian Analysis
1.5.3.1. Reference Bayesian Analysis
1.5.3.2. Conjugate Bayesian Analysis
1.5.4. Nonconjugate Bayesian Analysis
1.5.5. Posterior Sampling
1.5.5.1. The Metropolis-Hastings Algorithm
1.5.5.2. Gibbs Sampling
1.5.5.3. Convergence
1.6. Appendix
1.6.1. The Uniform Distribution
1.6.2. The Univariate Normal Distribution
1.6.3. The Multivariate Normal Distribution
1.6.4. The Gamma and Inverse-gamma Distributions
1.6.5. The Exponential Distribution
1.6.6. The Chi-square Distribution
1.6.7. The Inverse Chi-square Distributions
1.6.8. The Univariate Student-t Distribution
1.6.9. The Multivariate Student-t Distribution
1.7. Problems
2. Traditional time domain models
2.1. Structure of Autoregressions
2.1.1. Stationarity in AR Processes
2.1.2. State-Space Representation of an AR(p)
2.1.3. Characterization of AR(2) Processes
2.1.4. Autocorrelation Structure of an AR(p)
2.1.5. The Partial Autocorrelation Function
2.2. Forecasting
2.3. Estimation in AR Models
2.3.1. Yule-Walker and Maximum Likelihood
2.3.2. Basic Bayesian Inference for AR Models
2.3.3. Simulation of Posterior Distributions
2.3.4. Order Assessment
2.3.5. Initial values and Missing Data
2.3.6. Imputing Initial Values via Simulation
2.4. Further Issues in Bayesian Inference for AR Models
2.4.1. Sensitivity to the Choice of Prior Distributions
2.4.1.1. Analysis Based on Normal Priors
2.4.1.2. Discrete Normal Mixture Prior and Subset Models
2.4.2. Alternative Prior Distributions
2.4.2.1. Scale-mixtures and Smoothness Priors
2.4.2.2. Priors Based on AR Latent Structure
2.5. Autoregressive Moving Average Models (ARMA)
2.5.1. Structure of ARMA Models
2.5.2. Autocorrelation and Partial Autocorrelation Functions
2.5.3. Inversion of AR Components
2.5.4. Forecasting and Estimation of ARMA Processes
2.5.4.1. Forecasting ARMA Models
2.5.4.2. MLE and Least Squares Estimation
2.5.4.3. State-space Representation
2.5.4.4. Bayesian Estimation of ARMA Processes
2.6. Other Models
2.7. Appendix
2.7.1. The Reversible Jump MCMC Algorithm
2.7.2. The Binomial Distribution
2.7.3. The Beta Distribution
2.7.4. The Dirichlet Distribution
2.7.5. The Beta-binomial Distribution
2.8. Problems
3. The frequency domain
3.1. Harmonic Regression
3.1.1. The One-component Model
3.1.1.1. Reference Analysis
3.1.2. The Periodogram
3.1.3. Some Data Analyses
3.1.4. Several Uncertain Frequency Components
3.1.5. Harmonic Component Models of Known Period
3.1.6. The Periodogram (revisited)
3.2. Some Spectral Theory
3.2.1. Spectral Representation of a Time Series Process
3.2.2. Representation of Autocorrelation Functions
3.2.3. Other Facts and Examples
3.2.4. Traditional Nonparametric Spectral Analysis
3.3. Discussion and Extensions
3.3.1. Long Memory Time Series Models
3.4. Appendix
3.4.1. The F Distribution
3.4.2. Distributions of Quadratic Forms
3.4.3. Orthogonality of Harmonics
3.4.4. Complex Valued Random Variables
3.4.5. Orthogonal Increments Processes
3.4.5.1. Real-valued Orthogonal Increments Processes
3.4.5.2. Complex-valued Orthogonal Increments Processes
3.5. Problems
4. Dynamic linear models
4.1. General Linear Model Structures
4.2. Forecast Functions and Model Forms
4.2.1. Superposition of Models
4.2.2. Time Series Models
4.3. Inference in DLMs: Basic Normal Theory
4.3.1. Sequential Updating: Filtering
4.3.2. Learning a Constant Observation Variance
4.3.3. Missing and Unequally Spaced Data
4.3.4. Forecasting
4.3.5. Retrospective Updating: Smoothing
4.3.6. Discounting for DLM State Evolution Variances
4.3.7. Stochastic Variances and Discount Learning
4.3.7.1. References and additional comments
4.3.8. Intervention, Monitoring, and Model Performance
4.3.8.1. Intervention
4.3.8.2. Model monitoring and performance
4.4. Extensions: Non-Gaussian and Nonlinear Models
4.5. Posterior Simulation: MCMC Algorithms
4.5.1. Examples
4.6. Problems
5. State-space TVAR models
5.1. Time-Varying Autoregressions and Decompositions
5.1.1. Basic DLM Decomposition
5.1.2. Latent Structure in TVAR Models
5.1.2.1. Decompositions for standard autoregressions
5.1.2.2. Decompositions in the TVAR case
5.1.3. Interpreting Latent TVAR Structure
5.2. TVAR Model Speci cation and Posterior Inference
5.3. Extensions
5.4. Problems
6. SMC methods for state-space models
6.1. General State-Space Models
6.2. Posterior Simulation: Sequential Monte Carlo
6.2.1. Sequential Importance Sampling and Resampling
6.2.2. The Auxiliary Particle Filter
6.2.3. SMC for Combined State and Parameter Estimation
6.2.3.1. Algorithm of Liu and West
6.2.3.2. Storvik's algorithm
6.2.3.3. Practical ltering
6.2.3.4. Particle learning methods
6.2.4. Smoothing
6.2.5. Examples
6.3. Problems
7. Mixture models in time series
7.1. Markov Switching Models
7.1.1. Parameter Estimation
7.1.2. Other Models
7.2. Multiprocess Models
7.2.1. Definitions and Examples
7.2.2. Posterior Inference
7.2.2.1. Posterior inference in class I models
7.2.2.2. Posterior inference in class II models
7.3. Mixtures of General State-Space Models
7.4. Case Study: Detecting Fatigue from EEGs
7.4.1. Structured Priors in Multi-AR Models
7.4.2. Posterior Inference
7.5. Univariate Stochastic Volatility models
7.5.1. Zero-Mean AR(1) SV Model
7.5.2. Normal Mixture Approximation
7.5.3. Centered Parameterization
7.5.4. MCMC Analysis
7.5.5. Further Comments
7.6. Problems
8. Topics and examples in multiple time series
8.1. Multichannel Modeling of EEG Data
8.1.1. Multiple Univariate TVAR Models
8.1.2. A Simple Factor Model
8.2. Some Spectral Theory
8.2.1. The Cross-Spectrum and Cross-Periodogram
8.3. Dynamic Lag/Lead Models
8.4. Other Approaches
8.5. Problems
9. Vector AR and ARMA models
9.1. Vector Autoregressive Models
9.1.1. State-Space Representation of a VAR Process
9.1.2. The Moving Average Representation of a VAR Process
9.1.3. VAR Time Series Decompositions
9.2. Vector ARMA Models
9.2.1. Autocovariances and Cross-covariances
9.2.2. Partial Autoregression Matrix Function
9.2.3. VAR(1) and DLM Representations
9.3. Estimation in VARMA
9.3.1. Identifiability
9.3.2. Least Squares Estimation
9.3.3. Maximum Likelihood Estimation
9.3.3.1. Conditional likelihood
9.3.3.2. Exact likelihood
9.4. Bayesian VAR, TV-VAR, and DDNMs
9.5. Mixtures of VAR Processes
9.6. PARCOR Representations and Spectral Analysis
9.6.1. Spectral Matrix of a VAR and VARMA processes
9.7. Problems
10. General classes of multivariate dynamic models
10.1. Theory of Multivariate and Matrix Normal DLMs
10.1.1. Multivariate Normal DLMs
10.1.2. Matrix Normal DLMs and Exchangeable Time Series
10.2. Multivariate DLMs and Exchangeable Time Series
10.2.1. Sequential Updating
10.2.2. Forecasting and Retrospective Smoothing
10.3. Learning Cross-Series Covariances
10.3.1. Sequential Updating
10.3.2. Forecasting and Retrospective Smoothing
10.4. Time-Varying Covariance Matrices
10.4.1. Introductory Discussion
10.4.2. Wishart Matrix Discounting Models
10.4.3. Matrix Beta Evolution Model
10.4.4. DLM Extension and Sequential Updating
10.4.5. Retrospective Analysis
10.4.6. Financial Time Series Volatility Example
10.4.6.1. Data and model
10.4.6.2. Trajectories of multivariate stochastic volatility
10.4.6.3. Time-varying principal components analysis
10.4.6.4. Latent components in multivariate volatility
10.4.7. Short-term Forecasting for Portfolio Decisions
10.4.7.1. Additional comments and extensions
10.4.8. Beta-Bartlett Wishart Models for Stochastic Volatility
10.4.8.1. Discount model variants
10.4.8.2. Additional comments and current research areas
10.5. Multivariate Dynamic Graphical Models
10.5.1. Gaussian Graphical Models
10.5.2. Dynamic Graphical Models
10.6. Selected recent developments
10.6.1. Simultaneous Graphical Dynamic Models
10.6.2. Models for Multivariate Time Series of Counts
10.6.3. Models for Flows on Dynamic Networks
10.6.4. Dynamic Multiscale Models
10.7. Appendix
10.7.1. The Matrix Normal Distribution
10.7.2. The Wishart Distribution
10.7.3. The Inverse Wishart Distribution
10.7.3.1. Point estimates of variance matrices
10.7.4. The Normal, Inverse Wishart Distribution
10.7.5. The Matrix Normal, Inverse Wishart Distribution
10.7.6. Hyper-Inverse Wishart Distributions
10.7.6.1. Decomposable graphical models
10.7.6.2. The hyper-inverse Wishart distribution
10.7.6.3. Prior and posterior HIW distributions
10.7.6.4. Normal, hyper-inverse Wishart distributions
10.8. Problems
11. Latent factor models
11.1. Introduction
11.2. Static Factor Models
11.2.1. 1-Factor Case
11.2.2. MCMC for Factor Models with One Factor
11.2.3. Example: A 1-Factor Model for Temperature
11.2.4. Factor Models with Multiple Factors
11.2.5. MCMC for the k-Factor Model
11.2.6. Selection of Number of Factors
11.2.7. Example: A k-Factor Model for Temperature
11.3. Multivariate Dynamic Latent Factor Models
11.3.1. Example: A Dynamic 3-Factor Model for Temperature
11.4. Factor Stochastic Volatility
11.4.1. Computations
11.4.2. Factor Stochastic Volatility Model for Exchange Rates
11.5. Spatiotemporal Dynamic Factor Models
11.5.1. Example: Temperature Over the Eastern USA
11.6. Other Extensions and Recent Developments
11.7. Problems
Bibliography
Author Index
Subject Index
๐ SIMILAR VOLUMES
<p><span>This classic textbook builds theoretical statistics from the first principles of probability theory. Starting from the basics of probability, the authors develop the theory of statistical inference using techniques, definitions, and concepts that are statistical and natural extensions, and
<p>The goals of this text are to develop the skills and an appreciation for the richness and versatility of modern time series analysis as a tool for analyzing dependent data. A useful feature of the presentation is the inclusion of nontrivial data sets illustrating the richness of potential applica
Probability and Statistical Inference: From Basic Principles to Advanced Models covers aspects of probability, distribution theory, and inference that are fundamental to a proper understanding of data analysis and statistical modelling. It presents these topics in an accessible manner without sacrif
<p><span>Probability and Statistical Inference: From Basic Principles to Advanced Models</span><span> covers aspects of probability, distribution theory, and inference that are fundamental to a proper understanding of data analysis and statistical modelling. It presents these topics in an accessible