Connecting theory with practice, this systematic and rigorous introduction covers the fundamental principles, algorithms and applications of key mathematical models for high-dimensional data analysis. Comprehensive in its approach, it provides unified coverage of many different low-dimensional model
High-Dimensional Covariance Estimation: With High-Dimensional Data
โ Scribed by Mohsen Pourahmadi
- Publisher
- Wiley
- Year
- 2013
- Tongue
- English
- Leaves
- 206
- Edition
- 1
- Category
- Library
No coin nor oath required. For personal study only.
โฆ Synopsis
Methods for estimating sparse and large covariance matrices
Covariance and correlation matrices play fundamental roles in every aspect of the analysis of multivariate data collected from a variety of fields including business and economics, health care, engineering, and environmental and physical sciences. High-Dimensional Covariance Estimation provides accessible and comprehensive coverage of the classical and modern approaches for estimating covariance matrices as well as their applications to the rapidly developing areas lying at the intersection of statistics and machine learning.
Recently, the classical sample covariance methodologies have been modified and improved upon to meet the needs of statisticians and researchers dealing with large correlated datasets. High-Dimensional Covariance Estimation focuses on the methodologies based on shrinkage, thresholding, and penalized likelihood with applications to Gaussian graphical models, prediction, and mean-variance portfolio management. The book relies heavily on regression-based ideas and interpretations to connect and unify many existing methods and algorithms for the task.
High-Dimensional Covariance Estimation features chapters on:
- Data, Sparsity, and Regularization
- Regularizing the Eigenstructure
- Banding, Tapering, and Thresholding
- Covariance Matrices
- Sparse Gaussian Graphical Models
- Multivariate Regression
The book is an ideal resource for researchers in statistics, mathematics, business and economics, computer sciences, and engineering, as well as a useful text or supplement for graduate-level courses in multivariate analysis, covariance estimation, statistical learning, and high-dimensional data analysis.
โฆ Table of Contents
HIGH-DIMENSIONAL COVARIANCE ESTIMATION......Page 3
CONTENTS......Page 7
PREFACE......Page 11
I MOTIVATION AND THE BASICS......Page 13
1 INTRODUCTION......Page 15
1.1 Least Squares and Regularized Regression......Page 16
1.2 Lasso: Survival of the Bigger......Page 18
1.3 Thresholding the Sample Covariance Matrix......Page 21
1.4 Sparse PCA and Regression......Page 22
1.6 Cholesky Decomposition and Regression......Page 25
1.7 The Bigger Picture: Latent Factor Models......Page 27
1.8 Further Reading......Page 29
2 DATA, SPARSITY, AND REGULARIZATION......Page 33
2.1 Data Matrix: Examples......Page 34
2.2 Shrinking the Sample Covariance Matrix......Page 38
2.3 Distribution of the Sample Eigenvalues......Page 41
2.4 Regularizing Covariances Like a Mean......Page 42
2.5 The Lasso Regression......Page 44
2.6 Lasso: Variable Selection and Prediction......Page 48
2.7 Lasso: Degrees of Freedom and BIC......Page 49
2.8 Some Alternatives to the Lasso Penalty......Page 50
3.1 Definition and Basic Properties......Page 57
3.2 The Spectral Decomposition......Page 61
3.3 Structured Covariance Matrices......Page 65
3.4 Functions of a Covariance Matrix......Page 68
3.5 PCA: The Maximum Variance Property......Page 73
3.6 Modified Cholesky Decomposition......Page 75
3.7 Latent Factor Models......Page 79
3.8 GLM for Covariance Matrices......Page 85
3.9 GLM via the Cholesky Decomposition......Page 88
3.10.1 The Incoherency Problem in Incomplete Longitudinal Data......Page 91
3.10.2 The Incomplete Data and The EM Algorithm......Page 93
3.11 A Data Example: Fruit Fly Mortality Rate......Page 96
3.12 Simulating Random Correlation Matrices......Page 101
3.13 Bayesian Analysis of Covariance Matrices......Page 103
II COVARIANCE ESTIMATION: REGULARIZATION......Page 109
4 REGULARIZING THE EIGENSTRUCTURE......Page 111
4.1 Shrinking the Eigenvalues......Page 112
4.2 Regularizing The Eigenvectors......Page 117
4.3 A Duality between PCA and SVD......Page 119
4.4 Implementing Sparse PCA: A Data Example......Page 122
4.5 Sparse Singular Value Decomposition (SSVD)......Page 124
4.6 Consistency of PCA......Page 126
4.7 Principal Subspace Estimation......Page 130
4.8 Further Reading......Page 131
5 SPARSE GAUSSIAN GRAPHICAL MODELS......Page 133
5.1 Covariance Selection Models: Two Examples......Page 134
5.2 Regression Interpretation of Entries of ฮฃ-1......Page 136
5.3 Penalized Likelihood and Graphical Lasso......Page 138
5.4 Penalized Quasi-Likelihood Formulation......Page 143
5.5 Penalizing the Cholesky Factor......Page 144
5.6 Consistency and Sparsistency......Page 148
5.7 Joint Graphical Models......Page 149
5.8 Further Reading......Page 151
6 BANDING, TAPERING, AND THRESHOLDING......Page 153
6.1 Banding the Sample Covariance Matrix......Page 154
6.2 Tapering the Sample Covariance Matrix......Page 156
6.3 Thresholding the Sample Covariance Matrix......Page 157
6.4 Low-Rank Plus Sparse Covariance Matrices......Page 161
6.5 Further Reading......Page 162
7 MULTIVARIATE REGRESSION: ACCOUNTING FOR CORRELATION......Page 165
7.1 Multivariate Regression and LS Estimators......Page 166
7.2 Reduced Rank Regressions (RRR)......Page 168
7.3 Regularized Estimation of B......Page 170
7.4 Joint Regularization of (B, ฮฉ)......Page 172
7.5.1 Intraday Electricity Prices......Page 175
7.5.2 Predicting Asset Returns......Page 177
7.6 Further Reading......Page 179
BIBLIOGRAPHY......Page 183
INDEX......Page 193
WILEY SERIES IN PROBABILITY AND STATISTICS......Page 197
๐ SIMILAR VOLUMES
This book presents covariance matrix estimation and related aspects of random matrix theory. It focuses on the sample covariance matrix estimator and provides a holistic description of its properties under two asymptotic regimes: the traditional one, and the high-dimensional regime that better fits
This book presents covariance matrix estimation and related aspects of random matrix theory. It focuses on the sample covariance matrix estimator and provides a holistic description of its properties under two asymptotic regimes: the traditional one, and the high-dimensional regime that better fits
Ordinary differential equations (ODEs), differential-algebraic equations (DAEs) and partial differential equations (PDEs) are among the forms of mathematics most widely used in science and engineering. Each of these equation types is a focal point for international collaboration and research. This b
<p><p>"Geometric Structure of High-Dimensional Data and Dimensionality Reduction" adopts data geometry as a framework to address various methods of dimensionality reduction. In addition to the introduction to well-known linear methods, the book moreover stresses the recently developed nonlinear meth