<p><b>A thoroughly updated guide to matrix algebra and it uses in statistical analysis and features SASยฎ, MATLABยฎ, and R throughout</b></p> <p>This <i>Second Edition </i>addresses matrix algebra that is useful in the statistical analysis of data as well as within statistics as a whole. The material
Matrix Algebra Useful for Statistics
โ Scribed by Shayle R. Searle, Andre I. Khuri
- Publisher
- Wiley
- Year
- 2017
- Tongue
- English
- Leaves
- 514
- Series
- Wiley Series in Probability and Statistics
- Edition
- 2 edition
- Category
- Library
No coin nor oath required. For personal study only.
โฆ Table of Contents
Cover
Contents
Preface
Preface to the First Edition
Introduction
The Introduction of Matrices Into Statistics
References
About the Companion Website
Part 1 Definitions, Basic Concepts, and Matrix Operations
1 Vector Spaces, Subspaces, and Linear Transformations
1.1 Vector Spaces
1.1.1 Euclidean Space
1.2 Base of a Vector Space
1.3 Linear Transformations
1.3.1 The Range and Null Spaces of a Linear Transformation
Reference
Exercises
2 Matrix Notation and Terminology
2.1 Plotting of a Matrix
2.2 Vectors and Scalars
2.3 General Notation
Exercises
3 Determinants
3.1 Expansion by Minors
3.1.1 First- and Second-Order Determinants
3.1.2 Third-Order Determinants
3.1.3 n-Order Determinants
3.2 Formal Definition
3.3 Basic Properties
3.3.1 Determinant of a Transpose
3.3.2 Two Rows the Same
3.3.3 Cofactors
3.3.4 Adding Multiples of a Row (Column) to a Row (Column)
3.3.5 Products
3.4 Elementary Row Operations
3.4.1 Factorization
3.4.2 A Row (Column) of Zeros
3.4.3 Interchanging Rows (Columns)
3.4.4 Adding a Row to a Multiple of a Row
3.5 Examples
3.6 Diagonal Expansion
3.7 The Laplace Expansion
3.8 Sums and Differences of Determinants
3.9 A Graphical Representation of a Determinant
References
Exercises
4 Matrix Operations
4.1 The Transpose of a Matrix
4.1.1 A Reflexive Operation
4.1.2 Vectors
4.2 Partitioned Matrices
4.2.1 Example
4.2.2 General Specification
4.2.3 Transposing a Partitioned Matrix
4.2.4 Partitioning Into Vectors
4.3 The Trace of a Matrix
4.4 Addition
4.5 Scalar Multiplication
4.6 Equality and the Null Matrix
4.7 Multiplication
4.7.1 The Inner Product of Two Vectors
4.7.2 A MatrixโVector Product
4.7.3 A Product of Two Matrices
4.7.4 Existence of Matrix Products
4.7.5 Products With Vectors
4.7.6 Products With Scalars
4.7.7 Products With Null Matrices
4.7.8 Products With Diagonal Matrices
4.7.9 Identity Matrices
4.7.10 The Transpose of a Product
4.7.11 The Trace of a Product
4.7.12 Powers of a Matrix
4.7.13 Partitioned Matrices
4.7.14 Hadamard Products
4.8 The Laws of Algebra
4.8.1 Associative Laws
4.8.2 The Distributive Law
4.8.3 Commutative Laws
4.9 Contrasts With Scalar Algebra
4.10 Direct Sum of Matrices
4.11 Direct Product of Matrices
4.12 The Inverse of a Matrix
4.13 Rank of a MatrixโSome Preliminary Results
4.14 The Number of LIN Rows and Columns in a Matrix
4.15 Determination of The Rank of a Matrix
4.16 Rank and Inverse Matrices
4.17 Permutation Matrices
4.18 Full-Rank Factorization
4.18.1 Basic Development
4.18.2 The General Case
4.18.3 Matrices of Full Row (Column) Rank
References
Exercises
5 Special Matrices
5.1 Symmetric Matrices
5.1.1 Products of Symmetric Matrices
5.1.2 Properties of AAโฒ and AโฒA
5.1.3 Products of Vectors
5.1.4 Sums of Outer Products
5.1.5 Elementary Vectors
5.1.6 Skew-Symmetric Matrices
5.2 Matrices Having all Elements Equal
5.3 Idempotent Matrices
5.4 Orthogonal Matrices
5.4.1 Special Cases
5.5 Parameterization of Orthogonal Matrices
5.6 Quadratic Forms
5.7 Positive Definite Matrices
References
Exercises
6 Eigenvalues and Eigenvectors
6.1 Derivation of Eigenvalues
6.1.1 Plotting Eigenvalues
6.2 Elementary Properties of Eigenvalues
6.2.1 Eigenvalues of Powers of a Matrix
6.2.2 Eigenvalues of a Scalar-by-Matrix Product
6.2.3 Eigenvalues of Polynomials
6.2.4 The Sum and Product of Eigenvalues
6.3 Calculating Eigenvectors
6.3.1 Simple Roots
6.3.2 Multiple Roots
6.4 The Similar Canonical Form
6.4.1 Derivation
6.4.2 Uses
6.5 Symmetric Matrices
6.5.1 Eigenvalues All Real
6.5.2 Symmetric Matrices Are Diagonable
6.5.3 Eigenvectors Are Orthogonal
6.5.4 Rank Equals Number of Nonzero Eigenvalues for a Symmetric Matrix
6.6 Eigenvalues of orthogonal and Idempotent Matrices
6.6.1 Eigenvalues of Symmetric Positive Definite and Positive Semidefinite Matrices
6.7 Eigenvalues of Direct Products and Direct Sums of Matrices
6.8 Nonzero Eigenvalues of AB and BA
References
Exercises
7 Diagonalization of Matrices
7.1 Proving the Diagonability Theorem
7.1.1 The Number of Nonzero Eigenvalues Never Exceeds Rank
7.1.2 A Lower Bound on r (A โ ๐kI)
7.1.3 Proof of the Diagonability Theorem
7.1.4 All Symmetric Matrices Are Diagonable
7.2 Other Results for Symmetric Matrices
7.2.1 Non-Negative Definite (n.n.d.)
7.2.2 Simultaneous Diagonalization of Two Symmetric Matrices
7.3 The CayleyโHamilton Theorem
7.4 The Singular-Value Decomposition
References
Exercises
8 Generalized Inverses
8.1 The MooreโPenrose Inverse
8.2 Generalized Inverses
8.2.1 Derivation Using the Singular-Value Decomposition
8.2.2 Derivation Based on Knowing the Rank
8.3 Other Names and Symbols
8.4 Symmetric Matrices
8.4.1 A General Algorithm
8.4.2 The Matrix XโฒX
References
Exercises
9 Matrix Calculus
9.1 Matrix Functions
9.1.1 Function of Matrices
9.1.2 Matrices of Functions
9.2 Iterative Solution of Nonlinear Equations
9.3 Vectors of Differential Operators
9.3.1 Scalars
9.3.2 Vectors
9.3.3 Quadratic Forms
9.4 Vec and Vech Operators
9.4.1 Definitions
9.4.2 Properties of Vec
9.4.3 Vec-Permutation Matrices
9.4.4 Relationships Between Vec and Vech
9.5 Other Calculus Results
9.5.1 Differentiating Inverses
9.5.2 Differentiating Traces
9.5.3 Derivative of a Matrix with Respect to Another Matrix
9.5.4 Differentiating Determinants
9.5.5 Jacobians
9.5.6 Aitkenโs Integral
9.5.7 Hessians
9.6 Matrices With Elements That Are Complex Numbers
9.7 Matrix Inequalities
References
Exercises
Part 2 Applications of Matrices in Statistics
10 Multivariate Distributions and Quadratic Forms
10.1 Variance-Covariance Matrices
10.2 Correlation Matrices
10.3 Matrices of Sums of Squares and Cross-Products
10.3.1 Data Matrices
10.3.2 Uncorrected Sums of Squares and Products
10.3.3 Means, and the Centering Matrix
10.3.4 Corrected Sums of Squares and Products
10.4 The Multivariate Normal Distribution
10.5 Quadratic Forms and ๐2-Distributions
10.5.1 Distribution of Quadratic Forms
10.5.2 Independence of Quadratic Forms
10.5.3 Independence and Chi-Squaredness of Several Quadratic Forms
10.5.4 The Moment and Cumulant Generating Functions for a Quadratic Form
10.6 Computing the Cumulative Distribution Function of a Quadratic Form
10.6.1 Ratios of Quadratic Forms
References
Exercises
11 Matrix Algebra of Full-Rank Linear Models
11.1 Estimation of ๐ท by the Method of Least Squares
11.1.1 Estimating the Mean Response and the Prediction Equation
11.1.2 Partitioning of Total Variation Corrected for the Mean
11.2 Statistical Properties of the Least-Squares Estimator
11.2.1 Unbiasedness and Variances
11.2.2 Estimating the Error Variance
11.3 Multiple Correlation Coefficient
11.4 Statistical Properties Under the Normality Assumption
11.5 Analysis of Variance
11.6 The GaussโMarkov Theorem
11.6.1 Generalized Least-Squares Estimation
11.7 Testing Linear Hypotheses
11.7.1 The Use of the Likelihood Ratio Principle in Hypothesis Testing
11.7.2 Confidence Regions and Confidence Intervals
11.8 Fitting Subsets of the x-Variables
11.9 The Use of the R(.|.) Notation in Hypothesis Testing
References
Exercises
12 Less-Than-Full-Rank Linear Models
12.1 General Description
12.2 The Normal Equations
12.2.1 A General Form
12.2.2 Many Solutions
12.3 Solving the Normal Equations
12.3.1 Generalized Inverses of XโX
12.3.2 Solutions
12.4 Expected values and variances
12.5 Predicted y-Values
12.6 Estimating the Error Variance
12.6.1 Error Sum of Squares
12.6.2 Expected Value
12.6.3 Estimation
12.7 Partitioning the Total Sum of Squares
12.8 Analysis of Variance
12.9 The R(โ
|โ
) Notation
12.10 Estimable Linear Functions
12.10.1 Properties of Estimable Functions
12.10.2 Testable Hypotheses
12.10.3 Development of a Test Statistic for H0
12.11 Confidence Intervals
12.12 Some Particular Models
12.12.1 The One-Way Classification
12.12.2 Two-Way Classification, No Interactions, Balanced Data
12.12.3 Two-Way Classification, No Interactions, Unbalanced Data
12.13 The R(โ
|โ
) Notation (Continued)
12.14 Reparameterization to a Full-Rank Model
References
Exercises
13 Analysis of Balanced Linear Models Using Direct Products of Matrices
13.1 General Notation for Balanced Linear Models
13.2 Properties Associated with Balanced Linear Models
13.3 Analysis of Balanced Linear Models
13.3.1 Distributional Properties of Sums of Squares
13.3.2 Estimates of Estimable Linear Functions of the Fixed Effects
References
Exercises
14 Multiresponse Models
14.1 Multiresponse Estimation of Parameters
14.2 Linear Multiresponse Models
14.3 Lack of Fit of a Linear Multiresponse Model
14.3.1 The Multivariate Lack of Fit Test
References
Exercises
Part 3 Matrix Computations and Related Software
15 SAS/IML
15.1 Getting Started
15.2 Defining a Matrix
15.3 Creating a Matrix
15.4 Matrix Operations
15.5 Explanations of SAS Statements Used Earlier in the Text
References
Exercises
16 Use of MATLAB in Matrix Computations
16.1 Arithmetic Operators
16.2 Mathematical Functions
16.3 Construction of Matrices
16.3.1 Submatrices
16.4 Two- and Three-Dimensional Plots
16.4.1 Three-Dimensional Plots
References
Exercises
17 Use of R in Matrix Computations
17.1 Two- and Three-Dimensional Plots
17.1.1 Two-Dimensional Plots
17.1.2 Three-Dimensional Plots
References
Exercises
Appendix Solutions to Exercises
Chapter 1
Chapter 2
Chapter 3
Chapter 4
Chapter 5
Chapter 6
Chapter 7
Chapter 8
Chapter 9
Chapter 10
Chapter 11
Chapter 12
Chapter 13
Chapter 14
Chapter 15
Chapter 16
Chapter 17
Index
๐ SIMILAR VOLUMES
WILEY-INTERSCIENCE PAPERBACK SERIESThe Wiley-Interscience Paperback Series consists of selected books that have been made more accessible to consumers in an effort to increase global appeal and general circulation. With these new unabridged softcover volumes, Wiley hopes to extend the lives of these
"Linear algebra and the study of matrix algorithms have become fundamental to the development of statistical models. Using a vector-space approach, this book provides an understanding of the major concepts that underlie linear algebra and matrix analysis. Each chapter introduces a key topic, such as
<P><EM>A Thorough Guide to Elementary Matrix Algebra and Implementation in R</EM></P> <P><STRONG>Basics of Matrix Algebra for Statistics with R</STRONG> provides a guide to elementary matrix algebra sufficient for undertaking specialized courses, such as multivariate data analysis and linear models.