𝔖 Scriptorium
✦   LIBER   ✦

πŸ“

Applied Regression Analysis and Generalized Linear Models

✍ Scribed by John Fox


Publisher
SAGE Publ.
Year
2015
Tongue
English
Leaves
817
Edition
3
Category
Library

⬇  Acquire This Volume

No coin nor oath required. For personal study only.

✦ Synopsis


Combining a modern, data-analytic perspective with a focus on applications in the social sciences, the Third Edition of Applied Regression Analysis and Generalized Linear Models provides in-depth coverage of regression analysis, generalized linear models, and closely related methods, such as bootstrapping and missing data. Updated throughout, this Third Edition includes new chapters on mixed-effects models for hierarchical and longitudinal data. Although the text is largely accessible to readers with a modest background in statistics and mathematics, author John Fox also presents more advanced material in optional sections and chapters throughout the book.

✦ Table of Contents


Cover
Dedication Page
Title Page
Copyright Page
Brief Contents
Table of Contents
Preface
About the Author
1. Statistical Models and Social Science
1.1 Statistical Models and Social Reality
1.2 Observation and Experiment
1.3 Populations and Samples
Exercise
Summary
Recommended Reading
I. DATA CRAFT
2. What Is Regression Analysis?
2.1 Preliminaries
2.2 Naive Nonparametric Regression
2.3 Local Averaging
Exercise
Summary
3. Examining Data
3.1 Univariate Displays
3.1.1 Histograms
3.1.2 Nonparametric Density Estimation
3.1.3 Quantile-Comparison Plots
3.1.4 Boxplots
3.2 Plotting Bivariate Data
3.3 Plotting Multivariate Data
3.3.1 Scatterplot Matrices
3.3.2 Coded Scatterplots
3.3.3 Three-Dimensional Scatterplots
3.3.4 Conditioning Plots
Exercises
Summary
Recommended Reading
4. Transforming Data
4.1 The Family of Powers and Roots
4.2 Transforming Skewness
4.3 Transforming Nonlinearity
4.4 Transforming Nonconstant Spread
4.5 Transforming Proportions
4.6 Estimating Transformations as Parameters
Exercises
Summary
Recommended Reading
II. LINEAR MODELS AND LEAST SQUARES
5. Linear Least-Squares Regression
5.1 Simple Regression
5.1.1 Least-Squares Fit
5.1.2 Simple Correlation
5.2 Multiple Regression
5.2.1 Two Explanatory Variables
5.2.2 Several Explanatory Variables
5.2.3 Multiple Correlation
5.2.4 Standardized Regression Coefficients
Exercises
Summary
6. Statistical Inference for Regression
6.1 Simple Regression
6.1.1 The Simple-Regression Model
6.1.2 Properties of the Least-Squares Estimator
6.1.3 Confidence Intervals and Hypothesis Tests
6.2 Multiple Regression
6.2.1 The Multiple-Regression Model
6.2.2 Confidence Intervals and Hypothesis Tests
6.3 Empirical Versus Structural Relations
6.4 Measurement Error in Explanatory Variables

Exercises
Summary
7. Dummy-Variable Regression
7.1 A Dichotomous Factor
7.2 Polytomous Factors
7.2.1 Coefficient Quasi-Variances
7.3 Modeling Interactions
7.3.1 Constructing Interaction Regressors
7.3.2 The Principle of Marginality
7.3.3 Interactions With Polytomous Factors
7.3.4 Interpreting Dummy-Regression Models With Interactions
7.3.5 Hypothesis Tests for Main Effects and Interactions
7.4 A Caution Concerning Standardized Coefficients
Exercises
Summary
8. Analysis of Variance
8.1 One-Way Analysis of Variance
8.1.1 Example: Duncan’s Data on Occupational Prestige
8.1.2 The One-Way ANOVA Model
8.2 Two-Way Analysis of Variance
8.2.1 Patterns of Means in the Two-Way Classification
8.2.2 Two-Way ANOVA by Dummy Regression
8.2.3 The Two-Way ANOVA Model
8.2.4 Fitting the Two-Way ANOVA Model to Data
8.2.5 Testing Hypotheses in Two-Way ANOVA
8.2.6 Equal Cell Frequencies
8.2.7 Some Cautionary Remarks
8.3 Higher-Way Analysis of Variance
8.3.1 The Three-Way Classification
8.3.2 Higher-Order Classifications
8.3.3 Empty Cells in ANOVA
8.4 Analysis of Covariance
8.5 Linear Contrasts of Means
Exercises
Summary
9. Statistical Theory for Linear Models

9.1 Linear Models in Matrix Form
9.1.1 Dummy Regression and Analysis of Variance
9.1.2 Linear Contrasts
9.2 Least-Squares Fit
9.2.1 Deficient-Rank Parametrization of Linear Models
9.3 Properties of the Least-Squares Estimator
9.3.1 The Distribution of the Least-Squares Estimator
9.3.2 The Gauss-Markov Theorem
9.3.3 Maximum-Likelihood Estimation
9.4 Statistical Inference for Linear Models
9.4.1 Inference for Individual Coefficients
9.4.2 Inference for Several Coefficients
9.4.3 General Linear Hypotheses
9.4.4 Joint Confidence Regions
9.5 Multivariate Linear Models
9.6 Random Regressors
9.7 Specification Error
9.8 Instrumental Variables and Two-Stage Least Squares
9.8.1 Instrumental-Variables Estimation in Simple Regression
9.8.2 Instrumental-Variables Estimation in Multiple Regression
9.8.3 Two-Stage Least Squares
Exercises
Summary
Recommended Reading
10. The Vector Geometry of Linear Models
10.1 Simple Regression
10.1.1 Variables in Mean Deviation Form
10.1.2 Degrees of Freedom
10.2 Multiple Regression
10.3 Estimating the Error Variance
10.4 Analysis-of-Variance Models
Exercises
Summary
Recommended Reading
III. LINEAR-MODEL DIAGNOSTICS
11. Unusual and Influential Data
11.1 Outliers, Leverage, and Influence
11.2 Assessing Leverage: Hat-Values
11.3 Detecting Outliers: Studentized Residuals
11.3.1 Testing for Outliers in Linear Models
11.3.2 Anscombe’s Insurance Analogy
11.4 Measuring Influence
11.4.1 Influence on Standard Errors
11.4.2 Influence on Collinearity
11.5 Numerical Cutoffs for Diagnostic Statistics
11.5.1 Hat-Values
11.5.2 Studentized Residuals
11.5.3 Measures of Influence
11.6 Joint Influence
11.6.1 Added-Variable Plots
11.6.2 Forward Search
11.7 Should Unusual Data Be Discarded?
11.8 Some Statistical Details

11.8.1 Hat-Values and the Hat-Matrix
11.8.2 The Distribution of the Least-Squares Residuals
11.8.3 Deletion Diagnostics
11.8.4 Added-Variable Plots and Leverage Plots
Exercises
Summary
Recommended Reading
12. Diagnosing Non-Normality, Nonconstant Error Variance, and Nonlinearity
12.1 Non-Normally Distributed Errors
12.1.1 Confidence Envelopes by Simulated Sampling
12.2 Nonconstant Error Variance
12.2.1 Residual Plots
12.2.2 Weighted-Least-Squares Estimation

12.2.3 Correcting OLS Standard Errors for Nonconstant Variance
12.2.4 How Nonconstant Error Variance Affects the OLS Estimator

12.3 Nonlinearity
12.3.1 Component-Plus-Residual Plots
12.3.2 Component-Plus-Residual Plots for Models With Interactions
12.3.3 When Do Component-Plus-Residual Plots Work?
12.4 Discrete Data
12.4.1 Testing for Nonlinearity (β€œLack of Fit”)
12.4.2 Testing for Nonconstant Error Variance
12.5 Maximum-Likelihood Methods
12.5.1 Box-Cox Transformation of Y
12.5.2 Box-Tidwell Transformation of the Xs
12.5.3 Nonconstant Error Variance Revisited
12.6 Structural Dimension
Exercises
Summary
Recommended Reading
13. Collinearity and Its Purported Remedies
13.1 Detecting Collinearity
13.1.1 Principal Components

13.1.2 Generalized Variance Inflation
13.2 Coping With Collinearity: No Quick Fix
13.2.1 Model Respecification
13.2.2 Variable Selection
13.2.3 Biased Estimation
13.2.4 Prior Information About the Regression Coefficients
13.2.5 Some Comparisons
Exercises
Summary
IV. GENERALIZED LINEAR MODELS
14. Logit and Probit Models for Categorical Response Variables
14.1 Models for Dichotomous Data
14.1.1 The Linear-Probability Model
14.1.2 Transformations of p: Logit and Probit Models
14.1.3 An Unobserved-Variable Formulation
14.1.4 Logit and Probit Models for Multiple Regression
14.1.5 Estimating the Linear Logit Model

14.2 Models for Polytomous Data
14.2.1 The Polytomous Logit Model
14.2.2 Nested Dichotomies
14.2.3 Ordered Logit and Probit Models
14.2.4 Comparison of the Three Approaches
14.3 Discrete Explanatory Variables and Contingency Tables
14.3.1 The Binomial Logit Model
Exercises
Summary
Recommended Reading
15. Generalized Linear Models
15.1 The Structure of Generalized Linear Models
15.1.1 Estimating and Testing GLMs
15.2 Generalized Linear Models for Counts
15.2.1 Models for Overdispersed Count Data
15.2.2 Loglinear Models for Contingency Tables
15.3 Statistical Theory for Generalized Linear Models

15.3.1 Exponential Families
15.3.2 Maximum-Likelihood Estimation of Generalized Linear Models
15.3.3 Hypothesis Tests
15.3.4 Effect Displays
15.4 Diagnostics for Generalized Linear Models
15.4.1 Outlier, Leverage, and Influence Diagnostics
15.4.2 Nonlinearity Diagnostics
15.4.3 Collinearity Diagnostics
15.5 Analyzing Data From Complex Sample Surveys
Exercises
Summary
Recommended Reading
V. EXTENDING LINEAR AND GENERALIZED LINEAR MODELS
16. Time-Series Regression and Generalized Least Squares

16.1 Generalized Least-Squares Estimation
16.2 Serially Correlated Errors
16.2.1 The First-Order Autoregressive Process
16.2.2 Higher-Order Autoregressive Processes
16.2.3 Moving-Average and Autoregressive-Moving-Average Processes
16.2.4 Partial Autocorrelations
16.3 GLS Estimation With Autocorrelated Errors
16.3.1 Empirical GLS Estimation
16.3.2 Maximum-Likelihood Estimation
16.4 Correcting OLS Inference for Autocorrelated Errors
16.5 Diagnosing Serially Correlated Errors
16.6 Concluding Remarks
Exercises
Summary
Recommended Reading
17. Nonlinear Regression
17.1 Polynomial Regression
17.1.1 A Closer Look at Quadratic Surfaces
17.2 Piece-wise Polynomials and Regression Splines
17.3 Transformable Nonlinearity
17.4 Nonlinear Least Squares

17.4.1 Minimizing the Residual Sum of Squares
17.4.2 An Illustration: U.S. Population Growth
Exercises
Summary
Recommended Reading
18. Nonparametric Regression
18.1 Nonparametric Simple Regression: Scatterplot Smoothing
18.1.1 Kernel Regression
18.1.2 Local-Polynomial Regression
18.1.3 Smoothing Splines
18.2 Nonparametric Multiple Regression
18.2.1 Local-Polynomial Multiple Regression
18.2.2 Additive Regression Models
18.3 Generalized Nonparametric Regression
18.3.1 Local Likelihood Estimation

18.3.2 Generalized Additive Models
Exercises
Summary
Recommended Reading
19. Robust Regression
19.1 M Estimation
19.1.1 Estimating Location
19.1.2 M Estimation in Regression
19.2 Bounded-Influence Regression
19.3 Quantile Regression
19.4 Robust Estimation of Generalized Linear Models
19.5 Concluding Remarks
Exercises
Summary
Recommended Reading
20. Missing Data in Regression Models
20.1 Missing Data Basics
20.1.1 An Illustration
20.2 Traditional Approaches to Missing Data
20.3 Maximum-Likelihood Estimation for Data Missing at Random

20.3.1 The EM Algorithm
20.4 Bayesian Multiple Imputation
20.4.1 Inference for Individual Coefficients
20.4.2 Inference for Several Coefficients
20.4.3 Practical Considerations
20.4.4 Example: A Regression Model for Infant Mortality
20.5 Selection Bias and Censoring
20.5.1 Truncated- and Censored-Normal Distributions
20.5.2 Heckman’s Selection-Regression Model
20.5.3 Censored-Regression Models
Exercises
Summary
Recommended Reading
21. Bootstrapping Regression Models
21.1 Bootstrapping Basics
21.2 Bootstrap Confidence Intervals
21.2.1 Normal-Theory Intervals
21.2.2 Percentile Intervals
21.2.3 Improved Bootstrap Intervals
21.3 Bootstrapping Regression Models
21.4 Bootstrap Hypothesis Tests

21.5 Bootstrapping Complex Sampling Designs
21.6 Concluding Remarks
Exercises
Summary
Recommended Reading
22. Model Selection, Averaging, and Validation
22.1 Model Selection
22.1.1 Model Selection Criteria
22.1.2 An Illustration: Baseball Salaries
22.1.3 Comments on Model Selection
22.2 Model Averaging
22.2.1 Application to the Baseball Salary Data
22.2.2 Comments on Model Averaging
22.3 Model Validation
22.3.1 An Illustration: Refugee Appeals
22.3.2 Comments on Model Validation
Exercises
Summary
Recommended Reading
VI. MIXED-EFFECTS MODELS
23. Linear Mixed-Effects Models for Hierarchical and Longitudinal Data
23.1 Hierarchical and Longitudinal Data
23.2 The Linear Mixed-Effects Model
23.3 Modeling Hierarchical Data
23.3.1 Formulating a Mixed Model
23.3.2 Random-Effects One-Way Analysis of Variance
23.3.3 Random-Coefficients Regression Model
23.3.4 Coefficients-as-Outcomes Model
23.4 Modeling Longitudinal Data
23.5 Wald Tests for Fixed Effects
23.6 Likelihood-Ratio Tests of Variance and Covariance Components
23.7 Centering Explanatory Variables, Contextual Effects, and Fixed-Effects Models
23.7.1 Fixed Versus Random Effects
23.8 BLUPs
23.9 Statistical Details

23.9.1 The Laird-Ware Model in Matrix Form
23.9.2 Wald Tests Revisited
Exercises
Summary
Recommended Reading
24. Generalized Linear and Nonlinear Mixed-Effects Models
24.1 Generalized Linear Mixed Models
24.1.1 Example: Migraine Headaches
24.1.2 Statistical Details
24.2 Nonlinear Mixed Models

24.2.1 Example: Recovery From Coma
24.2.2 Estimating Nonlinear Mixed Models
Exercises
Summary
Recommended Reading
Appendix A
References
Author Index
Subject Index
Data Set Index


πŸ“œ SIMILAR VOLUMES


Applied Regression Analysis and Generali
✍ John Fox πŸ“‚ Library πŸ“… 2015 πŸ› SAGE Publ. 🌐 English

<p>Combining a modern, data-analytic perspective with a focus on applications in the social sciences, the <strong>Third Edition</strong> of <strong>Applied Regression Analysis and Generalized Linear Models </strong>provides in-depth coverage of regression analysis, generalized linear models, and clo

Beyond Multiple Linear Regression: Appli
✍ Paul Roback, Julie Legler πŸ“‚ Library πŸ“… 2020 πŸ› Chapman and Hall/CRC 🌐 English

<strong>Beyond Multiple Linear Regression: Applied Generalized Linear Models and Multilevel Models in R </strong>is designed for undergraduate students who have successfully completed a multiple linear regression course, helping them develop an expanded modeling toolkit that includes non-normal resp

Beyond Multiple Linear Regression-Applie
✍ Paul Roback and Julie Legler πŸ“‚ Library πŸ“… 2020 πŸ› CRC Press 🌐 English

Beyond Multiple Linear Regression: Applied Generalized Linear Models and Multilevel Models in R is designed for undergraduate students who have successfully completed a multiple linear regression course, helping them develop an expanded modeling toolkit that includes non-normal responses and correla