The revision of this well-respected text presents a balance of the classical and Bayesian methods. The theoretical and practical sides of both probability and statistics are considered. New content areas include the Vorel- Kolmogorov Paradox, Confidence Bands for the Regression Line, the Correction
Probability and Statistics (2nd Edition)
✍ Scribed by Morris H. Degroot
- Publisher
- Addison-Wesley
- Year
- 1986
- Tongue
- English
- Leaves
- 736
- Edition
- Subsequent
- Category
- Library
No coin nor oath required. For personal study only.
✦ Synopsis
The revision of this well-respected text presents a balance of the classical and Bayesian methods. The theoretical and practical sides of both probability and statistics are considered. New content areas include the Vorel- Kolmogorov Paradox, Confidence Bands for the Regression Line, the Correction for Continuity, and the Delta Method.
✦ Table of Contents
PROBABILITY AND STATISTICS, 2ND ED.
Title Page
Copyright Page
Dedication
Preface
Contents
Chapter 1. Introduction to Probability
1.1 The History of Probability
References
1.2 Interpretations of Probability
The Frequency Interpretation of Probability
The Classical Interpretation of Probability
The Subjective Interpretation of Probability
1.3 Experiments and Events
Types of Experiments
The Mathematical Theory of Probability
1.4 Set Theory
The Sample Space
Relations of Set Theory
The Empty Set
Operations of Set Theory
Exercises
1.5 The Definition of Probability
Axioms and Basic Theorems
Further Properties of Probability
Exercises
1.6 Finite Sample Spaces
Requirements of Probabililies
Simple Sample Spaces
Exercises
1.7 Counting Methods
Multiplication Rule
Permutations
The Birthday Problem
Exercises
1.8 Combinatorial Methods
Combinations
Binomial Coefficients
The Tennis Tournament
Exercises
1.9 Multinomial Coefficients
Exercises
1.10 The Probability of a Union of Events
The Union of Three Events
The Union of a Finite Number of Events
The Matching Problem
Exercises
1.11 Independent Events
Independence of Two Events
Independence of Several Events
The Collector's Problem
Exercises
1.12 Statistical Swindles
Misleading Use of Statistics
Perfect Forecasts
Guaranteed Winners
1.13 Supplementary Exercises
Chapter 2. Conditional Probability
2.1 The Definition of Conditional Probability
Conditional Probability for Independent Events
The Multiplication Rule for Conditional Probabllilles
The Game of Craps
Exercises
2.4 Bayes' Theorem
Probability and Partitions
Statement and Proof of Bayes' Theorem
Prior and Posterior Probabilities
Computation of Posterior Probabilities In More Than One Stage
Combining a Prior Probability with an Observation
Exercises
2.3 Markov Chains
Stochastic Processes
Markov Chains
The Transition Matrix
The Initial Probability Vector
Exercises
2.4 The Gambler's Ruin Problem
Statement of the Problem
Solution of the Problem
Exercises
2.5 Choosing the Best
Optimal Selection
The Form of the Best Procedure
The Best Procedure
The Simple but Interesting Limiting Value
Parlor Games
2.6 Supplementary Exercises
Chapter 3. Random Variables and Distributions
3.1 Random Variables and Discrete Distributions
Definition of a Random Variable
The Distribution of a Random Variable
Discrete Distributions
The Uniform Distribution on Integers
The Binomial Distribution
Exercises
3.2 Continuous Distributions
The Probability Density Function
Nonuniqueness of the p.d.f.
The Uniform Distribution on an Interval
Mixed Distributions
Exercises
3.3 The Distribution Function
Definition and Basic Properties
Determining Probabilities from the Distribution Function
The d.f. of a Discrete Distribution
The d.f. of a Continuous Distribution
Exercises
3.4 Bivariate Distributions
Discrete Joint Distributions
Continuous Joint Distributions
Mixed Bivariate Distributions
Bivariate Distribution Functions
Exercises
3.5 Marginal Distributions
Deriving a Marginal p.f. or a Marginal p.d.f.
Independent Random Variables
Exercises
3.6 Conditional Distributions
Discrete Conditional Distributions
Continuous Conditional Distributions
Construction of the Joint Distribution
Exercises
3.7 Multivariate Distributions
Joint Distributions
Marginal Distributions
Conditional Distributions
Exercises
3.8 Functions of a Random Variable
Variable with a Discrete Distribution
Variable with a Continuous Distribution
Direct Derivation of the Probability Density Function
The Probability Integral Transformation
Tables of Random Digits
Exercises
3.9 Functions of Two or More Random Variables
Variables with a Discrete Joint Distribution
Variables with a Continuous Joint Distribution
Transformation of a Multivariate Probability Density Function
Linear Transformations
The Sum of Two Random Variables
The Range
Exercises
3.10 The Borel–Kolmogorov Paradox
Conditioning on a Particular Value
Conditioning on the Equality of Two Random Variables
3.11 Supplementary Exercises
Chapter 4. Expectation
4.1 The Expectation of a Random Variable
Expectation for a Discrete Distribution
Expectation for a Continuous Distribution
Interpretation of the Expectation
The Expectation of a Function
Exercises
4.2 Properties of Expectations
Basic Theorems
The Mean of a Binomial Distribution
Expected Number of Matches
Expectation of a Product
Expectation for Nonnegative Discrete Distributions
Exercises
4.3 Variance
Definitions of the Variance and the Standard Deviation
Properties of the Variance
The Variance of the Binomial Distribution
Exercises
4.4 Moments
Existence of Moments
Moment Generating Functions
Properties of Moment Generating Functions
Exercises
4.5 The Mean and the Median
The Median
Comparison of the Mean and the Median
Minimizing the Mean Squared Error
Minimizing the Mean Absolute Error
Exercises
4.6 Covariance and Correlation
Covariance
Correlation
Properties of Covariance and Correlation
Exercises
4.7 Conditional Expectation
Definition and Basic Properties
Prediction
Exercises
4.8 The Sample Mean
The Markov and Chebyshev Inequalities
Properties of the Sample Mean
The Law of Large Numbers
Exercises
4.9 Utility
Utility Functions
Examples of Utility Functions
Selling a Lottery Ticket
Exercises
4.10 Supplementary Exercises
Chapter 5. Special Distributions
5.1 Introduction
5.2 The Bernoulli and Binomial Distributions
The Bernoulli Distribution
Bernoulli Trials
The Binomial Distribution
Exercises
5.3 The Hypergeometric Distribution
Definition of the Hypergeometrlc Distribution
Extending the Definition of Binomial Coefficients
The Mean and Variance for a Hypergeometric Distribution
Comparison of Sampling Methods
Exercises
5.4 The Poisson Distribution
Definition and Properties of the Poisson Distribution
The Poisson Process
The Poisson Approximation to the Binomial Distribution
Exercises
5.5 The Negative Binomial Distribution
Definition of the Negative Binomial Distribution
The Geometric Distribution
Other Properties of Negative Binomial and Geometric Distributions
Exercises
5.6 The Normal Distribution
Importance of the Normal Distribution
Properties of the Normal Distribution
The Standard Normal Distribution
Comparisons of Normal Distributions
Linear Combinations of Normally Distributed Variables
Exercises
5.7 The Central Limit Theorem
Statement ot the Theorem
Convergence in Distribution
Exercises
5.8 The Correction for Continuity
Approximating a Discrete Distribution by a Continuous Distribution
Approximating a Histogram
Exercises
5.9 The Gamma Distribution
The Gamma Function
The Gamma Distribution
The Exponential Distribution
Life Tests
Exercises
5.10 The Beta Distribution
Definition of the Beta Distribution
Moments of the Beta Distribution
Exercises
5.11 The Multinomial Distribution
Definition of the Multinomial Distribution
Relation Between the Multinomial and Binomial Distributions
Means, Variances, and Covariances
Exercises
5.12 The Bivariate Normal Distribution
Definition of the Bivariate Normal Distribution
Marginal and Conditional Distributions
Linear Combinations
Exercises
5.13 Supplementary Exercises
Chapter 6. Estimation
6.1 Statistical Inference
Nature of Statistical Inference
Parameters
Statistical Decision Problems
Experimental Design
References
6.2 Prior and Posterior Distributions
The Prior Distribution
The Posterior Distribution
The Likelihood Function
Sequential Observations
Exercises
6.3 Conjugate Prior Distributions
Sampling from a Bernoulli Distribution
Sampling from a Poisson Distribution
Sampling from a Normal Distribution
Sampling from an Exponential Distribution
Exercises
6.4 Bayes Estimators
Nature of an Estimation Problem
Loss Functions
Definition of a Bayes Estimator
Different Loss Functions
The Bayes Estimate for Large Samples
Exercises
6.5 Maximum Likelihood Estimators
Limitations of Bayes Estimators
Definition of a Maximum Likelihood Estimator
Examples of Maximum Likelihood Estimators
Exercises
6.6 Properties of Maximum Likelihood Estimators
Invariance
Numerical Computation
Consistency
Sampling Plans
The Likelihood Principle
Exercises
6.7 Sufficient Statistics
Definition of a Statistic
Definition of a Sufficient Statistic
The Factorization Criterion
Exercises
6.8 Jointly Sufficient Statistics
Definition of Jointly Sufficient Statistics
Minimal Sufficient Statistics
Maximum Likelihood Estimators and Bayes Estimators as Sufficient Statistics
Exercises
6.9 Improving an Estimator
The Mean Squared Error of an Estimator
Conditional Expectation When a Sufficient Statistic Is Known
Limitation of the Use of Sufficient Statistics
Exercises
6.10 Supplementary Exercises
Chapter 7. Sampling Distributions of Estimators
7.1 The Sampling Distribution of a Statistic
Statistics and Estimators
Purpose of the Sampling Distribution
Exercises
7.2 The Chi-Square Distribution
Definition of the Distribution
Properties of the Distribution
Exercises
7.3 Joint Distribution of the Sample Mean and Sample Variance
Independence of the Sample Mean and Sample Variance
Orthogonal Matrices
Proof of the Independence of the Sample Mean and Sample Variance
Estimation of the Mean and Variance
Exercises
7.4 The t Distribution
Definition of the Distribution
Moments of the t Distribution
Relation to Random Samples from a Normal Distribution
Exercises
7.5 Confidence Intervals
Confidence Intervals for the Mean of a Normal Distribution
Confidence Intervals for an Arbitrary Parameter
Shortcoming of Confidence Intervals
Exercises
7.6 Bayesian Analysis of Samples from a Normal Distribution
The Precision of a Normal Distribution
A Conjugate Family of Prior Distributions
The Marginal Distribution of the Mean
A Numerical Example
Exercises
7.7 Unbiased Estimators
Definition of an Unbiased Estimator
Unbiased Estimation of the Variance
Estimation of the Variance of a Normal Dlstrlbullon
Discussion of the Concept of Unbiased Estimation
Exercises
7.8 Fisher Information
Definition and Properties of Fisher Information
The Information Inequality
Efficient Estimators
Unbiased Estimators with Minimum Variance
Properties of Maximum Likelihood Estimators for Large Samples
The Delta Method
Exercises
7.9 Supplementary Exercises
Chapter 8. Testing Hypotheses
8.1 Problems of Testing Hypotheses
The Null and Alternative Hypotheses
The Critical Region
The Power Function
Simple and Composite Hypotheses
Exercises
8.2 Testing Simple Hypotheses
Two Types of Errors
Optimal Tests
Choosing a Level of Significance
Bayes Test Procedures
Exercises
8.3 Multidecision Problems
Finite Number of Parameter Values and Finite Number of Decisions
Bayes Decision Procedures
Exercises
8.4 Uniformly Most Powerful Tests
Definition of a Uniformly Most Powerful Test
Monotone Likelihood Ratio
One-Sided Alternatives
Two-Sided Alternatives
Exercises
8.5 Selecting a Test Procedure
General Form of the Procedure
Selection of the Test Procedure
Composite Null Hypothesis
Unbiased Tests
Equivalence of Confidence Sets and Tests
Exercises
8.6 The t Test
Testing Hypotheses About the Mean of a Normal Distribution When the Variance Is Unknown
Derivation of the t Test
Properties of the t Test
Testing with a Two-Sided Alternative
Confidence Intervals from the t Test
Exercises
8.7 Discussion of the Methodology of Testing Hypotheses
Tail Areas
Tail Areas for a Two-Sided Alternative Hypothesis
Statistically Significant Results
The Bayesian Approach
Exercises
8.8 The F Distribution
Definition of the F Distribution
Properties of the F Distribution
Comparing the Variances of Two Normal Distributions
Derivation of the F Test
Properties of the F Test
Exercises
8.9 Comparing the Means of Two Normal Distributions
Derivation of the Two-Sample t Test
Properties of the Two-Sample t Test
Two-Sided Alternatives and Unequal Variances
Exercises
8.10 Supplementary Exercises
Chapter 9. Categorical Data and Nonparametric Methods
9.1 Tests of Goodness-of-Fit
Description of Nonparametric Problems
Categorical Data
Testing Hypotheses About a Proportion
Testing Hypotheses About a Continuous Distribution
Discussion of the Test Procedure
Exercises
9.2 Goodness-of-Fit for Composite Hypotheses
Composite Null Hypotheses
Determining the Maximum Likelihood Estimates
Testing Whether a Distribution Is Normal
Testing Composite Hypotheses About an Arbitrary Distribution
Exercises
9.3 Contingency Tables
Independence in Contingency Tables
Exercises
9.4 Tests of Homogeneity
Samples from Several Populations
Comparing Two or More Proportions
Correlated 2 × 2 Tables
Exercises
9.5 Simpson's Paradox
Comparing Treatments
Aggregation and Disaggregation
The Paradox Explained
Exercises
9.6 The Kolmogorov–Smirnov Tests
The Sample Distribution Function
The Kolmogorov–Smirnov Test of a Simple Hypothesis
The Kolmogorov–Smirnov Test for Two Samples
Exercises
9.7 Inferences about the Median and Other Quantiles
Confidence Intervals and Tests for the Median
Confidence Intervals and Tests for Quantiles
Exercises
9.8 Robust Estimation
Estimating the Median
Trimmed Means
Comparison of the Estimators
Large-Sample Properties of the Sample Median
Exercises
9.9 Paired Observations
Comparative Experiments and Matched Pairs
The Sign Test
The Wilcoxon Signed-Ranks Test
Ties
Exercises
9.10 Ranks for Two Samples
Comparing Two Distributions
The Wilcoxon–Mann–Whitney Ranks Test
Ties
Exercises
9.11 Supplementary Exercises
Chapter 10. Linear Statistical Models
10.1 The Method of Least Squares
Fitting a Straight Line
The Least-Squares Line
Fitting a Polynomial by the Method of Least Squares
Fitting a Linear Function of Several Variables
Exercises
10.2 Regression
Regression Functions
Simple Linear Regression
The Distribution of the Least-Squares Estimators
The Gauss–Markov Theorem for Simple Linear Regression
Design of the Experiment
Prediction
Exercises
10.3 Tests of Hypotheses and Confidence Intervals in Simple Linear Regression
Joint Distribution of the Estimators
Tests of Hypotheses about the Regression Coefficients
Confidence Intervals and Confidence Sets
The Analysis of Residuals
Exercises
10.4 The Regression Fallacy
Use of the Term "Regression"
The Normal Distribution
10.5 Multiple Regression
The General Linear Model
Maximum Likelihood Estimators
Explicit Form of the Estimators
Mean Vector and Covariance Matrix
The Gauss–Markov Theorem for the General Linear Model
The Joint Distribution of the Estimators
Testing Hypotheses
Multiple Linear Regression
Screening Regression Equations
Exercises
10.6 Analysis of Variance
The One-Way Layout
Partitioning a Sum of Squares
Testing Hypotheses
Exercises
10.7 The Two-Way Layout
The Two-Way Layout with One Observation In Each Cell
Estimating the Parameters
Partitioning the Sum of Squares
Testing Hypotheses
Exercises
10.8 The Two-Way Layout with Replications
The Two-Way Layout with K Observations in Each Cell
Partitioning the Sum of Squares
Testing Hypotheses
The Two-Way Layout with Unequal Numbers of Observations in the Cells
Exercises
10.9 Supplementary Exercises
References
Tables
Answers to Even-Numbered Exercises
Index
📜 SIMILAR VOLUMES
The revision of this well-respected text presents a balance of the classical and Bayesian methods. The theoretical and practical sides of both probability and statistics are considered. New content areas include the Vorel- Kolmogorov Paradox, Confidence Bands for the Regression Line, the Correction
The revision of this well-respected text presents a balance of the classical and Bayesian methods. The theoretical and practical sides of both probability and statistics are considered. New content areas include the Vorel- Kolmogorov Paradox, Confidence Bands for the Regression Line, the Correction
The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramer-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities, e
These exercises are designed to show the power and uses of probability and statistical methods. Over 550 problems illustrate applications in mathematics, economics, industry, biology, and physics. Answers are included for those working the problems on their own.