Estimation theory is a product of need and technology. As a result, it is an integral part of many branches of science and engineering. To help readers differentiate among the rich collection of estimation methods and algorithms, this book describes in detail many of the important estimation methods
Lessons in Estimation Theory for Signal Processing, Communications, and Control, Second Edition
β Scribed by Jerry M. Mendel - Department of Electrical Engineering, University of Southern
- Publisher
- Prentice Hall
- Year
- 1995
- Tongue
- English
- Leaves
- 582
- Series
- Prentice Hall Signal Processing Series
- Edition
- 2nd edition
- Category
- Library
No coin nor oath required. For personal study only.
β¦ Synopsis
This book covers key topics in parameter estimation and state estimation, with supplemental lessons on sufficient statistics and statistical estimation of parameters, higher-order statistics and a review of state variable models. It also links computations into MATLAB and its associated toolboxes. A small number of important estimation M-files, which do not presently appear in any MathWork's toolbox, are included in an appendix.
β¦ Table of Contents
Cover......Page 1
Contents......Page 8
Preface......Page 18
Summary......Page 22
Introduction......Page 23
Coverage......Page 24
Philosophy......Page 27
Computation......Page 28
Summary Questions......Page 29
Introduction......Page 30
Examples......Page 31
Notational Preliminaries......Page 39
Computation......Page 41
Supplementary Material: Convolutional Model in Reflection Seismology......Page 42
Summary Questions......Page 44
Problems......Page 45
Introduction......Page 48
Objective Function and Problem Statement......Page 50
Derivation of Estimator......Page 51
Scale Changes and Normalization of Data......Page 57
Computation......Page 58
Supplementary Material: Least Squares, Total Least Squares, and Constrained Total Least Squares......Page 59
Summary Questions......Page 60
Problems......Page 61
Introduction......Page 65
Singular-value Decomposition......Page 66
Using SVD to Calculate θsub(LS)......Page 70
Supplementary Material: Pseudoinverse......Page 72
Summary Questions......Page 74
Problems......Page 75
Introduction......Page 79
Recursive Least Squares: Information Form......Page 80
Matrix Inversion Lemma......Page 83
Recursive Least Squares: Covariance Form......Page 84
Which Form to Use......Page 85
Computation......Page 87
Supplementary Material: Derivation of Start-up Conditions for Recursive Algorithms......Page 88
Summary Questions......Page 90
Problems......Page 91
Summary......Page 95
Introduction......Page 96
Unbiasedness......Page 97
Efficiency......Page 98
Supplementary Material: Proof of Theorem 6-4......Page 107
Summary Questions......Page 108
Problems......Page 109
Summary......Page 112
Stochastic Convergence......Page 113
Asymptotic Unbiasedness......Page 115
Consistency......Page 116
Asymptotic Distributions......Page 120
Asymptotic Normality......Page 122
Asymptotic Efficiency......Page 124
Summary Questions......Page 125
Problems......Page 126
Summary......Page 129
Small-sample Properties of Least-squares Estimators......Page 130
Large-sample Properties of Least-squares Estimators......Page 136
Summary Questions......Page 138
Problems......Page 139
Summary......Page 142
Problem Statement and Objective Function......Page 143
Derivation of Estimator......Page 145
Comparison of θsub(BLU) and θsub(WLS)......Page 146
Some Properties of θsub(BLU)......Page 147
Recursive BLUEs......Page 151
Supplementary Material: Lagrange's Method......Page 152
Summary Questions......Page 153
Problems......Page 154
Likelihood Defined......Page 158
Likelihood Ratio......Page 161
Multiple Hypotheses......Page 162
Decision Making Using Likelihood Ratio......Page 163
Supplementary Material: Transformation of Variables and Probability......Page 165
Problems......Page 166
Summary......Page 168
Maximum-likelihood Method and Estimates......Page 169
Properties of Maximum-likelihood Estimates......Page 172
The Linear Model [H(k) Deterministic]......Page 173
A Log-likelihood Function for an Important Dynamical System......Page 175
Computation......Page 177
Summary Questions......Page 178
Problems......Page 179
Introduction......Page 185
Jointly Gaussian Random Vectors......Page 186
The Conditional Density Function......Page 187
Properties of Multivariate Gaussian Random Variables......Page 189
Properties of Conditional Mean......Page 190
Problems......Page 192
Summary......Page 194
Objective Function and Problem Statement......Page 195
Derivation of Estimator......Page 196
Properties of Mean-squared Estimators when θ and Z(k) Are Jointly Gaussian......Page 199
Mean-squared Estimator for the Generic Linear and Gaussian Model......Page 200
Best Linear Unbiased Estimation, Revisited......Page 202
Supplementary Material: The Conditional Mean Estimator......Page 205
A Nonlinear Estimator......Page 206
Summary Questions......Page 209
Problems......Page 210
Introduction......Page 213
General Results......Page 214
The Generic Linear and Gaussian Model......Page 216
Computation......Page 220
Supplementary Material: Elements of Binary Detection Theory......Page 221
Summary Questions......Page 225
Problems......Page 226
Summary......Page 232
Definition and Properties of Discrete-time GaussβMarkov Random Sequences......Page 233
The Basic State-variable Model......Page 236
Properties of the Basic State-variable Model......Page 237
Signal-to-Noise Ratio......Page 241
Computation......Page 243
Summary Questions......Page 244
Problems......Page 245
Summary......Page 248
Single-stage Predictor......Page 249
A General State Predictor......Page 250
The Innovations Process......Page 254
Supplementary Material: Linear Prediction......Page 256
Summary Questions......Page 259
Problems......Page 260
Summary......Page 263
Introduction......Page 264
A Preliminary Result......Page 266
The Kalman Filter......Page 267
Observations about the Kalman Filter......Page 269
Supplementary Material: MAP Derivation of the Kalman Filter......Page 274
Summary Questions......Page 276
Problems......Page 277
Summary......Page 280
Examples......Page 281
Supplementary Material: Applications of Kalman Filtering......Page 292
Summary Questions......Page 297
Problems......Page 298
Summary......Page 300
Steady-state Kalman Filter......Page 301
Single-channel Steady-state Kalman Filter......Page 303
Relationships between the Steady-state Kalman Filter and a Finite Impulse Response Digital Wiener Filter......Page 307
Comparisons of Kalman and Wiener Filters......Page 314
Supplementary Material: Some Linear System Concepts......Page 315
The Levinson Algorithm......Page 316
Summary Questions......Page 321
Problems......Page 322
Summary......Page 325
Three Types of Smoothers......Page 326
Single-stage Smoother......Page 327
Double-stage Smoother......Page 330
Single- and Double-stage Smoothers as General Smoothers......Page 333
Summary Questions......Page 335
Problems......Page 336
Summary......Page 338
Fixed-interval Smoothing......Page 339
Fixed-point Smoothing......Page 344
Fixed-lag Smoothing......Page 346
Supplementary Material: Second-order GaussβMarkov Random Sequences......Page 349
Minimum-variance Deconvolution (MVD)......Page 350
Steady-state MVD Filter......Page 353
Relationship between Steady-state MVD Filter and an Infinite Impulse Response Digital Wiener Deconvolution Filter......Page 359
Maximum-likelihood Deconvolution......Page 361
Summary Questions......Page 362
Problems......Page 363
Summary......Page 366
Biases......Page 367
Correlated Noises......Page 368
Colored Noises......Page 371
Perfect Measurements: Reduced-order Estimators......Page 375
Computation......Page 378
Supplementary Material: Derivation of Equation (22-14)......Page 380
Summary Questions......Page 381
Problems......Page 382
Summary......Page 385
A Dynamical Model......Page 386
Linear Perturbation Equations......Page 388
Discretization of a Linear Time-varying State-variable Model......Page 392
Computation......Page 395
Supplementary Material: Proof of Theorem 23-1......Page 396
Summary Questions......Page 397
Problems......Page 398
Summary......Page 405
Iterated Least Squares......Page 406
Extended Kalman Filter......Page 407
Application to Parameter Estimation......Page 412
Computation......Page 413
Supplementary Material: EKF for a Nonlinear Discrete-time System......Page 414
Problems......Page 415
Summary......Page 418
A Log-likelihood Function for the Basic State-variable Model......Page 419
On Computing θ[sub(ML)]......Page 421
A Steady-state Approximation......Page 425
Computation......Page 429
Summary Questions......Page 430
Problems......Page 431
Introduction......Page 434
System Description......Page 435
Statistics of the State Vector......Page 436
Notation and Problem Statement......Page 437
The KalmanβBucy Filter......Page 438
Derivation of KBF Using a Formal Limiting Procedure......Page 439
Steady-state KBF......Page 442
An Important Application for the KBF......Page 444
Computation......Page 447
Supplementary Material: Proof of Theorem 26-1......Page 448
Derivation of the KBF when the Structure of the Filter Is Prespecified......Page 449
Summary Questions......Page 452
Problems......Page 453
Summary......Page 457
Concept of Sufficient Statistics......Page 458
Exponential Families of Distributions......Page 460
Exponential Families and Maximum-likelihood Estimation......Page 462
Sufficient Statistics and Uniformly Minimum-variance Unbiased Estimation......Page 465
Summary Questions......Page 469
Problems......Page 470
Summary......Page 471
Introduction......Page 472
Definitions of Higher-order Statistics......Page 473
Properties of Cumulants......Page 485
Proof of Theorem B-3......Page 487
Problems......Page 490
Summary......Page 494
Estimation of Cumulants......Page 495
Estimation of Bispectrum......Page 497
Applications of Higher-order Statistics to Linear Systems......Page 499
Computation......Page 511
Summary Questions......Page 512
Problems......Page 513
Summary......Page 520
Notions of State, State Variables, and State Space......Page 521
Constructing State-variable Representations......Page 524
Solutions of State Equations for Time-invariant Systems......Page 530
Miscellaneous Properties......Page 533
Summary Questions......Page 535
Problems......Page 536
APPENDIX A: Glossary of Major Results......Page 539
Introduction......Page 545
Recursive Weighted Least-squares Estimator......Page 546
Kalman Filter......Page 547
Kalman Predictor......Page 550
Suboptimal Filter......Page 553
Suboptimal Predictor......Page 555
Fixed-interval Smoother......Page 557
APPENDIX C: Answers to Summary Questions......Page 560
References......Page 563
B......Page 574
E......Page 575
G......Page 577
L......Page 578
M......Page 579
P......Page 580
S......Page 581
W......Page 582
π SIMILAR VOLUMES
Over the past decade, interest in computational or non-symbolic artificial intelligence has grown. The algorithms involved have the ability to learn from past experience, and therefore have significant potential in the adaptive control of signals and systems. This book focuses on the theory and appl
The late 1980s, revolutionary advances in digital halftoning enabled inkjet printers to achieve much higher image fidelity. The rapid rate of progress has resulted in numerous breakthroughs scattered throughout the literature, rendering old technologies obsolete and renewing the need for a centraliz
<ul><li>Paperback reprint of one of the most respected classics in the history of engineering publication<li>Together with the reprint of Part I and the new Part IV, this will be the most complete treatment of the subject available<li>Provides a highly-readable discussion of Signal Processing and No
Miller and Childers have focused on creating a clear presentation of foundational concepts with specific applications to signal processing and communications, clearly the two areas of most interest to students and instructors in this course. It is aimed at graduate students as well as practicing en
<span>Nonparametric kernel estimators apply to the statistical analysis of independent or dependent sequences of random variables and for samples of continuous or discrete processes. The optimization of these procedures is based on the choice of a bandwidth that minimizes an estimation error and the