<p>This book addresses a key technology for digital information processing: Kalman filtering, which is generally considered to be one of the greatest discoveries of the 20th century. It introduces readers to issues concerning various uncertainties in a single plant, and to corresponding solutions ba
Kalman Filtering Under Information Theoretic Criteria
β Scribed by Badong Chen, Lujuan Dang, Nanning Zheng, Jose C. Principe
- Publisher
- Springer
- Year
- 2023
- Tongue
- English
- Leaves
- 304
- Category
- Library
No coin nor oath required. For personal study only.
β¦ Synopsis
This book provides several efficient Kalman filters (linear or nonlinear) under information theoretic criteria. They achieve excellent performance in complicated non-Gaussian noises with low computation complexity and have great practical application potential. The book combines all these perspectives and results in a single resource for students and practitioners in relevant application fields. Each chapter starts with a brief review of fundamentals, presents the material focused on the most important properties and evaluates comparatively the models discussing free parameters and their effect on the results. Proofs are provided at the end of each chapter. The book is geared to senior undergraduates with a basic understanding of linear algebra, signal processing and statistics, as well as graduate students or practitioners with experience in Kalman filtering.
β¦ Table of Contents
Preface
Contents
Acronyms
1 Introduction
1.1 Estimation
1.2 Kalman Filtering
1.3 Robust Kalman Filtering
1.4 Robust Kalman Filtering Under Information Theoretic Criteria
1.4.1 Information Theoretic Criteria
1.4.1.1 Maximum Correntropy Criterion (MCC)
1.4.1.2 Minimum Error Entropy (MEE)
1.4.2 Application to Kalman Filtering
1.5 Organization of the Book
2 Kalman Filtering
2.1 Linear Kalman Filters
2.1.1 Bayesian Estimation (BE)
2.1.2 Maximum a Posteriori Estimation (MAP)
2.1.3 Linear Minimum Variance Estimation (LMV)
2.1.4 Minimum Variance Estimation (MV)
2.1.5 Weighted Least Squares (WLS)
2.1.6 Batch-Mode Regression
2.2 Nonlinear Kalman Filters
2.2.1 Extended Kalman Filter (EKF)
2.2.2 Unscented Kalman Filter (UKF)
2.2.3 Cubature Kalman Filter (CKF)
2.2.4 Others
2.3 Robust Kalman Filters
2.3.1 H-Infinite Filter (HIF)
2.3.2 Adaptive Kalman Filter (AKF)
2.3.3 Student's t-Based Kalman Filter (SKF)
2.3.4 Particle Filter (PF)
2.3.5 Huber-Based Kalman Filter (HKF)
2.4 Conclusion
3 Information Theoretic Criteria
3.1 Maximum Correntropy Criterion (MCC)
3.1.1 Correntropy
3.1.1.1 Definition
3.1.2 Maximum Correntropy Criterion
3.1.2.1 MCC Estimation
3.1.2.2 Fixed-Point Solution
3.1.2.3 Robustness Analysis
3.1.2.4 Extended Versions of MCC
3.2 Minimum Error Entropy (MEE)
3.2.1 Renyi's Entropy
3.2.2 Minimum Error Entropy
3.2.2.1 MEE Estimation
3.2.2.2 Fixed-Point Solution
3.2.2.3 Robustness Analysis
3.3 Minimum Error Entropy with Fiducial Points (MEEF)
3.3.1 MEEF Estimation
3.3.2 Fixed-Point Solution
3.4 Appendices
3.4.1 Appendix 3.A
3.4.2 Appendix 3.B
3.4.3 Appendix 3.C
3.4.4 Appendix 3.D
3.4.5 Appendix 3.E
3.4.6 Appendix 3.F
3.4.7 Appendix 3.G
3.4.8 Appendix 3.H
4 Kalman Filtering Under Information Theoretic Criteria
4.1 Kalman Filter (KF)
4.1.1 Prediction
4.1.2 Update
4.2 Maximum Correntropy Kalman Filter (MCKF)
4.2.1 MCKF Algorithm
4.2.2 Computational Complexity
4.2.3 Convergence Issue
4.2.4 Illustrative Example
4.3 Other Approaches for MCC-Based Kalman Filtering
4.3.1 Correntropy Filter (C-Filter)
4.3.2 Modified Correntropy Filter (MC-Filter)
4.3.3 Maximum Correntropy Criterion Kalman Filter(MCC-KF)
4.3.4 Measurement-Specific Correntropy Filter (MSCF)
4.4 Generalized Maximum Correntropy Kalman Filter (GMCKF)
4.4.1 GMCKF Algorithm
4.4.2 Computational Complexity
4.4.3 Parameter Selection
4.4.4 Illustrative Example
4.5 Minimum Error Entropy Kalman Filter (MEE-KF)
4.5.1 MEE-KF Algorithm
4.5.2 Computational Complexity
4.5.3 Convergence Issue
4.5.4 Illustrative Example
4.6 Conclusion
4.7 Appendices
4.7.1 Appendix 4.A
4.7.2 Appendix 4.B
5 Extended Kalman Filtering Under Information Theoretic Criteria
5.1 Extended Kalman Filter (EKF)
5.1.1 Prediction
5.1.2 Update
5.2 Maximum Correntropy Extended Kalman Filter (MCEKF)
5.2.1 Linear Regression MCEKF
5.2.2 Nonlinear Regression MCEKF
5.2.3 Illustrative Example
5.3 Minimum Error Entropy Extended Kalman Filter (MEE-EKF)
5.3.1 MEE-EKF Algorithm
5.3.2 Illustrative Example
5.4 Conclusion
5.5 Appendices
5.5.1 Appendix 5.A
5.5.2 Appendix 5.B
6 Unscented Kalman Filter Under Information Theoretic Criteria
6.1 Unscented Kalman Filter (UKF)
6.1.1 Prediction
6.1.2 Update
6.2 Maximum Correntropy Unscented Filter (MCUF)
6.2.1 MCUF Algorithm
6.2.2 Illustrative Example
6.3 Maximum Correntropy Unscented Kalman Filter (MCUKF)
6.3.1 MCUKF Algorithm
6.3.2 Illustrative Example
6.4 Unscented Kalman Filter with Generalized Correntropy Loss (GCL-UKF)
6.4.1 GCL-UKF Algorithm
6.4.2 Enhanced GCL-UKF (EnGCL-UKF) Algorithm
6.4.3 Application to Power System
6.4.4 Illustrative Example
6.4.4.1 Case 1: Non-Gaussian Noise with Outliers in Measurements
6.4.4.2 Case 2: Bad Data Condition
6.4.4.3 Case 3: Sudden Load Change Condition
6.5 Minimum Error Entropy Unscented Kalman Filter (MEE-UKF)
6.5.1 MEE-UKF Algorithm
6.5.2 Illustrative Example
6.5.2.1 Gaussian Noise with Random Outliers in Process and Measurement
6.5.2.2 Bimodal Gaussian Mixture Noise with Random Outliers in Process and Measurement
6.5.2.3 Sudden State Change and Bad Measurement Data
6.5.2.4 Different Parameters of MEE-UKF
6.5.2.5 Computational Complexity
6.6 Conclusion
7 Cubature Kalman Filtering Under Information Theoretic Criteria
7.1 Cubature Kalman Filter (CKF)
7.1.1 Prediction
7.1.2 Update
7.2 Maximum Correntropy Cubature Kalman Filter (MCCKF)
7.3 Maximum Correntropy Square-Root Cubature Kalman Filter (MCSCKF)
7.3.1 MCSCKF Algorithm
7.3.2 Illustrative Example
7.4 Cubature Kalman Filter Under Minimum Error Entropy with Fiducial Points (MEEF-CKF)
7.4.1 MEEF-CKF Algorithm
7.4.2 Determination of the Parameters
7.4.3 Convergence Issue
7.4.4 Illustrative Example
7.4.4.1 Case 1: Gaussian Noise with Outliers
7.4.4.2 Case 2: Bimodal Gaussian Mixture Noises
7.4.4.3 Case 3: Bimodal Gaussian Mixture Noises with Outliers
7.4.4.4 Case 4: Real-World Noise
7.5 Conclusion
8 Additional Topics in Kalman Filtering Under Information Theoretic Criteria
8.1 Maximum Correntropy Kalman Filter with State Constraints (MCKF-SC)
8.1.1 Linear Constraint
8.1.1.1 Projection Estimation
8.1.1.2 Probability Density Function Truncation
8.1.2 Nonlinear Constraint
8.1.2.1 Linear Approximation
8.1.2.2 Second-Order Approximation
8.1.3 Illustrative Example
8.2 Correntropy-Based Divided Difference Filter (CDD)
8.2.1 First-Order Divided Difference Filter (DD1)
8.2.2 Second-Order Divided Difference Filter (DD2)
8.2.3 Correntropy-Based DD1 and DD2
8.2.4 Illustrative Example
8.2.4.1 State and Measurement Models
8.2.4.2 Simulation Results
8.3 Dual Extended Kalman Filter Under Minimum Error Entropy with Fiducial Points (MEEF-DEKF)
8.3.1 State-Space Model
8.3.2 Dual Extended Kalman Filter
8.3.3 MEEF-DEKF
8.3.3.1 Batch Regression Model
8.3.3.2 MEEF-DEKF
8.3.4 Illustrative Examples
8.3.4.1 Time-Varying Channel Tracking and Equalization
8.3.4.2 Nonstationary Signal Prediction
8.4 Kernel Kalman Filtering with Conditional Embedding Operator and Maximum Correntropy Criterion (KKF-CEO-MCC)
8.4.1 Kernel Methods
8.4.2 Statistical Embeddings in Reproducing Kernel Hilbert Space
8.4.2.1 Marginal Distribution Embedding
8.4.2.2 Joint Distribution Embedding
8.4.2.3 Conditional Distribution Embedding
8.4.3 Kernel Kalman Filtering with Conditional Embedding Operator (KKF-CEO)
8.4.4 KKF-CEO-MCC
8.4.4.1 KKF-CEO-MCC in RKHS
8.4.4.2 1-Step Predictor of KKF-CEO-MCC
8.4.4.3 t-Step Predictor of KKF-CEO-MCC
8.4.5 Simplified Versions of KKF-CEO-MCC
8.4.5.1 KKF-CEO-MCC-O
8.4.5.2 KKF-CEO-MCC-NA
8.4.6 Computational Complexity
8.4.7 Illustrative Examples
8.4.7.1 Noisy IKEDA Chaotic Time-Series Estimation
8.4.7.2 Noisy Lorenz Time-Series Prediction
8.4.7.3 Noisy Sunspot Time-Series Prediction
8.5 Conclusion
8.6 Appendices
8.6.1 Appendix 8.A
8.6.2 Appendix 8.B
8.6.3 Appendix 8.C
8.6.4 Appendix 8.D
Reference
Index
π SIMILAR VOLUMES
<p><span>Adaptive filtering still receives attention in engineering as the use of the adaptive filter provides improved performance over the use of a fixed filter under the time-varying and unknown statistics environments. This application evolved communications, signal processing, seismology, mecha
Recently, there has appeared a new type of evaluating partial differential equations with Volterra integral operators in various practical areas. Such equations possess new physical and mathematical properties. This monograph systematically discusses application of the finite element methods to nume