𝔖 Scriptorium
✦   LIBER   ✦

πŸ“

Advancements in Bayesian Methods and Implementations

✍ Scribed by Alastair G. Young, Arni S. R. Srinivasa Rao, C.R. Rao


Publisher
Academic Press
Year
2022
Tongue
English
Leaves
322
Series
Handbook of Statistics, 47
Category
Library

⬇  Acquire This Volume

No coin nor oath required. For personal study only.

✦ Synopsis


Advancements in Bayesian Methods and Implementation, Volume 47 in the Handbook of Statistics series, highlights new advances in the field, with this new volume presenting interesting chapters on a variety of timely topics, including Fisher Information, Cramer-Rao and Bayesian Paradigm, Compound beta binomial distribution functions, MCMC for GLMMS, Signal Processing and Bayesian, Mathematical theory of Bayesian statistics where all models are wrong, Machine Learning and Bayesian, Non-parametric Bayes, Bayesian testing, and Data Analysis with humans, Variational inference or Functional horseshoe, Generalized Bayes.

✦ Table of Contents


Front Cover
Advancements in Bayesian Methods and Implementation
Copyright
Contents
Contributors
Preface
Chapter 1: Direct Gibbs posterior inference on risk minimizers: Construction, concentration, and calibration
1. Introduction
2. Gibbs posterior distributions
2.1. Problem setup
2.2. Definition
2.3. FAQs
2.3.1. Why a learning rate? Is it really needed?
2.3.2. Difference between Gibbs and misspecified Bayes?
2.3.3. Principles behind the Gibbs posterior construction?
2.4. Illustrations
2.4.1. Quantiles
2.4.2. Minimum clinically important difference
3. Asymptotic theory
3.1. Objectives and general strategies
3.2. Consistency
3.3. Concentration rates
3.4. Distributional approximations
4. Learning rate selection
5. Numerical examples
5.1. Quantile regression
5.2. Classification
5.3. Nonlinear regression
6. Further details
6.1. Things we did not discuss
6.2. Open problems
7. Conclusion
Acknowledgments
Appendix
A.1. Proofs
A.1.1. Proof of Theorem 1
A.1.2. Proof of Theorem 2
References
Chapter 2: Bayesian selective inference
1. Introduction
2. Bayes and selection
2.1. Fixed and random parameters
3. Noninformative priors for selective inference
3.1. Noninformative priors for exponential families
3.1.1. Inference for a selected normal mean with an unknown variance
3.1.2. Inference for the winner
4. Discussion
References
Chapter 3: Dependent Bayesian multiple hypothesis testing
1. Introduction
2. Bayesian multiple hypothesis testing
2.1. Preliminaries and setup
2.2. The decision problem
3. Dependent multiple testing
3.1. New error based criterion
3.2. Choice of G1,,Gm
4. Simulation study
4.1. The postulated Bayesian model
4.2. Comparison criteria
4.3. Comparison of the results
5. Discussion
References
Chapter 4: A new look at Bayesian uncertainty
1. Introduction
2. Missing data
3. Parametric martingale sequences
3.1. Langevin posterior
4. Nonparametric martingale distributions
5. Illustrations
5.1. Parametric case
5.2. Nonparametric case
6. Mathematical theory
7. Discussion
Acknowledgments
References
Chapter 5: 50 shades of Bayesian testing of hypotheses**Some paragraphs of this chapter have first appeared on the author ...
1. Introduction
2. Bayesian hypothesis testing
3. Improper priors united against hypothesis testing
4. The Jeffreys–Lindley paradox
5. Posterior predictive p-values
6. A modest proposal
7. Conclusion
Acknowledgments
Abbreviations
References
Chapter 6: Inference approach to ground states of quantum systems
1. Introduction
2. The Jaynes maximum entropy methodology: Brief resume
3. The quantum maximum entropy approach
3.1. Preliminaries
4. Properties of SQ that make our approximate maximum entropy approach wave functions reasonable ones
4.1. SQ is a true Shannon's ignorance function
4.2. Subject to the known quantities bk, the maximum value of SQ is unique
4.3. The entropy SQ obeys an H-theorem
4.4. Our SQ-ground-state wave functions respect the virial theorem
4.5. The SQ-ground-state wave functions respect hypervirial theorems
4.6. Saturation
4.7. Speculation
5. Coulomb potential
5.1. Harmonic oscillator
5.2. Morse potential
5.3. Ground state of the quartic oscillator
5.4. A possible MEM extension
6. Noncommuting observables
7. Other entropic or information measures
8. Conclusions
References
Chapter 7: MCMC for GLMMs
1. Introduction
2. Likelihood function for GLMMs
3. Conditional simulation for GLMMs
3.1. MALA for GLMMs
3.2. HMC for GLMMs
3.3. Data augmentation for GLMMs
3.3.1. Data augmentation for probit mixed models
3.3.2. Data augmentation for logistic mixed models
4. MCMC for Bayesian GLMMs
4.1. MALA and HMC for Bayesian GLMMs
4.2. Data augmentation for Bayesian GLMMs
4.2.1. Data augmentation for Bayesian probit mixed models
4.2.2. Data augmentation for Bayesian logistic mixed models
5. A numerical example
6. Discussion
References
Chapter 8: Sparsity-aware Bayesian inference and its applications
1. Introduction
1.1. Quick summary of existing methods for sparse signal recovery
1.2. Bayesian approaches: Motivation and related literature
2. The hierarchical Bayesian framework
2.1. Gaussian scale mixtures and sparse Bayesian learning
2.2. SBL framework
2.2.1. Likelihood, prior, and posterior in SBL
2.2.2. Expectation maximization
2.3. Case study: Wireless channel estimation and SBL
3. Joint-sparse signal recovery
3.1. The MSBL algorithm
3.2. Expectation maximization in MSBL
3.3. An interesting interpretation of the MSBL cost function
3.4. A Covariance-matching framework for sparse support recovery using MMVs
3.5. Examples of covariance-matching algorithms for sparse support recovery
3.5.1. The MSBL algorithm
3.5.2. Covariance matching using RΓ©nyi divergence
3.5.3. Co-LASSO
4. Exploiting intervector correlation
4.1. Intervector correlation: The Kalman SBL algorithm
4.2. Online sparse vector recovery
4.3. Case study (continued): Wireless channel estimation and the KSBL algorithm
5. Intravector correlations: The nested SBL algorithm
5.1. Nested SBL (B IB)
6. Quantized sparse signal recovery
7. Other extensions
7.1. Decentralized SBL
7.2. Dictionary learning
7.3. Relationship with robust principal component analysis and sparse + low-rank decomposition
7.3.1. SBL-based algorithms in the presence of colored noise
7.3.2. Covariance matching-based algorithms
7.4. Deep unfolded SBL
8. Discussion and future outlook
References
Chapter 9: Mathematical theory of Bayesian statistics where all models are wrong
1. Introduction
2. Mathematical theory of Bayesian statistics
2.1. DGP, model, and prior
2.2. Generalization loss and free energy
2.3. Regular theory
2.4. Singular theory
2.4.1. Real log canonical threshold (RLCT)
2.4.2. Geometrical property of RLCT
2.5. Phase transitions
2.5.1. Phase transition with sample size
2.5.2. Phase transitions with hyperparameter
3. Applications to statistics and machine learning
3.1. Model evaluation
3.1.1. Estimation of generalization loss
3.1.2. Estimation of free energy
3.2. Prior evaluation
3.2.1. Generalization loss and prior
3.2.2. Free energy and prior
3.3. Not i.i.d. cases
3.3.1. Exchangeable cases
3.3.2. Conditional independent cases
3.3.3. Time sequences
4. Conclusion
Abbreviations
References
Chapter 10: Geometry in sampling methods: A review on manifold MCMC and particle-based variational inference methods
1. Geometry consideration in sampling: Why bother?
2. Manifold and related concepts
2.1. Manifold
2.2. Tangent vector and vector field
2.3. Cotangent vector and differential form
2.4. Riemannian manifold
2.5. Measure
2.6. Divergence and Laplacian
2.7. Manifold embedding
3. Markov chain Monte Carlo on Riemannian manifolds
3.1. Technical description of general MCMC dynamics
3.2. Riemannian MCMC in coordinate space
3.2.1. Langevin dynamics
3.2.2. Hamiltonian dynamics
3.2.3. Stochastic gradient Hamiltonian dynamics
3.3. Riemannian MCMC in embedded space
4. Particle-based variational inference methods
4.1. Stein variational gradient descent
4.2. The Wasserstein space
4.3. Geometric view of particle-based variational inference methods
4.3.1. View from the Wasserstein space
4.3.2. View from the Stein geometry
4.4. Geometric view of MCMC dynamics and relation to ParVI methods
4.4.1. Langevin dynamics
4.4.2. Hamiltonian dynamics
4.4.3. General MCMC dynamics
4.4.3.1. Fiber-Riemannian structure and fiber-gradient flow
4.4.3.2. Poisson structure and Hamiltonian flow
4.4.3.3. fRP structure and fGH flow
4.4.3.4. Inspiration for more general ParVI methods
4.5. Variants and Techniques Inspired by the Geometric View
4.5.1. Other methods for Wasserstein gradient flow simulation
4.5.2. Riemannian-manifold support space
4.5.2.1. Riemannian SVGD
4.5.2.2. Mirrored SVGD
4.5.3. Accelerated gradient flow
4.5.4. Treatment of the kernel
5. Conclusion
Acknowledgments
References
Index
Back Cover


πŸ“œ SIMILAR VOLUMES


Advances In Biometrics: Modern Methods A
✍ G.R. Sinha πŸ“‚ Library πŸ“… 2019 πŸ› Springer 🌐 English

This book provides a framework for robust and novel biometric techniques, along with implementation and design strategies. The theory, principles, pragmatic and modern methods, and future directions of biometrics are presented, along with in-depth coverage of biometric applications in driverless car

Advances in Biometrics: Modern Methods a
✍ G.R. Sinha πŸ“‚ Library πŸ“… 2019 πŸ› Springer Nature 🌐 English

This book provides a framework for robust and novel biometric techniques, along with implementation and design strategies. The theory, principles, pragmatic and modern methods, and future directions of biometrics are presented, along with in-depth coverage of biometric applications in driverless car

Bayesian implementation
✍ Palfrey, Thomas R;Srivastava, Sanjay πŸ“‚ Library πŸ“… 1993;2018 πŸ› CRC Press, Routledge 🌐 English

The first part of Bayesian Implementation presents a basic model of the Bayesian implementation problem and summarizes and explains recent developments in this branch of implementation theory. Substantive problems of interest such as public goods provision, auctions and bargaining are special cases

Advances in Bayesian Networks
✍ Alireza Daneshkhah, Jim. Q. Smith (auth.), Dr. JosΓ© A. GΓ‘mez, Professor SerafΓ­n πŸ“‚ Library πŸ“… 2004 πŸ› Springer-Verlag Berlin Heidelberg 🌐 English

<p><P> In recent years probabilistic graphical models, especially Bayesian networks and decision graphs, have experienced significant theoretical development within areas such as Artificial Intelligence and Statistics. This carefully edited monograph is a compendium of the most recent advances in th

Bayesian Methods in Finance
✍ Svetlozar T. Rachev, John S. J. Hsu, Biliana S. Bagasheva, Frank J. Fabozzi CFA πŸ“‚ Library πŸ“… 2008 πŸ› Wiley 🌐 English

Bayesian Methods in Finance provides a detailed overview of the theory of Bayesian methods and explains their real-world applications to financial modeling. While the principles and concepts explained throughout the book can be used in financial modeling and decision making in general, the authors