𝔖 Scriptorium
✦   LIBER   ✦

πŸ“

Probabilistic graphical models for computer vision

✍ Scribed by Ji Q


Publisher
Elsevier
Year
2020
Tongue
English
Leaves
184
Category
Library

⬇  Acquire This Volume

No coin nor oath required. For personal study only.

✦ Table of Contents


Contents......Page 6
List of Figures......Page 8
List of Tables......Page 16
1.1 Introduction......Page 17
1.2 Objectives and key features of this book......Page 20
1.3 PGM introduction......Page 21
1.3.1 PGM issues......Page 23
References......Page 24
2.2.1 Random variable and probability......Page 26
2.2.2 Basic probability rules......Page 27
2.2.3 Independencies and conditional independencies......Page 28
2.2.4 Mean, covariance, correlation, and independence......Page 29
2.2.5 Probability inequalities......Page 31
2.2.6.1 Discrete probability distributions......Page 32
2.2.6.1.2 Integer probability distributions......Page 33
2.2.6.1.3 Multivariate integer probability distributions......Page 34
2.2.6.2 Continuous probability distributions......Page 35
2.3.1 Maximum likelihood......Page 36
2.3.1.2 Maximum conditional likelihood estimation......Page 37
2.3.2 Bayesian estimation......Page 38
2.4.1 Continuous optimization......Page 39
2.4.2 Discrete optimization......Page 41
2.5.2 Sample estimation......Page 42
References......Page 44
3.2.1 BN representation......Page 45
3.2.2.1 Markov condition......Page 47
3.2.2.3 D-separation......Page 48
3.2.2.4 Faithfulness......Page 49
3.2.3.1 Discrete BNs......Page 50
3.2.3.2 Continuous BNs......Page 52
3.2.3.4 Naive BNs......Page 53
3.2.3.5 Regression BNs......Page 55
3.3 BN inference......Page 56
3.3.1.1 Variable elimination......Page 58
3.3.1.2 Belief propagation in singly connected Bayesian networks......Page 61
3.3.1.3.1 Clustering and conditioning methods......Page 66
3.3.1.3.2 Junction tree method......Page 67
3.3.2.2 Monte Carlo sampling......Page 72
3.3.2.2.1 Logic sampling......Page 73
3.3.2.2.2 MCMC sampling......Page 74
3.3.2.2.3 Metropolis Hastings sampling......Page 76
3.3.2.3 Variational inference......Page 77
3.3.4 Bayesian inference......Page 82
3.4 BN learning under complete data......Page 84
3.4.1.1 Maximum likelihood estimation of BN parameters......Page 85
3.4.1.2 Bayesian estimation of BN parameters......Page 89
3.4.2 Structure learning......Page 92
3.4.2.1.1 Score-based approach......Page 93
3.4.2.1.2 Independence-test-based approach......Page 98
3.5.1 Parameter learning......Page 99
3.5.1.1.1 Direct method......Page 100
3.5.1.1.2 Expectation maximization method......Page 102
3.5.1.2 Bayesian parameter estimation......Page 106
3.5.2 Structure learning......Page 107
3.7.1 Introduction......Page 109
3.7.2 Learning and inference......Page 111
3.7.2.1 DBN learning......Page 112
3.7.2.2 DBN inference......Page 113
3.7.3 Special DBNs......Page 114
3.7.3.1.1 HMM topology and parameterization......Page 115
3.7.3.1.2 HMM inference......Page 116
3.7.3.1.3 HMM learning......Page 119
3.7.3.1.4 Variants of HMMs......Page 121
3.7.3.2 Linear Dynamic System (LDS)......Page 125
3.8.1 Hierarchical Bayes models......Page 127
3.8.2 Hierarchical deep models......Page 132
3.8.3 Hybrid hierarchical models......Page 135
3.9.2 Proof of Gaussian Bayesian network......Page 137
3.9.3 Laplace approximation......Page 139
References......Page 140
4.1.1.1 Definitions......Page 144
4.1.1.2 Properties......Page 146
4.1.1.3 I-map......Page 147
4.2.1 Discrete pairwise Markov networks......Page 148
4.2.2 Label-observation Markov networks......Page 149
4.2.3 Gaussian Markov networks......Page 151
4.2.4 Restricted Boltzmann machines......Page 152
4.3 Conditional random fields......Page 153
4.4 High-order and long-range Markov networks......Page 155
4.5.1.2 Belief propagation method......Page 157
4.5.1.3 Junction tree method......Page 158
4.5.1.4 Graph cuts method......Page 159
4.5.2.1 Iterated conditional modes......Page 160
4.5.2.2 Gibbs sampling......Page 161
4.5.2.3 Loopy belief propagation......Page 162
4.5.3 Other MN inference methods......Page 163
4.6 Markov network learning......Page 164
4.6.1.2 Maximum likelihood estimation......Page 165
4.6.1.2.1 Contrastive divergence method......Page 167
4.6.1.2.2 Pseudolikelihood method......Page 168
4.6.1.2.3 Variational method......Page 169
4.6.1.3 Bayesian estimation of MN parameters......Page 170
4.6.1.4 Discriminative learning......Page 171
4.6.1.6 MN parameter learning under incomplete data......Page 172
4.6.2.1 Score-based approach......Page 173
4.7 Markov networks versus Bayesian networks......Page 174
References......Page 176
Index......Page 178


πŸ“œ SIMILAR VOLUMES


Advances in Probabilistic Graphical Mode
✍ IldikΓ³ Flesch, Peter J.F. Lucas (auth.), Peter Lucas Dr., JosΓ© A. GΓ‘mez Dr., Ant πŸ“‚ Library πŸ“… 2007 πŸ› Springer-Verlag Berlin Heidelberg 🌐 English

<p><P>In recent years considerable progress has been made in the area of probabilistic graphical models, in particular Bayesian networks and influence diagrams. Probabilistic graphical models have become mainstream in the area of uncertainty in artificial intelligence;<BR>contributions to the area a

Machine Learning and Probabilistic Graph
✍ Kim Phuc Tran πŸ“‚ Library πŸ“… 2022 πŸ› CRC Press 🌐 English

<p><span>This book presents recent advancements in research, a review of new methods and techniques, and applications in decision support systems (DSS) with Machine Learning and Probabilistic Graphical Models, which are very effective techniques in gaining knowledge from Big Data and in interpreting