<b>Solve machine learning problems using probabilistic graphical models implemented in Python with real-world applications</b><h2>About This Book</h2><ul><br><li>Stretch the limits of machine learning by learning how graphical models provide an insight on particular problems, especially in high dime
Building Probabilistic Graphical Models with Python
β Scribed by Kiran R Karkera
- Publisher
- Packt Publishing
- Year
- 2014
- Tongue
- English
- Leaves
- 173
- Category
- Library
No coin nor oath required. For personal study only.
β¦ Synopsis
This is a short, practical guide that allows data scientists to understand the concepts of Graphical models and enables them to try them out using small Python code snippets, without being too mathematically complicated. If you are a data scientist who knows about machine learning and want to enhance your knowledge of graphical models, such as Bayes network, in order to use them to solve real-world problems using Python libraries, this book is for you.This book is intended for those who have some Python and machine learning experience, or are exploring the machine learning field.
β¦ Table of Contents
Cover
Copyright
Credits
About the Author
About the Reviewers
www.PacktPub.com
Table of Contents
Preface
Chapter 1: Probability
The theory of probability
Goals of probabilistic inference
Conditional probability
The chain rule
The Bayes rule
Interpretations of probability
Random variables
Marginal distribution
Joint distribution
Independence
Conditional independence
Types of queries
Probability queries
MAP queries
Summary
Chapter 2: Directed Graphical Models
Graph terminology
Python digression
Independence and independent parameters
The Bayes network
The chain rule
Reasoning patterns
Causal reasoning
Evidential reasoning
Inter-causal reasoning
D-separation
The D-separation example
Blocking and unblocking a V-structure
Factorization and I-maps
The Naive Bayes model
The NaΓ―ve Bayes example
Summary
Chapter 3: Undirected Graphical Models
Pairwise Markov networks
The Gibbs distribution
An induced Markov network
Factorization
Flow of influence
Active trail and separation
Structured prediction
Problem of correlated features
The CRF representation
The CRF example
The factorization-independence tango
Summary
Chapter 4: Structure Learning
The structure learning landscape
Constraint-based structure learning
Part I
Part II
Part III
Summary of constraint-based approaches
Score-based learning
The likelihood score
The Bayesian information criterion score
The Bayesian score
Summary of score-based learning
Summary
Chapter 5: Parameter Learning
The likelihood function
Parameter learning example using MLE
MLE for Bayesian networks
Bayesian parameter learning example using MLE
Data fragmentation
Effect of data fragmentation on parameter estimation
Bayesian parameter estimation
An example of Bayesian methods for parameter learning
Bayesian estimation for the Bayesian network
Example of Bayesian estimation
Summary
Chapter 6: Exact Inference Using Graphical Models
Complexity of inference
Real-world issues
Using the Variable Elimination algorithm
Marginalizing factors that are not relevant
Factor reduction to filter on evidence
Shortcomings of the brute-force approach
Using the Variable Elimination approach
Complexity of Variable Elimination
Graph perspective
Learning the induced width from the graph structure
The tree algorithm
The four stages of the junction tree algorithm
Using the junction tree algorithm for inference
Stage 1.1 β moralization
Stage 1.2 β triangulation
Stage 1.3 β building the join tree
Stage 2 β initializing potentials
Stage 3 β message passing
Summary
Chapter 7: Approximate
Inference Methods
The optimization perspective
Belief propagation on general graphs
Creating a cluster graph to run LBP
Message passing in LBP
Steps in the LBP algorithm
Improving the convergence of LBP
Applying LBP to segment an image
Understanding energy-based models
Visualizing unary and pairwise factors on a 3 x 3 grid
Creating the model for image segmentation
Applications of LBP
Sampling-based methods
Forward sampling
The accept-reject sampling method
The Markov Chain Monte Carlo sampling process
The Markov property
The Markov chain
Reaching a steady state
Sampling using a Markov chain
Gibbs sampling
Steps in the Gibbs sampling procedure
An example of Gibbs sampling
Summary
Appendix: References
Index
π SIMILAR VOLUMES
With the increasing prominence in machine learning and data science applications, probabilistic graphical models are a new tool that machine learning users can use to discover and analyze structures in complex problems. The variety of tools and algorithms under the PGM framework extend to many domai
<p><b>Master probabilistic graphical models by learning through real-world problems and illustrative code examples in Python</b></p> <h2>About This Book</h2><ul><li>Gain in-depth knowledge of Probabilistic Graphical Models</li><li>Model time-series problems using Dynamic Bayesian Networks</li><li>A
Probabilistic graphical models is a technique in machine learning that uses the concepts of graph theory to concisely represent and optimally predict values in our data problems. Graphical models gives us techniques to find complex patterns in the data and are widely used in the field of speech rec