๐”– Scriptorium
โœฆ   LIBER   โœฆ

๐Ÿ“

Probability for Machine Learning - Discover How To Harness Uncertainty With Python

โœ Scribed by Jason Brownlee


Year
2020
Tongue
English
Leaves
319
Series
Machine Learning Mastery
Edition
v1.9
Category
Library

โฌ‡  Acquire This Volume

No coin nor oath required. For personal study only.

โœฆ Synopsis


Probability is the bedrock of machine learning. You cannot develop a deep understanding and application of machine learning without it.

Cut through the equations, Greek letters, and confusion, and discover the topics in probability that you need to know.

Using clear explanations, standard Python libraries, and step-by-step tutorial lessons, you will discover the importance of probability to machine learning, Bayesian probability, entropy, density estimation, maximum likelihood, and much more.

โœฆ Table of Contents


Copyright
Contents
Preface
I Introduction
II Background
What is Probability?
Tutorial Overview
Uncertainty is Normal
Probability of an Event
Probability Theory
Two Schools of Probability
Further Reading
Summary
Uncertainty in Machine Learning
Tutorial Overview
Uncertainty in Machine Learning
Noise in Observations
Incomplete Coverage of the Domain
Imperfect Model of the Problem
How to Manage Uncertainty
Further Reading
Summary
Why Learn Probability for Machine Learning
Tutorial Overview
Reasons to NOT Learn Probability
Class Membership Requires Predicting a Probability
Some Algorithms Are Designed Using Probability
Models Are Trained Using a Probabilistic Framework
Models Can Be Tuned With a Probabilistic Framework
Probabilistic Measures Are Used to Evaluate Model Skill
One More Reason
Further Reading
Summary
III Foundations
Joint, Marginal, and Conditional Probability
Tutorial Overview
Probability for One Random Variable
Probability for Multiple Random Variables
Probability for Independence and Exclusivity
Further Reading
Summary
Intuition for Joint, Marginal, and Conditional Probability
Tutorial Overview
Joint, Marginal, and Conditional Probabilities
Probabilities of Rolling Two Dice
Probabilities of Weather in Two Cities
Further Reading
Summary
Advanced Examples of Calculating Probability
Tutorial Overview
Birthday Problem
Boy or Girl Problem
Monty Hall Problem
Further Reading
Summary
IV Distributions
Probability Distributions
Tutorial Overview
Random Variables
Probability Distribution
Discrete Probability Distributions
Continuous Probability Distributions
Further Reading
Summary
Discrete Probability Distributions
Tutorial Overview
Discrete Probability Distributions
Bernoulli Distribution
Binomial Distribution
Multinoulli Distribution
Multinomial Distribution
Further Reading
Summary
Continuous Probability Distributions
Tutorial Overview
Continuous Probability Distributions
Normal Distribution
Exponential Distribution
Pareto Distribution
Further Reading
Summary
Probability Density Estimation
Tutorial Overview
Probability Density
Summarize Density With a Histogram
Parametric Density Estimation
Nonparametric Density Estimation
Further Reading
Summary
V Maximum Likelihood
Maximum Likelihood Estimation
Tutorial Overview
Problem of Probability Density Estimation
Maximum Likelihood Estimation
Relationship to Machine Learning
Further Reading
Summary
Linear Regression With Maximum Likelihood Estimation
Tutorial Overview
Linear Regression
Maximum Likelihood Estimation
Linear Regression as Maximum Likelihood
Least Squares and Maximum Likelihood
Further Reading
Summary
Logistic Regression With Maximum Likelihood Estimation
Tutorial Overview
Logistic Regression
Logistic Regression and Log-Odds
Maximum Likelihood Estimation
Logistic Regression as Maximum Likelihood
Further Reading
Summary
Expectation Maximization (EM Algorithm)
Tutorial Overview
Problem of Latent Variables for Maximum Likelihood
Expectation-Maximization Algorithm
Gaussian Mixture Model and the EM Algorithm
Example of Gaussian Mixture Model
Further Reading
Summary
Probabilistic Model Selection with AIC, BIC, and MDL
Tutorial Overview
The Challenge of Model Selection
Probabilistic Model Selection
Akaike Information Criterion
Bayesian Information Criterion
Minimum Description Length
Worked Example for Linear Regression
Further Reading
Summary
VI Bayesian Probability
Introduction to Bayes Theorem
Tutorial Overview
What is Bayes Theorem?
Naming the Terms in the Theorem
Example: Elderly Fall and Death
Example: Email and Spam Detection
Example: Liars and Lie Detectors
Further Reading
Summary
Bayes Theorem and Machine Learning
Tutorial Overview
Bayes Theorem of Modeling Hypotheses
Density Estimation
Maximum a Posteriori
MAP and Machine Learning
Bayes Optimal Classifier
Further Reading
Summary
How to Develop a Naive Bayes Classifier
Tutorial Overview
Conditional Probability Model of Classification
Simplified or Naive Bayes
How to Calculate the Prior and Conditional Probabilities
Worked Example of Naive Bayes
5 Tips When Using Naive Bayes
Further Reading
Summary
How to Implement Bayesian Optimization
Tutorial Overview
Challenge of Function Optimization
What Is Bayesian Optimization
How to Perform Bayesian Optimization
Hyperparameter Tuning With Bayesian Optimization
Further Reading
Summary
Bayesian Belief Networks
Tutorial Overview
Challenge of Probabilistic Modeling
Bayesian Belief Network as a Probabilistic Model
How to Develop and Use a Bayesian Network
Example of a Bayesian Network
Bayesian Networks in Python
Further Reading
Summary
VII Information Theory
Information Entropy
Tutorial Overview
What Is Information Theory?
Calculate the Information for an Event
Calculate the Information for a Random Variable
Further Reading
Summary
Divergence Between Probability Distributions
Tutorial Overview
Statistical Distance
Kullback-Leibler Divergence
Jensen-Shannon Divergence
Further Reading
Summary
Cross-Entropy for Machine Learning
Tutorial Overview
What Is Cross-Entropy?
Difference Between Cross-Entropy and KL Divergence
How to Calculate Cross-Entropy
Cross-Entropy as a Loss Function
Difference Between Cross-Entropy and Log Loss
Further Reading
Summary
Information Gain and Mutual Information
Tutorial Overview
What Is Information Gain?
Worked Example of Calculating Information Gain
Examples of Information Gain in Machine Learning
What Is Mutual Information?
How Are Information Gain and Mutual Information Related?
Further Reading
Summary
VIII Classification
How to Develop and Evaluate Naive Classifier Strategies
Tutorial Overview
Naive Classifier
Predict a Random Guess
Predict a Randomly Selected Class
Predict the Majority Class
Naive Classifiers in scikit-learn
Further Reading
Summary
Probability Scoring Metrics
Tutorial Overview
Log Loss Score
Brier Score
ROC AUC Score
Further Reading
Summary
When to Use ROC Curves and Precision-Recall Curves
Tutorial Overview
Predicting Probabilities
What Are ROC Curves?
ROC Curves and AUC in Python
What Are Precision-Recall Curves?
Precision-Recall Curves in Python
When to Use ROC vs. Precision-Recall Curves?
Further Reading
Summary
How to Calibrate Predicted Probabilities
Tutorial Overview
Predicting Probabilities
Calibration of Predictions
How to Calibrate Probabilities in Python
Worked Example of Calibrating SVM Probabilities
Further Reading
Summary
IX Appendix
Getting Help
Probability on Wikipedia
Probability Textbooks
Probability and Machine Learning
Ask Questions About Probability
How to Ask Questions
Contact the Author
How to Setup Python on Your Workstation
Tutorial Overview
Download Anaconda
Install Anaconda
Start and Update Anaconda
Further Reading
Summary
Basic Math Notation
Tutorial Overview
The Frustration with Math Notation
Arithmetic Notation
Greek Alphabet
Sequence Notation
Set Notation
Other Notation
Tips for Getting More Help
Further Reading
Summary
X Conclusions
How Far You Have Come


๐Ÿ“œ SIMILAR VOLUMES


Python for Probability, Statistics, and
โœ Josรฉ Unpingco ๐Ÿ“‚ Library ๐Ÿ“… 2016 ๐Ÿ› Springer ๐ŸŒ English

This book, fully updated for Python version 3.6+, covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. All the figures and numerical results are reproducible using the Python codes provided. The author develops key intuitions i

Python for Probability, Statistics, and
โœ Josรฉ Unpingco ๐Ÿ“‚ Library ๐Ÿ“… 2022 ๐Ÿ› Springer ๐ŸŒ English

<p><span>Using a novel integration of mathematics and Python codes, this book illustrates the fundamental concepts that link probability, statistics, and machine learning, so that the reader can not only employ statistical and machine learning models using modern Python modules, but also understand

Python for Probability, Statistics, and
โœ Josรฉ Unpingco (auth.) ๐Ÿ“‚ Library ๐Ÿ“… 2016 ๐Ÿ› Springer International Publishing ๐ŸŒ English

<p>This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which ar

Python for Probability, Statistics, and
โœ Jose Unpingco ๐Ÿ“‚ Library ๐Ÿ“… 2016 ๐Ÿ› Springer ๐ŸŒ English

This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are

Python for Probability, Statistics, and
โœ Unpingco J. ๐Ÿ“‚ Library ๐ŸŒ English

Springer, 2016. โ€” 276 p. โ€” ISBN: 3319307150<div class="bb-sep"></div>Explains how to simulate, conceptualize, and visualize random statistical processes and apply machine learning methods<br/>Connects to key open-source Python communities and corresponding modules focused on the latest developments