𝔖 Scriptorium
✦   LIBER   ✦

📁

Machine Learning: A First Course for Engineers and Scientists

✍ Scribed by Andreas Lindholm, Niklas Wahlström, Fredrik Lindsten, Thomas B. Schön


Publisher
Cambridge University Press
Year
2022
Tongue
English
Leaves
352
Edition
New
Category
Library

⬇  Acquire This Volume

No coin nor oath required. For personal study only.

✦ Synopsis


This book introduces machine learning for readers with some background in basic linear algebra, statistics, probability, and programming. In a coherent statistical framework it covers a selection of supervised machine learning methods, from the most fundamental (k-NN, decision trees, linear and logistic regression) to more advanced methods (deep neural networks, support vector machines, Gaussian processes, random forests and boosting), plus commonly-used unsupervised methods (generative modeling, k-means, PCA, autoencoders and generative adversarial networks). Careful explanations and pseudo-code are presented for all methods. The authors maintain a focus on the fundamentals by drawing connections between methods and discussing general concepts such as loss functions, maximum likelihood, the bias-variance decomposition, ensemble averaging, kernels and the Bayesian approach along with generally useful tools such as regularization, cross validation, evaluation metrics and optimization methods. The final chapters offer practical advice for solving real-world supervised machine learning problems and on ethical aspects of modern machine learning.

✦ Table of Contents


Cover
Half-title
Title page
Copyright information
Contents
Acknowledgements
Notation
1 Introduction
1.1 Machine Learning Exemplified
1.2 About This Book
1.3 Further Reading
2 Supervised Learning: A First Approach
2.1 Supervised Machine Learning
2.2 A Distance-Based Method: k-NN
2.3 A Rule-Based Method: Decision Trees
2.4 Further Reading
3 Basic Parametric Models and a Statistical Perspective on Learning
3.1 Linear Regression
3.2 Classification and Logistic Regression
3.3 Polynomial Regression and Regularisation
3.4 Generalised Linear Models
3.5 Further Reading
3.A Derivation of the Normal Equations
4 Understanding, Evaluating, and Improving Performance
4.1 Expected New Data Error E[sub(new)]: Performance in Production
4.2 Estimating E[sub(new)]
4.3 The Training Error–Generalisation Gap Decomposition of E[sub(new)]
4.4 The Bias–Variance Decomposition of E[sub(new)]
4.5 Additional Tools for Evaluating Binary Classifiers
4.6 Further Reading
5 Learning Parametric Models
5.1 Principles of Parametric Modelling
5.2 Loss Functions and Likelihood-Based Models
5.3 Regularisation
5.4 Parameter Optimisation
5.5 Optimisation with Large Datasets
5.6 Hyperparameter Optimisation
5.7 Further Reading
6 Neural Networks and Deep Learning
6.1 The Neural Network Model
6.2 Training a Neural Network
6.3 Convolutional Neural Networks
6.4 Dropout
6.5 Further Reading
6.A Derivation of the Backpropagation Equations
7 Ensemble Methods: Bagging and Boosting
7.1 Bagging
7.2 Random Forests
7.3 Boosting and AdaBoost
7.4 Gradient Boosting
7.5 Further Reading
8 Non-linear Input Transformations and Kernels
8.1 Creating Features by Non-linear Input Transformations
8.2 Kernel Ridge Regression
8.3 Support Vector Regression
8.4 Kernel Theory
8.5 Support Vector Classification
8.6 Further Reading
8.A The Representer Theorem
8.B Derivation of Support Vector Classification
9 The Bayesian Approach and Gaussian Processes
9.1 The Bayesian Idea
9.2 Bayesian Linear Regression
9.3 The Gaussian Process
9.4 Practical Aspects of the Gaussian Process
9.5 Other Bayesian Methods in Machine Learning
9.6 Further Reading
9.A The Multivariate Gaussian Distribution
10 Generative Models and Learning from Unlabelled Data
10.1 The Gaussian Mixture Model and Discriminant Analysis
10.2 Cluster Analysis
10.3 Deep Generative Models
10.4 Representation Learning and Dimensionality Reduction
10.5 Further Reading
11 User Aspects of Machine Learning
11.1 Defining the Machine Learning Problem
11.2 Improving a Machine Learning Model
11.3 What If We Cannot Collect More Data?
11.4 Practical Data Issues
11.5 Can I Trust my Machine Learning Model?
11.6 Further Reading
12 Ethics in Machine Learning
12.1 Fairness and Error Functions
12.2 Misleading Claims about Performance
12.3 Limitations of Training Data
12.4 Further Reading
Bibliography
Index


📜 SIMILAR VOLUMES


MATLAB Essentials : A First Course for E
✍ BOBER, WILLIAM 📂 Library 📅 2018 🏛 CRC press 🌐 English

All disciplines of science and engineering use numerical methods for complex problem analysis, due to the highly mathematical nature of the field. Analytical methods alone are unable to solve many complex problems engineering students and professionals confront. Introduction to MATLAB® Programming f

A First Course in Machine Learning
✍ Simon Rogers, Mark Girolami 📂 Library 📅 2011 🏛 CRC Press 🌐 English

<div> <p><strong>A First Course in Machine Learning</strong> covers the core mathematical and statistical techniques needed to understand some of the most popular machine learning algorithms. The algorithms presented span the main problem areas within machine learning: classification, clustering an

A First Course in Machine Learning
✍ Simon Rogers, Mark Girolami 📂 Library 📅 2016 🏛 Chapman and Hall/CRC 🌐 English

<P>"<STRONG>A First Course in Machine Learning </STRONG>by Simon Rogers and Mark Girolami is the best introductory book for ML currently available. It combines rigor and precision with accessibility, starts from a detailed explanation of the basic foundations of Bayesian analysis in the simplest of

A First Course in Machine Learning
✍ Simon Rogers, Mark Girolami 📂 Library 📅 2011 🏛 Chapman and Hall/CRC 🌐 English

<P><STRONG>A First Course in Machine Learning</STRONG> covers the core mathematical and statistical techniques needed to understand some of the most popular machine learning algorithms. The algorithms presented span the main problem areas within machine learning: classification, clustering and proje