๐”– Scriptorium
โœฆ   LIBER   โœฆ

๐Ÿ“

An Introduction to Pattern Recognition and Machine Learning

โœ Scribed by Paul Fieguth


Publisher
Springer
Year
2022
Tongue
English
Leaves
481
Category
Library

โฌ‡  Acquire This Volume

No coin nor oath required. For personal study only.

โœฆ Synopsis


The domains of Pattern Recognition and Machine Learning have experienced exceptional interest and growth, however the overwhelming number of methods and applications can make the fields seem bewildering. This text offers an accessible and conceptually rich introduction, a solid mathematical development emphasizing simplicity and intuition. Students beginning to explore pattern recognition do not need a suite of mathematically advanced methods or complicated computational libraries to understand and appreciate pattern recognition; rather the fundamental concepts and insights, eminently teachable at the undergraduate level, motivate this text. This book provides methods of analysis that the reader can realistically undertake on their own, supported by real-world examples, case-studies, and worked numerical / computational studies.

โœฆ Table of Contents


Preface
Table of Contents
List of Examples
List of Algorithms
Notation
1 Overview
2 Introduction to Pattern Recognition
2.1 What Is Pattern Recognition?
2.2 Measured Patterns
2.3 Classes
2.4 Classification
2.5 Types of Classification Problems
Case Study 2: Biometrics
Numerical Lab 2: The Iris Dataset
Further Reading
Sample Problems
References
3 Learning
Case Study 3: The Netflix Prize
Numerical Lab 3: Overfitting and Underfitting
Summary
Further Reading
Sample Problems
References
4 Representing Patterns
4.1 Similarity
4.2 Class Shape
4.3 Cluster Synthesis
Case Study 4: Defect Detection
Numerical Lab 4: Working with Random Numbers
Further Reading
Sample Problems
References
5 Feature Extraction and Selection
5.1 Fundamentals of Feature Extraction
5.2 Feature Extraction and Selection
Case Study 5: Image Searching
Numerical Lab 5: Extracting Features and Plotting Classes
Further Reading
Sample Problems
References
6 Distance-Based Classification
6.1 Definitions of Distance
6.2 Class Prototype
6.3 Distance-Based Classification
6.4 Classifier Variations
Case Study 6: Hand-writing Recognition
Numerical Lab 6: Distance-Based Classifiers
Further Reading
Sample Problems
References
7 Inferring Class Models
7.1 Parametric Estimation
7.2 Parametric Model Learning
7.3 Nonparametric Model Learning
7.3.1 Histogram Estimation
7.3.2 Kernel-Based Estimation
7.3.3 Neighbourhood-based Estimation
7.4 Distribution Assessment
Case Study 7: Object Recognition
Numerical Lab 7: Parametric and Nonparametric Estimation
Further Reading
Sample Problems
References
8 Statistics-Based Classification
8.1 Non-Bayesian Classification: Maximum Likelihood
8.2 Bayesian Classification: Maximum a Posteriori
8.3 Statistical Classification for Normal Distributions
8.4 Classification Error
8.5 Other Statistical Classifiers
Case Study 8: Medical Assessments
Numerical Lab 8: Statistical and Distance-Based Classifiers
Further Reading
Sample Problems
References
9 Classifier Testing and Validation
9.1 Working with Data
9.2 Classifier Evaluation
9.3 Classifier Validation
Case Study 9: Autonomous Vehicles
Numerical Lab 9: Leave-One-Out Validation
Further Reading
Sample Problems
References
10 Discriminant-Based Classification
10.1 Linear Discriminants
10.2 Discriminant Model Learning
10.3 Nonlinear Discriminants
10.4 Multi-Class Problems
Case Study 10: Digital Communications
Numerical Lab 10: Discriminants
Further Reading
Sample Problems
References
11 Ensemble Classification
11.1 Combining Classifiers
11.2 Resampling Strategies
11.3 Sequential Strategies
11.4 Nonlinear Strategies
11.4.1 Neural Network Learning
11.4.2 Deep Neural Network Classifiers
Case Study 11: Interpretability and Ethics of Large Networks
Numerical Lab 11: Ensemble Classifiers
Further Reading
Sample Problems
References
12 Model-Free Classification
12.1 Unsupervised Learning
12.1.1 K-Means Clustering
12.1.2 Kernel K-Means Clustering
12.1.3 Mean-Shift Clustering
12.1.4 Hierarchical Clustering
12.2 Network-Based Clustering
12.3 Semi-Supervised Learning
Case Study 12: Ancient Text Analysis: Who Wrote What?
Numerical Lab 12: Clustering
Further Reading
Sample Problems
References
13 Conclusions and Directions
Appendices
A Algebra Review
Further Reading
Sample Problems
References
B Random Variables and Random Vectors
B.1 Random Variables
B.2 Expectations
B.3 Conditional Statistics
B.4 Random Vectors and Covariances
B.5 Outliers and Heavy-Tail Distributions
B.6 Sample Statistics
Further Reading
Sample Problems
References
C Introduction to Optimization
C.1 Basic Principles
C.2 One-Dimensional Optimization
C.3 Multi-Dimensional Optimization
C.4 Multi-Objective Optimization
Further Reading
Sample Problems
References
D Mathematical Derivations
Index


๐Ÿ“œ SIMILAR VOLUMES


Introduction to Pattern Recognition and
โœ M Narasimha Murty, V Susheela Devi ๐Ÿ“‚ Library ๐Ÿ“… 2014 ๐Ÿ› World Scientific Publishing Company ๐ŸŒ English

This book adopts a detailed and methodological algorithmic approach to explain the concepts of pattern recognition. While the text provides a systematic account of its major topics such as pattern representation and nearest neighbour based classifiers, current topics โ€” neural networks, support vecto

Faul, A: Concise Introduction to Machine
โœ A. C. Faul ๐Ÿ“‚ Library ๐Ÿ“… 2019 ๐Ÿ› Chapman and Hall/CRC ๐ŸŒ English

<p>The emphasis of the book is on the question of Why ย– only if why an algorithm is successful is understood, can it be properly applied, and the results trusted. Algorithms are often taught side by side without showing the similarities and differences between them. This book addresses the commonali

Pattern Recognition and Machine Learning
โœ Christopher M. Bishop ๐Ÿ“‚ Library ๐Ÿ“… 2007 ๐Ÿ› Springer ๐ŸŒ English

<P>This is the first textbook on pattern recognition to present the Bayesian viewpoint. The book presents approximate inference algorithms that permit fast approximate answers in situations where exact answers are not feasible. It uses graphical models to describe probability distributions when no o

Pattern Recognition and Machine Learning
โœ Christopher Bishop ๐Ÿ“‚ Library ๐Ÿ“… 2007 ๐Ÿ› Springer ๐ŸŒ English

ๅ†…ๅฎน็ฎ€ไป‹ ยท ยท ยท ยท ยท ยท The dramatic growth in practical applications for machine learning over the last ten years has been accompanied by many important developments in the underlying algorithms and techniques. For example, Bayesian methods have grown from a specialist niche to become mainstream, while

Pattern Recognition and Machine Learning
โœ Christopher M. Bishop ๐Ÿ“‚ Library ๐Ÿ“… 2011 ๐Ÿ› Springer ๐ŸŒ English

Pattern recognition has its origins in engineering, whereas machine learning grew out of computer science. However, these activities can be viewed as two facets of the same field, and together they have undergone substantial development over the past ten years. In particular, Bayesian methods have g

Pattern Recognition and Machine Learning
โœ Christopher M. Bishop ๐Ÿ“‚ Library ๐Ÿ“… 2006 ๐Ÿ› Springer ๐ŸŒ English

<p><span>This is the first textbook on pattern recognition to present the Bayesian viewpoint. The book presents approximate inference algorithms that permit fast approximate answers in situations where exact answers are not feasible. It uses graphical models to describe probability distributions whe