This book provides a detailed and up-to-date overview on classification and data mining methods. The first part is focused on supervised classification algorithms and their applications, including recent research on the combination of classifiers. The second part deals with unsupervised data mining
Machine Learning Foundations: Supervised, Unsupervised, and Advanced Learning
โ Scribed by Taeho Jo
- Publisher
- Springer
- Year
- 2021
- Tongue
- English
- Leaves
- 400
- Edition
- 1st ed. 2021
- Category
- Library
No coin nor oath required. For personal study only.
โฆ Synopsis
This book provides conceptual understanding of machine learning algorithms though supervised, unsupervised, and advanced learning techniques. The book consists of four parts: foundation, supervised learning, unsupervised learning, and advanced learning. The first part provides the fundamental materials, background, and simple machine learning algorithms, as the preparation for studying machine learning algorithms. The second and the third parts provide understanding of the supervised learning algorithms and the unsupervised learning algorithms as the core parts. The last part provides advanced machine learning algorithms: ensemble learning, semi-supervised learning, temporal learning, and reinforced learning.
- Provides comprehensive coverage of both learning algorithms: supervised and unsupervised learning;
- Outlines the computation paradigm for solving classification, regression, and clustering;
- Features essential techniques for building the a new generation of machine learning.
โฆ Table of Contents
Preface
Part I: Foundation
Part II: Supervised Learning
Part III: Unsupervised Learning
Part IV: Advanced Topics
Contents
Part I Foundation
1 Introduction
1.1 Definition of Machine Learning
1.2 Application Areas
1.2.1 Classification
1.2.2 Regression
1.2.3 Clustering
1.2.4 Hybrid Tasks
1.3 Machine Learning Types
1.3.1 Supervised Learning
1.3.2 Unsupervised Learning
1.3.3 Semi-supervised Learning
1.3.4 Reinforcement Learning
1.4 Related Areas
1.4.1 Artificial Intelligence
1.4.2 Neural Networks
1.4.3 Data Mining
1.4.4 Soft Computing
1.5 Summary and Further Discussions
References
2 Numerical Vectors
2.1 Introduction
2.2 Operations on Numerical Vectors
2.2.1 Definition
2.2.2 Basic Operations
2.2.3 Inner Product
2.2.4 Linear Independence
2.3 Operations on Matrices
2.3.1 Definition
2.3.2 Basic Operations
2.3.3 Multiplication
2.3.4 Inverse Matrix
2.4 Vector and Matrix
2.4.1 Determinant
2.4.2 Eigen Value and Vector
2.4.3 Singular Value Decomposition
2.4.4 Principal Component Analysis
2.5 Summary and Further Discussions
3 Data Encoding
3.1 Introduction
3.2 Relational Data
3.2.1 Basic Concepts
3.2.2 Relational Database
3.2.3 Encoding Process
3.2.4 Encoding Issues
3.3 Textual Data
3.3.1 Text Indexing
3.3.2 Text Encoding
3.3.3 Dimension Reduction
3.3.4 Encoding Issues
3.4 Image Data
3.4.1 Image File Formats
3.4.2 Image Matrix
3.4.3 Encoding Process
3.4.4 Encoding Issues
3.5 Summary and Further Discussions
References
4 Simple Machine Learning Algorithms
4.1 Introduction
4.2 Classification
4.2.1 Binary Classification
4.2.2 Multiple Classification
4.2.3 Regression
4.2.4 Problem Decomposition
4.3 Simple Classifiers
4.3.1 Threshold Rule
4.3.2 Rectangle
4.3.3 Hyperplane
4.3.4 Matching Algorithm
4.4 Linear Classifiers
4.4.1 Linear Separability
4.4.2 Hyperplane Equation
4.4.3 Linear Classification
4.4.4 Perceptron
4.5 Summary and Further Discussions
References
Part II Supervised Learning
5 Instance Based Learning
5.1 Introduction
5.2 Primitive Instance Based Learning
5.2.1 Look-Up Example
5.2.2 Rule Based Approach
5.2.3 Example Similarity
5.2.4 One Nearest Neighbor
5.3 Classification Process
5.3.1 Notations
5.3.2 Nearest Neighbors
5.3.3 Voting
5.3.4 Attribute Discriminations
5.4 Variants
5.4.1 Dynamic Nearest Neighbor
5.4.2 Concentric Nearest Neighbor
5.4.3 Hierarchical Nearest Neighbor
5.4.4 Hub Examples
5.5 Summary and Further Discussions
References
6 Probabilistic Learning
6.1 Introduction
6.2 Bayes Classifier
6.2.1 Probabilities
6.2.2 Bayes Rule
6.2.3 Gaussian Distribution
6.2.4 Classification
6.3 Naive Bayes
6.3.1 Classification
6.3.2 Learning
6.3.3 Variants
6.3.4 Application to Text Classification
6.4 Bayesian Learning
6.4.1 Bayesian Networks
6.4.2 Causal Relation
6.4.3 Learning Process
6.4.4 Comparisons
6.5 Summary and Further Discussions
References
7 Decision Tree
7.1 Introduction
7.2 Classification Process
7.2.1 Basic Structure
7.2.2 Toy Examples
7.2.3 Text Classification
7.2.4 Rule Extraction
7.3 Learning Process
7.3.1 Preprocessing
7.3.2 Root Node
7.3.3 Interior Nodes
7.3.4 Pruning
7.4 Variants
7.4.1 Regression Version
7.4.2 Decision List
7.4.3 Random Forest
7.4.4 Decision Graph
7.5 Summary and Further Discussions
Reference
8 Support Vector Machine
8.1 Introduction
8.2 Classification Process
8.2.1 Linear Classifier
8.2.2 Kernel Functions
8.2.3 Lagrange Multipliers
8.2.4 Generalization
8.3 Learning Process
8.3.1 Primal Problem
8.3.2 Dual Problem
8.3.3 SMO Algorithm
8.3.4 Other Optimization Schemes
8.4 Variants
8.4.1 Fuzzy SVM
8.4.2 Pairwise SVM
8.4.3 LMS SVM
8.4.4 Sparse SVM
8.5 Summary and Further Discussions
References
Part III Unsupervised Learning
9 Simple Clustering Algorithms
9.1 Introduction
9.2 AHC Algorithm
9.2.1 Cluster Similarity
9.2.2 Initial Version
9.2.3 Fuzzy Clustering
9.2.4 Variants
9.3 Divisive Clustering Algorithm
9.3.1 Binary Clustering
9.3.2 Evolutionary Binary Clustering
9.3.3 Standard Version
9.3.4 Variants
9.4 Online Linear Clustering Algorithm
9.4.1 Representative Selection Scheme
9.4.2 Initial Version
9.4.3 Fuzzy Clustering
9.4.4 Variants
9.5 Summary and Further Discussions
References
10 K Means Algorithm
10.1 Introduction
10.2 Supervised and Unsupervised Learning
10.2.1 Learning Paradigm Transition
10.2.2 Unsupervised KNN
10.2.3 Semi-supervised KNN
10.2.4 Dynamic Data Organization
10.3 Clustering Process
10.3.1 Initialization
10.3.2 Hard Clustering
10.3.3 Fuzzy Clustering
10.3.4 Hierarchical Clustering
10.4 Variants
10.4.1 K Medoid Algorithm
10.4.2 Dynamic K Means Algorithm
10.4.3 Semi-supervised Version
10.4.4 Constraint Clustering
10.5 Summary and Further Discussions
References
11 EM Algorithm
11.1 Introduction
11.2 Cluster Distributions
11.2.1 Uniform Distribution
11.2.2 Gaussian Distribution
11.2.3 Poisson Distribution
11.2.4 Fuzzy Distributions
11.3 Clustering Process
11.3.1 Initialization
11.3.2 E-Step
11.3.3 M-Step
11.3.4 Issues
11.4 Semi-Supervised Learning: Text Classification
11.4.1 Semi-Supervised Learning
11.4.2 Initialization
11.4.3 Likelihood Estimation
11.4.4 Parameter Estimation
11.5 Summary and Further Discussions
References
12 Advanced Clustering
12.1 Introduction
12.2 Cluster Index
12.2.1 Computation Process
12.2.2 Hard Clustering Evaluation
12.2.3 Fuzzy Clustering Evaluation
12.2.4 Hierarchical Clustering Evaluation
12.3 Parameter Tuning
12.3.1 Clustering Index to Unlabeled Items
12.3.2 Simple Clustering Algorithms
12.3.3 K Means Algorithm
12.3.4 Evolutionary Clustering
12.4 Clustering Governance
12.4.1 Cluster Naming
12.4.2 Cluster Maintenance
12.4.3 Multiple Viewed Clustering
12.4.4 Clustering Results Integration
12.5 Summary and Further Discussions
References
Part IV Advanced Topics
13 Ensemble Learning
13.1 Introduction
13.2 Combination Schemes
13.2.1 Voting
13.2.2 Expert Gates
13.2.3 Cascading
13.2.4 Cellular Model
13.3 Meta-learning
13.3.1 Voting
13.3.2 Expert Gates
13.3.3 Cascading
13.3.4 Cellular Model
13.4 Partition
13.4.1 Training Set Partition
13.4.2 Attribute Set Partition
13.4.3 Architecture Partition
13.4.4 Parallel and Distributed Learning
13.5 Summary and Further Discussions
References
14 Semi-supervised Learning
14.1 Introduction
14.2 Kohonen Networks
14.2.1 Initial Version
14.2.2 Learning Vector Quantization
14.2.3 Semi-supervised Version
14.2.4 Kohonen Networks vs. K Means Algorithm
14.3 Combined Learning Algorithms
14.3.1 Combination Paradigms
14.3.2 Simple Learning Algorithms
14.3.3 K Means Algorithm + KNN Algorithm
14.3.4 EM Algorithm + Naive Bayes
14.4 Advanced Supervised Learning
14.4.1 Resampling
14.4.2 Virtual Training Example
14.4.3 Co-Learning
14.4.4 Incremental Learning
14.5 Summary and Further Discussions
References
15 Temporal Learning
15.1 Introduction
15.2 Discrete Markov Model
15.2.1 State Diagram
15.2.2 State Transition Probability
15.2.3 State Path Probability
15.2.4 Application to Time Series Prediction
15.3 Hidden Markov Model
15.3.1 Initial Parameters
15.3.2 Observation Sequence Probability
15.3.3 State Sequence Estimation
15.3.4 HMM Learning
15.4 Text Topic Analysis
15.4.1 Task Specification
15.4.2 Sampling
15.4.3 Learning
15.4.4 Topic Sequence
15.5 Summary and Further Discussions
References
16 Reinforcement Learning
16.1 Introduction
16.2 Simple Reinforcement Learning
16.2.1 Single Example
16.2.2 Classification
16.2.3 Regression
16.2.4 Autonomous Moving
16.3 Q Learning
16.3.1 Q Table
16.3.2 Finite State
16.3.3 Infinite State
16.3.4 Stochastic Reward
16.4 Advanced Reinforcement Learning
16.4.1 Ensemble Reinforcement Learning
16.4.2 Reinforcement + Supervised
16.4.3 Reinforcement + Unsupervised
16.4.4 Environment Prediction
16.5 Summary and Further Discussions
Index
๐ SIMILAR VOLUMES
<p><p>This book covers the state of the art in learning algorithms with an inclusion of semi-supervised methods to provide a broad scope of clustering and classification solutions for big data applications. Case studies and best practices are included along with theoretical models of learning for a
This book covers the state of the art in learning algorithms with an inclusion of semi-supervised methods to provide a broad scope of clustering and classification solutions for big data applications. Case studies and best practices are included along with theoretical models of learning for a compre
<p><b>Concepts of Machine Learning with Practical Approaches.</b></p><p></p><p></p><p><b>Key Features</b><br></p><p>โ Includes real-scenario examples to explain the working of Machine Learning algorithms.<br></p><p>โ Includes graphical and statistical representation to simplify modeling Machine Lear
<p><span>Concepts of Machine Learning with Practical Approaches.</span></p><p></p><p></p><p><span>Key Features</span><span><br></span></p><p><span>โ Includes real-scenario examples to explain the working of Machine Learning algorithms.<br></span></p><p><span>โ Includes graphical and statistical repr