𝔖 Scriptorium
✦   LIBER   ✦

πŸ“

Machine Learning Crash Course for Engineers

✍ Scribed by Hossain, Eklas


Publisher
Springer
Year
2024
Tongue
English
Leaves
465
Category
Library

⬇  Acquire This Volume

No coin nor oath required. For personal study only.

✦ Synopsis


​Machine Learning Crash Course for Engineers is a reader-friendly introductory guide to machine learning algorithms and techniques for students, engineers, and other busy technical professionals. The book focuses on the application aspects of machine learning, progressing from the basics to advanced topics systematically from theory to applications and worked-out Python programming examples. It offers highly illustrated, step-by-step demonstrations that allow readers to implement machine learning models to solve real-world problems. This powerful tutorial is an excellent resource for those who need to acquire a solid foundational understanding of machine learning quickly.

✦ Table of Contents


Preface
Why This Book?
Acknowledgments
Contents
About the Author
1 Introduction to Machine Learning
1.1 Introduction
1.2 What Is Machine Learning?
1.2.1 Machine Learning Workflow
1.2.2 What Is Not Machine Learning?
1.2.3 Machine Learning Jargon
1.2.3.1 Features
1.2.3.2 Target Variable
1.2.3.3 Optimization Problem
1.2.3.4 Objective Function
1.2.3.5 Cost Function
1.2.3.6 Loss Function
1.2.3.7 Comparison Between Loss Function, Cost Function, and Objective Function
1.2.3.8 Algorithm, Model/Hypothesis, and Technique
1.2.4 Difference Between Data Science, Machine Learning, Artificial Intelligence, Deep Learning
1.3 Historical Development of Machine Learning
1.4 Why Machine Learning?
1.4.1 Motivation
1.4.2 Purpose
1.4.3 Importance
1.5 Prerequisite Knowledge to Learn Machine Learning
1.5.1 Linear Algebra
1.5.1.1 Linear Equations
1.5.1.2 Tensor and Tensor Rank
1.5.2 Statistics
1.5.2.1 Measures of Central Tendency
1.5.2.2 Standard Deviation
1.5.2.3 Correlation
1.5.2.4 Outliers
1.5.2.5 Histogram
1.5.2.6 Errors
1.5.3 Probability Theory
1.5.3.1 Probability Distribution
1.5.3.2 Gaussian or Normal Distribution
1.5.3.3 Bernoulli Distribution
1.5.3.4 Central Limit Theorem
1.5.4 Calculus
1.5.4.1 Derivative and Slope
1.5.4.2 Partial Derivatives
1.5.4.3 Maxima and Minima
1.5.4.4 Differential Equation
1.5.5 Numerical Analysis
1.5.5.1 Newton–Raphson Method
1.5.5.2 Gauss–Seidel Method
1.5.6 Gradient Descent
1.5.7 Activation Functions
1.5.8 Programming
1.5.8.1 Variables and Constants
1.5.8.2 Data Types
1.5.8.3 Conditional Statements
1.5.8.4 Loops
1.5.8.5 Array
1.5.8.6 Vector
1.5.8.7 Functions
1.6 Programming Languages and Associated Tools
1.6.1 Why Python?
1.6.2 Installation
1.6.3 Creating the Environment
1.6.3.1 Creating the Environment in Windows
1.6.3.2 Creating the Environment in MacOS
1.6.3.3 Installing Necessary Libraries
1.7 Applications of Machine Learning
1.8 Conclusion
1.9 Key Messages from This Chapter
1.10 Exercise
References
2 Evaluation Criteria and Model Selection
2.1 Introduction
2.2 Error Criteria
2.2.1 MSE
2.2.2 RMSE
2.2.3 MAE
2.2.4 MAPE
2.2.5 Huber Loss
2.2.6 Cross-Entropy Loss
2.2.7 Hinge Loss
2.3 Distance Metrics
2.3.1 Euclidean Distance
2.3.2 Cosine Similarity and Cosine Distance
2.3.3 Manhattan Distance
2.3.4 Chebyshev Distance
2.3.5 Minkowski Distance
2.3.6 Hamming Distance
2.3.7 Jaccard Similarity and Jaccard Distance
2.4 Confusion Matrix
2.4.1 Accuracy
2.4.2 Precision and Recall
2.4.2.1 Precision
2.4.2.2 Recall
2.4.3 F1 Score
2.5 Model Parameter and Hyperparameter
2.6 Hyperparameter Space
2.7 Hyperparameter Tuning and Model Optimization
2.7.1 Manual Search
2.7.2 Exhaustive Grid Search
2.7.3 Halving Grid Search
2.7.4 Random Search
2.7.5 Halving Random Search
2.7.6 Bayesian Optimization
2.7.7 Gradient-Based Optimization
2.7.8 Evolutionary Algorithm
2.7.9 Early Stopping
2.7.10 Python Coding Example for Hyperparameter Tuning Techniques
2.7.10.1 Manual Search
2.7.10.2 Grid Search
2.7.10.3 Halving Grid Search
2.7.10.4 Random Search
2.7.10.5 Halving Random Search
2.8 Bias and Variance
2.8.1 Bias–Variance Trade-off
2.9 Overfitting and Underfitting
2.10 Model Selection
2.10.1 Probabilistic Methods
2.10.1.1 Akaike Information Criterion (AIC)
2.10.1.2 Bayesian Information Criterion (BIC)
2.10.1.3 Minimum Description Length (MDL)
2.10.2 Resampling Methods
2.10.2.1 Random Train/Test Splits
2.10.2.2 Cross-Validation
2.10.2.3 Bootstrap
2.11 Conclusion
2.12 Key Messages from This Chapter
2.13 Exercise
References
3 Machine Learning Algorithms
3.1 Introduction
3.2 Datasets
3.2.1 Data Wrangling
3.2.1.1 Preprocessing
3.2.1.2 Missing Data
3.2.1.3 Imputation
3.2.2 Feature Scaling
3.2.2.1 Standardization
3.2.2.2 Normalization
3.2.2.3 Data Augmentation
3.2.3 Data Types
3.2.3.1 Sequential vs. Non-sequential Data Type
3.2.3.2 Stationary vs. Non-stationary Data Type
3.2.4 Data Splitting
3.3 Categorization of Machine Learning Algorithms
3.4 Supervised Learning
3.4.1 Regression
3.4.1.1 Simple Linear Regression
3.4.1.2 LASSO Regression
3.4.1.3 LASSO LARS Regression
3.4.1.4 Ridge Regression
3.4.1.5 Elastic Net Regression
3.4.1.6 Support Vector Regression
3.4.1.7 Decision Tree Regression
3.4.1.8 Random Forest Regression
3.4.1.9 Bayesian Ridge Regression
3.4.1.10 Multiple Linear Regression
3.4.1.11 Polynomial Regression
3.4.2 Classification
3.4.2.1 Logistic Regression
3.4.2.2 k-Nearest Neighbor (KNN)
3.4.2.3 Support Vector Classification
3.4.2.4 Naive Bayes
3.4.2.5 Gaussian Naive Bayes
3.4.2.6 Decision Tree Classification
3.4.2.7 Random Forest Classification
3.5 Deep Learning
3.5.1 What Is a Neuron?
3.5.2 Backpropagation and Gradient Descent
3.5.3 Artificial Neural Network (ANN)
3.5.4 Convolutional Neural Network
3.5.4.1 Convolution Layer
3.5.4.2 Pooling Layer
3.5.4.3 Activation Functions
3.5.4.4 Dropout
3.5.4.5 Batch Normalization
3.5.4.6 Optimizers
3.5.4.7 Fully Connected Layer
3.5.4.8 Why Is CNN So Popular?
3.5.4.9 State-of-the-Art Model Architecture
3.5.5 Recurrent Neural Network (RNN)
3.5.6 Generative Adversarial Network (GAN)
3.5.7 Transfer Learning
3.6 Time Series Forecasting
3.6.1 ARIMA
3.6.1.1 The Auto-regressive Process
3.6.1.2 The Moving Average Process
3.6.1.3 The Differencing Process
3.6.1.4 Determining the Order
3.6.2 Seasonal ARIMA
3.6.3 Long Short-Term Memory (LSTM)
3.7 Unsupervised Learning
3.7.1 Clustering
3.7.1.1 K-Means Clustering
3.7.1.2 Affinity Propagation Clustering
3.7.1.3 Mean-Shift Clustering
3.7.1.4 DBSCAN: Density-Based Spatial Clustering of Applications with Noise
3.7.2 Dimensionality Reduction
3.7.2.1 Principal Component Analysis (PCA)
3.7.2.2 Linear Discriminant Analysis (LDA)
3.7.2.3 Singular Value Decomposition (SVD)
3.7.3 Association Learning
3.7.3.1 Apriori Algorithm
3.7.3.2 ECLAT Algorithm
3.8 Semi-supervised Learning
3.8.1 Semi-supervised GAN (SGAN)
3.8.2 Semi-supervised Classification
3.9 Reinforcement Learning
3.9.1 Multi-armed Bandit Problem
3.9.1.1 The Greedy Strategy
3.9.1.2 The Epsilon ()-Greedy Strategy
3.9.1.3 Upper Confidence Bound (UCB)
3.9.1.4 Thompson Sampling
3.9.1.5 Q-Learning
3.10 Conclusion
3.11 Key Messages from This Chapter
3.12 Exercise
References
4 Applications of Machine Learning: Signal Processing
4.1 Introduction
4.2 Signal and Signal Processing
4.3 Image Classification
4.3.1 Image Classification Workflow
4.3.2 Applications of Image Classification
4.3.3 Challenges of Image Classification
4.3.4 Implementation of Image Classification
4.4 Neural Style Transfer (NST)
4.4.1 NST Applications
4.5 Feature Extraction or Dimensionality Reduction
4.6 Anomaly or Outlier Detection
4.6.1 How Does It Work?
4.6.1.1 Standard Deviation
4.6.1.2 Inter-quartile Range (IQR)
4.6.1.3 Isolation Forest
4.6.1.4 Local Outlier Factor (LOF)
4.6.2 Applications of Anomaly Detection
4.6.3 Challenges of Anomaly Detection
4.6.4 Implementation of Anomaly Detection
4.7 Adversarial Input Attack
4.8 Malicious Input Detection
4.9 Natural Language Processing
4.9.1 How Does NLP Work?
4.9.2 Applications of NLP
4.9.3 Challenges of NLP
4.9.4 Implementation of NLP
4.10 Conclusion
4.11 Key Messages from This Chapter
4.12 Exercise
References
5 Applications of Machine Learning: Energy Systems
5.1 Introduction
5.2 Load Forecasting
5.3 Fault/Anomaly Analysis
5.3.1 Different Types of Electrical Faults
5.3.2 Fault Detection
5.3.3 Fault Classification
5.3.4 Partial Discharge Detection
5.4 Future Trend Prediction in Renewable Energy Systems
5.4.1 Solar PV Installed Capacity Prediction
5.4.2 Wind Power Output Prediction
5.5 Reactive Power Control and Power Factor Correction
5.6 Conclusion
5.7 Key Messages from this Chapter
5.8 Exercise
References
6 Applications of Machine Learning: Robotics
6.1 Introduction
6.2 Computer Vision and Machine Vision
6.2.1 Object Tracking
6.2.1.1 The MTCNN Architecture
6.2.1.2 Face Tracking Example Using MTCNN
6.2.2 Object Recognition/Detection
6.2.2.1 Applications of Object Recognition/Detection
6.2.2.2 Self-Driving Car: Traffic Sign
6.2.3 Image Segmentation
6.2.3.1 The U-Net Architecture
6.2.3.2 Aerial Semantic Segmentation Example
6.3 Robot: A Line Follower Data Predictor Using Generative Adversarial Network (GAN)
6.4 Conclusion
6.5 Key Messages
6.6 Exercise
References
7 State of the Art of Machine Learning
7.1 Introduction
7.2 State-of-the-Art Machine Learning
7.2.1 Graph Neural Network
7.2.1.1 Applications of GNN
7.2.2 EfficientNet
7.2.3 Inception v3
7.2.4 YOLO
7.2.4.1 Features of YOLO
7.2.4.2 YOLO Concepts
7.2.4.3 YOLO Variants
7.2.4.4 YOLOv3 Implementation for Object Detection
7.2.5 Facebook Prophet
7.2.5.1 Features of Facebook Prophet
7.2.6 ChatGPT
7.2.6.1 Applications of ChatGPT
7.2.6.2 Limitations of ChatGPT
7.3 AI/ML Security Challenges and Possible Solutions
7.4 AI/ML Hardware Challenges and Future Potential
7.4.1 Quantization
7.4.1.1 Affine Quantization
7.4.1.2 Scale Quantization
7.4.2 Weight Pruning
7.4.3 Implementation of Quantization and Pruning
7.5 Multi-domain Learning
7.5.1 Transfer Learning
7.5.2 Domain Adaptation
7.6 Artificial Intelligence
7.6.1 The Turing Test
7.6.2 Limitations of AI and Solutions
7.6.3 Future Possibilities of AI
7.7 Conclusion
7.8 Key Messages
7.9 Exercise
References
Answer Keys to Chapter Exercises
Chapter 1
Chapter 2
Chapter 3
Chapter 4
Chapter 5
Chapter 6
Chapter 7
Index


πŸ“œ SIMILAR VOLUMES


Programming: 4 Books in 1: Python Progra
✍ James Deep πŸ“‚ Library πŸ“… 2020 🌐 English

<h2><strong><em>Have you always wanted to jump into the exciting world of Python programming and Machine Learning but didn’t know where to start? If so, then this book collection was made just for you.</em></strong></h2><p><br></p><p><br></p><p>Python is one of the most used programming languages in

Programming: 4 Books in 1: Python Progra
✍ James Deep πŸ“‚ Library 🌐 English

<h2><span>Have you always wanted to jump into the exciting world of Python programming and Machine Learning but didn’t know where to start? If so, then this book collection was made just for you.</span></h2><p><span><br></span></p><p><span><br></span></p><p><span>Python is one of the most used progr

Data Analysis with Machine Learning for
✍ Chandril Ghosh πŸ“‚ Library πŸ“… 2022 πŸ› Springer 🌐 English

<p><span>The power of data drives the digital economy of the 21st century. It has been argued that data is as vital a resource as oil was during the industrial revolution. An upward trend in the number of research publications using machine learning in some of the top journals in combination with an