<p><span>An engaging and accessible introduction to deep learning perfect for students and professionals</span></p><p><span>In </span><span>Deep Learning: A Practical Introduction</span><span>, a team of distinguished researchers delivers a book complete with coverage of the theoretical and practica
Deep Learning: A Practical Introduction
✍ Scribed by Manel Martinez-Ramon, Meenu Ajith, Aswathy Rajendra Kurup
- Publisher
- Wiley
- Year
- 2024
- Tongue
- English
- Leaves
- 416
- Edition
- 1
- Category
- Library
No coin nor oath required. For personal study only.
✦ Synopsis
An engaging and accessible introduction to deep learning perfect for students and professionals
In Deep Learning: A Practical Introduction, a team of distinguished researchers delivers a book complete with coverage of the theoretical and practical elements of deep learning. The book includes extensive examples, end-of-chapter exercises, homework, exam material, and a GitHub repository containing code and data for all provided examples.
Combining contemporary deep learning theory with state-of-the-art tools, the chapters are structured to maximize accessibility for both beginning and intermediate students. The authors have included coverage of TensorFlow, Keras, and Pytorch. Readers will also find:
- Thorough introductions to deep learning and deep learning tools
- Comprehensive explorations of convolutional neural networks, including discussions of their elements, operation, training, and architectures
- Practical discussions of recurrent neural networks and non-supervised approaches to deep learning
- Fulsome treatments of generative adversarial networks as well as deep Bayesian neural networks
Perfect for undergraduate and graduate students studying computer vision, computer science, artificial intelligence, and neural networks, Deep Learning: A Practical Introduction will also benefit practitioners and researchers in the fields of deep learning and machine learning in general.
✦ Table of Contents
fmatter
Title Page
Copyright
Contents
About the Authors
Foreword
Preface
Acknowledgment
About the Companion Website
ch1
1.1 Introduction
1.2 The Concept of Neuron
1.2.1 The Perceptron
1.2.2 The Perceptron (Training) Rule
1.2.3 The Minimum Mean Square Error Training Criterion
1.2.4 The Least Mean Squares Algorithm
1.3 Structure of a Neural Network
1.3.1 The Multilayer Perceptron
1.3.2 Multidimensional Array Multiplications
1.4 Activations
1.5 Training a Multilayer Perceptron
1.5.1 Maximum Likelihood Criterion
1.5.2 Activations and Likelihood Functions
1.5.2.1 Logistic Activation for Binary Classification
1.5.2.2 Softmax Activation for Multiclass Classification
1.5.2.3 Gaussian Activation in Regression
1.5.3 The Backpropagation Algorithm
1.5.3.1 Gradient with Respect to the Output Weights
1.5.3.2 Gradient with Respect to Hidden Layer Weights
1.5.4 Summary of the BP Algorithm
1.6 Conclusion
Problems
ch2
2.1 Introduction
2.2 Generalization and Overfitting
2.2.1 Basic Weight Initializations
2.2.2 Activation Aware Initializations
2.2.3 MiniBatch Gradient Descent
2.3 Regularization Techniques
2.3.1 L1 and L2 Regularization
2.3.2 Dropout
2.3.3 Early Stopping
2.3.4 Data Augmentation
2.4 Normalization Techniques
2.5 Optimizers
2.5.1 Momentum Optimization
2.5.2 Nesterov‐Accelerated Gradient
2.5.3 AdaGrad
2.5.4 RMSProp
2.5.5 Adam
2.5.6 Adamax
2.6 Conclusion
Problems
ch3
3.1 Python: An Overview
3.1.1 Variables
3.1.2 Statements, Indentation, and Comments
3.1.3 Conditional Statements
3.1.4 Loops
3.1.5 Functions
3.1.6 Objects and Classes
3.2 NumPy
3.2.1 Installation and Importing NumPy Package
3.2.2 NumPy Array
3.2.3 Creating Different Types of Arrays
3.2.4 Manipulating Array Shape
3.2.5 Stacking and Splitting NumPy Arrays
3.2.6 Indexing and Slicing
3.2.7 Arithmetic Operations and Mathematical Functions
3.3 Matplotlib
3.3.1 Plotting
3.3.1.1 Functional Method
3.3.1.2 Object Oriented Method
3.3.2 Customized Plotting
3.3.3 Two‐dimensional Plotting
3.3.3.1 Bar Plot
3.3.3.2 Histogram
3.3.3.3 Pie Plot
3.3.3.4 Scatter Plot
3.3.3.5 Quiver Plot
3.3.3.6 Contour Plot
3.3.3.7 Box Plot
3.3.3.8 Violin Plot
3.3.4 Three‐dimensional Plotting
3.3.4.1 3D Contour
3.3.4.2 3D Surface
3.3.4.3 3D Wireframe
3.4 Scipy
3.4.1 Data Input–Output Using Scipy
3.4.2 Clustering Methods
3.4.3 Constants
3.4.4 Linear Algebra and Integration Routines
3.4.5 Optimization
3.4.6 Interpolation
3.4.7 Image Processing
3.4.8 Special Functions
3.5 Scikit‐Learn
3.5.1 Scikit‐Learn API
3.5.1.1 Estimator Interface
3.5.1.2 Predictor Interface
3.5.1.3 Transformer Interface
3.5.2 Loading Datasets
3.5.3 Data Preprocessing
3.5.4 Feature Selection
3.5.5 Supervised and Unsupervised Learning Models
3.5.6 Model Selection and Evaluation
3.6 Pandas
3.6.1 Pandas Data Structures
3.6.1.1 Series
3.6.1.2 Dataframe
3.6.2 Data Selection
3.6.3 Data Manipulation
3.6.3.1 Sorting
3.6.3.2 Grouping
3.6.4 Handling Missing Data
3.6.5 Input–Output Tools
3.6.6 Data Information Retrieval
3.6.7 Data Operations
3.6.8 Data Visualization
3.7 Seaborn
3.7.1 Seaborn Datasets
3.7.2 Plotting with Seaborn
3.7.2.1 Univariate Plots
3.7.2.2 Bivariate Plots
3.7.2.3 Multivariate Plots
3.7.3 Additional Plotting Functions
3.7.3.1 Correlation Plots
3.7.3.2 Point Plots
3.7.3.3 Cat Plots
3.8 Python Libraries for NLP
3.8.1 Natural Language Toolkit (NLTK)
3.8.2 SpaCy
3.8.3 NLP Techniques
3.8.3.1 Tokenization
3.8.3.2 Stemming
3.8.3.3 Lemmatization
3.8.3.4 Stop Words
3.9 TensorFlow
3.9.1 Introduction
3.9.2 Elements of Tensorflow
3.9.3 TensorFlow Pipeline
3.10 Keras
3.10.1 Introduction
3.10.2 Elements of Keras
3.10.2.1 Models
3.10.2.2 Layers
3.10.2.3 Core Modules
3.10.3 Keras Workflow
3.11 Pytorch
3.11.1 Introduction
3.11.2 Elements of PyTorch
3.11.2.1 PyTorch Tensors
3.11.2.2 PyTorch Variables
3.11.2.3 Dynamic Computational Graphs
3.11.2.4 Modules
3.11.3 Workflow of Pytorch
3.12 Conclusion
Problems
ch4
4.1 Introduction
4.2 Elements of a Convolutional Neural Network
4.2.1 Overall Structure of a CNN
4.2.2 Convolutions
4.2.3 Convolutions in Two Dimensions
4.2.4 Padding
4.2.5 Stride
4.2.6 Pooling
4.3 Training a CNN
4.3.1 Formulation of the Convolution Layer in a CNN
4.3.2 Backpropagation of a Convolution Layer
4.3.3 Forward Step in a CNN
4.3.4 Backpropagation in the Dense Section of a CNN
4.3.5 Backpropagation of the Convolutional Section of a CNN
4.4 Extensions of the CNN
4.4.1 AlexNet
4.4.2 VGG
4.4.3 Inception
4.4.4 ResNet
4.4.5 Xception
4.4.6 MobileNet
4.4.6.1 Depthwise Separable Convolutions
4.4.6.2 Width Multiplier
4.4.6.3 Resolution Multiplier
4.4.7 DenseNet
4.4.8 EfficientNet
4.4.9 Transfer Learning for CNN Extensions
4.4.10 Comparisons Among CNN Extensions
4.5 Conclusion
Problems
ch5
5.1 Introduction
5.2 RNN Architecture
5.2.1 Structure of the Basic RNN
5.2.2 Input–Output Configurations
5.3 Training an RNN
5.3.1 Gradient with Respect to the Output Weights
5.3.2 Gradient with Respect to the Input Weights
5.3.3 Gradient with Respect to the Hidden State Weights
5.3.4 Summary of the Backpropagation Through Time in an RNN
5.4 Long‐Term Dependencies: Vanishing and Exploding Gradients
5.5 Deep RNN
5.6 Bidirectional RNN
5.7 Long Short‐Term Memory Networks
5.7.1 LSTM Gates
5.7.2 LSTM Internal State
5.7.3 Hidden State and Output of the LSTM
5.7.4 LSTM Backpropagation
5.7.5 Machine Translation with LSTM
5.7.6 Beam Search in Sequence to Sequence Translation
5.8 Gated Recurrent Units
5.9 Conclusion
Problems
ch6
6.1 Introduction
6.2 Attention Mechanisms
6.2.1 The Nadaraya–Watson Attention Mechanism
6.2.2 The Bahdanau Attention Mechanism
6.2.3 Attention Pooling
6.2.4 Representation by Self‐Attention
6.2.5 Training the Self‐Attention Parameters
6.2.6 Multi‐head Attention
6.2.7 Positional Encoding
6.3 Transformers
6.4 BERT
6.4.1 BERT Architecture
6.4.2 BERT Pre‐training
6.4.3 BERT Fine‐Tuning
6.4.4 BERT for Different NLP Tasks
6.5 GPT‐2
6.5.1 Language Modeling
6.6 Vision Transformers
6.6.1 Comparison between ViTs and CNNs
6.7 Conclusion
Problems
ch7
7.1 Introduction
7.2 Restricted Boltzmann Machines
7.2.1 Boltzmann Machines
7.2.2 Training a Boltzmann Machine
7.2.3 The Restricted Boltzmann Machine
7.3 Deep Belief Networks
7.3.1 Training a DBN
7.4 Autoencoders
7.4.1 Autoencoder Framework
7.5 Undercomplete Autoencoder
7.6 Sparse Autoencoder
7.7 Denoising Autoencoders
7.7.1 Denoising Autoencoder Algorithm
7.8 Convolutional Autoencoder
7.9 Variational Autoencoders
7.9.1 Latent Variable Inference: Lower Bound Estimation Approach
7.9.2 Reparameterization Trick
7.9.3 Illustration: Variational Autoencoder Implementation
7.10 Conclusion
Problems
ch8
8.1 Introduction
8.2 Elements of GAN
8.2.1 Generator
8.2.2 Discriminator
8.3 Training a GAN
8.4 Wasserstein GAN
8.5 DCGAN
8.5.1 DCGAN Training and Outcomes Highlights
8.6 cGAN
8.6.1 cGAN Training and Outcomes Highlights
8.7 CycleGAN
8.7.1 CycleGAN Training and Outcomes Highlights
8.7.2 Applications of CycleGAN
8.8 StyleGAN
8.8.1 StyleGAN Properties and Outcome Highlights
8.9 StackGAN
8.9.1 StackGAN Training and Outcomes Highlights
8.10 Diffusion Models
8.10.1 Forward Diffusion Process
8.10.2 Reverse Diffusion Process
8.10.3 Diffusion Process Training
8.11 Conclusion
Problems
ch9
9.1 Introduction
9.2 Bayesian Models
9.2.1 The Bayes' Rule
9.2.2 Priors as Regularization Criteria
9.3 Bayesian Inference Methods for Deep Learning
9.3.1 Markov Chain Monte Carlo Methods
9.3.2 Hamiltonian MCMC
9.3.3 Variational Inference
9.3.4 Bayes by Backpropagation
9.4 Conclusion
Problems
oth1
oth2
Bibliography
index
📜 SIMILAR VOLUMES
Practical Deep Learning teaches total beginners how to build the datasets and models needed to train neural networks for your own DL projects. If you’ve been curious about machine learning but didn’t know where to start, this is the book you’ve been waiting for. Focusing on the subfield of machin
<b><i>Practical Deep Learning</i> teaches total beginners how to build the datasets and models needed to train neural networks for your own DL projects.</b> If you’ve been curious about machine learning but didn’t know where to start, this is the book you’ve been waiting for. Focusing on the subf
<div> <p><strong>This book is for people with no experience with machine learning and who are looking for an intuition-based, hands-on introduction to deep learning using Python.</strong> </p> <p><em>Deep Learning for Complete Beginners: A Python-Based Introduction</em> is for complete beginners i
<p> <b><i>Practical Deep Learning</i> teaches total beginners how to build the datasets and models needed to train neural networks for your own DL projects.</b> <br> If you’ve been curious about machine learning but didn’t know where to start, this is the book you’ve been waiting for. Focusing on