<p><b>Take a hands-on approach to understanding deep learning and build smart applications that can recognize images and interpret text</b></p> <h4>Key Features</h4> <ul><li>Understand how to implement deep learning with TensorFlow and Keras </li> <li>Learn the fundamentals of computer vision and im
The Deep Learning Workshop: Learn the skills you need to develop your own next-generation deep learning models with TensorFlow and Keras: Take a ... that can recognize images and interpret text
β Scribed by Mirza Rahim Baig, Thomas V. Joseph, Nipun Sadvilkar, Mohan Kumar Silaparasetty, Anthony So
- Publisher
- Packt Publishing
- Year
- 2020
- Tongue
- English
- Leaves
- 473
- Category
- Library
No coin nor oath required. For personal study only.
β¦ Synopsis
Take a hands-on approach to understanding deep learning and build smart applications that can recognize images and interpret text
Key Features
- Understand how to implement deep learning with TensorFlow and Keras
- Learn the fundamentals of computer vision and image recognition
- Study the architecture of different neural networks
Book Description
Are you fascinated by how deep learning powers intelligent applications such as self-driving cars, virtual assistants, facial recognition devices, and chatbots to process data and solve complex problems? Whether you are familiar with machine learning or are new to this domain, The Deep Learning Workshop will make it easy for you to understand deep learning with the help of interesting examples and exercises throughout.
The book starts by highlighting the relationship between deep learning, machine learning, and artificial intelligence and helps you get comfortable with the TensorFlow 2.0 programming structure using hands-on exercises. You'll understand neural networks, the structure of a perceptron, and how to use TensorFlow to create and train models. The book will then let you explore the fundamentals of computer vision by performing image recognition exercises with convolutional neural networks (CNNs) using Keras. As you advance, you'll be able to make your model more powerful by implementing text embedding and sequencing the data using popular deep learning solutions. Finally, you'll get to grips with bidirectional recurrent neural networks (RNNs) and build generative adversarial networks (GANs) for image synthesis.
By the end of this deep learning book, you'll have learned the skills essential for building deep learning models with TensorFlow and Keras.
What you will learn
- Understand how deep learning, machine learning, and artificial intelligence are different
- Develop multilayer deep neural networks with TensorFlow
- Implement deep neural networks for multiclass classification using Keras
- Train CNN models for image recognition
- Handle sequence data and use it in conjunction with RNNs
- Build a GAN to generate high-quality synthesized images
Who this book is for
If you are interested in machine learning and want to create and train deep learning models using TensorFlow and Keras, this workshop is for you. A solid understanding of Python and its packages, along with basic machine learning concepts, will help you to learn the topics quickly.
Table of Contents
- Building Blocks of Deep Learning
- Neural Networks
- Image Classification with Convolutional Neural Networks (CNNs)
- Deep Learning for Text - Embeddings
- Deep Learning for Sequences
- LSTMs, GRUs, and Advanced RNNs
- Generative Adversarial Networks
β¦ Table of Contents
Cover
FM
Copyright
Table of Contents
Preface
Chapter 1: Building Blocks of Deep Learning
Introduction
AI, Machine Learning, and Deep Learning
Machine Learning
Deep Learning
Using Deep Learning to Classify an Image
Pre-Trained Models
The Google Text-to-Speech API
Prerequisite Packages for the Demo
Exercise 1.01: Image and Speech Recognition Demo
Deep Learning Models
The Multi-Layer Perceptron
Convolutional Neural Networks
Recurrent Neural Networks
Generative Adversarial Networks
Introduction to TensorFlow
Constants
Variables
Defining Functions in TensorFlow
Exercise 1.02: Implementing a Mathematical Equation
Linear Algebra with TensorFlow
Exercise 1.03: Matrix Multiplication Using TensorFlow
The reshape Function
Exercise 1.04: Reshaping Matrices Using the reshape() Function in TensorFlow
The argmax Function
Exercise 1.05: Implementing the argmax() Function
Optimizers
Exercise 1.06: Using an Optimizer for a Simple Linear Regression
Activity 1.01: Solving a Quadratic Equation Using an Optimizer
Summary
Chapter 2: Neural Networks
Introduction
Neural Networks and the Structure of Perceptrons
Input Layer
Weights
Bias
Net Input Function
Activation Function (G)
Perceptrons in TensorFlow
Exercise 2.01: Perceptron Implementation
Training a Perceptron
Perceptron Training Process in TensorFlow
Exercise 2.02: Perceptron as a Binary Classifier
Multiclass Classifier
The Softmax Activation Function
Exercise 2.03: Multiclass Classification Using a Perceptron
MNIST Case Study
Exercise 2.04: Classifying Handwritten Digits
Keras as a High-Level API
Exercise 2.05: Binary Classification Using Keras
Multilayer Neural Network or Deep Neural Network
ReLU Activation Function
Exercise 2.06: Multilayer Binary Classifier
Exercise 2.07: Deep Neural Network on MNIST Using Keras
Exploring the Optimizers and Hyperparameters of NeuralΒ Networks
Gradient Descent Optimizers
The Vanishing Gradient Problem
Hyperparameter Tuning
Overfitting and Dropout
Activity 2.01: Build a Multilayer Neural Network to Classify Sonar Signals
Summary
Chapter 3: Image Classification with Convolutional Neural Networks (CNNs)
Introduction
Digital Images
Image Processing
Convolution Operations
Exercise 3.01: Implementing a Convolution Operation
Stride
Padding
Convolutional Neural Networks
Pooling Layers
CNNs with TensorFlow and Keras
Exercise 3.02: Recognizing Handwritten Digits (MNIST) with CNN Using KERAS
Data Generator
Exercise 3.03: Classifying Cats versus Dogs with Data Generators
Data Augmentation
Horizontal Flipping
Vertical Flipping
Zooming
Horizontal Shifting
Vertical Shifting
Rotating
Shearing
Exercise 3.04: Image Classification (CIFAR-10) with Data Augmentation
Activity 3.01: Building a Multiclass Classifier Based on the Fashion MNISTΒ Dataset
Saving and Restoring Models
Saving the Entire Model
Saving the Architecture Only
Saving the Weights Only
Transfer Learning
Fine-Tuning
Activity 3.02: Fruit Classification with Transfer Learning
Summary
Chapter 4: Deep Learning for Text β Embeddings
Introduction
Deep Learning for Natural Language Processing
Getting Started with Text Data Handling
Text Preprocessing
Tokenization
Normalizing Case
Removing Punctuation
Removing Stop Words
Exercise 4.01: Tokenizing, Case Normalization, Punctuation, and Stop Word Removal
Stemming and Lemmatization
Exercise 4.02: Stemming Our Data
Beyond Stemming and Lemmatization
Downloading Text Corpora Using NLTK
Activity 4.01: Text Preprocessing of the 'Alice in Wonderland' Text
Text Representation Considerations
Classical Approaches to Text Representation
One-Hot Encoding
Exercise 4.03: Creating One-Hot Encoding for Our Data
Term Frequencies
The TF-IDF Method
Exercise 4.04: Document-Term Matrix with TF-IDF
Summarizing the Classical Approaches
Distributed Representation for Text
Word Embeddings and Word Vectors
word2vec
Training Our Own Word Embeddings
Semantic Regularities in Word Embeddings
Exercise 4.05: Vectors for Phrases
Effect of Parameters β "size" of the Vector
Effect of Parameters β "window size"
Skip-gram versus CBOW
Effect of Training Data
Exercise 4.06: Training Word Vectors on Different Datasets
Using Pre-Trained Word Vectors
Bias in Embeddings β A Word of Caution
Other Notable Approaches to Word Embeddings
Activity 4.02: Text Representation for Alice in Wonderland
Summary
Chapter 5: Deep Learning for Sequences
Introduction
Working with Sequences
Time Series Data β Stock Price Prediction
Exercise 5.01: Visualizing Our Time-Series Data
Recurrent Neural Networks
Loops β An Integral Part of RNNs
Exercise 5.02: Implementing the Forward Pass of a Simple RNN Using TensorFlow
The Flexibility and Versatility of RNNs
Preparing the Data for Stock Price Prediction
Parameters in an RNN
Training RNNs
Exercise 5.03: Building Our First Plain RNN Model
Model Training and Performance Evaluation
1D Convolutions for Sequence Processing
Exercise 5.04: Building a 1D Convolution-Based Model
Performance of 1D Convnets
Using 1D Convnets with RNNs
Exercise 5.05: Building a Hybrid (1D Convolution and RNN) Model
Activity 5.01: Using a Plain RNN Model to Predict IBM Stock Prices
Summary
Chapter 6: LSTMs, GRUs, and Advanced RNNs
Introduction
Long-Range Dependence/Influence
The Vanishing Gradient Problem
Sequence Models for Text Classification
Loading Data
Staging and Preprocessing Our Data
The Embedding Layer
Building the Plain RNN Model
Exercise 6.01: Building and Training an RNN Model for Sentiment Classification
Making Predictions on Unseen Data
LSTMs, GRUs, and Other Variants
LSTMs
Parameters in an LSTM
Exercise 6.02: LSTM-Based Sentiment Classification Model
LSTM versus Plain RNNs
Gated Recurrence Units
Exercise 6.03: GRU-Based Sentiment Classification Model
LSTM versus GRU
Bidirectional RNNs
Exercise 6.04: Bidirectional LSTM-Based Sentiment Classification Model
Stacked RNNs
Exercise 6.05: Stacked LSTM-Based Sentiment Classification Model
Summarizing All the Models
Attention Models
More Variants of RNNs
Activity 6.01: Sentiment Analysis of Amazon Product Reviews
Summary
Chapter 7: Generative Adversarial Networks
Introduction
Key Components of Generative Adversarial Networks
Problem Statement β Generating a Distribution Similar to a Given MathematicalΒ Function
Process 1 β Generating Real Data from the Known Function
Exercise 7.01: Generating a Data Distribution from a Known Function
Process 2 β Creating a Basic Generative Network
Building the Generative Network
Sequential()
Kernel Initializers
Dense Layers
Activation Functions
Exercise 7.02: Building a Generative Network
Setting the Stage for the Discriminator Network
Process 3 β Discriminator Network
Implementing the Discriminator Network
Function to Generate Real Samples
Functions to Generate Fake Samples
Building the Discriminator Network
Training the Discriminator Network
Exercise 7.03: Implementing the Discriminator Network
Process 4 β Implementing the GAN
Integrating All the Building Blocks
Process for Building the GAN
The Training Process
Exercise 7.04: Implementing the GAN
Deep Convolutional GANs
Building Blocks of DCGANs
Generating Handwritten Images Using DCGANs
The Training Process
Exercise 7.05: Implementing the DCGAN
Analysis of Sample Plots
Common Problems with GANs
Mode Collapse
Convergence Failure
Activity 7.01: Implementing a DCGAN for the MNIST Fashion Dataset
Summary
Appendix
Index
π SIMILAR VOLUMES