Generative modeling is one of the hottest topics in AI. Itβs now possible to teach a machine to excel at human endeavors such as painting, writing, and composing music. With this practical book, machine-learning engineers and data scientists will discover how to re-create some of the most impressive
Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play
β Scribed by David Foster
- Publisher
- OβReilly Media
- Year
- 2019
- Tongue
- English
- Leaves
- 330
- Edition
- 1
- Category
- Library
No coin nor oath required. For personal study only.
β¦ Synopsis
Generative modeling is one of the hottest topics in AI. Itβs now possible to teach a machine to excel at human endeavors such as painting, writing, and composing music. With this practical book, machine-learning engineers and data scientists will discover how to re-create some of the most impressive examples of generative deep learning models, such as variational autoencoders,generative adversarial networks (GANs), encoder-decoder models and world models.
Author David Foster demonstrates the inner workings of each technique, starting with the basics of deep learning before advancing to some of the most cutting-edge algorithms in the field. Through tips and tricks, youβll understand how to make your models learn more efficiently and become more creative.
β’ Discover how variational autoencoders can change facial expressions in photos
β’ Build practical GAN examples from scratch, including CycleGAN for style transfer and MuseGAN for music generation
β’ Create recurrent generative models for text generation and learn how to improve the models using attention
β’ Understand how generative models can help agents to accomplish tasks within a reinforcement learning setting
β’ Explore the architecture of the Transformer (BERT, GPT-2) and image generation models such as ProGAN and StyleGAN
β¦ Table of Contents
Cover
Copyright
Table of Contents
Preface
Objective and Approach
Prerequisites
Other Resources
Conventions Used in This Book
Using Code Examples
OβReilly Online Learning
How to Contact Us
Acknowledgments
Part I. Introduction to Generative Deep Learning
Chapter 1. Generative Modeling
What Is Generative Modeling?
Generative Versus Discriminative Modeling
Advances in Machine Learning
The Rise of Generative Modeling
The Generative Modeling Framework
Probabilistic Generative Models
Hello Wrodl!
Your First Probabilistic Generative Model
Naive Bayes
Hello Wrodl! Continued
The Challenges of Generative Modeling
Representation Learning
Setting Up Your Environment
Summary
Chapter 2. Deep Learning
Structured and Unstructured Data
Deep Neural Networks
Keras and TensorFlow
Your First Deep Neural Network
Loading the Data
Building the Model
Compiling the Model
Training the Model
Evaluating the Model
Improving the Model
Convolutional Layers
Batch Normalization
Dropout Layers
Putting It All Together
Summary
Chapter 3. Variational Autoencoders
The Art Exhibition
Autoencoders
Your First Autoencoder
The Encoder
The Decoder
Joining the Encoder to the Decoder
Analysis of the Autoencoder
The Variational Art Exhibition
Building a Variational Autoencoder
The Encoder
The Loss Function
Analysis of the Variational Autoencoder
Using VAEs to Generate Faces
Training the VAE
Analysis of the VAE
Generating New Faces
Latent Space Arithmetic
Morphing Between Faces
Summary
Chapter 4. Generative Adversarial Networks
Ganimals
Introduction to GANs
Your First GAN
The Discriminator
The Generator
Training the GAN
GAN Challenges
Oscillating Loss
Mode Collapse
Uninformative Loss
Hyperparameters
Tackling the GAN Challenges
Wasserstein GAN
Wasserstein Loss
The Lipschitz Constraint
Weight Clipping
Training the WGAN
Analysis of the WGAN
WGAN-GP
The Gradient Penalty Loss
Analysis of WGAN-GP
Summary
Part II. Teaching Machines to Paint, Write, Compose, and Play
Chapter 5. Paint
Apples and Organges
CycleGAN
Your First CycleGAN
Overview
The Generators (U-Net)
The Discriminators
Compiling the CycleGAN
Training the CycleGAN
Analysis of the CycleGAN
Creating a CycleGAN to Paint Like Monet
The Generators (ResNet)
Analysis of the CycleGAN
Neural Style Transfer
Content Loss
Style Loss
Total Variance Loss
Running the Neural Style Transfer
Analysis of the Neural Style Transfer Model
Summary
Chapter 6. Write
The Literary Society for Troublesome Miscreants
Long Short-Term Memory Networks
Your First LSTM Network
Tokenization
Building the Dataset
The LSTM Architecture
The Embedding Layer
The LSTM Layer
The LSTM Cell
Generating New Text
RNN Extensions
Stacked Recurrent Networks
Gated Recurrent Units
Bidirectional Cells
EncoderβDecoder Models
A Question and Answer Generator
A Question-Answer Dataset
Model Architecture
Inference
Model Results
Summary
Chapter 7. Compose
Preliminaries
Musical Notation
Your First Music-Generating RNN
Attention
Building an Attention Mechanism in Keras
Analysis of the RNN with Attention
Attention in EncoderβDecoder Networks
Generating Polyphonic Music
The Musical Organ
Your First MuseGAN
The MuseGAN Generator
Chords, Style, Melody, and Groove
The Bar Generator
Putting It All Together
The Critic
Analysis of the MuseGAN
Summary
Chapter 8. Play
Reinforcement Learning
OpenAI Gym
World Model Architecture
The Variational Autoencoder
The MDN-RNN
The Controller
Setup
Training Process Overview
Collecting Random Rollout Data
Training the VAE
The VAE Architecture
Exploring the VAE
Collecting Data to Train the RNN
Training the MDN-RNN
The MDN-RNN Architecture
Sampling the Next z and Reward from the MDN-RNN
The MDN-RNN Loss Function
Training the Controller
The Controller Architecture
CMA-ES
Parallelizing CMA-ES
Output from the Controller Training
In-Dream Training
In-Dream Training the Controller
Challenges of In-Dream Training
Summary
Chapter 9. The Future of Generative Modeling
Five Years of Progress
The Transformer
Positional Encoding
Multihead Attention
The Decoder
Analysis of the Transformer
BERT
GPT-2
MuseNet
Advances in Image Generation
ProGAN
Self-Attention GAN (SAGAN)
BigGAN
StyleGAN
Applications of Generative Modeling
AI Art
AI Music
Chapter 10. Conclusion
Index
About the Author
Colophon
π SIMILAR VOLUMES
Generative modeling is one of the hottest topics in AI. Itβs now possible to teach a machine to excel at human endeavors such as painting, writing, and composing music. With this practical book, machine-learning engineers and data scientists will discover how to re-create some of the most impressive
Generative AI is the hottest topic in tech. This practical book teaches machine learning engineers and data scientists how to create impressive generative deep learning models from scratch using Tensorflow and Keras, including variational autoencoders (VAEs), generative adversarial networks (GANs),
Generative AI is the hottest topic in tech. This practical book teaches machine learning engineers and data scientists how to use TensorFlow and Keras to create impressive generative deep learning models from scratch, including variational autoencoders (VAEs), generative adversarial networks (GANs),
Generative modeling is one of the hottest topics in AI. It's now possible to teach a machine to excel at human endeavors such as painting, writing, and composing music. With this practical book, machine learning engineers and data scientists will discover how to re-create some of the most impressive
<p><span>Deep learning has started to be applied to solving many electromagnetic problems, including the development of fast modelling solvers, accurate imaging algorithms, efficient design tools for antennas, as well as tools for wireless links/channels characterization. The contents of this book r