𝔖 Scriptorium
✦   LIBER   ✦

📁

Automated Deep Learning Using Neural Network Intelligence: Develop and Design PyTorch and TensorFlow Models Using Python

✍ Scribed by Ivan Gridin


Publisher
Apress
Year
2022
Tongue
English
Leaves
396
Edition
1
Category
Library

⬇  Acquire This Volume

No coin nor oath required. For personal study only.

✦ Synopsis


Optimize, develop, and design PyTorch and TensorFlow models for a specific problem using the Microsoft Neural Network Intelligence (NNI) toolkit. This book includes practical examples illustrating automated deep learning approaches and provides techniques to facilitate your deep learning model development.

The first chapters of this book cover the basics of NNI toolkit usage and methods for solving hyper-parameter optimization tasks. You will understand the black-box function maximization problem using NNI, and know how to prepare a TensorFlow or PyTorch model for hyper-parameter tuning, launch an experiment, and interpret the results. The book dives into optimization tuners and the search algorithms they are based on: Evolution search, Annealing search, and the Bayesian Optimization approach. The Neural Architecture Search is covered and you will learn how to develop deep learning models from scratch. Multi-trial and one-shot searching approaches of automatic neural network design are presented. The book teaches you how to construct a search space and launch an architecture search using the latest state-of-the-art exploration strategies: Efficient Neural Architecture Search (ENAS) and Differential Architectural Search (DARTS). You will learn how to automate the construction of a neural network architecture for a particular problem and dataset. The book focuses on model compression and feature engineering methods that are essential in automated deep learning. It also includes performance techniques that allow the creation of large-scale distributive training platforms using NNI.

After reading this book, you will know how to use the full toolkit of automated deep learning methods. The techniques and practical examples presented in this book will allow you to bring your neural network routines to a higher level.


What You Will Learn

  • Know the basic concepts of optimization tuners, search space, and trials
  • Apply different hyper-parameter optimization algorithms to develop effective neural networks
  • Construct new deep learning models from scratch
  • Execute the automated Neural Architecture Search to create state-of-the-art deep learning models
  • Compress the model to eliminate unnecessary deep learning layers


Who This Book Is For 
Intermediate to advanced data scientists and machine learning engineers involved in deep learning and practical neural network development

✦ Table of Contents


Table of Contents
About the Author
About the Technical Reviewer
Introduction
Chapter 1: Introduction to Neural Network Intelligence
What Is Automated Deep Learning?
No Free Lunch Theorem
Injecting New Deep Learning Techniques into Existing Model
Adjusting Model to a New Dataset
Creating a New Model from Scratch
Reinventing the Wheel
Working with Source Code
Neural Network Intelligence Installation
Install
Docker
Search Space, Tuner, and Trial
Black-Box Function Optimization
Web User Interface
Overview Page
Trials Details Page
NNI Command Line
NNI Experiment Configuration
Embedded NNI
Troubleshooting
TensorFlow and PyTorch
Summary
Chapter 2: Hyperparameter Optimization
What Is Hyperparameter?
Layer Hyperparameter
Training Hyperparameter
Feature Hyperparameter
Design Hyperparameter
Search Space
choice
randomint
uniform
quniform
loguniform
qloguniform
normal
qnormal
lognormal
qlognormal
Tuners
Random Search Tuner
Grid Search Tuner
Organizing Experiment
Optimizing LeNet for MNIST Problem
TensorFlow LeNet Implementation
PyTorch LeNet Implementation
Performing LeNet HPO Experiment
Upgrading LeNet with ReLU and Dropout
TensorFlow LeNet Upgrade Implementation
PyTorch LeNet Upgrade Implementation
Performing LeNet Upgrade HPO Experiment
From LeNet to AlexNet
TensorFlow LeNet Evolution Implementation
PyTorch LeNet Evolution Implementation
Performing LeNet Evolution HPO Experiment
Summary
Chapter 3: Hyperparameter Optimization Under Shell
Tuners
Evolution Tuner
Anneal Tuner
Sequential Model-Based Optimization Tuners
Tree-Structured Parzen Estimator Tuner
Gaussian Process Tuner
Which Tuner to Choose?
Custom Tuner
Tuner Internals
New Evolution Custom Tuner
Early Stopping
Median Stop
Curve Fitting
Risk to Stop a Good Trial
Searching for Optimal Functional Pipeline and Classical AutoML
Problem
Operators
Search Space
Model
Tuner
Experiment
Limits of HPO Applying to Neural Architecture Search
Hyperparameters for Hyperparameter Optimization
Summary
Chapter 4: Multi-trial Neural Architecture Search
Neural Architecture As Data Flow Graph
Neural Architecture Search Using Retiarii (PyTorch)
Introduction to NAS Using Retiarii
Retiarii Framework
Base Model
Mutators
LayerChoice
ValueChoice
InputChoice
Repeat
Labeling
Example
Evaluators
Exploration Strategies
Random Strategy
Grid Search
Regularized Evolution
TPE Strategy
RL Strategy
Experiment
CIFAR-10 LeNet NAS
CIFAR-10 ResNet NAS
Classic Neural Architecture Search (TensorFlow)
Base Model
Mutators
Trial
Search Space
Search Strategy
Experiment
Summary
Chapter 5: One-Shot Neural Architecture Search
One-Shot NAS in Action
Supernet Architecture
One-Shot Algorithms
Efficient Neural Architecture Search (ENAS)
TensorFlow ENAS Implementation
PyTorch ENAS Implementation
Differentiable Architecture Search (DARTS)
GeneralSupernet Solving CIFAR-10
Training GeneralSupernet Using TensorFlow and ENAS
Training GeneralSupernet Using PyTorch and DARTS
HPO vs. Multi-trial NAS vs. One-Shot NAS
Summary
Chapter 6: Model Pruning
What Is Model Pruning?
LeNet Model Pruning
One-Shot Pruners
Pruner Configuration
Level Pruner
FPGM Pruner
L1Norm and L2Norm Pruners
Iterative Pruners
Linear Pruner
AGP Pruner
Iterative Pruner Configuration
Iterative Pruning Scenarios
Best Accuracy Under Size Threshold Scenario
Minimal Size Above Accuracy Threshold Scenario
Summary
Chapter 7: NNI Recipes
Speed Up Trials
Start–Stop–Resume
Continue Finished Experiment
NNI and TensorBoard
Move Experiment to Another Server
Scaling Experiments
Shared Storage
One-Shot NAS with Checkpoints and TensorBoard
Summary
Index


📜 SIMILAR VOLUMES


Deep Learning Projects Using TensorFlow
✍ Vinita Silaparasetty 📂 Library 📅 2020 🏛 Apress 🌐 English

<div><div>Work through engaging and practical deep learning projects using TensorFlow 2.0. Using a hands-on approach, the projects in this book will lead new programmers through the basics into developing practical deep learning applications. </div><div><br></div><div>Deep learning is quickly integr

Deep Learning Projects Using TensorFlow
✍ Vinita Silaparasetty 📂 Library 📅 2020 🏛 Apress 🌐 English

<div><div>Work through engaging and practical deep learning projects using TensorFlow 2.0. Using a hands-on approach, the projects in this book will lead new programmers through the basics into developing practical deep learning applications. </div><div><br></div><div>Deep learning is quickly integr

Hands-On Transfer Learning with Python I
✍ Dipanjan Sarkar, Raghav Bali, Tamoghna Ghosh 📂 Library 📅 2018 🏛 Packt Publishing 🌐 English

Deep learning simplified by taking supervised, unsupervised, and reinforcement learning to the next level using the Python ecosystem Key Features • Build deep learning models with transfer learning principles in Python • implement transfer learning to solve real-world research problems • Perfo

Python Deep Learning: Exploring deep lea
✍ Ivan Vasilev, Daniel Slater, Gianmario Spacagna, Peter Roelants, Valentino Zocca 📂 Library 📅 2019 🏛 Packt Publishing 🌐 English

Exploring an advanced state of the art deep learning models and its applications using Popular python libraries like Keras, Tensorflow, and Pytorch Key Features • A strong foundation on neural networks and deep learning with Python libraries. • Explore advanced deep learning techniques and thei