Designing Deep Learning Systems (MEAP V08).
β Scribed by Chi Wang and Donald Szeto
- Publisher
- Manning Publications Co.
- Year
- 2023
- Tongue
- English
- Leaves
- 475
- Edition
- MEAP Edition
- Category
- Library
No coin nor oath required. For personal study only.
β¦ Table of Contents
Copyright_2022_Manning_Publications
welcome
1_An_introduction_to_deep_learning_systems
2_Dataset_management_service
3_Model_training_service
4_Distributed_training
5_Hyperparameter_optimization_(HPO)_service
6_Model_serving_design
7_Model_serving_in_practice
8_Metadata_and_artifact_store
9_Workflow_orchestration
10_Path_to_production
Appendix_A._A_hello_world_deep_learning_system
Appendix_B._Survey_of_existing_solutions
Appendix_C._Creating_an_HPO_service_with_Kubeflow_Katib
π SIMILAR VOLUMES
Get the big picture and the important details with this end-to-end guide for designing highly effective, reliable Machine Learning systems. In Machine Learning System Design: With end-to-end examples you will learn The big picture of machine learning system design Analyzing a problem space to i
Discover one-of-a-kind AI strategies never before seen outside of academic papers! Learn how the principles of evolutionary computation overcome deep learningβs common pitfalls and deliver adaptable model upgrades without constant manual adjustment. Evolutionary Deep Learning is a guide to improv
Make your Deep Learning models more generalized and adaptable! These practical regularization techniques improve training efficiency and help avoid overfitting errors. Regularization in Deep Learning teaches you how to improve your model performance with a toolbox of regularization techniques. It
Accelerate deep learning and other number-intensive tasks with JAX, Googleβs awesome high-performance numerical computing library. In Deep Learning with JAX you will learn how to: Use JAX for numerical calculations Build differentiable models with JAX primitives Run distributed and paralleli
In Deep Learning with JAX you will learn how to β’ Use JAX for numerical calculations β’ Build differentiable models with JAX primitives β’ Run distributed and parallelized computations with JAX β’ Use high-level neural network libraries such as Flax and Haiku β’ Leverage libraries and modules from