𝔖 Scriptorium
✦   LIBER   ✦

πŸ“

Deep Learning with R

✍ Scribed by Ghatak, Abhijit


Publisher
Springer
Year
2019
Tongue
English
Leaves
259
Edition
1st edition 2019
Category
Library

⬇  Acquire This Volume

No coin nor oath required. For personal study only.

✦ Table of Contents


Artificial Intelligence......Page 6
Evolution of Expert Systems to Machine Learning......Page 7
Applications and Research in Deep Learning......Page 8
Intended Audience......Page 10
Acknowledgements......Page 12
About This Book......Page 13
Contents......Page 14
About the Author......Page 19
1.1 Machine Learning......Page 20
1.1.1 Difference Between Machine Learning and Statistics......Page 21
1.1.2 Difference Between Machine Learning and Deep Learning......Page 22
1.3 Bias–Variance Trade-off in Machine Learning......Page 23
1.4 Addressing Bias and Variance in the Model......Page 24
1.6 Loss Function......Page 25
1.7 Regularization......Page 26
1.8 Gradient Descent......Page 27
1.9 Hyperparameter Tuning......Page 29
1.9.1 Searching for Hyperparameters......Page 30
1.10 Maximum Likelihood Estimation......Page 31
1.11.1 The Cross-Entropy Loss......Page 33
1.11.2 Negative Log-Likelihood......Page 34
1.11.3 Entropy......Page 35
1.11.4 Cross-Entropy......Page 37
1.11.5 Kullback–Leibler Divergence......Page 38
1.12 Conclusion......Page 39
2.1 Introduction......Page 41
2.2.3 Recurrent Neural Networks (RNNs)......Page 43
2.3.1 Notations......Page 44
2.3.2 Input Matrix......Page 45
2.3.3 Bias Matrix......Page 46
2.3.4 Weight Matrix of Layer-1......Page 47
2.3.6 Weights Matrix of Layer-2......Page 48
2.3.7 Activation Function at Layer-2......Page 50
2.3.8 Output Layer......Page 51
2.4 Activation Functions......Page 52
2.4.1 Sigmoid......Page 54
2.4.3 Rectified Linear Unit......Page 55
2.4.4 Leaky Rectified Linear Unit......Page 56
2.4.5 Softmax......Page 57
2.5.1 Derivative of Sigmoid......Page 60
2.5.2 Derivative of tanh......Page 61
2.5.5 Derivative of Softmax......Page 62
2.6 Cross-Entropy Loss......Page 64
2.7.2 Derivative of Cross-Entropy Loss with Softmax......Page 67
2.8 Back Propagation......Page 68
2.8.1 Summary of Backward Propagation......Page 71
2.9 Writing a Simple Neural Network Application......Page 72
2.10 Conclusion......Page 81
3.1 Writing a Deep Neural Network (DNN) Algorithm......Page 82
3.2 Overview of Packages for Deep Learning in......Page 96
3.3 Introduction to......Page 97
3.3.3 Defining a Model......Page 98
3.3.5 Compile and Fit the Model......Page 99
3.4 Conclusion......Page 103
4.1 Initialization......Page 104
4.1.2 Zero Initialization......Page 108
4.1.3 Random Initialization......Page 110
4.1.4 Initialization......Page 112
4.1.5 Initialization......Page 114
4.2.1 Hyperparameters and Weight Initialization......Page 117
4.2.6 Algorithm Related......Page 118
4.3 Conclusion......Page 119
5.1 Introduction......Page 120
5.2.1 Gradient Descent or Batch Gradient Descent......Page 121
5.2.3 Mini-Batch Gradient Descent......Page 122
5.3.2 Momentum Update......Page 124
5.3.3 Nesterov Momentum Update......Page 126
5.3.4 Annealing the Learning Rate......Page 127
5.3.5 Second-Order Methods......Page 128
5.3.6 Per-Parameter Adaptive Learning Rate Methods......Page 129
5.4 Vanishing Gradient......Page 139
5.5 Regularization......Page 143
5.5.1 Dropout Regularization......Page 144
5.5.2 \ell_2 Regularization......Page 145
5.6 Gradient Checking......Page 161
5.7 Conclusion......Page 164
6.1 Revisiting DNNs......Page 165
6.2 Modeling Using......Page 172
6.2.1 Adjust Epochs......Page 174
6.2.2 Add Batch Normalization......Page 175
6.2.3 Add Dropout......Page 176
6.2.4 Add Weight Regularization......Page 177
6.2.6 Prediction......Page 179
6.3 Introduction to......Page 180
6.3.1 What is Flow?......Page 181
6.3.3 Installing and Running......Page 182
6.4.1 Importing MNIST Data Set from......Page 183
6.4.2 Define......Page 184
6.4.4 Instantiating a and Running the Model......Page 185
6.5 Conclusion......Page 186
7.1.1 What is a Convolution Operation?......Page 187
7.1.2 Edge Detection......Page 189
7.1.3 Padding......Page 191
7.1.4 Strided Convolutions......Page 192
7.1.5 Convolutions over Volume......Page 193
7.1.6 Pooling......Page 195
7.2 Single-Layer Convolutional Network......Page 196
7.2.1 Writing a ConvNet Application......Page 197
7.3 Training a ConvNet on a Small DataSet Using keras......Page 202
7.3.1 Data Augmentation......Page 205
7.4.1 LeNet-5......Page 209
7.4.3 VGG-16......Page 210
7.4.5 Transfer Learning or Using Pretrained Models......Page 212
7.4.6 Feature Extraction......Page 214
7.5 What is the ConvNet Learning? A Visualization of Different Layers......Page 216
7.6 Introduction to Neural Style Transfer......Page 219
7.6.3 Generating Art Using Neural Style Transfer......Page 220
7.7 Conclusion......Page 222
8.1 Sequence Models or RNNs......Page 223
8.3 Sequence Model Architectures......Page 225
8.4 Writing the Basic Sequence Model Architecture......Page 226
8.4.1 Backpropagation in Basic RNN......Page 228
8.5.1 The Problem with Sequence Models......Page 231
8.5.2 Walking Through LSTM......Page 232
8.6 Writing the LSTM Architecture......Page 233
8.7.1 Working with Text Data......Page 241
8.7.3 Sampling Strategy and the Importance of Softmax Diversity......Page 242
8.7.4 Implementing LSTM Text Generation......Page 243
8.8.1 Word Embeddings......Page 246
8.8.2 Transfer Learning and Word Embedding......Page 247
8.8.3 Analyzing Word Similarity Using Word Vectors......Page 248
8.8.4 Analyzing Word Analogies Using Word Vectors......Page 249
8.8.5 Debiasing Word Vectors......Page 250
8.9 Conclusion......Page 253
9.1 Gathering Experience and Knowledge......Page 254
9.2 Towards Lifelong Learning......Page 255
9.2.1 Final Words......Page 256
BookmarkTitle:......Page 257


πŸ“œ SIMILAR VOLUMES


Deep Learning with R
πŸ“‚ Library πŸ“… 2022 πŸ› Manning 🌐 English

Deep learning from the ground up using R and the powerful Keras library! In Deep Learning with R, Second Edition you will learn: Deep learning from first principles Image classification and image segmentation Time series forecasting Text classification and machine translation Text generati

Deep Learning with R
✍ FranΓ§ois Chollet with J.J. Allaire πŸ“‚ Library πŸ“… 2017 πŸ› Manning Publications 🌐 English
Deep Learning with R
✍ FranΓ§ois Chollet, J.J. Allaire πŸ“‚ Library πŸ“… 2017 πŸ› Manning Publications 🌐 English
Deep Learning with R
✍ Abhijit Ghatak πŸ“‚ Library πŸ“… 2019 πŸ› Springer 🌐 English

Deep Learning with R introduces deep learning and neural networks using the R programming language. The book builds on the understanding of the theoretical and mathematical constructs and enables the reader to create applications on computer vision, natural language processing and transfer learning.

Deep learning with R
✍ Chollet, Francois;Allaire, J J πŸ“‚ Library πŸ“… 2018 πŸ› Manning Publications Co. 🌐 English

Summary<br /><br />Deep Learning with R introduces the world of deep learning using the powerful Keras library and its R language interface. The book builds your understanding of deep learning through intuitive explanations and practical examples.<br /><br />Purchase of the print book includes a fre