๐”– Scriptorium
โœฆ   LIBER   โœฆ

๐Ÿ“

Deep Learning and Physics

โœ Scribed by Akinori Tanaka, Akio Tomiya, Koji Hashimoto


Publisher
Springer
Year
2021
Tongue
English
Leaves
211
Series
Mathematical Physics Studies
Category
Library

โฌ‡  Acquire This Volume

No coin nor oath required. For personal study only.

โœฆ Table of Contents


Preface
Acknowledgments
Contents
1 Forewords: Machine Learning and Physics
1.1 Introduction to Information Theory
1.2 Physics and Information Theory
1.3 Machine Learning and Information Theory
1.4 Machine Learning and Physics
Part I Physical View of Deep Learning
2 Introduction to Machine Learning
2.1 The Purpose of Machine Learning
2.1.1 Mathematical Formulation of Data
2.2 Machine Learning and Occam's Razor
2.2.1 Generalization
2.3 Stochastic Gradient Descent Method
Column: Probability Theory and Information Theory
Column: Probability Theory and Information Theory
Joint and Conditional Probabilities
Relative Entropy
3 Basics of Neural Networks
3.1 Error Function from Statistical Mechanics
3.1.1 From Hamiltonian to Neural Network
3.1.2 Deep Neural Network
3.2 Derivation of Backpropagation Method Using Bracket Notation
3.3 Universal Approximation Theorem of Neural Network
Column: Statistical Mechanics and Quantum Mechanics
Column: Statistical Mechanics and Quantum Mechanics
Canonical Distribution in Statistical Mechanics
Bracket Notation in Quantum Mechanics
4 Advanced Neural Networks
4.1 Convolutional Neural Network
4.1.1 Convolution
4.1.2 Transposed Convolution
4.2 Recurrent Neural Network and Backpropagation
4.3 LSTM
Column: Edge of Chaos and Emergence of Computability
Column: Edge of Chaos and Emergence of Computability
Sorting Algorithm
Implementation Using Recurrent Neural Network
KdV Equation and Box-Ball System
Critical State and Turing Completeness of One-Dimensional Cellular Automata
5 Sampling
5.1 Central Limit Theorem and Its Role in Machine Learning
5.2 Various Sampling Methods
5.2.1 Inverse Transform Sampling
5.2.2 Rejection Sampling
5.2.3 Markov Chain
5.2.4 Master Equation and the Principle of Detailed Balance
5.2.5 Expectation Value Calculation Using Markov chains, and Importance Sampling
5.3 Sampling Method with the Detailed Balance
5.3.1 Metropolis Method
5.3.2 Heatbath Method
Column: From Ising Model to Hopfield Model
Column: From Ising Model to Hopfield Model
6 Unsupervised Deep Learning
6.1 Unsupervised Learning
6.2 Boltzmann Machine
6.2.1 Restricted Boltzmann Machine
6.3 Generative Adversarial Network
6.3.1 Energy-Based GAN
6.3.2 Wasserstein GAN
6.4 Generalization in Generative Models
Column: Self-learning Monte Carlo method
Column: Self-Learning Monte Carlo Method
Part II Applications to Physics
7 Inverse Problems in Physics
7.1 Inverse Problems and Learning
7.2 Regularization in Inverse Problems
7.3 Inverse Problems and Physical Machine Learning
Column: Sparse Modeling
Column: Sparse Modeling
8 Detection of Phase Transition by Machines
8.1 What is Phase Transition?
8.2 Detecting Phase Transition by a Neural Network
8.3 What the Neural Network Sees
9 Dynamical Systems and Neural Networks
9.1 Differential Equations and Neural Networks
9.2 Representation of Hamiltonian Dynamical System
10 Spinglass and Neural Networks
10.1 Hopfield Model and Spinglass
10.2 Memory and Attractor
10.3 Synchronization and Layering
11 Quantum Manybody Systems, Tensor Networks and Neural Networks
11.1 Neural Network Wave Function
11.2 Tensor Networks and Neural Networks
11.2.1 Tensor Network
11.2.2 Tensor Network Representation of Restricted Boltzmann Machines
12 Application to Superstring Theory
12.1 Inverse Problems in String Theory
12.1.1 Compactification as an Inverse Problem
12.1.2 The Holographic Principle as an Inverse Problem
12.2 Curved Spacetime Is a Neural Network
12.2.1 Neural Network Representation of Field Theory in Curved Spacetime
12.2.2 How to Choose Input/Output Data
12.3 Emergent Spacetime on Neural Networks
12.3.1 Is AdS Black Hole Spacetime Learned?
12.3.2 Emergent Spacetime from Material Data
12.4 Spacetime Emerging from QCD
Column: Black Holes and Information
Column: Black Holes and Information
13 Epilogue
13.1 Neural Network, Physics and Technological Innovation (Akio Tomiya)
13.2 Why Does Intelligence Exist? (Akinori Tanaka)
13.3 Why do Physical Laws Exist? (Koji Hashimoto)
Bibliography
Index


๐Ÿ“œ SIMILAR VOLUMES


Deep Learning and Physics
โœ Akinori Tanaka; Akio Tomiya; Koji Hashimoto ๐Ÿ“‚ Library ๐Ÿ“… 2021 ๐Ÿ› Springer Singapore ๐ŸŒ English
Deep Learning For Physics Research
โœ Martin Erdmann, Jonas Glombitza, Gregor Kasieczka, Uwe Klemradt ๐Ÿ“‚ Library ๐Ÿ“… 2021 ๐Ÿ› World Scientific Publishing Company ๐ŸŒ English

<span>A core principle of physics is knowledge gained from data. Thus, deep learning has instantly entered physics and may become a new paradigm in basic and applied research.This textbook addresses physics students and physicists who want to understand what deep learning actually means, and what is

Machine and Deep Learning in Oncology, M
โœ Issam El Naqa, Martin J. Murphy ๐Ÿ“‚ Library ๐Ÿ“… 2022 ๐Ÿ› Springer ๐ŸŒ English

<span><p>This book, now in an extensively revised and updated second edition, provides a comprehensive overview of both machine learning and deep learning and their role in oncology, medical physics, and radiology. Readers will find thorough coverage of basic theory, methods, and demonstrative appli

Quantum Computing: Physics, Blockchains,
โœ Melanie Swan, Renato P. dos Santos, Frank Witte ๐Ÿ“‚ Library ๐Ÿ“… 2020 ๐Ÿ› World Scientific ๐ŸŒ English

Quantum information and contemporary smart network domains are so large and complex as to be beyond the reach of current research approaches. Hence, new theories are needed for their understanding and control. Physics is implicated as smart networks are physical systems comprised of particle-many it