𝔖 Scriptorium
✦   LIBER   ✦

📁

Neuromorphic Engineering: The Scientist’s, Algorithms Designer’s and Computer Architect’s Perspectives on Brain-Inspired Computing

✍ Scribed by Elishai Ezra Tsur


Publisher
CRC Press
Year
2021
Tongue
English
Leaves
330
Edition
1
Category
Library

⬇  Acquire This Volume

No coin nor oath required. For personal study only.

✦ Synopsis


The brain is not a glorified digital computer. It does not store information in registers, and it does not mathematically transform mental representations to establish perception or behavior. The brain cannot be downloaded to a computer to provide immortality, nor can it destroy the world by having its emerged consciousness traveling in cyberspace. However, studying the brain's core computation architecture can inspire scientists, computer architects, and algorithm designers to think fundamentally differently about their craft.

Neuromorphic engineers have the ultimate goal of realizing machines with some aspects of cognitive intelligence. They aspire to design computing architectures that could surpass existing digital von Neumann-based computing architectures' performance. In that sense, brain research bears the promise of a new computing paradigm. As part of a complete cognitive hardware and software ecosystem, neuromorphic engineering opens new frontiers for neuro-robotics, artificial intelligence, and supercomputing applications.

The book presents neuromorphic engineering from three perspectives: the scientist, the computer architect, and the algorithm designer. It zooms in and out of the different disciplines, allowing readers with diverse backgrounds to understand and appreciate the field. Overall, the book covers the basics of neuronal modeling, neuromorphic circuits, neural architectures, event-based communication, and the neural engineering framework.

✦ Table of Contents


Cover
Half Title
Title Page
Copyright Page
Dedication
Contents
About the author
Preface
Foreword: a tale about passion and fear
List of Figures
Before we begin
Glossary
Section I: Introduction and Overview
Chapter 1: Introducing the perspective of the scientist
1.1. FROM THE NEURON DOCTRINE TO EMERGENT BEHAVIOR
1.1.1. The unity and diversity of neurons
1.1.2. Neural coding
1.1.3. Networks and emergent behavior
1.1.4. Neuronal abstractions
1.1.5. Data-driven neuroscience
1.2. BRAIN MODELING
1.2.1. Brain modeling in biological systems
1.2.2. Computational brain modeling
1.2.2.1. Bottom-up modeling
1.2.2.2. Top-down modeling
1.2.3. Brain modeling with neuromorphic hardware
1.3. GLOSSARY
1.4. FURTHER READING
Chapter 2: Introducing the perspective of the computer architect
2.1. LIMITS OF INTEGRATED CIRCUITS
2.1.1. Transistor density
2.1.2. Processing speed
2.1.3. Distributed computing
2.2. EMERGING COMPUTING PARADIGMS
2.2.1. Quantum computing
2.2.2. Molecular computing
2.2.3. DNA computing
2.2.4. Programmable microfluidics
2.3. BRAIN-INSPIRED HARDWARE
2.3.1. The why
2.3.2. The how
2.3.2.1. Probabilistic representation
2.3.2.2. In-memory computing
2.3.3. The what
2.3.4. Neuromorphic frameworks
2.3.4.1. Neuromorphic sensing
2.3.4.2. Neuromorphic interfaces
2.3.4.3. General purpose neuromorphic computers
2.3.4.4. Memristors
2.4. GLOSSARY
2.5. FURTHER READING
Chapter 3: Introducing the perspective of the algorithm designer
3.1. FROM ARTIFICIAL TO SPIKING NEURAL NETWORKS
3.1.1. Network architecture
3.1.2. Neuromorphic applications
3.1.2.1. Neuromorphic learning
3.1.2.2. Neuro-robotics
3.1.2.3. Cognitive models
3.2. NEUROMORPHIC SOFTWARE DEVELOPMENT
3.3. GLOSSARY
3.4. FURTHER READING
Section II: The Scientist’s Perspective
Chapter 4: Biological description of neuronal dynamics
4.1. POTENTIALS AND SPIKES
4.1.1. The resting potential
4.1.2. The action potential
4.1.3. Spike propagation
4.1.4. Synapses
4.2. POWER AND PERFORMANCE ESTIMATES OF THE BRAIN
4.3. GLOSSARY
4.4. FURTHER READING
Chapter 5: Models of point neuronal dynamic
5.1. THE LEAKY INTEGRATE AND FIRE MODEL
5.1.1. Membrane voltage for various input patterns
5.1.2. Flat input
5.1.3. Step current input
5.1.4. Numerical modeling of pulse input
5.1.5. Arbitrary input
5.2. THE IZHIKEVICH NEURON MODEL
5.3. THE HODGKIN-HUXLEY MODEL
5.4. SYNAPSE MODELING
5.5. SIMULATING POINT NEURONS
5.5.1. Biological plausibility vs. computational resources
5.5.2. Large scale simulations of point processes
5.6. CASE STUDY: A SNN FOR PERCEPTUAL FILLING-IN
5.6.1. Perceptual filling-in
5.6.2. Mathematical formulation
5.6.3. Feed-forward SNN for perceptual filling-in
5.6.4. Recurrent SNN for perceptual filling-in
5.6.5. Is it biologically plausible?
5.7. GLOSSARY
5.8. FURTHER READING
Chapter 6: Models of morphologically detailed neurons
6.1. WHY MORPHOLOGICALLY DETAILED MODELING?
6.2. THE CABLE EQUATION
6.2.1. Passive Dendrite
6.2.2. Axon
6.2.3. Simulating the cable equation
6.2.4. Partition length
6.3. THE COMPARTMENTAL MODEL
6.3.1. Reconstructed morphology and dynamic
6.4. CASE STUDY: DIRECTIONAL SELECTIVE SAC
6.5. GLOSSARY
6.6. FURTHER READING
Chapter 7: Models of network dynamic and learning
7.1. NEURAL CIRCUIT TAXONOMY FOR BEHAVIOR
7.2. RECONSTRUCTION AND SIMULATION OF NEURAL NETWORKS
7.2.1. Detailed modeling
7.2.2. Simulating detailed models
7.3. CASE STUDY: SACS’ LATERAL INHIBITION IN DIRECTION SELECTIVITY
7.4. NEUROMORPHIC AND BIOLOGICAL LEARNING
7.4.1. Biological backpropagation-inspired learning
7.4.2. Biological unsupervised learning
7.4.2.1. Hebbian learning
7.4.2.2. Spike timing-dependent plasticity
7.4.2.3. BCM learning
7.4.2.4. Oja’s learning
7.5. GLOSSARY
7.6. FURTHER READING
Section III: The Computer Architect’s Perspective
Chapter 8: Neuromorphic hardware
8.1. TRANSISTORS AND MICRO-POWER CIRCUITRY
8.2. THE SILICON NEURON
8.2.1. The pulse current-source synapse
8.2.2. The reset and discharge synapse
8.2.3. The charge and discharge synapse
8.2.4. The log-domain integrator synapse
8.2.5. The axon-hillock neuron
8.2.6. Voltage-amplifier LIF neuron
8.3. CASE STUDY: HARDWARE AND SOFTWARE CO-SYNTHESIS
8.3.1. Circuit design
8.3.2. Circuit analysis
8.3.2.1. Architectural design
8.3.2.2. Neuron control
8.3.3. NEF-inspired design
8.4. GLOSSARY
8.5. FURTHER READING
Chapter 9: Communication and hybrid circuit design
9.1. COMMUNICATING SILICON NEURONS
9.2. FROM HYBRID TO DIGITAL CIRCUITRY
9.3. GLOSSARY
9.4. FURTHER READING
Chapter 10: In-memory computing with memristors
10.1. FROM TRANSISTORS TO MEMRISTORS
10.2. A NEW FUNDAMENTAL CIRCUIT ELEMENT
10.3. MEMRISTORS FOR NEUROMORPHIC ENGINEERING
10.4. GLOSSARY
10.5. FURTHER READING
Section IV: The Algorithms Designer’s Perspective
Chapter 11: Introduction to neuromorphic programming
11.1. THEORY OF NEUROMORPHIC COMPUTING
11.1.1. Neuromorphic computing as Turing complete
11.1.2. A complexity theory for neuromorphic computing
11.2. UNDERSTANDING NEUROMORPHIC PROGRAMMING
11.3. GLOSSARY
11.4. FURTHER READING
Chapter 12: The Neural Engineering Framework (NEF)
12.1. THE FUNDAMENTAL PRINCIPLES OF NEF
12.1.1. Representation
12.1.1.1. Encoding
12.1.1.2. Decoding
12.1.1.3. Decoder analysis
12.1.1.4. Representation of high dimensional stimulus
12.1.2. Transformation
12.1.2.1. Linear transformation
12.1.2.2. Linear transformations
12.1.2.3. Non-linear transformations
12.1.2.4. Addition
12.1.2.5. Multiplication
12.1.3. Dynamics
12.1.3.1. The recurrent connection
12.1.3.2. Synthesis
12.1.3.3. Neuromorphic integration
12.1.3.4. Neuromorphic oscillators
12.1.3.5. Neuromorphic attractors
12.2. CASE STUDY: MOTION DETECTION IN A SPIKING CAMERA USING OSCILLATION INTERFERENCE
12.2.1. Spiking camera
12.2.2. Gabor functions
12.2.3. Damped oscillators
12.2.4. Motion detection
12.3. GLOSSARY
12.4. FURTHER READING
Chapter 13: Learning spiking neural networks
13.1. INTRODUCTION TO LEARNING SNNS
13.2. LEARNING SPIKING NEURAL NETWORKS WITH NEF
13.2.1. The prescribed error sensitivity rule
13.2.2. PES learning for classical conditioning
13.3. FROM DNN TO DEEP SNN
13.3.1. MNIST classification with deep SNN
13.3.1.1. Convolutional neural networks
13.3.1.2. CNN architecture
13.3.1.3. Results
13.4. GLOSSARY
13.5. FURTHER READING
Bibliography
Index


📜 SIMILAR VOLUMES


Neuromorphic Engineering: The Scientist’
✍ Elishai Ezra Tsur 📂 Library 📅 2021 🏛 CRC Press 🌐 English

<span><p>The brain is not a glorified digital computer. It does not store information in registers, and it does not mathematically transform mental representations to establish perception or behavior. The brain cannot be downloaded to a computer to provide immortality, nor can it destroy the world b

Learning in Energy-Efficient Neuromorphi
✍ Nan Zheng; Pinaki Mazumder 📂 Library 📅 2019 🏛 Wiley-IEEE Press 🌐 English

<b>Explains current co-design and co-optimization methodologies for building hardware neural networks and algorithms for machine learning applications</b><br /><br />This book focuses on how to build energy-efficient hardware for neural networks with learning capabilities--and provides co-design and

Neuromorphic Computing for Computer Scie
✍ Developers, Dynex 📂 Library 📅 2024 🏛 Independently Published 🌐 English

In 2019, Google astounded the world with the revelation that their quantum computer, Sycamore, had conquered an insurmountable problem. Remarkably, Sycamore achieved this feat in less than 200 seconds, a task that conventional computers, even the most potent ones, would require over 10,000 years to

Neuromorphic Devices for Brain-inspired
✍ Qing Wan, Yi Shi 📂 Library 📅 2022 🏛 Wiley-VCH 🌐 English

<p><span>Explore the cutting-edge of neuromorphic technologies with applications in Artificial Intelligence</span></p><p><span>In </span><span>Neuromorphic Devices for Brain-Inspired Computing: Artificial Intelligence, Perception, and Robotics,</span><span> a team of expert engineers delivers a comp