𝔖 Scriptorium
✦   LIBER   ✦

πŸ“

Introduction to Neural Networks

✍ Scribed by Gurney, K


Publisher
Routledge
Year
1997
Tongue
English
Leaves
148
Category
Library

⬇  Acquire This Volume

No coin nor oath required. For personal study only.

✦ Synopsis


Though mathematical ideas underpin the study of neural networks, the author presents the fundamentals without the full mathematical apparatus. All aspects of the field are tackled, including artificial neurons as models of their real counterparts; the geometry of network action in pattern space; gradient descent methods, including back-propagation; associative memory and Hopfield nets; and self-organization and feature maps. The traditionally difficult topic of adaptive resonance theory is clarified within a hierarchical description of its operation. The book also includes several real-world examples to provide a concrete focus. This should enhance its appeal to those involved in the design, construction and management of networks in commercial environments and who wish to improve their understanding of network simulator packages. As a comprehensive and highly accessible introduction to one of the most important topics in cognitive and computer science, this volume should interest a wide range of readers, both students and professionals, in cognitive science, psychology, computer science and electrical engineering.

✦ Table of Contents


Book Cover......Page 1
Half-Title......Page 2
Title......Page 3
Copyright......Page 4
Contents......Page 5
Preface......Page 8
1.1 What are neural networks?......Page 10
1.2 Why study neural networks?......Page 12
1.4 Notes......Page 13
2.1 Real neurons: a review......Page 14
2.2 Artificial neurons: the TLU......Page 17
2.3 Resilience to noise and hardware failure......Page 19
2.4 Non-binary signal communication......Page 20
2.5 Introducing time......Page 21
2.6 Summary......Page 23
2.7 Notes......Page 24
3.1.1 Pattern classification and input space......Page 25
3.1.2 The linear separation of classes......Page 26
3.2 Vectors......Page 27
3.2.2 The length of a vector......Page 28
3.2.3 Comparing vectorsβ€”the inner product......Page 29
Inner productβ€”algebraic form......Page 30
3.3 TLUs and linear separability revisited......Page 31
3.4 Summary......Page 32
3.5 Notes......Page 33
4.2 Training the threshold as a weight......Page 34
4.3 Adjusting the weight vector......Page 35
4.4 The perception......Page 37
4.5.2 Nonlinearly separable classes......Page 38
4.6.1 Making training sets......Page 40
4.6.2 Real and virtual networks......Page 41
4.8 Notes......Page 42
5.1 Finding the minimum of a function: gradient descent......Page 43
5.2 Gradient descent on an error......Page 45
5.3 The delta rule......Page 46
5.4 Watching the delta rule at work......Page 48
5.5 Summary......Page 49
6.1 Training rules for multilayer nets......Page 50
6.2 The backpropagation algorithm......Page 51
6.3 Local versus global minima......Page 52
6.5 Speeding up learning: the momentum term......Page 53
6.6 More complex nets......Page 54
6.7.1 Operation in pattern space......Page 55
6.7.2 Networks and function fitting......Page 57
6.7.3 Hidden nodes as feature extractors......Page 58
6.9 Generalization and overtraining......Page 59
6.10.2 Adequate training set size......Page 61
6.10.4 Constructing topologies......Page 62
6.11.1 Psychiatric patient length of stay......Page 63
6.11.2 Stock market forecasting......Page 64
6.14 Notes......Page 65
7.1 The nature of associative memory......Page 66
7.3 A physical analogy with memory......Page 67
7.4 The Hopfield net......Page 68
7.4.1 Defining an energy for the net......Page 70
7.4.2 Alternative dynamics and basins of attraction......Page 72
7.5.1 The storage prescription......Page 73
7.5.2 The Hebb rule......Page 74
7.7 The analogue Hopfield model......Page 75
7.8 Combinatorial optimization......Page 76
7.9 Feedforward and recurrent associative nets......Page 77
7.11 Notes......Page 78
8.1 Competitive dynamics......Page 79
8.2 Competitive learning......Page 81
8.2.1 Letter and β€œword” recognition......Page 83
8.3.1 Topographic maps in the visual cortex......Page 84
8.3.2 Developing topographic maps......Page 85
8.3.3 The SOM algorithm......Page 86
8.3.4 A graphic example......Page 87
8.3.5 Maps, distributions and dimensionality......Page 89
8.3.6 SOMs and classification: LVQ......Page 90
Visualizing the input space......Page 92
8.4 Principal component analysis......Page 94
8.5 Further remarks......Page 96
8.7 Notes......Page 97
9.1.4 System integration......Page 98
9.2 A hierarchical description of networks......Page 99
9.3.2 Network architecture......Page 100
9.3.3 Algorithmic level......Page 101
9.3.4 An example......Page 103
9.3.5 A system-level implementation......Page 104
9.3.6 The signal level: some neural mechanisms......Page 106
9.5 Applications......Page 107
9.6 Further remarks......Page 108
9.8 Notes......Page 109
10.1 Synapses revisited......Page 110
10.2 Sigma-pi units......Page 111
10.3 Digital neural networks......Page 112
10.3.1 Boolean functions as artificial neurons......Page 113
Recurrent nets......Page 114
10.3.3 Feedforward nets......Page 115
10.3.5 Expressions for cube activation......Page 116
Memory growth......Page 118
10.4 Radial basis functions......Page 119
10.5.1 Associative reward-penalty training......Page 121
10.5.2 System identification......Page 123
10.6 Summary......Page 124
10.7 Notes......Page 125
11.1.1 Neural net tasks......Page 126
11.1.2 A taxonomy of artificial neurons......Page 127
11.1.3 A taxonomy of network structures and dynamics......Page 128
11.2 Networks and the computational hierarchy......Page 129
11.4.1 The search for artificial intelligence......Page 131
11.4.2 The symbolic paradigm......Page 132
11.4.3 The connectionist paradigm......Page 133
11.4.4 Symbols and neuronsβ€”a rapprochement......Page 134
11.5.1 The early years......Page 135
11.7 Notes......Page 136
Appendix A The cosine function......Page 137
References......Page 139
Index......Page 144


πŸ“œ SIMILAR VOLUMES


Introduction to Neural Networks
✍ Architecture Technology Corpor. (Auth.) πŸ“‚ Library πŸ“… 1991 πŸ› Elsevier Advanced Technology 🌐 English

Please note this is a Short Discount publication. Neural network technology has been a curiosity since the early days of computing. Research in the area went into a near dormant state for a number of years, but recently there has been a new increased interest in the subject. This has been due to a n

An introduction to neural networks
✍ Anderson J.A. πŸ“‚ Library πŸ“… 1997 πŸ› MIT 🌐 English

An Introduction to Neural Networks falls into a new ecological niche for texts. Based on notes that have been class-tested for more than a decade, it is aimed at cognitive science and neuroscience students who need to understand brain function in te

Introduction to Graph Neural Networks
✍ Zhiyuan Liu, Jie Zhou πŸ“‚ Library πŸ“… 2020 πŸ› Morgan & Claypool 🌐 English

<p><b>Graphs are useful data structures in complex real-life applications such as modeling physical systems, learning molecular fingerprints, controlling traffic networks, and recommending friends in social networks.</b> However, these tasks require dealing with non-Euclidean graph data that contain