𝔖 Scriptorium
✦   LIBER   ✦

πŸ“

Theoretical Advances in Neural Computation and Learning

✍ Scribed by Vwani Roychowdhury, Kai-Yeung Siu, Alon Orlitsky (auth.), Vwani Roychowdhury, Kai-Yeung Siu, Alon Orlitsky (eds.)


Publisher
Springer US
Year
1994
Tongue
English
Leaves
481
Edition
1
Category
Library

⬇  Acquire This Volume

No coin nor oath required. For personal study only.

✦ Synopsis


For any research field to have a lasting impact, there must be a firm theoretical foundation. Neural networks research is no exception. Some of the foundaΒ­ tional concepts, established several decades ago, led to the early promise of developing machines exhibiting intelligence. The motivation for studying such machines comes from the fact that the brain is far more efficient in visual processing and speech recognition than existing computers. Undoubtedly, neuΒ­ robiological systems employ very different computational principles. The study of artificial neural networks aims at understanding these computational prinΒ­ ciples and applying them in the solutions of engineering problems. Due to the recent advances in both device technology and computational science, we are currently witnessing an explosive growth in the studies of neural networks and their applications. It may take many years before we have a complete understanding about the mechanisms of neural systems. Before this ultimate goal can be achieved, anΒ­ swers are needed to important fundamental questions such as (a) what can neuΒ­ ral networks do that traditional computing techniques cannot, (b) how does the complexity of the network for an application relate to the complexity of that problem, and (c) how much training data are required for the resulting network to learn properly? Everyone working in the field has attempted to answer these questions, but general solutions remain elusive. However, encouraging progress in studying specific neural models has been made by researchers from various disciplines.

✦ Table of Contents


Front Matter....Pages i-xxiv
Front Matter....Pages 1-1
Neural Models and Spectral Methods....Pages 3-36
Depth-Efficient Threshold Circuits for Arithmetic Functions....Pages 37-84
Communication Complexity and Lower Bounds for Threshold Circuits....Pages 85-125
A Comparison of the Computational Power of Sigmoid and Boolean Threshold Circuits....Pages 127-151
Computing on Analog Neural Nets with Arbitrary Real Weights....Pages 153-172
Connectivity Versus Capacity in the Hebb Rule....Pages 173-240
Front Matter....Pages 241-241
Computational Learning Theory and Neural Networks: A Survey of Selected Topics....Pages 243-293
Perspectives of Current Research about the Complexity of Learning on Neural Nets....Pages 295-336
Learning an Intersection of K Halfspaces Over a Uniform Distribution....Pages 337-356
On the Intractability of Loading Neural Networks....Pages 357-389
Learning Boolean Functions via the Fourier Transform....Pages 391-424
LMS and Backpropagation are Minimax Filters....Pages 425-447
Supervised Learning: Can it Escape its Local Minimum?....Pages 449-461
Back Matter....Pages 463-468

✦ Subjects


Data Structures, Cryptology and Information Theory;Artificial Intelligence (incl. Robotics);Statistical Physics, Dynamical Systems and Complexity;Electrical Engineering


πŸ“œ SIMILAR VOLUMES


Advances in Neural Networks: Computation
✍ Simone Bassis, Anna Esposito, Francesco Carlo Morabito (eds.) πŸ“‚ Library πŸ“… 2015 πŸ› Springer International Publishing 🌐 English

<p><p>This book collects research works that exploit neural networks and machine learning techniques from a multidisciplinary perspective. Subjects covered include theoretical, methodological and computational topics which are grouped together into chapters devoted to the discussion of novelties and

Motivation in Learning Contexts: Theoret
✍ Simone Volet, Sanna Jarvela πŸ“‚ Library πŸ“… 2001 🌐 English

This volume provides a platform for discussing theoretical and methodological developments in the field of motivation research related to learning and instruction. The combination of socio-cultural, situative and socio-cognitive epistemological traditions underlying the different contributions enabl

Neural Network Learning: Theoretical Fou
✍ Martin Anthony, Peter L. Bartlett πŸ“‚ Library πŸ“… 2009 🌐 English

This important work describes recent theoretical advances in the study of artificial neural networks. It explores probabilistic models of supervised learning problems, and addresses the key statistical and computational questions. Chapters survey research on pattern classification with binary-output

Advanced Neural Computers
✍ R. Eckmiller πŸ“‚ Library πŸ“… 1990 πŸ› Elsevier B.V, North Holland 🌐 English

This book is the outcome of the International Symposium on Neural Networks for Sensory and Motor Systems (NSMS) held in March 1990 in the FRG. The NSMS symposium assembled 45 invited experts from Europe, America and Japan representing the fields of Neuroinformatics, Computer Science, Computational N

Computational Intelligence: Theoretical
✍ Dinesh C.S. Bisht (editor); Mangey Ram (editor) πŸ“‚ Library πŸ“… 2020 πŸ› De Gruyter 🌐 English

<p>Computational intelligence (CI) lies at the interface between engineering and computer science; control engineering, where problems are solved using computer-assisted methods. Thus, it can be regarded as an indispensable basis for all artificial intelligence (AI) activities. This book collects su

Computational Intelligence: Theoretical
✍ Dinesh C.S. Bisht (editor); Mangey Ram (editor) πŸ“‚ Library πŸ“… 2020 πŸ› De Gruyter 🌐 English

<p>Computational intelligence (CI) lies at the interface between engineering and computer science; control engineering, where problems are solved using computer-assisted methods. Thus, it can be regarded as an indispensable basis for all artificial intelligence (AI) activities. This book collects su