<p>This book is a part of the Proceedings of the Seventh International Symposium on Neural Networks (ISNN 2010), held on June 6-9, 2010 in Shanghai, China. ISNN 2010 received numerous submissions from about thousands of authors in about 40 countries and regions across six continents . Based on the r
Neural Networks. Advances and Applications
β Scribed by E. Gelenbe
- Publisher
- North-holland, North Holland
- Year
- 1992
- Tongue
- English
- Leaves
- 224
- Edition
- 2nd
- Category
- Library
No coin nor oath required. For personal study only.
β¦ Synopsis
The present volume is a natural follow-up to Neural Networks: Advances and Applications which appeared one year previously. As the title indicates, it combines the presentation of recent methodological results concerning computational models and results inspired by neural networks, and of well-documented applications which illustrate the use of such models in the solution of difficult problems. The volume is balanced with respect to these two orientations: it contains six papers concerning methodological developments and five papers concerning applications and examples illustrating the theoretical developments. Each paper is largely self-contained and includes a complete bibliography.
The methodological part of the book contains two papers on learning, one paper which presents a computational model of intracortical inhibitory effects, a paper presenting a new development of the random neural network, and two papers on associative memory models. The applications and examples portion contains papers on image compression, associative recall of simple typed images, learning applied to typed images, stereo disparity detection, and combinatorial optimisation
β¦ Table of Contents
Content:
Front Matter, Page iii
Copyright, Page iv
PREFACE, Pages v-viii, Erol Gelenbe
Learning in the Recurrent Random Neural Network, Pages 1-12, Erol Gelenbe
Generalization Performance of Feed-Forward Neural Networks, Pages 13-38, Shashi Shekhar, Minesh B. Amin, Prashant Khandelwal
The Nature of Intracortical Inhibitory Effects, Pages 39-81, James A. Reggia, C. Lynne D'Autrechy, Granger Sutton III, Michael Weinrich
Random Neural Networks with Multiple Classes of Signals, Pages 83-93, Jean-Michel Fourneau, Erol Gelenbe
The MicroCircuit Associative Memory Architecture, Pages 95-127, Coe F. Miles, David Rogers
Generalised Associative Memory and the Computation of Membership Functions, Pages 129-140, Erol Gelenbe
Layered Neural Network for Stereo Disparity Detection, Pages 141-153, Eisaku Maeda, Akio Shio, Masashi Okudaira
Storage and Recognition Methods for The Random Neural Network, Pages 155-176, Myriam Mokhtari
NEURAL NETWORKS FOR IMAGE COMPRESSION, Pages 177-198, Sergio Carrato
Autoassociative Memory with the Random Neural Network using Gelenbe's Learning Algorithm, Pages 199-214, Christine HUBERT
Minimum Graph Covering with the Random Neural Network Model, Pages 215-222, Erol Gelenbe, FrΓ©deric Batty
π SIMILAR VOLUMES
<p><b>Presents the latest advances in complex-valued neural networks by demonstrating the theory in a wide range of applications</b></p><p>Complex-valued neural networks is a rapidly developing neural network framework that utilizes complex arithmetic, exhibiting specific characteristics in its lear
ΠΠ·Π΄Π°ΡΠ΅Π»ΡΡΡΠ²ΠΎ IEEE/John Wiley, 2013, -303 pp.<div class="bb-sep"></div>Complex-valued neural networks (CVNNs) have continued to open doors to various new applications. The CVNNs are the neural networks that deal with complex amplitude, i.e. signal having phase and amplitude, which is one of the most
<p>The Self-Organizing Map (SOM) is one of the most frequently used architectures for unsupervised artificial neural networks. Introduced by Teuvo Kohonen in the 1980s, SOMs have been developed as a very powerful method for visualization and unsupervised classification tasks by an active and innovat