Information-theoretic aspects of neural networks
β Scribed by Neelakanta, Perambur S(Editor)
- Publisher
- CRC Press
- Year
- 1999
- Tongue
- English
- Leaves
- 417
- Category
- Library
No coin nor oath required. For personal study only.
β¦ Synopsis
Information theoretics vis-a-vis neural networks generally embodies parametric entities and conceptual bases pertinent to memory considerations and information storage, information-theoretic based cost-functions, and neurocybernetics and self-organization. Existing studies only sparsely cover the entropy and/or cybernetic aspects of neural information.
Information-Theoretic Aspects of Neural Networks cohesively explores this burgeoning discipline, covering topics such as:
Shannon information and information dynamics
neural complexity as an information processing system
memory and information storage in the interconnected neural web
extremum (maximum and minimum) information entropy
neural network training
non-conventional, statistical distance-measures for neural network optimizations
symmetric and asymmetric characteristics of information-theoretic error-metrics
algorithmic complexity based representation of neural information-theoretic parameters
genetic algorithms versus neural information
dynamics of neurocybernetics viewed in the information-theoretic plane
nonlinear, information-theoretic transfer function of the neural cellular units
statistical mechanics, neural networks, and information theory
semiotic framework of neural information processing and neural information flow
fuzzy information and neural networks
neural dynamics conceived through fuzzy information parameters
neural information flow dynamics
informatics of neural stochastic resonance
Information-Theoretic Aspects of Neural Networks acts as an exceptional resource for engineers, scientists, and computer scientists working in the field of artificial neural networks as well as biologists applying the concepts of communication theory and protocols to the functioning of the brain. The information in this book explores new avenues in the field and creates a common platform for analyzing the neural complex as well as artificial neural networks.
β¦ Table of Contents
Cover......Page 1
Half Title......Page 2
Title Page......Page 4
Copyright Page......Page 5
Preface......Page 6
Contributors......Page 10
Table of Contents......Page 12
Chapter 1: Introduction......Page 16
1.1 Neuroinformatics......Page 17
1.2 Information-Theoretic Framework of Neural Networks......Page 30
1.3 Entropy, Thermodynamics and Information Theory......Page 35
1.4 Information-Theoretics and Neural Network Training......Page 37
1.5 Dynamics of Neural Learning in the Information-Theoretic Plane......Page 51
1.6 Neural Nonlinear Activity in the Information-Theoretic Plane......Page 53
1.7 Degree of Neural Complexity and Maximum Entropy......Page 54
1.8 Concluding Remarks......Page 56
Bibliography......Page 58
Appendix 1.1......Page 67
Appendix 1.2......Page 76
Appendix 1.3......Page 84
2.1 Introduction......Page 86
2.2 Neural Information Processing: C3! Protocols......Page 87
2.3 Nonlinear Neural Activity......Page 91
2.4 Bernoulli-Riccati Equations......Page 95
2.5 Nonlinear Neural Activity: Practical Considerations......Page 98
2.7 Nonsigmoidal Activation Functions......Page 103
2.8 Definitions and Lemmas on Certain Classes of Nonlinear Functions......Page 104
2.9 Concluding Remarks......Page 106
Bibliography......Page 110
Appendix 2.1......Page 114
Appendix 2.2......Page 120
3.2 What is Fuzzy Activity?......Page 128
3.3 Crisp Sets versus Fuzzy Sets......Page 130
3.4 Membership Attributions to a Fuzzy Set......Page 131
3.5 Fuzzy Neural Activity......Page 134
3.6 Fuzzy Differential Equations......Page 135
3.7 Membership Attributions to Fuzzy Sets via Activation Function......Page 141
3.8 Neural Architecture with a Fuzzy Sigmoid......Page 145
3.9 Fuzzy Considerations, Uncertainty and Information......Page 149
3.10 Information-Theoretics of Crisp and Fuzzy Sets......Page 150
3.11 Fuzzy Neuroinformatics......Page 156
Bibliography......Page 162
4.1 Introduction......Page 166
4.2 Disorganization and Entropy Considerations in Neural Networks......Page 169
4.3 Information-Theoretic Error-Measures......Page 173
4.4 Neutral Nonlinear Response versus Optimization Algorithms......Page 185
4.5. A Multilayer Perceptron Training with Information-Theoretic Cost-Functions......Page 188
4.6 Results on Neural Network Training with Csiszarβs Error-Measures......Page 191
4.7 Generalized Csiszarβs Symmetrized Error-Measures......Page 202
4.8 One-Sided Error-Measures and Implementation of Symmetrization......Page 203
Bibliography......Page 244
5.1 Introduction......Page 248
5.2 Stochastical Neural Dynamics......Page 252
5.3 Stochastical Dynamics of the Error-Measure (Ι)......Page 253
5.4 Random Walk Paradigm of Ι(t) Dynamics......Page 254
5.5 Evolution of Ι(t): Representation via the Fokker-Planck Equation......Page 258
5.6 Logistic Growth Model of Ι(t)......Page 260
5.7 Convergence Considerations......Page 264
5.8 Further Considerations on the Dynamics of Ι(t)......Page 272
5.9 Dynamics of Fuzzy Uncertainty......Page 278
5.10 Concluding Remarks......Page 282
Bibliography......Page 284
6.2 Neural Complexity......Page 286
6.3 Complexity Measure......Page 293
6.4 Neural Networks: Simple and Complex......Page 299
6.5 Neural Complexity versus Neural Entropy......Page 301
6.6. Neural Network Training via Complexity Parameter......Page 304
6.7 Calculation ΞΌn and Οn......Page 306
6.8. Perceptron Training: Simulated Results......Page 308
6.9 Concluding Remarks......Page 312
Bibliography......Page 314
Appendix 6.1......Page 316
7.1 Introduction......Page 320
7.2 Inter-Event Histograms (IIH) and Stochastic Resonance......Page 328
7.3 A Neural Network under SR-Based Learning......Page 332
7.4. Simulation Results......Page 336
7.5 Concluding Remarks......Page 339
Bibliography......Page 342
8.1 Entropy, Thermobiodynamics and Bioinformatics......Page 346
8.2 Genetic Code......Page 357
8.3 Search Algorithms......Page 359
8.4 Simple Genetic Algorithm (SGA)......Page 373
8.5 Genetic Algorithms and Neural Networks......Page 378
8.6 Information-Theoretics of Genetic Selection Algorithm......Page 385
8.7 A Test ANN Architecture Deploying GA and IT Concepts......Page 389
8.8 Description of the Algorithm......Page 394
8.9 Experimental Simulations......Page 399
8.10 Concluding Remarks......Page 408
Bibliography......Page 410
Subject Index......Page 412
π SIMILAR VOLUMES
<p>Since the appearance of Vol. 1 of Models of Neural Networks in 1991, the theory of neural nets has focused on two paradigms: information coding through coherent firing of the neurons and functional feedback. Information coding through coherent neuronal firing exploits time as a cardinal degree of
This important work describes recent theoretical advances in the study of artificial neural networks. It explores probabilistic models of supervised learning problems, and addresses the key statistical and computational questions. Chapters survey research on pattern classification with binary-output
ΠΠ·Π΄Π°ΡΠ΅Π»ΡΡΡΠ²ΠΎ Springer, 1995, -354 pp.<div class="bb-sep"></div>The second volume of the Physics of Neural Networks series.<div class="bb-sep"></div>Models of Neural Networks I (<a class="object-link fpm" data-file-id="1427677" href="/file/1427677/">/file/1427677/</a>)<br/>Models of Neural Networks I