ะะทะดะฐัะตะปัััะฒะพ Imperial College Press, 2007, -322 pp.<div class="bb-sep"></div>The area of Neural computing that we shall discuss in this book represents a combination of techniques of classical optimization, statistics, and information theory. Neural network was once widely called artificial neural n
Neural Networks and Computing: Learning Algorithms and Applications (Series in Electrical and Computer Engineering)
โ Scribed by Tommy W. S. Chow
- Publisher
- World Scientific Publishing Company
- Year
- 2007
- Tongue
- English
- Leaves
- 322
- Series
- Series in Electrical and Computer Engineering 7
- Category
- Library
No coin nor oath required. For personal study only.
โฆ Synopsis
This book covers neural networks with special emphasis on advanced learning methodologies and applications. It includes practical issues of weight initializations, stalling of learning, and escape from a local minima, which have not been covered by many existing books in this area. Additionally, the book highlights the important feature selection problem, which baffles many neural networks practitioners because of the difficulties handling large datasets. It also contains several interesting IT, engineering and bioinformatics applications.
โฆ Table of Contents
Contents......Page 10
Preface......Page 6
1.1 Background......Page 14
1.2 Neuron Model......Page 15
1.3 Historical Remarks......Page 17
1.4.1 Supervised Neural Networks......Page 19
1.4.1.1 McCulloh and Pitts Model......Page 20
1.4.1.2 The Perceptron Model......Page 24
1.4.1.3 Multi-layer Feedforward Network......Page 27
1.4.1.4 Recurrent Networks......Page 28
1.4.2 Unsupervised Neural Networks......Page 30
1.5.1 Determination of Parameters......Page 32
1.5.2 Gradient Descent Searching Method......Page 38
Exercises......Page 39
2. Learning Performance and Enhancement......Page 44
2.1 Fundamental of Gradient Descent Optimization......Page 45
2.2 Conventional Backpropagation Algorithm......Page 48
2.3 Convergence Enhancement......Page 55
2.3.1 Extended Backpropagation Algorithm......Page 57
2.3.2 Least Squares Based Training Algorithm......Page 60
2.3.3 Extended Least Squares Based Algorithm......Page 68
2.4 Initialization Consideration......Page 72
2.4.1 Weight Initialization Algorithm I......Page 74
2.4.2 Weight Initialization Algorithm II......Page 77
2.4.3 Weight Initialization Algorithm III......Page 80
2.5 Global Learning Algorithms......Page 82
2.5.1 Simulated Annealing Algorithm......Page 83
2.5.2 Alopex Algorithm......Page 84
2.5.3 Reactive Tabu Search......Page 85
2.5.4 The NOVEL Algorithm......Page 86
2.5.5 The Heuristic Hybrid Global Learning Algorithm......Page 87
2.6.1 Fast Learning Algorithms......Page 95
2.6.2 Weight Initialization Methods......Page 96
2.6.3 Global Learning Algorithms......Page 97
Appendix 2.1......Page 98
Exercises......Page 100
3. Generalization and Performance Enhancement......Page 104
3.1 Cost Function and Performance Surface......Page 106
3.1.1 Maximum Likelihood Estimation......Page 107
3.1.2 The Least-Square Cost Function......Page 108
3.2 Higher-Order Statistic Generalization......Page 111
3.2.1 Definitions and Properties of Higher-Order Statistics......Page 112
3.2.2 The Higher-Order Cumulants based Cost Function......Page 114
3.2.3 Property of the Higher-Order Cumulant Cost Function......Page 118
3.2.4 Learning and Generalization Performance......Page 121
3.2.4.1 Experiment one: Henon Attractor......Page 122
3.2.4.2 Experiment Two: Sunspot time-series......Page 128
3.3 Regularization for Generalization Enhancement......Page 129
3.3.1 Adaptive Regularization Parameter Selection (ARPS) Method......Page 132
3.3.1.1 Stalling Identification Method......Page 133
3.3.1.2 Selection Schemes......Page 134
3.3.2 Synthetic Function Mapping......Page 137
3.4 Concluding Remarks......Page 138
3.4.1 Objective function selection......Page 140
3.4.2 Regularization selection......Page 142
Appendix 3.1 Confidence Upper Bound of Approximation Error.......Page 143
Appendix 3.2 Proof of the Property of the HOC Cost Function......Page 145
Appendix 3.3 The Derivation of the Sufficient Conditions of the Regularization Parameter......Page 148
Exercises......Page 149
4. Basis Function Networks for Classification......Page 152
4.1 Linear Separation and Perceptions......Page 153
4.2 Basis Function Model for Parametric Smoothing......Page 155
4.3.1 RBF Networks Architecture......Page 157
4.3.2 Universal Approximation......Page 159
4.3.3 Initialization and Clustering......Page 162
4.3.4.1 Linear Weights Optimization.......Page 165
4.3.4.2 Gradient Descent Optimization......Page 167
4.3.4.3 Hybrid of Least Squares and Penalized Optimization......Page 168
4.3.5 Regularization Networks.......Page 170
4.4.1 Support Vector Machine.......Page 172
4.4.2 Wavelet Network......Page 174
4.4.3 Fuzzy RBF Controllers.......Page 177
4.4.4 Probabilistic Neural Networks......Page 180
4.5 Concluding Remarks.......Page 182
Exercises......Page 183
5.1 Introduction......Page 186
5.2 Self-Organizing Maps......Page 189
5.2.1 Learning Algorithm......Page 190
5.3.1 Cell Splitting Grid......Page 194
5.3.2 Growing Hierarchical Self-Organizing Quadtree Map......Page 197
5.4.1 Cellular Probabilistic SOM......Page 199
5.4.2 Probabilistic Regularized SOM......Page 204
5.5 Clustering of SOM......Page 212
5.6 Multi-Layer SOM for Tree-Structured data......Page 215
5.6.1 SOM Input Representation......Page 216
5.6.2 MLSOM Training......Page 218
5.6.3 MLSOM visualization and classification......Page 220
Exercises......Page 223
6.1 Introduction......Page 226
6.2 Support Vector Machines (SVM)......Page 230
6.2.1 Support Vector Machine Visualization (SVMV)......Page 232
6.3 Cost Function......Page 236
6.3.1 MSE and MCE Cost Functions......Page 238
6.3.2 Hybrid MCE-MSE Cost Function......Page 240
6.3.3 Implementing MCE-MSE......Page 243
6.4 Feature Selection......Page 247
6.4.1.1 Mutual Information......Page 249
6.4.1.2 Probability density function (pdf) estimation......Page 251
6.4.2 MI Based Forward Feature Selection......Page 253
6.4.2.1 MIFS and MIFS-U......Page 255
6.4.2.2 Using quadratic MI......Page 256
Exercises......Page 261
7.1 Electric Load Forecasting......Page 264
7.1.1 Nonlinear Autoregressive Integrated Neural Network Model......Page 266
7.1.2 Case Studies......Page 270
7.2 Content-based Image Retrieval Using SOM......Page 275
7.2.1.1 Overall Architecture of GHSOQM-Based CBIR System......Page 276
7.2.1.2 Image Segmentation, Feature Extraction and Region- Based Feature Matrices......Page 277
7.2.1.3 Image Distance......Page 278
7.2.1.4 GHSOQM and Relevance Feedback in the CBIR System......Page 279
7.2.2 Performance Evaluation......Page 283
7.3 Feature Selection for cDNA Microarray......Page 286
7.3.1 MI Based Forward Feature Selection Scheme......Page 288
7.3.2 The Supervised Grid Based Redundancy Elimination......Page 289
7.3.3 The Forward Gene Selection Process Using MIIO and MISF......Page 290
7.3.4 Results......Page 291
7.3.4.1 Prostate Cancer Classification Dataset......Page 293
7.3.4.2 Subtype of ALL Classification Dataset......Page 296
7.3.5 Remarks......Page 302
Bibliography......Page 304
Index......Page 318
๐ SIMILAR VOLUMES
<span>This volume of research papers comprises the proceedings of the first International Conference on Mathematics of Neural Networks and Applications (MANNA), which was held at Lady Margaret Hall, Oxford from July 3rd to 7th, 1995 and attended by 116 people. The meeting was strongly supported and,
"This book presents the most recent state-of-the-art theories, methods and techniques in various aspects of mobile computing and communications across engineering, business and organizational perspectives. It gives insights on the recent advancements in various aspects of mobile computing and commun
Neural Networks: Computational Models and Applications presents important theoretical and practical issues in neural networks, including the learning algorithms of feed-forward neural networks, various dynamical properties of recurrent neural networks, winner-take-all networks and their applications
<p><P>Neural Networks: Computational Models and Applications covers a wealth of important theoretical and practical issues in neural networks, including the learning algorithms of feed-forward neural networks, various dynamical properties of recurrent neural networks, winner-take-all networks and th
Neural Networks: Computational Models and Applications presents important theoretical and practical issues in neural networks, including the learning algorithms of feed-forward neural networks, various dynamical properties of recurrent neural networks, winner-take-all networks and their applications