<p>Providing a broad but in-depth introduction to neural network and machine learning in a statistical framework, this book provides a single, comprehensive resource for study and further research. All the major popular neural network models and statistical learning approaches are covered with examp
Neural networks and statistical learning
✍ Scribed by Du Ke-Lin, Swamy M.N.S
- Publisher
- Springer
- Year
- 2019
- Tongue
- English
- Leaves
- 996
- Edition
- 2
- Category
- Library
No coin nor oath required. For personal study only.
✦ Table of Contents
Preface to the Second Edition......Page 6
Preface to the First Edition......Page 8
Contents......Page 11
Abbreviations......Page 25
1.1 Major Events in Machine Learning Research......Page 29
1.2 Neurons......Page 32
1.2.1 McCulloch–Pitts Neuron Model......Page 33
1.2.2 Spiking Neuron Models......Page 35
1.3 Neural Networks......Page 37
1.4 Neural Network Processors......Page 41
1.5 Scope of the Book......Page 44
References......Page 45
2.1 Learning and Inference Methods......Page 48
2.1.1 Scientific Reasoning......Page 49
2.1.2 Supervised, Unsupervised, and Reinforcement Learnings......Page 51
2.1.3 Semi-supervised Learning and Active Learning......Page 54
2.1.4 Other Learning Methods......Page 55
2.2 Learning and Generalization......Page 60
2.2.1 Generalization Error......Page 61
2.2.2 Generalization by Stopping Criterion......Page 62
2.2.3 Generalization by Regularization......Page 63
2.2.4 Dropout......Page 64
2.2.5 Fault Tolerance and Generalization......Page 66
2.3 Model Selection......Page 67
2.3.1 Cross-Validation......Page 68
2.3.2 Complexity Criteria......Page 70
2.4 Bias and Variance......Page 72
2.5 Criterion Functions......Page 74
2.6 Robust Learning......Page 76
2.7.1 Boolean Function Approximation......Page 78
2.7.2 Linear Separability and Nonlinear Separability......Page 80
2.7.3 Continuous Function Approximation......Page 82
2.7.4 Winner-Takes-All......Page 83
References......Page 85
3.1 Introduction......Page 91
3.2 Probably Approximately Correct (PAC) Learning......Page 92
3.2.1 Sample Complexity......Page 93
3.3 Vapnik–Chervonenkis Dimension......Page 94
3.4 Rademacher Complexity......Page 96
3.5 Empirical Risk-Minimization Principle......Page 98
3.5.1 Function Approximation, Regularization, and Risk Minimization......Page 100
3.6 Fundamental Theorem of Learning Theory......Page 101
3.7 No-Free-Lunch Theorem......Page 102
References......Page 103
4.1 One-Neuron Perceptron......Page 106
4.2 Single-Layer Perceptron......Page 107
4.3 Perceptron Learning Algorithm......Page 108
4.4 Least Mean Squares (LMS) Algorithm......Page 110
4.5 P-Delta Rule......Page 113
4.6 Other Learning Algorithms......Page 114
References......Page 118
5.1 Introduction......Page 121
5.2 Universal Approximation......Page 122
5.3 Backpropagation Learning Algorithm......Page 123
5.4 Incremental Learning Versus Batch Learning......Page 128
5.5 Activation Functions for the Output Layer......Page 133
5.6.1 Network Pruning Using Sensitivity Analysis......Page 134
5.6.2 Network Pruning Using Regularization......Page 137
5.6.3 Network Growing......Page 139
5.7.1 Eliminating Premature Saturation......Page 141
5.7.2 Adapting Learning Parameters......Page 143
5.7.3 Initializing Weights......Page 147
5.7.4 Adapting Activation Function......Page 148
5.8 Some Improved BP Algorithms......Page 151
5.8.1 BP with Global Descent......Page 152
5.8.2 Robust BP Algorithms......Page 153
5.9 Resilient Propagation (Rprop)......Page 154
5.10 Spiking Neural Network Learning......Page 156
References......Page 159
6.1 Introduction to Second-Order Learning Methods......Page 166
6.2 Newton's Methods......Page 167
6.2.1 Gauss–Newton Method......Page 168
6.2.2 Levenberg–Marquardt Method......Page 169
6.3 Quasi-Newton Methods......Page 172
6.3.1 BFGS Method......Page 173
6.4 Conjugate Gradient Methods......Page 175
6.5 Extended Kalman Filtering Methods......Page 180
6.6 Recursive Least Squares......Page 182
6.7 Natural-Gradient-Descent Method......Page 183
6.8.1 Layerwise Linear Learning......Page 184
6.9 Escaping Local Minima......Page 185
6.10 Complex-Valued MLPs and Their Learning......Page 186
6.10.2 Fully Complex BP......Page 187
References......Page 191
7.1 Hopfield Model......Page 196
7.2 Continuous-Time Hopfield Network......Page 199
7.3 Simulated Annealing......Page 202
7.4 Hopfield Networks for Optimization......Page 205
7.4.1 Combinatorial Optimization Problems......Page 206
7.4.2 Escaping Local Minima......Page 210
7.4.3 Solving Other Optimization Problems......Page 211
7.5.1 Chaos, Bifurcation, and Fractals......Page 212
7.5.2 Chaotic Neural Networks......Page 213
7.6 Multistate Hopfield Networks......Page 216
7.7 Cellular Neural Networks......Page 217
References......Page 220
8.1 Introduction......Page 224
8.2.1 Generalized Hebbian Rule......Page 226
8.2.3 Perceptron-Type Learning Rule......Page 228
8.2.4 Retrieval Stage......Page 229
8.3 Storage Capability of Hopfield Model......Page 230
8.4 Increasing Storage Capacity......Page 235
8.5 Multistate Hopfield Networks as Associative Memories......Page 237
8.6 Multilayer Perceptrons as Associative Memories......Page 238
8.7 Hamming Network......Page 240
8.8 Bidirectional Associative Memories......Page 242
8.9 Cohen–Grossberg Model......Page 243
8.10 Cellular Networks as Associative Memories......Page 244
References......Page 249
9.1 Vector Quantization......Page 253
9.2 Competitive Learning......Page 254
9.3 Self-Organizing Maps......Page 256
9.3.1 Kohonen Network......Page 257
9.3.2 Basic Self-Organizing Maps......Page 258
9.4 Learning Vector Quantization......Page 266
9.5 Nearest Neighbor Algorithms......Page 268
9.6 Neural Gas......Page 271
9.7 ART Networks......Page 274
9.7.1 ART Models......Page 275
9.7.2 ART 1......Page 276
9.8 C-Means Clustering......Page 278
9.9 Subtractive Clustering......Page 281
9.10.1 Fuzzy C-Means Clustering......Page 284
9.10.2 Other Fuzzy Clustering Algorithms......Page 287
References......Page 291
10.1.1 Competitive Learning with Conscience......Page 297
10.1.2 Rival Penalized Competitive Learning......Page 299
10.1.3 Soft-Competitive Learning......Page 301
10.2 Robust Clustering......Page 302
10.2.1 Possibilistic C-Means......Page 304
10.2.2 A Unified Framework for Robust Clustering......Page 305
10.3 Supervised Clustering......Page 306
10.4 Clustering Using Non-Euclidean Distance Measures......Page 307
10.5 Partitional, Hierarchical, and Density-Based Clustering......Page 309
10.6.1 Distance Measures, Cluster Representations, and Dendrograms......Page 310
10.6.2 Minimum Spanning Tree (MST) Clustering......Page 312
10.6.3 BIRCH, CURE, CHAMELEON, and DBSCAN......Page 314
10.6.4 Hybrid Hierarchical/Partitional Clustering......Page 317
10.7 Constructive Clustering Techniques......Page 318
10.8 Cluster Validity......Page 320
10.8.1 Measures Based on Compactness and Separation of Clusters......Page 321
10.8.2 Measures Based on Hypervolume and Density of Clusters......Page 322
10.8.3 Crisp Silhouette and Fuzzy Silhouette......Page 323
10.9 Projected Clustering......Page 325
10.10 Spectral Clustering......Page 326
10.11 Coclustering......Page 327
10.12 Handling Qualitative Data......Page 328
10.13 Bibliographical Notes......Page 329
References......Page 330
11.1 Introduction......Page 337
11.2 RBF Network Architecture......Page 339
11.3 Universal Approximation of RBF Networks......Page 340
11.4 Formulation for RBF Network Learning......Page 341
11.5 Radial Basis Functions......Page 342
11.6 Learning RBF Centers......Page 345
11.7.1 Least Squares Methods for Weights Learning......Page 347
11.8 RBF Network Learning Using Orthogonal Least Squares......Page 349
11.9.1 Supervised Learning for General RBF Networks......Page 351
11.9.2 Supervised Learning for Gaussian RBF Networks......Page 352
11.9.3 Discussion on Supervised Learning......Page 353
11.10 Various Learning Methods......Page 354
11.11 Normalized RBF Networks......Page 356
11.12.1 Constructive Methods......Page 357
11.12.2 Resource-Allocating Networks......Page 359
11.13 Complex RBF Networks......Page 361
11.14 A Comparison of RBF Networks and MLPs......Page 363
References......Page 367
12.1 Introduction......Page 372
12.2 Fully Connected Recurrent Networks......Page 374
12.3 Time-Delay Neural Networks......Page 375
12.4 Backpropagation for Temporal Learning......Page 378
12.6 Some Recurrent Models......Page 381
12.7 Reservoir Computing......Page 383
References......Page 389
13.1 Introduction......Page 393
13.1.1 Hebbian Learning Rule......Page 394
13.1.2 Oja's Learning Rule......Page 395
13.2 PCA: Conception and Model......Page 396
13.3.1 Subspace Learning Algorithms......Page 399
13.3.2 Generalized Hebbian Algorithm......Page 403
13.4 Least Mean Squared Error-Based PCA......Page 405
13.4.1 Other Optimization-Based PCA......Page 409
13.5 Anti-Hebbian Rule-Based PCA......Page 410
13.5.1 APEX Algorithm......Page 411
13.6 Nonlinear PCA......Page 415
13.6.1 Autoassociative Network-Based Nonlinear PCA......Page 416
13.7.1 Extracting the First Minor Component......Page 418
13.7.2 Self-Stabilizing Minor Component Analysis......Page 419
13.7.4 Other Algorithms......Page 420
13.8 Constrained PCA......Page 421
13.8.1 Sparse PCA......Page 422
13.9 Localized PCA, Incremental PCA, and Supervised PCA......Page 423
13.10 Complex-Valued PCA......Page 425
13.11 Two-Dimensional PCA......Page 426
13.12 Generalized Eigenvalue Decomposition......Page 427
13.13.1 Cross-Correlation Asymmetric PCA Networks......Page 429
13.13.2 Extracting Principal Singular Components for Nonsquare Matrices......Page 432
13.13.3 Extracting Multiple Principal Singular Components......Page 433
13.14 Factor Analysis......Page 434
13.15 Canonical Correlation Analysis......Page 435
References......Page 438
14.1 Introduction......Page 446
14.2.1 Multiplicative Update Algorithm and Alternating Nonnegative Least Squares......Page 448
14.3 Other NMF Methods......Page 451
14.3.1 NMF Methods for Clustering......Page 454
14.3.2 Concept Factorization......Page 456
14.4 Nystrom Method......Page 457
14.5 CUR Decomposition......Page 459
References......Page 460
15.1 Introduction......Page 465
15.2 ICA Model......Page 466
15.3 Approaches to ICA......Page 467
15.4.1 Infomax ICA......Page 469
15.4.2 EASI, JADE, and Natural Gradient ICA......Page 471
15.4.3 FastICA Algorithm......Page 472
15.5 ICA Networks......Page 477
15.6.2 Constrained ICA......Page 480
15.6.3 Nonnegativity ICA......Page 481
15.6.4 ICA for Convolutive Mixtures......Page 482
15.6.5 Other BSS/ICA Methods......Page 483
15.7 Complex-Valued ICA......Page 486
15.8 Source Separation for Time Series......Page 488
15.9 EEG, MEG, and fMRI......Page 490
References......Page 494
16.1 Linear Discriminant Analysis......Page 501
16.3 Fisherfaces......Page 505
16.4 Regularized LDA......Page 506
16.5 Uncorrelated LDA and Orthogonal LDA......Page 508
16.6 LDA/GSVD and LDA/QR......Page 509
16.7 Incremental LDA......Page 510
16.8 Other Discriminant Methods......Page 511
16.9 Nonlinear Discriminant Analysis......Page 513
16.10 Two-Dimensional Discriminant Analysis......Page 515
References......Page 516
17.1 Introduction......Page 520
17.2 Learning Through Awards......Page 522
17.3 Actor–Critic Model......Page 524
17.4 Model-Free and Model-Based Reinforcement Learning......Page 526
17.5 Learning from Demonstrations......Page 529
17.6 Temporal-Difference Learning......Page 530
17.6.1 TD(λ)......Page 531
17.6.2 Sarsa(λ)......Page 532
17.7 Q-Learning......Page 533
17.8 Multiagent Reinforcement Learning......Page 535
17.8.1 Equilibrium-Based Multiagent Reinforcement Learning......Page 536
17.8.2 Learning Automata......Page 537
References......Page 538
18.1 Introduction......Page 541
18.2 Compressed Sensing......Page 542
18.2.1 Restricted Isometry Property......Page 543
18.2.2 Sparse Recovery......Page 544
18.2.3 Iterative Hard Thresholding......Page 546
18.2.4 Orthogonal Matching Pursuit......Page 548
18.2.5 Restricted Isometry Property for Signal Recovery Methods......Page 549
18.3 Sparse Coding and Dictionary Learning......Page 551
18.4 LASSO......Page 554
18.5 Other Sparse Algorithms......Page 556
References......Page 557
19.1 Introduction......Page 564
19.2 Matrix Completion......Page 565
19.2.1 Minimizing the Nuclear Norm......Page 566
19.2.2 Matrix Factorization-Based Methods......Page 568
19.2.3 Theoretical Guarantees on Exact Matrix Completion......Page 569
19.2.4 Discrete Matrix Completion......Page 571
19.3 Low-Rank Representation......Page 572
19.4 Tensor Factorization and Tensor Completion......Page 573
19.4.1 Tensor Factorization......Page 575
19.4.2 Tensor Completion......Page 576
References......Page 578
20.1 Introduction......Page 584
20.2 Kernel Functions and Representer Theorem......Page 585
20.3 Kernel PCA......Page 587
20.4 Kernel LDA......Page 591
20.5 Kernel Clustering......Page 593
20.6 Kernel Auto-associators, Kernel CCA, and Kernel ICA......Page 594
20.7 Other Kernel Methods......Page 596
20.8 Multiple Kernel Learning......Page 598
References......Page 601
21.1 Introduction......Page 608
21.2 SVM Model......Page 609
21.3 Solving the Quadratic Programming Problem......Page 612
21.3.2 Decomposition......Page 614
21.4 Least Squares SVMs......Page 618
21.5.1 SVM Algorithms with Reduced Kernel Matrix......Page 621
21.5.2 ν-SVM......Page 623
21.5.3 Cutting-Plane Technique......Page 624
21.5.5 Training SVM in the Primal Formulation......Page 625
21.5.6 Clustering-Based SVM......Page 627
21.5.7 Other SVM Methods......Page 628
21.6 Pruning SVMs......Page 630
21.7 Multiclass SVMs......Page 632
21.8 Support Vector Regression......Page 634
21.8.1 Solving Support Vector Regression......Page 636
21.9 Support Vector Clustering......Page 639
21.10 SVMs for One-Class Classification......Page 642
21.11 Incremental SVMs......Page 643
21.12.2 SVMs for Transductive or Semi-supervised Learning......Page 645
21.13 Solving SVM with Indefinite Matrices......Page 648
References......Page 650
22.1 Introduction......Page 660
22.1.1 Classical Versus Bayesian Approach......Page 661
22.1.2 Bayes' Theorem and Bayesian Classifiers......Page 662
22.1.3 Graphical Models......Page 663
22.2 Bayesian Network Model......Page 664
22.3 Learning Bayesian Networks......Page 667
22.3.1 Learning the Structure......Page 668
22.3.2 Learning the Parameters......Page 672
22.3.3 Constraint-Handling......Page 674
22.4.1 Belief Propagation......Page 675
22.4.2 Factor Graphs and Belief Propagation Algorithm......Page 678
22.5 Sampling (Monte Carlo) Methods......Page 681
22.5.1 Gibbs Sampling......Page 682
22.5.3 Particle Filtering......Page 684
22.6 Variational Bayesian Methods......Page 685
22.7 Hidden Markov Models......Page 687
22.8 Dynamic Bayesian Networks......Page 690
22.9 Expectation–Maximization Method......Page 691
22.10 Mixture Models......Page 693
22.11 Bayesian and Probabilistic Approach to Machine Learning......Page 694
22.11.1 Probabilistic PCA......Page 696
22.11.2 Probabilistic Clustering......Page 697
22.11.3 Probabilistic ICA......Page 698
22.11.5 Relevance Vector Machines......Page 700
References......Page 704
23.1 Boltzmann Machines......Page 714
23.1.1 Boltzmann Learning Algorithm......Page 716
23.2 Restricted Boltzmann Machines......Page 718
23.2.1 Universal Approximation......Page 720
23.2.2 Contrastive Divergence Algorithm......Page 721
23.2.3 Related Methods......Page 723
23.3 Mean-Field-Theory Machine......Page 724
23.4 Stochastic Hopfield Networks......Page 726
References......Page 727
24.1 Introduction......Page 731
24.2 Deep Neural Networks......Page 733
24.2.1 Deep Networks Versus Shallow Networks......Page 734
24.3 Deep Belief Networks......Page 735
24.3.1 Training Deep Belief Networks......Page 736
24.4 Deep Autoencoders......Page 737
24.5 Deep Convolutional Neural Networks......Page 738
24.5.1 Solving the Difficulties of Gradient Descent......Page 739
24.5.2 Implementing Deep Convolutional Neural Networks......Page 740
24.6 Deep Reinforcement Learning......Page 743
24.7 Other Deep Neural Network Methods......Page 744
References......Page 746
25.1 Introduction......Page 751
25.1.1 Ensemble Learning Methods......Page 752
25.1.2 Aggregation......Page 753
25.2 Majority Voting......Page 754
25.3 Bagging......Page 755
25.4 Boosting......Page 757
25.4.1 AdaBoost......Page 758
25.4.2 Other Boosting Algorithms......Page 760
25.5 Random Forests......Page 762
25.5.1 AdaBoost Versus Random Forests......Page 764
25.6.1 Ensemble Neural Networks......Page 765
25.6.2 Diversity Versus Ensemble Accuracy......Page 766
25.6.4 Ensembles for Streams......Page 767
25.7.1 One-Against-All Strategy......Page 768
25.7.2 One-Against-One Strategy......Page 769
25.7.3 Error-Correcting Output Codes (ECOCs)......Page 770
25.8 Dempster–Shafer Theory of Evidence......Page 772
References......Page 776
26.1 Introduction......Page 782
26.2 Definitions and Terminologies......Page 783
26.3 Membership Function......Page 789
26.4 Intersection, Union and Negation......Page 790
26.5 Fuzzy Relation and Aggregation......Page 792
26.6 Fuzzy Implication......Page 794
26.7 Reasoning and Fuzzy Reasoning......Page 795
26.7.1 Modus Ponens and Modus Tollens......Page 796
26.7.2 Generalized Modus Ponens......Page 797
26.7.3 Fuzzy Reasoning Methods......Page 798
26.8 Fuzzy Inference Systems......Page 799
26.8.1 Fuzzy Rules and Fuzzy Interference......Page 800
26.8.2 Fuzzification and Defuzzification......Page 801
26.9.1 Mamdani Model......Page 802
26.9.2 Takagi–Sugeno–Kang Model......Page 803
26.10 Complex Fuzzy Logic......Page 805
26.11 Possibility Theory......Page 806
26.13 Granular Computing and Ontology......Page 808
References......Page 812
27.1 Introduction......Page 815
27.1.1 Interpretability......Page 816
27.2.1 Fuzzy Rules and Multilayer Perceptrons......Page 817
27.2.2 Fuzzy Rules and RBF Networks......Page 818
27.2.3 Rule Extraction from SVMs......Page 819
27.2.4 Rule Generation from Other Neural Networks......Page 820
27.3.1 Rule Generation Based on Fuzzy Partitioning......Page 821
27.3.2 Other Methods......Page 823
27.4 Synergy of Fuzzy Logic and Neural Networks......Page 824
27.5 ANFIS Model......Page 825
27.6 Generic Fuzzy Perceptron......Page 831
27.7 Fuzzy SVMs......Page 833
27.8 Other Neurofuzzy Models......Page 834
References......Page 837
28.1 Introduction......Page 841
28.2 Hardware/Software Codesign......Page 843
28.3 Topics in Digital Circuit Designs......Page 844
28.4.1 Memristor......Page 845
28.4.2 Circuits for MLPs......Page 847
28.4.3 Circuits for RBF Networks......Page 848
28.4.5 Circuits for SVMs......Page 849
28.4.6 Circuits for Other Neural Network Models......Page 850
28.4.7 Circuits for Fuzzy Neural Models......Page 851
28.5 Graphic Processing Unit (GPU) Implementation......Page 852
28.6 Implementation Using Systolic Algorithms......Page 854
28.7 Implementation on Parallel Computers......Page 855
28.7.1 Distributed and Parallel SVMs......Page 857
References......Page 858
29.1 Biometrics......Page 864
29.1.1 Physiological Biometrics and Recognition......Page 865
29.1.2 Behavioral Biometrics and Recognition......Page 868
29.2 Face Detection and Recognition......Page 869
29.2.1 Face Detection......Page 870
29.2.2 Face Recognition......Page 871
29.3 Bioinformatics......Page 873
29.3.1 Microarray Technology......Page 875
29.3.2 Motif Discovery, Sequence Alignment, Protein Folding, and Coclustering......Page 878
References......Page 880
30.1 Introduction......Page 882
30.2 Document Representations for Text Categorization......Page 883
30.3.1 Classification-Based Data Mining......Page 885
30.3.2 Clustering-Based Data Mining......Page 886
30.3.3 Bayesian Network-Based Data Mining......Page 889
30.4 XML Format......Page 890
30.5.1 Affective Computing......Page 892
30.6 Web Usage Mining......Page 893
30.7 Ranking Search Results......Page 894
30.7.1 Surfer Models......Page 895
30.7.2 PageRank Algorithm......Page 896
30.7.3 Hypertext-Induced Topic Search (HITS)......Page 899
30.8 Personalized Search......Page 900
30.9 Data Warehousing......Page 902
30.10 Content-Based Image Retrieval......Page 904
30.11 E-mail Anti-spamming......Page 907
References......Page 908
31.1.1 Introduction to Big Data......Page 915
31.1.2 MapReduce......Page 916
31.1.3 Hadoop Software Stack......Page 920
31.1.4 Other Big Data Tools......Page 921
31.1.5 NoSQL Databases......Page 922
31.2 Cloud Computing......Page 923
31.2.1 Services Models, Pricing, and Standards......Page 924
31.2.2 Virtual Machines, Data Centers, and Intercloud Connections......Page 927
31.2.3 Cloud Infrastructure Requirements......Page 930
31.3.1 Architecture of IoT......Page 932
31.3.2 Cyber-Physical System Versus IoT......Page 934
31.4 Fog/Edge Computing......Page 937
31.5 Blockchain......Page 938
References......Page 940
A.1 Linear Algebra......Page 943
A.2 Data Preprocessing......Page 951
A.3 Stability of Dynamic Systems......Page 953
A.4 Probability Theory and Stochastic Processes......Page 954
A.5 Numerical Optimization Techniques......Page 959
A.6 Classification Measures......Page 962
B.1 Face Databases......Page 966
B.2 UCI Machine Learning Repository......Page 970
B.3 Some Machine Learning Databases......Page 971
B.4 Datasets for Data Mining......Page 976
B.5 Databases and Tools for Speech Recognition and Audio Classification......Page 977
B.6 Datasets for Microarray and for Genome Analysis......Page 978
B.7 Software......Page 980
Index......Page 987
📜 SIMILAR VOLUMES
This book covers Lévy processes and their applications in the contexts of reliability and storage. Special attention is paid to life distributions and the maintenance of devices subject to degradation; estimating the parameters of the degradation process is also discussed, as is the maintenance of d
<p>Providing a broad but in-depth introduction to neural network and machine learning in a statistical framework, this book provides a single, comprehensive resource for study and further research. All the major popular neural network models and statistical learning approaches are covered with examp
<p><p></p><p></p><p>This book provides a broad yet detailed introduction to neural networks and machine learning in a statistical framework. A single, comprehensive resource for study and further research, it explores the major popular neural network models and statistical learning approaches with e
<p><p>Providing a broad but in-depth introduction to neural network and machine learning in a statistical framework, this book provides a single, comprehensive resource for study and further research. All the major popular neural network models and statistical learning approaches are covered with ex
<p><p></p><p>This book reviews some of the most recent developments in neural networks, with a focus on applications in actuarial sciences and finance. It simultaneously introduces the relevant tools for developing and analyzing neural networks, in a style that is mathematically rigorous yet accessi