๐”– Scriptorium
โœฆ   LIBER   โœฆ

๐Ÿ“

Neural Networks for Pattern Recognition

โœ Scribed by Christopher M. Bishop


Publisher
Oxford University Press, USA
Year
1996
Tongue
English
Leaves
496
Category
Library

โฌ‡  Acquire This Volume

No coin nor oath required. For personal study only.

โœฆ Synopsis


Dr. Bishop is a world-renowned expert in this field, but his book didn't work for me. Despite the title, it covers the more general topic of classification, not just Neural Networks. However, it does so less well than my favorites (esp. Hastie and Tibshirani). In terms of specific discussion of nonlinear classifiers, I preferred Christianini's discussion of SVM's.The most positive feature is a detailed discussion of the Kolmogorov theorem which I found very powerful, but perhaps not in the way the author intended. To paraphrase, the Kolmogorov theorem states that a structure as simple as a single hidden layer neural network is dense with respect to the space of continuous functions. I interpret that to mean that a neural network model can fit anything, completely independent of any underlying relationship. I've always been skeptical of NN's and this property gives me a fact to support my bias. Bishop's discussion of the Kolmogorov theorem captures my opinion of the book. He presents the theorem as an aside and does not propose to draw any inferences from it. I prefer authors who have more of a central thesis.That said, many people really like this book. I bought it based on the near-universal praise. My dislike may be more a personal taste than a reliable guideline.


๐Ÿ“œ SIMILAR VOLUMES


Neural Networks for Pattern Recognition
โœ Christopher M. Bishop ๐Ÿ“‚ Library ๐Ÿ“… 1995 ๐Ÿ› Oxford University Press, USA ๐ŸŒ English

This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition. After introducing the basic concepts, the book examines techniques for modelling probability density functions and the properties and merits of the multi-layer perceptro

Neural Networks for Pattern Recognition
โœ Christopher M. Bishop ๐Ÿ“‚ Library ๐Ÿ“… 1996 ๐Ÿ› Oxford University Press, USA ๐ŸŒ English

This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition. After introducing the basic concepts, the book examines techniques for modeling probability density functions and the properties and merits of the multi-layer perceptron

Neural Networks for Pattern Recognition
โœ Christopher M. Bishop ๐Ÿ“‚ Library ๐Ÿ“… 1996 ๐Ÿ› OUP ๐ŸŒ English

This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition. After introducing the basic concepts, the book examines techniques for modeling probability density functions and the properties and merits of the multi-layer perceptron

Neural Networks for Pattern Recognition
โœ Albert Nigrin ๐Ÿ“‚ Library ๐Ÿ“… 1993 ๐Ÿ› The MIT Press ๐ŸŒ English

Neural Networks for Pattern Recognition takes the pioneering work in artificial neural networks by Stephen Grossberg and his colleagues to a new level. In a simple and accessible way it extends embedding field theory into areas of machine intelligence that have not been clearly dealt with before. Fo

Neural Networks for Pattern Recognition
โœ Albert Nigrin ๐Ÿ“‚ Library ๐Ÿ“… 1993 ๐Ÿ› A Bradford Book ๐ŸŒ English

<P>Neural Networks for Pattern Recognition takes the pioneering work in artificial neural networks by Stephen Grossberg and his colleagues to a new level. In a simple and accessible way it extends embedding field theory into areas of machine intelligence that have not been clearly dealt with before.