Dr. Bishop is a world-renowned expert in this field, but his book didn't work for me. Despite the title, it covers the more general topic of classification, not just Neural Networks. However, it does so less well than my favorites (esp. Hastie and Tibshirani). In terms of specific discussion of nonl
Neural Networks for Pattern Recognition
โ Scribed by Christopher M. Bishop
- Publisher
- Oxford University Press, USA
- Year
- 1996
- Tongue
- English
- Leaves
- 496
- Category
- Library
No coin nor oath required. For personal study only.
๐ SIMILAR VOLUMES
This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition. After introducing the basic concepts, the book examines techniques for modelling probability density functions and the properties and merits of the multi-layer perceptro
This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition. After introducing the basic concepts, the book examines techniques for modeling probability density functions and the properties and merits of the multi-layer perceptron
This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition. After introducing the basic concepts, the book examines techniques for modeling probability density functions and the properties and merits of the multi-layer perceptron
Neural Networks for Pattern Recognition takes the pioneering work in artificial neural networks by Stephen Grossberg and his colleagues to a new level. In a simple and accessible way it extends embedding field theory into areas of machine intelligence that have not been clearly dealt with before. Fo
<P>Neural Networks for Pattern Recognition takes the pioneering work in artificial neural networks by Stephen Grossberg and his colleagues to a new level. In a simple and accessible way it extends embedding field theory into areas of machine intelligence that have not been clearly dealt with before.