Neural networks have been proposed as solutions to complex pattern recognition problems at which humans excel but for which algorithmic approaches have not been very successful. Examples of such problems include recognizing an object regardless of its viewing angle or perspective in an image and rec
Transformation invariance in pattern recognition: Tangent distance and propagation
β Scribed by Patrice Y. Simard; Yann A. Le Cun; John S. Denker; Bernard Victorri
- Publisher
- John Wiley and Sons
- Year
- 2000
- Tongue
- English
- Weight
- 356 KB
- Volume
- 11
- Category
- Article
- ISSN
- 0899-9457
No coin nor oath required. For personal study only.
β¦ Synopsis
In pattern recognition, statistical modeling, or regression, the amount of data is a critical factor affecting the performance. If the amount of data and computational resources are unlimited, even trivial algorithms will converge to the optimal solution. However, in the practical case, given limited data and other resources, satisfactory performance requires sophisticated methods to regularize the problem by introducing a priori knowledge. Invariance of the output with respect to certain transformations of the input is a typical example of such a priori knowledge. We introduce the concept of tangent vectors, which compactly represent the essence of these transformation invariances, and two classes of algorithms, tangent distance and tangent propagation, which make use of these invariances to improve performance.
π SIMILAR VOLUMES
The paper is devoted to the recognition of objects and patterns deformed by imaging geometry as well as by unknown blurring. We introduce a new class of features invariant simultaneously to blurring with a centrosymmetric PSF and to a ne transformation. As we prove in the paper, they can be construc