Geometric learning algorithm for elementary perceptron and its convergence conditions
✍ Scribed by Seiji Miyoshi; Kazushi Ikeda; Kenji Nakayama
- Publisher
- John Wiley and Sons
- Year
- 1999
- Tongue
- English
- Weight
- 258 KB
- Volume
- 82
- Category
- Article
- ISSN
- 1042-0967
No coin nor oath required. For personal study only.
✦ Synopsis
In this paper, the geometric learning algorithm (GLA) is proposed for an elementary perceptron which includes a single output neuron. The GLA is a modified version of the affine projection algorithm (APA) for adaptive filters. The weight update vector is determined geometrically with respect to the orthogonal complement of the k patterns to be classified, where k is the order of the GLA. In the case of the APA, the target of the coefficient update is a single point which corresponds to the best identification of the unknown system. On the other hand, in the case of the GLA, the target of the weight update is the area in which all of the given patterns are classified correctly. Thus, their convergence conditions are different. In this paper, the convergence condition of the first-order GLA for 2 patterns is derived theoretically. The condition is represented by the relation between the angle T of 2 patterns and the learning rate O. Next, the new concept of the angle \ min of the solution area is introduced for the case of many patterns. Computer simulation results indicate that the relation be-tween \ min and O for convergence can be approximated by the relation between T and O for 2 patterns. Furthermore, it is proved that the first-order GLA always converges regardless of the number of patterns when O = 2. By our analysis and proof, the convergence of the first-order GLA is guaranteed and its usefulness as the learning algorithm is confirmed. The distributions of \ min and O for the convergence of the first-order GLA are approximately determined.