๐”– Bobbio Scriptorium
โœฆ   LIBER   โœฆ

Morphological bidirectional associative memories

โœ Scribed by G.X. Ritter; J.L. Diaz-de-Leon; P. Sussner


Publisher
Elsevier Science
Year
1999
Tongue
English
Weight
335 KB
Volume
12
Category
Article
ISSN
0893-6080

No coin nor oath required. For personal study only.

โœฆ Synopsis


The theory of artificial neural networks has been successfully applied to a wide variety of pattern recognition problems. In this theory, the first step in computing the next state of a neuron or in performing the next layer neural network computation involves the linear operation of multiplying neural values by their synaptic strengths and adding the results. Thresholding usually follows the linear operation in order to provide for nonlinearity of the network. In this paper we discuss a novel class of artificial neural networks, called morphological neural networks, in which the operations of multiplication and addition are replaced by addition and maximum (or minimum), respectively. By taking the maximum (or minimum) of sums instead of the sum of products, morphological network computation is nonlinear before thresholding. As a consequence, the properties of morphological neural networks are drastically different from those of traditional neural network models. The main emphasis of the research presented here is on morphological bidirectional associative memories (MBAMs). In particular, we establish a mathematical theory for MBAMs and provide conditions that guarantee perfect bidirectional recall for corrupted patterns. Some examples that illustrate performance differences between the morphological model and the traditional semilinear model are also given.


๐Ÿ“œ SIMILAR VOLUMES


Fuzzy associative memories
๐Ÿ“‚ Article ๐Ÿ“… 1991 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 80 KB
Complexity preserving increase of the ca
โœ Burkhard Lenze ๐Ÿ“‚ Article ๐Ÿ“… 1998 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 148 KB

In this paper, we show how to increase the capacity of Kosko-type bidirectional associative memories by introducing dilation and translation parameters in the pattern space. The essential point of this approach is that the increase of the capacity of the networks almost doesn't affect their complexi