Connectionist hashed associative memory
β Scribed by Ronald L. Greene
- Publisher
- Elsevier Science
- Year
- 1991
- Tongue
- English
- Weight
- 586 KB
- Volume
- 48
- Category
- Article
- ISSN
- 0004-3702
No coin nor oath required. For personal study only.
β¦ Synopsis
Greene, R.L., Connectionist hashed associative memory (Research Note), Artificial Intelligence 48 (1991) 87-98. This paper proposes the use of simple connectionist networks as hashing functions for sparse associative or content addressable memory. The robustness of such networks in the presence of noisy inputs, and the property that "similar inputs lead to similar outputs" permits (in a probabilistic sense) faster-than-linear retrieval of data which best fits the input. The input may be noisy or have partially specified feature vectors. Mathematical analysis is presented for the Boolean feature case using a network with randomly selected connection strengths.
π SIMILAR VOLUMES
A. new associative memory system composed of persistent-current bit-cells is developed. Each word of the memory system is associated with a selection and control network which is made up of only seven cryotrons. In the proposed system both reading and writing circuits are derived from three-terminal
## αΊe present a linguistic extension from a crisp model by using a codification model that allows us to implement a fuzzy system on a discrete decision model. The paper begins with an introduction to the representation of fuzzy information, followed by a discussion of the codification method and t
Neural network models are used to investigate the ways in which bees use landmarks to navigate through space. The snapshot hypothesis, whereby bees remember a position in space by taking an instantaneous snapshot of the configuration of landmarks, is explored using a Hebbian learning rule and a dist
Bidirectional associative memory (BAM) is a a form of heteroassociative memory that can recall and restore patterns. Being a Hebbian learning-based memory, it has the problem of very low capacity. A training paradigm called the Pseudo-Relaxation Learning Algorithm (PRLAB) greatly increases the memor