𝔖 Bobbio Scriptorium
✦   LIBER   ✦

How OWE architectures encode contextual effects in artificial neural networks

✍ Scribed by Nicolas Pican; Frédéric Alexandre


Book ID
103897778
Publisher
Elsevier Science
Year
1996
Tongue
English
Weight
601 KB
Volume
41
Category
Article
ISSN
0378-4754

No coin nor oath required. For personal study only.

✦ Synopsis


Artificial neural networks (ANNs) are widely used for classification tasks where discriminant cues and also contextual parameters are proposed as ANN inputs. When the input space is too large to enable a robust, time limited learning, a classical solution consists in designing a set of ANNs for different context domains. We have proposed a new learning algorithm, the lateral contribution learning algorithm (LCLA), based on the backpropagation learning algorithm, which allows for such a solution with a reduced learning time and more efficient performances thanks to lateral influences between networks. This attractive, but heavy solution has been improved thanks to the orthogonal weight estimator (OWE) technique, an original architectural technique which, under light constraints, merges the set of ANNs in one ANN whose weights are dynamically estimated, for each example, by others ANNs, fed by the context. This architecture allows to give a very rich and interesting interpretation of the weight landscape. We illustrate this interpretation with two examples: a mathematical function estimation and a process modelization used in neurocontrol.