𝔖 Bobbio Scriptorium
✦   LIBER   ✦

Language acquisition from sparse input without error feedback

✍ Scribed by R.F Hadley; V.C Cardei


Publisher
Elsevier Science
Year
1999
Tongue
English
Weight
313 KB
Volume
12
Category
Article
ISSN
0893-6080

No coin nor oath required. For personal study only.

✦ Synopsis


A connectionist-inspired, parallel processing network is presented which learns, on the basis of (relevantly) sparse input, to assign meaning interpretations to novel test sentences in both active and passive voice. Training and test sentences are generated from a simple recursive grammar, but once trained, the network successfully processes thousands of sentences containing deeply embedded clauses. All training is unsupervised with regard to error feedback -only Hebbian and self-organizing forms of training are employed. In addition, the activepassive distinction is acquired without any supervised provision of cues or flags (in the output layer) that indicate whether the input sentence is in active or passive sentence. In more detail: (1) The model learns on the basis of a corpus of about 1000 sentences while the set of potential test sentences contains over 100 million sentences. (2) The model generalizes its capacity to interpret active and passive sentences to substantially deeper levels of clausal embedding. (3) After training, the model satisfies criteria for strong syntactic and strong semantic systematicity that humans also satisfy. (4) Symbolic message passing occurs within the model's output layer. This symbolic aspect reflects certain prior language acquistion assumptions.