<p>This book is based on the workshop on New Approaches to Learning for Natural Language Processing, held in conjunction with the International Joint Conference on Artificial Intelligence, IJCAI'95, in Montreal, Canada in August 1995.<BR>Most of the 32 papers included in the book are revised selecte
Connectionist Approaches to Language Learning
β Scribed by David S. Touretzky (auth.), David Touretzky (eds.)
- Publisher
- Springer US
- Year
- 1991
- Tongue
- English
- Leaves
- 150
- Series
- The Springer International Series in Engineering and Computer Science 154
- Edition
- 1
- Category
- Library
No coin nor oath required. For personal study only.
β¦ Synopsis
arise automatically as a result of the recursive structure of the task and the continuous nature of the SRN's state space. Elman also introduces a new graphical technique for studyΒ ing network behavior based on principal components analysis. He shows that sentences with multiple levels of embedding produce state space trajectories with an intriguing selfΒ similar structure. The development and shape of a recurrent network's state space is the subject of Pollack's paper, the most provocative in this collection. Pollack looks more closely at a connectionist network as a continuous dynamical system. He describes a new type of machine learning phenomenon: induction by phase transition. He then shows that under certain conditions, the state space created by these machines can have a fractal or chaotic structure, with a potentially infinite number of states. This is graphically illustrated using a higher-order recurrent network trained to recognize various regular languages over binary strings. Finally, Pollack suggests that it might be possible to exploit the fractal dynamics of these systems to achieve a generative capacity beyond that of finite-state machines.
β¦ Table of Contents
Front Matter....Pages i-iv
Introduction....Pages 1-3
Learning Automata from Ordered Examples....Pages 5-34
SLUG: A Connectionist Architecture for Inferring the Structure of Finite-State Environments....Pages 35-56
Graded State Machines: The Representation of Temporal Contingencies in Simple Recurrent Networks....Pages 57-89
Distributed Representations, Simple Recurrent Networks, and Grammatical Structure....Pages 91-121
The Induction of Dynamical Recognizers....Pages 123-148
Back Matter....Pages 149-149
β¦ Subjects
Artificial Intelligence (incl. Robotics);Statistical Physics, Dynamical Systems and Complexity;Computer Science, general
π SIMILAR VOLUMES
Connectionist accounts of language acquisition, processing, and dissolution proliferate despite attacks from some linguists, cognitive scientists, and engineers. Although the networks of exquisitely interconnected perceptrons postulated by PDP theorists may not be anatomically homologous with actual
<p><span>This book presents a collection of original research articles that showcase the state of the art of research in corpus and computational linguistic approaches to Chinese language teaching, learning and assessment. It offers a comprehensive set of corpus resources and natural language proces
<p><P><EM>Learning Languages, Learning Life Skills</EM> offers an autobiographical reflexive approach to foreign language education. The orientation of the book is practical, containing rich descriptions of language learning situations including authentic language use and student stories.</P><P>Teac