<p>This book is based on the workshop on New Approaches to Learning for Natural Language Processing, held in conjunction with the International Joint Conference on Artificial Intelligence, IJCAI'95, in Montreal, Canada in August 1995.<BR>Most of the 32 papers included in the book are revised selecte
Connectionist approaches to natural language processing
β Scribed by Reilly, Ronan G.; Sharkey, Noel E.
- Publisher
- Routledge
- Year
- 2017
- Tongue
- English
- Leaves
- 489
- Series
- Psychology Library Editions: Cognitive Science.
- Category
- Library
No coin nor oath required. For personal study only.
β¦ Table of Contents
Pt. I. Semantics --
pt. II. Syntax --
pt. III. Representational adequacy --
pt. IV. Computational psycholinguistics. Cover
Half Title
Title Page
Copyright Page
Original Title Page
Original Copyright Page
Dedication
Table of Contents
List of Contributors
Preface
1 Connectionist Natural Language Processing
Introduction
Overview of chapters
Acknowledgements
References
PART I SEMANTICS
Introduction
2 Distributed Symbol Discovery through Symbol Recirculation: Toward Natural Language Processing in Distributed Connectionist Networks
Introduction
Natural language processing: Constraints from the task domain
Dynamic vs. static symbol representations
Symbol recirculation. Encoding semantic networks in DUAL: A distributed connectionist architectureOther symbol recirculation methods
Open problems
Variable binding research and symbol formation
Summary and conclusions
Acknowledgements
References
3 Representing Meaning Using Microfeatures
Introduction
Microfeature representations in PARROT
Implementation of the microfeature concept within PARROT
Examples
Discussion and next steps
Acknowledgements
References
Appendix I: Outline of the PARROT system
Appendix II: Example entries from the lexicon
4 Noun Phrase Analysis with Connectionist Networks. IntroductionThe domain
Learning level: Learning semantic prepositional relationships
Integration level: Integration of semantic and syntactic constraints
A case study for the disambiguation of noun phrases
Discussion
Conclusion
Acknowledgements
References
5 Parallel Constraint Satisfaction as a Comprehension Mechanism
Introduction
Sentence comprehension
Story comprehension
Conclusions
Acknowledgements
References
Appendix I: Input and output representations
PART II SYNTAX
Introduction
References
6 Self-correcting Connectionist Parsing
Introduction: Constrained chaos. AgreementCounting
Constituent motion
Missing constituents
Conclusions
Acknowledgements
References
7 A Net-linguistic "Earley" Parser
Introduction
The basic characteristics of the parser
The representation of parse-information
The Earley parse-list algorithm
Our approach
References
Appendix
PART III REPRESENTATIONAL ADEQUACY
Introduction
Reference
8 The Demons and the Beast-Modular and Nodular Kinds of Knowledge
Introduction and summary
Structure and habits-The knowledge and the power
A model that learns some morphology. Evidence for nodes unseen-some models that learn to read aloudThe study of statistically available information
Conclusion-Behavioural strategies and mental structures
Acknowledgements
References
9 Representational Adequacy and the Case for a Hybrid Connectionist/Marker-parsing Model
Introduction
Representational adequacy
Autonomous semantic networks
ASNs and representational adequacy
Discussion
Conclusion
Acknowledgements
References
10 A Step Toward Sub-symbolic Language Models without Linguistic Representations
Introduction
Basic observations about language
An implementation.
β¦ Subjects
Natural language processing (Computer science);Computational linguistics;Psycholinguistics;COMPUTERS -- General
π SIMILAR VOLUMES
<p>arise automatically as a result of the recursive structure of the task and the continuous nature of the SRN's state space. Elman also introduces a new graphical technique for studyΒ ing network behavior based on principal components analysis. He shows that sentences with multiple levels of embedd
<p>Connection science is a new information-processing paradigm which attempts to imitate the architecture and process of the brain, and brings together researchers from disciplines as diverse as computer science, physics, psychology, philosophy, linguistics, biology, engineering, neuroscience and AI
<p>As natural language processing spans many different disciplines, it is sometimes difficult to understand the contributions and the challenges that each of them presents. This book explores the special relationship between natural language processing and cognitive science, and the contribution of
<p>Natural language is easy for people and hard for machines. For two generations, the tantalizing goal has been to get computers to handle human languages in ways that will be compelling and useful to people. Obstacles are many and legendary. <br/><em>Natural Language Processing: The PLNLP Approach