Deep Learning and Linguistic Representation
β Scribed by Shalom Lappin
- Publisher
- Chapman and Hall/CRC
- Year
- 2021
- Tongue
- English
- Leaves
- 162
- Edition
- 1
- Category
- Library
No coin nor oath required. For personal study only.
β¦ Synopsis
The application of deep learning methods to problems in natural language processing has generated significant progress across a wide range of natural language processing tasks. For some of these applications, deep learning models now approach or surpass human performance. While the success of this approach has transformed the engineering methods of machine learning in artificial intelligence, the significance of these achievements for the modelling of human learning and representation remains unclear.
Deep Learning and Linguistic Representation looks at the application of a variety of deep learning systems to several cognitively interesting NLP tasks. It also considers the extent to which this work illuminates our understanding of the way in which humans acquire and represent linguistic knowledge.
Key Features:
- combines an introduction to deep learning in AI and NLP with current research on Deep Neural Networks in computational linguistics.
- is self-contained and suitable for teaching in computer science, AI, and cognitive science courses; it does not assume extensive technical training in these areas.
- provides a compact guide to work on state of the art systems that are producing a revolution across a range of difficult natural language tasks.
β¦ Table of Contents
Cover
Half Title
Series Page
Title Page
Copyright Page
Dedication
Contents
Preface
CHAPTER 1: Introduction: Deep Learning in Natural Language Processing
1.1. OUTLINE OF THE BOOK
1.2. FROM ENGINEERING TO COGNITIVE SCIENCE
1.3. ELEMENTS OF DEEP LEARNING
1.4. TYPES OF DEEP NEURAL NETWORKS
1.5. AN EXAMPLE APPLICATION
1.6. SUMMARY AND CONCLUSIONS
CHAPTER 2: Learning Syntactic Structure with Deep Neural Networks
2.1. SUBJECT-VERB AGREEMENT
2.2. ARCHITECTURE AND EXPERIMENTS
2.3. HIERARCHICAL STRUCTURE
2.4. TREE DNNS
2.5. SUMMARY AND CONCLUSIONS
CHAPTER 3: Machine Learning and the Sentence Acceptability Task
3.1. GRADIENCE IN SENTENCE ACCEPTABILITY
3.2. PREDICTING ACCEPTABILITY WITH MACHINE LEARNING MODELS
3.3. ADDING TAGS AND TREES
3.4. SUMMARY AND CONCLUSIONS
CHAPTER 4: Predicting Human Acceptability Judgements in Context
4.1. ACCEPTABILITY JUDGEMENTS IN CONTEXT
4.2. TWO SETS OF EXPERIMENTS
4.3. THE COMPRESSION EFFECT AND DISCOURSE COHERENCE
4.4. PREDICTING ACCEPTABILITY WITH DIFFERENT DNN MODELS
4.5. SUMMARY AND CONCLUSIONS
CHAPTER 5: Cognitively Viable Computational Models of Linguistic Knowledge
5.1. HOW USEFUL ARE LINGUISTIC THEORIES FOR NLP APPLICATIONS?
5.2. MACHINE LEARNING MODELS VS FORMAL GRAMMAR
5.3. EXPLAINING LANGUAGE ACQUISITION
5.4. DEEP LEARNING AND DISTRIBUTIONAL SEMANTICS
5.5. SUMMARY AND CONCLUSIONS
CHAPTER 6: Conclusions and Future Work
6.1. REPRESENTING SYNTACTIC AND SEMANTIC KNOWLEDGE
6.2. DOMAIN-SPECIFIC LEARNING BIASES AND LANGUAGE ACQUISITION
6.3. DIRECTIONS FOR FUTURE WORK
References
Author Index
Subject Index
π SIMILAR VOLUMES
Deep Learning for EEG-Based BrainβComputer Interfaces is an exciting book that describes how emerging deep learning improves the future development of BrainβComputer Interfaces (BCI) in terms of representations, algorithms and applications. BCI bridges humanity's neural world and the physical world
<span>Deep Learning for EEG-Based BrainβComputer Interfaces is an exciting book that describes how emerging deep learning improves the future development of BrainβComputer Interfaces (BCI) in terms of representations, algorithms and applications. BCI bridges humanity's neural world and the physical
ΠΠ·Π΄Π°ΡΠ΅Π»ΡΡΡΠ²ΠΎ Springer, 2003, -240 pp.<div class="bb-sep"></div>The development of high-performance computers and the corresponding advances in global communications have lead to an explosion in data collection, transmission and storage. Large-scale multidimensional databases are being generated to d
<p>This book is nominally about linguistic representation. But, since it is we who do the representing, it is also about us. And, since it is the universe which we represent, it is also about the universe. In the end, then, this book is about everything, which, since it is a philosophy book, is as i
This book addresses some issues of theorization in linguistics having to do with the systems of representation used in linguistics and the relation between linguistics and cognition. The essays gathered in the first part question the very concept of metalanguage, comparing the metalanguage used in f