This book provides an overview of the recent advances in representation learning theory, algorithms, and applications for natural language processing (NLP), ranging from word embeddings to pre-trained language models. It is divided into four parts. Part I presents the representation learning techniq
Representation Learning for Natural Language Processing
โ Scribed by Zhiyuan Liu, Yankai Lin, Maosong Sun
- Publisher
- Springer Singapore;Springer
- Year
- 2020
- Tongue
- English
- Leaves
- 349
- Edition
- 1st ed.
- Category
- Library
No coin nor oath required. For personal study only.
โฆ Synopsis
This open access book provides an overview of the recent advances in representation learning theory, algorithms and applications for natural language processing (NLP). It is divided into three parts. Part I presents the representation learning techniques for multiple language entries, including words, phrases, sentences and documents. Part II then introduces the representation techniques for those objects that are closely related to NLP, including entity-based world knowledge, sememe-based linguistic knowledge, networks, and cross-modal entries. Lastly, Part III provides open resource tools for representation learning techniques, and discusses the remaining challenges and future research directions.
The theories and algorithms of representation learning presented can also benefit other related domains such as machine learning, social network analysis, semantic Web, information retrieval, data mining and computational biology. This book is intended for advanced undergraduate and graduate students, post-doctoral fellows, researchers, lecturers, and industrial engineers, as well as anyone interested in representation learning and natural language processing.โฆ Table of Contents
Front Matter ....Pages i-xxiv
Representation Learning and NLP (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 1-11
Word Representation (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 13-41
Compositional Semantics (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 43-57
Sentence Representation (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 59-89
Document Representation (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 91-123
Sememe Knowledge Representation (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 125-161
World Knowledge Representation (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 163-216
Network Representation (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 217-283
Cross-Modal Representation (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 285-317
Resources (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 319-328
Outlook (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 329-334
โฆ Subjects
Computer Science; Computational Linguistics; Data Mining and Knowledge Discovery
๐ SIMILAR VOLUMES
<p><span>This book provides an overview of the recent advances in representation learning theory, algorithms, and applications for natural language processing (NLP), ranging from word embeddings to pre-trained language models. It is divided into four parts. Part I presents the representation learnin
Transfer Learning for Natural Language Processing gets you up to speed with the relevant ML concepts before diving into the cutting-edge advances that are defining the future of NLP.Building and training deep learning models from scratch is costly, time-consuming, and requires massive amounts of dat
<b>Build custom NLP models in record time by adapting pre-trained machine learning models to solve specialized problems.</b> Summary In <i>Transfer Learning for Natural Language Processing</i> you will learn: ย ย ย Fine tuning pretrained models with new domain data ย ย ย Picking the right mod
<p><span>Humans do a great job of reading text, identifying key ideas, summarizing, making connections, and other tasks that require comprehension and context. Recent advances in deep learning make it possible for computer systems to achieve similar results. </span></p><p><span>Deep Learning for Nat