๐”– Scriptorium
โœฆ   LIBER   โœฆ

๐Ÿ“

Representation Learning for Natural Language Processing

โœ Scribed by Zhiyuan Liu, Yankai Lin, Maosong Sun


Publisher
Springer Singapore;Springer
Year
2020
Tongue
English
Leaves
349
Edition
1st ed.
Category
Library

โฌ‡  Acquire This Volume

No coin nor oath required. For personal study only.

โœฆ Synopsis


This open access book provides an overview of the recent advances in representation learning theory, algorithms and applications for natural language processing (NLP). It is divided into three parts. Part I presents the representation learning techniques for multiple language entries, including words, phrases, sentences and documents. Part II then introduces the representation techniques for those objects that are closely related to NLP, including entity-based world knowledge, sememe-based linguistic knowledge, networks, and cross-modal entries. Lastly, Part III provides open resource tools for representation learning techniques, and discusses the remaining challenges and future research directions.

The theories and algorithms of representation learning presented can also benefit other related domains such as machine learning, social network analysis, semantic Web, information retrieval, data mining and computational biology. This book is intended for advanced undergraduate and graduate students, post-doctoral fellows, researchers, lecturers, and industrial engineers, as well as anyone interested in representation learning and natural language processing.

โœฆ Table of Contents


Front Matter ....Pages i-xxiv
Representation Learning and NLP (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 1-11
Word Representation (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 13-41
Compositional Semantics (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 43-57
Sentence Representation (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 59-89
Document Representation (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 91-123
Sememe Knowledge Representation (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 125-161
World Knowledge Representation (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 163-216
Network Representation (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 217-283
Cross-Modal Representation (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 285-317
Resources (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 319-328
Outlook (Zhiyuan Liu, Yankai Lin, Maosong Sun)....Pages 329-334

โœฆ Subjects


Computer Science; Computational Linguistics; Data Mining and Knowledge Discovery


๐Ÿ“œ SIMILAR VOLUMES


Representation Learning for Natural Lang
โœ Zhiyuan Liu, Yankai Lin, Maosong Sun ๐Ÿ“‚ Library ๐Ÿ“… 2023 ๐Ÿ› Springer ๐ŸŒ English

This book provides an overview of the recent advances in representation learning theory, algorithms, and applications for natural language processing (NLP), ranging from word embeddings to pre-trained language models. It is divided into four parts. Part I presents the representation learning techniq

Representation Learning for Natural Lang
โœ Zhiyuan Liu (editor), Yankai Lin (editor), Maosong Sun (editor) ๐Ÿ“‚ Library ๐Ÿ“… 2023 ๐Ÿ› Springer ๐ŸŒ English

<p><span>This book provides an overview of the recent advances in representation learning theory, algorithms, and applications for natural language processing (NLP), ranging from word embeddings to pre-trained language models. It is divided into four parts. Part I presents the representation learnin

Transfer Learning for Natural Language P
โœ Paul Azunre ๐Ÿ“‚ Library ๐Ÿ“… 2021 ๐Ÿ› Manning Publications ๐ŸŒ English

Transfer Learning for Natural Language Processing gets you up to speed with the relevant ML concepts before diving into the cutting-edge advances that are defining the future of NLP.Building and training deep learning models from scratch is costly, time-consuming, and requires massive amounts of dat

Transfer Learning for Natural Language P
โœ Paul Azunre ๐Ÿ“‚ Library ๐Ÿ“… 2021 ๐Ÿ› Manning ๐ŸŒ English

<b>Build custom NLP models in record time by adapting pre-trained machine learning models to solve specialized problems.</b> Summary In <i>Transfer Learning for Natural Language Processing</i> you will learn: ย ย ย  Fine tuning pretrained models with new domain data ย ย ย  Picking the right mod

Deep Learning for Natural Language Proce
โœ Stephan Raaijmakers ๐Ÿ“‚ Library ๐Ÿ“… 2022 ๐Ÿ› Manning Publications ๐ŸŒ English

<p><span>Humans do a great job of reading text, identifying key ideas, summarizing, making connections, and other tasks that require comprehension and context. Recent advances in deep learning make it possible for computer systems to achieve similar results. </span></p><p><span>Deep Learning for Nat