𝔖 Scriptorium
✦   LIBER   ✦

πŸ“

Supervised Sequence Labelling with Recurrent Neural Networks

✍ Scribed by Alex Graves (auth.)


Publisher
Springer-Verlag Berlin Heidelberg
Year
2012
Tongue
English
Leaves
159
Series
Studies in Computational Intelligence 385
Edition
1
Category
Library

⬇  Acquire This Volume

No coin nor oath required. For personal study only.

✦ Synopsis


Supervised sequence labelling is a vital area of machine learning, encompassing tasks such as speech, handwriting and gesture recognition, protein secondary structure prediction and part-of-speech tagging. Recurrent neural networks are powerful sequence learning toolsβ€”robust to input noise and distortion, able to exploit long-range contextual informationβ€”that would seem ideally suited to such problems. However their role in large-scale sequence labelling systems has so far been auxiliary.

The goal of this book is a complete framework for classifying and transcribing sequential data with recurrent neural networks only. Three main innovations are introduced in order to realise this goal. Firstly, the connectionist temporal classification output layer allows the framework to be trained with unsegmented target sequences, such as phoneme-level speech transcriptions; this is in contrast to previous connectionist approaches, which were dependent on error-prone prior segmentation. Secondly, multidimensional recurrent neural networks extend the framework in a natural way to data with more than one spatio-temporal dimension, such as images and videos. Thirdly, the use of hierarchical subsampling makes it feasible to apply the framework to very large or high resolution sequences, such as raw audio or video.

Experimental validation is provided by state-of-the-art results in speech and handwriting recognition.

✦ Table of Contents


Front Matter....Pages 1-11
Introduction....Pages 1-3
Supervised Sequence Labelling....Pages 5-13
Neural Networks....Pages 15-35
Long Short-Term Memory....Pages 37-45
A Comparison of Network Architectures....Pages 47-56
Hidden Markov Model Hybrids....Pages 57-60
Connectionist Temporal Classification....Pages 61-93
Multidimensional Networks....Pages 95-108
Hierarchical Subsampling Networks....Pages 109-131
Back Matter....Pages 0--1

✦ Subjects


Computational Intelligence; Artificial Intelligence (incl. Robotics)


πŸ“œ SIMILAR VOLUMES


Supervised Sequence Labelling with Recur
✍ Alex Graves (auth.) πŸ“‚ Library πŸ“… 2012 πŸ› Springer-Verlag Berlin Heidelberg 🌐 English

<p><p>Supervised sequence labelling is a vital area of machine learning, encompassing tasks such as speech, handwriting and gesture recognition, protein secondary structure prediction and part-of-speech tagging. Recurrent neural networks are powerful sequence learning toolsβ€”robust to input noise and

Supervised Learning with Complex-valued
✍ Sundaram Suresh, Narasimhan Sundararajan, Ramasamy Savitha (auth.) πŸ“‚ Library πŸ“… 2013 πŸ› Springer-Verlag Berlin Heidelberg 🌐 English

<p><p>Recent advancements in the field of telecommunications, medical imaging and signal processing deal with signals that are inherently time varying, nonlinear and complex-valued. The time varying, nonlinear characteristics of these signals can be effectively analyzed using artificial neural netwo

Learning with recurrent neural networks
✍ Barbara Hammer πŸ“‚ Library πŸ“… 2000 πŸ› Springer 🌐 English

Folding networks, a generalisation of recurrent neural networks to tree structured inputs, are investigated as a mechanism to learn regularities on classical symbolic data, for example. The architecture, the training mechanism, and several applications in different areas are explained. Afterwards a

Learning with recurrent neural networks
✍ Barbara Hammer (auth.) πŸ“‚ Library πŸ“… 2000 πŸ› Springer-Verlag London 🌐 English

<p>Folding networks, a generalisation of recurrent neural networks to tree structured inputs, are investigated as a mechanism to learn regularities on classical symbolic data, for example. The architecture, the training mechanism, and several applications in different areas are explained. Afterwards

Learning with recurrent neural networks
✍ Barbara Hammer πŸ“‚ Library πŸ“… 2000 πŸ› Springer 🌐 English

Folding networks, a generalisation of recurrent neural networks to tree structured inputs, are investigated as a mechanism to learn regularities on classical symbolic data, for example. The architecture, the training mechanism, and several applications in different areas are explained. Afterwards a

Recurrent Neural Networks with Python Qu
✍ Simeon Kostadinov πŸ“‚ Library πŸ“… 2018 πŸ› Packt Publishing, Limited 🌐 English

Learn how to develop intelligent applications with sequential learning and apply modern methods for language modeling with neural network architectures for deep learning with Python's most popular TensorFlow framework. Key Features Train and deploy Recurrent Neural Networks using the popular TensorF