<p>Human language capabilities are based on mental proceduresthat are closely linked to the time domain. Listening, understanding,and reacting, on the one hand, as well as planning,formulating,and speaking,onthe other, are performedin a highlyover lapping manner, thus allowing inter human communicat
Incremental Speech Translation (Lecture Notes in Computer Science, 1735)
β Scribed by Jan W. Amtrup
- Publisher
- Springer
- Year
- 1999
- Tongue
- English
- Leaves
- 213
- Category
- Library
No coin nor oath required. For personal study only.
β¦ Synopsis
Human language capabilities are based on mental proceduresthat are closely linked to the time domain. Listening, understanding,and reacting, on the one hand, as well as planning,formulating,and speaking,onthe other, are performedin a highlyover lapping manner, thus allowing inter human communication to proceed in a smooth and ?uent way. Although it happens to be the natural mode of human language interaction, in cremental processing is still far from becoming a common feature of todayβs lan guage technology. Instead, it will certainly remain one of the big challenges for research activities in the years to come. Usually considered dif?cult to a degree that rendersit almost intractableforpracticalpurposes,incrementallanguageprocessing has recently been attracting a steadily growing interest in the spoken language pro cessing community. Its notorious dif?culty can be attributed mainly to two reasons: Due to the inaccessibility of the right context, global optimization criteria are no longer available. This loss must be compensated for by communicating larger search spaces between system components or by introducing appropriate repair mechanisms. In any case, the complexity of the task can easily grow by an order of magnitude or even more. Incrementality is an almost useless feature as long as it remains a local property of individual system components. The advantages of incremental processing can be effectiveonly if all the componentsof a producer consumerchain consistently adhere to the same pattern of temporal behavior.
β¦ Table of Contents
Lecture Notes in Artificial Intelligence
IncrementalSpeech Translation
Foreword
Preface
Contents
List of Figures
List of Algorithms
List of Tables
Overview
Introduction
1.1 Incremental Natural Language Processing
1.2 Incremental Speech Understanding
1.3 Incremental Architectures and theArchitecture of MILC
1.4 Summary
Graph Theory and Natural Language Processing
2.1 General Definitions
2.2 The Use of Word Graphs for Natural Language Processing Systems
2.3 Evaluation of Word Graphs: Size and Quality Measures
2.4 Evaluation of Word Graphs: Quality Measures
2.5 Further Operations on Word Graphs
2.5.1 Removing Isolated Silence
2.5.2 Removing Consecutive Silence
2.5.3 Removing All Silence Edges
2.5.4 Merging Mutually Unreachable Vertices
2.6 Hypergraphs
2.6.1 Formal Definition of Hypergraphs
2.6.2 Merging of Hyperedges
2.6.3 Combination of Hyperedges
2.7 Search in Graphs
2.8 Summary
Unification-Based Formalisms for Translation in Natural Language Processing
3.1 Unification-Based Formalisms for Natural Language Processing
3.1.1 Definition of Typed Feature Structures with Appropriateness
3.1.2 Type Lattices
3.1.3 Feature Structures
3.1.4 Functions as Values of Features
3.2 Unification-Based Machine Translation
3.3 Architecture and Implementation of the Formalism
3.3.1 Definition and Implementation of Type Lattices
3.3.2 Definition and Implementation of Feature Structures
3.4 Summary
MILC: Structure and Implementation
4.1 Layered Charts
4.2 CommunicationWithin the Application
4.2.1 Communication Architecture of an Application
4.2.2 Channel Models
4.2.3 Information Service and Synchronization
4.2.4 Termination
4.3 Overview of the Architecture of MILC
4.4 Word Recognition
4.5 Idiom Processing
4.6 Parsing
4.6.1 Derivation of Verbal Complexes
4.6.2 Spontaneous Speech and Word Recognition
4.6.3 Structure and Processing Strategies
4.7 Utterance Integration
4.8 Transfer
4.8.1 Chart-Based Transfer
4.8.2 The Implementation of Transfer for MILC
4.9 Generation
4.10 Visualization
4.11 Extensions
4.11.1 Extension of the Architecture
4.11.2 Anytime Translation
4.12 System Size
4.13 Summary
Experiments and Results
5.1 Hypergraphs
5.2 Translation
5.2.1 Data Material
5.2.2 Linguistic Knowledge Sources
5.2.3 Experiments and System Parameters
5.2.4 Evaluation
5.2.5 Extensions
5.3 ComparisonWith Non-Incremental Methods
5.4 Summary
Conclusion and Outlook
Bibliography
Glossary
Abbreviations and Acronyms
Symbols
Index
π SIMILAR VOLUMES
<p>Human language capabilities are based on mental proceduresthat are closely linked to the time domain. Listening, understanding,and reacting, on the one hand, as well as planning,formulating,and speaking,onthe other, are performedin a highlyover lapping manner, thus allowing inter human communicat
<p>Human language capabilities are based on mental proceduresthat are closely linked to the time domain. Listening, understanding,and reacting, on the one hand, as well as planning,formulating,and speaking,onthe other, are performedin a highlyover lapping manner, thus allowing inter human communicat