A Note on Batch and Incremental Learnability
โ Scribed by Arun Sharma
- Book ID
- 102585848
- Publisher
- Elsevier Science
- Year
- 1998
- Tongue
- English
- Weight
- 231 KB
- Volume
- 56
- Category
- Article
- ISSN
- 0022-0000
No coin nor oath required. For personal study only.
โฆ Synopsis
According to Gold's criterion of identification in the limit, a learner, presented with data about a concept, is allowed to make a finite number of incorrect hypotheses before converging to a correct hypothesis. If, on the other hand, the learner is allowed to make only one conjecture which has to be correct, the resulting criterion of success is known as finite identification Identification in the limit may be viewed as an idealized model for incremental learning whereas finite identification may be viewed as an idealized model for batch learning. The present paper establishes a surprising fact that the collections of recursively enumerable languages that can be finite identified (batch learned in the ideal case) from both positive and negative data can also be identified in the limit (incrementally learned in the ideal case) from only positive data.
It is often difficult to extract insights about practical learning systems from abstract theorems in inductive inference. However, this result may be seen as carrying a moral for the design of learning systems, as it yields, in the ideal case of no inaccuracies, an algorithm for converting batch systems that learn from both positive and negative data into incremental systems that learn from only positive data without any loss in learning power. This is achieved by the incremental system simulating the batch system in incremental fashion and using the heuristic of ``localized closed-world assumption'' to generate negative data.
๐ SIMILAR VOLUMES