Vacillatory learning of nearly minimal size grammars
✍ Scribed by John Case; Sanjay Jain; Arun Sharma
- Publisher
- Elsevier Science
- Year
- 1994
- Tongue
- English
- Weight
- 996 KB
- Volume
- 49
- Category
- Article
- ISSN
- 0022-0000
No coin nor oath required. For personal study only.
✦ Synopsis
In Gold's influential language learning paradigm a learning machine converges in the limit to one correct grammar. In an attempt to generalize Gold's paradigm, Case considered the question whether people might converge to vacillating between up to (some integer) n > 1 distinct, but equivalent, correct grammars. He showed that larger classes of languages can be algorithmically learned (in the limit) by converging to up to n + 1 rather than up to n correct grammars. He also argued that, for "small" n > 1, it is plausible that people might sometimes converge to vacillating between up to n grammars. The insistence on small n was motivated by the consideration that, for "large" n, at least one of n grammars would be too large to fit in people's heads. Of course, even for Gold's n = 1 case, the single grammar converged to in the limit may be infeasibly large. An interesting complexity restriction to make, then, on the final grammar(s) converged to in the limit is that they all have small size. In this paper we study some of the trade-offs in learning power involved in making a well-defined version of this restriction. We show and exploit as a tool the desirable property that the learning power under our size-restricted criteria (for successful learning) is independent of the underlying acceptable programming systems. We characterize the power of our size-restricted criteria and use this characterization to prove that some classes of languages, which can be learned by converging in the limit to up to n + 1 nearly minimal size correct grammars, cannot be learned by converging to up to n unrestricted grammars even if these latter grammars are allowed to have a finite number of anomalies (i.e., mistakes) per grammar. We also show that there is