Characterizations of Monotonic and Dual Monotonic Language Learning

Authors: Thomas Zeugmann, Steffen Lange and Shyam Kapur

Source: Information & Computation 120, No. 2, 1995, 155 - 173.

Abstract.

The present paper deals with monotonic and dual monotonic language learning from positive as well as from positive and negative examples. The three notions of monotonicity reflect different formalizations of the requirement that the learner has to produce better and better generalizations when fed more and more data on the concept to be learned.

The three versions of dual monotonicity describe the concept that the inference device has to produce specializations that fit better and better to the target language. We characterize strong-monotonic, monotonic, weak-monotonic, dual strong-monotonic, dual monotonic, and monotonic & dual monotonic as well as finite language learning from positive data in terms of recursively generable finite sets. These characterizations provide a unifying framework for learning from positive data under the various monotonicity constraints. Moreover, they yield additional insight into the problem of what a natural learning algorithm should look like.


©Copyright 1995, Academic Press, Inc.