Trading Monotonicity Demands versus Mind Changes

Authors: Steffen Lange and Thomas Zeugmann

Source: ``Proc. Computational Learning Theory, 2nd European Conference, EuroColt'95,'' Barcelona, Spain, March 1995, (P. Vitanyi, Ed.), Lecture Notes in Artificial Intelligence 904, pp. 125 - 139, Springer-Verlag 1995.

Abstract. The present paper deals with with the learnability of indexed families of uniformly recursive languages from positive data. We consider the influence of three monotonicity demands to the efficiency of the learning process. The efficiency of learning is measured in dependence on the number of mind changes a learning algorithm is allowed to perform. The three notions of monotonicity reflect different formalizations of the requirement that the learner has to produce better and better generalizations when fed more and more data on the target concept.

We distinguish between exact learnability ( has to be inferred with respect to ), class preserving learning ( has to be inferred with respect to some suitable chosen enumeration of all the languages from ), and class comprising inference ( has to be learned with respect to some suitable chosen enumeration of uniformly recursive languages containing at least all the languages from ).

In particular, we prove that a relaxation of the relevant monotonicity requirement may result in an arbitrarily large speed-up.


©Copyright 1995 Springer-Verlag