Confident and Consistent Partial Learning of Recursive Functions

Authors: Ziyuan Gao and Frank Stephan

Source: Algorithmic Learning Theory, 23rd International Conference, ALT 2012, Lyon, France, October 29 - 31, 2012, Proccedings,
Lecture Notes in Artificial Intelligence 7568, pp. 51–65, Springer, 2012.

Abstract. Partial learning is a criterion where the learner infinitely often outputs one correct conjecture while every other hypothesis is issued only finitely often. This paper addresses two variants of partial learning in the setting of inductive inference of functions: first, confident partial learning requires that the learner also on those functions which it does not learn, singles out exactly one hypothesis which is output infinitely often; second, essentially class consistent partial learning is partial learning with the additional constraint that on the functions to be learnt, almost all hypotheses issued are consistent with all the data seen so far. The results of the present work are that confident partial learning is more general than explanatory learning, incomparable with behaviourally correct learning and closed under union; essentially class consistent partial learning is more general than behaviourally correct learning and incomparable with confident partial learning. Furthermore, it is investigated which oracles permit to learn all recursive functions under these criteria: for confident partial learning, some non-high oracles are omniscient; for essentially class consistent partial learning, all PA-complete and all oracles of hyperimmune Turing degree are omniscient.

©Copyright 2012, Springer


Valid HTML 4.1