Comparing the Power of Probabilistic Learning and Oracle Identification under Monotonicity ConstraintsAuthor: Léa Meyer Source: Lecture Notes in Artificial Intelligence Vol. 1501, 1998, 306 - 320. Abstract. In the setting of learning indexed families, probabilistic learning under monotonicity constraints is more powerful than deterministic learning under monotonicity constraints even if the probability is close to 1 provided the learning machines are restricted to proper or class preserving hypothesis spaces (cf. [19]). In this paper, we investigate the relation between probabilistic learning and oracle identification under monotonicity constraints. In particular, we deal with the question how much ``additional information'' provided by oracles is necessary in order to compensate the additional power of probabilistic learning.
If the oracle machines have access to K-oracle, then they can
compensate the power of monotonic (conservative) probabilistic machines
completely, provided the probability p is greater than 2/3 (1/2).
Furthermore, we show that for every recursively enumerable oracle A,
there exists a learning problem which is strong-monotonically learnable by an
oracle machine having access to A, but not conservatively or monotonically
learnable with any probability p > 0.
A similar result holds for Peano-complete oracles.
However, probabilistic learning under monotonicity constraints is ``rich''
enough to encode every recursively enumerable set in a characteristic learning
problem, i.e., for every recursively enumerable set A, and every
p > 2/3, there exists a learning problem
Finally, we show that these probability bounds are strict, i.e., in the case of monotonic probabilistic learning with probability p = 2/3, conservative probabilistic learning with probability p = 1/2, and strong-monotonic probabilistic learning with probability p = 1/2, K is not sufficient to compensate the power of probabilistic learning under monotonicity constraints. ©Copyright 1998 Springer |