Vacillatory and BC learning on noisy data

Authors: John Case, Sanjay Jain, and Frank Stephan.
Email: sanjay@comp.nus.edu.sg

Source: Theoretical Computer Science Vol. 241, No. 1-2, 2000, 115-141.

Abstract. The present work employs a model of noise introduced earlier by the third author. In this model noisy data nonetheless uniquely determines the true data: correct information occurs infinitely often while incorrect information occurs only finitely often. The present paper considers the effects of this form of noise on vacillatory and behaviorally correct learning of grammars - both from positive data alone and from informant (positive and negative data). For learning from informant, the noise, in effect, destroys negative data. Various noisy-data hierarchies are exhibited, which, in some cases, are known to collapse when there is no noise. Noisy behaviorally correct learning is shown to obey a very strong ``subset principle''. It is shown, in many cases, how much power is needed to overcome the effects of noise. For example, the best we can do to simulate, in the presence of noise, the noise-free, no mind change cases takes infinitely many mind changes. One technical result is proved by a priority argument.

©Copyright 2000 Elsevier Science