Learning One-Variable Pattern Languages in Linear Average Time

Authors: Rüdiger Reischuk and Thomas Zeugmann

Source: Proc. 11th Annual Conference on Computational Learning Theory - COLT'98, July 24th - 26th, Madison, pp. 198 - 208, ACM Press 1998.

Abstract. A new algorithm for learning one-variable pattern languages is proposed and analyzed with respect to its average-case behavior. We consider the total learning time that takes into account all operations till an algorithm has converged to a correct hypothesis. For the expectation it is shown that for almost all meaningful distributions defining how the pattern variable is replaced by a string to generate random examples of the target pattern language this algorithm converges within a constant number of rounds with a total learning time that is linear in the pattern length. Thus, the algorithm is average-case optimal in a strong sense.

Though one-variable pattern languages cannot be inferred finitely, our approach can also be considered as probabilistic finite learning with high confidence.

The algorithms presented in this paper have been implemented in JAVA. You may try them out using our Average Case Optimal 1-Variable Pattern Language Learning page which is also mirrored in Japan.
©Copyright 1998, ACM