- total number of examples received until successful learning,
- the total number of positive examples received until successful learning,
- the total number of prediction errors made

Next, we explain how to use the applet below. The syntax for the input is as follows:

1. Enter the probability to create negative examples for the target monomial
in the appropriate TextField in percent (default value is 50%).
You may choose 0% but *not* 100%,
since from negative examples only nothing is learned.

2. Enter the probability to create a '1' for the free entries of the Boolean
example vectors (that is, those digits, that - for positive examples - are
not determined by the target monomial) in the appropriate TextField in
percent (default value is 50%). Note that this probability *must*
be *bigger* than 0 and
* less * than 100.

3. Enter the number of variables. This must be a positive integer
bigger than or equal to 1 and less than what your computer can handle.
Also note that, if this number becomes *bigger* than 1000 then
part ouf the output is supressed. Instead of seeing the whole history
of the learning that has taken place (as in case your input was smaller
than 1001) you just get the statistics displayed. Also, it is not very
recommendable to activate ``Generate'' as described below, since it may
take its time displaying a target that has many literals.

4. Make your choice between Enter or Generate . If you press Enter, the text field below the Enter button will be activated. Input your target monomial. Type xi for the ith positive literal and -xi for the ith negative literal. Here i ranges between 1 and the 'number of variables' you have specified. All literals have to be separated by a space. Thus, a typical input may look like: x1 -x5 x33 -x45. However, the empty input represents the Boolen concept "TRUE"; while "FALSE" can be input by entering both xi and -xi for some i.

If you press Generate, the applet will randomly generate a target monomial. If you hit ``Generate"" again, a new target monomial will be generated. Assume you have specified the number of variables to be N. Then there are 2*N many literals and every monomial over these literals is equally likely. You can also hit ``Generate'' first, and subsequently enter. Then, you may manipulate the momonial displayed.

6. Click "confirm" to start learning the target monomial entered or generated.

Note that any type of illegal input will be not accepted but the program may try to recover. You may also try to correct it, but sometimes reloading the applet is the best possible choice.

Finally, the applet shows you the positive examples it has used to compute its different hypotheses (that is, those positive examples that led to a prediction error). Along with these examples the intermidiate hypotheses until successful learning are shown except the initial one, since it is always the same (that is x1 -x1 x2 -x2 ... xm -xm).

After one cycle (input, learning, display of history) you must hit relaod to start another cycle.

If you like to run the Wholist algorithm on your own input sequences, please go here.

**Bibliography**

Analyzing the Average-Case Behavior of Conjunctive Learning Algorithms.

Technical Report DOI-TR-153, Department of Informatics, Kyushu University,
Fukuoka, Japan, August 1998.

[Abstract]
[Full paper]

A Complete and Tight Average-Case Analysis of Learning Monomials,
*in* “STACS'99, 16th Annual Symposium on Theoretical Aspects of
Computer Science, Trier, Germany, March 1999, Proceedings,”
(C. Meinel and S. Tison, Eds.),
Lecture Notes in Computer Science 1563, pp. 414 - 423,
Springer 1999.

From Learning in the Limit to Stochastic Finite Learning,
*Theoretical Computer Science*, Vol. 364, Issue 1, 2006, 77-97.

(Special Issue Algorithmic Learning Theory (ALT 2003))

This applet has been written by Olaf Trzebin and Thomas Zeugmann. Please report any bug you may find to

Thomas Zeugmann

- "thomas" at "hokudai.ac.jp"

Rüdiger Reischuk, and Thomas Zeugmann

last change April 14, 2007.