Monotone Conditional Complexity Bounds on Future Prediction Errors

Authors: Alexey Chernov and Marcus Hutter

Source: Algorithmic Learning Theory, 16th International Conference, ALT 2005, Singapore, October 2005, Proceedings, (Sanjay Jain, Hans Ulrich Simon and Etsuji Tomita, Eds.), Lecture Notes in Artificial Intelligence 3734, pp. 414 - 428, Springer 2005.

Abstract. We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution μ by the algorithmic complexity of μ. Here we assume we are at a time t>1 and already observed x=x1...xt. We bound the future prediction performance on xt + 1xt + 2... by a new variant of algorithmic complexity of μ given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.


*This work was supported by SNF grants 200020-107590/1 (to Jürgen Schmidhuber), 2100-67712 and 200020-107616.
©Copyright 2005, Springer