### Extended Stochastic Complexity and Minimax Relative Loss Analysis

**Author: Kenji Yamanishi**.

**Source: ***Lecture Notes in Artificial Intelligence* Vol. 1720,
1999, 26 - 38.

We are concerned with the problem of sequential prediction using a
given hypothesis class of continuously-many prediction strategies.
An effective performance measure is the minimax relative cumulative loss (RCL),
which is the minimum of the worst-case difference between
the cumulative loss for any prediction algorithm and that for
the best assignment in a given hypothesis class.
The purpose of this paper is to evaluate the minimax RCL
for general continuous hypothesis classes under general losses.
We first derive asymptotical upper and lower bounds on the minimax RCL
to show that they match (*k*/2*c*)ln *m*
within error of *o*(ln *m*)
where *k* is the
dimension of parameters for the hypothesis class, *m*
is the sample size, and *c* is the constant depending on the loss function.
We thereby show that the cumulative loss attaining the minimax RCL
asymptotically coincides with the extended stochastic
complexity (ESC), which is an extension of Rissanen's stochastic
complexity (SC) into the decision-theoretic scenario.
We further derive non-asymptotical upper bounds on the minimax RCL
both for parametric and nonparametric hypothesis classes.
We apply the analysis into the regression problem to derive
the least worst-case cumulative loss bounds to date.

©Copyright 1999 Springer-Verlag