On learning monotone Boolean functions under the uniform distribution

Authors: Kazuyuki Amano and Akira Maruoka

Source: Theoretical Computer Science Vol. 350, Issue 1, January 2005, pp. 3-12
(Special Issue Algorithmic Learning Theory (ALT 2002)).

Abstract. In this paper, we prove two general theorems on monotone Boolean functions which are useful for constructing a learning algorithm for monotone Boolean functions under the uniform distribution.

A monotone Boolean function is called fair if it takes the value 1 on exactly half of its inputs. The first result proved in this paper is that a single variable function f(x)=xi has the minimum correlation with the majority function among all fair monotone functions. This proves the conjecture by Blum et al. (1998, Proc. 39th FOCS, pp. 408-415) and improves the performance guarantee of the best known learning algorithm for monotone Boolean functions under the uniform distribution they proposed.

Our second result is on the relationship between the influences and the average sensitivity of a monotone Boolean function. The influence of variable xi on f is defined as the probability that f(x) differs from f(xei) where x is chosen uniformly from {0,1}n and xei means x with its ith bit flipped. The average sensitivity of f is defined as the sum of the influences over all variables xi. We prove that a somewhat unintuitive result which says if the influence of every variable on a monotone Boolean function is small, i.e., O(1/nc) for some constant c>0, then the average sensitivity of the function must be large, i.e., Ω(log n). We also discuss how to apply this result to the construction of a new learning algorithm for monotone Boolean functions.


Keywords: PAC learning; Monotone Boolean functions; Harmonic analysis; Majority function


©Copyright 2005 Elsevier Science B.V.