On the Complexity of Training a Single Perceptron with
Programmable Synaptic Delays
Author: Jirí Síma.
Source: Lecture Notes in Artificial Intelligence Vol. 2842,
2003, 221 - 233.
Abstract.
We consider a single perceptron
with synaptic delays
which generalizes a simplified model for a spiking neuron
where not only the time that a pulse needs to travel
through a synapse is taken into account but also the input
firing rates may have more different levels. A
synchronization technique is introduced so that the results
concerning the learnability of spiking neurons with binary
delays also apply to
with arbitrary delays. In particular, the
consistency problem for
with programmable delays and its
approximation version prove to be NP-hard. It follows
that the perceptrons with programmable synaptic delays
are not properly PAC-learnable and the spiking neurons
with arbitrary delays do not allow robust learning unless
RP = NP. In addition, we show that the representation
problem for which is an issue whether an n-variable
Boolean function given in DNF (or as a disjunction of
O(n) threshold gates) can be computed by a spiking
neuron is co-NP-hard.
©Copyright 2003 Springer
|