Constraints on learning in dynamic synapses

Author: Amit Daniel   Fusi Stefano  

Publisher: Informa Healthcare

ISSN: 0954-898X

Source: Network: Computation in Neural Systems, Vol.3, Iss.4, 1992-11, pp. : 443-464

Disclaimer: Any content in publications that violate the sovereignty, the constitution or regulations of the PRC is not accepted or approved by CNPIEC.

Previous Menu Next

Abstract

Hebbian-type learning is discussed in a network whose synapses are analogue, dynamic variables, whose values have to be periodically refreshed due to possible exponential decay, or other instability of continuous synaptic efficacies. It is shown that the end product of learning in such networks is very sensitive to the relation between the rate of presentation of patterns and the size of the refresh time interval. It is shown that in the limit of slow presentation, the network can learn at most O(In N ) patterns in N neurons, and must learn each one in one shot, thus learning all errors present in a corrupt stimulus presented for retrieval. It is then shown that as the rate of presentation is increased the performance is increased rapidly. Another option we investigate is that in which the refresh mechanism is acting stochastically. In this case the rate of learning can be slowed down very significantly, but the number of stored patterns cannot surpass √N.