

Publisher: Springer Publishing Company
ISSN: 1370-4621
Source: Neural Processing Letters, Vol.21, Iss.1, 2005-02, pp. : 53-60
Disclaimer: Any content in publications that violate the sovereignty, the constitution or regulations of the PRC is not accepted or approved by CNPIEC.
Abstract
This paper studies the complete convergence of a class of neural networks with different time scales under the assumption that the activation functions are unsaturated piecewise linear functions. Under this assumption, there are multiple equilibrium points in the neural network. Traditional methods cannot be used in this neural network. Complete convergence is proved by constructing an energy-like function. Simulations are employed to illustrate the theory.
Related content










Recurrent neural networks for time series classification
Neurocomputing, Vol. 50, Iss. unknown, 2003-01 ,pp. :