×

A comparative study of two modeling approaches in neural networks. (English) Zbl 1082.68099

Summary: The neuron state modeling and the local field modeling provides two fundamental modeling approaches to neural network research, based on which a neural network system can be called either as a static neural network model or as a local field neural network model. These two models are theoretically compared in terms of their trajectory transformation property, equilibrium correspondence property, nontrivial attractive manifold property, global convergence as well as stability in many different senses. The comparison reveals an important stability invariance property of the two models in the sense that the stability (in any sense) of the static model is equivalent to that of a subsystem deduced from the local field model when restricted to a specific manifold. Such stability invariance property lays a sound theoretical foundation of validity of a useful, cross-fertilization type stability analysis methodology for various neural network models.

MSC:

68T05 Learning and adaptive systems in artificial intelligence
93A30 Mathematical modelling of systems (MSC2010)
93D99 Stability of control systems
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Almeida, L. B., Backpropagation in perceptrons with feedback, (Eckmiller, R.; Von der Malsburg, C., Neural computers (1988), Springer-Verlag: Springer-Verlag New York), 199-208
[2] Arik, A.; Tavsanoglu, V., A sufficient condition for absolute stability of a large class of dynamical neural networks, IEEE Transactions on Circuits and Systems I, 47, 758-760 (2000) · Zbl 0979.34042
[3] Bouzerdoum, A.; Pattison, T. R., Neural network for quadratic optimization with bound constraints, IEEE Transactions on Neural Networks, 4, 293-303 (1993)
[4] Chen, T. P.; Amari, S. I., Stability of asymmetric Hopfield networks, IEEE Transactions on Neural Networks, 12, 159-163 (2001)
[5] Chen, T. P.; Amari, S. I., New theorems on global convergence of some dynamical systems, Neural Networks, 14, 251-255 (2001)
[6] Chua, L. O.; Yang, L., Cellular neural networks: Theory and applications, IEEE Transactions on Circuits and Systems, 35, 1257-1290 (1988) · Zbl 0663.94022
[7] Cohen, M. A.; Grossberg, S., Absolute stability of global pattern formation and parallel memory storage by competitive neural networks, IEEE Transactions on System, Man and Cybernetics, 13, 815-826 (1983) · Zbl 0553.92009
[8] Fang, Y.; Kincaid, T. G., Stability analysis of dynamical neural networks, IEEE Transactions on Neural Networks, 7, 996-1006 (1996)
[9] Forti, M.; Tesi, A., New conditions for global stability of neural networks with applications to linear and quadratic programming problems, IEEE Transactions on Circuits and Systems I, 42, 354-366 (1995) · Zbl 0849.68105
[10] Friesz, T. L.; Bernstein, D. H.; Mehta, N. J.; Tobin, R. L.; Ganjlizadeh, S., Day-to-day dynamic network disequilibria and idealized traveler information systems, Operations Research, 42, 1120-1136 (1994) · Zbl 0823.90037
[11] Guan, Z. H.; Chen, G.; Qin, Y., On equilibria, stability and instability of Hopfield neural networks, IEEE Transactions on Neural Networks, 11, 534-540 (2000)
[12] Haykin, S., Neural networks: A comprehensive foundation (1994), Macmillan: Macmillan New York · Zbl 0828.68103
[13] Hertz, J. A.; Krogh, A.; Palmer, R. G., Introduction to the theory of neural computation (1994), Addison-Wesley: Addison-Wesley Reading, MA
[14] Hirsch, M. W., Convergent activation dynamics in continuous time networks, Neural Networks, 2, 331-349 (1989)
[15] Hopfield, J. J., Neural networks and physical systems with emergent collective computational abilities, Proceedings of the USA National Academy of Sciences, 79, 2554-2558 (1982) · Zbl 1369.92007
[16] Hopfield, J. J.; Tank, D. W., Computing with neural circuits: A model, Science, 233, 625-633 (1986) · Zbl 1356.92005
[17] Juang, J. C., Stability analysis of Hopfield-type neural networks, IEEE Transactions on Neural Networks, 10, 1366-1374 (1999)
[18] Khalil, H. K., Nonlinear systems (1992), Macmillan: Macmillan New York · Zbl 0626.34052
[19] Kosko, B., Bidirectional associative memories, IEEE Transaction on System, Man and Cybernetics, 18, 49-60 (1988)
[20] Li, J.; Michel, A. N.; Porod, W., Analysis and synthesis of a class of neural networks: Linear systems operating on a closed hypercube, IEEE Transactions on Circuits and Systems, 36, 1406-1422 (1989) · Zbl 0689.94004
[21] Liang, X. B.; Si, J., Global exponential stability of neural networks with globally Lipschitz continuous activation and its application to linear variational inequality problem, IEEE Transactions on Neural Networks, 12, 349-359 (2001)
[22] Liang, X. B.; Wang, J., A recurrent neural network for nonlinear optimization with a continuously differentiable objective function and bound constrains, IEEE Transactions on Neural Networks, 11, 1251-1262 (2000)
[23] Liang, X. B.; Wang, J., Absolute exponential stability of neural networks with a general class of activation functions, IEEE Transactions on Circuits and Systems I, 47, 1258-1263 (2000) · Zbl 1079.68592
[24] Liang, X. B.; Wu, L. D., Global exponential stability of a class of neural networks, IEEE Transactions on Circuits and Systems I, 46, 748-751 (1999) · Zbl 0952.94008
[25] Matsuoka, K., On absolute stability of neural networks, Transactions of Institute of Electronics, Information and Communication Engineering of Japan, 536-542 (1991)
[26] Park, J.; Kim, H. Y.; Park, Y.; Lee, S. W., A synthesis procedure for associative memories based on space-varying cellular neural networks, Neural Networks, 14, 107-113 (2001)
[27] Pineda, F. J., Generalization of back-propagation to recurrent neural networks, Physical Review Letters, 59, 2229-2232 (1987)
[28] Qiao, H.; Peng, J.; Xu, Z. B., Nonlinear measures: A new approach to exponential stability analysis for Hopfield-type neural networks, IEEE Transactions on Neural Networks, 12, 360-370 (2001)
[29] Qiao, H.; Peng, J.; Xu, Z. B.; Zhang, B., A reference model approach to stability analysis of neural networks, IEEE Transactions on System, Man and Cybernetics (2003), in press
[30] Rao, C. R.; Mitra, S. K., Generalized inverse of matrices and its applications (1971), Wiley: Wiley New York
[31] Rohwer, R.; Forrest, B., Training time-dependence in neural networks. Training time-dependence in neural networks, Proceedings of the First IEEE International Conference on Neural Networks, Vol. 2 (1987), IEEE: IEEE San Diego, CA, pp. 701-708
[32] Roska, T.; Vandewalle, J., Cellular neural networks (1995), Wiley: Wiley Chichester
[33] Varga, I.; Elek, G.; Zak, H., On the brain-state-in-a-convex-domain neural models, Neural Networks, 9, 1173-1184 (1996)
[34] Verhulst, F., Nonlinear differential equations and dynamic systems (1990), Springer-Verlag: Springer-Verlag New York
[35] Xia, Y. S., A new neural network for solving linear and quadratic programming problems, IEEE Transactions on Neural Networks, 7, 1544-1547 (1996)
[36] Xia, Y. S.; Wang, J., A general methodology for designing globally convergent optimization neural networks, IEEE Transactions on Neural Networks, 9, 1331-1343 (1998)
[37] Xia, Y. S.; Wang, J., On the stability of globally projected dynamical systems, Journal of Optimization Theory and Applications, 106, 129-150 (2000) · Zbl 0971.37013
[38] Yang, H.; Dillon, T. S., Exponential stability and oscillation of Hopfield graded response neural networks, IEEE Transactions on Neural Networks, 5, 719-729 (1994)
[39] Zhang, Y. P.; Heng, A.; Fu, A. W.C., Estimate of exponential convergence rate and exponential stability for neural networks, IEEE Transactions on Neural Networks, 10, 1487-1493 (1999)
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.