×

Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm. (English) Zbl 1282.90248

Summary: The Gravitational Search Algorithm (GSA) is a novel heuristic optimization method based on the law of gravity and mass interactions. It has been proven that this algorithm has good ability to search for the global optimum, but it suffers from slow searching speed in the last iterations. This work proposes a hybrid of Particle Swarm Optimization (PSO) and GSA to resolve the aforementioned problem. In this paper, GSA and PSOGSA are employed as new training methods for Feedforward Neural Networks (FNNs) in order to investigate the efficiencies of these algorithms in reducing the problems of trapping in local minima and the slow convergence rate of current evolutionary learning algorithms. The results are compared with a standard PSO-based learning algorithm for FNNs. The resulting accuracy of FNNs trained with PSO, GSA, and PSOGSA is also investigated. The experimental results show that PSOGSA outperforms both PSO and GSA for training FNNs in terms of converging speed and avoiding local minima. It is also proven that an FNN trained with PSOGSA has better accuracy than one trained with GSA.

MSC:

90C59 Approximation methods and heuristics in mathematical programming

Software:

BGSA; GSA
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] B. Irie, S. Miyake, Capability of three-layered perceptrons, in: Proceedings of IEEE International Conference on Neural Networks, San Diego, USA, 1998, pp. 641-648.; B. Irie, S. Miyake, Capability of three-layered perceptrons, in: Proceedings of IEEE International Conference on Neural Networks, San Diego, USA, 1998, pp. 641-648.
[2] C. Lin, Cheng-Hung, C. Lee, A self-adaptive quantum radial basis function network for classification applications, in: IEEE International Joint Conference on Neural Networks, 2004, pp. 3263-3268.; C. Lin, Cheng-Hung, C. Lee, A self-adaptive quantum radial basis function network for classification applications, in: IEEE International Joint Conference on Neural Networks, 2004, pp. 3263-3268.
[3] Mat Isa, N., Clustered-hybrid multilayer perceptron network for pattern recognition application, Applied Soft Computing, 11, 1 (2011)
[4] Homik, K.; Stinchcombe, M.; White, H., Multilayer feedforward networks are universal approximators, Neural Networks, 2, 359-366 (1989) · Zbl 1383.92015
[5] Malakooti, B.; Zhou, Y., Approximating polynomial functions by feedforward artificial neural network: capacity analysis and design, Appl. Math. Comput., 90, 27-52 (1998) · Zbl 0901.68173
[6] Hush, D. R.; Horne, N. G., Progress in supervised neural networks, IEEE Signal Process Mag., 10, 8-39 (1993)
[7] Hagar, M. T.; Menhaj, M. B., Training feedforward networks with the Marquardt algorithm, IEEE Trans. Network, 5, 6, 989-993 (1994)
[8] Adeli, H.; Hung, S. L., An adaptive conjugate gradient learning algorithm for efficient training of neural networks, Appl. Math. Comput., 62, 81-102 (1994) · Zbl 0804.68119
[9] Zhang, N., An online gradient method with momentum for two-layer feedforward neural networks, Appl. Math. Comput., 212, 488-498 (2009) · Zbl 1187.68420
[10] Zhang, J. R.; Zhang, J.; Lock, T. M.; Lyu, M. R., A hybrid particle swarm optimization-back-propagation algorithm for feedforward neural network training, Appl. Math. Comput., 128, 1026-1037 (2007) · Zbl 1112.65059
[11] Gori, M.; Tesi, A., On the problem of local minima in back-propagation, IEEE Trans. Pattern Anal. Mach. Intell., 14, 1, 76-86 (1992)
[12] S. Shaw, W. Kinsner, Chaotic simulated annealing in multilayer feedforward networks, in: Canadian Conference on Electrical and Computer Engineering, 1996, pp. 265-269.; S. Shaw, W. Kinsner, Chaotic simulated annealing in multilayer feedforward networks, in: Canadian Conference on Electrical and Computer Engineering, 1996, pp. 265-269.
[13] Chang, S. K.; Mohammed, O. A.; Y Hahn, S., Detection of magnetic body using article neural network with modified simulated annealing, IEEE Trans. Magn., 30, 3644-3647 (1994)
[14] D.J. Monata, L. Davis, Training feedforward neural networks using genetic algorithms, in: 11th International Joint Conference on Artificial Intelligence, 1989, pp. 762-767.; D.J. Monata, L. Davis, Training feedforward neural networks using genetic algorithms, in: 11th International Joint Conference on Artificial Intelligence, 1989, pp. 762-767.
[15] Kiranyaz, S.; Ince, T.; Yildirim, A.; Gabbouj, M., Evolutionary artificial neural networks by multi-dimensional particle swarm, Neural Networks, 22, 10, 1448-1462 (2009)
[16] M. Cells, B. Rylander, Neural network learning using particle swarm optimization, Advances in Information Science and Soft Computing, 2002, pp. 224-226.; M. Cells, B. Rylander, Neural network learning using particle swarm optimization, Advances in Information Science and Soft Computing, 2002, pp. 224-226.
[17] C. Zhang, Y. Li, H. Shao, A new evolved artificial neural network and its application, in: 3rd World Congress on intelligent Control and application, Hefei, China, 2000, pp. 1065-1068.; C. Zhang, Y. Li, H. Shao, A new evolved artificial neural network and its application, in: 3rd World Congress on intelligent Control and application, Hefei, China, 2000, pp. 1065-1068.
[18] Van den Bergli, F.; Engelbrecht, A. P., Cooperative learning in neural network using particle swarm optimization, South African Computer Journal, 26, 84-90 (2000)
[19] C. Zhang, H. Shao, Y. Li, Particle swarm optimization for evolving artificial neural network, in: IEEE international Conference on System, Man, and Cybemetics, 2000, pp. 2487-2490.; C. Zhang, H. Shao, Y. Li, Particle swarm optimization for evolving artificial neural network, in: IEEE international Conference on System, Man, and Cybemetics, 2000, pp. 2487-2490.
[20] S. Mirjalili, A. Safa Sadiq, Magnetic optimization algorithm for training multi layer perceptron, in: IEEE International Conference on Industrial and Intelligent Information (ICIII 2011), vol. 2, Indonesia, 2011, pp. 42-46.; S. Mirjalili, A. Safa Sadiq, Magnetic optimization algorithm for training multi layer perceptron, in: IEEE International Conference on Industrial and Intelligent Information (ICIII 2011), vol. 2, Indonesia, 2011, pp. 42-46.
[21] T. Si, S. Hazra, N. Jana, Artificial neural network training using differential evolutionary algorithm for classification, in: S. Satapathy, P. Avadhani, A. Abraham (Eds.), Proceedings of the International Conference on Information Systems Design and Intelligent Applications 2012 (INDIA 2012) held in Visakhapatnam, India, January 2012, Springer, Berlin/Heidelberg, AISC 132, pp. 769-778.; T. Si, S. Hazra, N. Jana, Artificial neural network training using differential evolutionary algorithm for classification, in: S. Satapathy, P. Avadhani, A. Abraham (Eds.), Proceedings of the International Conference on Information Systems Design and Intelligent Applications 2012 (INDIA 2012) held in Visakhapatnam, India, January 2012, Springer, Berlin/Heidelberg, AISC 132, pp. 769-778.
[22] Settles, M.; Rodebaugh, B.; Soule, T., Comparison of genetic algorithm and particle swarm optimizer when evolving a recurrent neural network, (Genetic and Evolutionary Computation - GECCO’2003, vol. 2723 (2003), Springer: Springer Berlin/Heidelberg), 148-149 · Zbl 1028.68883
[23] S. Mirjalili, Hybrid particle swarm optimization and gravitational search algorithm for multilayer perceptron learning, Universiti Teknologi Malaysia (UTM), Johor Bahru, Malaysia, M.Sc. Thesis 2011.; S. Mirjalili, Hybrid particle swarm optimization and gravitational search algorithm for multilayer perceptron learning, Universiti Teknologi Malaysia (UTM), Johor Bahru, Malaysia, M.Sc. Thesis 2011.
[24] J. Kennedy, R.C. Eberhart, Particle swarm optimization, in: Proceedings of IEEE International Conference on Neural Networks, vol. 4, 1995, pp. 1942-1948.; J. Kennedy, R.C. Eberhart, Particle swarm optimization, in: Proceedings of IEEE International Conference on Neural Networks, vol. 4, 1995, pp. 1942-1948.
[25] Rashedi, E.; Nezamabadi, S.; Saryazdi, S., GSA: a gravitational search algorithm, Information Sciences, 179, 13, 2232-2248 (2009) · Zbl 1177.90378
[26] I. Newton, In experimental philosophy particular propositions are inferred from the phenomena and afterwards rendered general by induction, 3rd ed.: Andrew Motte’s English translation published, 1729, vol. 2.; I. Newton, In experimental philosophy particular propositions are inferred from the phenomena and afterwards rendered general by induction, 3rd ed.: Andrew Motte’s English translation published, 1729, vol. 2.
[27] Atapour, A. A.; Ghanizadeh, A.; Shamsuddin, S. M., Advances of Soft Computing Methods in Edge Detection, Int. J. Advance. Soft Comput. Appl., 1, 2, 162-202 (2009)
[28] Rashedi, E.; Nezamabadi-pour, H.; Saryazdi, S., BGSA: binary gravitational search algorithm, Natural Computing, 9, 3, 727-745 (2009) · Zbl 1211.68140
[29] S. Sinaie, Solving shortest path problem using Gravitational Search Algorithm and Neural Networks, Universiti Teknologi Malaysia (UTM), Johor Bahru, Malaysia, M.Sc. Thesis 2010.; S. Sinaie, Solving shortest path problem using Gravitational Search Algorithm and Neural Networks, Universiti Teknologi Malaysia (UTM), Johor Bahru, Malaysia, M.Sc. Thesis 2010.
[30] S. Mirjalili and S.Z. Mohd Hashim, A New Hybrid PSOGSA Algorithm for Function Optimization, in: International Conference on Computer and Information Application(ICCIA 2010), 2010, pp. 374-377.; S. Mirjalili and S.Z. Mohd Hashim, A New Hybrid PSOGSA Algorithm for Function Optimization, in: International Conference on Computer and Information Application(ICCIA 2010), 2010, pp. 374-377.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.