×

Example-based learning particle swarm optimization for continuous optimization. (English) Zbl 1250.90113

Summary: Particle swarm optimization (PSO) is a heuristic optimization technique based on swarm intelligence that is inspired by the behavior of bird flocking. The canonical PSO has the disadvantage of premature convergence. Several improved PSO versions do well in keeping the diversity of the particles during the searching process, but at the expense of rapid convergence. This paper proposes an example-based learning PSO (ELPSO) to overcome these shortcomings by keeping a balance between swarm diversity and convergence speed. Inspired by a social phenomenon that multiple good examples can guide a crowd towards making progress, ELPSO uses an example set of multiple global best particles to update the positions of the particles. In this study, the particles of the example set were selected from the best particles and updated by the better particles in the first-in-first-out order in each iteration. The particles in the example set are different, and are usually of high quality in terms of the target optimization function. ELPSO has better diversity and convergence speed than single-gbest and non-gbest PSO algorithms, which is proved by mathematical and numerical results. Finally, computational experiments on benchmark problems show that ELPSO outperforms all of the tested PSO algorithms in terms of both solution quality and convergence time.

MSC:

90C59 Approximation methods and heuristics in mathematical programming
68T05 Learning and adaptive systems in artificial intelligence
68T20 Problem solving in the context of artificial intelligence (heuristics, search strategies, etc.)
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] P.J. Angeline, Using selection to improve particle swarm optimization, in: Proceedings of IEEE Congress Evolutionary on Computation, Anchorage, AK, 1998, pp. 84-89.; P.J. Angeline, Using selection to improve particle swarm optimization, in: Proceedings of IEEE Congress Evolutionary on Computation, Anchorage, AK, 1998, pp. 84-89.
[2] Clerc, M.; Kennedy, J., The Particle swarm-explosion, stability, and convergence in a multidimensional complex space, IEEE Transactions on Evolutionary Computation, 6, 1, 58-73 (2002)
[3] Coelho, L. S., An efficient particle swarm approach for mixed-integer programming in reliability-redundancy optimization applications, Reliability Engineering and System Safety, 94, 4, 830-837 (2009)
[4] Coelho, L. S.; Mariani, V. C., Particle swarm approach based on quantum mechanics and harmonic oscillator potential well for economic load dispatch with valve-point effects, Energy Conversion and Management, 49, 11, 3080-3085 (2008)
[5] Cui, Z.; Zeng, J.; Sun, G., Levy velocity threshold particle swarm optimization, ICIC Express Letters, 2, 1, 23-28 (2008)
[6] Cui, Z.; Cai, X.; Zeng, J., Chaotic performance-dependent particle swarm optimization, International Journal of Innovative Computing, Information and Control, 5, 4, 951-960 (2009)
[7] Du, W.; Li, B., Multi-strategy ensemble particle swarm optimization for dynamic optimization, Information Sciences, 178, 15, 3096-3109 (2008) · Zbl 1283.90047
[8] R.C. Eberhart, J. Kennedy, A new optimizer using particle swarm theory, in: Proceedings of the Sixth International Symposium on Micromachine and Human Science, Japan, Nagoya, 1995, pp. 39-43.; R.C. Eberhart, J. Kennedy, A new optimizer using particle swarm theory, in: Proceedings of the Sixth International Symposium on Micromachine and Human Science, Japan, Nagoya, 1995, pp. 39-43.
[9] X. Hu, R.C. Eberhart, Multi-objective optimization using dynamic neighborhood particle swarm optimization, in: Proceedings of IEEE Congress on Evolutionary Computation. Honolulu, HI, 2002, pp. 1677-1681.; X. Hu, R.C. Eberhart, Multi-objective optimization using dynamic neighborhood particle swarm optimization, in: Proceedings of IEEE Congress on Evolutionary Computation. Honolulu, HI, 2002, pp. 1677-1681.
[10] Jiang, Y.; Hu, T.; Huang, C. C.; Wu, X., An improved particle swarm optimization algorithm, Applied Mathematics and Computation, 193, 1, 231-239 (2007) · Zbl 1193.90220
[11] Kang, Q.; Wang, L.; Wu, Q., A novel ecological particle swarm optimization algorithm and its population dynamics analysis, Applied Mathematics and Computation, 205, 1, 61-72 (2008) · Zbl 1152.92353
[12] J. Kennedy, R.C. Eberhart, Particle swarm optimization, in: Proceeding of IEEE Intentional Conference on Neural Networks, Perth, Australia, IEEE Service Center, Piscataway, NJ, 1995, pp. 1942-1948.; J. Kennedy, R.C. Eberhart, Particle swarm optimization, in: Proceeding of IEEE Intentional Conference on Neural Networks, Perth, Australia, IEEE Service Center, Piscataway, NJ, 1995, pp. 1942-1948.
[13] J. Kennedy, Small worlds and mega-minds: effects of neighborhood topology on particle swarm performance, in: Proceedings of IEEE Congress on Evolutionary Computation, Piscataway, NJ, 1999, pp. 1931-1938.; J. Kennedy, Small worlds and mega-minds: effects of neighborhood topology on particle swarm performance, in: Proceedings of IEEE Congress on Evolutionary Computation, Piscataway, NJ, 1999, pp. 1931-1938.
[14] J. Kennedy, R. Mendes, Population structure and particle swarm performance, in: Proceedings of IEEE Congress on Evolutionary Computation, Honolulu, HI, 2002, pp. 1671-1676.; J. Kennedy, R. Mendes, Population structure and particle swarm performance, in: Proceedings of IEEE Congress on Evolutionary Computation, Honolulu, HI, 2002, pp. 1671-1676.
[15] Khalid, N. K.; Ibrahim, Z.; Kurniawan, T. B.; Khalid, M.; Sarmin, N. H., Function minimization in DNA sequence design based on continuous particle swarm optimization, ICIC Express Letters, 3, 1, 27-32 (2009)
[16] T. Krink, J.S. Vesterstroem, J. Riget, Particle swarm optimization with spatial particle extension, in: Proceedings of IEEE Congress on Evolutionary Computation, Honolulu, HI, 2002, pp. 1474-1479.; T. Krink, J.S. Vesterstroem, J. Riget, Particle swarm optimization with spatial particle extension, in: Proceedings of IEEE Congress on Evolutionary Computation, Honolulu, HI, 2002, pp. 1474-1479.
[17] Lai, C. C.; Wu, C. H.; Tsai, M. C., Feature selection using particle swarm optimization with application in spam filtering, International Journal of Innovative Computing, Information and Control, 5, 2, 423-432 (2009)
[18] Lovbjerg, M.; Rasmussen, T. K.; Krink, T., Hybrid particle swarm optimizer with breeding and subpopulations, (Proceedings of Genetic and Evolutionary Computation Conference, GECCO-2001 (2001), Morgan Kaufman: Morgan Kaufman San Francisco), 469-476
[19] Liang, J. J.; Qin, A. K.; Suganthan, P. N.; Baskar, S., Comprehensive learning particle swarm optimizer for global optimization of multimodal function, IEEE Transactions on Evolutionary Computation, 10, 3, 281-295 (2006)
[20] Liu, Y.; Qin, Z.; Shi, Z.; Lu, J., Center particle swarm optimization, Neurocomputing, 70, 4-6, 672-679 (2007)
[21] Mendes, R.; Kennedy, J.; Neves, J., The fully informed particle swarm: simpler, maybe better, IEEE Transactions on Evolutionary Computation, 8, 3, 204-210 (2004)
[22] Ratnaweera, A.; Halgamuge, S.; Watson, H., Self-organizing hierarchical particle swarm optimizer with time varying accelerating coefficients, IEEE Transactions on Evolutionary Computation, 8, 3, 240-255 (2004)
[23] K.E. Parsopoulos, M.N. Vrahatis, UPSO—A unified particle swarm optimization scheme, in: Proceedings of the ICCMSE 2004, Lecture Series on Computational Sciences, VSP Int’l Science Publishers, Attica, 2004, pp. 868-873.; K.E. Parsopoulos, M.N. Vrahatis, UPSO—A unified particle swarm optimization scheme, in: Proceedings of the ICCMSE 2004, Lecture Series on Computational Sciences, VSP Int’l Science Publishers, Attica, 2004, pp. 868-873.
[24] Pasti, R.; de Castro, L. N., Bio-inspired and gradient-based algorithms to train MLPs: the influence of diversity, Information Sciences, 179, 10, 1441-1453 (2009)
[25] Shi, Y.; Eberhart, R. C., A modified particle swarm optimizer, (Proceedings of the IEEE International Conference on Evolutionary Computation (1998), IEEE Press: IEEE Press Piscataway, NJ), 69-73
[26] Y. Shi, R.C. Eberhart, Parameter selection in particle swarm optimization, in: Proceedings of the 7th Annual Conference on Evolutionary Programming, 1998, pp. 591-600.; Y. Shi, R.C. Eberhart, Parameter selection in particle swarm optimization, in: Proceedings of the 7th Annual Conference on Evolutionary Programming, 1998, pp. 591-600.
[27] Y. Shi, R.C. Eberhart, Particle swarm optimization with fuzzy adaptive inertia weight, in: Proceedings Workshop Particle Swarm Optimization, Indianapolis, IN, 2001, pp. 101-106.; Y. Shi, R.C. Eberhart, Particle swarm optimization with fuzzy adaptive inertia weight, in: Proceedings Workshop Particle Swarm Optimization, Indianapolis, IN, 2001, pp. 101-106.
[28] P.N. Suganthan, Particle swarm optimizer with neighborhood operator, in: Proceedings of IEEE Congress on Evolutionary Computation, Washington, DC, 1999, pp. 1958-1962.; P.N. Suganthan, Particle swarm optimizer with neighborhood operator, in: Proceedings of IEEE Congress on Evolutionary Computation, Washington, DC, 1999, pp. 1958-1962.
[29] Tripathi, P. K.; Bandyopadhyay, S.; Pal, S. K., Multi-objective particle swarm optimization with time variant inertia and acceleration coefficients, Information Sciences, 177, 22, 5033-5049 (2007) · Zbl 1121.90130
[30] Wang, Y. J.; Yang, Y. P., Particle swarm optimization with preference order ranking for multi-objective optimization, Information Sciences, 179, 12, 1944-1959 (2009)
[31] van den Bergh, F.; Engelbrecht, A. P., A cooperative approach to particle swarm optimization, IEEE Transactions on Evolutionary Computation, 8, 3, 225-239 (2004)
[32] Zhao, X., A perturbed particle swarm algorithm for numerical optimization, Applied Soft Computing, 10, 1, 119-124 (2010)
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.