Abstract
Assessing the reliability of termination conditions for evolutionary algorithms (EAs) is of prime importance. An erroneous or weak stop criterion can negatively affect both the computational effort and the final result. We introduce a statistical framework for assessing whether a termination condition is able to stop an EA at its steady state, so that its results can not be improved anymore. We use a regression model in order to determine the requirements ensuring that a measure derived from EA evolving population is related to the distance to the optimum in decision variable space. Our framework is analyzed across 24 benchmark test functions and two standard termination criteria based on function fitness value in objective function space and EA population decision variable space distribution for the differential evolution (DE) paradigm. Results validate our framework as a powerful tool for determining the capability of a measure for terminating EA and the results also identify the decision variable space distribution as the best-suited for accurately terminating DE in real-world applications.
Similar content being viewed by others
References
Agapie, A.: Theoretical analysis of mutation-adaptive evolutionary algorithms. Evol. Comput. 9, 127–146 (2001)
Arnold, S.: The Theory of Linear Models and Multivariate Observations. Wiley, New York (1997)
Ashish, S., Srivastava, M.: Regression Analysis. Theory, Methods and Applications. Springer, New York (1990)
Ashyraliyev, M., Fomekong-Nanfack, Y., Kaandorp, J., Blom, J.: Systems biology: parameter estimation for biochemical models. In: FEBS, pp. 886–902 (2009)
Auger, A., Hansen, N.: A restart cma evolution strategy with increasing population size. In: CEC, pp. 1769–1776 (2005)
Aytug, H.K.G.: New stopping criterion for genetic algorithms. Eur. J. Oper. Res. 126, 662–674 (2000)
Bäck, T., Hammel, U., Schwefel, H.: Evolutionary computation: comments on the history and current state. IEEE Trans. Evol. Comput. 1, 3–17 (1997)
Bertsimas, D.J.T.: Simulated annealing. Stat. Sci. 8(1), 10–15 (1993)
Bischl, B., Mersmann, O., Trautmann, H., Preuss, M.: Algorithm selection based on exploratory landscape analysis and cost-sensitive learning. In: GECCO, 14th Annual Conference on Genetic and Evolutionary Computation, pp. 313–320. ACM, New York, USA (2012)
Cagnoni, S., Lutton, E., Olague, G.: Genetic and Evolutionary Computation for Image Processing and Analysis. Hindawi Publishing Corporation, New York (2008)
Cheney, W., Kincaid, D.: Numerical Mathematics and Computing, 6th edn. Bob Pirtle, USA (2008)
Das, S., Konar, A.: An improved differential evolution scheme for noisy optimization problems. In: Pat Rec Mac Int, no. 3776 in LNCS, pp. 417–421 (2005)
Deb, K., Jain, S.: Running performance metrics for evolutionary multi-objective optimization. In: SEAL, pp. 13–20 (2002)
Finck, S., Hansen, N., Ros, R., Auger, A.: Real Paraemter Black-Box Optimization Benchmarking 2009: Presentation of the Noiseless Functions. Technical Report 2009/20, Research Center PPE (2009)
Goel, T., Stander, N.: A non-dominance-based online stopping criterion for multiobjective evolutionary algorithms. Int. J. Numer. Methods Eng. 84, 661–684 (2010)
Guerrero, J., Marti, L., Garcia, J., Berlanga, A., Molina, J.: Introducing a robust and efficient stopping criterion for moeas. In: CEC, pp. 1–8 (2010)
Hansen, N.: Benchmarking a BI-population CMA-ES on the BBOB-2009 function testbed. In: ACM-GECCO (2009)
Hansen, N., Auger, A., Ros, R., Finck, S., Posík, P.: Comparing results of 31 algorithms from the black-box optimization benchmarking. In: GECCO, pp. 1689–1696 (2010)
Hansen, N., Finck, S., Ros, R., Auger, A.: Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions. Technical Report RR-6829, INRIA (2009)
Hansen, N., Müller, S.D., Koumoutsakos, P.: Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evol. Comput. 11(1), 1–18 (2003)
Kronfeld, M., Planatscher, H., Zell, A.: The EvA2 optimization framework. In: LION-SWOP, no. 6073 in LNCS, pp. 247–250. Springer, New York (2010)
Kwok, N.M., Ha, Q.P., Liu, D.K., Fang, G., Tan, K.C.: Efficient particle swarm optimization: a termination condition based on the decision-making approach. In: CEC, pp. 3353–3360 (2007)
Marti, L., Garcia, J., Berlanga, A., Molina, J.: An approach to stopping criteria for multi-objective optimization evolutionary algorithms: the mgbm criterion. In: CEC, pp. 1263–1270 (2009)
Mersmann, O., Bischl, B., Trautmann, H., Preuss, M., Weihs, C., Rudolph, G.: Exploratory landscape analysis. In: GECCO’11, pp. 829–836 (2011)
Mersmann, O., Preuss, M., Trautmann, H.: Benchmarking evolutionary algorithms: towards exploratory landscape analysis. In: PPSN XI, Part I, LNCS 6238, pp. 73–82 (2010)
Mullen, K., Ardia, D., Gil, D., Windover, D., Cline, J.: DEoptim: an R package for global optimization by differential evolution. J. Stat. Softw. 40(6), 1–26 (2011)
Pandi, V.R., Panigrahi, B.K.: Comparative study of evolutionary computing methods for parameter estimation of power quality signals. IJAEC 1(2), 28–59 (2010)
Price, K., Storn, R., Lampinen, J.: Differential Evolution: A Practical Approach to Global Optimization. Springer, New York (2005)
Rizki, M., Zmuda, M., Tamburino, L.: Evolving pattern recognition systems. IEEE Trans. Evol. Comput. 6(6), 594–609 (2002)
Roche, D., Gil, D., Giraldo, J.: An inference model for analyzing termination conditions of evolutionary algorithms. In: CCiA Proceedings, pp. 216–225 (2011)
Roche, D., Gil, D., Giraldo, J.: Using statistical inference for designing termination conditions ensuring convergence of evolutionary algorithms. In: Proceedings of ECAL, pp. 680–687 (2011)
Roche, D., Gil, D., Giraldo, J.: Detecting loss of diversity for an efficient termination of EAs. In: SYNASC, IEEE Proceedings, pp. 563–568 (2013)
Rudenko, O., Schoenauer, M.: A steady performance stopping criterion for paretobased. In: Multi-Objective Programming and Goal Programming Conference (2004)
Rudin, W.: Principles of Mathematical Analysis, 3rd edn. McGraw-Hill Science, New York (1976)
Rudolph, G.: Convergence analysis of canonical genetic algorithms. IEEE Trans. Neural Netw. 5, 96–101 (1994)
Rudolph, G.: Finite markov chain results in evolutionary computation: a tour d’horizon. Fundam. Inform. (1998)
Safe, M., Carballido, J., Ponzoni, I.: On stopping criteria for genetic algorithms. In: SBIA 2004, no. 3171 in LNCS, pp. 405–413 (2004)
Shir, O.M.: Niching in Evolutionary Algorithms. Springer, New York (2013)
Storn, R., Price, K.: Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 11, 341–359 (1997)
Tagetiren, M., Suganthan, P.: A multi-populated differential evolution algorithm for solving contrained optimization problem. In: CEC, pp. 340–347 (2006)
Thomsen, R.: Multimodal optimization using crowding-based differential evolution. No. vol 2 in CEC2004, pp. 1382–1389 (2004)
Vesterstrom, J., Thomsen, R.: A comparative study of differential evolution, particle swarm optimization, and evolutionary algorithms on numerical benchmark problems. In: CEC, pp. 1980–1987 (2004)
Wagner, T., Trautmann, H., Martí, L.: A taxonomy of online stopping criteria for multi-objective evolutionary algorithms. In: EMO’11, pp. 16–30 (2011)
Wagner, T., Trautmann, H., Naujoks, B.: Ocd: online convergence detection for evolutionary multi-objective algorithms based on statistical testing. In: EMO, pp. 198–215 (2009)
Xia, W., Wu, Z.: An effective hybrid optimization approach for multi-objective flexible job-shop scheduling problems. Comput. Ind. Eng. 48(2), 409–425 (2005)
Zaharie, D.: Critical values for the control parameters of differential evolution algorithm. In: MENDEL, pp. 62–67 (2002)
Zaharie, D.: A compartive analysis of crossover variants in differential evolution. In: Proceedings of the International Conference on Computer Science and Information Technology, pp. 171–181 (2007)
Zielinski, K., Laur, R.: Stopping criteria for differential evolution in constrained single-objective optimization. In: Studies in Computational Intelligence, No. 143 in Advances in Differential Evolution, pp. 111–138 (2008)
Acknowledgments
Work supported by Spanish projects TIN2009-13618, TIN2012-33116, SAF2010-19257 and CSD2007-00018, Fundacio La Marato de TV3 (110230) and RecerCaixa 2010ACUP 00378.
Author information
Authors and Affiliations
Corresponding author
Appendix
Appendix
1.1 Proof of Proposition 1
Proof
The Cauchy-condition given by Definition 1 is satisfied if \(\forall \epsilon \), \(\exists k_0\) such that \(\forall k_1,k_2 \le k_0\), we have that:
By the regression model (4), we have that:
for \(\varepsilon _{k_1}\), \(\varepsilon _{k_2}\) following a gaussian distribution \(N(0,\sigma ^2)\). Given that \(99.7\,\%\) of the values of a gaussian are in the interval given by three standard deviations, \([-3 \sigma , 3 \sigma ]\), we have that:
Since EA has converged to the minimum, \(Ref_k \rightarrow 0\) and, by the linear regression, also \(Q_k \rightarrow 0\). If \(\beta _1>0\), then \(Q_{k_1}^{\beta _1}\) is also convergent to zero. Therefore, \(T_1\) and \(T_2\) are bounded by:
for \(k_1\), \(k_2\) large enough. Substituting in (16) and taking into account that \(e^{3 \sigma }- e^{-3 \sigma }>0\), we have that:
Therefore \(|Ref_{k_1}-Ref_{k_2}| \le \epsilon \) holds provided that:
Finally, taking logarithms we have that:
The relation between \(\varepsilon \) and the model accuracy (\(\sigma \)) proves the Proposition. \(\square \)
We would like to note that Proposition 1 is proven using a set of inequalities \(\le \) which achieve strict values for some cases and, thus, it is a sufficient but not necessary condition. This means that any measure satisfying the inequality is Cauchy, but implies by no means that all Cauchy measures should fulfill it. In this context, Proposition 1 is a way of verifying that a measure is Cauchy rather than a means of discarding non-Cauchy measures.
Rights and permissions
About this article
Cite this article
Gil, D., Roche, D., Borràs, A. et al. Terminating evolutionary algorithms at their steady state. Comput Optim Appl 61, 489–515 (2015). https://doi.org/10.1007/s10589-014-9722-4
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10589-014-9722-4