Benchmarking Parallel Evolutionary Algorithms

, Abas Sabouni2, Travis Desell3 and Ali Ashtari4



(1)
Department of Electrical Engineering, University of North Dakota, Grand Forks, ND, USA

(2)
Department of Electrical Engineering, Wilkes University, Wilkes-Barre, PA, USA

(3)
Department of Computer Science, University of North Dakota, Grand Forks, ND, USA

(4)
Invenia Technical Computing, Winnipeg, MB, Canada

 



Abstract

This chapter presents a simulation framework and it is used to examine the effectiveness of various asynchronous optimization methods on simulated distributed computing environments. Four benchmark functions were used to evaluate asynchronous versions of differential evolution, genetic algorithms, and particle swarm optimization. Given large-scale homogeneous and heterogeneous computing environments, asynchronous optimization is shown to have superior scalability and performance compared to synchronous implementations, even scaling to potentially millions of processors.



8.1 Simulating Asynchronous Optimization


In order to examine the scalability and performance of the asynchronous optimization methods used for FDTD on hardware far larger than available in a more comprehensive manner, a simulation for these systems was developed. Further, as calculating the objective function for the FDTD application is extremely computationally expensive, it is not feasible to use it for large-scale studies. However, there are a large number of computationally inexpensive test functions that have challenging search spaces with multiple local minima—for example, the Ackley, Griewank, Rastrigin, and Rosenbrock functions as found in related works [13].

The Sphere test function was also used as an easy to solve well-formed optimization problems with a single minimum. The optimization test functions used are as follows:





  • The Sphere function is a simple test function. It has a single minimum with fitness 0 when all input parameters are 0 (see Fig. 8.1). The range of parameters used for this optimization problem was from − 100 to 100.


    
$$\displaystyle{ f_{\mathrm{sphere}}(\mathbf{x}) =\sum _{ i=1}^{N}\mathbf{x}_{ i}^{2} }$$

    (8.1)



    
$$\displaystyle{ f_{\mathrm{sphere}}(0,\ldots,0) = 0 }$$

    (8.2)


    A305155_1_En_8_Fig1_HTML.gif


    Fig. 8.1
    The Sphere test function with two input parameters (x, y). This is a simple test function with a single minimum at 0,0


    A305155_1_En_8_Fig2_HTML.gif


    Fig. 8.2
    The Ackley test function with two input parameters (x, y). This test function has many local minima and a single global minimum at 0,0. Unlike the Griewank and Rosenbrock functions, it converges sharply to the global minimum with a flat external surface


    A305155_1_En_8_Fig3_HTML.gif


    Fig. 8.3
    The Griewank test function with two input parameters (x, y). This test function has many local minima and a single global minimum at 0,0. This has a shallow curvature with less pronounced peaks and valleys than the Rosenbrock function


  • The Ackley function is a more challenging test function with many local minima. There is a single global minimum with fitness 0 when all input parameters are 0 (see Fig. 8.2). It becomes flat near the edges of the problem space and converges sharply to the global minimum. The range of parameters used for this optimization problem was from − 32 to 32.


    
$$\displaystyle{ f_{\mathrm{ackley}}(\mathbf{x}) = 20 + e - 20e^{-0.2\sqrt{ \frac{1} {N}\sum _{i=1}^{N}\mathbf{x}_{i}^{2}} } - e^{ \frac{1} {N}\sum _{i=1}^{N}\cos (2\pi \mathbf{x}_{ i})} }$$

    (8.3)



    
$$\displaystyle{ f_{\mathrm{ackley}}(0,\ldots,0) = 0 }$$

    (8.4)


  • The Griewank function is another challenging test function with many local minima. It has a shallow curvature and a single global minimum with fitness 0 when all input parameters are 0 (see Fig. 8.3). The range of parameters used for this optimization problem was from − 600 to 600.


    
$$\displaystyle{ f_{\mathrm{griewank}}(\mathbf{x}) = \frac{1} {4000}\sum _{i=1}^{N}\mathbf{x}_{ i}^{2} -\prod _{ i=1}^{N}\cos \left ( \frac{\mathbf{x}_{i}} {\sqrt{i}}\right ) + 1 }$$

    (8.5)



    
$$\displaystyle{ f_{\mathrm{griewank}}(0,\ldots,0) = 0 }$$

    (8.6)


  • The Rastrigin function is similar to the Griewank function. However, it has a less shallow curvature and more pronounced peaks and valleys. The global minimum has a fitness of 0 when all input parameters are 0 (see Fig. 8.4). The range of parameters used for this optimization problem was from − 2π to 2π.


    
$$\displaystyle{ f_{\mathrm{rastrigin}}(\mathbf{x}) =\sum _{ i=1}^{N}[\mathbf{x}_{ i}^{2} - 10\cos (2\pi \mathbf{x}_{ i}) + 10] }$$

    (8.7)



    
$$\displaystyle{ f_{\mathrm{rastrigin}}(0,\ldots,0) = 0 }$$

    (8.8)


  • The Rosenbrock function has no local minima; however, it has a very flat valley with only a single minimum. The minimum has a fitness of 0 when all input parameters are 1 (see Fig. 8.5). The range of parameters used for this optimization problem was from − 30 to 30.


    
$$\displaystyle{ f_{\mathrm{rosenbrock}}(\mathbf{x}) =\sum _{ i=1}^{N}[(\mathbf{x}_{ i+1} -\mathbf{ x}_{i}^{2})^{2} + (\mathbf{x}_{ i} - 1)^{2}] }$$

    (8.9)



    
$$\displaystyle{ f_{\mathrm{rosenbrock}}(1,\ldots,1) = 0 }$$

    (8.10)


A305155_1_En_8_Fig4_HTML.gif


Fig. 8.4
The Rastrigin test function with two input parameters (x, y). This test function has many local minima and a single global minimum at 0,0. It has a steeper curvature with more pronounced peaks and valleys than the Griewank function

Apr 5, 2020 | Posted by in GENERAL RADIOLOGY | Comments Off on Benchmarking Parallel Evolutionary Algorithms

Full access? Get Clinical Tree

Get Clinical Tree app for offline access