Selective-Candidate Framework with Similarity Selection Rule for Evolutionary Optimization

Reading time: 6 minute
...

📝 Original Info

  • Title: Selective-Candidate Framework with Similarity Selection Rule for Evolutionary Optimization
  • ArXiv ID: 1712.06338
  • Date: 2020-05-15
  • Authors: Researchers from original ArXiv paper

📝 Abstract

Achieving better exploitation and exploration capabilities (EEC) have always been an important yet challenging issue in the design of evolutionary optimization algorithm (EOA). The difficulties lie in obtaining a good balance in EEC, which is determined cooperatively by operations and parameters in an EOA. When deficiencies in exploitation or exploration are observed, most existing works consider a piecemeal approach, either by designing new operations or by altering the parameters. Unfortunately, when different situations are encountered, these proposals may fail to be the winner. To address these problems, this paper proposes an explicit EEC control method named selective-candidate framework with similarity selection rule (SCSS). M (M > 1) candidates are first generated from each current solution with independent operations and parameters to enrich the search. Then, a similarity selection rule is designed to determine the final candidate by considering the fitness ranking of the current solution and its Euclidian distance to each of these M candidates. Superior current solutions will prefer the closest candidates for efficient local exploitation while inferior ones will favor the farthest for exploration purpose. In this way, the rule could synthesize exploitation and exploration, making the evolution more effective. When applied to three classic, four state-of-the-art and four up-to-date EOAs from branches of differential evolution, evolution strategy and particle swarm optimization, significant enhancement in performance is achieved.

💡 Deep Analysis

Deep Dive into Selective-Candidate Framework with Similarity Selection Rule for Evolutionary Optimization.

Achieving better exploitation and exploration capabilities (EEC) have always been an important yet challenging issue in the design of evolutionary optimization algorithm (EOA). The difficulties lie in obtaining a good balance in EEC, which is determined cooperatively by operations and parameters in an EOA. When deficiencies in exploitation or exploration are observed, most existing works consider a piecemeal approach, either by designing new operations or by altering the parameters. Unfortunately, when different situations are encountered, these proposals may fail to be the winner. To address these problems, this paper proposes an explicit EEC control method named selective-candidate framework with similarity selection rule (SCSS). M (M > 1) candidates are first generated from each current solution with independent operations and parameters to enrich the search. Then, a similarity selection rule is designed to determine the final candidate by considering the fitness ranking of the curr

📄 Full Content

2

Selective-Candidate Framework with Similarity Selection Rule for
Evolutionary Optimization

Sheng Xin Zhanga* , Wing Shing Chana, Zi Kang Pengb, Shao Yong Zhengb* , Kit Sang Tanga

aDepartment of Electronic Engineering, City University of Hong Kong, Kowloon, Hong Kong bSchool of Electronics and Information Technology, Sun Yat-sen University, Guangzhou, 510006, China

Abstract Achieving better exploitation and exploration capabilities (EEC) have always been an important yet challenging issue in the design of evolutionary optimization algorithm (EOA). The difficulties lie in obtaining a good balance in EEC, which is determined cooperatively by operations and parameters in an EOA. When deficiencies in exploitation or exploration are observed, most existing works consider a piecemeal approach, either by designing new operations or by altering the parameters. Unfortunately, when different situations are encountered, these proposals may fail to be the winner. To address these problems, this paper proposes an explicit EEC control method named selective-candidate framework with similarity selection rule (SCSS). M (M > 1) candidates are first generated from each current solution with independent operations and parameters to enrich the search. Then, a similarity selection rule is designed to determine the final candidate by considering the fitness ranking of the current solution and its Euclidian distance to each of these M candidates. Superior current solutions will prefer the closest candidates for efficient local exploitation while inferior ones will favor the farthest for exploration purpose. In this way, the rule could synthesize exploitation and exploration, making the evolution more effective. When applied to three classic, four state-of-the-art and four up-to-date EOAs from branches of differential evolution, evolution strategy and particle swarm optimization, significant enhancement in performance is achieved.

Keywords: Evolution status, similarity selection, exploitation and exploration, differential evolution (DE), covariance matrix adaptation evolution strategy (CMA-ES), particle swarm optimization (PSO), global optimization.

*Corresponding authors. E-mail addresses: shengxinzhang@gmail.com (S. X. Zhang), zhengshaoy@mail.sysu.edu.cn (S. Y. Zheng).

3

  1. Introduction Constructed on a population basis, evolutionary optimization algorithm (EOA) explores a searching space by iteratively performing genetic operations (for evolutionary algorithms, EAs [1, 2]) or social learning processes (for swarm intelligences, SIs [3]) to generate new solutions. How these solutions are sampled, gives the feature of a particular method and determines its exploitation and exploration capabilities (EEC). For differential evolution (DE) [4-8] and evolution strategy (ES) [9], the genetic operations are mutation and crossover/recombination. While for particle swarm optimization (PSO) [10], the social learning procedures consist of the velocity and position update equations. Commonly, EEC of EOAs is indispensably controlled by the genetic operations/social learning, together with the associated parameters (e.g. mutation and crossover factors in DE, normal distribution in ES and acceleration coefficients in PSO), which cooperatively locate the sampled solutions. Since EEC is the cornerstone of evolutionary optimization [11] and has a direct impact on performance, researchers have put a lot of effort on designing appropriate exploitation and exploration schemes [12]. Existing works can be summarized under the following three categories. (1) EEC controlled by genetic operations/social learning. In general, genetic operations/social learning determines the evolution direction. In this category, research works solely focus on genetic operations/social learning. Along this line, various types of operators, such as ranking-based [13], collective information-based [14] mutation, multi-objective sorting-based [15] and jumping genes-based crossover [16] were designed, favoring an exploitation or exploration trend. Fitness diversity was considered in the designs of operations [13-15]. Besides these newly designed operations, EEC were also controlled by an ensemble of multiple DE mutation strategies [17-20], a combination of different types of optimizers [21], and the memetic algorithms [22-24]. In the multialgorithm genetically adaptive method (AMALGAM) [21] and multiple offspring sampling (MOS) [23] hybrid method, the constituents compete for computational resources based on their online performance, which enhanced the exploitation capability of the unity. To preserve population diversity, [21] also introduced a diversity mechanism. In [24, 25], multiple search agents were coordinated by considering fitness distribution among individuals. (2) EEC controlled by parameter tuning. Parameters control the evolution scale. In this category, researchers pursued ef

…(Full text truncated)…

📸 Image Gallery

cover.png page_2.webp page_3.webp

Reference

This content is AI-processed based on ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut