Large Scale Global Optimization by Hybrid Evolutionary Computation
📝 Abstract
In management, business, economics, science, engineering, and research domains, Large Scale Global Optimization (LSGO) plays a predominant and vital role. Though LSGO is applied in many of the application domains, it is a very troublesome and a perverse task. The Congress on Evolutionary Computation (CEC) began an LSGO competition to come up with algorithms with a bunch of standard benchmark unconstrained LSGO functions. Therefore, in this paper, we propose a hybrid meta-heuristic algorithm, which combines an Improved and Modified Harmony Search (IMHS), along with a Modified Differential Evolution (MDE) with an alternate selection strategy. Harmony Search (HS) does the job of exploration and exploitation, and Differential Evolution does the job of giving a perturbation to the exploration of IMHS, as harmony search suffers from being stuck at the basin of local optimal. To judge the performance of the suggested algorithm, we compare the proposed algorithm with ten excellent meta-heuristic algorithms on fifteen LSGO benchmark functions, which have 1000 continuous decision variables, of the CEC 2013 LSGO special session. The experimental results consistently show that our proposed hybrid meta-heuristic performs statistically on par with some algorithms in a few problems, while it turned out to be the best in a couple of problems.
💡 Analysis
In management, business, economics, science, engineering, and research domains, Large Scale Global Optimization (LSGO) plays a predominant and vital role. Though LSGO is applied in many of the application domains, it is a very troublesome and a perverse task. The Congress on Evolutionary Computation (CEC) began an LSGO competition to come up with algorithms with a bunch of standard benchmark unconstrained LSGO functions. Therefore, in this paper, we propose a hybrid meta-heuristic algorithm, which combines an Improved and Modified Harmony Search (IMHS), along with a Modified Differential Evolution (MDE) with an alternate selection strategy. Harmony Search (HS) does the job of exploration and exploitation, and Differential Evolution does the job of giving a perturbation to the exploration of IMHS, as harmony search suffers from being stuck at the basin of local optimal. To judge the performance of the suggested algorithm, we compare the proposed algorithm with ten excellent meta-heuristic algorithms on fifteen LSGO benchmark functions, which have 1000 continuous decision variables, of the CEC 2013 LSGO special session. The experimental results consistently show that our proposed hybrid meta-heuristic performs statistically on par with some algorithms in a few problems, while it turned out to be the best in a couple of problems.
📄 Content
1
Large Scale Global Optimization by Hybrid Evolutionary Computation
Gutha Jaya Krishna1, 2 and Vadlamani Ravi1
1Center of Excellence in Analytics,
Institute for Development and Research in Banking Technology,
Castle Hills Road #1, Masab Tank, Hyderabad - 500 057, INDIA.
2School of Computer & Information Sciences, University of Hyderabad,
Hyderabad – 500 046 INDIA
krishna.gutha@gmail.com , padmarav@gmail.com
Abstract In management, business, economics, science, engineering, and research domains, Large Scale Global Optimization (LSGO) plays a predominant and vital role. Though LSGO is applied in many of the application domains, it is a very troublesome and a perverse task. The Congress on Evolutionary Computation (CEC) began an LSGO competition to come up with algorithms with a bunch of standard benchmark unconstrained LSGO functions. Therefore, in this paper, we propose a hybrid meta-heuristic algorithm, which combines an Improved and Modified Harmony Search (IMHS), along with a Modified Differential Evolution (MDE) with an alternate selection strategy. Harmony Search (HS) does the job of exploration and exploitation, and Differential Evolution does the job of giving a perturbation to the exploration of IMHS, as harmony search suffers from being stuck at the basin of local optimal. To judge the performance of the suggested algorithm, we compare the proposed algorithm with ten excellent meta-heuristic algorithms on fifteen LSGO benchmark functions, which have 1000 continuous decision variables, of the CEC 2013 LSGO special session. The experimental results consistently show that our proposed hybrid meta-heuristic performs statistically on par with some algorithms in a few problems, while it turned out to be the best in a couple of problems. Keywords: Global Optimization; Differential Evolution; Harmony Search; Hybrid Metaheuristic; Large Scale Global Optimization.
- Introduction
Optimization consists of minimizing or maximizing a real output objective function for real input decision variables within the specified bounds and may or may not include constraints [1,2]. Optimization without constraints is termed unconstrained optimization and optimization with constraints is termed constrained optimization. Optimization has many sub areas which may include multiple objectives, i.e., multi-objective optimization, or which may have multiple good solutions for the same objective function, i.e., multimodal optimization. Optimization has a wide range of applications in mechanics, economics, finance, electrical engineering, operational research, control
Corresponding Author, Phone: +914023294042; FAX: +914023535157 2
engineering, geophysics, molecular modeling, etc. Optimization methods are classified into three sub- categories, namely 1) Classical optimization, 2) Heuristic-based optimization and 3) Metaheuristic- based optimization. Classical optimization techniques applied only to convex, continuous and differential search problems, but heuristic and metaheuristic-based optimization can also be applied to non-convex, discontinuous and non-differential search problems. Heuristic-based optimization has some inherent assumption which is problem specific, but metaheuristic-based optimization has no problem-specific assumption [3].
Metaheuristic-based optimization techniques are classified into Evolutionary Computing (EC), Swarm Intelligence-based optimization (SI), Stochastic-based optimization, Physics-based Optimization, Artificial Immune System (AIS)-based optimization, etc. Metaheuristic algorithms are further classified into the population, or point-based, optimization algorithms. Evolutionary computing techniques include the Genetic Algorithm [4], Genetic Programming [5], and Differential Evolution (DE) [6], etc., which are all single objective optimization algorithms. However, Multi-Objective Genetic Algorithm [7], Strength Pareto Evolutionary Algorithm-2 [8], Nondominated Sorting Genetic Algorithm-II (NSGA-II) [9], etc. are examples of multi-objective evolutionary computing algorithms. Swarm Intelligence-based optimization algorithms include Particle Swarm Optimization (PSO) [10], Ant Colony Optimization (ACO) [11], and Bee Swarm-based Optimization [12], etc. Stochastic-based optimization algorithms include Tabu Search [13,14], Stochastic Hill Climbing [15], and Threshold Accepting (TA) [16], etc. Physics-based optimizations include Simulated Annealing [17], Harmony Search (HS) [18], etc. The Artificial Immune System [19,20] based optimization, includes Negative Selection, Clonal Selection, etc.
There is also another class of meta-heuristics that combine the power of more than one meta- heuristic. These are called Memetic Algorithms (MA). MAs have evolved over three generations. First generation MAs are the hybrid optimization algorithms. These use the power of one cl
This content is AI-processed based on ArXiv data.