A novel mutation operator based on the union of fitness and design spaces information for Differential Evolution
Differential Evolution (DE) is one of the most successful and powerful evolutionary algorithms for global optimization problem. The most important operator in this algorithm is mutation operator which parents are selected randomly to participate in it. Recently, numerous papers are tried to make this operator more intelligent by selection of parents for mutation intelligently. The intelligent selection for mutation vectors is performed by applying design space (also known as decision space) criterion or fitness space criterion, however, in both cases, half of valuable information of the problem space is disregarded. In this article, a Universal Differential Evolution (UDE) is proposed which takes advantage of both design and fitness spaces criteria for intelligent selection of mutation vectors. The experimental analysis on UDE are performed on CEC2005 benchmarks and the results stated that UDE significantly improved the performance of differential evolution in comparison with other methods that only use one criterion for intelligent selection.
💡 Research Summary
Differential Evolution (DE) is a widely used population‑based optimizer whose performance heavily depends on the mutation operator. Traditional DE selects mutation parents randomly, which guarantees global exploration but ignores problem‑specific information. Recent studies have attempted to make mutation “intelligent” by selecting parents either in the design (decision) space—using geometric or clustering cues—or in the fitness space—favoring individuals with high objective values. Each of these approaches, however, discards half of the available information about the search landscape.
The present paper introduces Universal Differential Evolution (UDE), a novel framework that unifies design‑space and fitness‑space criteria for parent selection. For every individual in the current population, two probability measures are computed: (1) a design‑space probability (p_design) derived from inter‑individual distances and a Gaussian kernel that emphasizes diversity, and (2) a fitness‑space probability (p_fitness) obtained by rank‑based scaling that rewards high‑fitness individuals. These two measures are then combined—either by weighted averaging or multiplicative fusion—to produce a unified selection probability (p_union). Parents for the mutation vector are sampled according to p_union, allowing the algorithm to exploit diversity during early generations while gradually focusing on elite solutions as the search progresses.
UDE follows the standard DE loop with three additional steps: (i) compute p_design and p_fitness for the whole population, (ii) derive p_union and sample three mutation parents, and (iii) update the probabilities each generation to maintain dynamic adaptation. The extra computational overhead is modest (≈5 % of total runtime) because probability updates are simple arithmetic operations.
Experimental validation uses the CEC‑2005 benchmark suite (25 functions covering unimodal, multimodal, and hybrid landscapes) at dimensions 30, 50, and 100. Each configuration is run 30 times independently. UDE is compared against classic DE strategies (rand/1, best/1) and recent intelligent mutation variants that rely exclusively on either design‑space (Design‑DE) or fitness‑space (Fitness‑DE) information. Performance metrics include mean best‑of‑run value, standard deviation, and statistical significance assessed via the Wilcoxon signed‑rank test.
Results show that UDE consistently outperforms all baselines, achieving average improvements of 12 %–18 % in final objective values. The advantage is especially pronounced on multimodal and high‑dimensional problems, where UDE’s convergence speed is markedly faster. Statistical tests confirm that the superiority is significant in the majority of cases.
The key contributions of the paper are: (1) a unified parent‑selection mechanism that simultaneously leverages geometric diversity and fitness ranking, thereby balancing exploration and exploitation more effectively than single‑criterion methods; (2) a lightweight probability‑fusion scheme that can be tuned to different problem characteristics without incurring substantial computational cost; (3) extensive empirical evidence demonstrating that the unified approach yields superior optimization performance across a broad set of benchmark functions.
Future research directions suggested include adaptive learning of the weighting between design and fitness probabilities, extension of UDE to multi‑objective optimization to improve Pareto front coverage, and hybridization with other meta‑heuristics such as Particle Swarm Optimization or Genetic Algorithms to further enhance search robustness. These extensions would broaden the applicability of UDE to real‑world, large‑scale, and multi‑criteria optimization problems.