NegaBent, No Regrets: Evolving Spectrally Flat Boolean Functions
Negabent Boolean functions are defined by having a flat magnitude spectrum under the nega-Hadamard transform. They exist in both even and odd dimensions, and the subclass of functions that are simultaneously bent and negabent (bent-negabent) has attracted interest due to the combined optimal periodic and negaperiodic spectral properties. In this work, we investigate how evolutionary algorithms can be used to evolve (bent-)negabent Boolean functions. Our experimental results indicate that evolutionary algorithms, especially genetic programming, are a suitable approach for evolving negabent Boolean functions, and we successfully evolve such functions in all dimensions we consider.
💡 Research Summary
**
The paper investigates the automatic synthesis of Boolean functions whose spectra are flat under the nega‑Hadamard transform, a class known as negabent functions. While bent functions achieve maximal nonlinearity under the classic Walsh‑Hadamard transform, negabent functions satisfy an analogous flat‑spectrum condition for the nega‑Hadamard transform, which is relevant in quantum information theory and certain cryptographic constructions. In even dimensions, a function can be both bent and negabent (bent‑negabent), simultaneously optimal for both transforms. In odd dimensions, every affine function is trivially negabent, but the challenge is to find highly nonlinear (ideally maximal) negabent functions.
The authors compare two encoding schemes for evolutionary search: (1) a straightforward bit‑string representation of the truth table, and (2) a tree‑based genetic programming (GP) representation using logical primitives (AND, OR, XOR, NOT, IF). The bit‑string approach is simple but suffers from an exponential explosion of the search space as the number of variables grows (2^(2^n) possible truth tables). The GP approach encodes functions as expression trees, allowing structural reuse of sub‑expressions and more nuanced variation operators (various crossover types and subtree mutation).
A key contribution is the design of fitness functions that incorporate both nonlinearity and spectral flatness. For even n, the fitness combines the nonlinearity of the candidate f and of f⊕σ₂ (where σ₂ is the sum of all pairwise products of variables) together with a penalty based on the number of occurrences of the maximal absolute value in the Walsh‑Hadamard spectrum of each function. This encourages the algorithm not only to increase nonlinearity but also to reduce the peak magnitude of the spectrum, moving toward bentness. For odd n, the authors construct an auxiliary (n+1)-variable function f⊕σ₂⊕σ₁·y (σ₁ is the sum of all variables, y is an extra binary variable) and evaluate its nonlinearity and spectral peak count, effectively turning the odd‑dimensional problem into an even‑dimensional bent search.
Experimental settings include a steady‑state tournament selection (3‑tournament), population size 500, and a budget of 10⁶ fitness evaluations per run, repeated 30 times for each dimension. Variation operators for the bit‑string encoding consist of simple bit flip, substring shuffle, one‑point crossover, and uniform crossover, chosen randomly at each application. For GP, the function set comprises binary operators (OR, XOR, AND), unary NOT, and ternary IF, with crossover variants (size‑fair, one‑point, context‑preserving) and subtree mutation applied randomly.
Results show a stark contrast between the two encodings. The bit‑string representation discovers bent‑negabent functions only for n=6 and n=7; for larger even dimensions it fails to reach the optimal fitness. In contrast, the GP encoding finds bent‑negabent functions for every even dimension from 6 up to 16, often in every one of the 30 independent runs, with only a few dimensions (10, 14, 16) showing occasional failures. For odd dimensions (7, 9, 11, 13, 15), GP also succeeds in evolving highly nonlinear negabent functions, achieving nonlinearity values close to the known upper bounds. These findings demonstrate that a structural, tree‑based representation provides the necessary expressive power and variation to navigate the rugged fitness landscape defined by combined nonlinearity and spectral flatness.
The authors conclude that evolving negabent functions is feasible and that GP is particularly well‑suited for this task. They note that the computational bottleneck lies in evaluating the Walsh‑Hadamard (and, implicitly, nega‑Hadamard) spectra, especially for larger n, and suggest future work to develop more efficient evaluation methods, incorporate additional cryptographic criteria such as balancedness and algebraic degree, and extend the approach to dimensions beyond 16. Overall, the paper contributes a novel application of evolutionary computation to a previously unexplored class of Boolean functions, providing both methodological insights (fitness design, representation choice) and empirical evidence of success across a wide range of problem sizes.
Comments & Academic Discussion
Loading comments...
Leave a Comment