Extension of Wirtinger Calculus in RKH Spaces and the Complex Kernel LMS

Extension of Wirtinger Calculus in RKH Spaces and the Complex Kernel LMS
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Over the last decade, kernel methods for nonlinear processing have successfully been used in the machine learning community. However, so far, the emphasis has been on batch techniques. It is only recently, that online adaptive techniques have been considered in the context of signal processing tasks. To the best of our knowledge, no kernel-based strategy has been developed, so far, that is able to deal with complex valued signals. In this paper, we take advantage of a technique called complexification of real RKHSs to attack this problem. In order to derive gradients and subgradients of operators that need to be defined on the associated complex RKHSs, we employ the powerful tool ofWirtinger’s Calculus, which has recently attracted much attention in the signal processing community. Writinger’s calculus simplifies computations and offers an elegant tool for treating complex signals. To this end, in this paper, the notion of Writinger’s calculus is extended, for the first time, to include complex RKHSs and use it to derive the Complex Kernel Least-Mean-Square (CKLMS) algorithm. Experiments verify that the CKLMS can be used to derive nonlinear stable algorithms, which offer significant performance improvements over the traditional complex LMS orWidely Linear complex LMS (WL-LMS) algorithms, when dealing with nonlinearities.


💡 Research Summary

The paper addresses a notable gap in the literature: the lack of kernel‑based adaptive algorithms that can directly handle complex‑valued signals. To fill this gap, the authors combine two powerful mathematical tools—complexification of real reproducing kernel Hilbert spaces (RKHS) and Wirtinger calculus—and use them to derive a novel online learning rule, the Complex Kernel Least‑Mean‑Square (CKLMS) algorithm.

First, the authors formalize the complexification process. Starting from a real RKHS (\mathcal{H}) equipped with a kernel (k(\cdot,\cdot)), any element (f\in\mathcal{H}) is split into two real components (f_1) and (f_2). By defining a complex element (f_c = f_1 + i f_2) and an inner product (\langle f_c,g_c\rangle = \langle f_1,g_1\rangle + \langle f_2,g_2\rangle + i(\langle f_1,g_2\rangle - \langle f_2,g_1\rangle)), the space becomes a bona‑fide complex RKHS. Importantly, the same real kernel can be reused; the resulting complex kernel is simply (K_c(x,x’) = k(x,x’) + i k(x,x’)). This construction preserves the reproducing property while allowing the mapping of complex inputs into an infinite‑dimensional feature space.

Second, the paper extends Wirtinger calculus—originally defined for finite‑dimensional complex vectors—to functionals on a complex RKHS. By interpreting the Fréchet derivative in the Hilbert space and treating the variable (w) and its conjugate (w^*) as independent, the authors obtain a compact expression for the gradient of the mean‑square error cost functional (J(w)=\mathbb{E}


Comments & Academic Discussion

Loading comments...

Leave a Comment