Complex variational autoencoders admit Kähler structure

Complex variational autoencoders admit Kähler structure
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

It has been discovered that latent-Euclidean variational autoencoders (VAEs) admit, in various capacities, Riemannian structure. We adapt these arguments but for complex VAEs with a complex latent stage. We show that complex VAEs reveal to some level Kähler geometric structure. Our methods will be tailored for decoder geometry. We derive the Fisher information metric in the complex case under a latent complex Gaussian with trivial relation matrix. It is well known from statistical information theory that the Fisher information coincides with the Hessian of the Kullback-Leibler (KL) divergence. Thus, the metric Kähler potential relation is exactly achieved under relative entropy. We propose a Kähler potential derivative of complex Gaussian mixtures that acts as a rough proxy to the Fisher information metric while still being faithful to the underlying Kähler geometry. Computation of the metric via this potential is efficient, and through our potential, valid as a plurisubharmonic (PSH) function, large scale computational burden of automatic differentiation is displaced to small scale. Our methods leverage the law of total covariance to bridge behavior between our potential and the Fisher metric. We show that we can regularize the latent space with decoder geometry, and that we can sample in accordance with a weighted complex volume element. We demonstrate these strategies, at the exchange of sample variation, yield consistently smoother representations and fewer semantic outliers.


💡 Research Summary

The paper investigates the geometric structure of variational autoencoders (VAEs) when the latent space is complex‑valued. Building on prior work that showed Euclidean‑latent VAEs inherit a Riemannian metric from the Fisher information, the authors extend the analysis to complex latent variables and demonstrate that a Kähler geometry naturally arises.

First, they assume a complex normal posterior (z\sim\mathcal{CN}(\mu(z),\Sigma(z))) with a trivial relation matrix. Using Wirtinger calculus they derive the log‑likelihood gradient and Hessian, arriving at a Hermitian Fisher information metric
\


Comments & Academic Discussion

Loading comments...

Leave a Comment