Oscillators Are All You Need: Irregular Time Series Modelling via Damped Harmonic Oscillators with Closed-Form Solutions

Oscillators Are All You Need: Irregular Time Series Modelling via Damped Harmonic Oscillators with Closed-Form Solutions
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Transformers excel at time series modelling through attention mechanisms that capture long-term temporal patterns. However, they assume uniform time intervals and therefore struggle with irregular time series. Neural Ordinary Differential Equations (NODEs) effectively handle irregular time series by modelling hidden states as continuously evolving trajectories. ContiFormers arxiv:2402.10635 combine NODEs with Transformers, but inherit the computational bottleneck of the former by using heavy numerical solvers. This bottleneck can be removed by using a closed-form solution for the given dynamical system - but this is known to be intractable in general! We obviate this by replacing NODEs with a novel linear damped harmonic oscillator analogy - which has a known closed-form solution. We model keys and values as damped, driven oscillators and expand the query in a sinusoidal basis up to a suitable number of modes. This analogy naturally captures the query-key coupling that is fundamental to any transformer architecture by modelling attention as a resonance phenomenon. Our closed-form solution eliminates the computational overhead of numerical ODE solvers while preserving expressivity. We prove that this oscillator-based parameterisation maintains the universal approximation property of continuous-time attention; specifically, any discrete attention matrix realisable by ContiFormer’s continuous keys can be approximated arbitrarily well by our fixed oscillator modes. Our approach delivers both theoretical guarantees and scalability, achieving state-of-the-art performance on irregular time series benchmarks while being orders of magnitude faster.


💡 Research Summary

The paper tackles the problem of modeling irregularly sampled time‑series data, a setting where conventional Transformers struggle because they assume uniformly spaced timestamps, and where the recent ContiFormer, which augments Transformers with Neural Ordinary Differential Equations (NODEs), suffers from heavy computational overhead due to numerical ODE solvers. The authors propose a fundamentally different approach: replace the NODE dynamics governing the continuous key and value trajectories with a linear damped harmonic oscillator (DHO) system, whose solution is known in closed form.

In the proposed “OsciFormer”, each key and each value is generated by a DHO described by the second‑order ODE ¨x + 2γ ẋ + ω²x = F(t). The damping coefficient γ and natural frequency ω are learnable per head and per channel, while the driving term F(t) can be a learned linear function of the input or a fixed forcing. By converting the second‑order equation into a first‑order system (state z =


Comments & Academic Discussion

Loading comments...

Leave a Comment