Multi-timescale time encoding for CNN prediction of Fenna-Matthews-Olson energy-transfer dynamics

Multi-timescale time encoding for CNN prediction of Fenna-Matthews-Olson energy-transfer dynamics
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Machine learning simulations of open quantum dynamics often rely on recursive predictors that accumulate error. We develop a non-recursive convolutional neural networks (CNNs) that maps system parameters and a redundant time encoding directly to excitation-energy-transfer populations in the Fenna-Matthews-Olson complex. The encoding-modified logistic plus $\tanh$ functions-normalizes time and resolves fast, transitional, and quasi-steady regimes, while physics-informed labels enforce population conservation and inter-site consistency. Trained only on $0\sim 7 ps$ reference trajectories generated with a Lindblad model in QuTiP, the network accurately predicts $0\sim100 ps$ dynamics across a range of reorganization energies, bath rates, and temperatures. Beyond $20 ps$, the absolute relative error remains below 0.05, demonstrating stable long-time extrapolation. By avoiding step-by-step recursion, the method suppresses error accumulation and generalizes across timescales. These results show that redundant time encoding enables data-efficient inference of long-time quantum dissipative dynamics in realistic pigment-protein complexes, and may aid the data-driven design of light-harvesting materials.


💡 Research Summary

This paper addresses the long‑standing challenge of accurately simulating open‑quantum‑system dynamics over extended timescales without the error accumulation that plagues recursive machine‑learning approaches. The authors focus on the Fenna‑Matthews‑Olson (FMO) pigment‑protein complex, a prototypical system for studying excitation‑energy‑transfer (EET) in photosynthetic organisms. Traditional numerical methods such as HEOM, path‑integral Monte Carlo, or MCTDH provide high accuracy but become computationally prohibitive for long‑time simulations. Recent data‑driven methods have largely relied on recurrent neural networks (RNNs) or long short‑term memory (LSTM) architectures, which predict each future state from previously predicted states. While conceptually similar to Markovian propagation, these recursive models suffer from error propagation, over‑fitting to short‑time data, and violations of fundamental physical constraints such as trace preservation and positivity of the reduced density matrix.

To overcome these limitations, the authors propose a non‑recursive, one‑dimensional convolutional neural network (CNN) that directly maps system parameters and a specially designed time encoding to the full set of site populations and two global observables. The key innovation is the “redundant time encoding” module: the physical time variable (t) (0–100 ps) is transformed into a vector of 100 overlapping basis functions (f_k(t)=\eta_k(t)+g_k(t)). The (\eta_k) component is a scaled hyperbolic tangent, while (g_k) is a logistic‑type function; together they produce smooth S‑shaped curves that are uniformly normalized to the interval (


Comments & Academic Discussion

Loading comments...

Leave a Comment