The behaviour of information flow near criticality

The behaviour of information flow near criticality
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Recent experiments have indicated that many biological systems self-organise near their critical point, which hints at a common design principle. While it has been suggested that information transmission is optimized near the critical point, it remains unclear how information transmission depends on the dynamics of the input signal, the distance over which the information needs to be transmitted, and the distance to the critical point. Here we employ stochastic simulations of a driven 2D Ising system and study the instantaneous mutual information and the information transmission rate between a driven input spin and an output spin. The instantaneous mutual information varies non-monotonically with the temperature, but increases monotonically with the correlation time of the input signal. In contrast, the information transmission rate exhibits a maximum as a function of the input correlation time. Moreover, there exists an optimal temperature that maximizes this maximum information transmission rate. It arises from a tradeoff between the necessity to respond fast to changes in the input so that more information per unit amount of time can be transmitted, and the need to respond to reliably. The optimal temperature lies above the critical point, but moves towards it as the distance between the input and output spin is increased.


💡 Research Summary

The paper investigates how information transmission in a driven two‑dimensional Ising model depends on three key parameters: the temperature (which controls the system’s intrinsic response time), the correlation time of the external input signal, and the spatial distance between the input and output spins. The authors model the input as a single spin that flips according to a stationary random‑telegraph process with mean correlation time τₛ, while the output is another spin located a lattice distance d away. The system evolves with discrete‑time Glauber dynamics, ensuring a non‑equilibrium steady state because the input is continuously driven.

Two information‑theoretic measures are employed. The instantaneous mutual information I_inst(S;X)=H(S)−H(S|X) quantifies the static correlation between input and output at a single time point. The information transmission rate I_R is defined as the asymptotic increase of mutual information per unit time for long input‑output trajectories, i.e. I_R = lim_{L→∞} I(S^L;X^L)/L. I_R captures both the accuracy of the mapping and the rate at which independent “messages” are sent, thus accounting for auto‑correlations in the signals that reduce the effective information flow.

Simulation results reveal several non‑trivial behaviors:

  1. Instantaneous Mutual Information (I_inst).

    • I_inst grows monotonically with τₛ, saturating at a plateau equal to the static mutual information obtained when the input is held fixed indefinitely.
    • For a fixed τₛ, I_inst is non‑monotonic in temperature. At low temperatures the system is sluggish (large response time τ_r) and cannot follow rapid input changes, while at high temperatures thermal noise degrades the mapping. Consequently, there exists an optimal temperature T_opt(τₛ) that maximizes I_inst. This optimal temperature shifts downward as τₛ increases because a slower input gives the system more time to respond.
  2. Information Transmission Rate (I_R).

    • For any given temperature, I_R exhibits a maximum at an intermediate τₛ = τ_opt(T). Short τₛ means the input varies faster than the output can react, inflating the conditional entropy H(S^L|X^L). Very long τₛ reduces the entropy of the input stream H(S^L) because fewer distinct messages are generated per unit time. The trade‑off yields a peak in I_R.
    • I_R_max (the peak value of I_R for a given temperature) itself is non‑monotonic in temperature. Starting from low T, I_R_max rises as τ_r shortens, reaches a maximum at a temperature T_opt^R, and then declines because excessive thermal noise overwhelms the signal. Thus a second optimal temperature exists, distinct from the one that maximizes I_inst.
  3. Effect of Spatial Distance (d).

    • Increasing d reduces I_R_max, reflecting the attenuation of correlations over longer paths. However, the optimal temperature that maximizes I_R_max moves closer to the critical temperature T_c as d grows. Near criticality the correlation length diverges, partially compensating for the larger separation and allowing more efficient long‑range transmission.

The authors interpret these findings in the context of biological information processing. Many living systems—ranging from bacterial quorum sensing to neural circuits and flocking birds—appear to operate near critical points. The study shows that such proximity can simultaneously provide fast response (short τ_r) and high reliability (low thermal noise) when the system’s operating temperature is tuned slightly above T_c. Moreover, the optimal operating point depends on the temporal statistics of the input and the spatial scale of communication, suggesting that biological networks could adaptively shift their “working point” to match environmental demands.

Methodologically, the paper introduces a practical scheme to estimate I_R in high‑dimensional state spaces: by sampling trajectories at a coarse‑grained time interval Δt, ensuring the trajectory length exceeds both τₛ and τ_r, and employing the Bayesian entropy estimator of Nemenman et al. This allows extrapolation to the limit Δt → δt (the elementary Glauber step) and yields reliable estimates even for modest system sizes (5×5 and 10×10 lattices).

In summary, the work demonstrates that information transmission in a driven Ising medium is optimized not at the critical point itself but slightly above it, with the precise optimum governed by a balance between response speed and noise, and that this balance is modulated by the input’s temporal correlation and the distance over which information must travel. These insights provide a quantitative foundation for the hypothesis that biological systems exploit criticality to achieve efficient, robust communication.


Comments & Academic Discussion

Loading comments...

Leave a Comment