Transmission of Information in Active Networks

Transmission of Information in Active Networks
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Shannon’s Capacity Theorem is the main concept behind the Theory of Communication. It says that if the amount of information contained in a signal is smaller than the channel capacity of a physical media of communication, it can be transmitted with arbitrarily small probability of error. This theorem is usually applicable to ideal channels of communication in which the information to be transmitted does not alter the passive characteristics of the channel that basically tries to reproduce the source of information. For an {\it active channel}, a network formed by elements that are dynamical systems (such as neurons, chaotic or periodic oscillators), it is unclear if such theorem is applicable, once an active channel can adapt to the input of a signal, altering its capacity. To shed light into this matter, we show, among other results, how to calculate the information capacity of an active channel of communication. Then, we show that the {\it channel capacity} depends on whether the active channel is self-excitable or not and that, contrary to a current belief, desynchronization can provide an environment in which large amounts of information can be transmitted in a channel that is self-excitable. An interesting case of a self-excitable active channel is a network of electrically connected Hindmarsh-Rose chaotic neurons.


💡 Research Summary

Shannon’s capacity theorem assumes a passive communication channel whose characteristics remain unchanged by the transmitted signal. In many real‑world systems—neuronal assemblies, coupled oscillators, chaotic lasers—the channel itself is an active dynamical medium that can adapt, reconfigure, or even generate its own activity in response to an input. This paper addresses the fundamental question of whether Shannon’s theorem can be extended to such “active channels” and, if so, how to compute their information‑transfer capacity.

The authors introduce a framework based on Lyapunov exponents. For a network of dynamical units they define two spectra: (i) the full set of Lyapunov exponents Λ, which quantifies the total exponential divergence of nearby trajectories (i.e., the intrinsic chaoticity of the whole system); and (ii) the Lyapunov exponents Λ⊥ associated with the synchronization subspace, which measure how quickly perturbations orthogonal to the synchronized manifold decay. The average mutual information rate (MIR) that can be conveyed from a sender to a receiver embedded in the network is then expressed as the difference MIR = Λ – Λ⊥. This quantity naturally reduces to Shannon’s capacity when the channel is linear and passive (Λ⊥ = 0).

A crucial distinction is made between self‑excitable and non‑self‑excitable active channels. A self‑excitable channel possesses internal dynamics (e.g., chaotic oscillations) that generate information even in the absence of external driving; consequently Λ remains positive when the input current is zero. In contrast, a non‑self‑excitable channel collapses to a fixed point or periodic orbit without stimulation, yielding Λ ≈ 0 and a negligible capacity.

Contrary to the common belief that synchronization maximizes information flow, the analysis shows that desynchronization can actually increase capacity in self‑excitable networks. When units are partially desynchronized, Λ⊥ becomes small while Λ stays large, so the difference Λ – Λ⊥ grows. In fully synchronized states the system behaves effectively as a single degree of freedom, limiting the achievable MIR. Thus, a controlled amount of desynchronization is beneficial for information transmission.

The theoretical results are illustrated with a concrete example: an electrically coupled network of Hindmarsh‑Rose (HR) chaotic neurons. Each HR neuron is a three‑dimensional chaotic oscillator characterized by a membrane current I and an electrical coupling strength g. By numerically integrating the network, computing the Jacobian, and applying QR‑based algorithms to estimate the Lyapunov spectra, the authors map the MIR across the (g, I) parameter plane. They find that moderate coupling (g ≈ 0.05–0.1) combined with a suitable external current (I ≈ 3.5–4.0) yields a partially desynchronized regime where Λ ≈ 0.8 and Λ⊥ ≈ 0.1, giving MIR ≈ 0.7 bits per unit time—significantly higher than the fully synchronized regime (MIR ≈ 0.2). When I is reduced to zero, the network remains self‑excitable (Λ stays positive) but the capacity drops, confirming the importance of intrinsic chaos.

The paper also provides a practical recipe for evaluating the capacity of any active channel: (1) formulate the network equations and construct the Jacobian; (2) compute the full Lyapunov spectrum and the spectrum restricted to the synchronization manifold; (3) obtain MIR as Λ – Λ⊥; (4) explore the relevant control parameters to locate the region of maximal capacity. This methodology is applicable to a broad class of systems, including laser arrays, memristive circuits, and synthetic biochemical networks.

In conclusion, the authors demonstrate that active channels require an extension of Shannon’s theory that incorporates dynamical instability and synchronization properties. Self‑excitable networks can sustain high information rates, and desynchronization—far from being detrimental—can create a fertile environment for large‑scale information transfer. These insights open new avenues for designing high‑capacity communication architectures based on chaotic or neuromorphic substrates, with potential impact on brain‑computer interfaces, secure chaotic communications, and complex‑system information processing.


Comments & Academic Discussion

Loading comments...

Leave a Comment