Operator-Based Information Theory for Imaging: Entropy, Capacity, and Irreversibility in Physical Measurement Systems

Operator-Based Information Theory for Imaging: Entropy, Capacity, and Irreversibility in Physical Measurement Systems
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Imaging systems are commonly described using resolution, contrast, and signal-to-noise ratio, but these quantities do not provide a general account of how physical transformations affect the flow of information. This paper introduces an operator-based formulation of information theory for imaging. The approach models the imaging chain as a composition of bounded operators acting on functions, and characterises information redistribution using the spectral properties of these operators. Three measures are developed. Operator entropy quantifies how an operator distributes energy across its singular spectrum. Operator information capacity describes the number of modes that remain recoverable above a noise-dependent threshold. An irreversibility index measures the information lost through suppression or elimination of modes and captures the accumulation of information loss under operator composition. The framework applies to linear, nonlinear, and stochastic operators and does not depend on the specific imaging modality. Analytical examples show how attenuation, blur, and sampling affect entropy, capacity, and irreversibility in different ways. The results provide a general structure for analysing the physical limits of imaging and form the basis for subsequent work on information geometry, spatiotemporal budgets, nonlinear channels, and reconstruction algorithms.


💡 Research Summary

The paper proposes a novel operator‑based information‑theoretic framework for imaging systems, shifting the focus from traditional performance metrics such as resolution, contrast, and signal‑to‑noise ratio to the intrinsic properties of the physical measurement operators themselves. The authors model an imaging chain as a composition of three operators—source (S), propagation (P), and detector (D)—each acting on a Hilbert space of object functions (e.g., L²(Ω)) or its discrete analogue. By expressing each operator through its singular‑value decomposition (SVD), O = U Σ V*, the singular values σᵢ directly quantify how much energy from each input mode is transmitted to the output. Modes with σᵢ = 0 are completely eliminated; modes with very small σᵢ are effectively suppressed by noise.

Three quantitative measures are introduced:

  1. Operator Entropy (H(O)) – The squared singular values are normalized to λᵢ = σᵢ² / Σσⱼ², yielding a probability‑like distribution over modes. Entropy is defined as H(O) = – Σ λᵢ log λᵢ. This captures the diversity of modes preserved by the operator: a high entropy indicates a uniform spread of information across many spatial or spectral components, while low entropy signals concentration in a few modes or outright loss of others. The definition is scale‑invariant because of the normalization.

  2. Operator Information Capacity (I_ε(O)) – A noise‑dependent threshold ε is set, and the number of singular values exceeding ε constitutes the effective rank (or ε‑rank). The capacity is essentially the logarithm of this effective rank, representing the number of degrees of freedom that can be stably recovered under bounded noise. This links directly to practical reconstruction limits: even if an operator has many non‑zero singular values, those below ε contribute negligibly to recoverable information.

  3. Irreversibility Index (ℐ_ε(O)) – Using a numerical tolerance δ to define “numerically zero” singular values, the index measures the fraction of object‑space modes that are either eliminated (σᵢ ≤ δ) or suppressed below the recoverability threshold (δ < σᵢ < ε). When multiple operators are composed, the irreversibility index accumulates, providing a quantitative description of permanent information loss throughout the imaging pipeline.

The framework accommodates three broad classes of operators common in imaging:

  • Linear, non‑unitary operators (attenuation, convolutional blur, sampling) whose singular spectra are directly accessible.
  • Non‑linear operators (dose‑dependent sample response, non‑linear detector behavior) that can be locally linearized via Jacobians, allowing approximate singular‑value analysis.
  • Stochastic operators (noise processes) modeled as conditional expectation operators, which act as contractions on the space of possible signals.

The authors relate operator entropy to classical Shannon entropy for linear channels: for a convolution operator, the singular values correspond to the magnitude of the transfer function, and the entropy reduces to a frequency‑domain spread measure. They also discuss the effective‑rank literature, showing that exp(H(O)) equals the “effective number of modes” often used in signal processing.

Analytical examples illustrate how specific physical transformations affect the three measures:

  • Uniform attenuation scales all σᵢ equally, leaving entropy unchanged but reducing capacity as more modes fall below ε.
  • Spatially varying attenuation alters the relative λᵢ, potentially changing entropy.
  • Blur suppresses high‑frequency σᵢ, lowering both entropy and capacity.
  • Sampling projects onto a lower‑dimensional subspace, sharply cutting entropy, capacity, and raising irreversibility.
  • Scattering can redistribute energy across modes (increasing entropy) or eliminate modes (decreasing entropy), depending on the scattering kernel.

The paper emphasizes that these operator‑level quantities are independent of any assumed signal priors, reconstruction algorithms, or downstream decision tasks. Consequently, they provide a universal benchmark for the physical limits of any imaging modality that can be expressed as an operator chain, whether X‑ray, optical, ultrasound, or emerging quantum‑based systems.

In the discussion, the authors argue that traditional data‑centric metrics cannot distinguish between information that is fundamentally unrecoverable (due to operator‑induced mode loss) and information that is merely difficult to reconstruct. The operator‑based approach isolates the former, enabling system designers to identify and mitigate irreversible steps (e.g., by optimizing illumination patterns, reducing blur, or employing compressive sampling strategies that preserve more singular values).

Future work outlined includes extending the framework to information geometry (defining distances between operators), constructing spatiotemporal information budgets for dynamic imaging, analyzing fully non‑linear channels, and integrating the measures into regularization and reconstruction algorithm design.

Overall, the paper delivers a rigorous, modality‑agnostic theory that quantifies how physical transformations redistribute, compress, or destroy information, offering a new lens through which to assess, compare, and ultimately improve imaging systems.


Comments & Academic Discussion

Loading comments...

Leave a Comment