C-AoEI-Aware Cross-Layer Optimization in Satellite IoT Systems: Balancing Data Freshness and Transmission Efficiency

C-AoEI-Aware Cross-Layer Optimization in Satellite IoT Systems: Balancing Data Freshness and Transmission Efficiency
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Satellite-based Internet of Things (S-IoT) faces a fundamental trilemma: propagation delay, dynamic fading, and bandwidth scarcity. While Layer-coded Hybrid ARQ (L-HARQ) enhances reliability, its backtracking decoding introduces age ambiguity, undermining the standard Age of Information (AoI) metric and obscuring the critical trade-off between data freshness and transmission efficiency. To bridge this gap, we propose a novel cross-layer optimization framework centered on a new metric, the Cross-layer Age of Error Information (C-AoEI). We derive a closed-form expression for C-AoEI, explicitly linking freshness to system parameters, establishing an explicit analytical connection between freshness degradation and channel dynamics. Building on this, we develop a packet-level encoded L-HARQ scheme for multi-GBS scenarios and an adaptive algorithm that jointly optimizes coding and decision thresholds. Extensive simulations demonstrate the effectiveness of our proposed framework: it achieves 31.8% higher transmission efficiency and 17.2% lower C-AoEI than conventional schemes. The framework also proves robust against inter-cell interference and varying channel conditions, providing a foundation for designing efficient, latency-aware next-generation S-IoT protocols.


💡 Research Summary

**
The paper tackles the fundamental “trilemma” of satellite‑based Internet of Things (S‑IoT) systems—long propagation delay, dynamic fading, and scarce bandwidth—by introducing a cross‑layer performance metric that captures the peculiarities of layered Hybrid Automatic Repeat Request (L‑HARQ). Conventional Age of Information (AoI) assumes that the reception timestamp directly reflects the freshest data, an assumption that breaks down in L‑HARQ because successful decoding can occur either at the forward‑decoding stage or later during back‑tracking decoding. This creates an “age ambiguity” that masks the true freshness of information and leads to sub‑optimal design choices when reliability and efficiency are prioritized.

To resolve this, the authors propose the Cross‑layer Age of Error Information (C‑AoEI), defined as the elapsed time since the most recent successful back‑tracking decoding update. By explicitly incorporating the back‑tracking decoding delay, C‑AoEI provides a physically meaningful measure of how “valid” the timeline of received data is under error‑propagation across layers. The paper derives a closed‑form expression for C‑AoEI under a shadowed‑Rician fading model, linking it to system parameters such as propagation delay (τ_prop), ACK delay (τ_ACK), maximum HARQ cycles (K), transmit power (P_t), packet length (N), average SNR, Nakagami‑m fading parameter, and the mixing rate ρ of new versus retransmitted packets. This analytical relationship enables immediate quantification of how parameter variations affect freshness.

Building on the metric, the authors design a packet‑level encoded L‑HARQ scheme tailored for multi‑ground‑base‑station (GBS) deployments. Unlike conventional block‑wise coding, each sub‑packet can be assigned a variable coding rate and a dynamic mixing ratio ρ, allowing selective retransmission of only the erroneous portions of previously failed packets while simultaneously injecting fresh status updates. The depth of back‑tracking decoding (Θ_z) is adaptively controlled, which reduces the average number of retransmissions and shortens the effective age of information.

The core of the solution is an adaptive algorithm that jointly optimizes coding parameters (e.g., code rate, ρ) and decoding thresholds (e.g., SINR target, maximum HARQ rounds K). The algorithm uses real‑time channel state information to solve a multi‑objective optimization problem that minimizes a weighted sum of transmission efficiency η and C‑AoEI. A Lagrangian multiplier method yields closed‑form updates, and the algorithm also incorporates coordinated power control to mitigate inter‑cell interference among multiple GBSs.

Simulation studies consider a Low‑Earth‑Orbit (LEO) satellite with a typical round‑trip time of ~3.3 ms, shadowed‑Rician fading, and four simultaneously served GBSs. Results show that the proposed framework achieves a 31.8 % increase in average transmission efficiency and a 17.2 % reduction in C‑AoEI compared with conventional AoI‑based designs. The performance remains robust across a wide SNR range (5–15 dB) and under inter‑cell interference levels up to 10 dB, confirming that the metric and the adaptive scheme stay within theoretical bounds even in harsh channel conditions.

The authors argue that C‑AoEI is especially valuable for mission‑critical S‑IoT applications such as maritime distress signaling, wildfire monitoring, and Arctic infrastructure inspection, where timely and reliable data delivery is paramount. Moreover, the proposed scheme is compatible with existing HARQ, NR, and 5G‑NR standards, facilitating straightforward integration into next‑generation satellite‑IoT protocols.

In summary, the paper makes three key contributions: (1) a novel cross‑layer freshness metric (C‑AoEI) with a closed‑form analytical model; (2) a packet‑level encoded L‑HARQ architecture that leverages prior decoding information across layers to reduce error probability; and (3) an adaptive joint optimization algorithm that balances freshness and efficiency while handling inter‑cell interference. The comprehensive evaluation demonstrates that C‑AoEI‑aware cross‑layer optimization can simultaneously improve data freshness, reliability, and spectral efficiency, offering a solid foundation for future satellite‑IoT system design.


Comments & Academic Discussion

Loading comments...

Leave a Comment