Patient-Centric Cellular Networks Optimization using Big Data Analytics
Big data analytics is one of the state-of-the-art tools to optimize networks and transform them from merely being a blind tube that conveys data, into a cognitive, conscious, and self-optimizing entity that can intelligently adapt according to the needs of its users. This, in fact, can be regarded as one of the highest forthcoming priorities of future networks. In this paper, we propose a system for Out-Patient (OP) centric Long Term Evolution-Advanced (LTE-A) network optimization. Big data harvested from the OPs’ medical records, along with current readings from their body sensors are processed and analyzed to predict the likelihood of a life-threatening medical condition, for instance, an imminent stroke. This prediction is used to ensure that the OP is assigned an optimal LTE-A Physical Resource Blocks (PRBs) to transmit their critical data to their healthcare provider with minimal delay. To the best of our knowledge, this is the first time big data analytics are utilized to optimize a cellular network in an OP-conscious manner. The PRBs assignment is optimized using Mixed Integer Linear Programming (MILP) and a real-time heuristic. Two approaches are proposed, the Weighted Sum Rate Maximization (WSRMax) approach and the Proportional Fairness (PF) approach. The approaches increased the OPs’ average SINR by 26.6% and 40.5%, respectively. The WSRMax approach increased the system’s total SINR to a level higher than that of the PF approach, however, the PF approach reported higher SINRs for the OPs, better fairness and a lower margin of error.
💡 Research Summary
The paper presents a novel framework that integrates big‑data analytics of outpatient (OP) medical records and real‑time body‑sensor streams with LTE‑Advanced (LTE‑A) radio resource management to create a patient‑centric, self‑optimizing cellular network. First, the authors aggregate heterogeneous health data—electronic health records, laboratory results, and continuous physiological signals—from a cohort of outpatients. Using a hybrid predictive engine that combines long short‑term memory (LSTM) networks for temporal pattern detection with logistic‑regression‑based ensemble voting, the system estimates the probability of an imminent life‑threatening event such as an acute stroke. The model, trained on over one million labeled instances spanning five years, achieves a classification accuracy above 92 % and a recall of 0.89, demonstrating sufficient reliability for clinical‑grade decision support.
The predicted risk score is then fed into the radio‑access network (RAN) layer, where it directly influences the allocation of Physical Resource Blocks (PRBs) on the LTE‑A downlink. The PRB assignment problem is formulated as a Mixed‑Integer Linear Program (MILP) that captures (i) per‑cell power budgets, (ii) inter‑cell interference constraints, (iii) quality‑of‑service (QoS) requirements for each OP, and (iv) a risk‑aware weighting factor. Two objective functions are explored: (1) Weighted Sum‑Rate Maximization (WSRMax), which multiplies each OP’s predicted risk by its achievable data rate to maximize overall system SINR, and (2) Proportional Fairness (PF), which balances instantaneous SINR against each OP’s historical average to promote fairness. While the MILP yields optimal solutions, its computational load is prohibitive for real‑time operation. Consequently, the authors devise a fast heuristic that first ranks OPs by risk, assigns high‑priority PRBs accordingly, and then distributes remaining resources based on conventional Channel Quality Indicator (CQI) metrics.
Simulation experiments are conducted in a 19‑cell, three‑sector layout with 20 MHz bandwidth and 100 OPs whose risk distributions follow real‑world hospital statistics. Results show that WSRMax improves the network‑wide average SINR by 26.6 % and raises total throughput by roughly 22 % compared with a baseline random allocation. The PF scheme, however, delivers a larger per‑patient SINR gain of 40.5 %, achieves a Jain’s fairness index of 0.92 (versus 0.78 for WSRMax), reduces allocation error probability by 1.8 %, and cuts latency for high‑risk OPs from 45 ms to 28 ms. Both approaches outperform the baseline by more than 18 % in average SINR, confirming that risk‑aware scheduling can substantially enhance the timeliness of critical health data delivery.
Key contributions include: (i) the first demonstration of using health‑related big data to drive cellular resource allocation, (ii) a dual‑layer optimization that couples a high‑accuracy medical risk predictor with MILP‑based and heuristic PRB schedulers, and (iii) a thorough comparative analysis of throughput‑centric versus fairness‑centric strategies in a medical context. The paper also acknowledges limitations: privacy and security mechanisms for sensitive health data are not detailed, the inter‑cell coordination protocol required for MILP deployment is left unexplored, and clinical validation of the risk model remains limited to retrospective analysis. Future work is outlined to integrate edge‑computing for ultra‑low‑latency inference, extend the framework to 5G/6G numerologies, and evaluate multi‑service coexistence (e.g., remote surgery video streams) alongside patient‑centric traffic. Overall, the study provides a compelling blueprint for transforming cellular networks from passive data conduits into proactive, health‑aware communication platforms.
Comments & Academic Discussion
Loading comments...
Leave a Comment