Architectural Tactics for Big Data Cybersecurity Analytic Systems: A Review
Context: Big Data Cybersecurity Analytics is aimed at protecting networks, computers, and data from unauthorized access by analysing security event data using big data tools and technologies. Whilst a plethora of Big Data Cybersecurity Analytic Systems have been reported in the literature, there is a lack of a systematic and comprehensive review of the literature from an architectural perspective. Objective: This paper reports a systematic review aimed at identifying the most frequently reported quality attributes and architectural tactics for Big Data Cybersecurity Analytic Systems. Method: We used Systematic Literature Review (SLR) method for reviewing 74 primary studies selected using well-defined criteria. Results: Our findings are twofold: (i) identification of 12 most frequently reported quality attributes and the justification for their significance for Big Data Cybersecurity Analytic Systems; and (ii) identification and codification of 17 architectural tactics for addressing the quality attributes that are commonly associated with Big Data Cybersecurity Analytic systems. The identified tactics include six performance tactics, four accuracy tactics, two scalability tactics, three reliability tactics, and one security and usability tactic each. Conclusion: Our findings have revealed that (a) despite the significance of interoperability, modifiability, adaptability, generality, stealthiness, and privacy assurance, these quality attributes lack explicit architectural support in the literature (b) empirical investigation is required to evaluate the impact of codified architectural tactics (c) a good deal of research effort should be invested to explore the trade-offs and dependencies among the identified tactics and (d) there is a general lack of effective collaboration between academia and industry for supporting the field of Big Data Cybersecurity Analytic Systems.
💡 Research Summary
The paper presents a systematic literature review (SLR) of architectural tactics used in Big Data Cybersecurity Analytic Systems (BDCAS). Recognizing that the rapid growth of security event data—logs, network flows, threat intelligence—requires specialized big‑data platforms, the authors set out to map the non‑functional quality attributes (QAs) most frequently emphasized in the literature and to identify concrete architectural tactics (ATs) that address those attributes.
Methodology
A comprehensive search was performed across IEEE Xplore, ACM Digital Library, Scopus, and Web of Science for the period 2010‑2023 using keywords such as “big data”, “cybersecurity”, “analytics”, and “architecture”. After title/abstract screening, relevance filtering, and full‑text assessment based on predefined inclusion criteria (focus on security analytics, use of distributed processing frameworks, empirical evaluation), 74 primary studies were retained. The authors extracted reported QAs, classified them, and then coded any described design decisions as architectural tactics.
Findings – Quality Attributes
Twelve QAs emerged as the most recurrent: performance, accuracy, scalability, reliability, security, usability, interface interoperability, modifiability, adaptability, generality, stealthiness, and privacy assurance. The first six align with classic system concerns, while the latter six are often mentioned only in passing and lack systematic treatment.
Findings – Architectural Tactics
Seventeen distinct tactics were codified and grouped as follows:
Performance (6) – data partitioning/sharding, in‑memory caching (e.g., Redis), pipeline parallelism, asynchronous I/O, compression‑based transport, GPU acceleration.
Accuracy (4) – multi‑model ensembles, feedback‑driven labeling (active learning), statistical profiling for anomaly detection, precision‑recall balancing optimization.
Scalability (2) – auto‑scaling in cloud environments, micro‑service decomposition with container orchestration (Kubernetes).
Reliability (3) – checkpointing and fault‑recovery, data replication across geographic zones, health‑checks with circuit‑breaker patterns.
Security (1) – end‑to‑end encryption and fine‑grained access control (RBAC/ABAC).
Usability (1) – intuitive dashboards, customizable alerts, and reporting interfaces.
Each tactic is described with its intended QA impact, typical implementation mechanisms, and known trade‑offs (e.g., encryption improves confidentiality but adds latency; aggressive partitioning boosts throughput but complicates consistency).
Discussion
A critical insight is the mismatch between the importance of certain QAs (interoperability, modifiability, adaptability, generality, stealthiness, privacy) and the scarcity of explicit tactics supporting them. This suggests that current research focuses heavily on performance‑centric solutions while neglecting broader system‑level concerns. Moreover, the review highlights a lack of empirical evaluation of the identified tactics; most primary studies present prototype implementations without rigorous benchmarking or statistical validation. The authors argue that systematic performance and security benchmarking, possibly through a shared evaluation framework, is essential to quantify the benefits and costs of each tactic.
The paper also examines inter‑tactic dependencies. For example, data partitioning (performance) may increase replication overhead (reliability), and multi‑model ensembles (accuracy) can raise computational resource consumption, affecting scalability. Understanding these interactions is crucial for architects who must balance competing objectives.
Finally, the authors note a pronounced gap between academia and industry. Real‑world BDCAS deployments handle petabytes of streaming data, strict regulatory constraints, and continuous threat evolution—contexts that are under‑represented in the surveyed literature. They call for collaborative pilot projects, open datasets, and shared best‑practice repositories to bridge this divide.
Conclusion and Future Work
The review delivers a taxonomy of quality attributes and a catalog of 17 architectural tactics, providing a reference point for designers of big‑data security analytics platforms. It identifies four research directions: (1) development of tactics for the under‑addressed QAs, (2) systematic study of trade‑offs and dependencies among tactics, (3) empirical validation through benchmark suites, and (4) fostering academia‑industry partnerships to test tactics in production environments. By highlighting both the strengths and the blind spots of current research, the paper aims to guide future efforts toward more robust, scalable, and trustworthy BDCAS architectures.
Comments & Academic Discussion
Loading comments...
Leave a Comment