A Comprehensive Survey on Fog Computing: State-of-the-art and Research Challenges
📝 Abstract
Cloud computing with its three key facets (i.e., IaaS, PaaS, and SaaS) and its inherent advantages (e.g., elasticity and scalability) still faces several challenges. The distance between the cloud and the end devices might be an issue for latency-sensitive applications such as disaster management and content delivery applications. Service Level Agreements (SLAs) may also impose processing at locations where the cloud provider does not have data centers. Fog computing is a novel paradigm to address such issues. It enables provisioning resources and services outside the cloud, at the edge of the network, closer to end devices or eventually, at locations stipulated by SLAs. Fog computing is not a substitute for cloud computing but a powerful complement. It enables processing at the edge while still offering the possibility to interact with the cloud. This article presents a comprehensive survey on fog computing. It critically reviews the state of the art in the light of a concise set of evaluation criteria. We cover both the architectures and the algorithms that make fog systems. Challenges and research directions are also introduced. In addition, the lessons learned are reviewed and the prospects are discussed in terms of the key role fog is likely to play in emerging technologies such as Tactile Internet.
💡 Analysis
Cloud computing with its three key facets (i.e., IaaS, PaaS, and SaaS) and its inherent advantages (e.g., elasticity and scalability) still faces several challenges. The distance between the cloud and the end devices might be an issue for latency-sensitive applications such as disaster management and content delivery applications. Service Level Agreements (SLAs) may also impose processing at locations where the cloud provider does not have data centers. Fog computing is a novel paradigm to address such issues. It enables provisioning resources and services outside the cloud, at the edge of the network, closer to end devices or eventually, at locations stipulated by SLAs. Fog computing is not a substitute for cloud computing but a powerful complement. It enables processing at the edge while still offering the possibility to interact with the cloud. This article presents a comprehensive survey on fog computing. It critically reviews the state of the art in the light of a concise set of evaluation criteria. We cover both the architectures and the algorithms that make fog systems. Challenges and research directions are also introduced. In addition, the lessons learned are reviewed and the prospects are discussed in terms of the key role fog is likely to play in emerging technologies such as Tactile Internet.
📄 Content
This paper has been accepted for publication in IEEE Communications Surveys & Tutorials. The content is final but has NOT been proof-read. This is an author copy for personal record only.
A Comprehensive Survey on Fog Computing: State- of-the-art and Research Challenges
Abstract — Cloud computing with its three key facets (i.e.,
IaaS, PaaS, and SaaS) and its inherent advantages (e.g., elasticity
and scalability) still faces several challenges. The distance between
the cloud and the end devices might be an issue for latency-
sensitive applications such as disaster management and content
delivery applications. Service Level Agreements (SLAs) may also
impose processing at locations where the cloud provider does not
have data centers. Fog computing is a novel paradigm to address
such issues. It enables provisioning resources and services outside
the cloud, at the edge of the network, closer to end devices or
eventually, at locations stipulated by SLAs. Fog computing is not
a substitute for cloud computing but a powerful complement. It
enables processing at the edge while still offering the possibility to
interact with the cloud. This article presents a comprehensive
survey on fog computing. It critically reviews the state of the art in
the light of a concise set of evaluation criteria. We cover both the
architectures and the algorithms that make fog systems.
Challenges and research directions are also introduced. In
addition, the lessons learned are reviewed and the prospects are
discussed in terms of the key role fog is likely to play in emerging
technologies such as Tactile Internet.
Index Terms—Cloud Computing, Edge Computing, Fog
Computing, Internet of Things (IoT), Latency, Tactile Internet.
I.
INTRODUCTION
VER the years, computing paradigms have evolved from
distributed, parallel, and grid to cloud computing. Cloud
computing [1][2] comes with several inherent capabilities such
as scalability, on-demand resource allocation, reduced
management efforts, flexible pricing model (pay-as-you-go),
and easy applications and services provisioning. It comprises
three key service models: Infrastructure-as-a-Service (IaaS),
Platform-as-a-Service
(PaaS),
and
Software-as-a-Service
(SaaS). IaaS provides the virtualized resources, such as
compute, storage, and networking. The PaaS provides software
environments
for
the
development,
deployment,
and
management of applications. The SaaS provides software
applications and composite services to end-users and other
applications.
Nowadays, cloud computing is widely used. However, it
still has some limitations. The fundamental limitation is the
connectivity between the cloud and the end devices. Such
connectivity is set over the Internet, not suitable for a large set
of cloud-based applications such as the latency-sensitive ones
[3]. Well-known examples include connected vehicles [4], fire
detection and firefighting [5], smart grid [4], and content
delivery [6]. Furthermore, cloud-based applications are often
distributed and made up of multiple components [7].
Consequently, it is not uncommon to sometimes deploy
application components separately over multiple clouds (e.g.,
[8] and [9]). This may worsen the latency due to the overhead
induced by inter-cloud communications. Yet, as another
limitation, the regulations may prescribe processing at locations
where the cloud provider may have no data center [10].
Fog computing [11] is a computing paradigm introduced to
tackle these challenges. It is now being promoted by the
OpenFog Consortium which has recently published a few white
papers (e.g., [12]). Fog is “cloud closer to ground”. It is a novel
architecture that extends the traditional cloud computing
architecture to the edge of the network. With fog, the processing
of some application components (e.g., latency-sensitive ones)
can take place at the edge of the network, while others (e.g.,
delay-tolerant and computational intensive components) can
happen in the cloud. Compute, storage, and networking services
are the building blocks of the cloud and the fog that extends it.
However, the fog provides additional advantages, such as low-
latency, by allowing processing to take place at the network
edge, near the end devices, by the so-called fog nodes and the
ability to enable processing at specific locations. It also offers
densely-distributed points for gathering data generated by the
end devices. This is done through proxies, access points, and
routers positioned at the network edge, near the sources. In the
literature (e.g., [11][13]) it is widely acknowledged that cloud
computing is not viable for most of Internet of Things (IoT)
applications and fog could be used as an alternative. However,
it is important to note that the applicability of fog goes beyond
IoT and includes areas such as content delivery as shown later
in this paper.
Several surveys and tutorials related to fog comput
This content is AI-processed based on ArXiv data.