“…Based on the issues fog systems were supposed to solve, early research focused on fog systems that aimed to keep the latency at a minimum [21,31], increase the bandwidth to communicate large amounts of data [32], and improve the energy efficiency to preserve edge devices [33], among others. As further requirements became more and more critical for modern networks (such as security and privacy in healthcare [14], reliability in mobility [40], or interoperability in IoT [41]), increasingly more research on fog systems expanded the set of potential characteristics to focus on. Contemporary articles from quadrant A can be organized into three subcategories: first, research examining use-case or industry-specific fog systems mostly focuses on discussing and improving characteristics that are most relevant for their application (e.g., [14,42]).…”
Fog computing adds decentralized computing, storage, and networking capabilities with dedicated nodes as an intermediate layer between cloud data centers and edge devices to solve latency, bandwidth, and resilience issues. However, introducing a fog layer imposes new system design challenges. Fog systems not only exhibit a multitude of key system characteristics (e.g., security, resilience, interoperability) but are also beset with various interdependencies among their key characteristics that require developers' attention. Such interdependencies can either be trade-offs with improving the fog system on one characteristic impairing it on another, or synergies with improving the system on one characteristic also improving it on another. As system developers face a multifaceted and complex set of potential system design measures, it is challenging for them to oversee all potentially resulting interdependencies, mitigate trade-offs, and foster synergies. Until now, existing literature on fog system architecture has only analyzed such interdependencies in isolation for specific characteristics, thereby limiting the applicability and generalizability of their proposed system designs if other than the considered characteristics are critical. We aim to fill this gap by conducting a literature review to (1) synthesize the most relevant characteristics of fog systems and design measures to achieve them, and (2) derive interdependences among all key characteristics. From reviewing 147 articles on fog system architectures, we reveal 11 key characteristics and 39 interdependencies. We supplement the key characteristics with a description, reason for their relevance, and related design measures derived from literature to deepen the understanding of a fog system´s potential and clarify semantic ambiguities. For the interdependencies, we explain and differentiate each one as positive (synergies) or negative (trade-offs), guiding practitioners and researchers in future design choices to avoid pitfalls and unleash the full potential of fog computing.
“…Based on the issues fog systems were supposed to solve, early research focused on fog systems that aimed to keep the latency at a minimum [21,31], increase the bandwidth to communicate large amounts of data [32], and improve the energy efficiency to preserve edge devices [33], among others. As further requirements became more and more critical for modern networks (such as security and privacy in healthcare [14], reliability in mobility [40], or interoperability in IoT [41]), increasingly more research on fog systems expanded the set of potential characteristics to focus on. Contemporary articles from quadrant A can be organized into three subcategories: first, research examining use-case or industry-specific fog systems mostly focuses on discussing and improving characteristics that are most relevant for their application (e.g., [14,42]).…”
Fog computing adds decentralized computing, storage, and networking capabilities with dedicated nodes as an intermediate layer between cloud data centers and edge devices to solve latency, bandwidth, and resilience issues. However, introducing a fog layer imposes new system design challenges. Fog systems not only exhibit a multitude of key system characteristics (e.g., security, resilience, interoperability) but are also beset with various interdependencies among their key characteristics that require developers' attention. Such interdependencies can either be trade-offs with improving the fog system on one characteristic impairing it on another, or synergies with improving the system on one characteristic also improving it on another. As system developers face a multifaceted and complex set of potential system design measures, it is challenging for them to oversee all potentially resulting interdependencies, mitigate trade-offs, and foster synergies. Until now, existing literature on fog system architecture has only analyzed such interdependencies in isolation for specific characteristics, thereby limiting the applicability and generalizability of their proposed system designs if other than the considered characteristics are critical. We aim to fill this gap by conducting a literature review to (1) synthesize the most relevant characteristics of fog systems and design measures to achieve them, and (2) derive interdependences among all key characteristics. From reviewing 147 articles on fog system architectures, we reveal 11 key characteristics and 39 interdependencies. We supplement the key characteristics with a description, reason for their relevance, and related design measures derived from literature to deepen the understanding of a fog system´s potential and clarify semantic ambiguities. For the interdependencies, we explain and differentiate each one as positive (synergies) or negative (trade-offs), guiding practitioners and researchers in future design choices to avoid pitfalls and unleash the full potential of fog computing.
“…The result is that migration time and service downtime can be significantly reduced with regard to a simpler and more traditional migration of the whole application. Always to preserve a given level of QoS, [8] deploys a Fog server between a supported application and each used IoT service, in order to allow applications to access multiple IoT services more efficiently. In addition, [9] develops a cloud-fog computing architecture for information-centric IoT applications with specific support for job classification and resource scheduling functions.…”
Distributed architectures where the Internet of Things (IoT) and the cloud are efficiently integrated play an increasingly important role for IoT solutions. Among these architectures, there is a growing interest in the ones that support the opportunity of functionality offloading towards either intermediate fog nodes or IoT end devices. Relevant existing research work has mainly focused so far on virtual machine and container migration to intermediate fog nodes and on migration of very simple functions to IoT endpoints (to preserve their limited resources available). In this paper, we originally concentrate on the gap associated with benefitting from fog functionality at resource-powerful IoT endpoints, to create a continuum deployment that glues IoT devices and the cloud. In particular, this paper originally presents a middleware that manages application deployment and life-cycle by simplifying and optimizing management operations such as device configuration and application constraint satisfaction. The proposed solution particularly fits highly articulated scenarios with large numbers of IoT devices and intermediate fog nodes, by supporting the opportunity to offload functionality in a split way between IoT endpoints and edge nodes. The reported experimental results confirm the feasibility of our approach in term of overhead, scalability, and application life-cycle management.
“…Por otro lado, las aplicaciones Fog suelen expresarse como un flujo dirigido en el que cada componente tiene entradas y/o salidas (workflow) [4], [5], [7], [9], [14]. Cada componente debe cumplir con una función específica, distribuyendo así la funcionalidad de la aplicación [7].…”
Section: Introductionunclassified
“…De hecho, varios trabajos han aplicado técnicas de modelado para la definición de aplicaciones Fog siguiendo este enfoque orientado al flujo de datos. Sin embargo, ninguno de ellos ofrece herramientas para abordar la concepción, diseño y despliegue de este tipo de aplicaciones [3], [4], [9]. Es más, a pesar de los intentos de estandarización, por lo que los autores saben, el diseño y la composición de las aplicaciones suele resolverse con el desarrollo de soluciones ad-hoc para cada situación, no siendo aplicables a diferentes dominios de aplicación.…”
La irrupción de la Industria 4.0 ha dado lugar a integraciones con tecnologías innovadoras, las cuales permiten el desarrollo de aplicaciones que utilizan datos recogidos desde la planta para optimizar los procesos industriales. En un principio, estas aplicaciones se desplegaron en la nube. Sin embargo, por problemas de latencia y seguridad, la niebla ha surgido como un nuevo paradigma, con capacidades similares a la nube, pero más cerca de los activos de planta. El despliegue de aplicaciones en la niebla ha sido un tema de discusión creciente. No obstante, a pesar de que diferentes autores conciben las aplicaciones según la misma lógica (como un flujo dirigido de componentes), no se ha presentado una solución global. Así, los autores proponen un enfoque genérico basado en modelos para la definición de aplicaciones de computación en la niebla.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.