Abstract:The infrastructure-as-a-service (IaaS) model of cloud computing provides virtual infrastructure functions (VIFs), which allow application developers to flexibly provision suitable virtual machines' (VM) types and locations, and even configure the network connection for each VM. Because of the pay-as-you-go business model, IaaS provides an elastic way to operate applications on demand. However, in current cloud applications DevOps (software development and operations) lifecycle, the VM provisioning steps mainly… Show more
“…There is a need to develop data processing technologies that address the problem by abstracting from (and virtualising) the platform(s) that take care of executing the processing pipeline. Such technologies should go in tandem with optimisation technologies and should provide the data processing designer with fine-grained processing directives and facilitate detailed specification of processing algorithms [35].…”
The development of data processing and analytics tools is heavily driven by applications, which results in a great variety of software solutions, which often address specific needs. It is difficult to imagine a single solution that is universally suitable for all (or even most) application scenarios and contexts. This chapter describes the data analytics framework that has been designed and developed in the ENVRIplus project to be (a) suitable for serving the needs of researchers in several domains including environmental sciences, (b) open and extensible both with respect to the algorithms and methods it enables and the computing platforms it relies on to execute those algorithms and methods, and (c) open-science-friendly, i.e. it is capable of incorporating every algorithm and method integrated into the data processing framework as well as any computation resulting from the exploitation of integrated algorithms into a "research object" catering for citation, reproducibility, repeatability and provenance.
“…There is a need to develop data processing technologies that address the problem by abstracting from (and virtualising) the platform(s) that take care of executing the processing pipeline. Such technologies should go in tandem with optimisation technologies and should provide the data processing designer with fine-grained processing directives and facilitate detailed specification of processing algorithms [35].…”
The development of data processing and analytics tools is heavily driven by applications, which results in a great variety of software solutions, which often address specific needs. It is difficult to imagine a single solution that is universally suitable for all (or even most) application scenarios and contexts. This chapter describes the data analytics framework that has been designed and developed in the ENVRIplus project to be (a) suitable for serving the needs of researchers in several domains including environmental sciences, (b) open and extensible both with respect to the algorithms and methods it enables and the computing platforms it relies on to execute those algorithms and methods, and (c) open-science-friendly, i.e. it is capable of incorporating every algorithm and method integrated into the data processing framework as well as any computation resulting from the exploitation of integrated algorithms into a "research object" catering for citation, reproducibility, repeatability and provenance.
“…All these works lack an efficient solution for the application to program and control the computing infrastructure in the DevOps lifecycle. Although our previous work CloudsStorm [7], [8] framework tackling the resource management issue from the application DevOps perspective, it still mainly focuses on working with Cloud resources, which is insufficient for a more complex Edge computing environment.…”
Last decades, Cloud computing has made significant impacts on traditional applications to change their development and operation methods. We witnessed ever more newly-built Clouds and data centers. However, the centralized management mechanism of current Clouds lacks the dispersion to satisfy the requirements of emerging collaborative applications, including AI, IoT, and autopilot. On the other hand, the Edge computing stays at the conceptual and experimental stage. Most organizations construct their own Edge nodes to operate applications. An efficient and incentive mechanism is missing to motivate the Edge and micro Cloud resource providers to join and constitute a more generalized and decentralized ecosystem. To address this issue, we propose ALLSTAR, a blockchain based architecture for equally combining all the Cloud and Edge resources to be seamlessly leveraged by the application in the DevOps (development and operations) lifecycle. The ALLSTAR architecture is a systematic solution to realize the "Cloud+Edge" management and contributes to constructing the corresponding ALLSTAR ecosystem. This paper describes the overall architecture of ALLSTAR, the related key techniques, and detailed application DevOps processes as well as the new business model.
“…It bridges the gap between real-time scheduling theory and Xen, and provides a platform for integrating a broad range of realtime and embedded systems. The performance guarantee at the software system level relies on the optimisation between application and infrastructure, e.g., CloudStorm [17] for service applications and PrEstoCloud [18] for real-time Big Data using an extension of the Fog computing paradigm to the extreme Edge of the network, real-time container scheduling [19]. However, the engineering of quality critical applications across heteronomous infrastructure for DApps is still very challenging, due to diverse programming interfaces, and the lack of effective programming methods.…”
Quality Critical Decentralised Applications (QC-DApp) have high requirements for system performance and service quality, involve heterogeneous infrastructures (Clouds, Fogs, Edges and IoT), and rely on the trustworthy collaborations among participants of data sources and infrastructure providers to deliver their business value. The development of the QCDApp has to tackle the low-performance challenge of the current blockchain technologies due to the low collaboration efficiency among distributed peers for consensus. On the other hand, the resilience of the Cloud has enabled significant advances in software-defined storage, networking, infrastructure, and every technology; however, those rich programmabilities of infrastructure (in particular, the advances of new hardware accelerators in the infrastructures) can still not be effectively utilised for QCDApp due to lack of suitable architecture and programming model.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.