The merits of CO 2 capture and storage to the environmental stability of our world should not be underestimated as emissions of greenhouse gases cause serious problems. It represents the only technology that might rid our atmosphere of the main anthropogenic gas while allowing for the continuous use of the fossil fuels which still power today's world. Underground storage of CO 2 involves the injection of CO 2 into suitable geological formations and the monitoring of the injected plume over time, to ensure containment. Over the last two or three decades, attention has been paid to technology developments of carbon capture and sequestration. Therefore, it is high time to look at the research done so far. In this regard, a high-level review article is required to provide an overview of the status of carbon capture and sequestration research. This article presents a review of CO 2 storage technologies which includes a background of essential concepts in storage, the physical processes involved, modeling procedures and simulators used, capacity estimation, measuring monitoring and verification techniques, risks and challenges involved and field-/pilot-scale projects. It is expected that the present review paper will help the researchers to gain a quick knowledge of CO 2 sequestration for future research in this field. Keywords CO 2 storage • Geological formation • Modeling for CO 2 storage • Mechanism of CO 2 storage • CO 2 storage projects Edited by Yan-Hua Sun
Containers are increasingly used as means to distribute and run Linux services and applications. In this paper we describe the architectural design and implementation of udocker, a tool which enables the user to execute Linux containers in user mode. We also present a few practical applications, using a range of scientific codes characterized by different requirements: from single core execution to MPI parallel execution and execution on GPGPUs.
This paper describes the achievements of the H2020 project INDIGO-DataCloud. The project has provided e-infrastructures with tools, applications and cloud framework enhancements to manage the demanding requirements of scientific communities, either locally or through enhanced interfaces. The middleware developed allows to federate hybrid resources, to easily write, port and run scientific applications to the cloud. In particular, we have extended existing PaaS (Platform as a Service) solutions, allowing public and private e-infrastructures, including those provided by EGI, EUDAT, and Helix Nebula, to integrate their existing services and make them available through AAI services compliant with GEANT interfederation policies, thus guaranteeing transparency and trust in the provisioning of such services. Our middleware facilitates the execution of applications using containers on Cloud and Grid based infrastructures, as well as on HPC clusters. Our developments are freely downloadable as open source components, and are already being integrated into many scientific applications.
Reservoir rock typing is a process by which geological facies are characterized by their dynamic behavior. The dynamic behavior of the facies is assessed by studying the rock texture, the diagenetic processes which overprinted the initial fabric, and the interaction between the rock itself and the fluids. Porosity, permeability and pore size distributions characterize the rock texture while capillary pressure, relative permeability and wettability describe the rock-fluid interaction. Reservoir rock typing is a synergetic process between geology and petrophysics/SCAL. It is therefore a process by which various petrophysical parameters and dynamic measurements obtained from SCAL are integrated in a consistent manner with geological facies (lithofacies) to estimate their flow (dynamic) behavior. The relationships between lithofacies and reservoir rock types (RRTs) is complex because of the inter-play between facies, diagenetic processes and the rock-fluid interaction (wettability changes) in the reservoir. Similar lithofacies, deposited under the same depositional environments, may exhibit different petrophysical properties due to diagenesis. Therefore, lithofacies deposited under similar geological conditions may experience different diagenetic processes resulting in different petrophysical groups with distinct porosity-permeability relationship, capillary pressure profile and water saturation (Sw) for a given height above the Free Water Level (FWL). On the contrary, lithofacies deposited in different depositional environments, might exhibit similar petrophysical properties and dynamic behavior. The authors emphasize on the need to have a good understanding of the original facies, depositional environments, subsequent diagenetic processes and rock-fluid interaction (via SCAL) to be able to unravel the relationships between lithofacies, petrophysical groups and rock types. A workflow for carbonate rock typing addressing some of the industry pitfalls and the differences between lithofacies, petrophysical groups and rock types are presented in this paper. Introduction -NomenclatureBefore proceeding into the rock type description and its link with geology and SCAL, it is important to provide a few basic definitions of the common technical terminologies found in the literature such as lithofacies, facies associations, petrophysical groups, rock types and flow units. In this paper we define lithofacies or lithofacies types as a depositional facies, or lithotype, based on sedimentary texture (Dunham 1962; Embry and Klovan 1971), grain types (skeletal grains, peloids, ooids, etc.), and, optionally, sedimentary structures (cross-bedding, bioturbation, lamination, etc.). Typical lithofacies types are skeletal wackestone, skeletal-peloid packstone or cross-bedded ooid grainstone. Facies associations are groups or bins of lithofacies from the same depositional environment/facies tracks with common φ-k relationships/trends. Petrophysical groups are units of rocks (can consist of multiple lithofacies) with similar petrophysical ...
In this paper we propose a distributed architecture to provide machine learning practitioners with a set of tools and cloud services that cover the whole machine learning development cycle: ranging from the models creation, training, validation and testing to the models serving as a service, sharing and publication. In such respect, the DEEP-Hybrid-DataCloud framework allows transparent access to existing e-Infrastructures, effectively exploiting distributed resources for the most compute-intensive tasks coming from the machine learning development cycle. Moreover, it provides scientists with a set of Cloud-oriented services to make their models publicly available, by adopting a serverless architecture and a DevOps approach, allowing an easy share, publish and deploy of the developed models. INDEX TERMS Cloud computing, computers and information processing, deep learning, distributed computing, machine learning, serverless architectures.
This paper presents the methodologies adopted to model a tight carbonate reservoir located in Abu Dhabi and to better predict its performance (completed with horizontal wells) under a water-alternating-gas (WAG) process. The model is built through integrating geological, geophysical, petrophysical, geomechanical, and geostatistical information. The large-scale reservoir framework is built by integrating horizontal wells and 3D seismic data. Horizontal well results are used to improve the velocity modeling and depth conversion. The fine scale reservoir zonation is based on lithostratigraphic correlations derived from the porosity and micro-resistivity logs. Stylolitic intervals are used as stratigraphic markers to guide the reservoir zonation. The Porosity model is derived from a high resolution stochastic seismic inversion, and the permeability model is generated using cloud transforms with P-Fields applied by reservoir rock types. High-pressure mercury injection data is used to define reservoir rock types. Lorenz plots have been applied and found to be a useful technique for capturing the heterogeneity of the reservoir and determining the main flow units. Fracture analysis is conducted using cores and image logs (FMI). A geomechanical study is performed to assess the orientation of the horizontal wells in the field. A discussion on the orientation of the horizontal wells with respect to maximum principal stress versus productivity/injectivity is also addressed in this paper. A mechanistic compositional flow model is built to perform sensitivity analyses on various WAG schemes (cycle, ratio, etc). A full field compositional model is subsequently built to evaluate the field performance under various development scenarios. The field is scheduled to come on stream by December of 2005. Field History The field was discovered in 1969. Reflection seismic had defined a number of structure closures at several stratigraphic levels, and the discovery well, W-1, was drilled to test one of these structures. Oil and gas shows were recorded when the well penetrated the main reservoir interval. Subsequent tests proved the commercial viability of the structure. In 1995 a 3D seismic survey was acquired and a new re-interpretation performed. The combined evaluation of OH logs, well test and 3D seismic results provided some encouragement, and in 1999 the field was declared commercial. Between 1994 and 1999 ADCO implemented an Early Production Scheme (EPS) to evaluate the well performance, with vertical and horizontal wells. Based on the successful results of horizontal well performance a field development plan was devised consisting of a line drive with 20 horizontal producers and 15 horizontal injectors under a WAG process (Fig 1). The development plan calls for 20,000 BOPD by end of 2005, maintaining the plateau for a specified period of time, and achieving a high recovery efficiency. Production forecasts were based on compositional models which are used to predict and monitor the reservoir performance during Phase-1 and also to assist the subsequent phases of development beyond 2015. The latter was based on detailed static (geological) modeling, the subject of this paper. Seismic data Three generations of seismic were acquired between 1962 and 1995, with the latest being a 3D survey recorded in 1994 and 1995 by Western Geophysical. This survey covers 1334 sq. km over the field area with a 25 × 25 m. bin size that translates into 2.1 million traces providing a dense grid of information of the subsurface. The original processing was completed in 1997 by Western Geophysical and the seismic data quality was good enough to carry out accurate structural interpretation. However, despite favorable surface conditions, the seismic data suffered from stacking velocity dispersion that had a significant impact on amplitude values.
The Certification Authority Coordination Group in the European DataGrid project has created a large-scale Public Key Infrastructure and the policies and procedures to operate it successfully. The infrastructure demonstrates interoperability of multiple certification authorities (CAs) in a novel system of peer-assessment of the roots of trust. Crucial to the assessment is the definition of minimum requirements that all CAs must meet in order to be accepted. The evaluation is aided by software-generated trust matrices. Related work building on this infrastructure is described. The group's policies and experience now form the basis of the new European Policy Management Authority for Grid Authentication in e-Science.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.