[1] We introduce an improved initialization to the decadal predictions performed for the Mittelfristige Klimaprognosen (MiKlip) project based on the Max-Planck-Institute Earth System Model and furthermore test the effect of increased ocean and atmosphere model resolutions. The new initialization includes both a more sophisticated oceanic initialization and additionally an atmospheric initialization. We compare the performance of retrospective decadal forecasts over the past 50 years with that of the previous system. The new oceanic initialization considerably improves the performance in terms of surface air temperature over the tropical oceans on the 2-5 years time scale, which also helps to improve the predictive skill of global mean surface air temperature on this time scale. The higher model resolution improves the predictive skill of surface air temperature over the tropical Pacific even further. Through the newly introduced atmospheric initialization, the quasi-biennial oscillation exhibits predictive skill of up to 4 years when a sufficiently high vertical atmospheric resolution is used. Citation:
Applying the Terrestrial Systems Modeling Platform, TSMP, this study provides the first simulated long-term (1996–2018), high-resolution (~12.5 km) terrestrial system climatology over Europe, which comprises variables from groundwater across the land surface to the top of the atmosphere (G2A). The data set offers an unprecedented opportunity to test hypotheses related to short- and long-range feedback processes in space and time between the different interacting compartments of the terrestrial system. The physical consistency of simulated states and fluxes in the terrestrial system constitutes the uniqueness of the data set: while most regional climate models (RCMs) have a tendency to simplify the soil moisture and groundwater representation, TSMP explicitly simulates a full 3D soil- and groundwater dynamics, closing the terrestrial water cycle from G2A. As anthopogenic impacts are excluded, the dataset may serve as a near-natural reference for global change simulations including human water use and climate change. The data set is available as netCDF files for the pan-European EURO-CORDEX domain.
The advent of emerging technologies such as Web services, serviceoriented architecture, and cloud computing has enabled us to perform business services more efficiently and effectively. However, we still suffer from unintended security leakages by unauthorized actions in business services while providing more convenient services to Internet users through such a cutting-edge technological growth. Furthermore, designing and managing Web access control policies are often error-prone due to the lack of effective analysis mechanisms and tools. In this paper, we represent an innovative policy anomaly analysis approach for Web access control policies. We focus on XACML (eXtensible Access Control Markup Language) policy since XACML has become the de facto standard for specifying and enforcing access control policies for various Webbased applications and services. We introduce a policy-based segmentation technique to accurately identify policy anomalies and derive effective anomaly resolutions. We also discuss a proof-ofconcept implementation of our method called XAnalyzer and demonstrate how efficiently our approach can discover and resolve policy anomalies.
Abstract-Emerging computing technologies such as web services, service-oriented architecture, and cloud computing has enabled us to perform business services more efficiently and effectively. However, we still suffer from unintended security leakages by unauthorized actions in business services while providing more convenient services to Internet users through such a cutting-edge technological growth. Furthermore, designing and managing web access control policies are often error-prone due to the lack of effective analysis mechanisms and tools. In this paper, we represent an innovative policy anomaly analysis approach for web access control policies, focusing on extensible access control markup language policy. We introduce a policy-based segmentation technique to accurately identify policy anomalies and derive effective anomaly resolutions, along with an intuitive visualization representation of analysis results. We also discuss a proof-of-concept implementation of our method called XAnalyzer and demonstrate how our approach can efficiently discover and resolve policy anomalies.
This work documents the fidelity of the newly developed Indian Institute of TropicalMeteorology climate model simulations and demonstrates its suitability to address the climate variability and change issues relevant to the South Asian monsoon.Centers for Environmental Prediction (NCEP) weather and seasonal prediction system in India during 2011. As part of this collaboration, the India Meteorology Department (IMD) and National Centre for Medium Range Weather Forecasts (NCMRWF) implemented the high-resolution (T574, L64) atmospheric Global Forecast System (GFS) model with three-dimensional variational data assimilation (3DVAR) at IMD for short-and medium-range weather forecasts. Also, the coupled ocean-atmosphere model, Climate Forecast System version 2 (CFSv2), with a high-resolution atmosphere (T382, L64), was implemented for seasonal prediction at the Indian Institute of Tropical Meteorology (IITM). To address the long-term critical need in India for a climate model that would provide reliable future projections of Indian monsoon rainfall, IITM planned on building an Earth system model (ESM) based on the CFSv2 framework. Further, as part of the Monsoon Mission (see www.tropmet.res.in/), India is committed to improving the CFSv2 model for providing more skillful predictions of seasonal monsoon rainfall, which would also benefit the shortand medium-range predictions at IMD. Therefore, the extension of the seasonal prediction model to a long-term climate model would establish a seamless prediction system from weather time scales to seasonal T he Indian Ministry of Earth Sciences and the National Oceanic and Atmospheric Administration (NOAA) entered into a formal agreement to collaborate on the implementation of the National AFFILIATIONS:
Abstract-The advent of emerging computing technologies such as service-oriented architecture and cloud computing has enabled us to perform business services more efficiently and effectively. However, we still suffer from unintended security leakages by unauthorized actions in business services.Moreover, designing and managing different types of policies collaboratively in such a computing environment are critical but often error prone due to the complex nature of policies as well as the lack of effective analysis mechanisms and corresponding tools. In particular, existing mechanisms and tools for policy management adopt different approaches for different types of policies. In this work, we propose a unified framework to facilitate collaborative policy analysis and management for different types of policies, focusing on policy anomaly detection and resolution. Our generic approach captures the common semantics and structure of different types of access control policies with the notion of policy ontology. We also discuss a proof-of-concept implementation of our proposed framework and demonstrate how efficiently our approach can discover and resolve anomalies for different types of policies.Index Terms-Ontology, policy anomaly analysis, auto nomic computing. I. INT RODUCTIONWe have witnessed explosive growth of the applications adopting service oriented architecture (SOA) and cloud computing on the Internet. SOA technology and Cloud computing brought the concept of multi-tenancy for serving various subscribers through a common pool of resources. In such an environment, it is necessary to have a more flexible and collaborative access control mechanism to prevent unintended access of shared resources and private user data. Therefore, the use of a policy-based approach has received considerable attention to accommodate the security requirements covering such large, open, distributed and heterogeneous computing environments.A policy, the basic building block of policy-based sys tem, is a set of rules that control the behaviors of a system. Policy-based computing handles complex system properties by separating policies from system implementa tion and enables dynamic adaptability of system behaviors by changing policy configurations without reprogramming the systems. Policies in modern systems are exponentially growing in size and complexity. In a typical policy, multiple rules may overlap, which means one access request may match several rules. Furthermore, mUltiple rules within one policy may conflict, implying that those rules not only overlap each other but also yield different decisions. Conflicts in a policy may lead to both safety problem (e.g. allowing unauthorized access) and availability problem (e.g. denying legitimate access). On the other hand, there might be some rules that are redundant, meaning that an access request matching one rule also matches other rules with the same effect. In such a case, the performance of an access control system might be degraded since it directly depends on the number of rules...
Abstract. Geoscientific modeling is constantly evolving, with next-generation geoscientific models and applications placing large demands on high-performance computing (HPC) resources. These demands are being met by new developments in HPC architectures, software libraries, and infrastructures. In addition to the challenge of new massively parallel HPC systems, reproducibility of simulation and analysis results is of great concern. This is due to the fact that next-generation geoscientific models are based on complex model implementations and profiling, modeling, and data processing workflows. Thus, in order to reduce both the duration and the cost of code migration, aid in the development of new models or model components, while ensuring reproducibility and sustainability over the complete data life cycle, an automated approach to profiling, porting, and provenance tracking is necessary. We propose a run control framework (RCF) integrated with a workflow engine as a best practice approach to automate profiling, porting, provenance tracking, and simulation runs. Our RCF encompasses all stages of the modeling chain: (1) preprocess input, (2) compilation of code (including code instrumentation with performance analysis tools), (3) simulation run, and (4) postprocessing and analysis, to address these issues. Within this RCF, the workflow engine is used to create and manage benchmark or simulation parameter combinations and performs the documentation and data organization for reproducibility. In this study, we outline this approach and highlight the subsequent developments scheduled for implementation born out of the extensive profiling of ParFlow. We show that in using our run control framework, testing, benchmarking, profiling, and running models is less time consuming and more robust than running geoscientific applications in an ad hoc fashion, resulting in more efficient use of HPC resources, more strategic code development, and enhanced data integrity and reproducibility.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
334 Leonard St
Brooklyn, NY 11211
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.