* The vision, ideas, observations and recommendations presented in this report are summarized from discussions by the participants during the 'Sustain What?' workshop held in New York in November 2010. The atmosphere was an example of creative collaboration at its best and the intellectual property herein belongs to the participants as a whole. Agreement with everything in the report by any single author should not be assumed as there was lively debate and disagreements over details. That said, most major points including, importantly, the feasibility of a 50-year species inventory were agreed to by all. The participants willingly set aside minor divergences of opinion in the interest of community-building and the creation of a powerful general vision for what can be.
E ach year across the US, mesoscale weather events-flash floods, tornadoes, hail, strong winds, lightning, and localized winter storms-cause hundreds of deaths, routinely disrupt transportation and commerce, and lead to economic losses averaging more than US$13 billion.1 Although mitigating the impacts of such events would yield enormous economic and societal benefits, research leading to that goal is hindered by rigid IT frameworks that can't accommodate the real-time, on-demand, dynamically adaptive needs of mesoscale weather research; its disparate, high-volume data sets and streams; or the tremendous computational demands of its numerical models and data-assimilation systems.In response to the increasingly urgent need for a comprehensive national cyberinfrastructure in mesoscale meteorology-particularly one that can interoperate with those being developed in other relevant disciplines-the US National Science Foundation (NSF) funded a large information technology research (ITR) grant in 2003, known as Linked Environments for Atmospheric Discovery (LEAD). A multidisciplinary effort involving nine institutions and more than 100 scientists, students, and technical staff in meteorology, computer science, social science, and education, LEAD addresses the fundamental research challenges needed to create an integrated, scalable framework for adaptively analyzing and predicting the atmosphere.LEAD's foundation is dynamic workflow orchestration and data management in a Web services framework. These capabilities provide for the use of analysis tools, forecast models, and data repositories,
Computer P u b l i s h e d b y t h e I E E E C o m p u t e r S o c i e t yLEAD establish an interactive closed loop between the forecast analysis and the instruments: The data drives the instruments, which, to make more accurate predictions, refocus in a repeated cycle.The "Hypothetical CASA-LEAD Scenario" sidebar provides an example of the unprecedented capabilities these changes afford.Mesoscale meteorology is the study of smaller-scale weather phenomena such as severe storms, tornadoes, and hurricanes. System-level science in this context involves the responsiveness of the forecast models to the weather at hand as well as conditions on the network at large and the large-scale computational resources on which forecasts rely. This responsiveness can be broken down into four narrowly defined goals:• Dynamic workflow adaptivity. Forecasts execute in the context of a workflow, or task graph. Workflows should be able to dynamically reconfigure in response to new events.• Dynamic resource allocation. The system should be able to dynamically allocate resources, including radars and remote observing technologies, to optiTwo closely linked projects aim to dramatically improve storm forecasting speed and accuracy. CASA is creating a distributed, collaborative, adaptive sensor network of lowpower, high-resolution radars that respond to user needs. LEAD offers dynamic workflow orchestration and data management in a Web services framework designed to support on-demand, real-time, dynamically adaptive systems.
LEAD is a large-scale effort to build a service-oriented infrastructure that allows atmospheric science researchers to dynamically and adaptively respond to weather patterns to produce better-than-real time predictions of tornadoes and other "mesoscale" weather events. In this paper we discuss an architectural framework that is forming our thinking about adaptability and give early solutions in workflow and monitoring. 7
[1] Strong and strategic collaborations among experts from academia, federal operational centers, and industry have been forged to create a U.S. IOOS Coastal and Ocean Modeling Testbed (COMT). The COMT mission is to accelerate the transition of scientific and technical advances from the coastal and ocean modeling research community to improved operational ocean products and services. This is achieved via the evaluation of existing technology or the development of new technology depending on the status of technology within the research community. The initial phase of the COMT has addressed three coastal and ocean prediction challenges of great societal importance: estuarine hypoxia, shelf hypoxia, and coastal inundation. A fourth effort concentrated on providing and refining the cyberinfrastructure and cyber tools to support the modeling work and to advance interoperability and community access to the COMT archive. This paper presents an overview of the initiation of the COMT, the findings of each team and a discussion of the role of the COMT in research to operations and its interface with the coastal and ocean modeling community in general. Detailed technical results are presented in the accompanying series of 16 technical papers in this special issue.
In the spring of 2013, NASA conducted a field campaign known as Iowa Flood Studies (IFloodS) as part of the Ground Validation (GV) program for the Global Precipitation Measurement (GPM) mission. The purpose of IFloodS was to enhance the understanding of flood-related, space-based observations of precipitation processes in events that transpire worldwide. NASA used a number of scientific instruments such as groundbased weather radars, rain and soil moisture gauges, stream gauges, and disdrometers to monitor rainfall events in Iowa. This article presents the cyberinfrastructure tools and systems that supported the planning, reporting, and management of the field campaign and that allow these data and models to be accessed, evaluated, and shared for research. The authors describe the collaborative informatics tools, which are suitable for the network design, that were used to select the locations in which to place the instruments. How the authors used information technology tools for instrument monitoring, data acquisition, and visualizations after deploying the instruments and how they used a different set of tools to support data analysis and modeling after the campaign are also explained. All data collected during the campaign are available through the Global Hydrology Resource Center (GHRC), a NASA Distributed Active Archive Center (DAAC).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.