2020
DOI: 10.1029/2020ms002084
|View full text |Cite
|
Sign up to set email alerts
|

Data‐Driven Super‐Parameterization Using Deep Learning: Experimentation With Multiscale Lorenz 96 Systems and Transfer Learning

Abstract: To make weather and climate models computationally affordable, small-scale processes are usually represented in terms of the large-scale, explicitly resolved processes using physics-based/ semi-empirical parameterization schemes. Another approach, computationally more demanding but often more accurate, is super-parameterization (SP). SP involves integrating the equations of small-scale processes on high-resolution grids embedded within the low-resolution grid of large-scale processes. Recently, studies have us… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
51
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
2
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 54 publications
(51 citation statements)
references
References 71 publications
(118 reference statements)
0
51
0
Order By: Relevance
“…The first attempts started with simple FCNN methods followed by complex networks, such as CNN, RNN, and GAN models. With respect to the training set, early works used end-to-end training borrowed from the computer vision area, which requires a large number of annotated labels, while recent works have started to consider unsupervised learning (He et al, 2018) and the combination of DL with a physical model (Chattopadhyay et al, 2020;Wu & McMechan, 2019). In 2020, more works focused on the uncertainty of DL methods (Cao et al, 2020;Grana et al, 2020;Mousavi & Beroza, 2020a).…”
Section: The Development Trends Of DL In Geophysicsmentioning
confidence: 99%
See 1 more Smart Citation
“…The first attempts started with simple FCNN methods followed by complex networks, such as CNN, RNN, and GAN models. With respect to the training set, early works used end-to-end training borrowed from the computer vision area, which requires a large number of annotated labels, while recent works have started to consider unsupervised learning (He et al, 2018) and the combination of DL with a physical model (Chattopadhyay et al, 2020;Wu & McMechan, 2019). In 2020, more works focused on the uncertainty of DL methods (Cao et al, 2020;Grana et al, 2020;Mousavi & Beroza, 2020a).…”
Section: The Development Trends Of DL In Geophysicsmentioning
confidence: 99%
“…For instance, on the STanford EArthquake Data set (STEAD), the earthquake detection accuracy is improved to 100% compared to 91% accuracy of the traditional STA/LTA (short time average over long time average) method (Mousavi, Zhu, Sheng, et al, 2019, Mousavi et al, 2020. DL makes characterizing the earth with high resolution on a large scale possible (Chattopadhyay et al, 2020;Chen et al, 2019;Zhang, Stanev, & Grayek, 2020). DL can even be used for discovering physical concepts (Iten et al, 2020), such as the solar system is heliocentric.…”
mentioning
confidence: 99%
“…Further opportunities and perspectives for the use of data-driven methods for geosciences may be found in the reviews of [32,36]. Before proceeding, we note that a vast majority of the data-driven developments for geophysical forecasting have involved the use of variants of deep learning methods, for example ResNets [37], CapsuleNets [22], U-Nets [31], long short-term memory networks (LSTMs) [33], convolutional-LSTMs [34], neural ordinary differential equations (ODEs) [38,39] and local or global fully connected deep neural networks [32]. These methods, while exceptionally powerful in learning complex functions, hamper interpretability and require large computational resources for optimization.…”
Section: Related Workmentioning
confidence: 99%
“…A third approach is a Linear Inverse Modeling framework (Newman et al, 2003;Martinez-Villalobos et al, 2017), where the predictive modes are represented as covariance functions in a reduced space (e.g., functions of PCAs). We can also model the system with reduced complexity and represent higher complexity processes as AI-driven stochastic processes (Chattopadhyay et al, 2020;Crommelin and Edeling, 2020;Alcala and Timofeyev, 2020;Leinonen et al, 2020). To characterize noise relevant for predicting high-frequency signals, convection-resolving simulations such as the DYAMOND ensemble (Stevens et al, 2019) provide comprehensive data coverage to characterize variability in small-scale processes (Christensen, 2020).…”
Section: (A) the Stochastic Surrogate Modelsmentioning
confidence: 99%