2018
DOI: 10.1073/pnas.1810286115
|View full text |Cite
|
Sign up to set email alerts
|

Deep learning to represent subgrid processes in climate models

Abstract: SignificanceCurrent climate models are too coarse to resolve many of the atmosphere’s most important processes. Traditionally, these subgrid processes are heuristically approximated in so-called parameterizations. However, imperfections in these parameterizations, especially for clouds, have impeded progress toward more accurate climate predictions for decades. Cloud-resolving models alleviate many of the gravest issues of their coarse counterparts but will remain too computationally demanding for climate chan… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

14
676
5
2

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 649 publications
(742 citation statements)
references
References 36 publications
14
676
5
2
Order By: Relevance
“…While the correlation and MSE summary statistics are very similar for both the DNN and Random Forest regression, the Random Forest prediction time is 27% slower than the DNN. Though the model prediction time is not perfectly optimized, these results are consistent with previous work demonstrating rapid computation from DNNs (Rasp et al, ), and slower implementations of Random Forests (Keller & Evans, ). Given the computational challenges many models already face, the lack of process‐based information currently available in models of V d , and the great potential for DNN model portability and retraining (Chollet & Allaire, ), we believe that the application of a DNN for this purpose is well justified.…”
Section: Resultssupporting
confidence: 87%
“…While the correlation and MSE summary statistics are very similar for both the DNN and Random Forest regression, the Random Forest prediction time is 27% slower than the DNN. Though the model prediction time is not perfectly optimized, these results are consistent with previous work demonstrating rapid computation from DNNs (Rasp et al, ), and slower implementations of Random Forests (Keller & Evans, ). Given the computational challenges many models already face, the lack of process‐based information currently available in models of V d , and the great potential for DNN model portability and retraining (Chollet & Allaire, ), we believe that the application of a DNN for this purpose is well justified.…”
Section: Resultssupporting
confidence: 87%
“…Several aspects of our coarse‐graining procedure could cause the mean state to drift, which Rasp et al () did not observe with their SP data. First, unlike in SP, NG‐Aqua contains many subgrid‐scale sources of momentum, which we neglected altogether.…”
Section: Resultsmentioning
confidence: 99%
“…For example, Rasp and Lerch () used neural networks (NNs) to successfully improve postprocessing of GCM forecasts to surface stations, while Rodrigues et al () demonstrated the ability of deep NNs to downscale GCM output to higher horizontal resolution. Deep NNs have also been used to identify extreme weather and climate patterns in observed and modeled atmospheric states (Kurth et al, ; Lagerquist et al, ; Liu et al, ), improve parameterizations in GCMs (e.g., Brenowitz & Bretherton, ; Rasp et al, ), and predict uncertainty in weather forecasts (Scher & Messori, ). Larraondo et al () demonstrated the ability of deep NNs to extract spatial patterns in precipitation from gridded atmospheric fields.…”
Section: Introductionmentioning
confidence: 99%