2019
DOI: 10.1029/2019gl083662
|View full text |Cite
|
Sign up to set email alerts
|

Improving Atmospheric River Forecasts With Machine Learning

Abstract: This study tests the utility of convolutional neural networks as a postprocessing framework for improving the National Center for Environmental Prediction's Global Forecast System's integrated vapor transport forecast field in the Eastern Pacific and western United States. Integrated vapor transport is the characteristic field of atmospheric rivers, which provide over 65% of yearly precipitation at some western U.S. locations. The method reduces full‐field root‐mean‐square error (RMSE) at forecast leads from 3… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
61
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 72 publications
(66 citation statements)
references
References 51 publications
(50 reference statements)
1
61
0
Order By: Relevance
“…In addition, future work focusing on large ensemble operational hindcast generation would benefit the reliability and robustness of the statistical relationships suggested in this and related published work. Finally, exploiting postprocessing and machine learning techniques applied to S2S hindcast data (e.g., Chapman et al, ) has the potential to improve prediction skill of uncalibrated hindcast data from multiple operational centers.…”
Section: Future Directionsmentioning
confidence: 99%
“…In addition, future work focusing on large ensemble operational hindcast generation would benefit the reliability and robustness of the statistical relationships suggested in this and related published work. Finally, exploiting postprocessing and machine learning techniques applied to S2S hindcast data (e.g., Chapman et al, ) has the potential to improve prediction skill of uncalibrated hindcast data from multiple operational centers.…”
Section: Future Directionsmentioning
confidence: 99%
“…Thus, time-integrated (4 time-steps) daily ARs data has been generated from six-hourly reanalysis datasets using IVT (kgm -1 s -1 ), normalized IVT (kgm -1 s -1 K -1 ); IWV (mm) and normalized IWV (mmK -1 ) from the surface to 750 hPa, 500 hPa and 300 hPa. Temperature normalization is done to understand the change in the thermodynamic component of IVT and IWV using the Clausius-Clapeyron equation (3), which states that the water-vapour content of saturated air, q*, increases nearly exponentially with temperature T (Payne et al, 2020).…”
Section: Integrated Vapor Transport (Ivt)mentioning
confidence: 99%
“…Data obtained from both satellites and statistical methods have limitations in forecasting the landfall and intensity of ARs well in advance. In recent times machine learning techniques(Chapman et al, 2019;Kashinath et al, 2020) have evolved as other alternatives. However, the average error in estimating the intensity of ARs through IVT is around 40-60 kgm -1 s -1 using different sources of data including data from reanalysis and amounts to 22% of mean observed ux(Chapman et al, 2019, Lavers et al, 2018.…”
mentioning
confidence: 99%
“…In this section, nine days (11)(12)(13)(14)(15)(16)(17)(18)(19) March 2019) of S-NPP CSMs were generated by the FCDN-CSM model to evaluate the model performance by checking the stability and accuracy of the O-M biases. Figure 6 shows the O-M error bars with STDs for the five TEB M-bands, using ACSPO CSM and FCDN-CSM to identify clear-sky pixels.…”
Section: Stability Of the Fcdn-csmmentioning
confidence: 99%
“…With cutting-edge artificial intelligence (AI) evolving rapidly, deep learning (DL), one of the most popular AI methods, has made a remarkable difference in many science and engineering fields. Its application in remote sensing [13,14] and numerical weather prediction [15][16][17][18] is also being explored. Deep learning is constructed using artificial neural networks (ANNs), including more than one hidden layer, with a so-called "deep" neural network distinguished from a "shallow" neural network.…”
Section: Introductionmentioning
confidence: 99%