2021
DOI: 10.1016/j.apr.2021.101079
|View full text |Cite
|
Sign up to set email alerts
|

Emulation of an atmospheric gas-phase chemistry solver through deep learning: Case study of Chinese Mainland

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(18 citation statements)
references
References 29 publications
0
18
0
Order By: Relevance
“…The online ML solver embedded within GEOS-Chem performs the chemical integration 5× faster than the reference Super-Fast simulation (single Intel Broadwell CPU core; 2.10 GHz). This speedup is smaller than in Kelp et al (2020) and Liu et al (2021) because the Super-Fast mechanism is simpler and because of the overhead in accessing Python code at each time step. Further speedup could be achieved by reading the trained ML solver parameters through text files or by writing them in Fortran.…”
Section: One-year Simulation Testing Of Online ML Solvermentioning
confidence: 93%
See 2 more Smart Citations
“…The online ML solver embedded within GEOS-Chem performs the chemical integration 5× faster than the reference Super-Fast simulation (single Intel Broadwell CPU core; 2.10 GHz). This speedup is smaller than in Kelp et al (2020) and Liu et al (2021) because the Super-Fast mechanism is simpler and because of the overhead in accessing Python code at each time step. Further speedup could be achieved by reading the trained ML solver parameters through text files or by writing them in Fortran.…”
Section: One-year Simulation Testing Of Online ML Solvermentioning
confidence: 93%
“…First, random forest algorithms are much slower. Keller and Evans (2019) found that their random forest solver was 85% slower than the reference Rosenbrock solver, while neural networks should be much faster (Kelp et al, 2020;Liu et al, 2021). Second, random forests are not easily amenable to online training because the growing of the architecture to incorporate more trees and branches further slows performance (Lakshminarayanan et al, 2015), whereas online neural network training simply updates parameters.…”
Section: Offline and Online Trainingmentioning
confidence: 99%
See 1 more Smart Citation
“…First, random forest algorithms are much slower. Keller and Evans (2019) found that their random forest solver was 85% slower than the reference Rosenbrock solver, while neural networks should be much faster (Kelp et al., 2020; Liu et al., 2021). Second, random forests are not easily amenable to online training because the growing of the architecture to incorporate more trees and branches further slows performance (Lakshminarayanan et al., 2015), whereas online neural network training simply updates parameters.…”
Section: Methodsmentioning
confidence: 99%
“…Liu et al. (2021) developed a gas‐phase neural network solver for the CMAQ regional CTM over China, combining a standard implicit solver for radicals and oxidants with an ML solver for volatile organic compounds (VOCs). They achieved an order of magnitude speedup over a 1‐month simulation but with error growth over remote ocean grid cells.…”
Section: Introductionmentioning
confidence: 99%