2023
DOI: 10.1029/2023wr034420
|View full text |Cite
|
Sign up to set email alerts
|

Identifying Structural Priors in a Hybrid Differentiable Model for Stream Water Temperature Modeling

Farshid Rahmani,
Alison Appling,
Dapeng Feng
et al.

Abstract: Although deep learning models for stream temperature (Ts) have recently shown exceptional accuracy, they have limited interpretability and cannot output untrained variables. With hybrid differentiable models, neural networks (NNs) can be connected to physically based equations (called structural priors) to output intermediate variables such as water source fractions (specifying what portion of water is groundwater, subsurface, and surface flow). However, it is unclear if such outputs are physically meaningful … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 103 publications
0
1
0
Order By: Relevance
“…Differentiable models can also extrapolate better in space and time than purely data-driven deep networks (Feng et al, 2023). These methods are broadly applicable (including for estimation of parameters in ecosystem (Aboelyazeed et al, 2023) and stream temperature (Rahmani et al, 2023) modeling) and allow us to flexibly discover variable relationships within the model based on big data, enabling improved transparency compared to standard deep learning models.…”
mentioning
confidence: 99%
“…Differentiable models can also extrapolate better in space and time than purely data-driven deep networks (Feng et al, 2023). These methods are broadly applicable (including for estimation of parameters in ecosystem (Aboelyazeed et al, 2023) and stream temperature (Rahmani et al, 2023) modeling) and allow us to flexibly discover variable relationships within the model based on big data, enabling improved transparency compared to standard deep learning models.…”
mentioning
confidence: 99%