2021
DOI: 10.1029/2021gl095729
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Spatial Downscaling Approach for Climate Change Assessment in Regions With Sparse Ground Data Networks

Abstract: Due to the increased variability in the climate caused by global warming, the number of natural disasters has risen over the past four decades (Brown et al., 2008). Extreme events beyond the historical record and the bounds of natural variability have led to human casualties, property damages, and socioeconomic problems, creating international disagreements (AghaKouchak et al., 2014;Meehl et al., 2000;Oki & Kanae, 2006). It has become increasingly important to consider the changes in extreme events to design f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 76 publications
(101 reference statements)
0
1
0
Order By: Relevance
“…A variety of BC methods with different levels of complexity and performance have been developed and implemented for both global and regional climate simulations (François et al, 2020; Y.-T. Kim et al, 2021;Teutschbein & Seibert, 2012). Generally, their aim is to correct certain features in the target's distribution, such as the simple statistics of the mean (Linear Scaling, LS, Teutschbein & Seibert, 2012) and variance (Variance Scaling, VA, Chen & Dudhia, 2001), or the more advanced quantiles (Quantile Mapping, QM) for adjusting the entire distribution by parametric (PQM) or empirical (EQM) transformation (Gudmundsson et al, 2012;Switanek et al, 2017).…”
mentioning
confidence: 99%
“…A variety of BC methods with different levels of complexity and performance have been developed and implemented for both global and regional climate simulations (François et al, 2020; Y.-T. Kim et al, 2021;Teutschbein & Seibert, 2012). Generally, their aim is to correct certain features in the target's distribution, such as the simple statistics of the mean (Linear Scaling, LS, Teutschbein & Seibert, 2012) and variance (Variance Scaling, VA, Chen & Dudhia, 2001), or the more advanced quantiles (Quantile Mapping, QM) for adjusting the entire distribution by parametric (PQM) or empirical (EQM) transformation (Gudmundsson et al, 2012;Switanek et al, 2017).…”
mentioning
confidence: 99%