2017
DOI: 10.5194/npg-2017-52
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Feature-based data assimilation in geophysics

Abstract: Abstract. Many applications in science require that computational models and data be combined. In a Bayesian framework, this is usually done by defining likelihoods based on the mismatch of model outputs and data. However, matching model outputs and data in this way can be unnecessary or impossible. For example, using large amounts of steady state data is unnecessary because these data are redundant, it is numerically difficult to assimilate data in chaotic systems, and it is often impossible to assimilate dat… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
14
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 11 publications
(14 citation statements)
references
References 38 publications
0
14
0
Order By: Relevance
“…In parameter estimation problems for chaotic dynamical systems, such as those arising in climate modeling [12,34,65], data may only be available in time-averaged form; or it may be desirable to study time-averaged quantities in order to ameliorate difficulties arising from the complex objective functions, with multiple local minima, which arise from trying to match trajectories [1]. Indeed the idea fits the more general framework of feature-based data assim-ilation introduced in [51] which, in turn, is closely related to the idea of extracting sufficient statistics from the raw data [21]. The methodology developed in this section underpins similar work conducted for a complex climate model described in the paper [12].…”
Section: Time-averaged Datamentioning
confidence: 99%
“…In parameter estimation problems for chaotic dynamical systems, such as those arising in climate modeling [12,34,65], data may only be available in time-averaged form; or it may be desirable to study time-averaged quantities in order to ameliorate difficulties arising from the complex objective functions, with multiple local minima, which arise from trying to match trajectories [1]. Indeed the idea fits the more general framework of feature-based data assim-ilation introduced in [51] which, in turn, is closely related to the idea of extracting sufficient statistics from the raw data [21]. The methodology developed in this section underpins similar work conducted for a complex climate model described in the paper [12].…”
Section: Time-averaged Datamentioning
confidence: 99%
“…Rather, we extract "features" from the data and find model parameters such that the model produces comparable features. The features are based on PSDs of the Sint-2000, PADM2M and CALS10k.2 data sets as well as the reversal rate, time average VADM and VADM standard deviation (see Morzfeld et al (2018) for a more in depth explanation of feature-based approaches to data assimilation).…”
Section: Feature-based Likelihoodsmentioning
confidence: 99%
“…Buffett and Matsui (2015) derived an extension of the B13 model to extend it to time scales of thousands of years, by adding a time-correlated noise process. An extension of B13 to represent changes in reversal rates over the past 150 Myrs is considered by Morzfeld et al (2018). Its use for predicting the probability of an imminent reversal of Earth's dipole is described by Morzfeld et al (2017); Buffett and Davis (2018).…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…Thus, it is not straightforward to compare KTF17 to LES output. We address this issue by using "feature-based" likelihoods (Maclean et al, 2017;Morzfeld et al, 2018). The basic idea is that compressing the data into suitable features can bridge gaps between drastically simplified models and complex processes.…”
mentioning
confidence: 99%