EAGE 2020 Annual Conference &Amp; Exhibition Online 2020
DOI: 10.3997/2214-4609.202010770
|View full text |Cite
|
Sign up to set email alerts
|

A Deep-Learning Based Bayesian Approach to Seismic Imaging and Uncertainty Quantification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(15 citation statements)
references
References 0 publications
0
15
0
Order By: Relevance
“…Compared to the MAP estimate, the conditional mean, which corresponds to the minimum-variance estimate [85], is less prone to overfitting [86]. This was confirmed empirically for seismic imaging [54,55]. In the experimental sections below, we will provide further evidence of advantages the conditional mean offers compared to MAP estimation.…”
Section: Conditional Mean Estimationmentioning
confidence: 86%
See 2 more Smart Citations
“…Compared to the MAP estimate, the conditional mean, which corresponds to the minimum-variance estimate [85], is less prone to overfitting [86]. This was confirmed empirically for seismic imaging [54,55]. In the experimental sections below, we will provide further evidence of advantages the conditional mean offers compared to MAP estimation.…”
Section: Conditional Mean Estimationmentioning
confidence: 86%
“…The need for repeated evaluations of the forward operator, the correlation between consecutive samples [88], and the high dimensionality of the problem are the chief computational challenges for these methods. Despite these difficulties, MCMC methods have been applied successfully in imaging problems including [6,8,9,54,55,89,90].…”
Section: Sampling Via Stochastic Gradient Langevin Dynamicsmentioning
confidence: 99%
See 1 more Smart Citation
“…MCMC methods sample the posterior distribution via a series of random walks in the probability space where the posterior probability density function (PDF) needs to be evaluated or approximated at each step-e.g., via stochastic gradient Langevin dynamics [3]. These sampling methods typically require many steps to traverse the probability space [4][5][6][7][8][9][10][11], which fundamentally limits their applicability to large-scale problems due to costs associated with the forward operator [12][13][14][15]. Alternatively, variational inference methods [16] approximate the posterior distribution with a parametric and easy-to-sample distribution.…”
Section: Introductionmentioning
confidence: 99%
“…For deep learning algorithms, one way is to embed the existing neural network into a Bayesian framework and to find the posterior distribution [5]. Siahkoohi et al make use of deep prior to randomly initialize CNN for seismic imaging and uncertainty quantification [30]. Choi et al introduce variational dropout as a Bayesian approximation for neural network and evaluate prediction uncertainty [9].…”
Section: Introductionmentioning
confidence: 99%