2021
DOI: 10.1016/j.ymssp.2021.107986
|View full text |Cite
|
Sign up to set email alerts
|

On spike-and-slab priors for Bayesian equation discovery of nonlinear dynamical systems via sparse linear regression

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
9
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 31 publications
(9 citation statements)
references
References 57 publications
0
9
0
Order By: Relevance
“…The Gibbs sampling procedure requires knowledge of the conditional posterior distributions for all the random variables. Nayek et al (2021) derived analytical expressions for all the conditional posterior distributions which we briefly summarize here.…”
Section: Model Discovery Via Posterior Estimationmentioning
confidence: 99%
See 2 more Smart Citations
“…The Gibbs sampling procedure requires knowledge of the conditional posterior distributions for all the random variables. Nayek et al (2021) derived analytical expressions for all the conditional posterior distributions which we briefly summarize here.…”
Section: Model Discovery Via Posterior Estimationmentioning
confidence: 99%
“…The physical constraint of linear momentum conservation leads to a residual which defines the likelihood. We use a hierarchical Bayesian model with sparsity-promoting spikeslab priors (Nayek et al, 2021) and Monte Carlo sampling to efficiently solve the parsimonious model selection task and discover constitutive models in the form of multivariate multi-modal posterior probability distributions. Figure 1 summarizes the schematic of the method.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The measurement matrix can be defined as , where is the sensing design matrix and is a proper sparsifying basis. There exist various approaches to solve for in ( 1 ) including greedy-based, convex-based, thresholding-based and sparse Bayesian learning (SBL) algorithms [ 27 , 28 , 29 , 30 , 31 , 32 , 33 , 34 , 35 , 36 , 37 , 38 , 39 , 40 , 41 , 42 , 43 , 44 , 45 , 46 , 47 , 48 , 49 , 50 , 51 , 52 , 53 , 54 , 55 , 56 , 57 , 58 , 59 , 60 , 61 , 62 , 63 , 64 ]. Typically, the performance of CS reconstruction is determined in terms of the mean-squared reconstruction error.…”
Section: Introductionmentioning
confidence: 99%
“…Further, SINDy is computationally efficient and scalable with an increase in the dimension of the measurement states. The Bayesian approach for discovering governing physics from data can also be found in [25,26,27]. These approaches, although robust to noise, are computationally demanding as compared to SINDy.…”
Section: Introductionmentioning
confidence: 99%