2012
DOI: 10.1016/j.ijar.2011.10.004
|View full text |Cite
|
Sign up to set email alerts
|

Mixtures of truncated basis functions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
61
0
1

Year Published

2012
2012
2017
2017

Publication Types

Select...
8
2

Relationship

6
4

Authors

Journals

citations
Cited by 62 publications
(62 citation statements)
references
References 14 publications
0
61
0
1
Order By: Relevance
“…An important feature of the technique presented in this paper is that it can be directly applied using frameworks related to MTEs, like the Mixtures of Polynomials (MOPs) [32] and more generally, Mixtures of Truncated Basis Functions (MoTBFs) [19]. This can lead to improvements in inference efficiency, especially using MoTBFs, as they can provide accurate estimations with no need to split the domain of the densities [16].…”
Section: Discussion and Concluding Remarksmentioning
confidence: 99%
“…An important feature of the technique presented in this paper is that it can be directly applied using frameworks related to MTEs, like the Mixtures of Polynomials (MOPs) [32] and more generally, Mixtures of Truncated Basis Functions (MoTBFs) [19]. This can lead to improvements in inference efficiency, especially using MoTBFs, as they can provide accurate estimations with no need to split the domain of the densities [16].…”
Section: Discussion and Concluding Remarksmentioning
confidence: 99%
“…Our goal is to analyse the performance of probabilistic inference in hybrid Bayesian networks in scenarios where data come in streams at high speed, and therefore a quick response is required. Because of that, we will focus our analysis on conditional linear Gaussian (CLG) models [10,11], instead of more expressive alternatives such as mixtures of exponentials [12], mixtures of polynomials [18] and mixtures of truncated basis functions in general [9], as inference in the latter models is in general more time consuming [15].…”
Section: Introductionmentioning
confidence: 99%
“…Mixtures of truncated basis functions (MoTBFs) [2] have recently been proposed as a general framework for handling hybrid Bayesian networks, i.e., Bayesian networks where discrete and continuous variables coexist. Previous hybrid models as the so-called mixtures of truncated exponentials (MTEs) [7] and mixtures of polynomials (MoPs) [10] can be regarded as particular cases of MoTBFs.…”
Section: Introductionmentioning
confidence: 99%