2015
DOI: 10.1007/978-3-319-20807-7_25
|View full text |Cite
|
Sign up to set email alerts
|

On the Analysis of Probability-Possibility Transformations: Changing Operations and Graphical Models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 14 publications
0
6
0
Order By: Relevance
“…For possibilistic networks, parameter learning from data consists basically in deriving conditional local possibility distributions from data. There are two main approaches for learning the parameters [Haddad et al, 2015]: i) Transformation-based approach: It first consists in learning probability distributions from data then transforming them into possibilistic ones using probabilitypossibility transformations [Benferhat et al, 2015a]. ii) Possibilistic-based approach: Such approaches stem from some quantitative interpretations of possibility distributions.…”
Section: Fig 8 Example Of a Possibilistic Networkmentioning
confidence: 99%
“…For possibilistic networks, parameter learning from data consists basically in deriving conditional local possibility distributions from data. There are two main approaches for learning the parameters [Haddad et al, 2015]: i) Transformation-based approach: It first consists in learning probability distributions from data then transforming them into possibilistic ones using probabilitypossibility transformations [Benferhat et al, 2015a]. ii) Possibilistic-based approach: Such approaches stem from some quantitative interpretations of possibility distributions.…”
Section: Fig 8 Example Of a Possibilistic Networkmentioning
confidence: 99%
“…We choose three logical operators to include with the definition: and, or, not. The example in 1 Listing 1: Representing uncertainty with mUnc the listing 1 shows how to assert that a sentence ex:S1 is true with a probability of 0.7. For the sake of illustration, we use reification to attach an IRI to the previous sentence, although no preference about metadata representation methods is stated [16].…”
Section: Munc: a Vocabulary For Uncertainty Theoriesmentioning
confidence: 99%
“…The latter are a probability distribution of possibility distributions. An interesting analysis of the possibility-probability transformation and its links to graphical models can be found in [1].…”
Section: Translating Uncertainty Between Theoriesmentioning
confidence: 99%
“…And our intention was to compare the results of possibilistic and probabilistic networks on the same data, which means keeping the ordinal information contained in the probabilistic data. Using a least committed probability-to-possibility preference preserving transformation at the local level was a natural way of generating such possibilistic counterparts of subjective probabilistic data, even if we are aware that making local probability-to-possibility transforms is for instance not equivalent to making probability-to-possibility transforms of the joint probability, as studied in [7]. Note that the same issues occur when trying to learn possibilistic networks from data [18].…”
Section: Model Specificationmentioning
confidence: 99%