2015
DOI: 10.1016/j.artint.2015.05.003
|View full text |Cite
|
Sign up to set email alerts
|

Analyzing the computational complexity of abstract dialectical frameworks via approximation fixpoint theory

Abstract: dialectical frameworks (ADFs) have recently been proposed as a versatile generalization of Dung's abstract argumentation frameworks (AFs). In this paper, we present a comprehensive analysis of the computational complexity of ADFs. Our results show that while ADFs are one level up in the polynomial hierarchy compared to AFs, there is a useful subclass of ADFs which is as complex as AFs while arguably offering more modeling capacities. As a technical vehicle, we employ the approximation fixpoint theory of Deneck… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
56
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 37 publications
(64 citation statements)
references
References 29 publications
1
56
0
Order By: Relevance
“…The computational complexity of reasoning in ADFs is summarised in Table 2. The results were shown by Brewka et al (2013), Strass and Wallner (2014) and Wallner (2014 …”
Section: Definition 25: Letsupporting
confidence: 71%
See 2 more Smart Citations
“…The computational complexity of reasoning in ADFs is summarised in Table 2. The results were shown by Brewka et al (2013), Strass and Wallner (2014) and Wallner (2014 …”
Section: Definition 25: Letsupporting
confidence: 71%
“…One way of arriving at adequate encodings for these decision problems is by making use of a characterisation of grounded interpretations given by Strass and Wallner (2014), for the statement of which we make use of the following definition: The above mentioned characterisation of grounded interpretations is expressed in the following proposition by Strass and Wallner (2014) …”
Section: Complexity Sensitive Encodings For the Grounded And Stable Smentioning
confidence: 99%
See 1 more Smart Citation
“…Other data complexity factors such as noise, atypical patterns, overlap, and bad data distribution usually weaken the quality of the training process, too [5]. The curse of dimensionality problem increases the computational complexity and memory requirements, in some cases exponentially [1,2]. Due to the increased number of input data variables, the number of executing parameters usually increases exponentially.…”
Section: Introductionmentioning
confidence: 99%
“…Developing multivariate models for industrial or medical applications produces a computational complexity problem [1,2]. The complexity of the input data affects the quality of the neural network training process.…”
Section: Introductionmentioning
confidence: 99%