2016
DOI: 10.5430/air.v5n2p40
|View full text |Cite
|
Sign up to set email alerts
|

Parallelization of the next Closure algorithm for generating the minimum set of implication rules

Abstract: This paper addresses the problem of handling dense contexts of high dimensionality in the number of objects, which is still an open problem in formal concept analysis. The generation of minimal implication basis in contexts with such characteristics is investigated, where the NextClosure algorithm is employed in obtaining the rules. Therefore, this work makes use of parallel computing as a means to reduce the prohibitive times observed in scenarios where the input context has high density and high dimensionali… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 28 publications
(24 reference statements)
0
3
0
Order By: Relevance
“…Therefore, it is important to use tools that can simulate real data. It is also useful to compare and analyze results among algorithms, as realized in [de Moraes et al, 2016] [Santos et al, 2018.…”
Section: Scgaz -Synthetic Context Generatormentioning
confidence: 99%
“…Therefore, it is important to use tools that can simulate real data. It is also useful to compare and analyze results among algorithms, as realized in [de Moraes et al, 2016] [Santos et al, 2018.…”
Section: Scgaz -Synthetic Context Generatormentioning
confidence: 99%
“…Finally, draw the corresponding concept lattice in a top‐down, bottom‐up, or enumeration method (Dong et al, 2019; Zhang et al, 2019). Batch lattice construction algorithms can be found in (Andrews, 2009; Andrews, 2017; de Moraes, Dias, Freitas, & Zarate, 2016; Ganter, 2010; Kuznetsov, 1993; Muangprathub, 2014; Outrata & Vychodil, 2012). Incremental algorithms : construct the lattice incrementally by iterative processing each object/attribute from the formal context, then generate formal concepts and update edges accordingly.…”
Section: Preliminariesmentioning
confidence: 99%
“…However, the time needed to extract implications from the contexts considered in our case study were not prohibitive due to the sparsity of the input data. For high-dimensional formal contexts, it is important to design or use a parallel or distributed version of the algorithm (de Moraes et al , 2016).…”
Section: Formal Concept Analysismentioning
confidence: 99%