DOI: 10.1007/978-3-540-74450-4_4
|View full text |Cite
|
Sign up to set email alerts
|

A More Effective Linear Kernelization for Cluster Editing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
68
0

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 42 publications
(68 citation statements)
references
References 20 publications
0
68
0
Order By: Relevance
“…As to approximability, the currently best known approximation factor is 2.5 [17]. Considering the parameter k defined as the number of allowed edge modifications, a search tree of size O(1.83 k ) [3] has been developed and several studies concerning provably efficient and effective preprocessing by data reduction (which is called problem kernelization in the context of parameterized algorithmics [12]) have been performed [7,8]. Parameterized algorithms have led to several successful experimental studies mainly in the context of biological network analysis [3,4].…”
Section: S-plex Editingmentioning
confidence: 99%
See 1 more Smart Citation
“…As to approximability, the currently best known approximation factor is 2.5 [17]. Considering the parameter k defined as the number of allowed edge modifications, a search tree of size O(1.83 k ) [3] has been developed and several studies concerning provably efficient and effective preprocessing by data reduction (which is called problem kernelization in the context of parameterized algorithmics [12]) have been performed [7,8]. Parameterized algorithms have led to several successful experimental studies mainly in the context of biological network analysis [3,4].…”
Section: S-plex Editingmentioning
confidence: 99%
“…Since s-Plex Editing is a generalization of Cluster Editing, the first idea coming to mind in order to achieve a linear kernelization is to adapt an approach developed by Guo [8]. However, since the "critical clique" concept used there does not work for s-Plex Editing, we need a more sophisticated strategy; correspondingly, the accompanying mathematical analysis requires new tools.…”
Section: Data Reduction and Kernelizationmentioning
confidence: 99%
“…One of our main results is an elegant iterative compression algorithm for weighted Cluster Vertex Deletion using matching techniques, running in O(2 k k 9 + nm) time. 4 We extend our studies to the (also NP-hard) case where the number of clusters to be generated is limited by a second parameter d. Such studies have also been undertaken for Cluster Editing [19,22,36], but note that for Cluster Editing clearly d ≤ 2k. By way of contrast, since vertex deletion is a stronger operation than edge deletion, in the case of Cluster Vertex Deletion also d > 2k is possible.…”
Section: Introductionmentioning
confidence: 89%
“…For a background on fixed-parameter algorithmics we refer to [12,15,29]. Parameterized complexity studies for Cluster Editing were initiated by Gramm et al [21] and have been further pursued in a series of papers [5,6,10,13,20,22,32,33]. A previously shown bound of O(1.92 k + n 3 ) for an n-vertex graph [20] can be improved by combining a linear-time problem kernelization algorithm [13] that yields an instance with O(k 2 ) vertices with the currently best claimed running time of O(1.82 k +n 3 ) [6] to get an algorithm with running time O(1.82 k +n+m), where m is the number of edges in the graph.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation