No abstract
Bidimensionality theory was introduced by [E. D. Demaine et al., J. ACM, 52 (2005), pp. 866--893] as a tool to obtain subexponential time parameterized algorithms on H-minor-free graphs. In [E. D. Demaine and M. Hajiaghayi, Bidimensionality: New connections between FPT algorithms and PTASs, in Proceedings of the 16th Annual ACM-SIAM Symposium on Discrete Algorithms (SODA), SIAM, Philadelphia, 2005, pp. 590--601] this theory was extended in order to obtain polynomial time approximation schemes (PTASs) for bidimensional problems. In this work, we establish a third meta-algorithmic direction for bidimensionality theory by relating it to the existence of linear kernels for parameterized problems. In particular, we prove that every minor (resp., contraction) bidimensional problem that satisfies a separation property and is expressible in Countable Monadic Second Order Logic (CMSO) admits a linear kernel for classes of graphs that exclude a fixed graph (resp., an apex graph) H as a minor. Our results imply that a multitude of bidimensional problems admit linear kernels on the corresponding graph classes. For most of these problems no polynomial kernels on H-minor-free graphs were known prior to our work.
No abstract
Let F be a finite set of graphs. In the F-Deletion problem, we are given an n-vertex graph G and an integer k as input, and asked whether at most k vertices can be deleted from G such that the resulting graph does not contain a graph from F as a minor. FDeletion is a generic problem and by selecting different sets of forbidden minors F, one can obtain various fundamental problems such as Vertex Cover, Feedback Vertex Set or Treewidth η-Deletion.In this paper we obtain a number of generic algorithmic results about F-Deletion, when F contains at least one planar graph. The highlights of our work are• A constant factor approximation algorithm for the optimization version of F-Deletion;• A linear time and single exponential parameterized algorithm, that is, an algorithm running in time O(2 O(k) n), for the parameterized version of F-Deletion where all graphs in F are connected;• A polynomial kernel for parameterized F-Deletion.These algorithms unify, generalize, and improve a multitude of results in the literature. Our main results have several direct applications, but also the methods we develop on the way have applicability beyond the scope of this paper. Our results -constant factor approximation, polynomial kernelization and FPT algorithms -are stringed together by a common theme of polynomial time preprocessing.
In a parameterized problem, every instance I comes with a positive integer k. The problem is said to admit a polynomial kernel if, in polynomial time, one can reduce the size of the instance I to a polynomial in k, while preserving the answer. In this work we give two meta-theorems on kernelzation. The first theorem says that all problems expressible in Counting Monadic Second Order Logic and satisfying a coverability property admit a polynomial kernel on graphs of bounded genus. Our second result is that all problems that have finite integer index and satisfy a weaker coverability property admit a linear kernel on graphs of bounded genus. These theorems unify and extend all previously known kernelization results for planar graph problems.
We obtain a number of lower bounds on the running time of algorithms solving problems on graphs of bounded treewidth. We prove the results under the Strong Exponential Time Hypothesis of Impagliazzo and Paturi. In particular, assuming that SAT cannot be solved in (2 − ǫ) n m O(1) time, we show that for any ǫ > 0;• DOMINATING SET cannot be solved in (3 − ǫ) tw(G) |V (G)| O(1) time,
In parameterized complexity each problem instance comes with a parameter k, and a parameterized problem is said to admit a polynomial kernel if there are polynomial time preprocessing rules that reduce the input instance to an instance with size polynomial in k. Many problems have been shown to admit polynomial kernels, but it is only recently that a framework for showing the non-existence of polynomial kernels for specific problems has been developed by Bodlaender et al. [6] and Fortnow and Santhanam [17]. With few exceptions, all known kernelization lower bounds result have been obtained by directly applying this framework. In this paper we show how to combine these results with combinatorial reductions which use colors and IDs in order to prove kernelization lower bounds for a variety of basic problems. Below we give a summary of our main results. All results are under the assumption that the polynomial hierarchy does not collapse to the third level.• We show that the STEINER TREE problem parameterized by the number of terminals and solution size k, and the CONNECTED VERTEX COVER and CAPACITATED VERTEX COVER problems do not admit a polynomial kernel. The two latter results are surprising because the closely related VERTEX COVER problem admits a kernel of with at most 2k vertices.• Alon and Gutner obtain a k poly(h) kernel for DOMINATING SET IN H -MINOR FREE GRAPHS parameterized by h = |H| and solution size k, and ask whether kernels of smaller size exist [3]. We partially resolve this question by showing that DOMINATING SET IN H -MINOR FREE GRAPHS does not admit a kernel with size polynomial in k + h.• Harnik and Naor obtain a "compression algorithm" for the SPARSE SUBSET SUM problem [21]. We show that their algorithm is essentially optimal by showing that the instances cannot be compressed further.• The HITTING SET and SET COVER problems are among the most studied problems in algorithmics. Both problems admit a kernel of size k O(d) when parameterized by solution size k and maximum set size d. We show that neither of them, along with the UNIQUE COVERAGE and BOUNDED RANK DISJOINT SETS problems, admits a polynomial kernel.The existence of polynomial kernels for several of the problems mentioned above were open problems explicitly stated in the literature [3,4,19,20,26]. Many of our results also rule out the existence of compression algorithms, a notion similar to kernelization defined by Harnik and Naor [21], for the problems in question.
The field of exact exponential time algorithms for NP-hard problems has thrived over the last decade. While exhaustive search remains asymptotically the fastest known algorithm for some basic problems, difficult and non-trivial exponential time algorithms have been found for a myriad of problems, including Graph Coloring, Hamiltonian Path, Dominating Set and 3-CNF-Sat. In some instances, improving these algorithms further seems to be out of reach. The CNF-Sat problem is the canonical example of a problem for which the trivial exhaustive search algorithm runs in time O(2 n ), where n is the number of variables in the input formula. While there exist non-trivial algorithms for CNF-Sat that run in time o(2 n ), no algorithm was able to improve the growth rate 2 to a smaller constant, and hence it is natural to conjecture that 2 is the optimal growth rate. The strong exponential time hypothesis (SETH) by Impagliazzo and Paturi [JCSS 2001] goes a little bit further and asserts that, for every < 1, there is a (large) integer k such that k-CNF-Sat cannot be computed in time 2 n . In this paper, we show that, for every < 1, the problems Hitting Set, Set Splitting, and NAE-Sat cannot be computed in time O(2 n ) unless SETH fails. Here n is the number of elements or variables in the input. For these problems, we actually get an equivalence to SETH in a certain sense. We conjecture that SETH implies a similar statement for Set Cover, and prove that, under this assumption, the fastest known algorithms for Steiner Tree, Connected Vertex Cover, Set Partitioning, and the pseudo-polynomial time algorithm for Subset Sum cannot be significantly improved. Finally, we justify our assumption about the hardness of Set Cover by showing that the parity of the number of solutions to Set Cover cannot be computed in time O(2 n ) for any < 1 unless SETH fails.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.