No abstract
For the vast majority of local graph problems standard dynamic programming techniques give c tw |V | O(1) algorithms, where tw is the treewidth of the input graph. On the other hand, for problems with a global requirement (usually connectivity) the best-known algorithms were naive dynamic programming schemes running in tw O(tw) |V | O(1) time.We breach this gap by introducing a technique we dubbed Cut&Count that allows to produce c tw |V | O(1) Monte Carlo algorithms for most connectivity-type problems, including HAMILTONIAN PATH, FEEDBACK VERTEX SET and CONNECTED DOMINATING SET, consequently answering the question raised by Lokshtanov, Marx and Saurabh [SODA'11] in a surprising way. We also show that (under reasonable complexity assumptions) the gap cannot be breached for some problems for which Cut&Count does not work, like CYCLE PACKING.The constant c we obtain is in all cases small (at most 4 for undirected problems and at most 6 for directed ones), and in several cases we are able to show that improving those constants would cause the Strong Exponential Time Hypothesis to fail.Our results have numerous consequences in various fields, like FPT algorithms, exact and approximate algorithms on planar and H-minor-free graphs and algorithms on graphs of bounded degree. In all these fields we are able to improve the best-known results for some problems.
Abstract.It is well known that many local graph problems, like Vertex Cover and Dominating Set, can be solved in 2 O(tw) n O(1) time for graphs with a given tree decomposition of width tw. However, for nonlocal problems, like the fundamental class of connectivity problems, for a long time it was unknown how to do this faster than tw O(tw) The rank-based approach introduces a new technique to speed up dynamic programming algorithms which is likely to have more applications. The determinant-based approach uses the Matrix Tree Theorem for deriving closed formulas for counting versions of connectivity problems; we show how to evaluate those formulas via dynamic programming.
We introduce a new technique for designing fixed-parameter algorithms for cut problems, called randomized contractions. We apply our framework to obtain the first FPT algorithm for the UNIQUE LABEL COVER problem and new FPT algorithms with exponential speed up for the STEINER CUT and NODE MULTIWAY CUT-UNCUT problems. More precisely, we show the following:• We prove that the parameterized version of the UNIQUE LABEL COVER problem, which is the base of the UNIQUE GAMES CONJECTURE, can be solved in 2 O(k 2 log |Σ|) n 4 log n deterministic time (even in the stronger, vertex-deletion variant) where k is the number of unsatisfied edges and |Σ| is the size of the alphabet. As a consequence, we show that one can in polynomial time solve instances of UNIQUE GAMES where the number of edges allowed not to be satisfied is upper bounded by O( √ log n) to optimality, which improves over the trivial O(1) upper bound.• We prove that the STEINER CUT problem can be solved in 2 O(k 2 log k) n 4 log n deterministic time andrandomized time where k is the size of the cutset. This result improves the double exponential running time of the recent work of Kawarabayashi and Thorup (FOCS'11).• We show how to combine considering 'cut' and 'uncut' constraints at the same time. More precisely, we define a robust problem NODE MULTIWAY CUT-UNCUT that can serve as an abstraction of introducing uncut constraints, and show that it admits an algorithm running in 2 O(k 2 log k) n 4 log n deterministic time where k is the size of the cutset. To the best of our knowledge, the only known way of tackling uncut constraints was via the approach of Marx, O'Sullivan and Razgon (STACS'10, ACM Trans. Alg. 2013), which yields algorithms with double exponential running time.An interesting aspect of our algorithms is that they can handle positive real weights.
The field of exact exponential time algorithms for NP-hard problems has thrived over the last decade. While exhaustive search remains asymptotically the fastest known algorithm for some basic problems, difficult and non-trivial exponential time algorithms have been found for a myriad of problems, including Graph Coloring, Hamiltonian Path, Dominating Set and 3-CNF-Sat. In some instances, improving these algorithms further seems to be out of reach. The CNF-Sat problem is the canonical example of a problem for which the trivial exhaustive search algorithm runs in time O(2 n ), where n is the number of variables in the input formula. While there exist non-trivial algorithms for CNF-Sat that run in time o(2 n ), no algorithm was able to improve the growth rate 2 to a smaller constant, and hence it is natural to conjecture that 2 is the optimal growth rate. The strong exponential time hypothesis (SETH) by Impagliazzo and Paturi [JCSS 2001] goes a little bit further and asserts that, for every < 1, there is a (large) integer k such that k-CNF-Sat cannot be computed in time 2 n . In this paper, we show that, for every < 1, the problems Hitting Set, Set Splitting, and NAE-Sat cannot be computed in time O(2 n ) unless SETH fails. Here n is the number of elements or variables in the input. For these problems, we actually get an equivalence to SETH in a certain sense. We conjecture that SETH implies a similar statement for Set Cover, and prove that, under this assumption, the fastest known algorithms for Steiner Tree, Connected Vertex Cover, Set Partitioning, and the pseudo-polynomial time algorithm for Subset Sum cannot be significantly improved. Finally, we justify our assumption about the hardness of Set Cover by showing that the parity of the number of solutions to Set Cover cannot be computed in time O(2 n ) for any < 1 unless SETH fails.
We introduce a concept of parameterizing a problem above the optimum solution of its natural linear programming relaxation and prove that the node multiway cut problem is fixed-parameter tractable (FPT) in this setting. As a consequence we prove that node multiway cut is FPT, when parameterized above the maximum separating cut, resolving an open problem of Razgon.Our results imply O * (4 k ) algorithms for vertex cover above maximum matching and almost 2-SAT as well as an O * (2 k ) algorithm for node multiway cut with a standard parameterization by the solution size, improving previous bounds for these problems. ACM Reference Format:Marek Cygan, Marcin Pilipczuk, Michał Pilipczuk, and Jakub Onufry Wojtaszczyk. 2013. On multiway cut parameterized above lower bounds.
In recent years, significant progress has been made in explaining the apparent hardness of improving upon the naive solutions for many fundamental polynomially solvable problems. This progress has come in the form of conditional lower bounds -reductions from a problem assumed to be hard. The hard problems include 3SUM, All-Pairs Shortest Path, SAT, Orthogonal Vectors, and others.In the (min, +)-convolution problem, the goal is to compute a sequence (c. This can easily be done in O(n 2 ) time, but no O(n 2−ε ) algorithm is known for ε > 0. In this paper, we undertake a systematic study of the (min, +)-convolution problem as a hardness assumption.First, we establish the equivalence of this problem to a group of other problems, including variants of the classic knapsack problem and problems related to subadditive sequences. The (min, +)-convolution problem has been used as a building block in algorithms for many problems, notably problems in stringology. It has also appeared as an ad hoc hardness assumption. Second, we investigate some of these connections and provide new reductions and other results. We also explain why replacing this assumption with the SETH might not be possible for some problems.
In this paper we consider a generalization of the classical k-center problem with capacities. Our goal is to select k centers in a graph, and assign each node to a nearby center, so that we respect the capacity constraints on centers. The objective is to minimize the maximum distance a node has to travel to get to its assigned center. This problem is N P -hard, even when centers have no capacity restrictions and optimal factor 2 approximation algorithms are known. With capacities, when all centers have identical capacities, a 6 approximation is known with no better lower bounds than for the infinite capacity version.While many generalizations and variations of this problem have been studied extensively, no progress was made on the capacitated version for a general capacity function. We develop the first constant factor approximation algorithm for this problem. Our algorithm uses an LP rounding approach to solve this problem, and works for the case of non-uniform hard capacities, when multiple copies of a node may not be chosen and can be extended to the case when there is a hard bound on the number of copies of a node that may be selected. In addition we establish a lower bound on the integrality gap of 7(5) for nonuniform (uniform) hard capacities. In addition we prove that if there is a (3 − )-factor approximation for this problem then P = N P .Finally, for non-uniform soft capacities we present a much simpler 11-approximation algorithm, which we find as one more evidence that hard capacities are much harder to deal with.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.