2014
DOI: 10.1016/b978-0-12-800160-8.00005-x
|View full text |Cite
|
Sign up to set email alerts
|

Classifying Problems into Complexity Classes

Abstract: A fundamental problem in computer science is, stated informally: Given a problem, how hard is it?. We measure hardness by looking at the following question: Given a set A whats is the fastest algorithm to determine if "x ∈ A?" We measure the speed of an algorithm by how long it takes to run on inputs of length n, as a function of n. For example, sorting a list of length n can be done in roughly n log n steps.Obtaining a fast algorithm is only half of the problem. Can you prove that there is no better algorithm… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 58 publications
(50 reference statements)
0
2
0
Order By: Relevance
“…Some researchers might argue that we should evaluate the efficiency of a proposal building on its theoretical time or space complexity, but we do not think that such an approach is realistic because only a few authors have characterised the theoretical complexity of their proposals; furthermore, many of them have characterised an upper bound to the actual theoretical complexity to prove that their proposals are computationally tractable, not their actual complexity; even worse: even if we knew the exact theoretical complexity of every proposal, the relationships amongst most theoretical complexity classes are still open problems in computer science [71].…”
Section: A5 a Note On Implementation-related Measuresmentioning
confidence: 99%
“…Some researchers might argue that we should evaluate the efficiency of a proposal building on its theoretical time or space complexity, but we do not think that such an approach is realistic because only a few authors have characterised the theoretical complexity of their proposals; furthermore, many of them have characterised an upper bound to the actual theoretical complexity to prove that their proposals are computationally tractable, not their actual complexity; even worse: even if we knew the exact theoretical complexity of every proposal, the relationships amongst most theoretical complexity classes are still open problems in computer science [71].…”
Section: A5 a Note On Implementation-related Measuresmentioning
confidence: 99%
“…This dovetails nicely with results from formal language theory regarding the processing complexity of these language classes. Time complexity of the recognition problem for regular languages is linear in length of the input string, while space complexity is constant [ 12 ]. In contradistinction, standard parsing algorithms for context-free languages (such as the CYK-algorithm) require cubic time complexity (meaning: the number of steps that a deterministic computer requires to decide whether a given string belongs to a given context-free grammar is bounded by a cubic function of the length of the string) and quadratic space complexity (meaning: the maximal number of memory cells is bounded by a quadratic function of the length of the string) [ 13 ].…”
Section: Introductionmentioning
confidence: 99%