2021
DOI: 10.1007/978-3-030-79416-3_18
|View full text |Cite
|
Sign up to set email alerts
|

Lower Bounds and Hardness Magnification for Sublinear-Time Shrinking Cellular Automata

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 22 publications
0
4
0
Order By: Relevance
“…Our main motivation is to analyze to what extent-if at all-the addition of randomness to the model is able to make up for inherent limitations of it. For instance, models of sublinear-time cellular automata are usually restricted to a local view of their input [16] and are also unable to cope with long unary subwords [15]. As a result of our analysis, we are able to demonstrate yet another connection between randomness and counting-as has been observed in the past in multiple areas of complexity theory (for various examples thereof, see [1,8]).…”
Section: Introductionmentioning
confidence: 57%
See 2 more Smart Citations
“…Our main motivation is to analyze to what extent-if at all-the addition of randomness to the model is able to make up for inherent limitations of it. For instance, models of sublinear-time cellular automata are usually restricted to a local view of their input [16] and are also unable to cope with long unary subwords [15]. As a result of our analysis, we are able to demonstrate yet another connection between randomness and counting-as has been observed in the past in multiple areas of complexity theory (for various examples thereof, see [1,8]).…”
Section: Introductionmentioning
confidence: 57%
“…Theorem 2 indicates that unconditional time-efficient derandomization results for PACA are beyond reach of current techniques, so perhaps one should consider space-efficient derandomization instead. Indeed, it is straightforward to show that PACA can be simulated by space-efficient machines (e.g., using an adaptation of the algorithm from [15]); hence, it is possible to recast the many constructions of PRGs that fool space-bounded machines (e.g., [9,17]) as PRGs that fool PACA. Nevertheless, we expect better constructions can be obtained from exploiting the locality of PACA (which space-bounded machines do not suffer from).…”
Section: Pseudorandom Generatorsmentioning
confidence: 99%
See 1 more Smart Citation
“…Theorem 6 and the original speedup of Oliveira and Santhanam can be interpreted as hardness magnification theorems. Hardness magnification is an approach to strong complexity lower bounds by reducing them to seemingly much weaker lower bounds developed in a series of recent papers [33,27,31,25,9,10,7,6,8,26,11], see [6] for a more comprehensive survey. For example, it turns out that in order to prove that functions computable in nondeterministic quasipolynomialtime are hard for NC 1 it suffices to show that a parameterized version of the minimum circuit size problem MCSP is hard for AC 0 [2].…”
Section: Learning Speedupmentioning
confidence: 99%