2010
DOI: 10.1145/1814370.1814389
|View full text |Cite
|
Sign up to set email alerts
|

Typically-correct derandomization

Abstract: A fundamental question in complexity theory is whether every randomized polynomial time algorithm can be simulated by a deterministic polynomial time algorithm (that is, whether BPP=P). A beautiful theory of derandomization was developed in recent years in attempt to solve this problem.In this article we survey some recent work on relaxed notions of derandomization that allow the deterministic simulation to err on some inputs. We use this opportunity to also provide a brief overview to some results and researc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2012
2012
2022
2022

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 45 publications
0
5
0
Order By: Relevance
“…[GW02] pioneered the idea that randomized algorithms which are "typically correct" can be derandomized by harvesting randomness from the input itself. This idea has often been used for various kinds of derandomization tasks (see the surveys [Sha10,HW12] or the related work section in [Hoz17]).…”
Section: Harvesting Multiple Access Randomness From the Input: Hardne...mentioning
confidence: 99%
“…[GW02] pioneered the idea that randomized algorithms which are "typically correct" can be derandomized by harvesting randomness from the input itself. This idea has often been used for various kinds of derandomization tasks (see the surveys [Sha10,HW12] or the related work section in [Hoz17]).…”
Section: Harvesting Multiple Access Randomness From the Input: Hardne...mentioning
confidence: 99%
“…• The proofs of [Zim07,Sha11] are based on a general framework, due to Goldreich and Wigderson [GW02], of "derandomization by extracting randomness from the input". (See [Sha10] for an excellent survey of this framework.) Both works use extractors within this framework to tame the correlations between the uniform random input (x ∼ {0, 1} n ) and the randomness employed by the query algorithm (r ∼ {0, 1} m ): Zimand uses exposure resilient extractors, and Shaltiel uses extractors for bit-fixing sources.…”
Section: Theorem 8 ([Zim07]mentioning
confidence: 99%
“…Goldreich and Wigderson's idea of using the input as a source of randomness for a typicallycorrect derandomization [GW02] has been applied and developed by several researchers [AT04, vMS05, KS05, Zim08, Sha11, KvMS12, SW14, Alm19]; see related survey articles by Shaltiel [Sha10] and by Hemaspaandra and Williams [HW12]. Researchers have proven unconditional typically-correct derandomization results for several restricted models, including sublinear-time algorithms [Zim08,Sha11], communication protocols [Sha11,KvMS12], constant-depth circuits [Sha11,KvMS12], and streaming algorithms [Sha11].…”
Section: Related Workmentioning
confidence: 99%