2012
DOI: 10.1007/978-3-642-32009-5_42
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Dissection of Composite Problems, with Applications to Cryptanalysis, Knapsacks, and Combinatorial Search Problems

Abstract: Abstract. In this paper we show that a large class of diverse problems have a bicomposite structure which makes it possible to solve them with a new type of algorithm called dissection, which has much better time/memory tradeoffs than previously known algorithms. A typical example is the problem of finding the key of multiple encryption schemes with r independent n-bit keys. All the previous error-free attacks required time T and memory M satisfying T M = 2 rn , and even if "false negatives" are allowed, no at… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
83
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 62 publications
(86 citation statements)
references
References 14 publications
0
83
0
Order By: Relevance
“…While this may not be a surprise from the point of view of quantum complexity theory (see e.g. the conclusion of [3]), this suggests that the time-space product, a common way of evaluating classical attacks [7], may not be the correct figure of merit to evaluate quantum attacks.…”
Section: Quantum Attacks Against Iterated Block Ciphers 73mentioning
confidence: 99%
“…While this may not be a surprise from the point of view of quantum complexity theory (see e.g. the conclusion of [3]), this suggests that the time-space product, a common way of evaluating classical attacks [7], may not be the correct figure of merit to evaluate quantum attacks.…”
Section: Quantum Attacks Against Iterated Block Ciphers 73mentioning
confidence: 99%
“…The problem of merging two large lists with respect to a group-wise Boolean relation has been defined and addressed by Naya-Plasencia in [33,Section 2]. Here, we focus on three algorithms proposed in [33], namely instant matching, gradual matching and an improvement of the parallel matching due to [19]. We provide general and precise formulas for the average time and memory complexities of these three algorithms.…”
Section: Merging the Two Lists Efficientlymentioning
confidence: 99%
“…The details are provided by Algo 3. This algorithm applies an idea from [19] to the parallel matching algorithm from [33]: instead of building a big auxiliary list as in the original parallel matching, we here build small ones which do not need any additional memory. In parallel matching, the elements in both lists are decomposed into three parts: the first t 1 groups, the next t 2 groups, and the remaining (t − t 1 − t 2 ) groups.…”
mentioning
confidence: 99%
“…The papers [6,9] have the basic idea of guessing one internal state in MITM attacks, but their attacks only succeed in improving memory and data complexities, but not time complexity, of the previous work in [12]. We also note that there is an independent work [8] with similar ideas, but the authors focus on optimizing time-memory trade-offs for composite problems, and their analysis is only applied to the cases where all sub-ciphers have independent keys.…”
Section: Introductionmentioning
confidence: 96%