2017
DOI: 10.1007/978-3-662-54365-8_2
|View full text |Cite
|
Sign up to set email alerts
|

Improved Algorithms for the Approximate k-List Problem in Euclidean Norm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
33
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 27 publications
(36 citation statements)
references
References 15 publications
0
33
0
Order By: Relevance
“…Tuple lattice sieving aims to overcome the main drawback of classical lattice sieving methods by using less memory at the cost of more time, offering a tradeoff between sieving and enumeration. After Bai-Laarhoven-Stehlé [8] made a first step towards analyzing tuple lattice sieving, Herold-Kirshanova [18] significantly improved upon this by (i) proving what are the memory requirements for arbitrary tuple sizes; (ii) analyzing which configurations of tuples one should look for; (iii) showing that tuple sieving can be modified to use much less time; and (iv) showing how a near neighbor-like method called Configuration Extension can be used to further reduce the asymptotic time complexity. As an example, their optimized triple sieve requires 2 0.1887d+o(d) memory and 2 0.3717d+o(d) time.…”
Section: Svp Algorithmsmentioning
confidence: 99%
See 1 more Smart Citation
“…Tuple lattice sieving aims to overcome the main drawback of classical lattice sieving methods by using less memory at the cost of more time, offering a tradeoff between sieving and enumeration. After Bai-Laarhoven-Stehlé [8] made a first step towards analyzing tuple lattice sieving, Herold-Kirshanova [18] significantly improved upon this by (i) proving what are the memory requirements for arbitrary tuple sizes; (ii) analyzing which configurations of tuples one should look for; (iii) showing that tuple sieving can be modified to use much less time; and (iv) showing how a near neighbor-like method called Configuration Extension can be used to further reduce the asymptotic time complexity. As an example, their optimized triple sieve requires 2 0.1887d+o(d) memory and 2 0.3717d+o(d) time.…”
Section: Svp Algorithmsmentioning
confidence: 99%
“…Tuple lattice sieving, originally proposed in [8] and later improved in [18], aims to overcome this large space requirement by combining multiple list vectors for reductions: instead of looking for short vectors x 1 ± x 2 , one looks for short combinations x 1 ±· · ·±x k with all x i ∈ L. 5 By considering a larger number of more intricate combinations of the list vectors, one hopes to reduce the required list size to make progress. As conjectured in [8] and later proved in [18], this is indeed the case: Lemma 1 (List sizes for tuple sieving [18,Theorem 3]). Let L k ⊂ S d−1 consist of n uniformly random unit vectors.…”
Section: Sieving Complexitiesmentioning
confidence: 99%
“…We compare ourselves to four of them, namely a baseline implementation [MV10], and three advanced sieve implementations [FBB+15,MLB17,HK17], which represent (to the best of our knowledge) the state of the art in three different directions. This is given in Table 1.…”
Section: Performance Comparison To the Literaturementioning
confidence: 99%
“…A long line of work, including [BGJ13,Laa15a,Laa15b,BDGL16] decrease this time complexity down to (3/2) n/2+o(n) at the cost of more memory. Other variants (tuple-sieving) are designed to lower the memory complexity [BLS16,HK17].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation