2022
DOI: 10.1088/1367-2630/ac94ef
|View full text |Cite
|
Sign up to set email alerts
|

Polynomial T-depth quantum solvability of noisy binary linear problem: from quantum-sample preparation to main computation

Abstract: The noisy binary linear problem (NBLP) is known as a computationally hard problem, and therefore, it offers primitives for post-quantum cryptography. An efficient quantum NBLP algorithm that exhibits a polynomial quantum sample and time complexities has recently been proposed. However, the algorithm requires a large number of samples to be loaded in a highly entangled state and it is unclear whether such a precondition on the quantum speedup can be obtained efficiently. Here, we present a complete analysis of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 36 publications
(66 reference statements)
0
2
0
Order By: Relevance
“…the initial-state reflection in our case, other than the data-access. We have used a similar approach in the previous work [33]. Nevertheless, if we consider a local-purpose system for data search only, one may think of an algorithm that uses only the one-hot encoding, for example, by developing a useful method to prepare or process the W-type entangled state.…”
Section: Discussionmentioning
confidence: 99%
“…the initial-state reflection in our case, other than the data-access. We have used a similar approach in the previous work [33]. Nevertheless, if we consider a local-purpose system for data search only, one may think of an algorithm that uses only the one-hot encoding, for example, by developing a useful method to prepare or process the W-type entangled state.…”
Section: Discussionmentioning
confidence: 99%
“…Consequently, this work improves the NLP-solving algorithm in the linearithmic order than the GKZ algorithm [4] including Ref. [5,43,44].…”
Section: < L a T E X I T S H A 1 _ B A S E 6 4 = " U M Y M J L H Z 9 ...mentioning
confidence: 99%