1978
DOI: 10.1109/tit.1978.1055873
|View full text |Cite
|
Sign up to set email alerts
|

On the inherent intractability of certain coding problems (Corresp.)

Abstract: The code C is the set of vectors satisfying XH =O. In many information betwen [ and n. Using a slight modification of a applications it is desirable to know, for a given weight w, technique of Lipster [7], we obtain whether C contains any words of weight w, i.e., whether there is a vector of weight w satisfying XH = 0. Again, the best general algorithm known for deciding this requires an exponential search, in this case through all 2k codewords, and a faster algorithm would be highly desirable.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

4
857
0
15

Year Published

2001
2001
2019
2019

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 1,196 publications
(876 citation statements)
references
References 3 publications
4
857
0
15
Order By: Relevance
“…As the second pre-image resistance is strictly weaker than collision resistance, we will only check that the hash function is collision free and resistant to inversion. We show that the inversion and collision finding are related to two problems very close to syndrome decoding, which is a hard problem [3]. We describe them here and show (in appendix) that they are also NP-complete.…”
Section: Theoretical Securitymentioning
confidence: 88%
See 2 more Smart Citations
“…As the second pre-image resistance is strictly weaker than collision resistance, we will only check that the hash function is collision free and resistant to inversion. We show that the inversion and collision finding are related to two problems very close to syndrome decoding, which is a hard problem [3]. We describe them here and show (in appendix) that they are also NP-complete.…”
Section: Theoretical Securitymentioning
confidence: 88%
“…The ISD algorithm consists in picking information sets at random, until a valid one is found. Checking whether the information set is valid or not is done in polynomial time 3 , so the exponential nature of the algorithm originates from the exponentially small probability of finding a valid information set: let V (r) be the cost of checking the validity of an information set, and P w the probability for a random information set to be valid; then the complexity of this algorithm is V (r)/P w .…”
Section: Information Set Decodingmentioning
confidence: 99%
See 1 more Smart Citation
“…We reduce from the Syndrome Decoding problem of a linear error correcting code that is NP-complete. The proof for the case q = 2 is to be found in [2], and an extension to the arbitrary field is sketched in [39], page 1764. Let (n, k, d) be an error correcting code.…”
Section: Minrank Is Np-hardmentioning
confidence: 99%
“…Still there are schemes using an NP-hard problem and still practical, for example PKP by Shamir [31], CLE by Stern [35] or PPP by Pointcheval [27]. However the most interesting schemes are in our opinion the schemes related to coding, as the decoding problem(s) are believed intractable even since the 1970s [2]. There were many proposals [34,40,20,16,4] and the best of them is the scheme SD by Stern [34,40].…”
Section: Introductionmentioning
confidence: 99%