2008 5th International Symposium on Turbo Codes and Related Topics 2008
DOI: 10.1109/turbocoding.2008.4658669
|View full text |Cite
|
Sign up to set email alerts
|

A separation algorithm for improved LP-decoding of linear block codes

Abstract: Abstract-Maximum Likelihood (ML) decoding is the optimal decoding algorithm for arbitrary linear block codes and can be written as an Integer Programming (IP) problem. Feldman et al. relaxed this IP problem and presented Linear Programming (LP) based decoding algorithm for linear block codes. In this paper, we propose a new IP formulation of the ML decoding problem and solve the IP with generic methods. The formulation uses indicator variables to detect violated parity checks. We derive Gomory cuts from our fo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
32
0

Year Published

2009
2009
2016
2016

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(33 citation statements)
references
References 22 publications
1
32
0
Order By: Relevance
“…To demonstrate the improvement offered by the proposed RPC search algorithm, we compare its error-correcting performance to that of LP/ALP decoding , BP decoding (sumproduct algorithm with a maximum of 1000 iterations), the Separation Algorithm (SA) [4], and ML decoding for two [8].…”
Section: Numerical Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…To demonstrate the improvement offered by the proposed RPC search algorithm, we compare its error-correcting performance to that of LP/ALP decoding , BP decoding (sumproduct algorithm with a maximum of 1000 iterations), the Separation Algorithm (SA) [4], and ML decoding for two [8].…”
Section: Numerical Resultsmentioning
confidence: 99%
“…. , m of the parity-check matrix, corresponding to a check node in the associated Tanner graph, the linear inequalities used to form the fundamental polytope P are given by (4) where N (j) ⊆ {1, 2, . .…”
Section: A Lp Relaxation Of ML Decodingmentioning
confidence: 99%
See 2 more Smart Citations
“…In practice, the τ R/I p [i]'s can be chosen to be a small constant or optimized online by lifting it into the objective function. Of course, the 2 -norm could also be used in (26), which is optimal for AWGN, at the cost of higher complexity (as convex quadratic programming, usually solved by secondorder cone programming methods).…”
Section: Pilot Constraints Under Noisementioning
confidence: 99%