1999 IEEE International Conference on Communications (Cat. No. 99CH36311)
DOI: 10.1109/icc.1999.765550
|View full text |Cite
|
Sign up to set email alerts
|

A recurrent neural decoder for convolutional codes

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
19
0

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 31 publications
(19 citation statements)
references
References 5 publications
0
19
0
Order By: Relevance
“…The proposed hard decision decoding ANN scheme offers similar or better performance compared with the previously reported ANN decoders as given in ; however, the soft decision decoding scheme outperforms them all. Although like‐to‐like comparison is difficult to make in terms of the complexity because of the differences in the neural architecture, a comparison in overall decoding complexity and the delay can be made.…”
Section: Comparative Study Of the Artificial Neural Network And The Vmentioning
confidence: 59%
See 3 more Smart Citations
“…The proposed hard decision decoding ANN scheme offers similar or better performance compared with the previously reported ANN decoders as given in ; however, the soft decision decoding scheme outperforms them all. Although like‐to‐like comparison is difficult to make in terms of the complexity because of the differences in the neural architecture, a comparison in overall decoding complexity and the delay can be made.…”
Section: Comparative Study Of the Artificial Neural Network And The Vmentioning
confidence: 59%
“…The Viterbi algorithm (VA) is an efficient method for decoding the convolutional code using the maximum likelihood sequence estimator technique. However, complexity, delay and memory requirement associated with the VA are relatively high and hence a sub‐optimal decoding of the convolutional code using the artificial neural network (ANN) has been intensively studied . ANN, as an alternative to the VA, has been suggested because of the low complexity, parallel processing, adaptability and fault tolerance capabilities.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…[1], [2]). With the recent success of deep learning in wide areas of applications (including natural language processing, image processing, autonomous driving, financial investment, computer games, and many others), neural networks have regained increasing interests in the domain of communicational signal processing (e.g.…”
Section: Introductionmentioning
confidence: 99%