Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challeng 2000
DOI: 10.1109/ijcnn.2000.861490
|View full text |Cite
|
Sign up to set email alerts
|

Novel use of channel information in a neural convolutional decoder

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2005
2005
2009
2009

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 7 publications
0
6
0
Order By: Relevance
“…However, the neural network structure was still used for existing algorithms. In 2000, another convolutional decoder using recurrent neural network was developed [14]. It has shown again that the performance of neural network decoding approaches to that of VA, and it can be easily implemented in hardware.…”
Section: Advances In Artificial Intelligencementioning
confidence: 99%
“…However, the neural network structure was still used for existing algorithms. In 2000, another convolutional decoder using recurrent neural network was developed [14]. It has shown again that the performance of neural network decoding approaches to that of VA, and it can be easily implemented in hardware.…”
Section: Advances In Artificial Intelligencementioning
confidence: 99%
“…In the past several years substantial efforts have been made to apply RNNs in error control coding theory. Initially, these networks were applied for block codes decoding [2,3] and then for convolutional [4][5][6][7] and turbo codes decoding [8]. In [5][6][7], it was shown that the decoding problem could be formulated as a function minimization problem and the gradient descent algorithm was applied to decode convolutional codes of a small code rate, and the developed recurrent artificial neural network (ANN) algorithm did not need any supervision.…”
Section: Introductionmentioning
confidence: 99%
“…These expressions for the general case encoder were applied for an example 1/2-rate encoder and the constraint length L = 3, which is shown in Figure 1 alongside with the unit impulse generator matrix obtained from the general form in (6).…”
Section: Development Of the Soft Decision Neural Network Algorithmmentioning
confidence: 99%
“…For the simple encoder analyzed in [6], defined by the ½rate, the constraint length L=3 and the impulse generator matrix given in Figure 3.…”
Section: Examplementioning
confidence: 99%
See 1 more Smart Citation