2011
DOI: 10.1186/1687-1499-2011-111
|View full text |Cite
|
Sign up to set email alerts
|

Minimum decoding trellis length and truncation depth of wrap-around Viterbi algorithm for TBCC in mobile WiMAX

Abstract: The performance of the wrap-around Viterbi decoding algorithm with finite truncation depth and fixed decoding trellis length is investigated for tail-biting convolutional codes in the mobile WiMAX standard. Upper bounds on the error probabilities induced by finite truncation depth and the uncertainty of the initial state are derived for the AWGN channel. The truncation depth and the decoding trellis length that yield negligible performance loss are obtained for all transmission rates over the Rayleigh channel … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2013
2013
2013
2013

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 9 publications
0
1
0
Order By: Relevance
“…Before encoding, the encoder memory is initialized with the last six data bits of the FEC block to be encoded. Thus, the initial state of the code trellis is the same as the end state, and the convolutional code is converted into a short block code called tail-biting convolutional code (TBCC) [16,17]. Each codeword, which is called coded FEC block, is interleaved and constellation-mapped into modulation symbols that in turn are mapped to slots of the data burst.…”
Section: Encodingmentioning
confidence: 99%
“…Before encoding, the encoder memory is initialized with the last six data bits of the FEC block to be encoded. Thus, the initial state of the code trellis is the same as the end state, and the convolutional code is converted into a short block code called tail-biting convolutional code (TBCC) [16,17]. Each codeword, which is called coded FEC block, is interleaved and constellation-mapped into modulation symbols that in turn are mapped to slots of the data burst.…”
Section: Encodingmentioning
confidence: 99%