2012 6th International Conference on Signal Processing and Communication Systems 2012
DOI: 10.1109/icspcs.2012.6508013
|View full text |Cite
|
Sign up to set email alerts
|

Vector equalization based on continuous-time recurrent neural networks

Abstract: The problem of vector equalization based on recurrent neural networks as suboptimum scheme is considered from the analog signal processing point of view. We distinguish between discrete-time recurrent neural networks (DTRNNs) and continuous-time ones (CTRNNs). In contrast to the CTRNNs, the DTRNNs have been extensively investigated and implemented for the vector equalization task with good results for channels with little to moderate interference. However, the growing demand for jointly high date rates and pow… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2014
2014
2016
2016

Publication Types

Select...
1
1
1

Relationship

3
0

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 19 publications
0
5
0
Order By: Relevance
“…18 Layout of the vector equalizer and pin configuration. 9,10,11,12,23,24,25,26), six pins for the weights configuration (pads 13,14,18,19,20,21), reset (pad 15), voltage supplies (pads 16,17,22) and grounds (square pads). The active area is approximately 0.09 mm 2 , with a transistor count CNT = 171 for four neurons.…”
Section: Measurement Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…18 Layout of the vector equalizer and pin configuration. 9,10,11,12,23,24,25,26), six pins for the weights configuration (pads 13,14,18,19,20,21), reset (pad 15), voltage supplies (pads 16,17,22) and grounds (square pads). The active area is approximately 0.09 mm 2 , with a transistor count CNT = 171 for four neurons.…”
Section: Measurement Resultsmentioning
confidence: 99%
“…for l = t equ . The above stated conditions are valid for BPSK, but can be generalized by combining the results of [10,18,19].…”
Section: Equalization Based On Continuous-time Recurrent Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…However, this was limited to the binary phaseshift keying (BPSK) symbol alphabet ψ ={−1, +1}. This has been generalized to complex-valued symbol alphabets in [21] by combining the results of references [20,22,32] 5 . Based thereon, it has been proven that the RNN ends in a local minimum of Eq.…”
Section: A Vector Equalization Based On Rnnmentioning
confidence: 99%
“…In contrast to the common strategy when dealing with neural networks, RNNs have been shown to be able to perform vector equalization without the need for a training phase, in both the discrete-time case [3]- [5] and the continuous-time case [6], [7]. This is due to their Lyapunov stability under specific conditions [8], [9].…”
Section: Introductionmentioning
confidence: 99%