2020
DOI: 10.1109/tcomm.2020.3019077
|View full text |Cite
|
Sign up to set email alerts
|

Deep Transfer Learning-Based Downlink Channel Prediction for FDD Massive MIMO Systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
87
0
1

Year Published

2021
2021
2023
2023

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 144 publications
(88 citation statements)
references
References 30 publications
0
87
0
1
Order By: Relevance
“…10 shows the energy efficiency results for both full-duplex and half-duplex schemes. Moreover, we consider the utilization of transfer block architecture to enhance the energy efficiency results by reducing the number of RF chains 11 . As proven in Section V, the total achievable rate performance remains the same whether the transfer block architecture is utilized or not.…”
Section: Energy Efficiencymentioning
confidence: 99%
See 1 more Smart Citation
“…10 shows the energy efficiency results for both full-duplex and half-duplex schemes. Moreover, we consider the utilization of transfer block architecture to enhance the energy efficiency results by reducing the number of RF chains 11 . As proven in Section V, the total achievable rate performance remains the same whether the transfer block architecture is utilized or not.…”
Section: Energy Efficiencymentioning
confidence: 99%
“…Nonetheless, the minimization problem for the current form of MSE j depends on the effective SI channel H SI,j , which is not available at the node j. According to the SI channel model given in (11), the SI channel has two parts indicating the LoS paths (i.e., the near-field SI channel) and NLoS paths (i.e., far-field SI channel). After applying the antenna separation based SIC, the NLoS paths become more dominant compared to the LoS paths (i.e., H SI,j = F r,j H SI,j F t,j ≈ F r,j H NLoS,j F t,j ).…”
mentioning
confidence: 99%
“…We choose to introduce this layer before the deep learning layers as the neural network designs are driven by applications. [7] Sleep monitoring CNN+RNN Adversarial domain adaptation Zheng et al 2019 [5] Gesture recognition CNN+RNN HCF Fhager et al 2019 [8] Gesture recognition CNN Transfer learning Tamzeed et al 2019 [11] Gesture recognition CNN+LSTM Zero-shot learning Yang et al 2020 [12] Gesture recognition CNN+LSTM Teacher student network Wang et al 2019 [13] Pose estimation U-Net+attention Adversarial domain adaptation Guan et al 2020 [6] Imaging CNN GAN Yang et al 2019 [14] Channel prediction FNN Transfer learning…”
Section: Application Layermentioning
confidence: 99%
“…Since the OSNR features are similar for higher-order QAM, direct-learning may cause the negative transfer between different modulation formats. To avoid the negative transfer among unrelated tasks, it is considered as a single-source domain transfer to a multi-source domain transfer problem rather than a direct-transfer problem [21] as shown in Fig. 1 (c).…”
Section: Principle Of the Cascaded Neural Networkmentioning
confidence: 99%