2021
DOI: 10.1109/tccn.2021.3084409
|View full text |Cite
|
Sign up to set email alerts
|

Downlink CSI Feedback Algorithm With Deep Transfer Learning for FDD Massive MIMO Systems

Abstract: In this paper, a channel state information (CSI) feedback method is proposed based on deep transfer learning (DTL). The proposed method addresses the problem of high training cost of downlink CSI feedback network in frequency division duplexing (FDD) massive multiple-input multiple-output (MIMO) systems. In particular, we obtain the models of different wireless channel environments at low training cost by finetuning the pre-trained model with a relatively small number of samples. In addition, the effects of di… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
19
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 56 publications
(19 citation statements)
references
References 39 publications
0
19
0
Order By: Relevance
“…Online training [59], [138]: transfer learning and meta learning is introduced to accelerate online training at the BS; [139]: the feedback NN is trained at the user side, and FL is adopted; [140]: a new encoder is trained at the user side for a specific area without changing the decoder, and gossip learning applied to multiuser scenario;…”
Section: Data Collection and Online Trainingmentioning
confidence: 99%
See 2 more Smart Citations
“…Online training [59], [138]: transfer learning and meta learning is introduced to accelerate online training at the BS; [139]: the feedback NN is trained at the user side, and FL is adopted; [140]: a new encoder is trained at the user side for a specific area without changing the decoder, and gossip learning applied to multiuser scenario;…”
Section: Data Collection and Online Trainingmentioning
confidence: 99%
“…Therefore, online training is essential. Inspired by [151] that applies transfer learning to DL-based CSI prediction, several novel online training strategies are introduced to DL-based CSI feedback in [59] and [138] to accelerate the training convergence. Once the environment changes, pretrained NNs are fine-tuned using new CSI samples, which is regarded as transfer learning.…”
Section: Data Collection and Online Trainingmentioning
confidence: 99%
See 1 more Smart Citation
“…These works exploit the conjecture that learning at the uplink frequency can be transferred at the downlink frequency with no further modification, as proposed for the first time in [30] and validated in [36]. Furthermore, this problem has been addressed through transfer learning in [37], where the authors investigate the trade-off between training cost and model performance. Second, deep autoencoders compress the channel into a latent space with fixed dimensionality, determined by the number of neurons present in their middle layer.…”
Section: Introductionmentioning
confidence: 99%
“…To this end, few studies are exploring potential solutions. For example, in [133] a MAML-based method is proposed o solve the challenge of associated large number of samples in a wireless channel environment, in order to train a deep neural network (DNN) with good results in terms of Normalized Mean Squarred Error (NMSE). Furthermore, the authors in [134] propose a new decoder, namely Model Independent Neural Decoder (MIND) based on a MAML methodology achieving satisfactory parameter initialization in the meta-training stage and accuracy results.…”
mentioning
confidence: 99%