2019 IEEE 90th Vehicular Technology Conference (VTC2019-Fall) 2019
DOI: 10.1109/vtcfall.2019.8891371
|View full text |Cite
|
Sign up to set email alerts
|

Deep Learning Based Fast Multiuser Detection for Massive Machine-Type Communication

Abstract: Massive machine-type communication (MTC) with sporadically transmitted small packets and low data rate requires new designs on the PHY and MAC layer with light transmission overhead. Compressive sensing based multiuser detection (CS-MUD) is designed to detect active users through random access with low overhead by exploiting sparsity, i.e., the nature of sporadic transmissions in MTC. However, the high computational complexity of conventional sparse reconstruction algorithms prohibits the implementation of CS-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
32
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 31 publications
(40 citation statements)
references
References 23 publications
(19 reference statements)
0
32
0
Order By: Relevance
“…In [33], a new block activation unit with the block size of six has been proposed to handle the block-sparse vector in wideband wireless communication systems. Inspired by it, we use two 7 × 7 convolutional layers in CsiNet+ to replace the first 3 × 3 convolution layer of CsiNet at the encoder.…”
Section: ) Modificationmentioning
confidence: 99%
“…In [33], a new block activation unit with the block size of six has been proposed to handle the block-sparse vector in wideband wireless communication systems. Inspired by it, we use two 7 × 7 convolutional layers in CsiNet+ to replace the first 3 × 3 convolution layer of CsiNet at the encoder.…”
Section: ) Modificationmentioning
confidence: 99%
“…The work in [55] compares different approaches of the Hierarchical Hard Thresholding Pursuit (HiHTP) algorithm. Machine learning approaches are also suggested, as in [56]- [60].…”
Section: A Relevant Prior Artmentioning
confidence: 99%
“…Machine learning BRNN [56] • Consists in a feedforward neural network with interleaved fully connected layers and non-linear transformation layers. • A batch normalization is added for initialization and residual connection is used to avoid vanishing/exploding gradients.…”
Section: Mp-bsbl [52]mentioning
confidence: 99%
See 1 more Smart Citation
“…Particularly, it is revealed in Reference 21 that deep learning can achieve better performance in user detection and channel estimation for IoT networks with a new structure of block‐restrict neural network (BRNN). The work in Reference 21 reveals the fact that, through sufficient training, deep learning has advantages on both performance and computational complexity compared against OMP and other CS methods, and the computation time is reduced more than an order of magnitude. However, although deep learning shows the potential in this problem, currently the network structure is simply a fully connected network, which can not take full advantage of the power of the deep neural network.…”
Section: Introductionmentioning
confidence: 99%