2023
DOI: 10.1109/tit.2023.3243869
|View full text |Cite
|
Sign up to set email alerts
|

Learning Maximum Margin Channel Decoders

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 46 publications
0
3
0
Order By: Relevance
“…Some exemplary (relatively) recent applications of (squared) Mahalanobis distance can be found e.g. in Xu et al [301], Sun et al [302], Timmermann et al [303], Wauchope et al [304], Weinberger [305], Wen et al [306], Yang et al [307], Zhang et al [308], Burssens et al [309], Choi et al [310], Choi et al [311], Dahlin et al [312], Ebrahimi et al [313], Jeong et al [314], Kim et al [315], Nowakowski et al [316], Qu et al [317], Rabby et al [318], Sarno et al [319], Tang et al [320], Tsvieli & Weinberger [321], Zhang et al [322], Zhou et al [323]. Some exemplary (relatively) recent applications of generally non-separable (ordinary/classical) Bregman distances appear e.g.…”
Section: A Further Divergences and Friendsmentioning
confidence: 99%
“…Some exemplary (relatively) recent applications of (squared) Mahalanobis distance can be found e.g. in Xu et al [301], Sun et al [302], Timmermann et al [303], Wauchope et al [304], Weinberger [305], Wen et al [306], Yang et al [307], Zhang et al [308], Burssens et al [309], Choi et al [310], Choi et al [311], Dahlin et al [312], Ebrahimi et al [313], Jeong et al [314], Kim et al [315], Nowakowski et al [316], Qu et al [317], Rabby et al [318], Sarno et al [319], Tang et al [320], Tsvieli & Weinberger [321], Zhang et al [322], Zhou et al [323]. Some exemplary (relatively) recent applications of generally non-separable (ordinary/classical) Bregman distances appear e.g.…”
Section: A Further Divergences and Friendsmentioning
confidence: 99%
“…Kurmukova et al [239] Friendly jamming for improving decoding performance. Tsvieli et al [240] Investigation on the problem of maximizing the margin of the decoder. Zhong et al [241], [242] DL-based decoders for spin-torque transfer magnetic random access memory.…”
Section: Other Approachesmentioning
confidence: 99%
“…the generalization properties, remains challenging. The authors in [240] addressed the problem of maximizing the margin of the decoder for an additive noise channel whose noise distribution is unknown, as well as for a nonlinear channel with AWGN. They formulated a maximum margin optimization problem, which is common in support vector machines (SVMs), for the decoder learning problem, and they relaxed it to a regularized loss minimization (RLM) problem by several approximation steps.…”
Section: Other Approachesmentioning
confidence: 99%