Machine Learning for Future Wireless Communications 2019
DOI: 10.1002/9781119562306.ch17
|View full text |Cite
|
Sign up to set email alerts
|

Machine Learning for Digital Front‐End

Abstract: Our proposed method, RESETOX (REdo SEarch if TOXic), addresses the issue of Neural Machine Translation (NMT) generating translation outputs that contain toxic words not present in the input. The objective is to mitigate the introduction of toxic language without the need for re-training. In the case of identified added toxicity during the inference process, RESETOX dynamically adjusts the keyvalue self-attention weights and re-evaluates the beam search hypotheses. Experimental results demonstrate that RESETOX … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 115 publications
(4 reference statements)
0
6
0
Order By: Relevance
“…The technology development of AI for wireless communication and RF systems has become an important challenge in the academic, and in industrial communities. Additionally, AI addresses major challenges in communication networks by optimizing parameters in terms of specifications, such as latency and flexibility of efficiency 147 . Figure 23 depicts branches of AI include machine learning and deep learning with explanation of the related subsets, where it is used to characterize communication channels in the digital domain.…”
Section: Optimization Methods In Communicationsmentioning
confidence: 99%
“…The technology development of AI for wireless communication and RF systems has become an important challenge in the academic, and in industrial communities. Additionally, AI addresses major challenges in communication networks by optimizing parameters in terms of specifications, such as latency and flexibility of efficiency 147 . Figure 23 depicts branches of AI include machine learning and deep learning with explanation of the related subsets, where it is used to characterize communication channels in the digital domain.…”
Section: Optimization Methods In Communicationsmentioning
confidence: 99%
“…architecture (left) and illustration of the spectra of the complex baseband signal and the RF signal at a given PA output with typical RF impairments (right). Source: [5].…”
Section: A State-of-the-artmentioning
confidence: 99%
“…We define how to apply the original OLS technique to reduce the polynomial model basis functions and ANN dataset features in MIMO DPD, knowing that it is able to outperform OMP in DPD basis selection and perform equally well than its later variants [34]. In polynomial-based DPD, PCA is commonly used in the identification subsystem to reduce the number of basis and avoid an ill-conditioned estimation while reducing the complexity of the LS calculation [5]. We propose using PCA to reduce the number of features in large ANN MIMO DPD datasets.…”
Section: B Contribution and Noveltymentioning
confidence: 99%
“…The proposed NN shown in Figure 3 uses a feedforward fully connected (FC) structure. Based on the interconnection pattern or architecture, we can distinguish between feedforward networks (FNNs) and recurrent (or feedback) networks (RNNs) [27]. The feedforward network is considered since it is the most used NN and according to the universal approximation theorem, it can approximate any non-linear function with any desired error [28].…”
Section: Nn Trainingmentioning
confidence: 99%
“…The feedforward network is considered since it is the most used NN and according to the universal approximation theorem, it can approximate any non-linear function with any desired error [28]. An FC structure in a densely populated NN may increase requirements for hardware resources, but in many applications, the weight of some interconnections can be set to zero without loss of accuracy, which results in sparsely connected layers [27]. The sparse structure is out of the scope of this work.…”
Section: Nn Trainingmentioning
confidence: 99%