2019
DOI: 10.1007/s00034-019-01049-6
|View full text |Cite
|
Sign up to set email alerts
|

Reduced-Complexity Polynomials with Memory Applied to the Linearization of Power Amplifiers with Real-Time Discrete Gain Control

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(8 citation statements)
references
References 37 publications
0
8
0
Order By: Relevance
“…The ascendant method was proposed to memory polynomials models to reduce their amount of parameters [12]. In this work, the ascendant method is employed to simplify the TLP implementation.…”
Section: Ascendant Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…The ascendant method was proposed to memory polynomials models to reduce their amount of parameters [12]. In this work, the ascendant method is employed to simplify the TLP implementation.…”
Section: Ascendant Methodsmentioning
confidence: 99%
“…In what concerns the multimode real-time operation, besides of previously presented characteristics, its AM-AM and AM-PM discontinuous curves are a challenge. Modeling the contribution of each mode in a specific TLP is a possible solution [12]. Thus, the block diagram of the TLP-based model for the multimode PA is shown in Fig.…”
Section: Tlp-based Model For Multimode Pasmentioning
confidence: 99%
See 1 more Smart Citation
“…As discussed in (ELAD, 2010), (DAVIES; RILLING; BLUMENSATH, 2012), the greedy pursuit performance guarantees are typically weaker than LASSO, both in terms of variable selection consistency and accuracy. Also note that CS-based techniques has the additional advantage of reducing the DPD running complexity, whereas techniques that carry out a regression basis change, such as the principal component analysis (PCA) in (LOPEZ-BUENO et al, 2018), (GILABERT et al, 2013) imply an increase in DPD running cost (SCHUARTZ et al, 2019). Examples in the literature that have applied LASSO to reduce the dimensionality of PA and DPD models are (ZENTENO et al, 2015), (KEKATOS; GIANNAKIS, 2011), (WISELL; JALDEN; HANDEL, 2008).…”
Section: Sparse Least Squares Estimationmentioning
confidence: 99%
“…Although orthogonalizing the regression matrix Φ simplifies the estimation and model sizing tasks associated to the model in eq. (6.1), in the context of DPD the transformation of the original, observable regression basis, Φ, into an orthogonal basis Θ severely impacts the running cost of the model, since a matrix-to-matrix multiplication is required before the DPD filtering of the input signal (SCHUARTZ et al, 2019).…”
Section: Orthogonalizationmentioning
confidence: 99%