2017
DOI: 10.1109/tsp.2017.2726991
|View full text |Cite
|
Sign up to set email alerts
|

Widely Linear Complex-Valued Kernel Methods for Regression

Abstract: Usually, complex-valued RKHS are presented as an straightforward application of the real-valued case. In this paper we prove that this procedure yields a limited solution for regression. We show that another kernel, here denoted as pseudokernel, is needed to learn any function in complex-valued fields. Accordingly, we derive a novel RKHS to include it, the widely RKHS (WRKHS). When the pseudo-kernel cancels, WRKHS reduces to complex-valued RKHS of previous approaches. We address the kernel and pseudo-kernel de… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
45
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 23 publications
(45 citation statements)
references
References 26 publications
0
45
0
Order By: Relevance
“…Notice that this kernel and pseudo-kernel follow the structure introduced in [11]. The first entry ofŷ(i) in (13) yields the proposed generalized complex KLMS (gCKLMS):…”
Section: The Composite Klms Algorithmmentioning
confidence: 99%
See 4 more Smart Citations
“…Notice that this kernel and pseudo-kernel follow the structure introduced in [11]. The first entry ofŷ(i) in (13) yields the proposed generalized complex KLMS (gCKLMS):…”
Section: The Composite Klms Algorithmmentioning
confidence: 99%
“…Therefore, these algorithms use a real-valued kernel k(x, x ) = 2k rr (x, x ). The condition k rj (x(i), x(l)) = 0 implies that the real and the imaginary parts are not related and that one of them does not provide information to learn the other [11].…”
Section: A Kernel Designmentioning
confidence: 99%
See 3 more Smart Citations