2021
DOI: 10.48550/arxiv.2109.03582
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Higher Order Kernel Mean Embeddings to Capture Filtrations of Stochastic Processes

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 0 publications
0
4
0
Order By: Relevance
“…One notable example of a jointly controlled path satisfying Condition 3.6 is the signature kernel, a machine learning tool which has recently seen use in kernel methods in works such as [4,13,14,[16][17][18]. An important result for the computation of the signature kernel is that it satisfies a second order Goursat PDE in the case where paths are differentiable [17].…”
Section: Signature Kernels As Jointly Controlled Pathsmentioning
confidence: 99%
See 1 more Smart Citation
“…One notable example of a jointly controlled path satisfying Condition 3.6 is the signature kernel, a machine learning tool which has recently seen use in kernel methods in works such as [4,13,14,[16][17][18]. An important result for the computation of the signature kernel is that it satisfies a second order Goursat PDE in the case where paths are differentiable [17].…”
Section: Signature Kernels As Jointly Controlled Pathsmentioning
confidence: 99%
“…The second is the signature kernel, which has seen data science applications in works such as [4,13,14,[16][17][18]. This also gives us an alternative account of giving meaning to the rough integral equation " K .s;t /;.u;v/ .X;…”
Section: Introductionmentioning
confidence: 99%
“…These results have been influential in stochastic analysis [14,15] and only more recently have been started to be explored in a statistical and machine learning context. We mention pars-pro-toto [16,17,18] for inference about laws of stochastic processes; [10,19,20] for kernel learning; [21,22,23] for Bayesian approaches; [24,25,26] for generative modelling; [27,28,29] for applications in topology; [30,31,32] for algebraic perspectives.…”
Section: Related Workmentioning
confidence: 99%
“…However, for some applications the filtration of a stochastic process matters and one could ask to extend the scoring rule framework to this. A kernel that captures the filtration was introduced in [50] and a kernel algorithm and new applications given in [19]. To get a non-kernel scoring one could try to replace Φ in Definition (4.1) by the higher-rank signature from [50].…”
Section: Scoring Rules For Tracks and Pathsmentioning
confidence: 99%