2020
DOI: 10.48550/arxiv.2002.05551
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

PACOH: Bayes-Optimal Meta-Learning with PAC-Guarantees

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
22
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
5

Relationship

2
3

Authors

Journals

citations
Cited by 8 publications
(23 citation statements)
references
References 0 publications
1
22
0
Order By: Relevance
“…In the next section, we extend this framework to transfer meta-learning by leveraging the theoretical results in [7]. To this end, we first review non-parametric Bayesian learning via Gaussian Processes (GPs), and then we describe the problem of meta-learning the GP prior proposed in [8]. Notation: We use Roman fonts to indicate random variables, functions, and vectors; while the corresponding regular font denotes fixed realizations.…”
Section: Problem Settingmentioning
confidence: 99%
See 4 more Smart Citations
“…In the next section, we extend this framework to transfer meta-learning by leveraging the theoretical results in [7]. To this end, we first review non-parametric Bayesian learning via Gaussian Processes (GPs), and then we describe the problem of meta-learning the GP prior proposed in [8]. Notation: We use Roman fonts to indicate random variables, functions, and vectors; while the corresponding regular font denotes fixed realizations.…”
Section: Problem Settingmentioning
confidence: 99%
“…We will also require the evidence or marginal likelihood of the output labels, p θ (Y|X ) = p θ (t(X ) = t)p(Y|t(X ) = t)dt, (8) the log of which can be obtained in closed form for the Gaussian likelihood as [18]…”
Section: Gaussian Processes (Gps)mentioning
confidence: 99%
See 3 more Smart Citations