2022
DOI: 10.48550/arxiv.2203.00155
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Performance of Distribution Regression with Doubling Measure under the seek of Closest Point

Abstract: We study the distribution regression problem assuming the distribution of distributions has a doubling measure larger than one. First, we explore the geometry of any distributions that has doubling measure larger than one and build a small theory around it. Then, we show how to utilize this theory to find one of the nearest distributions adaptively and compute the regression value based on these distributions. Finally, we provide the accuracy of the suggested method here and provide the theoretical analysis fo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

2
0

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 11 publications
0
2
0
Order By: Relevance
“…Therefore, this result was concluding almost optimality of the nuclear norm minimization where sampling method is passive. However, in modern data analysis it has been show that adaptive sensing and sampling methods are outperforming passive sampling methods Ramazanli (2022); Haupt et al (2011); Warmuth & Kuzmin (2008); . In a follow up work the paper Krishnamurthy & Singh (2013) has proposed an adaptive matrix completion algorithm that can actually recover underlying matrix using at most O(µ 0 r 1.5 log r ǫ ) many observation.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, this result was concluding almost optimality of the nuclear norm minimization where sampling method is passive. However, in modern data analysis it has been show that adaptive sensing and sampling methods are outperforming passive sampling methods Ramazanli (2022); Haupt et al (2011); Warmuth & Kuzmin (2008); . In a follow up work the paper Krishnamurthy & Singh (2013) has proposed an adaptive matrix completion algorithm that can actually recover underlying matrix using at most O(µ 0 r 1.5 log r ǫ ) many observation.…”
Section: Introductionmentioning
confidence: 99%
“…It has been shown in many problems, adaptive methods are outperforming passive traditional machine learning methods. Ramazanli (2022b) has show the power of adaptivity in distribution regression problem. Paramythis & Loidl-Reisinger (2003) has studied adaptive learning for environment learning, and famous adaptive learning algorithm also proposed in Riedmiller & Braun (1992) Specifically for matrix completion Krishnamurthy & Singh (2013, 2014; Ramazanli; Balcan & Zhang (2016) showed that for adaptivity helps us to reach theoretical bounds.…”
Section: Introductionmentioning
confidence: 99%