Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery &Amp; Data Mining 2020
DOI: 10.1145/3394486.3403214
|View full text |Cite
|
Sign up to set email alerts
|

RECORD: Resource Constrained Semi-Supervised Learning under Distribution Shift

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 14 publications
(15 citation statements)
references
References 17 publications
0
14
0
1
Order By: Relevance
“…Sconf-Unbiased, Sconf-ABS and Sconf-NN are short for ERM with risk estimators in Eqs. ( 6), (8), and (9), respectively.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Sconf-Unbiased, Sconf-ABS and Sconf-NN are short for ERM with risk estimators in Eqs. ( 6), (8), and (9), respectively.…”
Section: Methodsmentioning
confidence: 99%
“…However, the collection of massive data with exact supervision is laborious and expensive in many real-world problems. To overcome this bottleneck, weakly supervised learning [1] has been proposed and explored under various settings, including but not limited to, semi-supervised learning [2,3,4,5,6,7,8], positive-unlabeled learning [9,10,11,12], noisy-label learning [13,14,15,16], partial-label learning [17,18,19,20], complementary-label learning [21,22,23,24,25,26], similarity-unlabeled learning [27], and similarity-dissimilarity learning [28].…”
Section: Introductionmentioning
confidence: 99%
“…There are some SSL studies trying to tackle sub-problems in dynamic environments, such as classifying the new emerging labels [144], [145], adapting to gradually shifting distributions [146], [147]. But these attempts mainly focused on specific small problems and are still a long way from being applied to real-world tasks with dynamic environments.…”
Section: Learning In Dynamic Environmentsmentioning
confidence: 99%
“…In supervised learning, the annotations of a huge number of instances may not be easily obtained in many practical applications due to the concerns including but not limited to time consumption, expenditure, and privacy preserving. For these reasons, many weakly supervised learning (WSL) frameworks [1] have been studied in various scenarios recently, including semi-supervised learning [2,3,4,5,6,7,8], positive-unlabeled learning [9,10,11,12], unlabeled-unlabeled learning [13,14], noisy-label learning [15,16,17,18], complementary and partial-label learning [19,20,21,22,23,24,25,26,27], similarity-based learning [28,29,30], and positive-confidence learning [31].…”
Section: Introductionmentioning
confidence: 99%