2015 IEEE International Conference on Computer Vision (ICCV) 2015
DOI: 10.1109/iccv.2015.473
|View full text |Cite
|
Sign up to set email alerts
|

ML-MG: Multi-label Learning with Missing Labels Using a Mixed Graph

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
44
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 78 publications
(44 citation statements)
references
References 38 publications
0
44
0
Order By: Relevance
“…As such, image annotation is treated as a multi-label learning problem, where tag correlations play a key role. Most common tag correlations involve tag-level smoothness [30,32] (i.e., the prediction scores of two semantically similar tags should be similar in the same image), image-level smoothness [13,30,32,20] (i.e., visually similar images have similar tags), low rank assumption [2] (i.e., the whole tag space is spanned by a lower-dimensional space), and semantic hierarchy [30,25] (i.e. parent tags in a hierarchy are as probable as their children).…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…As such, image annotation is treated as a multi-label learning problem, where tag correlations play a key role. Most common tag correlations involve tag-level smoothness [30,32] (i.e., the prediction scores of two semantically similar tags should be similar in the same image), image-level smoothness [13,30,32,20] (i.e., visually similar images have similar tags), low rank assumption [2] (i.e., the whole tag space is spanned by a lower-dimensional space), and semantic hierarchy [30,25] (i.e. parent tags in a hierarchy are as probable as their children).…”
Section: Related Workmentioning
confidence: 99%
“…Note that most of these methods only focus on positive tag correlations, while negative correlations have rarely been explored, such as mutual exclusion [7,3] and diversity. The third direction focuses on designing loss functions that encourage certain types of annotation solutions, such as the (weighted) hamming loss [30,34] or the pairwise ranking loss [1]. The fourth direction handles incomplete tags in training, which has been studied in many recent works [30,34,5,29,31,19].…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations