2020 IEEE Winter Conference on Applications of Computer Vision (WACV) 2020
DOI: 10.1109/wacv45572.2020.9093432
|View full text |Cite
|
Sign up to set email alerts
|
Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
72
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 97 publications
(73 citation statements)
references
References 18 publications
1
72
0
Order By: Relevance
“…Most of the methods in RS do not consider the hardness of the images in the selected triplets and exploit the random triplet selection strategy as mentioned in the introduction [16], [17], [21]. Unlike RS, in the CV community, the use of triplets is more extended and the importance of the hardness is widely studied [22]- [25]. As an example, Xuan et al propose a triplet selection strategy that selects the closest positive sample (easy positive) and the closest negative (hard negative) for each anchor [22].…”
Section: Before Aftermentioning
confidence: 99%
“…Most of the methods in RS do not consider the hardness of the images in the selected triplets and exploit the random triplet selection strategy as mentioned in the introduction [16], [17], [21]. Unlike RS, in the CV community, the use of triplets is more extended and the importance of the hardness is widely studied [22]- [25]. As an example, Xuan et al propose a triplet selection strategy that selects the closest positive sample (easy positive) and the closest negative (hard negative) for each anchor [22].…”
Section: Before Aftermentioning
confidence: 99%
“…Hard negative mining is selecting image pairs; and that have most similar embedding vectors, yet they belong to different classes : where is the candidate sample, f is a function that produces the embedding vectors for both and , and d is a distance measure [ 33 ]. The measure of how close two embedding vectors are is based on the hyper-parameter: margin or .…”
Section: Related Workmentioning
confidence: 99%
“…Several additional mining techniques include easy positive mining, easy negative mining, and hard positive mining. Details on these techniques are discussed by Xuan et al 2020 [ 33 ].…”
Section: Related Workmentioning
confidence: 99%
“…This leads to slow training and underfitted models. The training can be improved by carefully mining the training data that produce a large loss (Xuan et al, 2019). For each training iteration, first, the anchor training data are randomly chosen.…”
Section: Distance Learningmentioning
confidence: 99%