2022 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV) 2022
DOI: 10.1109/wacv51458.2022.00375
|View full text |Cite
|
Sign up to set email alerts
|

Meta-UDA: Unsupervised Domain Adaptive Thermal Object Detection using Meta-Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 22 publications
(4 citation statements)
references
References 25 publications
0
4
0
Order By: Relevance
“…In this experiment, we detect and crop the ATR vehicles from the DSIAC dataset by using the information from the Meta-UDA. 23 The different distances of target vehicles are projected into the canonical distance (2 kilometers) using bi-cubic interpolation. The final target chip size is 68x68x3.…”
Section: Datasetmentioning
confidence: 99%
“…In this experiment, we detect and crop the ATR vehicles from the DSIAC dataset by using the information from the Meta-UDA. 23 The different distances of target vehicles are projected into the canonical distance (2 kilometers) using bi-cubic interpolation. The final target chip size is 68x68x3.…”
Section: Datasetmentioning
confidence: 99%
“…In support of the thermal ATR task, Abraham et al [ 33 ] improve the performance of YOLOv5 models on the DSIAC dataset via a novel homotopy-based hyperparameter optimization algorithm. Leveraging the visible imagery of the DSIAC dataset, VS et al [ 34 ] propose a metalearning strategy for unsupervised domain adaptation in thermal ATR.…”
Section: Related Workmentioning
confidence: 99%
“…They transfer the visible domain into the thermal domain by image adaptation such as Guo et al [22] and Kieu et al [21] or feature adaptation such as Herrmann [14], Kieu et al [19], Kieu et al [20] and Kim et al [45]. The final one is unsupervised domain adaptation, such as Meta-UDA [46]. Disentanglement Learning.…”
Section: Related Workmentioning
confidence: 99%