2022
DOI: 10.1016/j.ins.2022.10.074
|View full text |Cite
|
Sign up to set email alerts
|

Real-time masked face classification and head pose estimation for RGB facial image via knowledge distillation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 26 publications
0
2
0
Order By: Relevance
“…Face recognition systems have been widely used in a variety of applications, including visual surveillance (Zhang et al, 2018), automated border control (del Rio et al, 2016), education systems (Jadhav et al, 2021) and healthcare (Bargshady et al, 2020). Face recognition technology needs to be more efficient while confronting obstacles such as varying illumination (Koley et al, 2022), low resolution (Zangeneh et al, 2020), different pose (Thai et al, 2022), expression change (Huang et al, 2021) and occlusion (Long et al, 2018; Peng et al, 2023; Zeng et al, 2020).…”
Section: Introductionmentioning
confidence: 99%
“…Face recognition systems have been widely used in a variety of applications, including visual surveillance (Zhang et al, 2018), automated border control (del Rio et al, 2016), education systems (Jadhav et al, 2021) and healthcare (Bargshady et al, 2020). Face recognition technology needs to be more efficient while confronting obstacles such as varying illumination (Koley et al, 2022), low resolution (Zangeneh et al, 2020), different pose (Thai et al, 2022), expression change (Huang et al, 2021) and occlusion (Long et al, 2018; Peng et al, 2023; Zeng et al, 2020).…”
Section: Introductionmentioning
confidence: 99%
“…Unlike several other methods that require a tailor-made solution for every different network structure and even dataset, KD can be applied to different network structures and datasets only by designing a unified framework, which has wider practical applications. [18][19][20] KD was originally proposed by Hinton et al 12 The core idea is to exploit the knowledge of a heavy but powerful teacher network to help train a small and lightweight student network and improve the performance of the student network as much as possible. Kullback-Leibler (KL) divergence is a measure used to quantify the difference between two probability distributions.…”
Section: Introductionmentioning
confidence: 99%
“… 17 Among these methods, KD provides a simple, effective, and universal solution. Unlike several other methods that require a tailor-made solution for every different network structure and even dataset, KD can be applied to different network structures and datasets only by designing a unified framework, which has wider practical applications 18 20 …”
Section: Introductionmentioning
confidence: 99%