2023
DOI: 10.14569/ijacsa.2023.0140643
|View full text |Cite
|
Sign up to set email alerts
|

Semi-Dense U-Net: A Novel U-Net Architecture for Face Detection

Abstract: Face detection and localization has been a major field of study in facial analysis and computer vision. Several convolutional neural network-based architectures have been proposed in the literature such as cascaded approach, singlestage and two-stage architectures. Using image segmentation based technique for object/face detection and recognition have been an alternative approach recently being employed. In this paper, we propose detection of faces by using U-net segmentation architectures. Motivated from Dens… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 16 publications
(21 reference statements)
0
1
0
Order By: Relevance
“…The chart topping model U-Net [26] was developed in 2015 by Olaf Ronneberger and his team of researchers for their own purpose of biomedical image segmentation. However, researchers have been using the same or modified version of this model for the purpose of detection and segmentation of regions of interest (ROI) as per their requirements [27], [28]. The model got its name from its unique "U" shape architecture as shown in Figure 4.…”
Section: The U-net Modelmentioning
confidence: 99%
“…The chart topping model U-Net [26] was developed in 2015 by Olaf Ronneberger and his team of researchers for their own purpose of biomedical image segmentation. However, researchers have been using the same or modified version of this model for the purpose of detection and segmentation of regions of interest (ROI) as per their requirements [27], [28]. The model got its name from its unique "U" shape architecture as shown in Figure 4.…”
Section: The U-net Modelmentioning
confidence: 99%