2016
DOI: 10.1016/j.patrec.2015.09.014
|View full text |Cite
|
Sign up to set email alerts
|

On using periocular biometric for gender classification in the wild

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 52 publications
(36 citation statements)
references
References 38 publications
0
36
0
Order By: Relevance
“…The reported results correspond to the 5-folds mean highest accuracy achieved, varying the cost and gamma parameters respectively within the intervals C = [0. 5,5] and gamma = [0.04, 0.15].…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…The reported results correspond to the 5-folds mean highest accuracy achieved, varying the cost and gamma parameters respectively within the intervals C = [0. 5,5] and gamma = [0.04, 0.15].…”
Section: Resultsmentioning
confidence: 99%
“…the interest for the GC problem of a proper combination of features and regions of interest. We start from a baseline, given by a state of the art facial based GC system [4], to later explore the fusion with features densely extracted from some specific areas of the inner face [5]. With this concept in mind, we have revisited the analysis of the human visual system for the GC problem using bubbles [13], where the authors argue that both the ocular and the mouth areas are discriminant for this task to the human system.…”
Section: Approachmentioning
confidence: 99%
See 1 more Smart Citation
“…10. Wilcoxon matched-pairs signed-ranks tests indicate that Farett-Gender shows the highest learnability 17 , and KCAPTCHA shows the lowest 18 .…”
Section: Learnabilitymentioning
confidence: 98%
“…There is further evidence for the difference between humans and machines in recognizing gender: those faces that were misclassified [26] or excluded [79] in the studies could easily be classified by humans. On the one hand, some authors report recognition rates in excess of 90% [79] and recent studies show that for some real-world images, similar success rates are possible [17,86].…”
Section: Gender and Age Classification For Captchasmentioning
confidence: 99%