2020
DOI: 10.1007/s11042-020-09286-7
|View full text |Cite
|
Sign up to set email alerts
|

CCRNet: a novel data-driven approach to improve cross-domain Iris recognition

Abstract: In spite of the prominence and robustness of iris recognition systems, iris images acquisition using heterogeneous cameras/sensors, is the prime concern in deploying them for wide-scale applications. The textural qualities of iris samples (images) captured through distinct sensors substantially differ due to the differences in illumination and the underlying hardware that yields intra-class variation within the iris dataset. This paper examines three miscellaneous configurations of convolution and residual blo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 42 publications
0
2
0
Order By: Relevance
“…Some studies introduce residual connections when designing network structures for IR tasks. [53] uses the residuals connection from ResNet. Since residual connections allow the network to be built deeper without degrading performance.…”
Section: Conventional Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Some studies introduce residual connections when designing network structures for IR tasks. [53] uses the residuals connection from ResNet. Since residual connections allow the network to be built deeper without degrading performance.…”
Section: Conventional Methodsmentioning
confidence: 99%
“…Since residual connections allow the network to be built deeper without degrading performance. The model used in [53] is the Collaborative Convolutional Residual Network (CCRNet). Experimental results show that CCRNet has good performance with an EER of 1.06% and 1.21% for ND-CrossSensor-Iris-2013 and ND-IRIS-0405.…”
Section: Conventional Methodsmentioning
confidence: 99%