In recent years, biometric recognition patterns have attracted the attention of many researchers, among which human ears, as a unique and stable biometric feature, have significant advantages in verifying personal identity. In the Internet era, a system with low computing cost and good real-time performance is more popular. Most of the existing ear recognition methods are based on a large parameter network model, which causes a large memory footprint and computational overhead. This paper proposes an efficient and lightweight human ear recognition method (ELERNet) based on MobileNet V2. Based on the MobileNet V2 model, dynamic convolution decomposition is introduced to enhance the representation ability of human ear features. Then, combined with the coordinate attention mechanism, the spatial features of human ear images are aggregated to locate the location information of the human ear features more accurately. We conducted experiments on AWE and EarVN1.0 human ear datasets. Compared with the MobileNet V2 model, the recognition accuracy of our method is significantly improved. Using less computing hardware resources, the ELERNet model achieves 83.52% and 96.10% Rank-1 (R1) recognition accuracy, respectively, which is better than other models. Finally, we provide a visual interpretation using GradCAM technology, and the results show that our method can learn specific and discriminative features in the ear images.
Cancer is one of the major causes of human disease and death worldwide, and mammary cancer is one of the most common cancer types among women today. In this paper, we used the deep learning method to conduct a preliminary experiment on Breast Cancer Histopathological Database (BreakHis); BreakHis is an open dataset. We propose a high-precision classification method of mammary based on an improved convolutional neural network on the BreakHis dataset. We proposed three different MFSCNET models according to the different insertion positions and the number of SE modules, respectively, MFSCNet A, MFSCNet B, and MFSCNet C. We carried out experiments on the BreakHis dataset. Through experimental comparison, especially, the MFSCNet A network model has obtained the best performance in the high-precision classification experiments of mammary cancer. The accuracy of dichotomy was 99.05% to 99.89%. The accuracy of multiclass classification ranges from 94.36% to approximately 98.41%.Therefore, it is proved that MFSCNet can accurately classify the mammary histological images and has a great application prospect in predicting the degree of tumor. Code will be made available on http://github.com/xiaoan-maker/MFSCNet.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.