Event recognition in surveillance video has gained extensive attention from the computer vision community. This process still faces enormous challenges due to the tiny inter-class variations that are caused by various facets, such as severe occlusion, cluttered backgrounds, and so forth. To address these issues, we propose a spatio-temporal deep residual network with hierarchical attentions (STDRN-HA) for video event recognition. In the first attention layer, the ResNet fully connected feature guides the Faster R-CNN feature to generate object-based attention (O-attention) for target objects. In the second attention layer, the O-attention further guides the ResNet convolutional feature to yield the holistic attention (H-attention) in order to perceive more details of the occluded objects and the global background. In the third attention layer, the attention maps use the deep features to obtain the attention-enhanced features. Then, the attention-enhanced features are input into a deep residual recurrent network, which is used to mine more event clues from videos. Furthermore, an optimized loss function named softmax-RC is designed, which embeds the residual block regularization and center loss to solve the vanishing gradient in a deep network and enlarge the distance between inter-classes. We also build a temporal branch to exploit the long- and short-term motion information. The final results are obtained by fusing the outputs of the spatial and temporal streams. Experiments on the four realistic video datasets, CCV, VIRAT 1.0, VIRAT 2.0, and HMDB51, demonstrate that the proposed method has good performance and achieves state-of-the-art results.
Background Epidemiological data showed that the comorbidities of cardiovascular disease and diabetes in China are very serious, but the comprehensive management of such patients is seriously inadequate and lack of standardization. Our study aimed to estimate the inequalities of those high CVD risk patients with or without diabetes in China, and analyze the contributors of inequalities problems. Method Data were derived from China Health and Retirement Longitudinal Study (CHARLS) conducted in 2011. The criteria of American Systolic Blood Pressure Intervention Trial (SPRINT) were used to estimate the prevalence of high CVD risk patients with or without diabetes in China. Concentration index was calculated to describe the economic-related inequality degree in high CVD risk patients with or without diabetes. A decomposition method was employed to analyze the cause of inequalities problems. Results The prevalence of high CVD risk patients with or without diabetes in China were 3.46% and 22.03% respectively. The concentration index of high CVD risk patients with or without diabetes in China were 0.0639 [ 95% CI: (0.0630, 0.0648)] and − 0.0628 [95% CI: (-0.0629, -0.0627)] respectively, indicating a pro-rich inequality in high CVD risk patients with diabetes and a pro-poor inequality in high CVD risk patients without diabetes. Location(rural or urban), age and BMI were the key factors to influence the pro-rich inequality in high CVD risk patients with diabetes. Age, socioeconomic status and education were the key factors to influence the pro-poor inequality in high CVD risk patients without diabetes. Conclusion Our study determined pro-rich inequality in high CVD risk patients with diabetes, but pro-poor in high CVD-risk without diabetes patients in China. SES and BMI mainly associated with the social-economic inequality in high CVD risk patients with or without diabetes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.