We present a method for turning a flash selfie taken with a smartphone into a photograph as if it was taken in a studio setting with uniform lighting. Our method uses a convolutional neural network trained on a set of pairs of photographs acquired in an ad-hoc acquisition campaign. Each pair consists of one photograph of a subject's face taken with the camera flash enabled and another one of the same subject in the same pose illuminated using a photographic studio-lighting setup. We show how our method can amend defects introduced by a close-up camera flash, such as specular highlights, shadows, skin shine, and flattened images.
Advanced intelligent surveillance systems are able to automatically analyze video of surveillance data without human intervention. These systems allow high accuracy of human activity recognition and then a high-level activity evaluation. To provide such features, an intelligent surveillance system requires a background subtraction scheme for human segmentation that captures a sequence of images containing moving humans from the reference background image. This paper proposes an alternative approach for human segmentation in videos through the use of a deep convolutional neural network. Two specific datasets were created to train our network, using the shapes of 35 different moving actors arranged on background images related to the area where the camera is located, allowing the network to take advantage of the entire site chosen for video surveillance. To assess the proposed approach, we compare our results with an Adobe Photoshop tool called Select Subject, the conditional generative adversarial network Pix2Pix, and the fully-convolutional model for real-time instance segmentation Yolact. The results show that the main benefit of our method is the possibility to automatically recognize and segment people in videos without constraints on camera and people movements in the scene (Video, code and datasets are available at http://graphics.unibas.it/www/HumanSegmentation/index.md.html).
The estimated population growth in the next decades will create severe scarcity of water and will have a tremendous impact on the natural environment. Both the developed and developing countries will have to face increasing challenges to match the greater demand of clean and safe water, looking for supplies far from the residential area. This situation will be furtherly exasperated by the effects of climate change which, increasing the frequency and intensity of extreme events, will reduce the availability and the quality of water resources and will subject the population to serious and ongoing hazards. In such context, an accurate and continuous monitoring of surface waters represents a fundamental step to reduce the contamination status and plan actions for a sustainable management of this resource. In the last years, the development of advanced methodologies and high-tech equipment able to lower the times and costs of the field surveys has not been associated with an appropriate training of the technical staff of public and private bodies responsible for the control of the territory. In most cases, unable to outsource highly qualified personnel due to lack of funding, such bodies tend to reduce the monitoring activities, leaving the areas even more subject to the risk of disastrous events. The present paper proposes an innovative educational tool based on the virtual reality in support to technical and non-technical workforces in field activities. The tool represents a Virtual Laboratory able to train on the standard techniques for the accurate monitoring of the water discharge in open-channel flows and was successfully tested on a sample of people from the private and public water sector. According to the results, its use increased the fieldworkers’ ability to quickly move within the river as well as to easily and correctly manage the measurement equipment and methodology, so reducing the costs and times of surveys in situ.
This paper describes an innovative virtual laboratory for students of Hydraulic Engineering at an Italian university that shows water discharge measurement techniques applied in open-channel flows. Such new technology, which supports traditional practical classes, has the potential to increase students’ motivation and improve their skills, as well as simultaneously reducing the costs, time, and possible dangers that continuous field experiments would involve. Thanks to this immersive and interactive experience that is carried out indoors, students learn to move around a fluvial environment, as well as work more safely and with reduced risks of accidents. Besides, the virtual lab can boost learners’ interest by combining education with pleasure and making knowledge more fun. Collaboration with a group of students enrolled in the Master’s degree course of the Civil and Environmental Engineering program at Basilicata University at the early stages of developing the educational tool led to improvements in its performance and features. Also, a preliminary testing procedure carried out on a student sample, verified the achievement of the students’ learning objectives in terms of knowledge and skills. Such analysis indicated that students took more active role in the teaching/learning process and they showed greater interest in the topic dealt with through the new technology compared to the involvement of students observed during traditional lessons in previous years. The architecture and operational modes of the virtual laboratory as well as the results of the preliminary analysis are discussed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.