In this paper, we propose techniques to assess the objective quality for stereoscopic 3D video content, related to motion and depth map features. An analysis has been carried out in order to understand what causes the generation of visual discomfort in the viewer's eye when visualizing a 3D video. Motion is an important feature affecting 3D experience but is also often the cause of visual discomfort. Guidelines are obtained after applying the algorithm to quantify the impact over viewer's experience when common cases happen, such as high motion sequences, scene changes with abrupt parallax changes, or complete absence of stereoscopy.
Work partially funded by the project MIPAC-CM (Monitorización por procesado de imagen y ciencia ciudadana para la conservación de materiales del patrimonio cultural-Image processing monitoring and city science for maintenance of cultural heritage materials), project code 2018/NMT-4913, in colaboration with Centro Nacional de Investigaciones Metalúrgicas (CENIM) of the Centro Superior de Investigaciones Científicas (CSIC).
Colour changes of cultural heritage objects can be related with degradation of materials, thus a proper colour monitoring system can be used to detect conservation problems. With this purpose, a monitoring methodology for cultural heritage preventive conservation based on tailored colour reference charts and image analysis is proposed.Reference colour charts have been designed and tested for use in museums. Charts containing 64 colour patches have been printed using high-stability inks on 4 different substrates: Acid-free paper SkyLight, Acid-free paper covered with a propylene film, FOREX® and GlassPack. The stability has been studied by accelerated ageing in an UV chamber, and the harmlessness of the materials by Oddy Test. The final selection of material, laminated paper, is a balance between the colour change upon ageing and the performance in the Oddy Test. Using this material and the proposed design, colour change of copper and silver coupons has been assessed using images that are adjusted and calibrated by an adaptive calibration framework employing a given set of reference colours which homogenises the visual information in the supplied images. Thus, regardless of the camera of origin, any processed picture will deliver reliable information of the state of the colour in the metal surfaces at the moment it was taken.Results demonstrate the adequacy of the approach and the design for colour calibration, so these charts can be used to monitor colour change of sensitive materials –metal coupons– using photographs. As colour change of reference metals is a consequence of corrosion by environmental factors this may be used as a measure of air quality in museum environments. This methodology can be used to design a low-cost preventive conservation tool, where colour change of metal coupons –or other reference materials– can be followed through image analysis of pictures taken periodically by conservators or visitors, introducing citizen science in the conservation strategy.
In recent years, the use of unmanned aerial vehicles (UAVs) for surveillance tasks has increased considerably. This technology provides a versatile and innovative approach to the field. However, the automation of tasks such as object recognition or change detection usually requires image processing techniques. In this paper we present a system for change detection in video sequences acquired by moving cameras. It is based on the combination of image alignment techniques with a deep learning model based on convolutional neural networks (CNNs). This approach covers two important topics. Firstly, the capability of our system to be adaptable to variations in the UAV flight. In particular, the difference of height between flights, and a slight modification of the camera’s position or movement of the UAV because of natural conditions such as the effect of wind. These modifications can be produced by multiple factors, such as weather conditions, security requirements or human errors. Secondly, the precision of our model to detect changes in diverse environments, which has been compared with state-of-the-art methods in change detection. This has been measured using the Change Detection 2014 dataset, which provides a selection of labelled images from different scenarios for training change detection algorithms. We have used images from dynamic background, intermittent object motion and bad weather sections. These sections have been selected to test our algorithm’s robustness to changes in the background, as in real flight conditions. Our system provides a precise solution for these scenarios, as the mean F-measure score from the image analysis surpasses 97%, and a significant precision in the intermittent object motion category, where the score is above 99%.
Abstract-Recommendations such as P.910 suggests parameters TI (temporal information) and SI (spatial information) for characterizing video sequences for quality assessment. In this paper, we suggest two additional parameter based on disparity called SPI (spatial parallax information) and TPI (temporal parallax information) to characterize 3DTV video sequences for this purpose.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.