This study tackles a common, yet underrated problem in remote-sensing image analysis: the fact that human interpretation is highly variable among different operators. Despite current technological advancements, human perception and interpretation are still vital components of the map-making process. Consequently, human errors can considerably bias both mapping and modelling results. In our study, we present a web-based tool to quantify operator variability and to identify the human and external factors affecting this variability. Human operators were given a series of images and were asked to hand-digitize different point, line, and polygon objects. The quantification of performance variability was achieved using both thematic and positional accuracy measures. Subsequently, a series of questions related to demographics, experience, and personality were asked, and the answers were also quantified. Correlation and regression analysis was then used to explain the variability in operator performance. From our study, we conclude that: (1) humans were seldom perfect in visual interpretation; (2) some geographic objects were more complex to accurately digitize than others; (3) there was a high degree of variability among image interpreters when hand-digitizing the same objects; and (4) operator performance was mainly determined by demographic, non-cognitive, and cognitive personality factors, whereas external and technical factors influenced operator performance to a lesser extent. Finally, the results also indicated a gradual decline in performance over time, mimicking classical mental fatigue effects
Wood-plastic composites (WPCs) are increasingly used in decking applications, where exterior exposure can lead to sufficient moisture for fungal deterioration. Standard tests recommended to assess fungal durability of WPC, but initially developed for wood or woodbased panels, are not applied in this study because the similarity in moisture behaviour for wood (-based panels) and WPC is questioned. The moisture dynamics of commercialised WPC versus wood-based panels were studied employing different moistening methods. The moisture sorption differences between various WPCs were minimal despite different wood contents, particle sizes, and plastics employed, but given sufficient time WPC wood particles gained sufficient water for fungal decay. To assay fungal durability of WPCs, immersion of the specimens for at least 1 week in water at 70°C seems to be the most effective pretreatment.
An often undervalued but inevitable component in remote sensing image analysis is human perception and interpretation. Human intervention is a requisite for visual image interpretation, where the interpreter actually performs the analysis. Although image processing became more and more automated, human screening and interpretation remained indispensable at certain stages. One particular stage where the operator plays a crucial role is in the development of reference maps. This is often done by a visual interpretation of an image by an operator. Although the result is crucial for adequately assessing automated systems' performance, the work of the human operator is rarely questioned. No variability is considered and the possibility of errors is not mentioned. This is an implicit assumption that operator performance approaches perfection and that infrequent errors are randomly distributed across time, operators and image types. Given that the existence of operator variability has been proven in several other related domains, for example, screening of medical images, this assumption may be questioned. This letter brings the issue to the attention of the remote sensing community and introduces a new concept quantifying operator variability. As the WAVARS project (web-based assessment of operator performance variability within remote sensing image interpretation tasks) will gain from a high amount of data, we kindly invite interested researchers to access the website http://wavars.ugent.be and take part in the test.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.