A breakthrough is needed in order to achieve a substantial progress in the field of Content-Based Image Retrieval (CBIR). This breakthrough can be enforced by: 1) optimizing user-system interaction, 2) combining the wealth of techniques from text-based Information Retrieval with CBIR techniques, 3) exploiting human cognitive characteristics, especially human color processing, and 4) conducting benchmarks with users for evaluating new CBIR techniques. In this paper, these guidelines are illustrated by findings from our research conducted the last five years, which have lead to the development of the online Multimedia for Art ReTrieval (M4ART) system: http://www.m4art.org. The M4ART system follows the guidelines on all four issues and is assessed on benchmarks using 5730 queries on a database of 30,000 images. Therefore, M4ART can be considered as a first step into a new era of CBIR.
The prototype for an online Multimedia for Art ReTrieval (M4ART) system is introduced, that provides entrance to the digitized collection of the National Gallery of the Netherlands (the Rijksmuseum). The current online system of the Rijksmuseum is text-based and requires expert knowledge concerning the work searched for, else it fails in retrieving it. M4ART extends this system by allowing the user to query with an example image that can be uploaded, or to select it by browsing the collection. The global color distribution of the example image, and perhaps a set of texture features, are then extracted and compared with those of the images in the collection. Thus, the collection can be queried based on text as well as content-based features. Moreover, the matching process of M4ART can be scrutinized. With this feature, M4ART not only integrates the means to give expert and lay equal access to the system, it also lets the user understand the system's inner workings. These qualities make M4ART unique in it its ability to let the user access, enhance, and retrieve the knowledge available in digitized art collections.
Various texture analysis algorithms have been developed the last decades. However, no computational model has arisen that mimics human texture perception adequately. In 2000, Payne, Hepplewhite, and Stoneham and in 2005, Van Rikxoort, Van den Broek, and Schouten achieved mappings between humans and artificial classifiers of respectively around 29% and 50%. In the current research, the work of Van Rikxoort et al. was replicated, using the newly developed, online card sorting experimentation platform M-HinTS: http://eidetic.ai.ru. nl/M-HinTS/. In two separate experiments, color and gray scale versions of 180 textures, drawn from the OuTex and VisTex texture databases were clustered by 34 subjects. The mutual agreement among these subjects was 51% and 52% for, respectively, the experiments with color and gray scale textures. The average agreement between the k-means algorithm and the participants was 36%, where k-means approximated some participants up to 60%. Since last year's results were not replicated, an additional data analysis was developed, which uses the semantic labels available in the database. This analysis shows that semantics play an important role in human texture clustering and once more illustrate the complexity of texture recognition. The current findings, the introduction of M-HinTS, and the set of analyzes discussed, are the start of a next phase in unraveling human texture recognition.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.