Art therapy assumes that art work is related to differential constructs of the artist. Empirically, this hypothesis has not been proven yet because quantitative methods are rare. The Rating Instrument for two-dimensional Pictorial Work (RizbA) is designed to address this issue. The construct-pictorial expression-is theoretically defined by seven content areas (representation, color, shape, space, motion, composition, expression), which combined create the overall construct. Test development is based on art historical and art therapeutic theories and supported empirically. Two online studies are conducted using a sample of nine pictures, which are rated by experts (n 1 = 12, n 2 = 8). In the first study, based on psychometric characteristics, an item pool of 113 items is examined and a preliminary test version is developed. The second study examines quality criteria of the preliminary version. For both studies, factor analyses are computed. The preliminary version includes 26 items. Its ability for differentiation between pictorial works ranges between .897 (T1) and .766 (T2), its inter-rater reliability between .525 (T1) and .917 (T2). Test-retest reliability is .919. PCA suggests a four-factors solution, which in large part is consistent across studies. As a reliable measurement RizbA opens new perspectives in fundamental art therapeutic and psychological research.
In empirical art psychology and creativity research most studies focus on the psychological correlates of art. Only few go beyond treating artworks as categorical data (e.g. abstract vs. representational) and consider artworks in detail. In part this is due to the lack of reliable quantitative measurements. The rating instrument for two-dimensional pictorial works (RizbA) makes a difference to current research designs. The current study validates the questionnaire on a representative sample of contemporary visual art, consisting of 318 images depicting works by artists from different cultural areas dated to the 21st century. In a randomized test-retest design, the pictorial material was rated by 506 (T1) and 238 (T2) art experts using RizbA. Statistical quality criteria, such as item difficulty, capacity of differentiation, test-retest reliability, and intraclass correlation were calculated. Principal component analysis (PCA) and indices of factor similarity were computed. The overall test’s capacity for differentiation yields partial eta-squared of .31 (T1) and .40 (T2). Test-retest reliability is .86. PCA reveals an eight-factor solution, which is largely consistent across both measurement points. Tucker’s coefficient of congruence ranges between |.71| and |1.00|. Intraclass correlation coefficients are .86 (T1) and .73 (T2). This study indicates generalizability of the questionnaire to contemporary artworks. Although a conclusion on the factors’ structure cannot be drawn yet, results are very promising. As the first reliable quantitative tool for formal picture analysis, RizbA allows more detailedexamination of visual art and its psychological correlates. This broadens research methodology by giving art greater weight in art psychology and creativity research.
Editormetrics analyse the role of editors of academic journals and their impact on the scientific publication system. However, such analyses would best rely on open, structured and machine-readable data on editors and editorial boards, whose availability still remains rare. To address this shortcoming, the project Open Editors collects data about academic journal editors on a large scale and structures them into a single dataset. It does so by scraping the websites of 6.090 journals from 17 publishers, thereby structuring publicly available information (names, affiliations, editorial roles etc.) about 478.563 researchers. The project will iterate this webscraping procedure annually to enable insights into the changes of editorial boards over time. All codes and data are made available at GitHub, while the result is browsable at a dedicated website (https://openeditors.ooir.org). This dataset carries wide-ranging implications for meta-scientific investigations into the landscape of scholarly publications, including for bibliometric analyses, and allows for critical inquiries into the representation of diversity and inclusivity. It also contributes to the goal of expanding linked open data within science to evaluate and reflect on the scholarly publication process.
Editormetrics analyses the role of editors of academic journals and their impact on the scientific publication system. Such analyses would best rely on open, structured, and machine-readable data about editors and editorial boards, which still remains rare. To address this shortcoming, the project Open Editors collects data about academic journal editors on a large scale and structures them into a single dataset. It does so by scraping the websites of 7,352 journals from 26 publishers (including predatory ones), thereby structuring publicly available information (names, affiliations, editorial roles, ORCID etc.) about 594,580 researchers. The dataset shows that journals and publishers are immensely heterogeneous in terms of editorial board sizes, regional diversity, and editorial role labels. All codes and data are made available at Zenodo, while the result is browsable at a dedicated website (https://openeditors.ooir.org). This dataset carries implications for both practical purposes of research evaluation and for meta-scientific investigations into the landscape of scholarly publications, and allows for critical inquiries regarding the representation of diversity and inclusivity across academia.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.