Divergent thinking has often been used as a proxy measure of creative thinking, but this practice lacks a foundation in modern cognitive psychological theory. This article addresses several issues with the classic divergent-thinking methodology and presents a new theoretical and methodological framework for cognitive divergent-thinking studies. A secondary analysis of a large dataset of divergent-thinking responses is presented. Latent semantic analysis was used to examine the potential changes in semantic distance between responses and the concept represented by the divergent-thinking prompt across successive response iterations. The results of linear growth modeling showed that although there is some linear increase in semantic distance across response iterations, participants high in fluid intelligence tended to give more distant initial responses than those with lower fluid intelligence. Additional analyses showed that the semantic distance of responses significantly predicted the average creativity rating given to the response, with significant variation in average levels of creativity across participants. Finally, semantic distance does not seem to be related to participants' choices of their own most creative responses. Implications for cognitive theories of creativity are discussed, along with the limitations of the methodology and directions for future research.
How people perceive their creativity is a growing area of research, but less is known about how people perceive the distinction between inborn (fixed) versus learnable (growth) aspects of their creative competence. This study measured fixed and growth creative mindsets, and its relationship to creative self-efficacy and creative identity in a sample of 620 undergraduate students. The data were split into 2 equal-sized samples to perform exploratory factor analysis and confirmatory factor analysis. Exploratory factor analysis results showed that items adapted from Dweck's previous studies with the word creativity replacing intelligence did not perform as well as Karwowski's creative mindset items. Confirmatory factor analysis results suggest that the best measurement model for mindsets is one that also includes self-efficacy, but not necessarily creative identity. Fixed mindsets correlated much less with the other factors, and all of the small correlations were in the negative direction, which could be expected given that those with a fixed mindset employ more helpless strategies. Fixed and growth creative mindsets were moderately negatively correlated, suggesting that while the 2 mindsets are related, it is important for future researchers to measure levels of both dimensions. This study also suggests that fixed and growth mindsets are better measured using descriptions that pertain specifically to the creative process. Implications and theoretical considerations are discussed.
This commentary discusses common relevant themes that have been highlighted across contributions in this special issue on "Creativity Assessment: Pitfalls, Solutions, and Standards." We first highlight the challenges of operationalizing creativity through the use of a range of measurement approaches that are simply not tapping into the same aspect of creativity. We then discuss pitfalls and challenges of the three most popular measurement methods employed in the field, namely divergent thinking tasks, productbased assessment using the consensual assessment techniques, and self-report methodology. Finally, we point to two imperative standards that emerged across contributions in this collection of articles, namely transparency (need to accurately define, operationalize, and report on the specific aspect[s] of creativity studied) and homogenization of creativity assessment (identification and consistent use of an optimal "standard" measure for each major aspect of creativity). We conclude by providing directions on how the creativity research community and the field can meet these standards.
Recent studies have highlighted both similarities and differences between the cognitive processing that underpins memory retrieval and that which underpins creative thinking. To date, studies have focused more heavily on the Alternative Uses task, but fewer studies have investigated the processing underpinning other idea generation tasks. This study examines both Alternative Uses and Consequences idea generation with a methods pulled from cognitive psychology, and a novel method for evaluating the creativity of such responses. Participants were recruited from Amazon Mechanical Turk using a custom interface allowing for requisite experimental control. Results showed that both Alternative Uses and Consequences generation are well approximated by an exponential cumulative response time model, consistent with studies of memory retrieval. Participants were also slower to generate their first consequence compared with first responses to Alternative Uses, but inter-response time was negatively related to pairwise similarity on both tasks. Finally, the serial order effect is exhibited for both tasks, with Consequences earning more creative evaluations than Uses. The results have implications for burgeoning neuroscience research on creative thinking, and suggestions are made for future areas of inquiry. In addition, the experimental apparatus described provides an equitable way for researchers to obtain good quality cognitive data for divergent thinking tasks.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.