We present PuPl (Pupillometry Pipeliner), an Octave-compatible library of Matlab functions for processing pupillometry data with an easy-to-use graphical user interface (GUI). PuPl's preprocessing tools include blink correction, data smoothing, and gaze correction. PuPl can also define and sort trials, and segment data to isolate event-related pupil dilation responses. PuPl's flexible tabular export tools enable a wide variety of statistical analyses. Furthermore, PuPl can translate GUI interactions into a Matlab script, enabling easy creation and reconfiguration of reusable data processing pipelines. Finally, PuPl is designed to be extensible, and users can easily contribute functionality as best practices for pupillometry evolve. Here we demonstrate PuPl by replicating published results using publicly available data. PuPl can be downloaded from github. com/kinleyid/pupl.
the evaluation of an idea's creativity constitutes an important step in successfully responding to an unexpected problem with a new solution. Yet, distractions compete for cognitive resources with the evaluation process and may change how individuals evaluate ideas. in this paper, we investigate whether attentional demands from these distractions bias creativity evaluations. this question is examined using 1,065 creativity evaluations of 15 alternative uses of everyday objects by 71 study participants. participants in the distraction group (treatment) rated the alternative uses as more creative on the novelty dimension, but not the usefulness dimension, than did participants in the baseline group (control). psychophysiological measurements-event-related and spectral eeG and pupillometry-confirm attentional resources in the Treatment group are being diverted to a distractor task and that the Control group expended significantly more cognitive resources on the evaluation of the alternative uses. these data show direct physiological evidence that distractor tasks draw cognitive resources from creative evaluation and that such distractions will bias judgements of creativity.We rely on behavioral methods, that is creativity evaluations of alternative uses (AUs), to test the broad hypothesis that distractions bias creativity evaluations. We leverage psychophysiological methods such as pupillometry and event-related (ERP) and spectral electroencephalography (EEG) to verify that distracted individuals are expending less cognitive resources on the evaluation task. The three methods are complimentary. Established pupillometry, EEG oscillations, and ERP components (e.g., alpha band desynchronization, P300) provide reliable and robust real-time measures of attention and cognitive load 13,14 that is not susceptible to retrospective and subjectivity biases 15 . Multi-method experiments provide support for the existence of a bias. Biases in creativitySelecting among alternative ideas is a fundamental challenge for individuals in a number of different organizational settings 16,17 . To this end, people devote a great deal of time and effort to evaluating ideas. Sevens and Burley 18 find that, on average, managers evaluate more than 3,000 raw ideas to identify one that is commercially successful. The challenge of idea evaluation has only grown with the increase of ideation maximization training, platform-based contests, big data, and crowdfunding as a means of generating a large number of ideas [19][20][21] . The challenge faced by decision-makers is often to select the most creative idea from a myriad of competing alternatives and to reduce biases that may have a detrimental effect on this process 22 .In social sciences, mathematics, and engineering, biases refer to systematic errors impacting performance 23 . Prior research has identified a number of decision-making errors in relatively simple decision-making tasks 24,25 . Research has also examined biases present in creativity 6,26 . Broadly, this research has examined biases ...
Visual analogue scales (VASs) allow survey respondents to specify their answers with a high degree of precision, unlike Likert-type scales in which only a few categorical responses are available. In web-based research, HTML sliders (in which the respondent drags and drops a marker) are sometimes used as a substitute for VASs, but sliders produce data of a lower quality than true point-and-click VASs. Here, I introduce a plugin for jsPsych that enables survey responses using a true VAS, enabling researchers to collect self-report data on a continuum without the drawbacks of sliders.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.