Eye behaviour provides valuable information revealing one's higher cognitive functions and state of affect. Although eye tracking is gaining ground in the research community, it is not yet a popular approach for the detection of emotional and cognitive states. In this paper, we present a review of eye and pupil tracking related metrics (such as gaze, fixations, saccades, blinks, pupil size variation, etc.) utilized towards the detection of emotional and cognitive processes, focusing on visual attention, emotional arousal and cognitive workload. Besides, we investigate their involvement as well as the computational recognition methods employed for the reliable emotional and cognitive assessment. The publicly available datasets employed in relevant research efforts were collected and their specifications and other pertinent details are described. The multimodal approaches which combine eye-tracking features with other modalities (e.g. biosignals), along with artificial intelligence and machine learning techniques were also surveyed in terms of their recognition/classification accuracy. The limitations, current open research problems and prospective future research directions were discussed for the usage of eyetracking as the primary sensor modality. This study aims to comprehensively present the most robust and significant eye/pupil metrics based on available literature towards the development of a robust emotional or cognitive computational model.
SIGNIFICANCE This article evaluates the standardized Greek version of the International Reading Speed Texts (IReST) set, which enriches interlanguage comparisons and international clinical studies of reading performance. Moreover, it investigates how specific textual and subject-related characteristics modulate the variability of reading speed across texts and readers. PURPOSE The purpose of this study was to develop a standardized Greek version of the IReST set and investigate how specific textual and subject-related factors modulate the variability of reading speed across texts and readers. METHODS The English IReST texts were translated to Greek and matched for length, content, and linguistic difficulty. The Greek IReSTs were presented at a distance of 40 cm and size of 1 M to assess reading speeds of 25 normally sighted native speakers (age range, 18 to 35 years). The participants read the texts aloud while reading time was measured by stopwatch. Reading performance included measurement of reading speed in three units of analysis. Reading efficiency was assessed using a word-level oral reading task. Statistical analysis included evaluation of subject- and text-related variability, as well as correlations between reading speed and specific textual and subject-related factors. RESULTS The average reading speed between texts was 208 ± 24 words/min, 450 ± 24 syllables/min, and 1049 ± 105 characters/min. Differences between readers accounted for the 76.6%, whereas differences across texts accounted for the 23.4% of the total variability of reading speed. Word length (in syllables per word) and median word frequency showed a statistically significant contribution to the variability of reading speed (r = 0.95 and 0.70, respectively). Reading speed was also statistically correlated with word reading efficiency (r = 0.68). CONCLUSIONS The addition of the Greek version in the IReST language pack is expected to be a valuable tool for clinical practice and research, enriching interlanguage comparisons and international studies of reading performance.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.