2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) 2020
DOI: 10.1109/vr46266.2020.00009
|View full text |Cite
|
Sign up to set email alerts
|

Exploring Eye Gaze Visualization Techniques for Identifying Distracted Students in Educational VR

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
19
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 28 publications
(21 citation statements)
references
References 0 publications
0
19
0
Order By: Relevance
“…Besides, visual clusters of disruptive gaze information displayed during collaboration may also become less prominent. To investigate collaborative gaze systems that are ergonomic and immersive, researchers have explored the use of emerging technologies such as AR (Lee et al, 2017) (Li et al, 2019), VR (Piumsomboon et al, 2017) (Rahman et al, 2020), and MR (Kim et al, 2020a) (Bai et al, 2020).…”
Section: Visual Interfaces Of Gaze Cues In Collaborationmentioning
confidence: 99%
See 1 more Smart Citation
“…Besides, visual clusters of disruptive gaze information displayed during collaboration may also become less prominent. To investigate collaborative gaze systems that are ergonomic and immersive, researchers have explored the use of emerging technologies such as AR (Lee et al, 2017) (Li et al, 2019), VR (Piumsomboon et al, 2017) (Rahman et al, 2020), and MR (Kim et al, 2020a) (Bai et al, 2020).…”
Section: Visual Interfaces Of Gaze Cues In Collaborationmentioning
confidence: 99%
“…In recent years, the rise of emerging reality-based technologies such as Augmented Reality (AR), Mixed Reality (MR), or Virtual Reality (VR) has enabled novel techniques to overcome the limitations of situated displays. However, the current colocated collaborative gaze visualisation studies conducted using reality-based technologies are often one-directional (single user gaze indicator) (Erickson et al, 2020) (Li et al, 2019), asynchronous (Rahman et al, 2020) with different knowledge level towards the task (Erickson et al, 2020), in a virtual task space (Li et al, 2019), or between a human and virtual collaborator (Erickson et al, 2020) (Li et al, 2019) (Rahman et al, 2020). It is common to represent all gaze behaviours (eye fixation, saccades, and blink etc) using the same virtual cue (e.g., a virtual gaze ray) while richer visualisation of different gaze behaviours combining both spatial and temporal information are neglected.…”
Section: Introductionmentioning
confidence: 99%
“…There is a risk that is making all data available to users simultaneously, even if possible, will result in more confusion. Hence, guidelines and opportunities to filter data for analysis seem crucial, which is also mentioned by Rahman et al (2020), for larger user groups and in Ugwitz et al (2022). There might also be different needs depending on whether the users are novices or experts.…”
Section: Discussionmentioning
confidence: 99%
“…For the 3D model on screen and in VR, the 3D gaze data have been computed via raycasting and visualized as points in the VE together with a playback of the gaze movements. Rahman et al (2020) explore gaze data supporting a VRbased education scenario. Eye-tracking data can help provide real-time information to see if students are confused or distracted by looking at objects not relevant to the educational aim.…”
Section: Gaze In Virtual Environmentsmentioning
confidence: 99%
“…Physiological measures can also benefit classical teacher-student scenarios. Rahman et al (2020) present a virtual education environment in which the teacher is provided with a visual representation of the gaze behavior of students. This allows the teachers to identify distracted or confused students, which can benefit the transfer of knowledge.…”
Section: Virtual Classroomsmentioning
confidence: 99%