2022
DOI: 10.3389/frvir.2021.727344
|View full text |Cite
|
Sign up to set email alerts
|

Optimizing Performance and Satisfaction in Matching and Movement Tasks in Virtual Reality with Interventions Using the Data Visualization Literacy Framework

Abstract: Virtual reality (VR) has seen increased use for training and instruction. Designers can enable VR users to gain insights into their own performance by visualizing telemetry data from their actions in VR. Our ability to detect patterns and trends visually suggests the use of data visualization as a tool for users to identify strategies for improved performance. Typical tasks in VR training scenarios are manipulation of 3D objects (e.g., for learning how to maintain a jet engine) and navigation (e.g., to learn t… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 51 publications
(43 reference statements)
0
5
0
Order By: Relevance
“…VR offers unique means to explore spatial and abstract data in a unified, immersive, and presence-enhancing environment beyond traditional WIMP interfaces, i.e., windows, icons, menu, pointer (Van Dam, 1997). While exploring 3D reference organs and tissue blocks on a 2D screen can be learned (Bueckle et al, 2021(Bueckle et al, , 2022, many users have issues interacting with 3D objects on a 2D screen. visualizations at various viewpoints (Camp et al, 1998), and to create visualizations of intricate molecular structures and biomolecular systems (Chavent et al, 2011;Gill and West, 2014;Trellet et al, 2018;Wiebrands et al, 2018).…”
Section: Human Reference Atlas Datamentioning
confidence: 99%
See 2 more Smart Citations
“…VR offers unique means to explore spatial and abstract data in a unified, immersive, and presence-enhancing environment beyond traditional WIMP interfaces, i.e., windows, icons, menu, pointer (Van Dam, 1997). While exploring 3D reference organs and tissue blocks on a 2D screen can be learned (Bueckle et al, 2021(Bueckle et al, , 2022, many users have issues interacting with 3D objects on a 2D screen. visualizations at various viewpoints (Camp et al, 1998), and to create visualizations of intricate molecular structures and biomolecular systems (Chavent et al, 2011;Gill and West, 2014;Trellet et al, 2018;Wiebrands et al, 2018).…”
Section: Human Reference Atlas Datamentioning
confidence: 99%
“…The anatomical structures in the ASCT+B tables are linked to anatomical structures in the 3D reference organs via a crosswalk (Bruce W. Herr II, 2022). Using the Registration User Interfaces (RUI) and the EUI (Bueckle et al, 2021, 2022; Börner et al, 2022) 3D reference organs are used to register 3D tissue blocks from diverse donors into the HRA. The specimen data for each tissue block is tracked to support filter, search, and exploration by donor demographics using the CCF Ontology (Herr et al, 2022).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The anatomical structures in the ASCT+B tables are linked to anatomical structures in the 3D reference organs via a crosswalk ( Quardokus et al, 2022 ). Using the Registration User Interface (RUI) and the EUI ( Bueckle et al, 2021 ; Bueckle et al, 2022 ; Börner et al, 2022 ), 3D reference organs are used to register 3D tissue blocks from diverse donors into the HRA. The specimen data for each tissue block is tracked to support filter, search, and exploration by donor demographics using the CCF Ontology ( Herr et al, 2023 ).…”
Section: Introductionmentioning
confidence: 99%
“…VR offers unique means to explore spatial and abstract data in a unified, immersive, and presence-enhancing environment beyond traditional WIMP interfaces, i.e., windows, icons, menus, pointer ( Van Dam, 1997 ). While exploring 3D reference organs and tissue blocks on a 2D screen can be learned ( Bueckle et al, 2021 ; Bueckle et al, 2022 ), many users have issues interacting with 3D objects on a 2D screen.…”
Section: Introductionmentioning
confidence: 99%