2023
DOI: 10.1111/cgf.14835
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating View Management for Situated Visualization in Web‐based Handheld AR

Abstract: As visualization makes the leap to mobile and situated settings, where data is increasingly integrated with the physical world using mixed reality, there is a corresponding need for effectively managing the immersed user's view of situated visualizations. In this paper we present an analysis of view management techniques for situated 3D visualizations in handheld augmented reality: a shadowbox, a world-in-miniature metaphor, and an interactive tour. We validate these view management solutions through a concret… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 58 publications
0
0
0
Order By: Relevance
“…RagRug portrays the potential of cross-device connectivity with a visualization pipeline that combines IoT devices, data mediation via MQTT, Node-RED for filtering, and IATK for visual encoding and rendering in MR. Finally, VRIA [12], although predominately designed for VR, also works in AR settings [6], largely thanks to the ongoing development of the open WebXR specification. Our work here follows the same open web technology approach as VRIA.…”
Section: Interaction In Immersive Analyticsmentioning
confidence: 99%
See 1 more Smart Citation
“…RagRug portrays the potential of cross-device connectivity with a visualization pipeline that combines IoT devices, data mediation via MQTT, Node-RED for filtering, and IATK for visual encoding and rendering in MR. Finally, VRIA [12], although predominately designed for VR, also works in AR settings [6], largely thanks to the ongoing development of the open WebXR specification. Our work here follows the same open web technology approach as VRIA.…”
Section: Interaction In Immersive Analyticsmentioning
confidence: 99%
“…It is written in TypeScript using ReactJS and depends on react-three-fiber 4 (R3F), a React wrapper for three.js. 5 It also depends on the WebXR API 6 for hand pose, the Web Speech API 7 for speech recognition, and the AR.js library for QR code recognition. 8 Wizualization creates the specifications for our grammar, Optomancy, which returns components from its R3F renderer, OptomancyR3F, to be rendered as visualization objects in the XR scene by Wizualization.…”
Section: System Summarymentioning
confidence: 99%