2014
DOI: 10.1109/tvcg.2014.2346311
|View full text |Cite
|
Sign up to set email alerts
|

MovExp: A Versatile Visualization Tool for Human-Computer Interaction Studies with 3D Performance and Biomechanical Data

Abstract: In Human-Computer Interaction (HCI), experts seek to evaluate and compare the performance and ergonomics of user interfaces. Recently, a novel cost-efficient method for estimating physical ergonomics and performance has been introduced to HCI. It is based on optical motion capture and biomechanical simulation. It provides a rich source for analyzing human movements summarized in a multidimensional data set. Existing visualization tools do not sufficiently support the HCI experts in analyzing this data. We iden… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0
1

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 18 publications
(7 citation statements)
references
References 40 publications
0
6
0
1
Order By: Relevance
“…We used MATLAB, R and MovExp [18] for exploring the dataset. This section presents the main findings.…”
Section: Resultsmentioning
confidence: 99%
“…We used MATLAB, R and MovExp [18] for exploring the dataset. This section presents the main findings.…”
Section: Resultsmentioning
confidence: 99%
“…This lack of flexibility stems from several factors, including technical challenges [90,102], costs [72] and the lack of theoretical foundations that bridge external representations with internal cognitive processes [47,79,104]. Two other factors are seemingly conflicting.…”
Section: Reported Benefits and Critiques On Interactionmentioning
confidence: 99%
“…For this, dimensions of the body part can be extracted from a 3D scanning and, using a 3D design software, produce a 3D printable model for each individual user (Algar & Guldberg, 2013). Visualization tools can be used to understand human movements in terms of muscular loads and directions (Palmas, Bachynskyi, Oulasvirta, Seidel, & Weinkauf, 2014).…”
Section: Modeling and Printing Tangible Scaffolding In 3dmentioning
confidence: 99%