2016
DOI: 10.1007/978-3-319-40651-0_26
|View full text |Cite
|
Sign up to set email alerts
|

An Investigation of Leap Motion Based 3D Manipulation Techniques for Use in Egocentric Viewpoint

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
9
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(9 citation statements)
references
References 13 publications
0
9
0
Order By: Relevance
“…This scale developed by Brooke [ 238 ] is a tool to quickly assign a global scale to that perspective and has been widely adopted over the years [ 239 , 240 ]. There are several contexts were SUS was used [ 53 , 133 , 173 , 198 , 241 ]. For example, the study of Nestrov et al [ 53 ] assessed the perceived usability of LMC and Kinect 2.0 in a clinical context (control of radiological images during surgery).…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…This scale developed by Brooke [ 238 ] is a tool to quickly assign a global scale to that perspective and has been widely adopted over the years [ 239 , 240 ]. There are several contexts were SUS was used [ 53 , 133 , 173 , 198 , 241 ]. For example, the study of Nestrov et al [ 53 ] assessed the perceived usability of LMC and Kinect 2.0 in a clinical context (control of radiological images during surgery).…”
Section: Discussionmentioning
confidence: 99%
“…Coelho and Verbeek [ 241 ] assessed pointing tasks in 3D VR space and compared LMC and mouse as input devices and used SUS to compare the usability between the two input systems. Caggianese et al [ 173 ] compared two object manipulation methods with the LMC in VR space and used the qualitative data of SUS to compare the perceived usability of the two methods.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Leap-Motion has become very common in virtual and augmented reality, it is efficiency, mobility and low costing make it accessible to the general public. Many researchs and works used this latter, we cite Bacim et al (2014) and Jiang et al (2018) , Jia et al (2019), Kim and Lee (2016) and Caggianese et al (2016). Wen et al (2020) presented a low-cost glove that tracked the motions of human fingers,they combinied several triboelectric textile sensors and proper machine learning technique, it has great potential to realize complex gesture recognition with the minimalistdesigned glove for the comprehensive control in both real and virtual space.…”
Section: Hand Gesturesmentioning
confidence: 99%