2015
DOI: 10.3390/machines3030173
|View full text |Cite
|
Sign up to set email alerts
|

Move, Hold and Touch: A Framework for Tangible Gesture Interactive Systems

Abstract: Technology is spreading in our everyday world, and digital interaction beyond the screen, with real objects, allows taking advantage of our natural manipulative and communicative skills. Tangible gesture interaction takes advantage of these skills by bridging two popular domains in Human-Computer Interaction, tangible interaction and gestural interaction. In this paper, we present the Tangible Gesture Interaction Framework (TGIF) for classifying and guiding works in this field. We propose a classification of g… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 17 publications
(7 citation statements)
references
References 74 publications
0
6
0
Order By: Relevance
“…Globally speaking, touch-based applications tend to be preferred for long-term strategic management actions. In contrast, a "fully digitalized" MCDU (Multifunction Control and Display Unit), or any other short-term tactical controller, would result in a less-integrated tool [3], where no link (causal, mechanical) can be made to what is going to happen: "We could say for all short term management, for example everything on the MCDU which is in front ... in my opinion .. it's not acceptable that this is digital […] when you touch [a physical button], if you get the wrong button and you're flying at the same time, you say to yourself: 'something is not right'. We'll make the connection. "…”
Section: The Pilot Viewpoint On Touch-based Toolsmentioning
confidence: 99%
“…Globally speaking, touch-based applications tend to be preferred for long-term strategic management actions. In contrast, a "fully digitalized" MCDU (Multifunction Control and Display Unit), or any other short-term tactical controller, would result in a less-integrated tool [3], where no link (causal, mechanical) can be made to what is going to happen: "We could say for all short term management, for example everything on the MCDU which is in front ... in my opinion .. it's not acceptable that this is digital […] when you touch [a physical button], if you get the wrong button and you're flying at the same time, you say to yourself: 'something is not right'. We'll make the connection. "…”
Section: The Pilot Viewpoint On Touch-based Toolsmentioning
confidence: 99%
“…However, the two first levels are most commonly involved in past experiments (Veelaert, Du Bois, Moons and Karana, 2020) and in the development of the demonstrator form (Veelaert et al, n.d.), as they are usually more approached from a semi-quantitative perspective in contrast to a more qualitative evaluation of emotions or the observation of performative actions. (Angelini et al, 2015;Cutkosky, 1989;Lederman and Klatzky, 2009). However, additional experiential qualities can be interesting for characterization experiments too, as used in previous work focussing on (recycled) plastics (Veelaert et al, n.d.;Veelaert, Du Bois, Moons, De Pelsmacker, et al, 2020).…”
Section: Dependent Variablesmentioning
confidence: 99%
“…Interaction with the Gesture Based-User Interface (mobile devices like smartphones, tablets, etc.) is based on simple gestures using one, two or more fingers, which the user drags and creates symbols on the touch screen; the interface translates this to a command (Angelini et al 2015). For example, a crossed sign created by a finger can be translated as an action to delete something in an application.…”
Section: Physical Gesturing and Interactionmentioning
confidence: 99%