Adjunct Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications 2018
DOI: 10.1145/3239092.3265961
|View full text |Cite
|
Sign up to set email alerts
|

Approach for Enhancing the Perception and Prediction of Traffic Dynamics with a Tactile Interface

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
18
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 10 publications
(20 citation statements)
references
References 23 publications
2
18
0
Order By: Relevance
“…The data reported here were recorded as part of a larger study. Details on the entire experiment can be found in [34,35].…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The data reported here were recorded as part of a larger study. Details on the entire experiment can be found in [34,35].…”
Section: Methodsmentioning
confidence: 99%
“…13 participants took part in the entire experiment (12 males, mean age 33 [24 to 43]) [34,35]. Nine of the participants were categorized as highly experienced drivers while four were categorized as little-to medium experienced drivers based on the driven kilometers per year.…”
Section: Participantsmentioning
confidence: 99%
“…On-body computing opens up a wide variety of opportunities for interaction, e.g., leveraging the skin as a platform for interaction [21,63,64], using electrical muscle stimulation to move users' limbs [36] and providing feedback for prosthetic limbs [34]. Vibrotactile interfaces on the body for output are particularly attractive as they are not restricted to body locations that are visible, which leads to their use across a diverse range of body locations, e.g., on the hand [18-20, 33, 42], wrist [7,30,31,35], forearm [37,38,45,47,51,70], upperarm [3,4,58], back [23,41,60], stomach [28], thigh [56] and lower leg [9]. Their usage spans a wide range of interaction scenarios, such as speech communication [46,67,70], affective communication [43], progress monitoring [7], learning gestures [20], spatial guidance [19,33], motion guidance [51,56] and navigation [13,15,24].…”
Section: On-body Vibrotactile Interfacesmentioning
confidence: 99%
“…Their usage spans a wide range of interaction scenarios, such as speech communication [46,67,70], affective communication [43], progress monitoring [7], learning gestures [20], spatial guidance [19,33], motion guidance [51,56] and navigation [13,15,24]. Srikulwong and O'Neill [57] Meier et al [40] Konishi et al [27] Israr and Poupyrev [23] Tam et al [59] Cauchard et al [7] Leong et al [34] Lee et al [30] Lee and Starner [31] Liao et al [35] Zhao et al [70] Luzhnica and Veas [38] Luzhnica et al [37] Pfeiffer et al [45] Schönauer et al [51] Reinschluessel et al [47] Stratmann et al [58] Bark et al [4] Alvina et al [3] Spelmezan et al [56] Chen et al [9] Wong et al [67] Dobbelstein et al [13] Karuei et al [25] Cholewiak and Collins [12] Cholewiak et al [11] Schneider et al [49] Krüger et al [28] Spelmezan [55] Ertan et al [16] Aggravi et al [1] Ho et al…”
Section: On-body Vibrotactile Interfacesmentioning
confidence: 99%
See 1 more Smart Citation