Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems 2015
DOI: 10.1145/2702123.2702530
|View full text |Cite
|
Sign up to set email alerts
|

Investigating the Information Transfer Efficiency of a 3x3 Watch-back Tactile Display

Abstract: A watch-back tactile display (WBTD) is expected to be a viable supplement to the user interface limitations of a smartwatch. However, its design requires that many design parameters such as tactor types and stimulus patterns be determined. We conducted a series of experiments to explore the design space of a WBTD consisting of 3×3 tactors. We demonstrated that tactor types and the temporal patterns and locus of a stimulus produce statistically significant effects on the efficiency of a WBTD. The experimental r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
13
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 39 publications
(13 citation statements)
references
References 14 publications
0
13
0
Order By: Relevance
“…On-body computing opens up a wide variety of opportunities for interaction, e.g., leveraging the skin as a platform for interaction [21,63,64], using electrical muscle stimulation to move users' limbs [36] and providing feedback for prosthetic limbs [34]. Vibrotactile interfaces on the body for output are particularly attractive as they are not restricted to body locations that are visible, which leads to their use across a diverse range of body locations, e.g., on the hand [18-20, 33, 42], wrist [7,30,31,35], forearm [37,38,45,47,51,70], upperarm [3,4,58], back [23,41,60], stomach [28], thigh [56] and lower leg [9]. Their usage spans a wide range of interaction scenarios, such as speech communication [46,67,70], affective communication [43], progress monitoring [7], learning gestures [20], spatial guidance [19,33], motion guidance [51,56] and navigation [13,15,24].…”
Section: On-body Vibrotactile Interfacesmentioning
confidence: 99%
See 2 more Smart Citations
“…On-body computing opens up a wide variety of opportunities for interaction, e.g., leveraging the skin as a platform for interaction [21,63,64], using electrical muscle stimulation to move users' limbs [36] and providing feedback for prosthetic limbs [34]. Vibrotactile interfaces on the body for output are particularly attractive as they are not restricted to body locations that are visible, which leads to their use across a diverse range of body locations, e.g., on the hand [18-20, 33, 42], wrist [7,30,31,35], forearm [37,38,45,47,51,70], upperarm [3,4,58], back [23,41,60], stomach [28], thigh [56] and lower leg [9]. Their usage spans a wide range of interaction scenarios, such as speech communication [46,67,70], affective communication [43], progress monitoring [7], learning gestures [20], spatial guidance [19,33], motion guidance [51,56] and navigation [13,15,24].…”
Section: On-body Vibrotactile Interfacesmentioning
confidence: 99%
“…Their usage spans a wide range of interaction scenarios, such as speech communication [46,67,70], affective communication [43], progress monitoring [7], learning gestures [20], spatial guidance [19,33], motion guidance [51,56] and navigation [13,15,24]. Srikulwong and O'Neill [57] Meier et al [40] Konishi et al [27] Israr and Poupyrev [23] Tam et al [59] Cauchard et al [7] Leong et al [34] Lee et al [30] Lee and Starner [31] Liao et al [35] Zhao et al [70] Luzhnica and Veas [38] Luzhnica et al [37] Pfeiffer et al [45] Schönauer et al [51] Reinschluessel et al [47] Stratmann et al [58] Bark et al [4] Alvina et al [3] Spelmezan et al [56] Chen et al [9] Wong et al [67] Dobbelstein et al [13] Karuei et al [25] Cholewiak and Collins [12] Cholewiak et al [11] Schneider et al [49] Krüger et al [28] Spelmezan [55] Ertan et al [16] Aggravi et al [1] Ho et al…”
Section: On-body Vibrotactile Interfacesmentioning
confidence: 99%
See 1 more Smart Citation
“…Also the wearer's skin can be used as an interface to input or get feedback from wearable devices in skin touch/feedback UI. Recent projects and examples such as SkinWatch [11], Skin Drag [12], and 3x3 WBTD [13] fall into this category. Gesture UI is a contact-less type of UI that can be used in mid-air over the wearable device such as AirTouch [14] and is usually based on gesture and activity recognition as in zSense [15].…”
Section: B Wearable Ui/uxmentioning
confidence: 99%
“…Various studies on tactile sensation have been being conducted wherein information is presented through the skin [1]- [3]. Lee [4] and Sawada [5] succeeded in recognizing patterns and letters for absolute value information. However, with these methods, it took time to present the information, and the user needs to concentrate on recognizing it.…”
Section: Introductionmentioning
confidence: 99%