Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology 2011
DOI: 10.1145/2047196.2047279
|View full text |Cite
|
Sign up to set email alerts
|

TapSense

Abstract: We present TapSense, an enhancement to touch interaction that allows conventional surfaces to identify the type of object being used for input. This is achieved by segmenting and classifying sounds resulting from an object's impact. For example, the diverse anatomy of a human finger allows different parts to be recognized -including the tip, pad, nail and knuckle -without having to instrument the user. This opens several new and powerful interaction opportunities for touch input, especially in mobile devices, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0

Year Published

2012
2012
2022
2022

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 180 publications
(17 citation statements)
references
References 22 publications
0
17
0
Order By: Relevance
“…Optical methods such as frustrated total internal reflection (FTIR) [13] and depth cameras [14,39] are commonly used for largescale touch screens. Acoustic methods have also been shown in touch interactive surfaces [25] and on the body [15,16]. Other technologies include resistive methods [17,37], electric field sensing [43], impedance profiling [32], time-domain reflectometry [40] and electric field tomography [41].…”
Section: Touch Sensingmentioning
confidence: 99%
“…Optical methods such as frustrated total internal reflection (FTIR) [13] and depth cameras [14,39] are commonly used for largescale touch screens. Acoustic methods have also been shown in touch interactive surfaces [25] and on the body [15,16]. Other technologies include resistive methods [17,37], electric field sensing [43], impedance profiling [32], time-domain reflectometry [40] and electric field tomography [41].…”
Section: Touch Sensingmentioning
confidence: 99%
“…Harrison et al [25] presented TapSense. It allows to differentiate interactions with a pen, a finger tip or even a fingernail.…”
Section: Related Workmentioning
confidence: 99%
“…Boring et al [6] explored the extent to which changes in contact point size are purposeful, and how changes in the centroid of a contact point can provide a further parameter for touch input [5]. In addition to using the geometrical properties of a contact point to extend the expressivity of touch input, prior work [20,29] has also demonstrated that the sound made when making contact with the touchscreen can be used to differentiate touches made by different parts of the hand.…”
Section: Related Workmentioning
confidence: 99%
“…While some of the technologies discussed are aimed at making performance of common discrete commands quicker or more natural [5,6,11,13,14,19,21,24,29], others demonstrate a potential to support more expressive interactions [22,35]. However, the majority of these approaches rely on bespoke or impractical hardware configurations [3,10,27,32,44] and many only extend the expressiveness of touch interaction with a single or small number of new capabilities [12,17,20,26,38,42,46,47]. Moreover, none of them introduce a conceptual model for expressive touch interactions that can guide other researchers interested in this field of research and stimulate new ideas for how such technologies can be used.…”
Section: Related Workmentioning
confidence: 99%