Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems 2019
DOI: 10.1145/3290607.3312925
|View full text |Cite
|
Sign up to set email alerts
|

Byte.it

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 6 publications
0
4
0
Order By: Relevance
“…Zhong et al and Kuzume et al both used in-ear bone conduction microphones to detect the occurrence of a tooth click [19,48]. Bitey [3] expands upon this work to allow for distinguishing different pairs of teeth clicking, and Byte.it [40] demonstrates that the interaction technique can be implemented with other commodity sensors like an accelerometer or gyroscope. Additionally, Xu et al proposed a system of clench interactions that differentiate different degrees of force when biting down and found that users appreciated the clench interaction as a hands-free technique [45].…”
Section: Teeth Interactionsmentioning
confidence: 92%
See 2 more Smart Citations
“…Zhong et al and Kuzume et al both used in-ear bone conduction microphones to detect the occurrence of a tooth click [19,48]. Bitey [3] expands upon this work to allow for distinguishing different pairs of teeth clicking, and Byte.it [40] demonstrates that the interaction technique can be implemented with other commodity sensors like an accelerometer or gyroscope. Additionally, Xu et al proposed a system of clench interactions that differentiate different degrees of force when biting down and found that users appreciated the clench interaction as a hands-free technique [45].…”
Section: Teeth Interactionsmentioning
confidence: 92%
“…Many of the final selected gestures involve tongue movements to different areas of the mouth which have been shown to be possible to detect in [9], [29], and [15]. We envision that recent work using sensors around the ear including acoustic sensing [1], electric field sensing [24] and motion sensor [40], are promising avenues to making mouth microgestures more adoptable. Some of the microgestures may be subtle enough to pose a challenge to detect with a single sensing modality.…”
Section: Technological Contextmentioning
confidence: 99%
See 1 more Smart Citation
“…A custom 3D-printed enclosure houses the components and provides two separate channels for the speaker and pressure sensor. [19,52]. The use of the ear is underexplored, and EarRumble enables interaction in which users can do less and are not impaired in other actions, as well as provide hidden interaction that is undetectable by, and non-disruptive to, others.…”
Section: Subtle and Discreet Interactionsmentioning
confidence: 99%