CHI Conference on Human Factors in Computing Systems 2022
DOI: 10.1145/3491102.3517698
|View full text |Cite
|
Sign up to set email alerts
|

FaceOri: Tracking Head Position and Orientation Using Ultrasonic Ranging on Earphones

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 21 publications
(4 citation statements)
references
References 45 publications
0
4
0
Order By: Relevance
“…The optimization of the eye-tracking algorithm is needed to make gaze contribute more to the performance. Multi-modal data obtained by earphones and smart glasses can be used to infer head orientation and position [28,29] to facilitate the eye-tracking algorithm.…”
Section: Discussion and Future Workmentioning
confidence: 99%
“…The optimization of the eye-tracking algorithm is needed to make gaze contribute more to the performance. Multi-modal data obtained by earphones and smart glasses can be used to infer head orientation and position [28,29] to facilitate the eye-tracking algorithm.…”
Section: Discussion and Future Workmentioning
confidence: 99%
“…Recent work has used in-ear sensors for health applications [11][12][13] and activity tracking [39,52]. Prior work has also explored various interaction modalities like ultrasound sensing [68] and on-face interaction [70] for in-ear devices. The closest to our work is Clearbuds [14], which focuses on the task of enhancing the speech of the wearer using synchronized audio signals from two wireless earbuds.…”
Section: ]mentioning
confidence: 99%
“…Using Channel Impulse Response (CIR) for data representation, UltraGesture achieves a range resolution of 7mm and recognizes 12 fine-grained hand gestures with an average accuracy of over 99%. In addition to hand gesture recognition, ultrasonic-based HAR systems have also been built for a variety of other applications such as in-air digits recognition [64] and head tracking [65], exhibiting high potentials in HAR tasks featuring short-distance interaction and fine-grained movements. In the wearable domain, IMU is the dominant or even the sole sensor for motion sensing in current commercial products.…”
Section: A Sensor-based Human Activity Recognitionmentioning
confidence: 99%