2009
DOI: 10.1007/978-3-642-03658-3_32
|View full text |Cite
|
Sign up to set email alerts
|

How Not to Become a Buffoon in Front of a Shop Window: A Solution Allowing Natural Head Movement for Interaction with a Public Display

Abstract: Abstract. The user interaction solution described in this paper was developed in the context of an Intelligent Shop Window (ISW) with an aim to offer a user the interaction solution where system response would be triggered by naturally gazing at products. We have analyzed a possibility to realize such a user interaction solution using gaze tracking and concluded that remote calibration free eye tracking is still a subject of academic research, but that head tracking could be used instead. We argue that convent… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2010
2010
2020
2020

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 13 publications
(8 citation statements)
references
References 17 publications
0
8
0
Order By: Relevance
“…Conversely, gaze attention over wider visual fields is often approximated by head pointing and ignores eye-in-head movement. Various works have used face pose tracking for gaze pointing on large displays [34,35]. Early work on VR explored gaze directed input but solely based on head orientation [30,59].…”
Section: Gaze Tracking Based On Eye Versus Head Movementmentioning
confidence: 99%
“…Conversely, gaze attention over wider visual fields is often approximated by head pointing and ignores eye-in-head movement. Various works have used face pose tracking for gaze pointing on large displays [34,35]. Early work on VR explored gaze directed input but solely based on head orientation [30,59].…”
Section: Gaze Tracking Based On Eye Versus Head Movementmentioning
confidence: 99%
“…A large body of work provides general insight into how people interact with displays in public settings (e.g., Opinionizer [1], Citywall [15] and Looking glass [12]), which is of importance to inform practical deployment of Sideways. A variety of projects have used head orientation towards large displays in presumed approximation of what people look at [11,13,19]. However, Mubin et al found in an "interactive shop window" study that only few users aligned their heads with their gaze [11].…”
Section: Eye-based Interactionmentioning
confidence: 99%
“…A variety of projects have used head orientation towards large displays in presumed approximation of what people look at [11,13,19]. However, Mubin et al found in an "interactive shop window" study that only few users aligned their heads with their gaze [11]. Other work has focused on low-cost extension of public displays for gaze pointing however still requires a calibration phase prior to interaction [16].…”
Section: Eye-based Interactionmentioning
confidence: 99%
“…This fits with the very high standard deviation (see Table 1) in gaze estimation for far locations. This is mainly caused by the large variability in head movement propensity [17]. If we also take a look into at the gaze estimation between targets with minimal distance (e.g., far-left EM-C T4: 3.39°, T5: 1.14°compared to HO T4: 7.52°, T5: 2.08°), we can see that HO performs much worse than EM.…”
Section: Gaze Estimationmentioning
confidence: 99%