Proceedings of the SIGCHI Conference on Human Factors in Computing Systems 2012
DOI: 10.1145/2207676.2207735
|View full text |Cite
|
Sign up to set email alerts
|

The user as a sensor

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
5
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 92 publications
(5 citation statements)
references
References 27 publications
0
5
0
Order By: Relevance
“…While the adoption of navigation technology among people with visual impairments suggests that such technologies are helpful, we still have limited understanding of how people choose navigation technologies and use them in their everyday lives. navigation technologies to support blind users, but most prior research focuses on a singular viewpoint, whether the sensory output method (e.g., sound [16]), navigation scenario (e.g., unfamiliar indoor environment [9]), or suitability of specific technology (e.g., smartphone accelerometers [4]). Absent in this work are the role that individual differences play in choosing and using navigation technology; we should expect that people with vision impairments are individuals, with individual preferences, and not simply an aggregate collection of users.…”
Section: Introductionmentioning
confidence: 99%
“…While the adoption of navigation technology among people with visual impairments suggests that such technologies are helpful, we still have limited understanding of how people choose navigation technologies and use them in their everyday lives. navigation technologies to support blind users, but most prior research focuses on a singular viewpoint, whether the sensory output method (e.g., sound [16]), navigation scenario (e.g., unfamiliar indoor environment [9]), or suitability of specific technology (e.g., smartphone accelerometers [4]). Absent in this work are the role that individual differences play in choosing and using navigation technology; we should expect that people with vision impairments are individuals, with individual preferences, and not simply an aggregate collection of users.…”
Section: Introductionmentioning
confidence: 99%
“…Although widespread solutions are still not a reality indoors, there are several eforts to support indoor localization and blind navigation assistance [17,57]. There is an increasing tendency for approaches where users do not require any hardware besides their own devices, for instance by making use of smartphone sensors [16,21]. In addition, camera-based approaches can use the user's (or a specialized) device to guide them to a particular target [4,18,41] or to detect and avoid obstacles [19,39,49,62,66].…”
Section: Indoor Navigation Assistancementioning
confidence: 99%
“…Previous indoor navigation designs can be broadly grouped into three categories: dead-reckoning (e.g., Foxlin 2005;Li et al 2012); sensor-based (e.g., Hub et al 2004);and, beacon-based (e.g., Chawathe 2008;Buchanan 2010). Each of these solutions have their own trade-offs as summarised by Fallah et al (2012Fallah et al ( , 2013. Most notably, deadreckoning techniques degrade in accuracy over time as errors accumulate; beacon-based versions often involve large-scale augmentation of the physical environment; and, sensorbased approaches require considerable computational power and custom hardware.…”
Section: Indoor Navigationmentioning
confidence: 99%