Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility 2013
DOI: 10.1145/2513383.2513440
|View full text |Cite
|
Sign up to set email alerts
|

Exploring the use of speech input by blind people on mobile devices

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

7
63
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 111 publications
(70 citation statements)
references
References 20 publications
7
63
0
Order By: Relevance
“…[3], [7], [8], [11], [20], [26], [29]), or voice input (e.g. [4], [19]), along with accessible forms of output to provide feedback to the user (e.g. audio [7], [8], [16], [26], [29], and/or tactile output [15], [26] either to the user's hand via the mobile device or via a separate wearable.…”
Section: A Eyes-free Interaction Techniquesmentioning
confidence: 99%
See 1 more Smart Citation
“…[3], [7], [8], [11], [20], [26], [29]), or voice input (e.g. [4], [19]), along with accessible forms of output to provide feedback to the user (e.g. audio [7], [8], [16], [26], [29], and/or tactile output [15], [26] either to the user's hand via the mobile device or via a separate wearable.…”
Section: A Eyes-free Interaction Techniquesmentioning
confidence: 99%
“…screen glare, or interacting with the device under the table [12]), or deliberate obfuscation by the user (e.g., attempting to hide the screen in a bag [18] or a pocket [1] from a shoulder surfing attack), is not well understood. While eyes-free interactions for different types of users and mobile devices have been studied by researchers in the past [3], [4], [7], [8], [11], [15], [16], [19], [20], [24], [29], studies have yet to investigate the performance with common authentication mechanisms when the phone is outof-view, and user coping strategies to enter passcodes in an eyes-free manner.…”
Section: Introductionmentioning
confidence: 99%
“…Special attention was given to the message editing process to make it more accessible. Indeed, Azenkot and Lee's survey showed that people with visual impairment spend 80% of their time editing when using speech recognition which can be frustrating (Azenkot and Lee, 2013).…”
Section: Case Study: Sms Applicationmentioning
confidence: 99%
“…Input modalities for VIPs are classified into eight categories: 1) hand gestures [7][8][9] , 2) upper body movement [10] , 3) lower body movement [11][12] , 4) environmental context [13][14][15][16][17] , 5) braille [18][19][20] , 6) multi-touch [21][22][23][24][25] , 7) speech [26][27][28][29] , and 8) keyboard/mouse [30][31] . Feedback from the system (i.e., output) was classified into four broad categories: 1) vibration [7,[9][10][13][14] , 2) speech [15,19,[21][22]26,31] , 3) sound [8,[11][12]16,[20][21]29] , and 4) braille display [17][18]25,27] .…”
Section: Related Workmentioning
confidence: 99%
“…Feedback from the system (i.e., output) was classified into four broad categories: 1) vibration [7,[9][10][13][14] , 2) speech [15,19,[21][22]26,31] , 3) sound [8,[11][12]16,[20][21]29] , and 4) braille display [17][18]25,27] .…”
Section: Related Workmentioning
confidence: 99%