Audio-only interfaces, facilitated through text-to-speech screen reading software, have been the primary mode of computer interaction for blind and low-vision computer users for more than four decades. During this time, the advances that have made visual interfaces faster and easier to use, from direct manipulation to skeuomorphic design, have not been paralleled in nonvisual computing environments. The screen reader–dependent community is left with no alternatives to engage with our rapidly advancing technological infrastructure. In this article, we describe our efforts to understand the problems that exist with audio-only interfaces. Based on observing screen reader use for 4 months at a computer training school for blind and low-vision adults, we identify three problem areas within audio-only interfaces: ephemerality, linear interaction, and unidirectional communication. We then evaluated a multimodal approach to computer interaction called the Tangible Desktop that addresses these problems by moving semantic information from the auditory to the tactile channel. Our evaluation demonstrated that among novice screen reader users, Tangible Desktop improved task completion times by an average of 6 minutes when compared to traditional audio-only computer systems.
No abstract
From the white cane to the smartphone, technology has been an efective tool for broadening blind and low vision participation in a sighted world. In the face of this increased participation, individuals with visual impairments remain on the periphery of most sight-frst activities. In this paper, we describe a multi-month public-facing co-design engagement with an organization that supports blind and low vision outrigger paddling. Using a mixed-ability design team, we developed an inexpensive cooperative outrigger paddling system, called CoOP, that shares control between sighted and visually impaired paddlers. The results suggest that public design, a DIY (do-it-yourself) stance, and attentiveness to shared physical experiences, represent key strategies for creating assistive technologies that support shared experiences. CCS Concepts: • Human-centered computing → Accessibility theory, concepts and paradigms; Empirical studies in accessibility; Accessibility design and evaluation methods; Accessibility technologies.
No abstract
In this work, we apply an activity theory lens to analyze nonvisual computing for blind and low-vision computer users. Our analysis indicates major challenges for users in translating the activities they are working towards into specific tasks to be completed in a system comprehensible manner. Specifically, blind and lowvision students learning to use accessible technologies struggled with organizing their activities, tracking the history and status of their operations, and understanding how the system was acting underneath these interactions. We discuss how activity-centered design can be applied to nonvisual interfaces to better match user behavior in a computational system. CCS Concepts: • Human-centered computing → Haptic devices; Empirical studies in accessibility; Accessibility systems and tools; HCI design and evaluation methods; • Social and professional topics → People with disabilities;
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.