Proceedings of the 22nd International ACM SIGACCESS Conference on Computers and Accessibility 2020
DOI: 10.1145/3373625.3418023
|View full text |Cite
|
Sign up to set email alerts
|

PantoGuide: A Haptic and Audio Guidance System To Support Tactile Graphics Exploration

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(6 citation statements)
references
References 24 publications
0
5
0
Order By: Relevance
“…Their generation process was divided into five basic parts: (1) the input data, which are the raw data presented in the chart; (2) the user input, which specifies characteristics that can be set or triggered by the user inside the user interface; (3) the user interface that provides an interactive SVG file within the preview; (4) the rendering process, which receives the input data from the user interface to generate the output data presented to the user; and (5) design guidelines, which highly depend on user input, particularly on raw data [51]. Chase et al [52] proposed a system that provided audio and haptic direction through skin-stretch feedback to the dorsum of a BVI's hand, while exploring a tactile graphic overlaid on a touchscreen. The system was able to support two teaching scenarios (synchronous and asynchronous) and two guidance interactions (point-to-point and continuous) and demonstrate their use in two applications: a bar chart and tactile graphics of a marble rolling down an inclined plane.…”
Section: Currently Available Methods Using Audiomentioning
confidence: 99%
See 3 more Smart Citations
“…Their generation process was divided into five basic parts: (1) the input data, which are the raw data presented in the chart; (2) the user input, which specifies characteristics that can be set or triggered by the user inside the user interface; (3) the user interface that provides an interactive SVG file within the preview; (4) the rendering process, which receives the input data from the user interface to generate the output data presented to the user; and (5) design guidelines, which highly depend on user input, particularly on raw data [51]. Chase et al [52] proposed a system that provided audio and haptic direction through skin-stretch feedback to the dorsum of a BVI's hand, while exploring a tactile graphic overlaid on a touchscreen. The system was able to support two teaching scenarios (synchronous and asynchronous) and two guidance interactions (point-to-point and continuous) and demonstrate their use in two applications: a bar chart and tactile graphics of a marble rolling down an inclined plane.…”
Section: Currently Available Methods Using Audiomentioning
confidence: 99%
“…School art teachers showed interest in applying the design as an educational instrument in the classroom. As a result of the analysis, additional audio feedback was proven to be an effective method for making it easier for BVI individuals to perceive tactile graphics [7,11,[51][52][53]61,64].…”
Section: Currently Available Methods Using Audiomentioning
confidence: 99%
See 2 more Smart Citations
“…This approach's limitations are restricted precision in placing markers on the coil array and the impossibility of scaling this technology to other tactile graphic readers. Another navigation interface alternative is to use additional hand-wearable technologies that implement haptic feedback during pinpointing interactions [8,[36][37][38]. Some of these technologies were extended and used for sighted people when exploring 3D virtual environments [39,40].…”
Section: Tactile Graphic Readersmentioning
confidence: 99%