Advancing access to non-visual graphics : haptic and audio representations of 3D information and data

Placeholder Show Content

Abstract/Contents

Abstract
Graphical representations are pervasive in Science, Technology, Engineering and Math (STEM) content, often serving as cognitive aids for spatial reasoning tasks. Examples include data charts, 3D models and simulations, physics diagrams, and maps. The vast majority of this content is highly visual and often inaccessible to people who are Blind and Visually Impaired (BVI). Alternative representations are mostly text-based (e.g. image descriptions) or static and difficult to update, produced by skilled specialists (e.g. embossed graphics). The inaccessibility gap is growing as more content moves to a digital format and offers sighted users capabilities for interactive manipulation and exploration. This dissertation aims to increase access to non-textual information for BVI people by investigating representations of spatial information that exploit haptic and/or auditory perception towards supporting enhanced understanding, interaction, and manipulation. I explore these accessibility challenges motivated by two activity domains: 3D design (Part 1) and data visualization on the web (Part 2). In Part 1, I investigate supporting 3D information consumption as well as authoring through the development of a novel 2.5D haptic shape display. This display consists of a grid of 12x24 actuated pins that can render shapes at interactive rates. With this system, I conduct studies to assess haptic object recognition and spatial navigation of large-format layouts. Working with BVI co-designers, I develop interaction techniques to support 3D design tasks, allowing users to create and manipulate models. An evaluation demonstrates how the proposed 3D design workflow enables BVI makers to quickly understand 3D design concepts and apply them in creating their own designs. In Part 2, I investigate increasing the accessibility of data visualizations on the web. With the goal of understanding BVI users' current experiences when accessing data-driven media on the web, I present findings from two studies, an online survey and a remote contextual inquiry. I outline challenges with existing accessible data representations, and discuss compromises users make between the modalities and technology they would prefer for access vs. the practical feasibility of their availability. To address the need for data representations on the web that are updatable and multimodal, I propose an algorithmic approach to generating data narratives, combining textual descriptions and sonification (the mapping of data to non-speech sounds). This approach is driven by co-design workshops with BVI users and principles in auditory perception. An evaluation of this approach demonstrates that data narratives support users in better understanding and gaining insights from auditory graphics compared to the standard representation.

Description

Type of resource text
Form electronic resource; remote; computer; online resource
Extent 1 online resource.
Place California
Place [Stanford, California]
Publisher [Stanford University]
Copyright date 2021; ©2021
Publication date 2021; 2021
Issuance monographic
Language English

Creators/Contributors

Author Siu, Alexa Fay
Degree supervisor Follmer, Sean
Thesis advisor Follmer, Sean
Thesis advisor Landay, James A, 1967-
Thesis advisor Okamura, Allison
Degree committee member Landay, James A, 1967-
Degree committee member Okamura, Allison
Associated with Stanford University, Department of Mechanical Engineering

Subjects

Genre Theses
Genre Text

Bibliographic information

Statement of responsibility Alexa F. Siu.
Note Submitted to the Department of Mechanical Engineering.
Thesis Thesis Ph.D. Stanford University 2021.
Location https://purl.stanford.edu/pz042ng6051

Access conditions

Copyright
© 2021 by Alexa Fay Siu
License
This work is licensed under a Creative Commons Attribution Non Commercial 3.0 Unported license (CC BY-NC).

Also listed in

Loading usage metrics...