cover phone g2.jpg

nonvisual digital

 
 

What is lost in translation when digital content is interpreted by accessible technology?  

 

 
gaaaaahhhhhhhhhh4.jpg
slidey phone - 1.png
hero image edited light.png

The metaphors and compositional strategies used in digital design are dependent on vision.  Accessible technology provides a window into screen content for blind and visually impaired users. However, it fails to translate the experience and sensibility of popular interfaces. Through research, experimentation, and prototyping, I investigated the need for a more impressionistic visual to nonvisual digital translation.

 
 
 

Defining the problem


Modern accessibility tools make it nearly impossible to employ the spatial cognition and sensitivity to graphic hierarchy that are essential to visual reading.


Advances in the development of Braille tablets are on the horizon. However, only about 10% of the legally blind population is able to read Braille.* Further, Braille itself can be problematic because word emphasis is contextual rather than formal. For example, A bold word is indicated only by a standard Braille character. It’s form does not stand out or demand special attention.

*NFB Literacy Report

 

Empathy Experiences


 
 

To begin my research, I engaged in a variety of empathy activities. These included a blindfolded coffee run, a white cane lesson, a jog around an accessible track, and an attempt to function without the privilege my own vision correction.

 
 
 

CONTEXTUAL RESEARCH- VISION IMPAIRMENT


insight - 73.jpg
insight - 41.jpg

I spoke with optometrists, therapists, and patients at nearby rehab centers and community gatherings. I learned about tools and techniques used to support individuals with minimal remaining eyesight. Rather than using a screen-reader, most opted to strain their eyes to see digital content, or, in the case of some seniors, forego it altogether. 

 
 
 
 

CONTEXTUAL RESEARCH- BLINDNESS


The consensus was that while the basic layout of screen elements can be inferred, visual metaphors like that of sliding paper do not translate to sound.


nfb - 55.jpg

At a National Federation of the Blind gathering, I met with individuals who identify as blind. Equipped with a tactile slideshow, I posed questions about the translation of graphics and text to sound. My models addressed metaphors that sighted people use to think about the movement and organization of digital information.

nfb - 1 (16).jpg
 

visualizing voiceover


VoiceOver was the most ubiquitous screen-reader amongst the individuals I met. So, with a bit of practice, I learned to operate it myself. The following animations describe two ways to use VoiceOver to navigate through digital content

non linear

  • VoiceOver reads out items a user is physically touching.

  • A single touch or swipe will cause VoiceOver to announce items.

  • double tap will activate, open, or edit items.

linear

  • Voiceover reads off screen items successively in response to sideways swipes.

  • Twisting one's fingers on the screen as if turing a radio dial provides options for different types of item containers (i.e. letters, words, headings, hyperlinks, default)

  • An up or down swipe reveals screen items grouped within containers. (for example,
    up and down swipes within "letters" might read "Capitol V. o. i. c. e."

 

asking experts


I visited blind and visually impaired individuals in their homes and workplaces to learn about their attitudes towards VoiceOver and other screen-readers.

barbara's house - 27.jpg

severe learning curve

Despite her declining vision, Barbara has not learned to use a screen-reader. The transition to this technology requires a significant amount of practice and cognitive adjustment. She described screenreader voices as "startling and fast."

IMG_0325 2.JPG

diversity of sound

Cory simultaneously listened to the fast, robotic voice of a screen-reader and took calls on his cellphone.  He told me that the diversity of the sounds made them comfortable to listen to simultaneously.

IMG_0405.JPG

a lack of delight

Molly explained that to the best of her
knowledge there are no websites that are specifically formatted for more experimental
or playful interaction with screen-readers.

derek1.png

visual thinking

Derek explained that he is a visual thinker. Even though his screenreader lists off items in a linear format, he imagines their arrangement on his screen. This helps him to understand their functionality.

 

Insights and observations


VoiceOver does not allow for the initial glance that helps visual readers to quickly identify  information and prepare a reading strategy.


  • The mismatch of visual screen items to VoiceOver gesture patterns creates a confusing mental model for seniors learning to use the technology.

  • The implied spatial organization of screen items supports an understanding of their functionality for sighted and blind people.

  • Only about 1% of blind individuals have congenital blindness- this means that most have at least basic idea of what a screen layout might look like.

techAsset 65.png
techAsset 64.png
 

design solutions


 
 

In an effort to simulate the experience of casually scanning an interface before reading it's contents, I designed a system in which different types of screen elements might be reflected by different sound qualities. A user could quickly pan over the screen with his or her thumb to get an overall sense of the type of information before them, how it is arranged, and where the most important content is located.

 
 
 
 
 
 
 
 
 
 

new questions


  • What would content that was not translated, but designed specifically for the purpose of being experienced non-visually feel or sound like?

  • How might the physical shape of a device affect the way a user interfaces with the information on it?

  • How might the way we interact with information change if the common rectangular format of information presentation was disrupted?