Posted on

Touch-Screen Proficiency from a Non-Visual Perspective

Key Components of TouchScreen Proficiency From A Non-Visual Perspective: Acquisition of Gesturing and Mental Mapping Skills

Tom Dekker, Founder/Proprietor iHabilitation Canada

Background
Despite being a rehabilitation instructor with a long history in assistive technology, it took several long and frustrating weeks to learn proper use of the Voiceover gestures on my first iPhone in late 2010 and early 2011. It took that long to start thinking of Voiceover as something other than a cumbersome serial interface that required flicking from one screen element to the next, as if I was manipulating a string of beads.

Over time, however, I began to visualize the touch-screen interface in my mind’s eye, even though I couldn’t see it. As the positional relationships of screen elements were added to my mental maps, it became easier and easier to imagine and navigate the four-sided work surface using Voiceover.

This process naturally occurs instantly for people with vision, who also see the gestures being executed.

In my role as a rehabilitation instructor, I discovered that the demand for iOS instruction was growing rapidly. Thus, I found myself devoting a great deal of thought to the question: How can I make the learning process less frustrating for my students than it was for me? Thus, iHabilitation Canada was born.

iHabilitation teaches gestures through a combination of physical demonstration and use of the keyboard help mode. It develops mental mapping skills through use of SpeedDots screen protectors in conjunction with our “Feel’n’See Screenshots for iOS” book and its free companion videos.

1. Introduction
This document briefly outlines the iHabilitation process: a set of techniques and resources for teaching gestures and mental mapping techniques that facilitate the development of muscle memory to achieve greater proficiency with an Apple touchscreen interface. These methods can be applied equally to Android and other touchscreen interfaces as they become more accessible.

2. Acquiring Gesture Skill
Reading the description of a gesture is one thing, but learning to perform it accurately and consistently is quite another. This is particularly the case for those who have never interacted with a touchscreen or been around people who use one. What is a “flick” and when does it change to “drag”? What is the difference between those gestures? Understanding how to impart this information most effectively within a non-visual context is vital.

Working with a blind student, whether child or adult, it quickly becomes apparent that traditional “hand-over-hand” or “hand-under-hand” methods do not work for teaching a touchscreen user interface. The potential for extra fingers to touch the screen and create unwanted results is far too high, negatively affecting learning.

Good progress can often be made, however, by asking the student to imagine the palm or back of their hand as the screen, where the instructor can then execute the gesture. This method has proven very effective to convey the idea of what the touch screen “expects” from the user’s hand.

Once a general idea of the gestures has been developed, one can utilize the “keyboard help” mode of an iDevice or Apple computer to refine accuracy and consistency of gesture execution. This works especially well for blind instructors, since the help mode only describes the gesture performed without actually executing it. In this way, the help mode assists both teacher and student by providing 100 percent accurate biofeedback regarding gesture performance. It is possible to toggle between the help and “live” modes, so if there is difficulty learning a gesture, one can quickly switch to help mode for review and practice, then go live again and continue working with the application. The “VO Starter” and “Looktel Tutorial” apps are both very useful for gesture practice and familiarization with controls.

3. Mental Mapping and the TouchScreen
Many blind people are still visual learners. People who are adventitiously blind can easily visualize a screen full of elements if it is described to them. Many who are congenitally blind are still visual learners by heredity. With proper early intervention and exposure to tactile resources, many in this second group develop their spatial perception skills to a level where they indeed “see” the full screen in their mind’s eye.

These are the blind people who can genuinely appreciate tactile maps and diagrams. (I was one of them, achieving top marks in geography through wanting my hands on maps and to learn more, whenever possible.)

As I closed my own conceptual gaps during the iDevice learning process, I remembered some excellent tactile Windows diagrams I had seen and thought, “Why shouldn’t these exist for iDevices?” The result was “Feel’n’See Screenshots for iOS”.

These tactile and large print screenshots show many native iOS app screens, facilitating the development of a visual frame of reference for both blind and low-vision users.

The objective is to cultivate mental mapping processes and augment positional relationship identification: self-to-object (relationship of icon to finger) and object-to-object (positional relationships of screen elements to each other). This also helps with orientation – moving precisely in left-right and up-down directions; students quickly learn that diagonal motions can cause confusing and often surprising results.

After being introduced to the screenshots, students are encouraged to start dragging one finger around the screen, as if they were reading a kind of “talking braille”. Many can employ this method with one hand, while exploring a tactile screenshot with the other. They are amazed to discover the task bar across the top, a grid of at least 16 icons, and a dock containing another four icons at the bottom. Though taken for granted by sighted users, this is the kind of “picture” that can instantly close a major conceptual gap. Closing such gaps in this way greatly increases skill acquisition and level of proficiency.

With this process in place, one can move through the screenshots, learning about the various functions: phone, calendar, contacts, etc.

4. “Hand-Over-Shoulder” Technique
In situations where both student and teacher are blind, and both are using VoiceOver, trying to speak directions or have a conversation over the combined VoiceOvers can be confusing. My solution was to devise non-verbal instruction through the creation of an screen-shaped area in an upper quadrant of the student’s back, just below the shoulder.

When necessary, coaching occurs through performing the gestures simultaneously on the “back screen” and on my iDevice. The student executes the gesture they felt, and, if successful, they get the correct result from their device.

By this method, and with minimal verbal interaction, I am able to pilot a student through the motions of learning a new app, while VoiceOver does almost all the talking. This approach has been found to be effective for both beginners and advanced users. It can be particularly useful when working with a deaf-blind student who uses a braille display.

5. Mental Mapping and Muscle Memory
SpeedDots (www.speeddots.com) produces screen protectors for the various models of iPhone and iPad. These protectors provide raised dot markers for the most common controls, including send/done, cancel/back, telephone keypad and typing keyboard, as well as the five tabs located across the bottom of many app screens. Once applied to the face of the iDevice, most users almost immediately find that they can locate many controls and orient themselves to the working surface more quickly. Once they begin moving about the screen more quickly and accurately using these markers, muscle memory develops, facilitating improved speed and accuracy over time.

6. Supporting Videos
Peer support podcasting is one of the fastest-growing resources for assistive technology. iHabilitation provides such resources specifically for beginners and those working with them. These podcasts are also available as YouTube videos demonstrating the gestures and how to use the tactile guides in conjunction with the iDevice.

High quality audio narration is provided for blind users, while the video component demonstrates the techniques for those who may be assisting a blind person in their iDevice exploration. All of this material can be found on iHabilitation’s website (www.ihabilitation.ca).

7. Conclusions
Development and application of the methods described have been underway since late 2011, with direct interaction and observation of approximately of 50 students, either via individual instruction or through the Balance For Blind Adults iOS user group (Toronto). Ranging in age from 18 to 75 years, most of these students were able to acquire or increase proficiency in the skills described, namely:

  • gesture performance
  • using “drag” to replace “flick” as a primary navigation mode
  • increased spatial awareness resulting in more rapid adaptation when learning new apps, and
  • increased ability to work within a visual frame of reference when receiving assistance or support from sighted persons
  • Receive your free Eyes-Free Academy booklet - and more!

    Tom Dekker & the AbraHound say welcome to the Eyes-Free Academy!

    We don't spam! To unsubscribe, send a message to [email protected]