The use of mobile computers, also known as smartphones, has skyrocketed in the past decade. Although smartphones have been around since the 90s, such a Personal Digital Assistants (PDAs), they began to truly reach the masses in 2007 with the introduction of Apple’s touch screen iPhone. Since then, sales of mobile computers have been increasing year over year. Smartphones certainly have their benefits: portability, web access, camera/video capabilities, and applications for nearly anything you can think of, including GPS, calendars, and games. For all the conveniences that mobile computers offer the masses at large though, an already marginalized subset of the population is slow to partake in this rising technological advancement—the disabled.
The Disabled and Mobile Computers
2011 was the first year that smartphone sales outpaced sales of personal computers (PC). According to market analyst firm Canalsys, roughly 488 million smartphones were shipped worldwide, compared to 415 million PCs. Trying to comprehend how individuals with disabilities, particularly those with visual impairments, can take advantage of touch screens with a flat surface and minimal buttons seems a tall task.
Thankfully, not all touch screens are created equal. Some use sounds waves, others force sensors. Geoff Walker, a touch industry expert at Walker Mobile, says that there are 18 distinct touch technologies currently available, varying according to size, accuracy, reliability, durability and cost.
On a basic level, placing your finger (or stylus) on a touch screen changes the state the device is monitoring (e.g. sound or light waves). A finger physically blocks the waves, thereby allowing the mobile system to detect the location of the touch. Touch screens have removed the ability to represent characters felt by touch that Braille allows. Physical keyboards can be used by the blind—many have Braille printed on each key to indicate a letters or character, but virtual keyboards don’t offer that luxury. This makes it seem as though touch screen technology is a greater detriment than benefit to individuals with disabilities.
Mobile computers, though, have offered a solution for the visually impaired: built-in screen readers. A screen reader is a form of assistive technology that attempts to interpret what is being displayed on a user’s screen, and then presents the information the user via text-to-speech, sound icons, or a Braille output device. Screen readers are already widely used by the visually impaired with PCs, but now they are available on most smartphones as well. In this sense, screen readers have replaced the need for Braille (certainly there are cases where Braille is the preferred method of communication for some individuals), and have provided another way to convey information to the visually impaired.
Mobile screen readers, for all they offer, still suffer from the same constraints that PC screen readers do—they only work if web pages are designed to work with accessibility in mind. Inaccessible web pages limit the ability to interact with online content as much on PCs as they do on smartphones. Mobile developers and webmasters need to ensure that their products and online content is ‘accessible’ to allow the visually impaired to take advantage of existing assistive technologies. The disabled and visually impaired can and do use smartphones; the question is whether those that create mobile content and applications will allow them to access their content.