In the last couple days, Marco Arment and John Gruber have written essays on wearable technology and Google Glass, respectively. Both pieces have inspired me to begin considering the accessibility ramifications of wearable technology, smartwatches in particular.
From an accessibility perspective, the first thing that springs to mind when I think of, say, an “iWatch” is a talking watch. Growing up, I knew several other visually impaired kids at school who wore talking watches, and they generally liked them. Talking watches are just that: regular watches that, with a press of a button, speaks the time and/or date. But talking watches can’t do much else beyond their essential function; they’re surely an aid but still are rather unsophisticated. A smartwatch, on the other hand, is infinitely more useful: even the creepy-as-all-hell Galaxy Gear ad offers good insight into what such a device is capable of doing. (I think we can all agree that the Galaxy Gear is a dud, but that’s a point immaterial to the point I’m making here.)
It’s interesting, then, to ponder the possibilities of a smartwatch for someone with disabilities. An iWatch with Siri integration could, like the aforementioned talking watches, could tell me the time with a press of a button. Likewise, were my eyes tired from looking at my iPhone’s screen, I could route my calls and/or iMessages to the watch, and have Siri read them to me. (And even reply too, but Siri and speech impairments is another topic unto itself.) One’s initial notion may be to dismiss these tasks as core functions, obvious examples of such a device’s features. But if you look at it from an accessibility point of view, you can see how an iWatch (or whatever) could truly be beneficial to a user with special needs. Pushing a button on the watch may be easier than navigating the iPhone or iPod Touch, motor-wise, while listening to Siri would save eye strain caused from looking at your phone’s screen for an extended period. An iWatch could even help the visually impaired navigate their surroundings via iBeacon, and help those with motor impairments by not requiring the user to always directly interact with their phone every time a call or text comes in. As someone whose cerebral palsy causes more issues with balancing and carrying objects than most, I oftentimes have trouble responding to important messages when I’m in situations (e.g., the grocery store) where my hands are full and I can’t easily access my phone.
The biggest hurdle to the smartwatch’s success, I think, is the display. This is especially true for users like myself. Brightness and sharpness are imperative. A low-resolution, dimly-lit display such as the one found on the Pebble absolutely will not work for me. It has to be Retina quality or bust. And, obviously, it can’t be too small either. I’m fortunate insofar that my vision could be much worse than it is, but if I can’t see the watch’s screen, its usefulness to me goes out the window. I’m sure I’m not the only one with prerequisites like these. The display on a smartwatch is crucial, even more so than the displays on other devices.
The ideas I’ve presented only scratch the surface of what’s possible, but I think they’re interesting nonetheless. Whatever happens in the wearable tech space going forward, device makers are going to have to factor in accessibility use cases. Assuming the purported iWatch runs iOS, what (if any) Accessibility features will Apple incorporate? How will they make an iWatch accessible to the visually or otherwise impaired? Will wearable devices be as revolutionary to the accessibility community as the iPhone and iPad have been? These questions remain unanswered for now, but as we enter 2014, they’re questions that anyone and everyone writing about technology should start thinking about.