Apple and the I-Word: Accessibility As Innovation

So often, the tech commentariat --- and a few of my Android-loving friends --- disparage Apple for a perceived "lack" of innovation, arguing that Cupertino has failed to deliver to market anything truly innovative and revolutionary since introducing the original iPad in 2010. Four years of nothing! Apple's lost its touch!, they say. This viewpoint, though, is flawed insofar that it's skewed so as to myopically focus on hardware: the iPod, the iPhone, the iPad, the Mac. To Apple's detractors, innovation is purely a tangible concept --- hence, that Apple has gone four years since releasing a breakthrough product, shiny and new, is surely a telltale sign that Apple has exhausted its ability to innovate. The problem with exclusively tying innovation to hardware is that it misses, obviously, the innovation that can be made in terms of software. More specifically, I contend that an underrated aspect of Apple's ability to innovate is the strides made by Apple's Accessibility efforts, particularly on iOS. (OS X's Accessibility features are equally impressive, but my experience with them is far less extensive.) The breadth and depth of iOS's Accessibility options are, in my opinion, a sign of true innovation. This is because not only does Apple improve the feature set year over year with each major release of the operating system, which in itself shows unwavering commitment to the accessibility community, but also because Accessibility opens usability doors for the disabled and non-disabled alike. While the case can certainly be made that there lies a big chasm between the quality of Apple's software and hardware products, it's my strong opinion that the build quality of the Accessibility features in iOS are on par with the industrial design of the devices on which they run. As a disabled user, I use these features every day, and they empower me to use my iPhone and iPad with as much delight and fluidity as my fully-abled family and friends do. Though this is a sentiment that I have written about numerous times, it's a sentiment worth repeating. True innovation, I think, is best expressed through accessibility because how else could I (and others under similar circumstances) use our devices? Accessibility is what makes using iPhones and iPads and iPod Touches possible for those with special needs, and Apple is to be commended for their relentless pursuit of innovation in this realm. Apple's continued refinement of iOS Accessibility may not be as lust-worthy or headline-grabbing as a diamond-cut chamfered, 64-bit A7 system-on-a-chip iPhone 5S, but it's innovation in its own right, and I would argue just as important and noteworthy. With this idea in mind, there are several instances in which Apple's accessibility innovations have profound real-world effects on folks who rely on accessibility to operate their devices.

* * *

The Deaf and iOS: FaceTime, Hearing Aids, and More

The vast majority of my work writing about iOS Accessibility has to do with vision-related problems and solutions: typography, user interface design, and the like. I think and talk about accessibility in visual terms so often that I sometimes forget that there are other facets of accessibility on which I can authoritatively speak. One of these facets is accessibility for the hearing impaired. Aside from being fluent in American Sign Language, being a child of deaf adults, I was, at an early age, exposed to various accessibility solutions that adapted the "hearing world" to the deaf world in which my parents lived. These solutions included light-up phone ringers and doorbells, as well as standalone closed-captioning boxes. In hindsight, perhaps it was these experiences that laid the foundation for the writing that I do today. Given my background within the deaf community, it warms my heart when accessibility is highlighted in the context of the hearing impaired. Case in point: last year's "FaceTime Every Day" iPhone ad. Among the vignettes seen in the ad was one of a deaf woman talking to a family member or friend via FaceTime. It was touching --- so much so, in fact, that I interpreted what was said by the woman. Not to mention it's a great example of Apple's attention to detail, and a great nod to the accessibility community. The bigger point is how transformative FaceTime is to the hearing impaired. FaceTime is a game-changer in the sense that it allows for a deaf or otherwise hearing impaired user to conceivably own an iPhone, because it's possible to use FaceTime to make calls. (Another method of communication is, of course, SMS and iMessage.) Using FaceTime on an iPhone (or iPod Touch) is certainly preferable and more desirable than using, say, a TDD or a dedicated Video Relay Service (VRS) device. Anecdotally, I have spoken to several deaf friends locally, and they've all expressed strong interest in iOS devices precisely because of FaceTime. Even I, a hearing person, get goosebumps whenever I use FaceTime because it feels so surreal and futuristic. From what I gather, FaceTime's effect on the hearing impaired is even more awe-inspiring, and proves that FaceTime has accessibility merit all its own. Related to FaceTime's accessibility is that of hearing aids. In iOS 6, Apple added support for hearing aids as part of their MFi (Made For iPhone) program. This is yet another initiative by Apple to bring the innovations of iOS and the iPhone to disabled users. As Heather Kelly writes for CNN Tech, Apple is working closely with hearing aid manufacturer ReSound in developing and brinking to market enhanced hearing aids which work seamlessly with iPhone. Features such as Live Listen, which allows focusing on a person's voice instead of the ambient noise around them, helps users get the most out of not only their iPhone, but of their hearing aid as well. There are other accessibility innovations too. Apple retail stores are hosting accessibility workshops for the hearing impaired in which they demonstrate how to best utilize iOS features like Siri and Speak Selection to interact with hearing peers. An example of this is using Siri to dictate (type) a note of a Starbucks drink order, then show the note to the barista. This is much easier and less frustrating than awkwardly pointing at items or writing things down. The idea that the deaf are able to use a phone to communicate may seem oxymoronic at first blush, but these anecdotes prove that the iPhone is just as accessible to a deaf person as it is to a hearing person. Apple's innovations in this space make it so, and it's a wonderful thing.

Touching Without Taps: Switch Control

When I worked for the school district, we in the classroom oftentimes would help our students learn practical life skills; an activity we did a lot was cooking. But as several of our students suffered from severe motor issues, we would employ the help of an Augmentative and Alternative Communication technique using switches. Plug the Switch into an appliance, press the button on the Switch, and the blender or whatever would magically turn itself on (or off). These Switches make kitchenware (or anything else, really) accessible to those who have problems pushing standard buttons. In our case, it ensured that even the lowest-functioning students could participate with the least amount of support possible. In the case of iOS (and the Mac), Apple has taken the concept of Switches and applied it to iOS devices and Macs. Both iOS and OS X support Switch Control, whereby a motor-impaired user can use a switch --- or even their head ---- to control their devices. Depending on how Switch Control is configured, this feature will simulate multi-touch events such as taps and swipes via a connected button and act as if it were registered on the touchscreen itself. There are videos on YouTube which demonstrate this functionality, and Apple has in the recent iOS 7.1 update included a feature that allows for a device's camera to act as a trigger for switches. This is a very cool feature, and a wonderfully thoughtful use of the Camera API. In my experience using switches in the classroom, I know firsthand how "magical" they are in terms of elevating otherwise inaccessible devices to a, Hey, I can do this! level. And, in fact, I know personally of one person who uses his iPad exclusively using Switch Control, and he loves it. He often tells me that without it, an iPad would be pretty useless because he's unable to interact directly with the touchscreen. Thus, Switch Control has opened a door that never would have been had it not been for Apple's inclusion of the feature in their OSes.

Touch ID and iOS 7.1

I've made no secret of my love for the iPhone 5C, but a major factor for my upgrading to the 5S instead was Touch ID. After Apple revealed the 5S last September, I wrote a piece explaining how Touch ID could emerge as a dark horse accessibility tool because of how it saves a visually and/or motor impaired user from slaving over the passcode prompt screen. After using a 5S the last six months, I can attest that my hypothesis about Touch ID's accessibility potential was spot on. The sheer act of dimply resting my thumb on the Home button to unlock my phone not only is more convenient and secure, but it also has saved me anguish from squinting at the keypad and moving my finger to peck in my passcode. It seems minor, but for someone like me who feels this stress even for a few seconds, Touch ID has been nothing short of a godsend because it saves me from experiencing those few seconds. It's nice knowing I'm not alone in my sentiments, and like Markdown, I've found Touch ID to be a very happy, if not intentional, coincidence with regards to accessibility. As for iOS 7.1, the new update includes some much-needed accessibility features and tweaks to the user interface that make using my devices much easier. The biggest change for me, Button Shapes notwithstanding, has been Bold Text. Enabling it (and rebooting your device) will turn every bit of text system-wide --- right down to the status bar and "slide to unlock" prompt --- into boldface. While it's not pretty, Bold Text makes using my iPhone and iPad so much easier: emails and text messages are easier to read, and icons in, say, Mail are much more discernable now than before. I really love it, and I can't foresee myself going back to the stock interface. iOS 7 still has plenty of design warts, to be sure, but using Bold Text at least alleviates some of the usability issues to such a degree that I'm not wanting to throw my phone in a ditch every time I struggle to make out some text or buttons. Again, Bold Text is likely to make UI designers the world over cringe, but it works. I'd say the only quibble I have is the fact that your device must restart for the change to take effect. That's really weird.

* * *
The moral of this over 2,000-word story is that the I word ---- innovation ---- is, I feel, alive and well at Apple's headquarters. Accessibility not only exemplifies the interplay between hardware and software, but also Apple's ethos of making its products accessible for everyone. Corny and trite though it may be, as a disabled person who deals with my afflictions every second of every day, I surely appreciate the innovations Apple's delivered in terms of Accessibility, and look forward to seeing what comes in the future. Without such innovations, using my devices wouldn't nearly be as easy or fun. As I said at the outset, Accessibility may not be glamorous or headline-grabbing as, say, an iWatch, but just because we haven't seen anything that lights up and makes noise in a while doesn't mean that the innovation well at Apple HQ has dried up. Far from it, actually. Trust me.