Jason Snell’s First Impressions of His New iMac Pro

Jason posted his first impressions story to Six Colors last week:

It’s a 5K iMac, albeit in a slightly darker shade. I made my transfer of data using Migration Assistant via Thunderbolt, which meant I needed to dig up a Thunderbolt 3 to Thunderbolt 2 adapter. (Turns out I had one of those!) I look forward to compressing more video and denoising more audio. This could be the beginning of a beautiful friendship.

I’ll say it again: I’m not in the market for this machine, but that space gray sure is pretty.

John Gruber’s iPhone X Review

John’s review is quintessential Gruber: incredibly thoughtful and well-written.

This is the key passage to me:

The iPhone X is not the work of an overcautious company. It’s a risk to so fundamentally change the most profitable platform in the world. But Apple is gambling on the taste of the team who lived with the iPhone X during its development. Ossification is a risk with a platform as popular and successful as the iPhone — fear of making unpopular changes can lead a platform vendor to make no significant changes. Another risk, though, is hubris — making changes just for the sake of making changes that show off how clever the folks at Apple still are.

After two months using an iPhone X, I’m convinced Apple succeeded. The iPhone X is a triumph, a delightful conceptual modernization of a ten-year-old platform that, prior to using the iPhone X, I didn’t think needed a modernization. Almost nothing7 about the iPhone X calls undue attention to its cleverness. It all just seems like the new normal, and it’s a lot of fun.

Also of note, we have similar takeaways about the lack of Face ID on our iPads.

The Moat Between Apple Maps and Google Maps

Justin O’Beirne wrote (another) blog post comparing Apple and Google’s respective mapping apps. In this recent piece, he argues there’s a sizable moat between the two because Google has more AOIs, or “areas of interest.” Namely, buildings.

I don’t dispute Google has collected better data than Apple—and you can argue data is paramount in this context—but there is a flip side to this. In my usage, Apple Maps wins the day over Google. This is because (a) I live in the Bay Area, and as such, the data here is great; and (b) Apple Maps is far more visually accessible. As I tweeted earlier today, it’s a matter of perspective. In other words, data isn’t everything, even to maps. Accessibility, as ever, is a key factor too, so in my case, design and data make the difference.

Bloomberg: Apple Developing EKG Monitor for Watch

Good scoop by Alex Webb at Bloomberg:

Apple Inc. is developing an advanced heart-monitoring feature for future versions of its smartwatch, part of a broader push by the company to turn what was once a luxury fashion accessory into a serious medical device, according to people familiar with the plan.

A version being tested requires users to squeeze the frame of the Apple Watch with two fingers from the hand that’s not wearing the device, one of the people said. It then passes an imperceptible current across the person’s chest to track electrical signals in the heart and detect any abnormalities like irregular heart rates. Such conditions can increase the risk of strokes and heart failure and develop in about one-quarter of people over 40, according to the U.S. Centers for Disease Control and Prevention.

As someone who recently had a bit of a heart scare—my EKG results came back normal, thankfully—I find this incredibly exciting news. Every bit of Apple Watch news is exciting, as Apple is clearly invested in pushing the device’s health and fitness capabilities even further.

‘iMac Pro: The First Shoe Drops’

Jason Snell published a great piece at Six Colors on what the new iMac Pro means for Apple, customers, and developers. Jason puts everything nicely into perspective. As I’ve said on Twitter, I’m not even remotely close to being in the market for this machine, but I’d be all over the space gray accessories if they were available as standalone purchases.

See also: John Gruber, Matthew Panzarino, and Rene Ritchie on Apple’s new desktop.

Review: iPhone X

With the exception of the SE and the 8/8 Plus, I’ve reviewed every new iPhone since the iPhone 6 and 6 Plus debuted three years ago. I’ve concluded every review by saying that, in one way or another, that year’s model is the best, most accessible iPhone to date. However trite, I’ve stood by that assertion because it’s true. Each model is an iteration on the last, bringing about many improvements. Accordingly, each successive model is more accessible than the last.

The iPhone X is more than iterative—it’s a massive leap forward.

In a month using iPhone X—a review unit provided to me by Apple—I’ve found the device to be everything Apple proclaims it to be. It truly is the best, most accessible iPhone yet. It’s delightful to hold and use. It feels like the future, today.

That iPhone X is the “most accessible iPhone yet” holds new meaning. As I’ve lived with the phone, a thought that’s persisted in my mind is how much iPhone X is not merely the “future of the smartphone,” as Apple boasts, but how it represents a more accessible smartphone of the future. Between the new form factor, Face ID, and wireless charging, using iPhone X is a whole new experience for a disabled user such as myself. These technologies are bleeding-edge, but they’re so compelling that they make iPhone X the most accessible iPhone Apple’s ever made.

Face ID

I published a piece a few weeks ago in which I delve into the accessibility implications of Face ID, Apple’s new facial recognition system. Without rehashing the entire article here, the Cliff’s Notes version is Face ID became most useful to me when I realized I had to turn off the Require Attention option.

The reason Require Attention doesn’t work for me is the strabismus in my left eye. Strabismus is a condition where one or both of the eyes aren’t set straight, and it seems to wreak havoc on the iPhone X’s TrueDepth camera system. After I restored my phone from an iCloud backup, the Face ID setup process would go smoothly, but then I wasn’t able to log into my phone (getting into 1Password and using Apple Pay was also hard). The problem was the phone couldn’t tell whether I was looking at it, even if I knew I was, due to the strabismus. It was highly frustrating initially, but I learned something: I’m an edge case. For the first time using an Apple product, I felt I had to adapt to the technology rather than have the technology adapt to me.

Since turning off Require Attention, Face ID has worked like a charm. It has even started to recognize me at extreme angles, such as when I lean over the phone as it sits on my kitchen table. The only issue I continue to have is I’m still not totally accustomed to holding the phone far enough away such that it can see me. This is because I instinctively hold the phone close to my face in order to see comfortably. I have yet to develop consistent muscle memory to move my arm farther away, and have to consciously remind myself to do so whenever I get the haptic, can’t-log-you-in buzz on the Lock screen.

Overall, Face ID is terrific, particularly given how it’s a “1.0” version of the feature. For as much as I praised Touch ID on its merit as an accessibility tool, Face ID is markedly better. It’s very liberating going from tactilely authenticating with my thumb to simply looking at my phone. Face ID removes another point of friction, effectively making accessing iPhone X a “hands-free” endeavor. If you’re someone with certain fine-motor limitations, the advent of Face ID is a true revelation.

The side effect here is Face ID instantly makes Touch ID on my 10.5-inch iPad Pro feel downright anachronistic. For things like unlocking my phone or paying for a Lyft ride, Face ID is like performing a magic trick. To me, this is the utmost compliment; for as wonderful as Touch ID was (and still is), Face ID bests it in every meaningful way. And, again, this is a 1.0 take.

Size & Weight

In early 2016, I wrote about switching to the iPhone 6s Plus, saying “the ‘monster’ iPhone is the iPhone I've always wanted.” To this day, I maintain moving to the Plus was one of the best technological decisions I’ve ever made. If the iPhone X didn’t exist this year, I certainly would have upgraded to the 8 Plus.

I freely admit, however, loving the Plus for its screen didn’t come without a cost. There’s no getting around the fact it’s a beast physically, and as such, it’s not easily pocketable. Of course I acclimated to the size in hand and in my pocket, but that doesn’t mean it wasn’t a pain in the ass. As I said in my 6s Plus story, the benefits my low vision reaped from the big screen trumped any concerns over ergonomics and portability.

For its part, the iPhone X strikes me as a blend of both traits: it has the ergonomics and pocketability of an iPhone 6/7/8 and it has the (slightly) bigger screen of the Plus models. In practice, using iPhone X feels like using a “regular” iPhone; I’m able to use it one-handed and pocket it with no problems. It’s so great to have the best of both worlds, because I feel I’m not making a compromise to use the device. It’s all good.

Which is important considering the rumor Apple is planning to release a Plus variant of iPhone X in 2018. In all honesty, I don’t know if I’d be willing to make the switch to an iPhone X Plus. I’d surely like to check it out for journalism’s sake, but I really believe the iPhone X as it is right now is ideal. Like the 10.5-inch iPad Pro, I feel like I’m getting the best of both attributes: Small enough to be portable yet big enough to see.

Wireless Charging

Aside from Tap to Wake, wireless charging is actually my favorite aspect of iPhone X. Like with Face ID, it’s felt incredibly freeing not having to plug in a cable to charge. All I need to do is literally put my phone down, and it charges.

My review kit from Apple included the Belkin charging pad, which has worked wonderfully for me. I’ve seen other reviewers on Twitter say the Mophie one is better, but Belkin’s has been fine in my usage. Maybe it’s just dumb luck, but I’ve never had an issue with finding the right “spot” to charge my phone.

In terms of accessibility, what makes wireless charging so great is, again, it removes a point of friction. In this case, wireless charging means I needn’t have to contend with plugging in a cable. Mundane as it is, this is a big deal. Given my low vision and cerebral palsy, plugging in my devices has always been somewhat of an adventure. I have to not only find the port with my eyes, but I have to use my fingers to plug in the cable. It’s not an easy task if your vision and fine-motor skills are lacking, as mine are. Thus, wireless charging is a lifesaver.

There are people who poo-poo wireless charging as not being any better than using Lightning, which is their prerogative, but it overlooks the accessibility benefits. The bull case for wireless charging is exactly the same case for ditching the headphone jack. Losing the headphone jack on iPhone 7 meant I gained AirPods, which has revolutionized the way I listen to audio on iOS. Wireless headphones absolutely beats plugging in EarPods. Likewise, using a wireless charging mat like Belkin’s (or Apple’s forthcoming AirPower accessory) beats plugging in a Lightning cable. Put another way, a wireless charging mat makes charging my phone more accessible in the same way AirPods makes listening to music and podcasts more accessible. I can plug in a cable, but I’d rather not. With iPhone X, I don’t have to.

New Home Gestures

The absence of a Home button on iPhone X means unlocking the phone, opening multitasking, and exiting apps is done via a swipe-up gesture. I’ve had no problems performing the swipe; it became second nature to me after a matter of hours. The only bad thing is my brain goes wonky when I try to swipe on my iPad. It feels “broken” for a second before it dawns on me it’s different. iPhone X is magical not only technologically, but also in the way it makes newish devices feel old and decrepit.

One amusing aspect of iPhone X lacking a Home button is the popularity of using AssistiveTouch to “put back” the button. It’s a hack—a hack that works!—but it’s funny nonetheless. People are spending $1000 on Apple’s flagship, cutting-edge smartphone only to “hack it” by giving it a pseudo Home button. On the bright side, though, it’s heartening to see more people discovering iOS’s accessibility features. I’ve long championed the idea that accessibility features are not exclusively the domain of users with disabilities. They’re equally beneficial to anyone, regardless of ability. The use of AssistiveTouch (and Dynamic Type) are two examples of this, and I’m happy people are noticing. Accessibility helps everyone, not only the disabled.

Things I Don’t Like

I have only two complaints about iPhone X, both minor.

First, I dislike how I can’t see the headphone icon (🎧) in the status bar at a glance, as it’s helpful in confirming that audio is piping through my AirPods. Because the sensor housing (aka “the notch”) is in the way, there’s less room up there for information. I find I have to swipe down to invoke Control Center in order to see the icon, and it’s annoying. (The same goes for the battery percentage, which I like a lot. Visually, it’s a far more concrete measure for me than the abstract battery icon alone.) I would rather see the headphone icon instead of, say, the cellular bars or “AT&T” whenever I’m listening to something.

Secondly, the home indicator (the horizontal line at the bottom of the screen) gets in the way of content. I understand why it’s there, but I don’t believe it needs to be persistent. I know how to get back to Springboard; I don’t need to see the home indicator all the time. It gets in the way of stuff down at the bottom. I’d like to see Apple add an option to get rid of it.

Bottom Line

The iPhone X is superb. It gives me the same feeling of delightfulness as my AirPods and Apple Pencil do. The phone oozes luxury with its all glass and stainless steel design, and its OLED screen is the best screen I’ve ever seen on any device. I cannot wait to see how Apple will refine iPhone X next fall.

I’ll say it once more: iPhone X is the best, most accessible iPhone yet.

Apple Confirms Shazam Acquisition

Last Friday, TechCrunch reported Apple was in talks to acquire Shazam. This morning, the company issued a statement to BuzzFeed on the deal:

“We are thrilled that Shazam and its talented team will be joining Apple,” Apple spokesperson Tom Neumayr said in a statement to BuzzFeed News. “Since the launch of the App Store, Shazam has consistently ranked as one of the most popular apps for iOS. Today, it’s used by hundreds of millions of people around the world, across multiple platforms.”

“Apple Music and Shazam are a natural fit, sharing a passion for music discovery and delivering great music experiences to our users,” Neumayr continued. “We have exciting plans in store, and we look forward to combining with Shazam upon approval of today’s agreement.”

I wonder if this deal was more about the engineering talent than about the service.

Jony Ive Resumes Daily Oversight of Design Team

Mark Gurman and Alex Webb, reporting for Bloomberg:

Apple Inc.’s Jony Ive, a key executive credited with the look of many of the company’s most popular products, has re-taken direct management of product design teams.

Ive, 50, was named Apple’s chief design officer in 2015 and subsequently handed off some day-to-day management responsibility while the iPhone maker was building its new Apple Park headquarters in Cupertino, California. “With the completion of Apple Park, Apple’s design leaders and teams are again reporting directly to Jony Ive, who remains focused purely on design,” Amy Bessette, a company spokeswoman, said Friday in a statement.

Now that Apple Park is complete, this makes sense.

(via Daring Fireball)

Improving the iPad Pro’s Smart Keyboard

Although I’m fond of my Touch Bar MacBook Pro, the majority of my computing time is done on iOS, whether it be on my iPhone or my iPad. I use the iPad (the 10.5-inch model) often as a “laptop” to write blog posts and stories for my various freelance outlets. When I write on the tablet, my keyboard of choice is Apple’s Smart Keyboard. I like it for a few reasons, not the least of which is the fact it’s an “all-in-one” solution. Unlike something like Studio Neat’s Canopy, the Smart Keyboard attaches itself to the iPad—which means it’s also a cover, so I’m effectively carrying a single entity. That matters for accessibility insofar as it’s one thing to carry, as opposed to carrying my iPad and the Canopy stand with the Magic Keyboard. For me, the Smart Keyboard is less weight and less cumbersome. Not to mention the advantage of the Smart Connector over Bluetooth; overall, the Smart Keyboard is a winner in my book.

Ergonomics and portability aside, I do like the Smart Keyboard as a keyboard for typing. The keys feel good, and not being a touch typist, I’m not concerned about words-per-minute or key travel. My cerebral palsy makes typing somewhat difficult for me, so the only way I can put “pen to paper,” so to speak, is to use the hunt-and-peck method. (I also do this on iOS’s virtual keyboard. In my almost 5 years as a member of the tech press, I’ve written innumerable articles using only the glass on my iPad.)

Despite my Smart Keyboard fandom, there are two key areas that are in dire need of improvement. The enhancements I suggest would make the accessory more accessible to me, and I hope the company considers adding them, if possible, in a future version of it.

The Caps Lock Key Needs an Indicator Light

One huge advantage the Magic Keyboard has over the Smart Keyboard is its Caps Lock key has a light that indicates state. The Smart Keyboard desperately needs one, and I sorely miss it. It drives me crazy at the beginning of a sentence or when I type a proper noun that I can’t tell whether Caps Lock is on/off. I end up with many typos as a result, which I find disrupts the writing process because I always feel compelled to stop and correct myself.

The lack of a Caps Lock indicator is so infuriating not only for convenience, but more importantly, for accessibility. The reason for this is because the little light on my Magic Keyboard—or on my MacBook Pro’s keyboard, for that matter—is a clear visual cue signaling that Caps Lock is enabled. By contrast, the lack of said light on the Smart Keyboard today means users effectively play a guessing game every time they write. You don’t know what state the key is in unless you press it once or twice—this abstraction is both annoying and inaccessible, at least for me. Adding a light would give me (and others) a concrete indication that Caps Lock is on or off. No more ambiguity or frustration.

The Keyboard Needs to Be Backlit

In a similar vein to how an indicator light on the Caps Lock key acts as a visual cue of state, backlit keys would make seeing the Smart Keyboard easier. The keyboard itself is relatively low contrast, as its dark grayish-blue color makes distinguishing individual keys somewhat tricky. I find this especially problematic in low-light environments or at night, because the dark styling of the keyboard can’t offset the ambient conditions. I’m able to manage despite this, but it isn’t an ideal situation.

Adding backlit keys to the Smart Keyboard would help immensely in boosting contrast because the LED lights would be the contrast. It’d make typing in low-light conditions exponentially better, particularly for someone like me who’s visually impaired. I don’t know how feasible this would be, engineering-wise, but it would be awesome if Apple can pull it off. I’d be very pleased.

As I mentioned at the outset, the addition of a Caps Lock indicator and backlit keys to the Smart Keyboard would make the product more accessible—and more enjoyable. The Smart Keyboard seems to be a divisive product amongst iPad writers, but for my needs and tolerances, it’s nearly perfect. Enhance it with my suggestions, and I’d likely never use another keyboard again.

Apple Launches Apple Heart Study

Apple PR, in a press release:

Apple today launched the Apple Heart Study app, a first-of-its-kind research study using Apple Watch’s heart rate sensor to collect data on irregular heart rhythms and notify users who may be experiencing atrial fibrillation (AFib).

AFib, the leading cause of stroke, is responsible for approximately 130,000 deaths and 750,000 hospitalizations in the US every year. Many people don’t experience symptoms, so AFib often goes undiagnosed.

To calculate heart rate and rhythm, Apple Watch’s sensor uses green LED lights flashing hundreds of times per second and light-sensitive photodiodes to detect the amount of blood flowing through the wrist. The sensor’s unique optical design gathers signals from four distinct points on the wrist, and when combined with powerful software algorithms, Apple Watch isolates heart rhythms from other noise. The Apple Heart Study app uses this technology to identify an irregular heart rhythm.

The timing of this launch is kinda funny to me—as I write this, I’m wearing an EKG patch after telling my doctor about how my Apple Watch alerted me several times of spikes in my heart rate (>100) when I’m not doing anything but sitting on the couch.

See also: CNBC’s Christina Farr’s interview with Apple COO Jeff Williams on the study.

The Second-Generation iPhone SE

Joe Rossignol reports for MacRumors about a rumor claiming Apple is readying an update to the iPhone SE for “the first half of 2018.” Like Michael Rockwell’s wife, my girlfriend absolutely loves her iPhone SE, and was ecstatic when I told her of this report. She strongly dislikes the larger iPhones, so the SE is a perfect device for her: she gets most of the modern features in a decidedly smaller form factor that she can comfortably hold and carry. Her 2016 model is still going strong, and I’m glad to see Apple committing itself to updating the line. It’s a great product in its own right.

The Touch Bar Makes the Mac More Accessible to Me

Marco Arment, creator of my favorite podcast app and a co-host of one of my favorite podcasts, published a piece yesterday in which he outlines what Apple should do with the next revision of the MacBook Pro. Regarding the Touch Bar, he writes:

Sorry, it’s a flop. It was a solid try at something new, but it didn’t work out. There’s no shame in that — Apple should just recognize this, learn from it, and move on.

The Touch Bar should either be discontinued or made optional for all MacBook Pro sizes and configurations.

Arment’s recommendation that Apple “back away from the Touch Bar” reiterates a popular sentiment in the Apple community: in blunt terms, the Touch Bar sucks. I’ve read many articles and heard many podcasts where prominent members of the community deride the feature and question its future. These criticisms, while legitimate, sting me personally because I like the Touch Bar.

It stings because, in my usage, I find the Touch Bar to be an invaluable tool when I’m using macOS. Where it shines considerably is as an alternative to keyboard shortcuts and the system emoji picker. Tapping a button on the Touch Bar is far more accessible than trying to contort my hands to execute a keyboard shortcut or straining my eyes searching for an emoji. In addition, the Zoom feature—one of the Touch Bar’s many accessibility features—makes seeing controls much easier.

However useful the Touch Bar is to me, I realize I’m only one data point. When I shared my experiences on Twitter, Shelly Brisbin responded by rightfully pointing out how accessibility “is very different for each user” while adding she has “no love” for the Touch Bar. Brisbin, like Arment and numerous others, don’t like the feature and have no use for it. Which is the whole problem, I suppose—not enough people are in the “I like it” camp for the Touch Bar. Even Apple itself seems less enthused about the feature, given how High Sierra shipped this fall without much iteration in this regard. Maybe they really are “backing away.”

Whatever happens to the Touch Bar going forward, I’ll continue to use it and like it. While I spend most of my computing time on iOS, my time on the Mac is made better and more accessible by the Touch Bar’s presence. I’m only one person, but that’s my take.

Shelly Brisbin is absolutely correct when she says accessibility differs from person to person. In sharing my experiences with the Touch Bar, though, I want to show the feature isn’t an abject failure. It does have utility and it’s technically extremely well done. I think a lot of the Touch Bar haters overlook all the capabilities the Touch Bar offers in terms of accessibility, at least for me.

I’ll be sad if the Touch Bar is abandoned or discontinued, but again, let the record show that it helps me and I’m a big fan.

‘Apple’s Next Laptop Should Run iOS’

Jason Snell, in this week’s More Color column for Macworld:

Would an iOS laptop be for everyone? Absolutely not, but there are lots of people who might want a laptop but don’t need anything more than what iOS offers. Apple will still make MacBooks, and hopefully has plenty of innovation yet to come in that area. And some people really do prefer working with laptops over tablets, all other things being equal—not just writers, but people who watch a lot of video. My daughter is a great example: She loves watching Netflix on her laptop and has refused to consider switching to an iPad.

I’d be interested in a such a product; it’s very intriguing.

What Face ID Means for Accessibility

When Apple introduced Touch ID with the iPhone 5s in 2013, I wrote a piece in which I posited how the fingerprint reader would be beneficial in an accessibility context. I wrote, in part:

What I see Touch ID doing is helping people with the aforementioned acuity/motor issues by allowing them to simply use their thumbprint (or other finger) to unlock their phone, password-free. More specifically, Touch ID would free users from the struggle of manually entering in their passcode.

My idea here is not so much of convenience (which is nice) but rather of usability. I know many folks with vision-and motor-related issues who bemoan iOS’s passcode prompt because not only does it take time, but also entering in said code isn’t necessarily an easy task. In fact, more than a few lament this so often that they forego a passcode altogether because it’s time-consuming and a pain (sometimes literally) to enter.

Four years later, the advent of Face ID in the iPhone X represents the next step in biometric security. But it’s something else too—for as great as Touch ID has been in terms of security, convenience, and accessibility, Face ID is even better. In my brief time with iPhone X so far, I have found Apple’s facial recognition technology to best Touch ID in virtually every meaningful way. Not to mention it’s pretty damn cool knowing I have the ability to unlock my phone and buy things with my face.

Living on the bleeding-edge is fun.

Facing My Face ID Conundrum

In my first impressions story, I noted how Face ID on iPhone X has been “the most revelatory aspect” of the device thus far. What’s revelatory about it is how it taught me something about myself: namely, that I’m an edge case. For the first time using an Apple product, I have felt like I’ve been forced to adapt to the technology rather than have the tech adapt to me.

Here’s the thing. I have a condition called strabismus, which means one or both of the eyes are not set straight. For me, mine is the left eye—ironically, my strong eye—and it seems to wreak havoc with the TrueDepth camera system. In my initial attempts to set up Face ID, I could not get Face ID to unlock my phone. The setup process went smoothly—Face ID successfully mapped my face, but again, it was unable to recognize me when unlocking the phone or logging into an app like 1Password. It was highly frustrating.

As Face ID is the marquee feature of iPhone X, this was bad.

After some troubleshooting, however, there was a solution. Following some tests, I determined I’m one of those users for which requiring “eye contact” with iPhone X just will not work. Hence, the solution was to go into Face ID’s settings and turn off the Require Attention feature (Settings > Face ID & Passcode > Require Attention for Face ID). With Require Attention disabled, Face ID works like a charm. Doing things such as unlocking my phone, logging into 1Password, and paying with Apple Pay all are effortless.

The only caveat to this is I’m still not used to holding my phone far enough away such that Face ID can read my face. Because of my low vision, I instinctively hold my phone close to my face because I need it close to see. Face ID obviously can’t see me at this angle, so I tend to get the haptic, can’t-log-you-in “head shake” a lot. I’ve only had the iPhone X for almost two weeks now, so it’ll take me a bit more time to develop a new muscle memory. I can deal with this, though, because I know the technology isn’t faulty nor did Apple give me a lemon of a review unit, as I initially feared. Everything works as intended, as designed—I just need to learn new habits.

Especially with iPhone X, there’s ten years of iPhone convention to unlearn.

Why Face ID Beats Touch ID

So what makes Face ID even more accessible than Touch ID?

For one thing, setup is far faster and less taxing. Enrolling in Touch ID is by no means difficult, but it is relatively slow and “precise.” iOS prompts you to move your finger this way and that way, and will bug you when you don’t follow directions. If you’re someone with limited fine-motor skills, getting Touch ID set up can be a literal pain along with being a figurative one.

By contrast, setting up Face ID at least feels more streamlined and less tedious. While moving your head around “like you’re drawing a circle with your face,” as Apple described it to me, can be difficult for individuals with certain gross motor limitations, there is an accessibility option to eliminate that step. (Instead of moving your head around to get the depth map, the system will take a single shot at a fixed angle.) If rolling your head around is impossible or bothersome, Apple has you covered right from within the setup UI. Again, Touch ID is no slouch, but I have found, anecdotally, that setting up Face ID is much simpler and quicker than ever. Surely this is due to Apple having years to study user data and fine-tune BiometricKit.

Beyond setup, another area where Face ID excels is its presence removes a point of friction (the Touch ID sensor) for many disabled users. However accessible Touch ID may be, the fact remains reaching and/or pushing that button is problematic for many. Instead of tactilely authenticating for everything, now all someone has to do is literally look at their phone. It’s no doubt convenient as well, but importantly for accessibility, Face ID is freedom. It’s freedom knowing there’s a better way forward technologically, and freedom knowing there’s less one less possible barrier.

The way Apple has built Face ID, hardware- and software-wise, into iOS quite literally makes using iPhone a “hands-free” experience in many regards. And that’s without discrete accessibility features like Switch Control or AssistiveTouch. That makes a significant difference to users, myself included, whose physical limitations make even the most mundane tasks (e.g., unlocking one’s device) tricky. As with so many accessibility-related topics, the little things that are taken for granted are always the things that make the biggest difference in shaping a positive experience.

On Face ID and Apple Pay

I’ve gone on the record (here and here) to laud Apple Pay as an accessible way to pay for stuff. I’ve used it every chance I get since its debut in 2014, and still can’t get over how well done it is. It’s a truly magical service.

Face ID on iPhone X takes Apple Pay to the next level. In the handful of times I’ve used Apple Pay on iPhone X (to pay for Lyft rides), Face ID has provided an even more seamless experience. Like with unlocking, the advantage of Apple Pay being tied to Face ID is you confirm the purchase with your face. (You double-press the side button to initiate, but that’s it.) The hands-free nature of it means I needn’t worry about getting my thumb in the right position or spend time waiting for authorization.

Despite how good it is, the thing about Apple Pay on the phone is, since I’m an Apple Watch wearer, I don’t use the service often on my iPhone. It’s even better from my wrist, but I’m glad Apple made the gestures more consistent across devices. Nonetheless, for the times when I do use Apple Pay on my iPhone, Face ID makes it quicker, easier, and more accessible.

A Brief Note on the Touch ID API

It’s worth mentioning how much of an impact I believe the public Touch ID/Face ID API has had on accessibility. To me, it’s a sleeper hit.

The reason for this is because, by giving developers the power to integrate biometrics into their apps, Apple is effectively ensuring third-party apps be more accessible. I continue to agree with Marco Arment that the company should make accessibility a tentpole of the app-vetting process, but as it stands currently, just the fact alone that App Store apps have access to these biometric features puts them on solid ground, accessibility-wise. That I have been able to use my thumb (and now my face) to get into my 1Password means that app already is pretty accessible, even without critiquing any design details. It sure beats typing a passcode every time.

Of course there’s more developers need to do to ensure their app(s) are accessible by all, but the API sure puts them and users ahead. It’s not trivial, and Apple is to be commended for perhaps having the foresight to realize the benefits here. It was a huge addition to the toolkit.

The Future of the (Accessible) Smartphone

Everyone who has an iPhone X right now is still in the honeymoon phase, so time will tell how feelings about the evolve as the device ages. In my usage so far, it’s clear to me Apple built iPhone X in such a way that the so-called “future” of the smartphone is an accessible one.

iPhone X takes many leaps forward, but Face ID is the biggest. It’s markedly better than its predecessor, which is high praise for a feature as beloved as Touch ID. There was some adjustment necessary on my part, but I can’t speak effusively enough about Face ID. It’s delightful, reliable, and accessible.

Apple Pushes Back HomePod Release to ‘Early 2018’

Nicole Nguyen, reporting for BuzzFeed:

In June, Apple announced that it was challenging Amazon's sleeper hit Amazon Echo with its own voice assistant-enabled speaker, called HomePod, and said the product would be released in December 2017. Today, the company released a statement that the speaker will be delayed until 2018: "We can't wait for people to experience HomePod, Apple's breakthrough wireless speaker for the home, but we need a little more time before it's ready for our customers. We'll start shipping in the US, UK, and Australia in early 2018."

The company also said “we need a little more time” when AirPods were delayed last year.