Me and Timothy were joined by special guest Rene Ritchie. We discussed Apple's Global Accessibility Awareness Day efforts and my interview with Tim Cook, Apple's stewardship of its accessibility work, and our wishes for WWDC.
Michelle Robertson, writing for SFGate:
Some San Francisco neighborhoods are more averse to cash than others, according to data collected by Square Inc. in March and April. Heavy shopping districts filled with new stores and young residents, like Hayes Valley and SoMa, favor card and digital payments over older neighborhoods with more established businesses, like the Richmond and Sunset districts.
I live in the Inner Richmond and can attest to the neighborhood's preference for actual cash. The majority of businesses I frequent are hole-in-the-wall, mom-and-pop dim sum places and other stores, and nearly all of them are cash-only. The only times I use something like Apple Pay is when I venture to other parts of the city, notably in various parts of the Sunset. Otherwise, I'm prone to ATM visits to get cash because that's how commerce works in my part of town. And I'm okay with that—I don't mind cash and I like supporting small businesses.
I'm very pleased to see my byline again on MacStories. In this piece, I list five features and technologies iOS and macOS can share with each other. I believe these suggestions would boost the usability and accessible design of both platforms, which ultimately makes for a better user experience.
Ben McCarthy yesterday released an all-new version of his camera app, Obscura. On Medium, he wrote a terrific piece on what went into creating the app and the rationale behind some of his design decisions. I love these kinds of "behind the scenes" pieces; they're always fun reads.
Excellent story from BuzzFeed's Mat Honan:
As noon approached, an Orthodox priest, a rabbi, a Zen Center reverend, an imam, an Episcopal bishop, a Catholic archbishop, and a leader from the Brahma Kumaris Meditation Center stood on a stage in downtown San Francisco, clasped hands, and said a prayer: “Bless this magnificent edifice,” they intoned, “the Salesforce Tower.”
It was a bit ridiculous. But the Salesforce Tower itself, which opened to the public on Tuesday, is no joke. For San Francisco, it is a literal monument to the wealth and power of tech, and its grand opening brought together a nexus of powerful forces in modern-day California: the technology industry, Democratic politics, and the housing crisis.
As I said on Twitter, I have close friends and family who work at Salesforce. Upon further reflection, however, the more upset I am by this. Not by Honan's story, but by the sheer idea of the Tower. As I also said on Twitter, San Francisco is a city "in real shit shape," as Honan wrote, yet Benioff and other company executives expect people to come and gawk at this architectural monstrosity that cost over a billion dollars to build. Even worse, they get religious figures to come and bless the fucking thing? It's absurd. Then Benioff has the gall to wax on about helping the city by combatting homelessness and so forth. Honan was right on: This whole farce is and was Dickensian indeed.
Chelsea Stark and Samit Sarkar, writing for Polygon:
The world of video games is not particularly welcoming to individuals with disabilities. Game makers and platform holders have made some strides in this area in recent years, but for the most part, they’ve left the hard work to third-party organizations. The Xbox Adaptive Controller is the strongest, clearest expression yet of Microsoft’s commitment to reaching people with disabilities, and it sprang in part out of a controller that’s on the opposite end of the accessibility spectrum.
“We cast a really inclusive map of partners and individuals to help us build this, in a much bigger way than we have normally for our products,” said Kumar. In addition to groups working in the gaming accessibility field, like AbleGamers, SpecialEffect, the veteran-focused charity Warfighter Engaged and accessory manufacturers, Microsoft consulted with the Cerebral Palsy Foundation and Craig Hospital, a Denver-area rehabilitation center for brain and spinal cord injuries.
This new controller from Microsoft is a big deal for the video game industry and for Microsoft. As Dan Moren writes at Six Colors, it's heartening to see the other big players in tech make such a pronounced move in the accessibility space. Apple surely leads here—although they aren't doing anything hardware-wise—but it's great to see others follow their lead in acknowledging and supporting the disabled community. Huge kudos to Microsoft for their efforts here.
Speaking of Global Accessibility Awareness Day, how fortutious that the first episode of my new (again) podcast about accessibility in tech dropped on the same day. In Accessible's first new episode since 2014, me and Timothy discuss what accessibility is and what it means, the history of Apple's accessibility work, how the first iPhone redefined accessible computing, and more.
Yesterday was Global Accessibility Awareness Day. Apple, as it has done the last few years, was an active participant in promoting the day—a day all about raising awareness for disabled people and the importance of accessible technology. Being the industry leader, Apple plays a big role.
For TechCrunch, I wrote a deep dive story on Apple's activities for this year, tied into the education announcements they made in Chicago in late March. Apple invited me to a small event at the California School for the Deaf in Fremont, which was the highlight of the day. It was at that event where I got an opportunity to briefly interview CEO Tim Cook about GAAD and what accessibility means to the company and to him. The reporting was key, but man, how exhilirating it was for me. It was an amazing experience—one that I won't ever forget.
Rachelle Hampton, writing for Slate:
The fact that analog clocks have managed to stand the test of time in an increasingly digitized world is a bit of a wonder. While there may be nothing quite as charming as the quiet tick of a watch, the fact that teachers are adapting to the fact that a fair number of teenagers can’t read an analog clock isn’t another sign of corruption in The Youth. It means, as Nobel Prize Winner Bob Dylan once wrote, that times are a-changing. It doesn’t make sense for a teacher to waste precious class time to teach teenagers how to read an analog clock just so they can tell time during an exam. And if students can’t read traditional clock faces with relative ease by the time they’re sitting for exams, it means one of two things: Somewhere down the line their teacher prioritized other knowledge over doing countless worksheets on telling time, or they’ve just forgotten something they learned in elementary school once they tested out of that grade.
Kids in America who learn under Common Core standards are required to be taught how to read an analog clock in first or second grade; if kids in the U.K. are taught around the same time than it wouldn’t be entirely surprising to learn that they forgot how to do so by the time they’re teenagers. With the sheer amount of stuff kids are expected to know once state exams roll around, it’s understandable that they’d get a bit rusty on something they’re not using every day.
One of my saved watch faces on my Apple Watch is one of the analog ones. I like to think using an "old" way to tell time on a 21st century computer on my wrist is a bit whimsical, and it's cool. I was born in 1981, so I grew up with analog clocks; I know how to tell time. Digital clocks are fine too, but I'm comfortable reading the clock's hands.
One amusing aspect to analog clocks versus digital ones, at least in my experience, is how young people (read: teens and twentysomethings) don't understand "approximate" time. For example, if someone asks me what time it is and it's 3:15, more often than not I'll say it's "quarter after." Likewise, if it's 3:45, I'll say it's "quarter to." When I use these phrases, it seems to throw off people—I think most people nowadays are used to exact numbers. To them, it's not "ten to 4," it's "3:50." Digital timepieces have set the expectation for getting exact numbers as opposed to abstract (albeit correct) phrases.
Maybe I'm showing my age. My grandma told time this way, so I guess I picked it up subsconsciously.
The show art is courtesy of Simon “forgottentowel” Buckmaster, who’s done artwork for Relay FM.
At long last, it returns. I’m excited to announce the resurrected Accessible podcast.
First, a quick history lesson. The show’s original run dates back to 2013–2014, a time when my career in tech journalism was in its infancy. The show’s premise was simple: A (then) weekly podcast about accessibility in tech, mostly seen through an Apple-focused lens. Accessible was part of the now-defunct Constellation FM podcast network, and it had a good but brief run. Apple featured it in iTunes for a while under “New & Noteworthy,” and we even had sponsors! Squarespace was a sponsor at one point; I still have the old ad read as a text file stored in my Dropbox. And there were guests too—Stephen Hackett and Jared Sinclair were two notable ones.
Accessible’s original run marked my first foray into the world of podcasting, and I enjoyed doing it very much. As time went on, however, life got in the way of me and my co-host, so our work on the show waned to the point it ceased to exist. We just stopped, and unfortunately lost touch with each other altogether in the process. In fact, none of the audio exists anywhere online today. Since it ended four years ago, my career has blossomed and I’ve grown a nice audience. Most of my friends and colleagues podcast regularly, and as a podcast lover, I’ve listened to countless hours of their shows over time. It eventually reached a point where, coinciding with my burgeoning writing career, that I began to really miss podcasting. More to the point, I increasingly felt I should be podcasting—I should augment my written work on Apple accessibility by talking about it too. So, after much procrastination and preparation, here I am.
Rather, here we are. I’m doing Accessible with my friend Timothy Buck.
Tim is a product manager who, like me, lives in San Francisco. We’ve been friends several years; we’ve had coffee many times, over which we had many discussions about rebooting Accessible. He’s a great guy and I couldn’t be more excited to be working with him on the show. As one friend recently told me, I “picked the right partner... he’s an amazing dude.”
Round 2 of the show has the exact same premise as before: Accessibility in tech, with an Apple bend. This time, though, we’ll be doing the show fortnightly... so listeners will get two episodes per month. And like the old show, we’re planning on having guests—we have an exciting, ever-growing list in Notes of people we’d like to have on the show.
In a broad sense, I think bringing back Accessible is important for the Apple podcasting scene. While there are accessibility-minded podcasts out there, I’m frustrated at the general lack of discussion about it on most shows, including all of the “AAA” shows most Apple nerds are familiar with. To be clear: This isn’t me being critical of the people doing the shows; I know everyone doing them and they’re all smart as hell. The issue is simply accessibility is too important a topic not to cover, even though it’s admittedly abstract and hard to understand in places. It’s difficult to talk about something you can’t easily relate to. Still, it’s a subject deserving of our attention. Accessibility is a major facet of all Apple products, and certainly has relevance to the rumor mill as well when ruminating on potential future products.
Put another way: The Apple community needs to talk about accessibility more than it does.
Thus, our hope for Accessible is to bring some of that conversation to the forefront. As a disabled person and as someone who is looked upon as the “expert” on this topic, I’m hoping that, if we do it right, Accessible can be a trusted source of coverage and analysis of Apple-related topics from a different perspective. To diversify the conversation is, I believe, to enrich the conversation. We hope the conversations we have on the podcast help listeners see Apple (and other companies) in a different light than usual.
As of this writing, we’ve recorded an Episode 0 and will be recording the first real episode shortly. We’re excited about the early enthusiasm for the show, and we’re excited to get to work and get the show off the ground and see where it goes. We hope you’ll join us on our journey. Any questions or comments can be directed to us on Twitter or via email. We would love to hear from you on how we’re doing!
In celebrating Autism Acceptance Month, this is a terrific project:
Over the last few weeks, participants have been creating new forms of art on iPad Pro. Though some had previous experience with iPad, most artists had not previously worked with Apple Pencil to create art. Through one-on-one sessions at Apple stores, drawing apps, and other support from Apple these artists gained confidence with the new method and learned how to maximize their creative potential. The participants have differing abilities, and are of different genders and ages ranging from 15 to 53.
The Art of Autism is very excited to lead this series for artists of all abilities. We are grateful to Apple for their support and to the artists for sharing their art and insights. The Created on iPad art exhibit highlights the diversity of art and the creativity of many on the autism spectrum in visual art — and how technology can enhance their experience.
Though I am a fan of the Mac, I don't think there's any question that, generally speaking, an iPad is a far more accessible computer than something like a MacBook. This exhibit is one reason why.
For many years, I was a diehard Tweetbot user, the popular indie Twitter client for iOS and the Mac. Everything about the app’s design delighted me: I adored the sounds, the way fonts were rendered, and just the way the user interface looked. It was—and remains—a beautiful app. Tapbots, Tweetbot’s developer, does exquisite work. It’s no wonder the app is so beloved.
Somewhere along the way, however, I moved away from Tweetbot and gravitated towards Twitter’s official first-party app. I don’t exactly know why I made the change, but I soon came to realize that the official client is actually well done in its own right. Despite its proclivity for inserting ads and promoted tweets into the timeline, among other annoyances, I can appreciate the niceties the official app offers such as the Search tab, Moments, threaded replies, and—best of all—the dedicated GIF button.
And one other thing: Accessibility.
Many nerds like to shit on the official Twitter app for being an abomination, mainly due to the algorithmic timeline, but the truth is Twitter’s iOS client is worlds better than Tweetbot for accessibility. The UI design is much higher contrast—Twitter for iOS even acknowledges when you have the system’s Increase Contrast setting enabled, as I do. And, crucially, the official client natively supports alt-text, which allows users to append image descriptions for the blind and low vision before tweeting.
I’ve spent the last several weeks back on Tweetbot, because I still have great fondness and respect for it. More to the point, however, I wanted to revisit the app and see how it compares to the official app. I’ve identified a few bullet point enhancements Tapbots that would greatly increase its accessibility. Such as:
- Higher contrast iconography and support for Increase Contrast
- Support for adding alt-text to images prior to tweeting
- And just for fun, add an integrated GIF button
One thing Tweetbot does extremely well, accessibility-wise, is with sounds. The sounds you hear when a new tweet is sent or retweeted or a mention comes in, are helpful audio cues that the action in question occurred. This creates a bimodal sensory experience where someone like me, who has low vision, can see and hear my interactions with the app. The sounds are undoubtedly a nod to Tapbots’ robot-y branding, but in reality they also are an unintentional assistive tool and it’s great.
I have no idea where Tapbots is in development on version 5, but wherever they stand, I hope they consider improving the app for accessibility. It’d make it even more delightful for me (and others) who rely on accessibility to get the most from our apps.
TechCrunch’s Matthew Panzarino was given exclusive access by Apple to what the company calls their “Pro Workflows Group.” This is the team who are working on the redesigned Mac Pro, which Apple confirmed to Panzarino is slated for release next year.
Now, it’s a year later and Apple has created a team inside the building that houses its pro products group. It’s called the Pro Workflow Team, and they haven’t talked about it publicly before today. The group is under John Ternus and works closely with the engineering organization. The bays that I’m taken to later to chat about Final Cut Pro, for instance, are a few doors away from the engineers tasked with making it run great on Apple hardware.
“We said in the meeting last year that the pro community isn’t one thing,” says Ternus. “It’s very diverse. There’s many different types of pros and obviously they go really deep into the hardware and software and are pushing everything to its limit. So one thing you have to do is we need to be engaging with the customers to really understand their needs. Because we want to provide complete pro solutions, not just deliver big hardware, which we’re doing and we did it with iMac Pro. But look at everything holistically.”
To do that, Ternus says, they want their architects sitting with real customers to understand their actual flow and to see what they’re doing in real time. The challenge with that, unfortunately, is that though customers are typically very responsive when Apple comes calling, it’s not always easy to get what they want because they may be using proprietary content. John Powell, for instance, is a long-time logic user and he’s doing the new Star Wars Han Solo standalone flick. As you can imagine, taking those unreleased and highly secret compositions to Apple to play with on their machines can be a sticking point.
This story is a terrific follow-up to last year’s Mac roundtable one.
Per a press release put out by the American Foundation for the Blind:
The American Foundation for the Blind (AFB), a national nonprofit that creates a world of no limits for people with visual impairments, today announced the election of a new, distinguished trustee to its national board: Sarah Herrlinger, Director, Global Accessibility Policy and Initiatives, at Apple, Inc. Herrlinger was elected during the February 2018 Board of Trustees meeting.
"Sarah Herrlinger is an outstanding addition to AFB's board," said Kirk Adams, AFB President and CEO. "Ms. Herrlinger and Apple's commitment to making technology accessible to everyone reflects AFB’s mission of removing barriers and creating a world of no limits as we approach our 100th year and beyond."
As I wrote on Twitter, Herrlinger acts often as the “public face” of Apple’s accessibility efforts. She’s generally the one who represents Apple at awards ceremonies and appearances, such as her attendance at SXSW last month for the Everyone Can Code program. At Apple, Lisa Jackson oversees Apple’s accessibility initiatives as part of her duties, but it’s really Herrlinger who’s one of the big drivers, internally, of new features, partnerships, and the like pertaining to accessibility across the company.
Mark Gurman and Ian King, reporting for Bloomberg:
Apple Inc. is planning to use its own chips in Mac computers beginning as early as 2020, replacing processors from Intel Corp., according to people familiar with the plans.
The initiative, code named Kalamata, is still in the early developmental stages, but comes as part of a larger strategy to make all of Apple’s devices -- including Macs, iPhones, and iPads -- work more similarly and seamlessly together, said the people, who asked not to be identified discussing private information. The project, which executives have approved, will likely result in a multi-step transition.
For Apple, the change would be a defining moment. Intel chips remain some of the only major processor components designed by others inside Apple’s product portfolio. Currently, all iPhones, iPads, Apple Watches, and Apple TVs use main processors designed by Apple and based on technology from Arm Holdings Plc. Moving to its own chips inside Macs would let Apple release new models on its own timelines, instead of relying on Intel’s processor roadmap.
As Rene Ritchie tweeted, Apple’s likely been working on this project for years.
Farhad Manjoo, writing for the New York Times:
This has been my life for nearly two months. In January, after the breaking-newsiest year in recent memory, I decided to travel back in time. I turned off my digital news notifications, unplugged from Twitter and other social networks, and subscribed to home delivery of three print newspapers — The Times, The Wall Street Journal and my local paper, The San Francisco Chronicle — plus a weekly newsmagazine, The Economist.
I could never go back to reading print. As I tweeted, when I was growing up my dad used to get the paper every day, and I used to steal the sports page from him. Aside from headlines, stories were hard to read because the text was so small; I had to get so close to the paper that the tip of my nose would be covered in ink. Manjoo’s broader point about unplugging and not getting your news from Twitter—which I do, much to the chagrin of my RSS account—is well taken, but for me, technology makes consuming news a more accessible experience that helps more than hinders. I suspect I’m not alone in that sentiment either.
Lori Hawkins, reporting for 512tech:
Ober and fellow students at the Texas School for the Blind and Visually Impaired spent Wednesday morning with a team from Apple Inc. learning to code and then using the code to pilot small drones.
“It’s really cool because a lot of us really good on computers and want to create apps, but there aren’t the tools,” said Ober, 18, who is legally blind. “This technology opens up more possibilities.”
The event was Apple’s first in-school coding session for students who are blind and low vision.
The story also mentions Apple will be represented at next week’s SXSW, where they’ll discuss the Everyone Can Code initiative and how it was built to be accessible to all.
Fascinating piece on Medium by the Academy of Motion Picture Arts and Sciences:
On the evening of May 16, 1929, Academy members and their guests, 270 in total, filled the Blossom Room at the Roosevelt Hotel for a banquet featuring the presentation of the Academy’s Merit Awards. They dined on fillet of sole, half a broiled chicken on toast, string beans and potatoes. And they all knew the winners.
My favorite tidbit from this article: The ceremony lasted only 20 minutes!