Apple talks accessibility at SightTechGlobal (and it’s fascinating stuff)
Marking International Day of Persons with Disability, Apple’s Senior Director of Global Accessibility Initiatives, Sarah Herrlinger and accessibility engineer Chris Fleizach joined TechCrunch Matthew Panzarino for a fascinating interview at accessibility event, SightTechGlobal. You can (and should) read the whole thing here. What follows are just. few of the many, many highlights of this conversation. (Recommended).
Apple has worked on accessibility since 1985
Apple sees accessibility as one of its six core corporate values. Herrlinger also shared something I didn’t know:
“And a little factoid– we actually started our first office of disability back in 1985, which was five years before the ADA even became the ADA. And I think really from the start, that goal of making products that are products that everyone can use has permeated the design of our products since day one.”
The relationship is deep
Fleizach was on the macOS VoiceOver team when the iPhone was first introduced. He pointed to the challenges some users may face with a touchscreen device and was involved in bringing accessibility to the device.
The work took three years until it was made available in iOS 3. This is pretty typical with accessibility, as the teams get bought in quite early on when Apple builds new products, Herrlinger said. This relationship goes deep, and seems to work in both directions with the aim to change lives across all users.
What if FaceID didn’t understand diversity?
Fleizach shares a story about FaceID. “We were brought in over a year before it ever shipped to customers. And they asked for our expertise in this. And one of the jobs that we have is to sort of poke and prod. It’s like, who would be impacted here? How could we improve the situation, and find all those edge cases? So, one thing that we pushed hard for at that point was to ensure that facial diversity was included in the training set to make sure that FaceID would work for a wide range of people right out of the gate.”
VoiceOver Recognition is P.R.O.F.O.U.N.D.
Herrlinger spent time discussing iOS 14 and some of its accessibility features, including VoiceOver Recognition. This lets the iPhone recognize some things, but delivers contextual information: it won’t just say what it sees, but tries to explain what those things are doing. “Cow jumps over fox,” for example. The feature uses the iPhone camera to give a running commentary on what’s going on around the user. The only problem, I guess, is the need to wave the iPhone around in public.
The Apple experts confirmed the feature works on the device, no server required. This also extends to machine vision being used to translate websites, apps and more. That last item is quite profound. “We can make this stuff happen now with ML, and on-device technology and the neural engine. You put this all together, and all of a sudden, you can enable people to play games that would never been able to do it before.”
The interview is packed with interesting insight. You really should take a look.
Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.