-
DESCRIPTIONLearn how Apple's Accessibility team is harnessing innovations across hardware, software, and machine learning to support users who are blind or low-vision. Hear what’s new in VoiceOver, and other vision accessibility tools across Apple devices.
Speakers
-
-
SESSION TRANSCRIPT
[MUSIC PLAYING]
BRIAN: A deep dive into Apple’s industry-leading screen-reader VoiceOver. Moderator, Matthew Panzarino, speakers, Dean Hudson and Areeba Kamal.
MATTHEW PANZARINO: Thanks, Brian. OK. I’m so happy to be here at Sight Tech Global to talk about different issues around accessibility and some of the new technological advancements. I’m so happy to have a couple of participants here from Apple. I’d love for you both to introduce yourselves and what you do at Apple. Perhaps, Areeba, we’ll start with you.
AREEBA KAMAL: Yeah, it’s great to be here. My name is Areeba. I am our product manager for accessibility at Apple. And I work really closely with our engineers, our designers, to build vision, hearing, mobility, and cognitive accessibility features across our devices.
MATTHEW PANZARINO: Thank you, Areeba. And Dean?
DEAN HUDSON: Yeah, I’m Dean Hudson. I’ve been at Apple for a little while. Started engineering and now working in the Policy Social Initiatives Group. I’m an accessibility evangelist here at Apple working with various teams to make sure our products are accessible.
MATTHEW PANZARINO: Excellent. Thank you. So, so happy to have you both with us. I’m glad you could take the time. There’s lots of places we could start. But I think one good launchpad for us would be to talk about what’s new in vision accessibility this year at Apple. I know we have a lot of new features being launched every year. And just what’s the– what’s the fresh? What’s the bleeding edge of vision accessibility in the new versions of iOS and new hardware this year?
AREEBA KAMAL: Yeah, Matthew. Like you said, accessibility is something that we are always working on. And there’s always new and exciting updates for our users with disabilities. And this has been an exceptional year for accessibility at Apple. So within vision, I wanted to point out two things today. One, there’s been some fantastic updates to our magnifier app, which is built into every iPhone and iPad for our low vision and our blind users. And this year on iPhone and iPad devices with the LiDAR scanner, we introduced a tool called detection mode. Now detection mode is a handy place for our blind users to find a whole set of helpful utilities. And one of them is called door detection. It’s new this year. And what it does is, if you’re a blind user and you’re getting out of a taxicab to explore a new neighborhood spot, door detection is going to tell you how many doors that are in front of you and read the text and symbols around the door so that you can find the right door to your destination. And it can also describe the color and the shape of the door, how far you’re standing from it, whether the door is open or closed, and if it’s closed, how you can open it. And you can combine it with people detection to distance socially and image descriptions to get some more descriptions of your surroundings. And we’re getting some really great feedback from our users who are blind that the tool is helping them just go and explore new places around them. The second thing I wanted to mention for folks is our screen reader VoiceOver has been around for a while, but it keeps getting better. And this was an incredible year for VoiceOver because we actually added over 20 new languages and locales for our users across different geographies and our users globally. And we also added nifty new customization options. So even for languages that we previously supported, there’s a host of new voices such as the eloquent voices that are super popular amongst veteran screen reader users. And you can customize VoiceOver even more with the updates we’ve made this year. There’s also a text checker. So if you’re a VoiceOver user on the Mac and you have any duplicative spaces, you have any missed capitalized letters, the kind of visual typos that might be hard to catch, VoiceOver can now help you catch them. So really a great year all around for vision accessibility.
MATTHEW PANZARINO: Well, thank you for that overview. That’s nice. I mean, I think the door detection, obviously, a very useful, in the moment tool for navigating spaces. And I think VoiceOver and screen reader has come a long way over the past few years in helping folks to navigate dimensional space, right, to a greater degree, rather than just– I think a lot of people would associate it, for instance, with things on the page or on a flat screen. But instead, utilizing the camera and LiDAR and all of the different ML work that Apple has done, now dimensional space is more accessible. And that was one of the driving goals of this, right?
AREEBA KAMAL: Yeah, I think you hit the nail on the head. I think the really magical thing about accessibility at Apple is because we build our own hardware and our own software and because all our machine learning algorithms work on device, we’re able to combine those elements to help our users with disabilities solve really difficult problems in their everyday life. And I think door detection, like you said, is combining input from the camera, from the LiDAR scanner, from machine learning, and it’s keeping all that information on your device. It’s not being sent to any servers. Your data is private and secure on your iPhone and iPad. And at the same time, you’re able to go out and figure out the last few feet to your destination without needing sighted assistance. And it’s something that we’ve heard from the community in the past. And we got this problem, actually, from some users who are blind who are inside Apple. And it was something that we’ve been trying to solve for a while. And bringing together hardware, software, and machine learning is really the magic formula for us.
MATTHEW PANZARINO: Wonderful. Thanks. You mentioned inside Apple– I mean, you know, Apple’s approach to accessibility, the overall approach to accessibility, I think is a unique one. And that’s how would you describe Apple’s approach to making accessible features available to users is different from other tech companies?
AREEBA KAMAL: I think at Apple, accessibility is a fundamental human right. As a company, we’ve identified a set of core values that back every single thing we do, every single thing we build, and the way that we work. And accessibility is one of our core values. And when it comes to our software platforms, we’re really adamant that accessibility is not something you download. It’s not something you buy a license for. It is built into every Apple product from the get-go so that a user with a disability can unbox their Apple product and set it up independently just like anybody else. And I think the thing that really sets us apart in many ways is that there are engineers at Apple, there are designers at Apple, there are people in our accessibility team who are building these accessibility features and they are using these accessibility features every single day. And so we’re getting feedback from the community, from our users outside Apple. But we’re also kind of benefiting from the empathy and the lived experience on the accessibility team itself. And all of that is powering the innovation that you see across our accessibility features.
MATTHEW PANZARINO: Thanks.
DEAN HUDSON: Yeah–
MATTHEW PANZARINO: Oh, go ahead, Dean. No, no, it’s all right.
DEAN HUDSON: It’s really true, as someone who’s been a part of that team in developing some of the access technologies within our applications and our products. Just being in the meetings with other engineers and having the back and forth between how to design a specific feature or how to enhance a specific feature is really sort of the proof. When you have those kinds of discussions, the back and forth, you really refine and make the product a lot better for our users.
MATTHEW PANZARINO: Yeah, that makes sense. And Dean, I’ll throw to you since you’re already here. I’m curious– if you wouldn’t mind, give an audience that may not have used it before a little bit of an overview about how VoiceOver works, about the screen reader, kind of what its basic functions are because it’ll kind of set the scene for our next part of our discussion.
DEAN HUDSON: Yeah, yeah. So VoiceOver is our award winning screen reader. It is built in all of our devices. And the way it works, sort of a general top level, it examines elements that appear as UI on the screen and presents that information to the user in a succinct way through speech output. Now, typical screen readers, you can navigate and it will speak the element. Sometimes though, if a developer has not properly made an element accessible, VoiceOver has built-in intelligence technology now which can actually determine what that element should be and then read labels for that element so that you don’t miss out on things that a sighted person can see. And so we’re really proud of that. And the fact that it’s built into everything, as Areeba said, it makes it very, very liberating for a person who is visually impaired or blind because I don’t need to think about extra cost. I don’t need to think about downloading anything. I can open my product and start using it immediately. And that’s something that, as long as I’ve been here, it’s never gotten old that I can go down to an Apple store and literally use every Apple product in that store. That’s just amazing to me still. So that’s something that we really pride ourselves on. And it’s just kind of how we do– Apple is– it’s easy to use. It behaves like it should. And that’s where we stand in terms of accessibility as well.
MATTHEW PANZARINO: And, you know, Dean, you’ve been at Apple for something like 20 years– and correct me if I’m wrong. But you’ve been here since the earliest days of VoiceOver, for sure. And I’d love to hear you talk a little bit about the genesis of it. Where did that come from? Where did that initiative begin?
DEAN HUDSON: Wow. Yeah, not quite 20 years, but vastly approaching. Yeah, when I started in 2006, came on as a QA engineer, and there were a few programs, applications that I had to use, things like Mail and Calendar. And they were accessible. But as I started to perform my task and had to do things, and do things at a certain efficiency, I started to make recommendations on how we could improve that experience so that people could work– or at least I could work– in a more effective way. And I’ll give you a small example of that. When you create a new message in Mail, you would expect to go to the To field. Well, we would go to the toolbar. And so I told engineers, hey you guys, it would be way faster for me if I just went right to the To button and started typing as a recipient. And that’s small, right? That’s detailed inside. But it’s powerful in terms of just efficiency. So we continue down that path through other apps, and even started reaching out further and making really good, powerful accessibility changes. And in 2008– I don’t know how many people remember this. But we introduced speech output on an iPod. And it was the first one we did on a mobile device. And we also made iTunes accessible. And that was huge. I mean, that was a game changer. Fast forward a little bit more, in 2009, Phil Schiller was introducing the 3GS iPhone, and he sort of slipped in there that, yeah, we will be accessible. People will be able to use it with disabilities. And I can tell you that friends that I knew that were blind and other people just found that confusing, like how is that going to–
MATTHEW PANZARINO: Right. I mean, the lack of the physical controls, which obviously was talked about in the larger context because BlackBerry was like, how could you do it without a physical keyboard? But for a user who is blind, that added a whole other connotation, like how will we type on this screen where we cannot touch these buttons?
DEAN HUDSON: Exactly. And what’s interesting, though, is we announced it, but it wasn’t public yet. So I knew because I’d been testing it for a few years now– a couple of years. And I just couldn’t say anything, like, well, you know. It was kind of weird. But–
MATTHEW PANZARINO: You previewed it during the keynote. You kind of had to hold your tongue for a bit.
DEAN HUDSON: Yeah, exactly. But history show that it actually is very popular. And it turned out to be a very, very good solution. And according to the web page survey that 72% of blind users use VoiceOver. And that’s amazing.
MATTHEW PANZARINO: Yeah, that’s fantastic. Thank you for that overview. I mean, I think the To field is something that we can all appreciate. So thank you for that suggestion. I think that’s a default industry behavior now.
DEAN HUDSON: Yeah, yeah. It’s a little inside baseball.
MATTHEW PANZARINO: Yeah, it’s good. And then I think ML, Machine Learning, is probably– has been one of the biggest boons overall to the work of accessibility, as a problem solving element of accessibility. You know, AI– general AI– is a whole other bag. And it’s not a panacea. We all know that. But ML specifically as a tool has been great for solving accessibility problems. And so I just want to talk a little bit about that, kind of some specific hurdles that ML helped jump, especially as the co-processors and hardware became more available on iOS devices to support these skill sets– tool sets and learning models.
AREEBA KAMAL: Yeah, I think machine learning has been a really critical tool– and continues to be a really critical tool for us to solve problems that are difficult to solve and support our users with disabilities. And it’s especially powerful when you consider that our incredible chips let us do this in a privacy preserving way. And there’s a great synergy between our on-device machine learning and the sensors in our hardware and our existing software tools. And so we’ve talked about door detection, which is combining input from the camera, from the LiDAR scanner, from the on-device machine learning that’s happening on your iPhone and iPad, and it’s giving you this information about this is how far you’re standing from the door. This is what the label on the door reads. It’s a sign to your favorite new restaurant on the block. And it’s a wheelchair accessible entrance. And all of that information that’s being given to the user, under the hood, we’re using our hardware sensors, our software tools, our on-device machine learning to deliver that. And that’s just one example. Dean just talked about screen recognition for VoiceOver and how that draws upon machine learning to try to distinguish any unlabeled images and any unlabeled UI. We know that there are third-party apps that may have some buttons that are not labeled. Or you might get a photograph from a family member and they’ve forgotten to add alt text. And in those moments, we’re using on device machine learning to pass that element and describe it to you. And there are countless other examples of machine learning across accessibility. Those examples, of course, you’ll find them in vision accessibility. You’ll also find them in our live captions for deaf– hard of hearing and deaf blind users. You’ll find those examples in voice control for users with acute physical motor disabilities. It’s really a critical tool for all of our cutting edge accessibility tools today.
MATTHEW PANZARINO: Yeah. And I think that the broader consumer– one of the overarching themes that I’ve found through my years of exploring and trying to understand accessibility work is that I think the broader consumer doesn’t realize just how much they benefit from this work. The long tail that goes into providing the most accessible experience for any user has actually a pretty profound effect on their user experience as well. One example being the categorization and analysis of photos to present text in them– and selectable text. I use selectable text so much in my photos, whether it’s copying out a label from something and pasting it into a search bar or even searching for a particular subject like dogs or cats or grass or trees or landscape. And a lot of that work began with the idea of, how do we help a person who is blind to see what is available in their photo archive through the power of ML?
AREEBA KAMAL: Yeah, I think there are always examples of accessibility paving the way for innovation in all sorts of spaces. And I think your example of live text is a great one. Now, live text is drawing upon on-device machine learning. And it’s super handy for all users. But when you are a blind or low vision user and you’re able to select text in a photo and then use an accessibility tools such as Speak Selection or VoiceOver to hear that text be spoken out loud, now that is a whole different level of access that is made possible by the combination of our hardware, our software, and our machine learning. And there are times, Matthew, when we build an accessibility tool and it gains momentum across different spheres of users. I think if you’re familiar with Back Tap, that’s a feature that we built for accessibility users who–
MATTHEW PANZARINO: Describe Back Tap really quickly for people who aren’t familiar with it. I love this.
AREEBA KAMAL: Yeah, yeah. No, so Back Tap is a handy feature. What it does is if you want to set up a shortcut– so you’re someone who uses a flashlight a lot or you’re someone who takes screenshots a lot or you want to launch a specific app really quickly, you can actually customize Back Tap, which you can find in your accessibility settings, so that any time you double tap the back of your iPhone or you triple tap it, it performs that action that you’ve selected. And so we built that originally for our users with physical and motor disabilities. And we said, we know that somebody with a physical or motor disability might have trouble taking a screenshot and making the right claw shape with their fingers. And what if it was as simple as just tapping the back of the device a couple of times lightly? And that is a great example, actually, of hardware sensors and on-device machine learning kind of all coming together to power that feature. And we’ve seen it be really popular amongst able-bodied users, amongst users with disabilities. And they’re always nice surprises there. That said, I think our focus remains on users with disabilities. Our accessibility tools sometimes have utility for all users. But we’re always honed in on the users that are otherwise gated from our amazing products and trying to address those barriers.
MATTHEW PANZARINO: And Dean, obviously, the advent of ML, you have the historical context to see how that has changed the game in accessibility. Would you view that as like a big cliff for availability of building these new features out?
DEAN HUDSON: Oh, yeah. I mean, technology in general– I mean, we live in a very exciting time. And watching Apple improve on our hardware and advance in our software– and our accessibility team just right there with it, just trying to take advantage of every leap and bound that we make here. And ML is a big piece of that. I don’t see this going away soon at all. I think it’s going to continue bringing rich content to people and other things. So I think this is just a beginning of something that’s going to be pretty fantastic.
MATTHEW PANZARINO: Thanks. And we’ve talked a lot about first-party efforts within Apple to build accessibility into their products. But I want to talk a little bit about third-party stuff because you can’t do it all. And Apple’s availability– our ability to kind of touch all of the surfaces of accessibility is limited in some ways. So like, hey, these are the products that we make. And other people make other products for other communities, especially specific communities that have an intense passion around things like gaming and design and other things like that. And so I’d love to talk a little bit about third-party support and how Apple’s accessibility efforts integrate with kind of the broader world of accessibility efforts.
AREEBA KAMAL: Yeah, I’m really glad you asked because our developer community is so important to us. And it’s so critical to the Apple experience that we want for our users. And we know that our developers are always innovating. They’re always building amazing apps. And we want to make it super easy for these apps to be accessible to our users with disabilities. And so we’re always building accessibility tools for developers. And the newest in that long list of tools is our Unity plugin. So we know that Unity is a really popular platform for game developers today, for graphics experts today. And what we wanted to do was make it super easy for these developers to utilize Apple frameworks, including our accessibility framework, our haptics framework, our game controller frameworks. And so what we did is we put in a lot of work and we created this plugin that you can download from Apple’s GitHub. And you have a quick start guide that you run. And within minutes, your Unity game is accessible to users with disabilities using our built-in accessibility features like VoiceOver and voice control and switch control. And it was really– it was a Herculean effort for the team. But what we wanted to do was put in the work on our end to make it easy for our developers to make their apps more accessible. And that’s been a years long effort. Of course, if you’re not using Unity, if you’re using Swift UI, AppKit, UI kit, one of our Apple frameworks to develop your app, we make every default element accessible just by design. And so with that, you have to do almost no additional effort for the app to be accessible if you’re using those default elements. And there’s a host of tools such as the accessibility inspector that’s built into Xcode, which is our development environment on Mac OS so that if you are using custom UI, it’s super easy to audit it and understand whether it’s accessible and what you can do to make it even more accessible.
MATTHEW PANZARINO: Excellent. Thank you. You know, I think I’d like to take a couple of minutes to talk about what you’re kind of most excited about the future of accessibility. I think Apple is at the forefront of a lot of this work, makes a very public and concerted effort to ensure that people view their work as central. And I think that is borne out by the products and, as Steve mentioned, how many people do use them. So I think people would be interested to hear from each of you, what are you most excited about in the accessibility work ahead?
DEAN HUDSON: I’ll start off. I think what we’re most excited– I’m most excited about is really our users. And as I said earlier, to watch engineering grab hold of some of these technologies and really bring it home to our users is what excites me. And all the time, I’m hearing from people who will say, hey, I didn’t know this feature was in here. And it’s not like, yeah, it was a cool feature I found out. It’s like, no, this actually impacted me. This had something to do to improve my life. So that’s where I go– without being very specific about what we’re going to do next. But it’s how Apple approaches challenges of just grabbing the technology and seeing how we can make it work effectively for folks. That’s what’s most exciting to me about the future.
AREEBA KAMAL: Yeah. I really think that Dean hit the nail right on the head. And I think what excites me the most is going into work and meeting with our incredible accessibility team and getting feedback from all our users with disabilities and trying to understand, what is one more experience that we can make even more accessible? What is any existing barrier that exists, even for a single user with disability around the world that we can dig down today using our technology? And if you’re familiar with Emergency SOS via satellite, this year, we added the option to use your iPhone to text emergency services via satellite connection when you don’t have cellular and Wi-Fi coverage. And the team realized that the user might need to turn left or right to avoid a blocked signal. And the iPhone is providing visual guidance. And what we did was– we were in the details and we realized that, hey, if a VoiceOver user is using this feature, they might not get to follow along as easily with the visual signal. And so we’re giving the VoiceOver user sound and haptic feedback to help them avoid that blocked signal. And just that level of detail and trying to kind of work hard to make every single experience across our devices accessible to more and more users, I think that’s what really excites folks like Dean and myself.
MATTHEW PANZARINO: That’s great. Thank you so much. I appreciate that example, as I do feel that there is a tendency, when launching new technologies, to say, hey, we’re going to launch it. It needs to ship in Q4. It needs to affect our business outcomes to a certain degree. We need to launch our new marquee feature. And accessibility is often relegated to near the end of that list, maybe down with quality assurance too. But I think that viewing it as a priority and saying, hey, this is a very new feature. It’s brand new. It’s implementing a bunch of new technology and a lot of effort across all of our organizations, accessibility needs to be woven into that from the beginning. I think that’s a beautiful stance. And I hope it does– I hope it’s infectious, right? We deal a lot in startups in our universe. And I hope they take that kind of example with them and build these efforts in from the ground up as well. Right along with innovation, right along with shipping their product, they need to understand that these efforts will bear fruit long-term. So thank you both for the discussion. I really appreciate it. And thanks for taking the time.
DEAN HUDSON: Thank you.
AREEBA KAMAL: Absolutely. Thanks for having us.
[MUSIC PLAYING]