What Waymo learned at the DOT Inclusive Design Challenge
DESCRIPTIONWaymo participated in the US Department of Transportation Inclusive Design Challenge, and emerged with numerous accessibility lessons and features that will help Waymo's autonomous rides offer people with disabilities better service. Waymo's team is still processing all they learned.
BRIAN: What Waymo learned at the US Department of Transportation Inclusive Design Challenge. Moderator, Mike May. Speakers, Lauren Schwendimann and Jeffrey Colon.
MIKE MAY: Thank you, Brian, and thank you Sight Tech Global for having this forum for discussing some of the latest and greatest technology. This is Mike May from GoodMaps. I’ve worked on accessible navigation a good portion of my professional career. And when we turned into this century, I think Uber and Lyft and rideshare were some of the greatest things that happened in terms of independence for blind people, not to mention everybody. And the Holy Grail has been, when can I drive a car? And in a Consumer Electronics Show press conference in 2009, they asked Stevie Wonder and I together, what’s your greatest dream? And we both said, being able to drive and control a vehicle on our own. So we’re thrilled to have Waymo here to tell us more about that, and then to have feedback from a couple of users from the San Francisco LightHouse to tell us about what it’s like to be in a vehicle, which I haven’t done yet, but I have lots of questions about how it all works. So let’s do a quick round of introductions and then we’ll dig into it. Lauren?
LAUREN SCHWENDIMANN: Hi, everyone. I’m Lauren Schwendimann. Thank you so much for having me. I lead the Experience Design team for Waymo One at Waymo. So our team is responsible for designing all the interactions that riders have with Waymo’s ride-hailing service, from hailing their car in the app to finding their car, boarding, riding, exiting, and finding their final destination. And I’m excited to be here to talk about our DoT Inclusive Design Challenge.
MIKE MAY: Yeah, thrilled to hear more. And certainly at GoodMaps, we use LiDOR for indoor navigation, and LiDAR outdoors was a big part of, I think, a lot of the technology that allowed this sort of thing to happen. And it facilitated the indoor navigation that we do, so I’m really interested to hear more about that. Jeffrey?
JEFFREY COLON: Hi, everybody, thank you for having me. My name is Jeffrey Colon, Director of Access Technology here at the LightHouse on access technology. We provide individual and group training to blind or low-vision users throughout the Bay Area. We also provide user recruiting and corporate training to different organizations such as Waymo. So we partner with them on this exciting project, recruiting users and working with them on making sure that they have the opportunity to participate on a great user’s research study. Thank you so much for having me here.
MIKE MAY: Yeah. Thank you, Jeffrey. And of course, the LightHouse and GoodMaps are working together and we hope to have a BART station accessible soon.
JEFFREY COLON: Yes, sir. Yes, sir.
MIKE MAY: Shane, how about you?
SHANE MARTIN: Hi, guys. My name is Shane Martin. I’ve been living with sight loss for about six years now. I’m a proud member of the LightHouse family. I’m looking forward to continuing all my experiences and work with the LightHouse, so everybody’s been great. And I’ve enjoyed working on this Waymo project, and it’s been a journey for sure. So I’m looking forward to helping wherever I can, so thanks again.
MIKE MAY: Yeah, thanks for being on here. Lauren, tell us about the Design Challenge, and what is that all about, what was the request in that challenge that you guys decided to meet?
LAUREN SCHWENDIMANN: Yeah, sure, thank you. So the DoT Inclusive Design Challenge was really an opportunity for us at Waymo to pursue and embrace universal design by learning from people with a variety of needs and disabilities to ensure that we’re building a transportation service that works for everyone. So for our particular entry into the challenge, we really looked at the wayfinding user experience. We explored product prototypes geared toward touching a variety of senses that incorporated accessible audio-visual, haptic, and hands-free features, and tools to support riders throughout their journey with Waymo. So we also worked with community organizations, notably LightHouse for the Blind, and Shane who’s here, to test these prototypes and hear directly from people with blindness or low vision how well different solutions meet their needs and can support their independence in ways that really work for them in their everyday lives. So the prototypes we tested were most– like I said, mostly focused on wayfinding, and really sought to answer the question, how can we make it easy, safe, and convenient for all of our riders to know where to meet their car, how to navigate to their car, and to board their car with confidence? And then at the end of their journey, how can we help them get to their final destination safely, confidently, and independently? So that’s just a quick overview. I’m happy to dive into more of the details of the prototypes, but just wanted to give a high-level view of what we were aiming to do.
MIKE MAY: Yeah, I think before we pass it over to Jeffrey, I think a lot of people would just want to know, what is the state of autonomous vehicles? We all hear bits and pieces, particularly if there’s a catastrophe, but I think the technology has been evolving for quite some time. How long has it been in process and where are we in terms of the industry?
LAUREN SCHWENDIMANN: Thank you for that question. We’ve actually had a lot of exciting announcements happen over the last few weeks for Waymo. So first of all, autonomous vehicles are– Waymo autonomous driving has been real for Waymo for the last two years in Chandler, Arizona– real meaning available to the public. But as of late, we’ve announced that LA is our next autonomous ride hailing city we’ve begun testing at Sky Harbor which is the airport in Phoenix, Arizona one of the busiest airports in the US. And we’ve opened up our service, our Waymo One ride-hailing service, in Downtown Phoenix to the public. So and hot off the press, we now have our permit to operate in rider-only mode with members of the public in San Francisco as of today. So we’re really excited about the momentum that we have and we’re working hard every day to make to the promise of AV ride hail real for people to use in their everyday lives.
MIKE MAY: OK, good. Well, we’ll come back to a little bit more about that and also what’s next. Jeffrey, having not used the app or the experience myself, I’m just curious, is it like rideshare where you just launch an app and you put your pickup location and your destination and a car shows up? How does it all work.?
JEFFREY COLON: Yeah, it works like that. And I wanted to also have a little context of when the experience that we have when Waymo approached us with working on this great user study. Like Lauren said, they wanted to make sure that– how that we can provide feedback of blind users on all areas of the experience, like from the beginning. When we request the vehicle, like you said, Mike, making sure that a user can find the buttons on the application. So that we make sure that we can provide feedback for different perspective. So when Waymo came to us, we decided, OK, let’s find different let’s approach this in the way that we approach our user studies. This is very important for our community. So let’s find users on different skill set of access technology. Let’s find a probably 65, 70-year-old man that is starting with technology, and let’s see how this person can contribute on providing feedback of working or finding a vehicle. Let’s find a 25-year-old very tech-savvy user that can provide those technical expertise– those areas. OK. I wanted to make sure that I understand that I can cancel, that I can change my ride, that I can find different alternatives. Let’s find these 45-year-old women that are in the middle. So we try to find all different skill sets of users and combine that into an opportunity to be the liaison with the community and Waymo to help them approaching that part that Lauren mentioned, finding the car, making sure that it’s safe, that people find this experience accessible. So I even was part of the initial pilot and trying to– OK, the first part, find the application, making sure that swipes left and right, finding the buttons, those parts were accessible. So we have an idea, then when we approach the users of how to find and we find people like Shane that help them getting that feedback that they need to see how the project was progressing.
MIKE MAY: Yeah. And how many people were in the study?
JEFFREY COLON: We found– we did three runs of the study. We did one that was for the users to find the car and the other two was on different aspects of the application. So in the first round, we found 12 users.
MIKE MAY: OK. So you mentioned figuring out how do you find the car. And this really speaks to one of the hugest problems with the rideshare, you have a situation where you have a sighted driver that’s looking for a blind rider, but at least that driver, if you text them or call them, you can say, oh, I’m over here, I’ve got a dog or I’m next to a sign. But that’s always a challenge to connect with that vehicle. How does that work with an autonomous vehicle? You don’t have anybody to call.
JEFFREY COLON: Yeah. So for us was, OK, obviously the classic ones, pick up, drop off. And then it was the experience, the application telling us, OK, now the car will be– it started to wayfinding feature like Lauren mentioned, try to direct you in like you using a wayfinding alternatives, like you will be to 200 feet from your right– from your right. And then you start walking, and the application, when you were getting closer to the car, it will be like an experience like you are using a maps application, like Google Maps, that it will tell you that you are 100 feet away or 50 feet away. And then when you get closer to the vehicle, you will present with more options, like the opportunity to even run the honk so you can hear it. And that was very interesting. And the users that participated with the study that had that opportunity, when they get closer to the car, they will have opportunity. So I know that people– that Shane has that opportunity to see how it– when they were closer, that experience, because that sometimes it’s those last 50 feet is the challenge for us.
MIKE MAY: Yeah. The final frustrating 50 feet, the bane of all accessible navigation. Yeah, I was hoping you could honk the horn, that would be great.
JEFFREY COLON: It was super great having that.
MIKE MAY: It could be a bit crazy with a lot of people after a concert or something. And I suppose there’s probably some way to flash the lights for sighted people.
JEFFREY COLON: Yes, that was also part of the experience.
MIKE MAY: OK. OK, good. Shane, how did you find– literally find a car?
SHANE MARTIN: It was interesting. It was something that I hadn’t expected to be quite as easy. You guys touched on the tones, the signals, which was super helpful. I was lucky enough to have people with me, so I wasn’t too worried about being panicked or not finding it. But it seemed really easy. We crossed the street and walked right up to the car and it went pretty smooth. I was impressed. Thankful.
MIKE MAY: Yeah. Is there any option– this might be getting a little bit fancy, but can you tell the car, come to me? I mean, if they’re on the other side of the street, you may not want to cross. You communicate that?
SHANE MARTIN: Right. I don’t think we went over that. Trying to think back, but I don’t remember calling the car. I remember going to the car. I remember ordering the car and going to it. So I was going to the mountain instead of the mountain coming to me. [LAUGHTER]
MIKE MAY: And what about once you got in the car, tell us about that experience.
SHANE MARTIN: Smooth as well. Everything– once you shut the door, everything seemed comfortable. I’m usually– I was always hesitant about getting in the car without a driver, but it seemed pretty easy and the whole experience was comfortable and it seemed like it would have been a great ride. I wish I could have taken a ride in it, but maybe at some point.
MIKE MAY: So you just got in and sat there and checked it out?
SHANE MARTIN: Mm-hmm.
MIKE MAY: Does it provide any sort of progress report or announcements saying one mile to my destination? I’m passing Starbucks or any kind of play-by-play like that?
SHANE MARTIN: That would have been great, but I didn’t get to ride in it. So I got to go to it and get in it and then– yeah. So I didn’t get too much after getting to the car, I didn’t get to experience too much.
MIKE MAY: Yeah. Lauren, is that something that’s available or will be available?
LAUREN SCHWENDIMANN: That is actually a setting that we have available that people can turn on or off, is like more verbose audio cues to help support understanding of what landmarks that you’re passing and what’s going on with the ride and how the driver is making driving decisions.
MIKE MAY: So once I’m in the vehicle, am I interfacing via the app to know what’s going on and how to maybe change my destination? Or is there a switchover to a big screen in the car and control panel there?
JEFFREY COLON: No, you do you have the– you have the app, I was doing it through the app. And like Lauren said, you have the opportunity to change it through the app, like you usually can change any ride. But also, you have the opportunity to interact– and you hear the car mentioned in different– like if you activate both option, then you will get an opportunity to hear some of the destinations when you’re passing different areas.
LAUREN SCHWENDIMANN: That’s right. There’s– an obviously, as Jeffrey mentioned, you can interact with the app to control every part of the ride. There’s also a screen in the car should folks want to interact with a screen that shows visually information about what’s going on in the trip and supports some visualizations of what the car is seeing and responding to in the environment. And that screen has some controls as well, but we really see it as a priority feature to have every control accessible in the app so that can be a primary interface for folks.
MIKE MAY: Yeah. The more any kind of app can parallel the sighted experience with the blind experience, the better just in terms of updates and sharing user experiences and helping each other out. What had to be adapted for blind users other than making sure the buttons were accessible? The obvious things. Was there anything else like the horn honking that had to be added? That could actually be helpful for a sighted person as well.
LAUREN SCHWENDIMANN: Yes, that’s one of our hero examples of how universal design has really improved the experience for everyone. So our honk horn feature has been available in the app for some time, and we’ve seen usage of that feature from a variety of folks, blind and low-vision people included, who really rely on it. But we’ve seen everyone really appreciate that feature in helping confirm where their car is, or they know that it’s nearby, but it could be behind a building. And just the ability to leverage that sense to hear– or to find your car has been really useful for everyone. There’s a feature that can minimize walking time. So the car that some people– is important for some people with different abilities or disabilities or mobility challenges where– the way that our car navigates to a pickup spot is influenced by a variety of factors. The route time, the traffic patterns, et cetera. And so there is a setting where if it’s really important to you to minimize your walking as much as possible, you can tell the Waymo app, minimize my walking. And it might take the car slightly longer to get to you, but if that’s important to you, that’s a setting that’s available. I’ll also add that one of the features we tested in this research, which was around better app navigation to find your car– so I think Jeffrey was mentioning like your car is 50 feet straight ahead or to your right, there’s a voiceover experience, and we built this compass-like interface that could give these verbal directions to support blind and low-vision usage. But we’ve also seen that everyone really loves the simplicity of this feature because it’s a very human way to give directions to someone versus trying to follow dots on a map or lines on a map, just hearing or being able to read and see, go 50 feet to your right or 30 feet straight ahead. That feature has also been talked about by a lot of our users as being super useful.
MIKE MAY: Yeah.
LAUREN SCHWENDIMANN: So that’s another example.
MIKE MAY: That’s certainly my world in terms of accessible navigation, and particularly in indoor settings where we give turn-by-turn directions. You don’t have streets and sidewalks and parameters that guide where you’re going, so you need to be really good about giving a slight left or hard left or right or turn here and there. So that sounds like it’s the same sort of thing in order to navigate to your car. It just crossed my mind in terms of cars honking, most cars just have one kind of horn. I’m just wondering, could I send a flag to the car that’s– this is Mike’s honk profile? I could even use Morse code or something.
JEFFREY COLON: Very interesting. I remember now, they– actually, when we tested, as far as I remember, probably they even added more profiles. It was more than one sound. So every time that we pressed the button, it was a different sound. So sometimes especially, that’s very important when we talk about universal design for different users, that maybe they have some problem with different frequencies and maybe they could not detect that sound among a crowd of people. So remember, we talk about during the feedback sessions, and they actually implement it, and I remember hearing more than one sound. So it will be more opportunities for people that has– could have some problems identifying those sounds to have opportunity of hearing a different one.
MIKE MAY: Right. One of the thing I’ve always wondered about– when I’m going along to– on my journey and something goes wrong, there’s a traffic jam or something breaks down. Now I’m in the middle of the road, I’m a blind guy by myself, what– I’m sure you’ve talked about this experience. What do I do? Sit tight? I obviously can’t get out and go walk to the nearest gas station. What kind of safety parameters are in place to deal with that kind of situation?
LAUREN SCHWENDIMANN: Great question. So we do have rider support available via button in the app and in the car. And this connects people to a human agent right away who can see where your car is and understand what’s going on in your ride and be that human-to-human kind of interaction and help you understand what might be going on and/or help you resolve any unforeseen situations. And so we’ve also found that service– that aspect of our service to be really reassuring and helpful to people to know that, OK, I’m in this ride, I love the privacy, I love so many aspects of it, but what happens if something unforeseen happens? And that’s where I think rider support is that great backup extra help, extra support as needed. And Lauren, is the thinking here with autonomous vehicles that the world of rideshare right now doesn’t really share that much, but the idea of making more use of a car because it picks me up, it drops me off, then it goes and gets somebody else, it seems like that’s a more effective sharing type of approach to using motor vehicles?
LAUREN SCHWENDIMANN: Yes. I mean, a service like Waymo One is accessible in a variety of means, meaning like folks don’t need to own a car to use it to get from point A to point B. And so I think it allows us to share resources among the community and leverage the technology and make it available to everyone. So yeah, I believe that the ride-hailing model really does make this really exciting important technology available broadly to the public, and that is our vision that we’re working toward.
MIKE MAY: Yeah. Shane, let me pass it back over to you for a second. What are your– what’s your thinking going forward? I’m sure you’re excited about this. Or do you have any concerns? What are you thinking about the future?
SHANE MARTIN: 100%. I want to go back to what somebody said earlier. I just want to say that don’t have to be 75 to be technically challenged. Believe me. But everything seems to be pretty smooth and good on that end. But the one thing that stresses me the most and gives me the most– makes me most anxious is getting there before it times out. Your driver’s leaving in two minutes or whatever, and that’s always my biggest fear, I guess, about– is missing it. And I think– I don’t think I’ve asked about that during my time with it. So that’s what stresses me the most. Is there a time limit? I’m sure there has to be, but how do you handle that on your end, I guess? Or how do you address that?
LAUREN SCHWENDIMANN: It’s a great question, and that is actually one of the learnings that we had from this research and that we’ve seen. So Shane, you’re not alone, we hear that a lot, that–
SHANE MARTIN: Great.
LAUREN SCHWENDIMANN: –knowing– there’s stress involved in finding your car and getting to it before it drives away. And in particular, for folks with disabilities. And so this is an area where we’re really excited to explore what we could possibly do to mitigate that– or to, again, reassure people that they won’t miss their ride. So that work is in progress, but that feedback has been heard loud and clear.
SHANE MARTIN: Great.
MIKE MAY: Yeah, interesting. With so many technologies, it really takes years from when it’s proof of concept and you can do cool demos. I think of voice speech recognition, I worked on a project in 1981 at Bank of California where we were thinking, within a couple of years, you’ll be doing banking by speech, and it took another 25 years or more. Indoor navigation, I started working on that in 1996, and it’s just now becoming viable. Where are we in that evolution of autonomous vehicles? How long have they been going and what can we expect? I’m sure there’s a window or an estimate and you have to have lots of caveats, but what is the future coming forward?
LAUREN SCHWENDIMANN: Yeah, no, I think it’s a question on a lot of people’s minds. So just to answer one of your early questions, Waymo, which was formerly the Google Self-Driving Car Project, began in the late aughts. And so we’ve been working in this problem space for, I don’t know 15-ish years– I don’t know the exact number, don’t quote me. But we’ve been working in this space for a long time. And how real is it? It’s real today. We’re operating in Phoenix, an urban city publicly. Anyone can download the app and take a ride. We’re expanding in LA and San Francisco. And so we really believe the future is here. Again, we work to deploy responsibly and in line with public policy and all of those good things, but we really see this materializing today for real people in their everyday lives. So we’ve been working for a long time to make it real, but we do feel like now is the time where it’s– people can use it.
MIKE MAY: The future is now, and Ned and I are going to head to Phoenix and check it out in-person. We’ll go ahead and wrap up, and thank you, Jeffrey and Shane from the San Francisco LightHouse. Wonderful having your thoughts and I’m sure you’ll be contributing even more to Lauren’s work.
SHANE MARTIN: Thank you.
JEFFREY COLON: Thank you so much.
MIKE MAY: Yeah. Thank you, Lauren. And I will hand it back over to Brian. It’s been a pleasure being on here. Mike May from GoodMaps, have a good day.