-
DESCRIPTIONDid that mechanical voice just say it’s safe to cross the street? It’s a dilemma every blind person faces when they are about to step off the curb. What if the camera on the back of your mobile phone could assess the signals and your path to make a crossing safer? A small team of AI engineers at the startup AYES took on that challenge and created the OKO app, which uses computer-vision-based AI to “read” the signals and suggest when it’s safe to cross. How does the app work and just how safe is it?
Speakers
-
-
SESSION TRANSCRIPT
[MUSIC PLAYING]
BRANDON BIGGS: Thank you, Alice. It’s great to be here at Sight Tech Global. I am here with Michiel or Michiel Janssen who is the co-founder of the company AYES that creates the app to help blind pedestrians cross the street. It’s called OKO. Thank you for being here, Michiel. And I’m curious, what inspired you to make OKO?
MICHIEL JANSSEN: Yeah. Thanks for that kind introduction, Brandon. And it’s a great pleasure to be on Sight Tech Global. So to start off with how we got inspired. Basically, I’m Belgian, originally born in Antwerp in a beautiful town, but we moved to the States about six months now to New York. And so all the way two years and a half ago, we started in Antwerp due to the fact that we got inspired by a family friend. He’s actually a friend of my parents. Because fun fact, one of my two co-founders is also my brother. And so the second co-founder is one of my best friends from high school. And so the three of us were all AI engineers. So we studied computer science and then specialized in artificial intelligence, specifically in computer vision, which of course I’ll talk more talking about the OKO app. But so two or almost three years now, we were talking a lot with our blind family friend named Kenny. And so we got inspired by his challenges as a blind person navigating the streets in Antwerp. And so we thought, as computer scientists like, hey, there are self-driving cars like Teslas or any other brand out there. They’re using cameras to interpret the visual surrounding and then do certain magic to it. And so we kind of figured out if a self-driving car will be able to drive autonomously on a highway for like 60 miles an hour, why can’t we use that same technology, which is computer vision, and then apply that to help blind and low-vision pedestrians navigate, sidewalks, crosswalks at 2/3 miles an hour? So that’s actually how we got inspired and of course, how we incorporated our AYES company.
BRANDON BIGGS: Awesome. Cool. Thank you, Kenny, for inspiring this. And yeah, I definitely want to be using the Tesla technology for the self-driving mode. So why exactly would I want to use OKO as a blind person? What can it do for me?
MICHIEL JANSSEN: Yeah, great question. So maybe a bit of a backstory there as well. So originally, we kind of figured quickly that a lot of cities all over the world don’t have a lot of accessibility features in place when you’re talking about pedestrian signalized intersections. So here in the US, it’s called an APS or Accessible Pedestrian Signal. And I would say on average, throughout the world, 5, 10, 15% of all intersections are made audible or vibrotactile to inform whether the walk sign is on or the don’t walk sign is on. And so basically to help them on all the intersections, so on all the 100% of signalized intersections, we wanted to develop an application which is, of course, is called OKO. And so what we do is we use your smartphone camera and artificial intelligence to visually interpret whether the walk sign is on or the don’t walk sign. And similarly to an APS, we also produce an audible cue, a vibrotactile cue, and even more so, we also provide also a visual feedback so people with low vision can look at their screen to interpret whether the walk sign is on or the don’t walk sign is on. And so we not only detect whether that walk sign or don’t walk sign is on, but what’s interesting as well, and it’s something that we figured more here in the United States, is that a lot of blind and low-vision pedestrians often veer off into traffic left or right. And so what we found is that a lot of people use our app as well to stay on track while they’re either crossing the street or getting oriented towards that pedestrian signal. So imagine your parallel street is on the right, people start scanning away from the intersection, holding their phone at chest level. And while they’re scanning towards that parallel street, you at some point will find the light across the street and hence we’ll immediately produce an audible or vibration feedback to let you know this is the direction where that pedestrian signal is and what is the status of that signal. And so the veering off ultimately comes in when people continue to hold their phone while crossing the street. The moment that you’re veering off, given that we are using the camera to detect the signal, when you’re veering off, the camera will no longer see that light and hence the feedback automatically drops, which for them is then an indication, hey, I’m veering off. And again, you should rotate your upper body or your complete body together with your phone to get back on track. And there’s a good saying actually, what we always like to say, if you’re stop hearing, you’re veering. So that’s always to get quite easy back on track.
BRANDON BIGGS: That’s awesome. That’s a great catchy phrase there. If you stop hearing, you’re veering. That’s awesome. Well, how can I trust that OKO is telling me the right information? Because I know that machine learning often doesn’t get 100% accuracy on things. How can I trust that it’s giving me the correct information and it’s not giving me false positive information, telling me that the street lights on and it’s really not?
MICHIEL JANSSEN: Yeah, great question. Maybe it’s a good first side note, let’s say. Our technology, of course, is always an assistive technology. So it should always be used in addition to orientation and mobility or gain or your guide dog or combination of all. So it’s not that we’re replacing all of these. We’re just mere an extra tool in your toolbox. That will make you more comfortable or more independent to get across the street. And so talking more about safety aspect. So all the way in the beginning, it took us, I think, almost nine months of development before we initially released the first OKO version back in Belgium. And so what’s necessary for our AI is a lot of data of pedestrian signals but also negative examples, like car signals, because we don’t support those car signals. And so by aggregating all those specific images and also in different weather and light conditions, so during the day, during snow– and I mean, it’s also pretty funny because it was pretty easy for us all the way in the beginning to collect that data on ourselves because we as pedestrians could just travel everywhere. And we also encouraged our family back in the day to hey, you’re going to Paris. Can you shoot some videos of pedestrian signals in Paris? Because ultimately, pedestrian signals visually are different from one country to another. So taking all these things into consideration, the biggest necessity of our application was just data. So it took us a while to reach a certain level of accuracy, let’s say. And that was also one of the reasons why it took a while to make the jump to the West because the lights are totally different. Here in the US, we have a white walking person to indicate walk, and a red hand to indicate don’t walk. And so in Belgium or most generally Europe, you have a green walking man and a green– and a red walking man. So it’s totally different and also just environments are different. Like compared New York to Miami, for example, it’s totally different. So trying to capture also data in any other city was also pretty necessary. And so on the– what is it now– 1.5 million streets that we’ve already helped people cross with OKO, our AI has never made a mistake. Because we also have a fallback mechanism in place where if you think about AI, there’s just an image that comes in, you predict a certain thing, and then there’s a certain output. But we don’t necessarily just immediately provide that output to the end user. We have some majority voting mechanism in place as well to always ensure that we’re only, if we’re highly confident, that we push that feedback, whether it’s audible or vibration. But again, of course, you should always use that feedback that you’re getting from our app in addition to confirming with your both ears or other skills that you can cross the street.
BRANDON BIGGS: Got it. Yeah, that’s awesome. So I would probably pretty much use this app when there’s no cars around and you know it’s that horrible time when there’s nobody giving me parallel traffic and I need to cross the street.
MICHIEL JANSSEN: Yeah. Especially in those situations where there’s no surge of traffic, it’s very helpful. But a lot of people also use it even though there’s surge of traffic again, an extra confirmation that you’re good to go. Because of course, a lot of intersections are changing these days. One specific thing is a lead pedestrian interval where pedestrian signals get a 5 to 7 second head start. So a lot of people actually value that you can use our app to immediately get that walk sign in and make advantage of that 5, 7 seconds head start. So that’s just one example. But I agree.
BRANDON BIGGS: Awesome, cool. Yeah. That makes me it makes me feel a lot better knowing that out of a 1.5 million streets, you haven’t had any problems. That’s pretty confident boosting. Awesome. Well, I don’t know about you, but when I was born, I only had two hands. And one hand holding my cane and the other hand is holding something else. I guess, can you use OKO without hands, so if I’m going to the grocery store or something? Yeah.
MICHIEL JANSSEN: A lot of people always say I need to have a third hand, and I agree, of course. But what works pretty well for a lot of people is using a certain lanyard or a pouch. And of course, they come in many formats, given an iPhone or a smartphone is always a bit different. So you can quite easily buy whatever suits your needs, of course, on any other store so that basically the phone is hanging around your neck, or even around your wrist, or even around your belt. We’ve seen some pretty crazy devices. And some people actually make their own lanyard through their specific situation, let’s say. So I think a lot of people use it like that as well. So whenever the phone is hanging around your neck, you can basically say, hey, Siri, open OKO. And that’s the great thing about our app. The moment that you come into it, there’s no real action required. So it immediately is looking for those pedestrian signals for you. And that’s a great thing about using that in a hands-free approach. But of course, it, again, depends on your preference, whether you would like to hold it yourself or the lanyard is holding it. And fun fact, actually we pretty recently, like 1 and 1/2 weeks ago, we got our own branded lanyards in. We don’t sell them yet. They’re only exclusively for like conferences that we’re going to. But a lot of people have been asking like, wow, this is amazing. I need to get one of these.
BRANDON BIGGS: Yeah, that’s great. I need to get one of those. [LAUGHTER] I mean–
MICHIEL JANSSEN: I’ll send you one.
BRANDON BIGGS: OK, cool. Awesome. Yeah, that’d be great. Cool. So we talked a little bit about this. But how does the AI work? Like give me some of the juicy details on the AI. You mentioned you got a lot of data from different countries on different stoplights. But how does the AI itself work?
MICHIEL JANSSEN: Yeah, good question. So basically, the technology that we’re using is object recognition. So how that works is you have indeed that millions of pictures of images, whether there’s pedestrian signal on it or not, and you annotate the availability of a pedestrian signal and then you more clearly say it’s a walk, don’t walk, or even countdown status, because that’s also something that we detect. And so by having all those images and training samples, we can train an algorithm to understand what is a pedestrian signal first, and then secondary, what’s a walk sign? What’s a don’t walk sign? And what’s a countdown signal? And so again, it all relies on how good your data is and how the variety of your data is. Again, it comes down to those cities and those daylight conditions and everything. So again, it took us quite a while to reach that level of what we wanted to have. Because of course, it’s such an important thing to assist people to get across the street. But basically, in a gist, it’s just detecting that signal. And it only focuses, of course– and it’s a good one. It only focuses on pedestrian signals. So we don’t support car lights, bicycle lights, or anything else. It visually is looking for actually icons that are the walk sign, the white walking person, and the red hand, which is the don’t walk sign. And there’s a lot of– without going into too much detail, but indeed like under the hood, there’s a lot of things. So it’s not only looking for colors but also contours. And everything makes up the definition for the algorithm. This is a walk sign. And this is a don’t walk sign.
BRANDON BIGGS: Got it. So it’s a very dependable object recognition system and basically sounds like.
MICHIEL JANSSEN: Yeah. Indeed, yeah.
BRANDON BIGGS: Cool. So I know you’re on iOS. Are you on any other platforms?
MICHIEL JANSSEN: Good question. Yeah, no. Right now, we’re indeed only on iOS. Actually, it’s twofold. One, of course, we’re a team of seven. So we’re still a smaller company. And so we figured all the way in the beginning that of course, the majority of the community is on iOS due to accessibility that was already in place from the early days at Apple. And so I think, of course, that’s one portion to the story, why we’re not on Android or any other platform. But I think the second and utmost most important reason is that our app, the OKO app, is indeed relying on AI to tell you whatever you can walk or not. But that algorithm is so computationally heavy that we’ve made it– or we optimize our AI to run locally on your device, which means you’re not relying on a server connection nor a wi-fi connection to talk to any cloud service to tell you then the cartoon walk is on. Because it’s such a crucial aspect. You need to have it instantly, the feedback, to know whenever you can cross. And so the fact that it’s running locally and requires a lot of computational power, only Apple with their iPhone offers us the tools as developers to do or enable that AI inference on device. And so it’s not to say that there’s no Android at all that might support an OKO. But of course, as a younger team, we don’t have the resources right now to go there. Because ultimately, we won’t be able to support any other– or not all Android devices. Probably only like an upper segment, maybe a pixel or a Google S series. And so the trade off there is for us is like if it’s already a minority in terms of percentages, but then within the Android community, how many people are on those higher-end levels?
BRANDON BIGGS: Yeah.
MICHIEL JANSSEN: So it’s something that we don’t know. And I also don’t want to make any false promises. We are always willing to go into Android, but there’s a lot of technical challenge that keeps us away from doing Android as well.
BRANDON BIGGS: Yeah, I see that. That makes it so hard when you don’t know what type of hardware you’re interacting with. Yeah.
MICHIEL JANSSEN: Yeah.
BRANDON BIGGS: That’s tough. That’s the one good thing about iOS devices. You know what you’re getting when you get–
MICHIEL JANSSEN: Yeah, exactly. I mean, we do support an iPhone 8 or newer. So it’s like– what is it now? 10 to 15 devices that you need to support, and that’s quite manageable. But for androids, you’re probably talking about a majority in like hundreds if not thousands of devices. So it’s pretty complex to do and crack Android, especially for us.
BRANDON BIGGS: Yeah. Yeah, for sure. Well, it’s great that you support of iOS 8. That’s great. Or iPhone 8. So if I’m in another country, how will I be able to use OKO or will I know– how will I know if I can use OKO?
MICHIEL JANSSEN: Yeah, good question. So we actually don’t put it somewhere on the website, but we do always mention it in like webinars. But there’s only four countries where OKO is available as of today. That’s our home country, Belgium, then, of course, the US, and then you have Spain and Japan. And so why like only four countries? It all boils down into, again, having data of country-specific pedestrian signals that again, are different from one country to another. So there’s a lot of resources that we need to invest into developing another AI model for every other country so we support that specific country. And the only way to switch– and it’s a an iffy way, I would say. The only way if you, for example, as a US citizen go to Belgium, you need to go to your settings, your Apple settings, and change your country from the US to Belgium so that our app, the OKO app, understands like hey, you’re in Belgium so we need to use the AI that was used and trained on Belgium traffic lights. And of course, down the line, whenever we open up new countries and anything, we hope, of course, based on GPS to just automate it once you’re getting in another country that we immediately pick up another AI to use to detect that signal.
BRANDON BIGGS: Got it. So it’s pretty much just a matter of need to repeat the data collection and annotation process that you’ve done in the US and Belgium and the other countries in every country basically. Yeah, it’s a lot of work.
MICHIEL JANSSEN: It’s a lot of work. But I think what the beauty of our app also is although it doesn’t use any cellular data to provide you the status of the signal, there’s of course, an opportunity for people to contribute in sharing their data. And so what that means is that every time a user is at an intersection and detects a walk, don’t walk, or countdown status signal, they can share an image of that specific intersection wherever that may be here in the US. And so what it enables us is to collect so many images of all particular cities, all light conditions again. And so in a sense, they’re contributing in enabling us the tools to make a better product for themselves, but everyone else here in the US or even beyond. So that’s pretty beautiful. And all the way in the beginning, as I mentioned, we used to collect our data on ourselves, like our family collected it as well. But right now, it’s just our user base that is going out there every day. And I think it’s pretty cool actually like last month, we had a 180,000 streets crossed. So that also reflects into, again, so many data points which we then internally can use to make a better product.
BRANDON BIGGS: That’s awesome. That’s a crazy number of– very, very high number of streets crossed. That’s great. So currently, OKO is completely free to users. What plans do you have to start bringing in money into OKO so we don’t lose it? Because that would be really terrible if we stopped having access to OKO.
MICHIEL JANSSEN: Great question. So it also is linked to our mission and vision. Basically, all the way from our start, we were like, it’s a basic right to explore and go wherever you want to go. So we never wanted to ask any money to the end consumer. And right now, we’re backed by venture capitalists. So that’s how we can sustain our business. But basically, down the line, as we evolve our OKO app with many other features to come, of course, we ultimately want to work with businesses or governments to also provide accessibility information of, for example, a bar, or a restaurant, or a city such that you can also get that through our platform and in a sense then, by working with those B2B partners, we can sustain our model and still facilitate our app for free to the end consumer.
BRANDON BIGGS: That’s awesome. So basically, when I’m a user, I can pretty much expect not to need to pay for OKO. That’s basically your mission is what sounds like.
MICHIEL JANSSEN: Yeah. Exactly, yeah. And that’s something also that we’ve discovered, right? I mean, there are so many assistive technology are pretty expensive, and that’s just due to economics of scale. And so what we figured out is like, hey, we need to find a way to have a great impact, and we feel that’s just only usable or feasible through a mainstream product which is a smartphone, what a lot of people already own instead of carrying around X, Y, Z assistive technologies that you need to charge as well and need to think about. We just wanted to make it very simple through a smartphone app and of course, indeed even more so, make it for free so there’s no barrier at all to try to use it, see if it’s something for you, and make use of it whenever you want.
BRANDON BIGGS: That’s awesome. So basically, what you’re saying is everybody who’s blind on this call should download the OKO app. And where can they get it?
MICHIEL JANSSEN: Yeah, yeah. So it’s a free app that you can get in the App Store. You will find it under full OKO-AI Copilot for the Blind. And so basically, again, it’s only available in four countries, the United States, Spain, Japan, and Belgium. And so unfortunately, if you’re not in one of those countries, you won’t be able to download the app as of yet.
BRANDON BIGGS: Got it. Cool. Well, thank you so much, Michiel. This has been an amazing conversation that we’ve had about OKO and the future of crossing streets. And I definitely have OKO on my phone. So I’ll be using it–
MICHIEL JANSSEN: That’s great. And now you probably have a– and now you have a lanyard as well, which I’ll send you.
BRANDON BIGGS: Yeah, yeah. That’ll be even better. Yeah. Because there’s a lot of these computer vision apps that are coming out that are needing that phone. Yeah. That’s awesome. Thank you so much, Michiel. This has been an amazing conversation. And everybody, go and download OKO if you’re one of those four countries.
MICHIEL JANSSEN: Thanks so much, Brandon.
[MUSIC PLAYING]