-
DESCRIPTIONExplore how Aira and Haptic are combining tactile feedback technology with human-AI verification to transform navigation for visually impaired users. Learn how this innovative partnership leverages smartphones and smartwatches to provide intuitive, non-visual guidance through vibration motors and verified assistance.
Speakers
-
-
SESSION TRANSCRIPT
[MUSIC PLAYING]
VOICEOVER: Haptics and Human Touch. Speakers, Troy Otillio, CEO of Aria. Kevin Yoo, CEO and co-founder, Haptic. Moderator, Devin Coldewey, writer and photographer, TechCrunch.
DEVIN COLDEWEY: All right. Well, thank you. And thanks to SightTech Global for facilitating this conversation. And Troy and Kevin, thank you for joining me here. Why don’t we start out with a little bit of an introduction? I’m sure most of our viewers are probably familiar with what you guys do, but I don’t know what maybe you’ve come up with recently or what you’ve been working on. Maybe you can give me just like a short summary of what your company does and what you’ve been working on recently. Troy, why don’t you start us out?
TROY OTILLIO: Well, first off, thanks for having Aria, having me at SightTech Global. Really excited to be here and share a little bit. And also, with Kevin, who we’ve partnered with, so that’s one of the things we’ll talk about. But ultimately, if you haven’t heard of Aira, we started as a startup, as a VC-backed startup, 10 years ago. We raised $38 million and the vision and the motto was access is a human right. And we focused on the blind, low vision community, building with them, and really creating this concept of visual interpreting remote assistance on demand. So we operate a 24/7 service, where individuals who are blind or low vision can use our app, use our web app, use MetaGlasses, different ways to connect, but ultimately you’re connecting with a human in the loop. We’ve augmented that with AI, so we also have AI. We’re a free service in that individuals can use Aira at access partners. So part of our business strategy is to partner with government and businesses, such as Alabama, Colorado, you know, many banks, a lot of tech companies use Aira as an accommodation. Many companies use Aira as accommodation. We’re in 50 airports, 50 universities. So we’re really proud of the progress we’ve made at providing a free service by partnering with businesses to kind of provide the service. But with that service, because it’s secure and trusted, we’re HIPAA compliant. We really focus on information handling, privacy, security. We’re trusted to do all manner of activities, including navigation, interrogation, but including also work at a computer, in the home, reading financial documents, medical documents, COVID tests, pregnancy tests, all the things. And with that, I feel like Aira has probably the largest catalog of use cases. We’re approaching 10 million sessions. And with that, you know, we’ve seen a thing or two, to borrow from another tagline, but really glad to be here, share a little bit more about what’s going on at Aira, and happy also to be on the panel with Kevin.
DEVIN COLDEWEY: Yeah, and Kevin, we just met the other day at TechCrunch Disrupt, where I was, I wrote up your company for the battlefield. I was super excited to have a company focused on accessibility, presenting on stage. But why don’t you, why don’t you share what, what you’ve been working on? Yeah, I would love to.
KEVIN YOO: And thank you, Devin, for that amazing article. That was phenomenal. We’ve shared that with a lot of our investors and partners. Really, appreciate the great write-up. Happy to be a part of this panel with you. Finally, we’re able to showcase to the world, you know, a grander six, you know, scale our partnerships. So Haptic really started as a mission-focused driven towards helping people with technology. And first we were, you know, called Wearworks. We created a wearable device called Wayband. And that was what guided the first blind runner in the New York City Marathon, making history. So if you do do a lot of research, that was the first time we had the New York City, New York City Marathon. Cause, I will say you’ve got some really unprecedented command of software. Yeah. All about, respectfully, you know, I can be under pressure, but that was the PR guys’ So we got this perfect. The thing is really, everything is being said, but that’s what we’re doing has to happen to make sure that we wouldn’t, because, you know, what you’d call a tough task. Yeah, ask you to do it at home. And it’s not like we had that plan, which is one of the hundreds of times we were doing, but you know, most companies are becoming one wayward and it’s all fine. And have done a great job, so we focus on kind of the other chamber of senses, which is mainly focused on touch, and the devices that we’ve created in the beginning are no longer really necessary, so we convert it fully to a B2B SAS model rebranded into Haptic, encompassing entirety of information through touch. So now we have haptics of our technology of SD case being licensed out that activates the the vibration motors of smartphones and Apple Watches and Pixel phones and Pixel Watches as well as Samsung, so we do that in partnership with them in order to keep on developing the future of Haptic in terms of the hardware side but also in terms of the software side. So we’re kind of just breaking boundaries here, right? People thought it was impossible to run an entire marathon as a fully blind individual and that it would be you know really difficult to do so without a tether and a sighted assistance and we were able to you know break that mold and say okay it is possible what’s next and now we work a lot harder to do so without a tether and a sighted assistance and we were able to you know break that mold and say okay it is possible what’s next and now we work a lot harder to do so without a tether and a sighted assistance and we were able to with paralympic athletes blind and sighted as well and in tourism world that’s where we see a lot of interest so i think this conversation again is going to Get really excited because Ira focuses this is heavily on today’s forefront of visual computation and information through agents, what we’re able to do with haptic technology is to create that more immersive side for the user to feel calmer with navigation, less than other types of info that we can allow the agent to to not have to describe and overload the audio barrier, and then to go towards the more balanced way to communicate through technology. Gotcha!
DEVIN COLDEWEY: Now I want to use this uh the use of haptics to communicate this kind of information, this is of course kind of a uh with uh with people with visual impairments as well. Time using ‘using’ a cane or using any other kind of like touch-based navigation, people have been doing this for a long time and technology has ‘uh’ improved. I’ve seen various ways of doing this over the years, I’ve seen various ways of doing this over the years. None has quite caught on; they’ve all been really interesting. Why do you think that now is the time and that haptic is the like the move?
KEVIN YOO: Well, thanks to your article, Devin, the world now knows about the potentials of haptic navigation ‘uh’ I so I started the the mission of really changing people’s life with technology instead of just a small, you know, 10% increase in satisfactory. By creating prosthetics, I learned uh a lot in the beginning by visiting blind organizations, so 10 years ago like Haptic was very much rudimentary. You had Buzz for notification and smartphones, and that was practically it. And over time, um, you know things have evolved slightly, but still when you look at a map uh app and Google Maps and Apple Maps, and so on, these are still not accessible-friendly in a lot of different ways a fully blind and deaf individual would not be able to use Google Maps for example, in order to get to places safely, and or, and also ride-sharing, right? To find vehicles safely, and so on. So we work a lot with The blind organizations, which is really the core of our customers and our royal communities, and then from there we developed it in order to fit with the universal language side where everybody can use it the exact same way. So I think that’s part of the difference within us is that you know we really started with a problem, we built it out, built it out which made history and so on increased it to a mass population um pipeline where now anybody with a smartphone can use it right off from our app Haptic Nav is a free app worldwide. We have now close to about 100, 000 users organically. We want to keep that free forever at the same time. We want to now partner with companies like Ira Indoor Navigation companies and the big providers of ride-sharing, and so on, to say okay let’s create this common ground where we can just license simply a technology that works right. We’ve proven the concept, we’ve put it into the test, we’ve actually you know made a lot of movements forward in that field, um, and I think that’s what a lot of companies or maybe individuals in the past have not done is to really put the technology out there, put it to the test, and to you know expand and grow. And the last thing I’ll mention is sense communication if you want to evolve it, it’s very. Difficult, there are tremendous amount of books and articles about sight and audio right wine tasters all the way down to incredible musicians but when it comes to tactile information there’s not a lot written about it there’s not a lot of information to learn from it so we’re creating that tool base and to also provide the technology in a very easy way to integrate and that’s part of our major patent that I’m going to talk about in just a moment, I’ve got you yeah yeah and I want to say that when I when I tested it out at uh back in the green room I was really surprised at how like natural it felt to just automatically be like oh.
DEVIN COLDEWEY: Yeah, now I get it, it literally only took like 10 seconds to figure it out.
TROY OTILLIO: Um, so I’ll add uh you know again we have the benefit of you know tens of thousands of customers who are using all manner of tech right like Ira is just another tool in the toolbox, we are often the system of last resort right when all else fails, but it’s a tool that gives us like a great access to both early adopters and uh and folks who follow. And yes, as you said there’s been lots of um focus on haptic based interfaces, um, and ultimately what we hear back often is that many of them are distracting, they take like some learning and break-in period, and at the end of The day doesn’t require so much kind of cognitive load that uh it doesn’t really improve the situation, right or um there’s also a situation where um the false positives or the false negatives aren’t often easily discerned and so yes you got the indication but you you actually you know it’s like the classic uh my daughter um pick on her she left and right are sometimes challenging for her I say hey turn left and we’re driving and she’ll turn right it’s just in her brain other left yeah yeah that’s so I think the haptic interface is really you know powerful but but you have to get it right, but you have to get the experience right. And what our early results from not just, uh, Ira users but even some of our staff were blind, really, really found that the um the art and what they put together is, you know, quick, very quick to learn, I don’t even know if I call it learning, it’s just it’s just um it’s just obvious and uh and provides that um kind of almost uh instinctual kind of interface so I would say you know my comment on this having looked at a lot of haptic interfaces either through the eyes of my customers or or just directly, they really have something unique here, yeah, I certainly thought so um and I think you mentioned that the the experience the
DEVIN COLDEWEY: Uh, the intuitiveness of the experience is very important to people actually using it. Um, you recently, or rather Ira, recently published a paper that is about since you mentioned that you’re you’ve integrated AI and vision in lots of ways uh, and that’s fantastic. We all know the capabilities of AI are you know extremely interesting, yeah, but it also can be a sort of people can fall victim to sort of like it it seems to work but it doesn’t really work um, and in this case you just put out a paper I think yesterday or very recently uh, that was uh about the need to have a human in the loop with these AI systems and and it seems like you Have both so, you really, you really can say that with some conviction. Could you explain that sort of thing?
TROY OTILLIO: Yeah, I’ll share a little bit if you remember a little plug-if you go to our website, you can download the uh, it’s a white paper, um, and uh, as we build things, you know. I might have a little side story that I would just like. I’m always looking for more collaboration in the industry itself, and we’re all looking to adopt AI, so you know every company of significance-Vespero be my eyes or IRA, you name it, we’ve integrated um, you know, the vision-based APIs from all the major vendors right there’s no real difference there. I mean there’s some art in how you do the, uh, you know, the tuning of it and and um and certainly put a lot of effort into that so that our free AI has the ability to describe an image or a scene and interact with it, you know. We keep working on that. But unique to Ira we have our agents and so we decided to put in a verification service so the AI is free right? So you download the app for free, you can take images, take pictures, upload images, have it described, interact button, and for free you can have a real human in the loop or professional visual interpreter review that and you’re not making a call; it’s just it’s just um that image is transferred. to an available agent they took a look at the image they take a look at what the ai produced and then they they correct it if it needs correcting they amend it if they if it needs amending um or just pass it back as hey it looks it looks great and in that um i’m not gonna be able to quote all of the uh all the stats there there’s a lot in the paper um the the call the failure rate or the rate that it um the agent needs to correct or feels the need to correct is is pretty high it’s in it’s above 20 20 30 now the question is are all those Corrections material I mean lots of times if you’re interacting with an image you have a certain objective And you don’t necessarily declare it right like you might ask to have your coffee maker described, take a picture what you really are interested in is the water level or maybe what light is on but you might not say that initially and so if the if the AI comes back and says you know there’s a two coffee mugs on the left and it’s really not two coffee mugs or two um you know water glasses is that material is that really an error? So there’s some nuance in the error rates, and then the error rates also vary by use case by lighting by all kinds of things, so there’s a lot more to learn. As part of our brand, uh you know we we look to provide a service that is uh reliable and trusted and so that’s why we’ve added the human in the loop so that you can get the trust but you can also get the low cost um you know value from from AI delivering in from visual information and uh happy to talk about that more I don’t know that we were surprised per se perhaps because we were looking at the data it’s but when we put it together in a in a report uh we had a couple MIT uh interns uh working on it so it’s kind of a fun project for them it was interesting to see the variation in the false positive false rates in in the different conditions and and that’s also going to vary over time right As things as the AI improves, as different vendors come out, so it’s it’s a space worth watching. But I believe, at least from an iRO perspective, relative to the service that we provide, you know, a professional service uh human in the loop is absolutely required because uh especially if you’re blind low vision, how else are you going to learn about the quality of the results except for maybe if you depend on it and it sends you in the wrong direction or gives you the wrong info? So our job is to, you know, provide a service whether it’s with humans or AI.
DEVIN COLDEWEY: We’re going to combine it yeah I think that’s what’s interesting is that you know. mentioned of course the AI is going to get better but the best way for AI to get better is the data that you are now producing like that is a that is an extremely valuable set of data where it’s like this is a picture or video taken by a person who needs this service and here is exactly what they needed here is what it got wrong here is how we corrected it it’s such a valuable thing and it can only happen with like with a service that is being used by people so um I think that that will that will prove to be an extremely valuable data set um and so I’d like to say by the way just for fun uh everyone you know sees AI is is new and and certainly
TROY OTILLIO: the time is right I think for AI it’s it’s demonstrating great value um the name iris stands for artificial intelligence remote assistance so when we were founded 10 years ago and we raised uh you know the money we raised a significant portion of that was put into building our own AI we failed we the time wasn’t right so we’ve always had an AI mindset we’ve always had the belief that over time AI will augment and displace some of the things that that humans do um it’s it’s really about how do you how do you integrate the two in a way that’s efficient and useful and and natural I guess for both the the individual seeking uh the
DEVIN COLDEWEY: Information, and the other the person providing it totally um, speaking of integration, uh, I understand you two have a uh, a partnership going on some kind of some kind of little little hookup, what’s uh, what’s this partnership about, who’s uh, who’s doing what for who, and uh, what does that look like for users of your apps?
TROY OTILLIO: Yeah, I’ll share I’ll share a little bit of the vision that uh Kevin and I came up with and um and let him go further, he’s obviously the expert on haptic but uh today we navigate people all the time right every day right now there’s probably you know a great number of folks roaming the planet right now indoors. Outdoors, and they’re doing that with the agent giving verbal instructions, um, and we train our agents. There’s a whole like it’s almost like uh if anyone’s ever you know taken a flight lesson or knows anything about piloting, there’s a there’s a language you use with the with the tower and like we we’ve we’ve definitely uh worked on that yet um it’s not the only information that’s been exchanged during navigation. Often people want to know about the surroundings um in the area they want to know um about what’s present nearby like, one of one of the things that caught my attention early at Ira is that someone had um started using Ira they worked as a dishwasher at a that’s not a piece of place and every two weeks they had to go deposit their check and they had their route they got on a bus they went to their bank deposit the check and the first time they they did this with Ira they were leaving you know their the and and the agent said and there’s a Wells Fargo sign on this there’s there’s a you know just providing that information and that person got so upset like oh my God for the last four years I’ve been you know spending a half hour to go to a bank but in fact I could have been going next door so there’s a lot of extra information That people would want yet it’s it’s crowded by the navigation instruction, so um, the vision is to have haptic integrated such the agent can either plot routes on their own and and maybe call back if they need additional information or to simply make that navigation experience even richer because the individual is focused on on the kind of turn-by-turn through the haptic while the agent is focused on the larger strategic objective of you know, what the route is, and also providing the additional visual information that they get through the camera, um, and Kevin could talk about like, you know, there’s beyond. Haptics Kevin, right? You have an ability, uh, to, uh, to do some of that turn by turn if you will, yeah, yeah.
DEVIN COLDEWEY: What’s your side of the story here?
KEVIN YOO: Yeah, I would love to just lean on that a lot um so as you were mentioning Troy, you know this human component is something that’s unreplaceable in a lot of ways um probably for a very, very long time right? And in terms of the way that we were being observed by the blind, vision-impaired community in the beginning when we first proposed to NFB and other organizations um, you know proximity detection tools, tools, and navigation tools orientation tools all using uh haptic sensations. The question always Comes out like, ‘Is this going to replace my cane and or my guide dog? And the answer is no, it’s not. It’s just part of the add-on that you have not been able to have for your sake of confidence and freedom to have the independence to you know move around to new places right without concern and without confusion, for all of us this is the part that we want to achieve, and same thing with with you know our partnership with Ira-you know, we’re not trying to get rid of any specific element of interaction, we’re just trying to make it more immersive and better. So, the topic that uh we’re covering here is really there is a lot of information. And we’re constantly bombarded with visual and audio, and it’s only going to get worse, right? And there’s going to be tons of noise in cities; it’s going to get more booming and bustling at every given moment, you know. With the new vehicles, things are just changing at a rapid pace, so at over time the human mind is just going to become bizarre social media, etc it’s just going to keep on going up and up. So, the tactile feedback for me is a really important element because it does pierce through a lot of that noise right through your core, like fundamental understanding, information in a way that’s um not a not necessarily had. To be a distraction, right? It can be if it’s just constantly a notification. But if you know exactly what that information is, you can either compartmentalize it, right? You can say, ‘This is not something I want to deal with; there’s something I want to deal with now.’ So, I think of it as a first digestive system of the information that people are able to digest and then decide whether they want to you know put their um cognitive efforts into it. For example, you are a person walking around the streets, let’s say, a fully uh blind individual. An agent from IRA is on as he was describing the navigation information; it’s just like fully. Embedded into the core of your nature, right? So for example, I used to ask the question: how do you feel when you walk somewhere that you’ve gone to a million times? Right now, you feel like it’s very natural; you feel there’s nothing in your really mind in order to distract you. It’s kind of a calming, very relaxing journey. And that’s the kind of type of experience we want to provide for somebody going somewhere for the first time ever. And that’s what Epic Nap does. It’s not just a turn-by-turn direction; it’s not just a turn by turn direction it’s not just a left and right turns. It’s an orientation. It’s really like you can describe it as a hot and cold you really have to feel. It’s hard to understand, but it practically just intensifies when you go the wrong direction and then simply gets softer when you go the right direction and stops. And when I just talked to Anil from NFB yesterday, he was saying we’ve been talking to a lot of the right sharing guys, especially Waymo, and these are discussions that come out. The custom honking noise and audio feedback are just not the right answer. Haptic nav is going to be one of the major ones that’s going to change the game in the industry of just identifying obstacles in a really precise and accurate way without that bothersome haptic feedback. Right, so the fact That we refined this over the course of many years; it was, it was, that it was really just nitpicking at the exact sensation that with the range that a person will feel confident and comfortable to digest at a given moment every single time, to feel like they are going somewhere as if they’ve done it a million times; and that’s that, really, is the core of haptic nav. And if we can allocate that to the end user, and at the same time provide only the beautiful things of the information around them through, like, a human-to-human interaction; I believe that will be the best experience a person could have, totally, and that actually.
DEVIN COLDEWEY: Going right into what I wanted to ask about next, which is uh getting the feedback from the community uh I know you both probably are connecting with people who use the app all the time but it’s it’s always interesting to me to sort of hear about how that feedback is actually like formalized and listened to in a systematic way because you know when you have thousands of users and every once in a while someone says oh you know i’m not going to say hey it’d be nice if this thing did this that’s not really systematic feedback and it’s not solicited in a way that you can necessarily be constructive with your own product about it so. I’m curious about how you, uh, both as leaders of your companies and in the community engage with, uh, the people that use your apps and how do you come up with how do you receive that feedback and act on it? Whichever one of you guys wants to jump on that, um, you know what?
KEVIN YOO: Sorry, I’ll kick it off just briefly because, um, you know, I have a lot to share about this in terms of how we’ve gathered the data within tech companies, within AARP for example, which has been a recent investor of ours, uh, with all different ranges of age groups from children all the way to people over 65 and up, and so on, and then the blind envisioning community. Through NFB lighthouses, so the full conglomerate of the data part that we’re able to receive is very diverse. For example, when we do surveys, they’ve done about eight within indoor navigation airports, you know, tough congestion high congestion zones, event spaces, and general navigation and finding people within these places. And what we have learned is that about 87 percent – these are people that never use haptic navigation before, right? Of all ages again cited or unsighted have identified that 87 percent uh suggest or prefer vibrational feedback for navigation over any other senses, including visual insight. And that was really great for us, and that’s 80-87% at the highest realm, like five out of you know, five, and then the other percentages go down to four, and then practically none at the the visual insight base. So if we can have the access to, of course, navigate through touch, that’s going to be a very viable product for everybody in the world which is what we have learned and specifically with Google, we’ve done ERG testings with a lot of people there, and that’s of course more internal right techies, more like you know people understand the navigation experience, very high highway um they also understand that navigation. Through Google Maps right now, are still lacking a lot of ways, but especially on the accessibility side. So, we’re able to get that feedback and a global feedback from our app itself, so that’s kind of like where I’ll start off by saying our information is quite diverse, and the information that we are getting from the data is pretty concrete-it’s very solid now, and that’s what we need to showcase to let’s say the bigger corporations to say, ‘Okay, let’s do something big. Let’s make a global movement here where everybody can feel their way and not have to you know worry about that so much.’
TROY OTILLIO: That’s uh, boy, we can go along a long. way here um i’m a veteran of uh i guess silicon valley i grew up an engineer product manager manager at a lot of companies building software in fact ira’s my first company where folks are accessible technology and i love it but i borrow and a lot of the folks at ira borrow from you know companies like intuit or google or apple on how we engage the users and it’s a multi-faceted approach right there’s there’s you know high touch low touch there’s follow you home there’s a lot of different techniques um and and we we use them all um we have an in-app feedback uh um process uh that we continually detune because we want to get the Broadest feedback on our not just the quality of the app, but the quality of the you know interaction with the agent, um, and the challenge is always getting the full spectrum right. It’s easy to get feedback from people who are either highly passionate about your product-they love it-or extremely dissatisfied. But how do you get that majority of folks who just either don’t have time or they don’t have the interest in providing the feedback? And that’s a bit of the art, and I will say this about the blind and low vision community: they’re incredibly patient and they’re incredibly grateful for advances in technology. Sometimes. The challenge is getting them to be critical of a product. And so we have techniques. It’s in the art of design and research, like how do you ask questions that aren’t leading and how do you solicit feedback in a way that you get to the real why, right? The seven questions. So I could go on about the way that we deploy, but you’d find that we use a lot of approaches that larger companies would take to collect feedback, both in-person, observational, through surveys, through the app. I’ll say another thing. We get a lot of great feedback when we just go to conferences. So whether that’s NFB or ACB or other conferences like M-Enabling, often there’s a concentration of our users or people who don’t even use our product. And that’s often an awesome place to collect feedback because the mindset at those events is different than the day-to-day and people have more time. So there’s just a multitude of ways, but I will say I’m very proud of what we’ve built and built with the community. And for anyone out there who’s new to the community, I mean, if there’s one piece of advice I give to companies who are starting out, I mean, you just got to wade in and ask a million questions and find the people with the canes and the dogs and just walk up to them and get, you know, make, make. Relationships and, and learn about their needs because they’re all different and learn about their mindsets. The, the, the, the, even the ability to adopt tech, uh, various, just like in any, um, population. But I think there’s some unique challenges here because part of the ability to wield tech is your comfort with technology. And, you know, for example, screen readers or voiceover in the first place. So I think there’s an added degree of, of difficulty maybe in comparison to creating, uh, apps or. Or services for sighted people. Um, so as you can tell, I’m very passionate about this and, you know, I can talk on about it, but it’s a, it’s a key to any company that’s entering this market.
DEVIN COLDEWEY: No, totally. I’d love to add in, uh, go ahead, Kevin.
KEVIN YOO: Sorry. Uh, sorry, Devin. I think Troy, this stuff is, you know, I’m really excited about this element here. And I speak for both of us, like when we are going to get the first data set ever of the computer computer, computer, vision, uh, alignment with haptic navigation together. This will be truly the first time ever. People are getting like very high quality audio feedback at live as well as utilizing computer vision and with the haptic experience conjoined, right? So that’s the part that’s, I believe, um, going to be really amazing data set for both of us. And that will be a setting, an example for everybody else in the world.
DEVIN COLDEWEY: I think that that’s like, I, like I mentioned before, I think that’s one of the most exciting things is building out that data, building out that those data sets that, you know, it sounds kind of boring, like, oh, we’re building a data set, but like, it’s so important. To have this. Can you, uh, can you tell me a little bit how to, as a startup, it must be very difficult to get started. You both started a few years back. How do you incentivize founders to focus on innovation and accessibility versus, you know, just getting rich or going after the latest trend?
TROY OTILLIO: I’ll go, Kevin. I set up, um, I’ll be, I’ll be kind of, it’s going to sound a little pessimistic, but, but, uh, just bear with me. I think it’s incredibly hard if you’re talking about someone, um, launching a company in the blind and low vision space for accessible tech. I think there’s a lot of unforeseen challenges, certainly, you know, this is my fourth startup. Yeah. And, uh, there’s just a lot more learning that has to go on and the funding, uh, isn’t the same as for, you know, general tech. And so you have, you have to be aware of that. And the adoption curve is; can be slower. It’s harder to market and find people, you know, like the self-identification of being blind and low vision isn’t prolific, right? Certainly, you can work with organizations like NFB, and they have members. There’s just a lot of, a lot of hidden challenges. So I always tell people going into the market, you know, go in eyes open that, you know, you’re going to, you’re going to win because you have the passion in your heart. And because you, um, get engaged; I mean, that’s, that’s the upside; um, it’s going to be struggling. Like, how do you incentivize? Uh, I mean, there are obviously financial incentives, but I, I think that the hidden incentive and I see it from people I hire from outside the industry, I see it from people who’ve been in the industry, uh, you get a chance to, you know, leverage your skills, whatever that might be marketing, sales, development, even finance in a way that, um, I think has an impact on people that, uh, isn’t, isn’t the same in maybe other industries. Um, and, and that’s certainly what at the core powers, like the, the, the staff at iron, that’s also what powered us through a lot of the early challenges. Um, the last thing I’ll touch on, but it’s, it’s a long conversation. I think coming from the VC world or, or knowing, knowing the VCs, uh, which primarily fund early stage companies, I think there’s, there’s still a big gap in awareness, a huge gap in awareness on what this market is about and also what accessibility means. Um. I can tell you every startup I was at before, um, IRA not once did the board or the VCs or anyone ask us like, Hey, how are you making your product accessible? It’s just not, it’s just not part of that, um, culture yet. And I think, I think there’s a big opportunity to influence that.
KEVIN YOO: Yeah. And a lot to lean on. And from my side, I’m learning a lot, right. From the guys like Troy and people that have done it multiple times. This is part of, um, I think, the most important element that, uh, that I’ve learned is that, uh, we also have a lot of people that are on our advisory boards. We’d have some of the top people on our side, the chief business officer of Waymo, for example, previously. And, um, Francis West, who is, you know, 37 years at IBM, um, you know, chief accessibility officer, RJ was a partner at Deloitte. So we’re able to kind of get a big, big spread on the different compartments that we need for legal, uh, legal elements, ADA compliances, you know, changing the game really in terms of how people visualize, uh, but also feel the future. and from uh the startup standpoint of course it’s just hard right the percentages are there everything is out in the open it’s uh it’s high failure rates and if you don’t have the passion and the mission um you know it’s not gonna happen so i do have that time behind me as well so i did start a previous company before this focused on sustainability and then i’ve started this company which is more tech focused um that can help people with our tools in terms of you know our mission driven and the people that have joined us have been you know just really solid solid individuals good hearts like troy and people that just do it for the right reasons so i gotta say you know having the right people um on the board and uh advisors and mentors definitely are just the main key components uh surviving as a startup you may be a startup founder but the people around just supporting you are going to be the giants and i think by having that standing on the shoulders of them and being able to be really producing what actually you’re doing is going to be a great way to be a great way to be a great way to be a great front like forefront means right in technology That’s uh, you can do that. I noticed that when I first partnered with Mapbox, which is still a very fantastic partner of ours and we’re doing actually a GloMo Award situation with them right now. We’re doing a ton of different partnership projects for map providers, so in terms of like that, I learned early on that the competitive edge between navigation tools, for example, we’re not that great, right? Obviously, Google has all the photos and such that showcase the details of buildings, but when it comes to actual guidance system A to B, um, it’s practically the same thing, right? An arrow that points you towards the direction. A road route that’s practically based on time efficiency, not even based on safety nor noise levels or anything like that, uh, or preferences. It’s just kind of like a very straightforward answer um to how people feel like they should navigate and there’s a ton of issues right coming out of the subway station going the wrong direction as we’re mentioning before, that still exists in cities I live in New York, and that’s I still you know observed that every single day. So when I realized that, you know, going to the navigation tool it wasn’t obviously the easiest thing but to get to the top levels of competition, we are using the. Same GPS data, uh, we’re still using the same exact sensors from the phones’ orientation, you know, gyroscopic accelerometers, they’re all identical and all we have to do is provide just a different way to communicate that information through the skin instead of any other method, and lay different um, you know, little pieces that would make it more accurate, you know, sucking a little bit more power but it was able to uh serve the population that really wanted that higher accuracy, the higher orientation, um, because for sighted users they may just say, okay, I know the general direction, I’m going to just you know go that way, but for Somebody who’s blind or visually impaired may really rely on that at a second by second basis to know that they’re going the straight pathway, that’s one thing that Simon the blind marathon runner taught me early on is don’t dismiss the capability, the power of being able to keep somebody in a straight line, especially if when they’re running and/ or doing exercises. So that’s why our technology really uh honed in on that component. But again, let’s wrap it up like startups are really hard, where we talk to enable ventures like they’re a fantastic group and we really, you know, we’re gonna have this big meeting at NFB, we’re Rounding up a lot of individuals that support in this field, um, and you know back in the day people, the VC partners will ask me how do you measure impact? And that’s a tough question, right? People love it, people use it, and it changes their lives. Like, is there anything more I have to say or put into documentation, Excel sheet to show you otherwise, um, and I think you know maybe Troy, you’ve experienced that similarly as well, like to showcase impact is truly within the end users utilizing the technology, telling me in person, this is amazing, right? It changed my life, and I think that’s the part that drives our mission. Well, you.
DEVIN COLDEWEY: know we only have a couple of minutes left uh but i wanted to ask just really quickly like is there one thing if you’re we’re talking you’re talking directly to some of your users or maybe some people who haven’t gotten a chance to use your your apps and products yet if there’s one thing just come up in the next couple of months that you know they could look forward to what would you say that is just real quick so we can end up in this talk on time go ahead yeah well the good ceo will never take the bait of one so i’ll give you three um one is uh as you might have seen in the press we’re very proud to be in a in a us pilot with walmart walmart
TROY OTILLIO: Is obviously a premier brand, but it’s also an access partner by our constituents, I mean it’s a one-stop shop, it’s got everything. So, I would encourage anyone, maybe you’re not an IRA user, maybe you are, but go visit Walmart. They’re gauging the value of this solution based on like the the use so, like this is your chance to kind of vote with vote with your time. It does include the website other things I would talk about, um. We didn’t get into but uh, the whole revolution with MetaGlass, I think is great to see that there’s people coming that are affordable, stylish, you know have the right um performance and I think that’s exciting.
KEVIN YOO: and uh that will also make navigation other tasks more natural because you won’t have to hold you know your phone in the same way you do today so those are at least uh the one thing i would talk about uh yeah that’s coming nice work and uh and kevin you wanna you want to finish this off here with the uh the one thing that uh people can look forward to uh one thing is always tough uh i think i’m gonna definitely go uh go into two if that’s okay but yeah the first one is we’ve signed a lot of ndas recently with pretty big companies i’ll just mention some like volkswagen uh and we’re now talking with samsung um in terms of the um you know The updates that we’re really excited about too is similar to what Choice said with Metaclasses, so we just had a meeting with Meta, of course, with ride-sharing companies that’s kind of like the core, I think. The approach for the next couple of months to really adapt Epic Nav and help a lot, I’m able to think that too, yeah, like it’s not meant to be a constant race. For example, in some kind of similar series, and for us they didn’t even provide the captions but people got that as uh, the link was on there and and everyone could have access to uh comedy video you which went pretty fast, we wanted a lot of uh changing on that in terms. Of the come to the frame to the way that the camera’s gun was chill-force in terms of game-time issues so people, the money would just continue the overarching overmatch fits and things like that so we’re just actually and the pros and cons they could come up with, they couldn’t avoid all of the things that they would it to your head. Orientation is perfect for it. It just makes sense.
DEVIN COLDEWEY: I’m looking forward to all this stuff. This is fantastic. Well, it’s been fantastic, too, just speaking with both of you. Thanks for joining me to talk about this stuff. And thanks again to SightTech Global for hosting this conversation. Thank you, SightTech Global. And thank you, Troy.
TROY OTILLIO: Everyone, yeah, thanks, Kevin.
DEVIN COLDEWEY: Take it easy.
[MUSIC PLAYING]