-
DESCRIPTIONLike Seleste, ARx is another recently released device designed to take advantage of the technology tech platforms that surround everyday life with a private, minimally visible, heads-mounted device. Both the Seleste and ARx leaders will discuss what they've learned in the course of developing and testing their devices.
Speakers
-
-
SESSION TRANSCRIPT
[MUSIC PLAYING]
BRIAN: Hands On With ARx. Moderator, Lucy Greco. Speaker, Charles Leclercq.
LUCY GRECO: Thank you for that introduction. It’s really nice to be here at Sight Global, and I am very excited today to be interviewing Charles Leclercq from ARx. And he is the co-founder of the company. And Charles, I’d like you to tell us a little bit about yourself.
CHARLES LECLERCQ: Hi, Lucy. Thank you so much for this opportunity. It’s such an honor to be here at Sight Tech Global. So yes, my name is Charles Leclercq. I’m the CEO and Co-Founder of ARxVision. And yeah, it’s a pleasure to be here.
LUCY GRECO: Excellent. Charles, can we get started with you telling us a little bit about why you actually founded ARxVision? What is it about? Why did you do this?
CHARLES LECLERCQ: So my background is actually in video games. I used to work at Ubisoft in research and development where I worked on a lot of motion-based sensors like the Kinect and the Wii U. And then I worked on blockbuster franchises like Assassin’s Creed and Prince of Persia and I thought, in video games, we use lots of cutting-edge technologies and lots of good design thinking. What if we took all that knowledge and brought it to products that are looking to solve real-life problems? And so that took me onto a long journey of going back to school at the RCA and Imperial College to study innovation design engineering. And to make this short, I ended up working at the BBC at the Future Experience Technology in London. And we were– I was leading the team discovering the potential of augmented reality as a public service. And one of the prototypes we built was an audio AR experience that allowed you to place audio stories in your environment and to curate your environment with audio. And I thought, OK, who needs this the most? And I was in touch with the company formerly called Horus Sight, and they’ve hired me as their CEO, and we rebooted this company into ARxVision with the vision, obviously, to augment reality with audio for the blind and visually impaired.
LUCY GRECO: Excellent. So that’s an amazing background, and I thought I detected some Québécois in that accent from Ubisoft. But it’s fascinating to me how much the game industry is actually leading in accessibility. And the fact that you’re coming from the game industry with an innovation for accessibility just proves that point. So I have with me one of your headsets. And I want to explore this. So this is built on a bone conducting headset, which I really like the idea of being able to keep my ears clear when I’m walking around my environments. And it has a camera up front. And when I read through the literature, it said that there’s two cameras actually in this box, is that correct?
CHARLES LECLERCQ: Yeah, that’s correct.
LUCY GRECO: And what you’ve done is you’ve made some really good tactile controls. One of my big gripes with the world nowadays is that there’s not enough tactile controls, things can’t be felt. And these are really well-feelable buttons, and they’re all shaped and have a– whenever you talk about them in your literature and in your tutorials, you actually talk about the shape of the button, which is– I really like that. And then you also have a volume control on this side, which is very, very tactile as well. So nicely done on the infrastructure of it. I’m just going to put the headset on and show people how I would look with it on. So it fits over my ears really nicely, and I have a badly-shaped head for these things, I have to tell you. So you’ll see, it slips on me a bit, but that’s because I don’t have it fitted properly right now. So it sits on my bone in front of my ear, and then the camera is next to my face. I have one comment for you in that while I was playing with this the other day, it got very warm, and this very warm device next to my cheek was a little uncomfortable. But otherwise, the camera can move up and down. So if I’m looking for, say, something I dropped on the floor, I could point down somewhat and not have to tilt my head too much, or have it point straight ahead of me as I’m walking around to try and get my environment. The system you’ve developed with the echo location sound moving back and forth is interesting. I’m wondering if that’s just telling a person that the device is actually active and if that sound could be turned off. Is that possible?
CHARLES LECLERCQ: That’s a very good question. So first on the–
LUCY GRECO: I’m sorry [INAUDIBLE].
CHARLES LECLERCQ: The first purpose of the– so [INAUDIBLE] headsets as well. Here we go.
LUCY GRECO: Yeah.
CHARLES LECLERCQ: So what’s happening is that the first purpose of the sound you hear and you mentioned is to let the user know that the camera is running. But you would have noticed that it goes from left to right.
LUCY GRECO: Yes.
CHARLES LECLERCQ: And to actually make you aware of the stereoability of the headset. And so when you, for example, detect an object like your hand or a document, you will– it’s basically designed to make it feel like you hear the audio coming from the object, and that’s going to be incredibly useful in the case of searching for objects and things like this because you can just rely on the echo location.
LUCY GRECO: Narrowing in on it. I like that, I like that a lot. So let’s get to the purpose of our chat today, which is where– what we really want to talk about is this great product. But mostly what we talk about is your process of creating the product. Tell us how that whole process went. Was it a lot of experimentation? Did everything turn out the way you wanted it? Tell us a little bit about the journey to getting an actual finished product.
CHARLES LECLERCQ: I’d love to do this. And it’s been an incredible journey, and that’s– the journey has actually been enabled by the community. So whoever is watching who’s been involved in the process, the entire ARx team thank you. So basically when I joined ARx, the headset was nearly done, but it was– there was no app for it. And so what we decided to do is obviously we designed the headset to be specifically for blind people. So that’s why you have all these physical touch buttons, so that you can know what you’re doing with just the sense of touch, and that’s useful for all sighted as well as non-sighted people. Obviously the product doesn’t cover their face. So these are all the things we learn from users and what people wanted. And that’s really the key to this process. I think one key feature is the bone conduction. Obviously this is incredibly useful because earbuds would isolate a person. So if you are isolated, you can’t hear your surroundings properly. But if you have a loud speaker, then everybody can hear the AI, and that’s not really nice in terms of privacy or being in public environment. And so in terms of the software, really, what we wanted to do was to fail early and learn. So we didn’t try to create a product in a lab and make it perfect. We are like, OK, let’s get it as soon as possible in people’s hand and let’s see if it crashes or if it survives. And so what we did is that we took the headset. We took– we made an app. And the app at the beginning was a clone to Microsoft Seeing AI or Envision AI, because we didn’t want to make anything new. We just wanted to build what people would expect. And so from there, we launched our first version of our product, that was nearly two years ago. And it was awful. It just didn’t work. I mean, the software worked, but the interface didn’t work because what we had done is that we based everything on the graphical user interface model. So you still could use your voice and the buttons of the headset, but you had to interact with menus that were nested in other menus. And so because you couldn’t see them, you had to imagine this whole map of menus. And so we took all learnings from that experience and we went back to the drawing board, and then we redesigned everything and we actually came up with a conversational interface, something similar to what you have with AI assistant like Alexa or Siri. And then we started that process of having beta testers being involved and we kept pushing updates. But I would say that the voice user interface was really the big breakthrough that enabled us to stop that loop. And we’ve basically kept learning from people, and we actually keep doing so even. Feedback is good today. We were looking for how we can make this as perfect as possible.
LUCY GRECO: That’s actually really, really important, the idea that you use there that you built it to fail and then failing was how you learned. Once you brought blind people in, did you have anything that was challenging, that your assumptions were either excellent or really bad? Something happened when you brought real blind people in to start doing your beta testing. Tell us about that experience of working with people.
CHARLES LECLERCQ: Yeah, absolutely. It’s actually a good opportunity to share that experience. It was not easy. At first I had no– I had no preconceived idea of how that would work. But what happened is that when we were ready to ship our first units, it was during lockdown. And so I was just unable to meet in-person with the user.
LUCY GRECO: [LAUGHS]
CHARLES LECLERCQ: And so what happened–
LUCY GRECO: That happened to a lot of us, didn’t it?
CHARLES LECLERCQ: Exactly. But how do you start and learn from a physical product when you can’t be in-person in the room and you can’t even be on Zoom? So what happened is that the first trials were a nightmare, and lots of emails and phone calls to be like, no, no, you have to press this button, it was complicated. But what we did is that we built– and that is actually coming from my background in gaming, we built an interactive onboarding experience which takes the user– The first time you open the app, you are taken through this audio experience that teaches you how to plug in the headset, how to press the buttons, what they do, the core principles of interaction. And so through that actual challenge of not being able to meet with people, we came up with, I think, a good solution that is really helpful even today.
LUCY GRECO: I really liked your audio tutorial. It really walked me through quite nicely. I don’t remember being told to plug in the camera, but the audio tutorial went through very nicely. It, first of all, told me to press a button, it told me which button to press. Then it explained, once I did that, what that button did and how to use it again. And that was a really nice audio onboarding. I really liked that onboarding a lot. It was really quite amazing.
CHARLES LECLERCQ: Thank you for saying that.
LUCY GRECO: The one thing I did want to point out to you, because you said you are interested in feedback, is when I plugged the device in to for the first time, the sound level was at 0. And I could not hear my phone and I could not hear the camera, I could not hear anything. So it’s gotta be something– something’s got to happen there that– I mean, I couldn’t continue the onboarding until I actually got my husband to help with the volume. Once I hit it on screen, he was able to raise the volume for me. But because I could not hear TalkBack, I could not raise the volume of the device, even though I tried to use the volume switch on the headset.
CHARLES LECLERCQ: And you’re touching on a very interesting point. So what’s happening in the– basically being an app and a hardware product, what’s happening is that– so we’re talking about the Android ecosystem.
LUCY GRECO: Yep.
CHARLES LECLERCQ: Which is super awesome and has lots of accessibility features that we love. But basically, we don’t have as much control as we would like on what happens in the operating system. We have control on what happens within the app. That’s like, we can control everything there. But before you’re in the app, that’s the challenge. And so this is why we– actually, we do need to improve on that part, but we created– I don’t know if you received it, but there is a QR code on the packaging that takes you to a link with an onboarding audio description.
LUCY GRECO: So you brought up two points I want to follow up on now. So let me address the QR code. Yes, that was excellent. That worked really, really well. The braille manual was really lovely as well. Very simple, straightforward. You should get someone to proofread your braille a little bit because you change between computer braille and literary braille interchangeably. So one time the URL spelt out in computer braille and another time it’s not spelled out in computer braille, it’s literary. But it’s actually a really nice, durable little manual. It’s a very cute little booklet with very solid pages that the braille won’t rub off on and disappear. So nice job on the manual. But let’s get back to your– let’s get back to your decision to use the Android ecosystem. I am an oddball in the United States and that I actually am an Android user and I prefer the Android ecosystem. But a lot of blind people here really do prefer iOS. Is there a specific reason why you guys chose Android versus iOS to start with? I mean, I understand you’re going to work on an iOS product, but what was your motivation to start with Android and then move to iOS?
CHARLES LECLERCQ: Yeah, that’s a very good question. So being on iOS is definitely on our roadmap. We know that most of the audience is on iOS, so we want to be there. The reason why we chose Android is because it’s faster to iterate and to deploy and launch applications. Also in terms of hardware, it is a little bit more complex to do it for iOS. And also, we’re unsure about what’s the situation– what the situation is going to be in terms of USB-C versus Lightning. It’s actually a hot topic right now and no one knows what’s going to happen. Is it going to be one or the other or maybe none of it?
LUCY GRECO: Yeah. Come on, Apple, let’s get with the rest of the world and finally go to USB-C or at least standardized with everybody else. Stop charging us more for your own standard. I agree with you on that, the USB-C versus Lightning. That’s a key point. I mean, how are you going to get your device to connect? And I see you didn’t go with the same mistake that other people who’ve tried to do this kind of thing before have done in that you right off the bat went for a hardwire. You did not try to go for the latency of Bluetooth, which I like.
CHARLES LECLERCQ: Yeah. That’s actually a very important point. And actually, it is a debate, because customer expectation is that for any category of customer, we want wireless products. But a wireless wearable means that you have to charge it, so you have to remember every day, OK, I need to charge my device. That’s friction. That’s an opportunity cost in your everyday life. And then the second point is if it’s wireless, you have to pair the device, which is– it doesn’t work perfectly still. And in terms of data, with the USB-C 3.1, we have 200 times more bandwidth than with Bluetooth, so we have a lot higher-quality images.
LUCY GRECO: Yeah. I mean, that’s really important, because the AI is doing that image recognition, and the better the image, the more likely they are to do that recognition. Which I find absolutely fascinating that we’ve come around to that. I mean, I was an early OCR user in the ’80s. And when you did OCR, the more data you had, the worse it was. You never went above 300 DPI. Nowadays, to get the better recognition, you actually need to up your up your resolution. So that’s really fascinating. I had one question for you about the actual image recognition. Is that done on the device or is it all done in the cloud?
CHARLES LECLERCQ: So, also a very good question. Some of it is done on the device, on the smartphone. None of it is done on the wearable. The wearable is an accessory.
LUCY GRECO: It’s a camera. Yeah.
CHARLES LECLERCQ: Exactly. Well, there is a bit more going on because of the stereo compression and so on.
LUCY GRECO: Yeah.
CHARLES LECLERCQ: But yeah. So yes, we do do what we can– when we can do it on device, we do it on device, but sometimes we get much better results using the cloud. And so our vision is, if internet is available, then we’ll use the cloud. If the latency is low, we’ll use the cloud because it’s just limitless what we can do. But if the internet is not available, if there is no connectivity, we still want our users to be able to use ARx. And so [INAUDIBLE] will still be available. I do know that currently our document reader doesn’t work offline, but it’s only a matter of days and weeks before this is fixed.
LUCY GRECO: I have to say, the number of items that you have on the menu, I never got to the end of it. It’s a large, large list of items a person can look for. And I compare it to the large list of items that’s in Lookout, you’ve got 20 to 30 times more items in your list. I mean, they stop at table, chair, and dinnerware, but you go all the way into cup, plate, table, all sorts of different things on the list, and I never finished it because I was like, this list is endless, it was wonderful the amount of items. Do you ever have a dog in there? Can a person look for their dog?
CHARLES LECLERCQ: Yeah, that’s actually a really good one. So Steve Nutt, who works with us very closely, is always asking me, when can I detect dogs specifically? And so what we want to build ultimately is something that’s probably similar to Lookout, but it’s a funnel where we’re able to detect that it’s a dog, and if we know it’s a dog, then we’re going to load a model that’s very good at detecting the race of the dog. And so we want to be able to do this for everything.
LUCY GRECO: I like that. I like it a lot. I think Steve and I are on some of the same listservs and we’re both asking the same questions all the time. Sometimes when I’m out at the park, I’d love to be able to find where my own dog is in the dog park.
CHARLES LECLERCQ: That’s actually a good point. We’re working on the ability to learn things. So to teach things to ARx. So currently we have all these modes, like the scene mode, document scanner mode, the short text, what you already have in Seeing Eye and so on. And where we want to go is to have a modeless experience or maybe only two modes. The first mode is– what we started to build was the scene mode where you get scene descriptions and text recognition and face recognition all in one mode so you don’t need to switch modes. And then the other aspect of the user experience is the Learning mode. So you’d be teaching faces, teaching a favorite object or location, or your dog, and then when you are in the Explore mode, you’d be able to detect all that.
LUCY GRECO: Well, you can count on me to beta test that portion. I have a question for you. Are you always going to depend on the camera or can a person use the camera on their phone? Say, if you’re in the kitchen and you want to recognize what the can was were holding, can you just use the phone with the same application or are you dependent on your camera? And if the answer is no, you have to use the camera, I would say change it.
CHARLES LECLERCQ: So the way this works is, if we see that there is demand and a need for this, then the answer is we’ll probably do it. So we’ll probably open the access to the phone’s camera. However, our mission is to really offer a hands-free experience. And the thing I hear every day is– from the community is, I don’t know where is the text, I don’t know where is the QR code, so I have to hold my smartphone in front of me and wave it all around. And that’s also true for scanning a barcode on a box of cereal.
LUCY GRECO: Yeah, no, exactly. It’s just that I don’t want to wear it around the house. If I have to do something in a hurry, I want to be able to not have to go and get rigged up with the headset and do all the rest of it from there. But no, I think this is a really well done product. I mean, we’ve had other iterations that do this, and the hands-free is actually key. I think for moving around and navigating, that’s going to be a really interesting experience. And I can’t wait to get home and try this on my new phone and see how it works with a better phone model.
CHARLES LECLERCQ: But yeah, I really look forward to having your feedback.
LUCY GRECO: Yeah, I’m excited. This is– it’s always fun to learn about new products. I really want to get back to the experience of working with people with disabilities, because I can tell you, every time I give a talk or a presentation, I always point out that it’s really important to bring in people with disabilities. Do you have any stories that can express something you learned from some of the people you worked with that you would never have thought of without them?
CHARLES LECLERCQ: Yeah, absolutely. I think Steve has been really important in that development journey. And I’ve visited him a few times and I actually just observed him in his home, how he is interacting with objects, how he was trying to read documents. And this is– basically when I was building the first version of the document reader, I was using my eyes to scan documents. And then I was giving it to Steve and I was like, yeah, just put the document here, it’s easy. I’ll show you. But Steve, at that point, he just couldn’t do it, and that was actually completely fair. But what happened is that I learned from Steve that I shouldn’t use my eyes to track the document. And we actually– Steve told me we need to verbalize everything you use with your vision so that the blind people can actually do it in the same way. And today, Steve now says that our document reader is the best ever because of that work we’ve done together.
LUCY GRECO: That’s really great to hear. That’s absolutely amazing. You’re normalizing to a person who is your end customer is something really hard for people to do. Even if that person doesn’t have a disability, you’re trying to think of what your customers are doing, the engineer back at the lab has their own mindset and then not thinking about what the end user who doesn’t have that same knowledge that the engineer does, how to think like them is a really important factor. And normalizing like you said. So don’t use your eyes when you’re trying to come up with the steps. I mean, I had this very thing happen yesterday. I was trying to deposit a check with my phone, and normally I just hand it to my husband because I just don’t want to bother with it, but I’ve got to get some independence. So he sat there with me and watched me do it, and he’s like, the instructions it’s giving you are absolutely ridiculous. They make no sense. It kept saying turn document 90 degrees. Well, when I turned it 90 degrees, it still said turn document 90 degrees. And so he’s looking at the screen and said, whatever it’s trying to tell you has no basis in reality.
CHARLES LECLERCQ: Yeah. And we’re running out of time, but I would love to speak about this topic for– I mean, we could speak about it for hours. And really, I would emphasize that the community has been so overwhelmingly active with ARx and that has made a big difference.
LUCY GRECO: Yeah. When you have the community involved, you get a better product. Ultimately the product becomes what the community needs. If you try and give us something we don’t need, we shy away from it and we end up finding that there’s some of the things built into it that just don’t work. Now I just can’t emphasize enough how important it is to work with our community. And not just ask them to volunteer to do things for you for free, but pay them, interact with them, make them a part of your project, and give them credit. If you eventually have product credits, I’m pretty sure you’ll include Steve’s name in those.
CHARLES LECLERCQ: Absolutely. And actually, we’ve created the ARx Academy which enables anyone to request a feature. And we want to name those features by the name of people who’ve suggested them. So if you get ideas, you know where to look. Go on the ARx Academy.
LUCY GRECO: That’s fantastic. Charles, this has been a wonderful conversation. If you had any last words you wanted to say to the community or to other developers like yourself, what would be the one thing that you’d want them to know going forward?
CHARLES LECLERCQ: Well, I want them to know that the goal of ARx is to democratize wearables, and that these are the early days and we need everyone to get involved because this is the only way we can make a good product that’s going to solve the problem for real.
LUCY GRECO: Excellent. Well thank you very much, Charles, I really appreciate you having this conversation with me. And good luck and be prepared to get a bug list from me.
CHARLES LECLERCQ: Thank you so much, Lucy. It was such a pleasure.
LUCY GRECO: It was a pleasure for me. Thank you.
[MUSIC PLAYING]