-
DESCRIPTIONPeople with disabilities and accessibility advocates are working to make sure the metaverse is accessible to everyone. This panel will delve into research on the challenges current virtual and augmented reality tools create for people who are blind or have low vision.The panelists will share their experiences using immersive technologies and explore how these tools can be used to enhance employment opportunities in hybrid and remote workplaces – but only if they are built with inclusion in mind.
Speakers
-
-
SESSION TRANSCRIPT
[MUSIC PLAYING]
BRIAN: Virtual Reality and Inclusion– What Does Non-visual Access to the Metaverse Mean? Moderator– Bill Curtis-Davidson. Speakers– Alexa Huth, Brendan Biggs, and Aaron Gluck.
BILL CURTIS-DAVIDSON: Hello, everyone. I’m Bill Curtis-Davidson. I’m really excited to be here today. And I want to thank the Sight Tech Global team for allowing us to come talk with you today about the metaverse and accessibility. We’re really excited to talk about some ways that people with disabilities and accessibility advocates are working to make sure these technologies like augmented virtual and mixed reality accessible to everyone. I’m joined by some amazing panelists today who will share their experiences using these technologies, discuss barriers that exist and possibilities that the technologies create. And they include Alexa Huth, Aaron, and Brandon. So I’m going to ask each of them to give a quick introduction right now. Alexa, would you like to start?
ALEXA HUTH: Of course. It’s wonderful to be here today. My name is Alexa Huth, and I am a white woman with teal hair. And I am PEAT’s director of communications, so that’s the Partnership on Employment and Accessible Technology. And my connection to the metaverse is growing. So I started losing my eyesight about 13 years ago due to a condition known as lattice degeneration and I rely on my left eye, but I have a lot of visual challenges. So I’m excited to talk about them today and how the metaverse might make for a more inclusive future. So thank you.
BILL CURTIS-DAVIDSON: Thank you Alexa. Brandon.
BRANDON BIGGS: Yeah. Thank you for having me. My name is Brandon Biggs. And I am a researcher that focuses primarily on nonvisual maps, [? studio ?] graphic maps, and virtual reality. I am blind as well. And I’m basically totally blind, but I do have a little tiny bit of vision. I can see shapes. So primarily, I use audio as my primary modality. And I am focused on using audio game conventions. And audio games are games that can be played completely in audio to bring accessibility into the VR and spatial information presentation space. I’m also interested in making headphones, which are the most ubiquitous virtual reality headset a first class citizen in virtual reality.
BILL CURTIS-DAVIDSON: Right, thank you, Brandon. And Aaron Gluck, would you like to introduce yourself? AARON GLUCK: Yeah, hi. I’m Aaron Gluck. I’m a fourth-year PhD candidate at Clemson University where I actually study accessibility, both in autonomous vehicles and in virtual reality. So I started working with VR about seven years ago and only took about two years to start asking the questions of, how do we make VR accessible for those that VR is not currently made for?
BILL CURTIS-DAVIDSON: Great. I’m really pleased to have each of you here today because you each have such interesting and diverse backgrounds. And really, to start off this discussion, I’d like to ask each of you to comment on some of the reasons why you think these newer immersive technologies are exciting to you. What opportunities do they create? And Brandon, I’d like to ask you to start us off.
BRANDON BIGGS: Yeah, so XR technologies and virtual reality in particular is a fantastic tool to represent social interactions and information as close to the real world as possible, so specifically, within social interactions. And I find in the interfaces that I work on that the cognitive load is very, very low if you make the interface well. And the learnability is really, really high. One thing that I’m very excited about going forward is that blind people will, for probably one of the very first times in history, be able to socialize with everybody around them at the same level as sighted individuals. We’ll have access to nonverbal social interactions that we don’t usually have access to in real life. So I think that that’s going to expand inclusion significantly. I’m also extremely excited about the haptic information and technology that is being created as part of this virtual reality expansion to the metaverse. So I can’t wait till– vibrations are only part of the haptic experience. We’ve got force feedback and textures as well. And I’m very excited that sighted individuals are paying more attention to nonvisual experiences as it relates to XR.
BILL CURTIS-DAVIDSON: Oh, thank you so much. That’s really great to hear. And I’d like to ask Alexa to comment next. AARON GLUCK: Sure, there are a lot of different reasons that I’m excited about immersive technologies. Personally, I grew up loving PC games, just any sort of experience that took me out of the daily world and put me in something interesting. So when I first started with VR and I tried an art application, it gave me that same feeling I had as a kid when I played Return to Zork. It gave me a sense that I was really entering something new. And I was so excited about it. And I was able to jump into the world of the art that I was creating and really focus on it with my left eye, which is– the left eye central vision is primarily what I use. And that has so many applications. Thinking about it from a workplace perspective, being able to maybe expand a presentation that I’m seeing so that I can really actually engage with the content instead of what I generally do, which is pretend I’m engaging with it. So I think there are a lot of ways to close distance gaps, to spark creativity, and to bring a closeness among people, if the technology is built with inclusion in mind, of course.
BILL CURTIS-DAVIDSON: I love that you point out the creativity part, Alexa. As you know, I began my career as a fine artist. So I love the creative aspect of these technologies. You make some great points. Aaron, what’s getting you excited about these technologies? AARON GLUCK: For me, it’s really that giving us the chance to really do whatever we want to do in this. We’re no longer defied by– our movements are no longer defied by physics. Reality doesn’t have to be reality anymore. The question that’s always come to me from people that are new into metaverse, virtual reality, augmented reality is, what can you do in those type of spaces? And I hate to use it, but I always use the response of, there’s nothing you can’t do. So it leads to this open world where anything and everything is possible. Just because it hasn’t been done yet, doesn’t mean that there’s not a way to do it. So it really excites me to be able to bring not only new experiences to individuals that maybe wouldn’t have them, but also to allow people with different disabilities to access these environments that they would potentially have access to as well.
BILL CURTIS-DAVIDSON: Great, thanks to each of you for sharing what makes you excited about these technologies. I’d like to talk next about what’s attractive to you as a tech challenge. Each of you mentioned so many different aspects of these technologies from haptics to audio-only experiences or audio-primary experiences. And each of these has an incredible amount of technology and engineering behind it that is moving at lightning speed. So what’s exciting to each of you about what we’re working on in terms of making this set of technologies more inclusive? And I guess I’ll start, Aaron, maybe ask you to comment. AARON GLUCK: Sure, yeah, for me, it really was the first moment I put on my virtual– the first virtual reality set. I felt like I was the person I imagined myself to be as a kid. I was the hero in the story. I was there. I was experiencing it. The moment I took it off, I was just hit with this sense of, I have to get everyone that I know and don’t into VR. They need to experience it. And so I just spent a lot of time in those first couple of years just learning and growing, trying new things in that experience. And it wasn’t actually until a couple of years into it that I was talking with my now advisor, and accessibility was brought up. And I said, wait a minute, VR is really not an accessible medium, especially for those with visual impairments since VR is a visual medium. So what can we do? How can we address that? And so I set myself a challenge. I said, I’m going to figure out something to allow no visuals to be needed to be used in the virtual environment. What’s nice, at least with most modern-day headsets or VR systems, is you have the ability for head tracking, hand tracking, you’ve got microphones. You’ve got haptic feedback. You’ve obviously got 3D spatial audio. And so by using those, I was able to actually build my first nonvisual VR experience, the enclosing dart prototype. And so from there, it’s just built and grown from that for me. That was a nice slow, exploratory experience. I’ve started exploring, what does it look like when you need to, without visuals, make instantaneous decisions? And so what’s the best method for that? And actually have just started work on a prototype exploring, how do you do an equivalent first-person shooter in a virtual environment? So the challenge as a programmer and developer for, how do we make these things that have for so long been a visual-based information processing, how do we do that without the visuals? It gets me up in the morning and allows me to have just a fun time exploring different things.
BILL CURTIS-DAVIDSON: Well, thank you for sharing those stories and examples. Brandon, do you have anything you might want to pepper in here in terms of the work you’ve been doing and what challenges you’ve been addressing?
BRANDON BIGGS: Yeah, absolutely. So the biggest challenge that I have is that people think that– they’ve been focusing primarily on visuals. And so there is the whole field of audio and haptics that are very unexplored and significantly underfunded. So the world is twice as big, three times as big if you add on these sounds and haptic experiences. And I think they can be just as interesting as the visual displays. So my challenge has been, how do we present information in what’s known as a cross-sensory way, so that the information is accessible equally, visually, auditorily, and tactilely. And so most of my research has been on, how do we make these modalities independent of one another so that we can use them and then eventually combine them together to create a really incredible experience where you can choose what modality you want to use? And so that’s the real excitement here is that we’ve got the fields of data visualization, of visual virtual reality, they’ve gone for four years. And a lot of stuff has been solved. There there’s still many things that haven’t. But there’s been a lot of work in that space. But there’s hardly any work in nonvisual virtual reality. And that’s even bigger than visual virtual reality. So that’s really super-exciting.
BILL CURTIS-DAVIDSON: I love what you’re pointing out there. And I guess I should pause and say to our audience that if you don’t a huge amount about these technologies, one thing that you’ll learn more and more is that the hardware is becoming more and more capable of sensing the environment, our movement as a person as we’re wearing it, just like our phone is also capable of that. And our phone is also getting more of these capabilities as well as even some other types of devices but also, sensors of various kinds, cameras, spatial mapping sensors, audio sensors. So as we were just hearing some of these lively examples, I think each of these has a tech challenge to it, right? But first, we need to ignite our imagination. And Brandon, I love that you say that this is under-tapped, right? People are privileging in a way the visual domain, but really need to unpack these areas. And I think one way that we can and should do that is to understand the barriers that people are having, because then we have problems we can solve for. And I know each of you are exploring that in your own way. Alexa, I would love to ask you to talk about that very topic. Like, what kind of barriers have you felt when you’ve been using these type of spatial or immersive technologies?
ALEXA HUTH: Sure, I think, obviously, the barriers are different for every person. Because I have very limited peripheral vision, I find that a lot of these experiences place things outside of my visual reach, just by default. And I think it would be wonderful if there was a way to customize. Obviously, accessibility options though are never enough. But these should be mainstream options as well. Where do you visually focus, if you do visually focus, would be a really important question for me, because it is my left-central vision. And when I hop into any sort of an immersive experience, I’m immediately scrambling. Is there a menu bar, perhaps, below my field of vision? So I’m looking everywhere. It’s very overwhelming. In addition to that, I am very light sensitive. And so at any point, this is a very bright room for me that I’m currently in. And it’s just because we’re recording. I usually like to be in a very dark environment. So when I put on an HMD, I am, again, overwhelmed by how bright it is. It’s pretty painful. If that could be customized down for every immersive experience that I do, that would be wonderful. So I think really, a lot of the barriers are fixable, if we are thinking beyond the pretty. We’re not thinking about the colors and the oh, should we put something on the virtual wall? That’s great. But that’s something that’s going to be a virtual clutter for me. And with the outside world, the physical world being not built for me already, I can only control my apartment. And I would love to be able to control my virtual space as well. So I think there are a lot of barriers, but they are fixable for me personally.
BILL CURTIS-DAVIDSON: Yeah, I think we’ll talk a little bit more about design specifically later. But I think what occurs to me is the first step in inclusive design is recognizing exclusion, and then using that as a way to design for problems. And what occurs to me is when you talked about dimming the display or focusing, these are things that have been done in other areas like in gaming, for example. We have accessibility features in games where you can highlight core actors or to turn off audio, extra audio. And there’s a lot of things that we can learn from other media. But it does require us, at first, to recognize exclusion, number one, because we don’t want exclusion. We want inclusion. But in order to do that, we also need to recognize the value for others. Brandon, do you have any barriers that you’d like to talk about, especially for people who are blind?
BRANDON BIGGS: Yeah, yeah, absolutely. Absolutely none of the mainstream virtual reality platforms are usable for me right now. So I’d say that’s probably the biggest problem. And it’s not that we don’t know how to do this. It’s that none of the platforms have really contacted the right people or figured out– they haven’t really implemented any of the changes that are needed for this. And that part of that problem is we go back to the idea that virtual reality is visual. So I’ve worked with people on the Webex R specification. And it was so inherently visual initially, I had to go through and tell them it’s actually not inherently visual. We just don’t necessarily have the correct displays at the moment to show these in different modalities. But headphones are there. They’ve been around for a really long time. And if you talk to blind people about head-mounted displays, they will say, I don’t need the camera, other than the visual aspect of it. So why do we even need this thing? It’s not useful. So the headphones, we wear these for 14 hours a day, and they work perfectly. And a lot of them have head-tracking now. So they are full VR headsets. So yeah, basically, audio games have suggested many of the conventions that we can use for social interactions and interacting with the environment and creating an immersive virtual world. And so the visual virtual reality experiences just need to become as immersive as auditory audio games.
BILL CURTIS-DAVIDSON: Great, and while we were mentioning– thank you for all of that, Brandon. And that’s really wonderful points you’re making. And while we’re mentioning gaming, I should say that the adoption of these technologies, of course, virtual reality, gaming, there’s plenty of that going around. But I think the adoption of the hardware is skewing toward enterprises who maybe can spend more money on these devices. They’re still not as commonplace as they will be. And also, they’re changing rapidly. And so what we’re seeing is enterprises or workplaces beginning to use these in different ways. And we’ll talk about that more in a minute. But as we’re talking about barriers, maybe to focus a little bit on, we talked about the barriers and how these exist today. And while we’re still working on all of it, there is an emotional or mental impact of when, if you are– let’s say you do get a new job, and you’re offered a training scenario that utilizes virtual reality. What happens if you’re blind or have low vision or limited vision? What if the company rolls that out and hasn’t really thought through that? I’d like to ask Alexa to comment a little bit on that because we do so much at Peat, talking to the employers or adopter crowd. Alexa, would you like to expand on that?
ALEXA HUTH: To pull that thread that you’d started, so imagine that I have started a new job, and I chose not to disclose my disability, which is absolutely fine. But then they give me a virtual reality onboarding experience and expect me to complete it within two or three days. That’s not going to happen. I can maybe be in an immersive environment for 20 minutes tops. So my choices there are either disclose, which is a forced disclosure, which should not be happening. Or I would have to fake it until I made it and hope that they’re not getting records that perhaps it’s taking me a week or longer to complete these modules that they expected to have done. These are very exhausting challenges. And they are things that people with disabilities are confronted with daily multiple times a day. And it shouldn’t be happening, especially in the workplace, because that is just such a barrier. And so I’m glad we’re talking about the mental and emotional impact of it, because there are so many times when I’m confronted with these things that aren’t made for me. Like I said, I can only control the apartment environment. I can’t control the virtual environment or the physical environment. And we need to think about them very, very carefully before we roll them out to employees or even just people suggesting them and saying oh, this will help. This will fix it. Be careful with that language, because a lot of times, you don’t know what set of barriers a person has. And so unless the technology is designed flexibly and codesigned with people with disabilities, which I know we’re going to get into design later, but there is no way that you can imagine my experience. And so that won’t be baked in. And I will still be excluded. And so I think it’s a great thing that people are onboarding with exciting, new technology. But it does need to be inclusive first.
BILL CURTIS-DAVIDSON: Great points. And I think, as we’re talking about all of these, what can you do, right? There is fast-moving development, and Brandon and really Aaron and Lex have mentioned. While there’s no perfect solution, there are a lot of things that are happening to advance this. And I’d like to ask each of you to comment on that. So I mean, these include areas like human interactions, like communicating, interacting with other real humans through these technologies, or what’s happening with hardware and fallback options, input options, et cetera, and then also in the content, 3D content space. So maybe I can start out with Aaron. Let’s see what comments you have in this area. AARON GLUCK: Sure, so my specific focus is looking at, how do we make commercial, off-the-shelf virtual reality systems accessible? So they include an Oculus Quest 2. If you go to the store and buy one, you have head tracking, hand tracking. You have controllers that have haptic vibrations. You have glass spacers to assist with glasses. You have a microphone that can use a voice-user interface. So there’s a lot of things that are built in to these devices and equipment. The problem that I specifically see is that we’re not using them to help make it accessible. They’re just there to try and add to the visual experience. So like Brandon was saying, it’s not looking at, how do we make these the main focus rather than– and lowering the visual requirement? So we’re the pieces, for the most part, are there. It shouldn’t be hard to change the contrast level or the brightness of the display that would help Alexa out a lot. Those things are there. What it takes is the time and the realization that, hey, we could do this. It doesn’t take that much. And I know a lot of researchers are starting to explore these areas. The trick becomes, how do we take what’s being researched in academic labs or industry labs, how do we get that to the manufacturers and the developers? And I know we’ll talk about that as we go forward. But it’s really that disconnect between those of us who are passionate about making this experience as accessible as possible. And I know that’s going to be a step-by-step, feature-by-feature, modification-by-modification process. It’s not, sadly, going to be a snap your fingers and it’s all done overnight. But we need to spend that time. We need to show that the research is being done. We’re proving that these things can be done, that we can treat immersive environments in such a way that they’re accessible, get that into the hands of those that are producing them, that are making the games, making the experiences, doing the training, the virtual trainings, things like that. We need to get them on board and get them connected, bring everybody together and have those conversations.
BILL CURTIS-DAVIDSON: Yeah, those are great points. Having worked in this area in some detail and even my most previous role at Magic Leap, I know a lot of these areas are literally being developed or have recently been developed in the last few years, things like voice commands to open up applications instead of using a launcher, for example, that’s visual and requires a pointing device, or having a segmented dimmer, which is now available in the released product Magic Leap 2, to focus on content that is something that you want to focus on. That could be useful. But again, these things are being developed literally in real time, almost. So it’s like a moving train, isn’t it? So Brandon, what thoughts do you have? I’m curious about some literally some of the platforms or developments that are happening that will be useful for accessibility.
BRANDON BIGGS: Yeah, so I mean, if you’re building one of these tools, the thing that’s going to get you the fastest way to accessibility is to find designers or human computer interaction experts who have the lived experience. They’re very few and far between, but they do exist. And so if you can find those people, they’re going to get you there the fastest, because they’re going to have the research and the background to really take the platform to the next step. The little bit slower option, and both people with lived experience and everybody else should do this, is to find other users with the lived experience and test it and test the different interactions. But for example, with lived experience dictating, some of the interactions that really work, the nonvisual experience is something that blind people have solved for themselves. They’ve made virtual reality for the last 20 years. And it’s really been extremely immersive. And so I’m in the midst of writing up a really comprehensive paper on all the interactions that take place in these nonvisual virtual reality experiences. And I’ve asked people how they would expand those out to some of the interactions that are in visual virtual reality that aren’t necessarily in the auditory virtual reality space. So a lot of it has to do with really basic elements. And contact me if you want more information. But it’s stuff like naming every single interactable mesh in the area, having collision sounds, or having footstep sounds, having some kind of way to scan around the area with using a person’s screen reader and having text information and having menus where you can interact with the different elements in your room. And so there’s a bunch of different interactions that you can do to make your virtual reality experience nonvisual. And blind people have been building these for themselves for a really long time. And so it’s just a matter of looking at what they’ve already done. And most interaction researchers who are blind themselves will be able to answer this. And I’m sure it’s the same for other people with lived experiences with different types of disabilities so low vision individuals or hard-of-hearing or deaf, or all those different groups of people. So yeah, I think that this is an area that really needs more highlighting. What have blind people or people with disabilities done already in this space that hasn’t hit the mainstream stage yet?
BILL CURTIS-DAVIDSON: All right, plus one to everything that you just said. To me, what’s so exciting about this is really that hacking that lived experience is really a lever for innovation. And I love what you said, if you want to do it the quick way, which, who doesn’t want to do that, right? You need to have people with these diverse lived experiences on your team, because they’re going to come up with angles that you would never explore on your own because of your lived experience. I can speak as someone who does not consider– I consider myself an accessibility and inclusion advocate. But I am not a person with a disability. So I myself count myself in that club of people who need to interact and feel like that’s imperative. It’s not an option. And I love some of the work we’re doing with 3D content descriptions with groups like Equal Entry and the XR Association where we’re literally working with people who are blind or have low vision to figure out, how do you actually navigate inside containers of space, at the high level, going down into an environment, going down into a building, into rooms, and into spaces within rooms, and even furniture? How do you actually experience that, number one, in the real world, the physical world? And what would you expect to have in virtual? And that is so productive to work with people who have the lived experience. So I just wanted to kind of emphasize another concrete example of things that are being done right now. With the time we have left, I want to quickly ask Alexa to talk a little bit about workplace because at Peat, we do quite a bit along that line, working in the way we do. Alexa, what are your comments about the future possibilities and imperatives for workplace?
ALEXA HUTH: I think that there are a lot of interesting applications. So as you mentioned, at P-E-A-T-W-O-R-K-S dot O-R-G, we have an inclusive XR and hybrid workplace toolkit. And that will help you get a sense of exactly what you need to do to be understanding how these technologies can apply to the workplace in inclusive ways, what you should be looking for, how you should be using them. We also have a joint white paper with the XR Association that brought some really interesting applications for the workplace. So the one that I think is exciting is training for especially in the medical industry. So if you’re a person like myself and you’re often at the eye doctor, often, they’ll have a training doctor come in and look at my retinas. And so I usually go through three or four doctors before I’m done with a typical visit. If there was skills-based virtual, immersive training, perhaps they could understand virtually. And that would be great. I would love training like that, not that replaces lived experience. But in some cases, because I do have a condition that is often not common, I will just say that, and so if people could experience that in an immersive environment and train with it and learn about it, just even to catch the signs, because mine weren’t caught early enough because it wasn’t something the person was expecting to see. So training in the health care field I think would be a really great application. And it’s already happening in certain hospitals and certain training environments. So I think the pandemic really sped that up, because people weren’t able to do the type of hands-on practice that they were before. So that is something that I look forward to, not that I would ever begrudge somebody learning from my situation. But I think that there are some really great skills-based training and upskilling that can happen in immersive environments.
BILL CURTIS-DAVIDSON: Yes, thank you so much. And I’m glad you mentioned our Peat inclusive XR and hybrid work toolkit where we go into the detail, for example, on meetings. Which, meetings apply to every other use case, which is why we chose that area. What do you need to think about before you try to engage with other people? What do you think about when you are engaging with them? And it’s broken down into areas like language and communication mode. So if you’re communicating, for example, with sign language, how do you actually incorporate that into VR? Like, if you’re doing a VR meetup like we do with Equal Entry, well, you actually may have to put a video feed into it and have a sign-language interpreter that’s always present. Again, how do you do these things is evolving art. But we need to do it. And it’s important to look at that for inclusion in the workplace. We just have a few minutes left. And I’d like to sort of wrap up by talking about– we’ve talked about all of these challenges, all these opportunities. And I guess what I’d like to get each of you to comment on is, what should designers, developers, and even implementers do now if they want to get more involved in this? We want a big tent here. I think all of us would agree. How do you get involved? And maybe I can start with you, Brandon.
BRANDON BIGGS: Yeah, if you really want to get more into nonvisual experiences of virtual reality, I would contact the different research institutions that focused specifically on that particular disability, for example, the Smith-Kettlewell Eye Research Institute is all about vision. And Gallaudet University, I think, has a research institute that’s on individuals who are deaf and hard-of-hearing. And so focus on contacting those places. If you want low vision stuff, I’m probably a good person to talk to about that. But yeah, also, think about these different disabilities as a different challenge to create this experience for those individuals and what input devices or output devices they would be using. I could go back to this idea of headphones. And once we do have these experiences, how is that going to increase the accessibility of VR for individuals of all different types of abilities. So for example, headphones just happened to be in almost every single household. So it’ll decrease the level of adoption required for virtual reality if everybody has the tools already to use them.
BILL CURTIS-DAVIDSON: Yes, thank you very much. Aaron, what are your thoughts? AARON GLUCK: Yeah, I think it’s an interesting problem that we’re at, a tipping point at this point where the information that VR, XR, AR is just not as accessible or equitable of a technology that more and more people are starting to have these conversations. I think businesses, developers are starting to realize that the ADA laws are going to, as we had with the web, are going to start being enforced in a immersive environment as well. And so being able to start looking at these– I don’t want to call them problems, but challenges– to, how do we develop systems that are going to be more accessible? How are we going to develop applications that are going to be more accessible? I think it just leads to kind of a change in the mindset. Here, we’re going to take an extra two weeks to get this product out. And we’re going to build in X-Y-Z to give more accessibility to what we’re doing. And I think as developers and manufacturers start doing these small things, they’re going to see their sales increase. I mean, if you can make something that wasn’t accessible to a group of people, let’s say, even if it’s just 100,000 people that have something specific that you can create an accessibility feature for, you now have 100,000 new potential customers. And not that it’s about that bottom dollar, but from a manufacturing development standpoint, it often is that we have a budget, we’ve got to go with that. So I think the manufacturers, designers they really, like I had mentioned before, they need to spend the time, reach out to researchers that are looking into that, reach out to the community of people with the disabilities that they’re interested in making accessible, bring them on, bring in a consultant, do codesigners, have someone who can has that lived experience to be able to help get that through. I think that’s where we’re going to really see an uptick in those accessibility features as we go forward. And there is plenty of room for that to be done. So just need to realize, taking an extra couple of weeks to get a brand new game out, people can live. It’s OK. Make it more accessible for everyone.
BILL CURTIS-DAVIDSON: Thank you so much. I totally agree. Alexa, please give us your thoughts really quick.
ALEXA HUTH: Before I do, Brandon had said people could reach out to him. Brandon, could you give an email address or something where they could reach you?
BRANDON BIGGS: Yeah brandon.biggs@xrnavigation.io. And it should be somewhere in the meeting notes.
ALEXA HUTH: Perfect. Aaron, would you like to give contact details to before I wrap up with my answer? AARON GLUCK: Sure. Easiest way to get in touch with me is amgluck, G-L-U-C-K, @clemson.edu.
ALEXA HUTH: Great, I just didn’t want to shout out Peatworks without giving everybody a chance. But I absolutely think that Brandon and Aaron have nailed it. And I would just encourage people to think about digital curb cuts, so making sure that you’re not thinking about it just as oh, I’m helping x people. No, it’s better for the entire design. So you need to consider what benefits there will be for everyone. Closed captioning is a great example. So will your product have next closed captioning if you design with people with disabilities at every stage. So that’s just what I’d like to advocate for over and over again.
BILL CURTIS-DAVIDSON: Great, thank you so much to all three of you for an excellent discussion. I feel energized every time I have these discussions. And I have a lot of them. But this has been really special. And I really appreciate each of your unique perspectives. And again, please get involved in the community. XR Access is another community at xraccess.org you can get involved in. You can get involved in the XR Association, one of the leading trade associations at xra.org. And you can also get involved in groups like equalentry.com, who runs a near-monthly accessibility VR meetup that people like Aaron have spoken at in the past. And they have an incredible library of recorded sessions as well that are accessible. And then finally, check out Peat, that’s P-E-A-T-W-O-R-K-S dot org. Subscribe to our newsletter. Listen to our podcasts. And just stay informed and get involved. We need everyone to make this as good as possible. And it’s really important for our future. So I’d like to thank the Sight Tech Global team again for having us today. And thank all of you for joining us.
[MUSIC PLAYING]