The 2021 Agenda

See the 2020 Agenda here.

  • Main Stage

    Day 1: Welcome and Introduction by Will Butler, Host of Sight Tech Global

    Concluded
    - PT
  • Main Stage

    Designing for Everyone: Accessibility and Machine Learning at Apple

    Apple’s iPhone and VoiceOver are among the greatest breakthroughs ever for accessibility but Apple never rests on its laurels. The next wave of innovation will involve what’s known as “machine learning” (a subset of artificial intelligence), which uses data from sensors on the phone and elsewhere to help make sense of the world around us. The implications for accessibility are just starting to emerge.

    Speakers

    Concluded
    - PT
  • Main Stage

    Seeing AI: What Happens When You Combine Computer Vision, LIDAR and Audio AR?

    The latest features in Microsoft’s Seeing AI app enable the app to recognize things in the world and place them in 3D space. Items are literally announced from their position in the room; in other words, the word “chair” seems to emanate from the chair itself. Users can place virtual audio beacons on objects to track the location of the door, for example, and use the haptic proximity sensor to feel the outline of the room. All of this is made possible by combining the latest advances in AR, computer vision and the iPhone 12 Pro’s lidar sensor. And that’s only the start.

    Speakers

    Concluded
    - PT
  • Main Stage

    W3C ARIA-AT: Screen readers, Interoperability, and a new era of Web accessibility

    Who knew that screen readers, unlike Web browsers, are not interoperable. Web site developers don’t worry about whether their code will work on Safari, Chrome or any other browser, but if they take accessibility seriously they have to test for JAWS, VoiceOver, NVDA and the rest. That’s about to change, thanks to the W3C ARIA-AT project. (This session will be followed tomorrow by a live breakout session with King and Fairchild, as well as several other members of the W3C ARIA-AT team.)

    Speakers

    Concluded
    - PT
  • Breakout

    Perkins Access: Using AI to remove digital barriers for math students

    In the world of accessibility, mathematical studies have been a long standing challenge. For students, online math instruction and assessment can pose barriers. More specifically, there is not a sufficient, two-way method for getting math and science information, like charts, graphs, tables, or code for equations, to and from a refreshable braille display, which poses a challenge for online assessment. Thanks to innovation, research, and advancements in technology, some of these challenges are being addressed. This session will discuss the work NWEA is leading, in collaboration with the Perkins Access digital accessibility consulting team, to make middle school mathematics assessments more accessible for students with visual disabilities using Artificial Intelligence (AI) technology. NWEA accessibility research manager, Dr. Elizabeth Barker, was awarded a generous AI for Accessibility grant from Microsoft to help further these efforts.

    Presenters

    • Geoff Freed, Director of Perkins Access Consulting, Perkins Access
    • Elizabeth RG Barker, Ph.D., Accessibility Research Manager, NWEA
    • Dr. Sarah McManus, Digital Learning Director for the Education Services, Governor Morehead School for the Blind
    Concluded
    - PT
  • Main Stage

    The “Holy Braille”: The development of a new tactile display combining Braille and graphics in one experience

    Today, instant access to the written word in braille is much less available to someone who is blind than the printed word is for someone who is sighted. Tools such as single line refreshable braille displays have been available for years, but a single line at a time gives the user a very limited reading experience. This limitation is especially felt when users are reading lengthy documents or when they encounter content such as charts and graphs in a textbook. The American Printing House for the Blind (APH), and HumanWare have teamed to develop a device capable of rendering multiple lines of braille and tactile graphics on the same tactile surface. Currently referred to as the Dynamic Tactile Device (DTD), this tool aims to provide blind users with a multi-line book reader, tactile graphics viewer and so much more. (This session will be followed by a live breakout Q&A session with Greg Stilson, head of APH’s Global Technology Innovation team, and HumanWare’s Andrew Flattres, Braille Product Manager.)

    Speakers

    • Greg Stilson Photo
      Greg Stilson, Head of Global Technology Innovation, The American Printing House for the Blind
    • Will Butler Photo
      Moderator: Will Butler, Vice President, Be My Eyes
    Concluded
    - PT
  • Breakout

    Fable: The Future of Screen Readers: Key Ideas That Will Not Serve Us Well

    In this session, Sam Proulx, Accessibility Evangelist at Fable and 30-year screen reader user, will challenge attendees to think about what the future of screen readers might be like in a rapidly changing technology landscape. In conversation with Lynette Frison, Fable’s Community Manager, many of our core ideas about what screen readers are, what they do, how they work, and how we interact with them will be explored and challenged. What is a screen reader without a screen to read, a text to speech voice for output, or a keyboard or touch screen to control it? How can a screen reader work with 3D and other non-linear information? What is the place of AI in screen readers? Surely, it’s about more than just recognizing images and OCRing text! Over the past 30 years, we have lived through the change from text-based DOS interfaces to graphical ones, from keyboard only interfaces, to control schemes that use touch screens, controllers, and mice, and from hardware text to speech to having everything done in software. However, most of these changes have been incremental improvements. If accessibility is to thrive in a world of augmented reality, wearable technology, and haptic interfaces, the entire framework of the screen reader may need to be rethought from the ground up. But it can’t happen without people with disabilities and assistive technology users. In this interactive and conversational zoom panel, your thoughts, questions, and ideas will be welcome as we explore the future together.

    Presenters

    Concluded
    - PT
  • Main Stage

    Indoor Navigation: Can Inertial Navigation, Computer Vision and other new technologies Work Where GPS Can't?

    Thanks to mobile phones, GPS, and navigation apps, people who are blind or visually impaired can get around outdoors independently. Navigating indoors is another matter. For starters, GPS is often not available indoors. Then there are the challenges of knowing where the door is, finding the stairs, or avoiding the couch someone moved. Combining on-phone and in-cloud technologies like inertial navigation, audio AR, LiDAR and computer vision may be the foundation for a solution, if product developers can map indoor spaces, provide indoor positioning and deliver an accessible user interface.

    Speakers

    • Mike May Photo
      Mike May, Chief Evangelist, Goodmaps
    • Paul Ruvolo photo
      Paul Ruvolo, Associate Professor of Computer Science, Olin College
    • Roberto Manduchi Photo
      Roberto Manduchi, Professor of Computer Science and Engineering, University of California, Santa Cruz
    • Dr. Nicholas Giudice Photo
      Moderator: Dr. Nicholas Giudice, Founder & Chief Research Scientist, University of Maine
    Concluded
    - PT
  • Breakout

    APH: The “Holy Braille”—The development of a new tactile display combining Braille and graphics in one experience

    The American Printing House for the Blind (APH), and HumanWare have teamed to develop a device capable of rendering multiple lines of braille and tactile graphics on the same tactile surface. Currently referred to as the Dynamic Tactile Device (DTD), this tool aims to provide blind users with a multi-line book reader, tactile graphics viewer and so much more. Yesterday on the main stage, APH’s Greg Stilson discussed the new device with Will Butler. In this live breakout session, Stilson returns to discuss the DTD’s product development with Andrew Flatres at Humanware, which is collaborating with APH on the project.

    Presenters

    Concluded
    - PT
  • Main Stage

    Day 2: Welcome and Introduction by Will Butler, Host of Sight Tech Global

    Concluded
    - PT
  • Main Stage

    Why Amazon’s vision includes talking less to Alexa

    As homes become increasingly more technology-driven, inputs from multiple sources—teachable AI, multimodal understanding, sensors, computer vision, and more—will create a truly ambient, surround experience. Already, 1 in every 5 Alexa smart home interactions is initiated by Alexa without any spoken command. As Alexa develops an understanding of us and our home well enough to predict our needs and act on our behalf in meaningful ways, what are the implications for accessibility?

    Speakers

    Concluded
    - PT
  • Breakout

    W3C ARIA-AT: Screen readers, Interoperability, and a new era of Web accessibility

    Who knew that screen readers, unlike Web browsers, are not interoperable. Web site developers don’t worry about whether their code will work on Safari, Chrome or any other browser, but if they take accessibility seriously they have to test for JAWS, VoiceOver, NVDA and the rest. That’s about to change, thanks to the W3C ARIA-AT project. This session is a breakout follow-up to the main stage session on the same topic held yesterday.

    Presenters

    Concluded
    - PT
  • Main Stage

    Inventors Invent: Three New Takes on Assistive Technology

    Inventors have long been inspired to apply their genius to helping blind people. Think of innovators like Mike Shebanek (Voiceover, Apple) or Jim Fruchterman (Bookshare, Benetech), to name just two. Today, innovators have a nearly miraculous array of affordable technologies to work with, including LIDAR, computer vision, high speed data networks, and more. As a result, innovation is moving ahead at a dizzying pace. In this session, we will talk to three three product innovators on the forefront of turning those core technologies into remarkable new tools for people who are blind or visually impaired.

    Speakers

    Concluded
    - PT
  • Breakout

    HumanWare: the introduction of HumanWare’s new intelligent braille displays

    For nearly 35 years, HumanWare has been at the forefront of developing unique solutions that allow for braille readers to interact with the world around them. Join us at the HumanWare breakout session where Peter Tucic and Louis-Philippe Massé will help participants better understand the benefits of a refreshable braille device for screen reader users, and showcase how we have continuously furthered expectations of what these devices can do to increase productivity amongst all users through innovation. The main takeaway for attendees will be to best grasp what a refreshable braille device is, why someone would use one, and how the introduction of HumanWare’s new intelligent braille displays strives to push the envelope moving forward in this space.

    Presenters

    • Louis-Philippe Massé, Vice-president of Product Innovation and Technologies, HumanWare
    • Peter Tucic, Director of Strategic Partnerships, HumanWare
    Concluded
    - PT
  • Main Stage

    Product Accessibility: How Do You Get it Right? And How Do You Know When You Have?

    Accessibility awareness is on the rise, but even teams with the best of intentions can flounder when it comes to finding the right approaches. One key is to work closely with the appropriate communities of users to get feedback and understand needs. The result is not trade-offs but a better product for everyone. In this session, we’ll hear from experts on the frontline of accessibility in product development.

    Speakers

    Concluded
    - PT
  • Breakout

    WordPress.com: Accessibility on WordPress.com

    A brief look at the features built into WordPress.com that can help site builders with accessibility, as well as how the platform itself is built accessibly. Includes a short demo of editing a site using screen reader software and a Q&A.

    Presenter

    Concluded
    - PT
  • Main Stage

    For Most Mobile Phone Users, Accessibility Is Spelled Android

    Nearly three quarters of mobile phone users in the world use phones built on Google’s Android operating system, not Apple’s iOS on the iPhone. For people who are blind or have low vision the key app is Google’s Lookout, which draws on the vast resources of Google’s AI infrastructure, including its computer vision database and Google maps. How is Google approaching the huge accessibility opportunity Lookout represents?

    Speakers

    Concluded
    - PT
  • Breakout

    Vispero: The Next Generation of Assistive Technology User Feedback is Here

    Join Mark Miller as he hosts David O'Neill and Matt Ater to discuss how Vispero brands Freedom Scientific and TPGi collaborated to empower JAWS Screen Reader users to have voice in surfacing the issues they face navigating websites. Learn how organizations can now easily integrate JAWS Screen Reader user testing directly into their accessibility programs to improve user experiences.

    Presenters

    • Mark Miller, Director of Sales, Emerging Accounts, Vispero
    • David O’Neill, General Manager & VP, Enterprise Compliance, Vispero
    • Matt Ater, VP, Business Development, Vispero
    Concluded
    - PT
  • Main Stage

    Getting around: Autonomous Vehicles, Ride-Sharing and Those Last Few Feet

    Summoning a ride from a smart phone is a dream come true for many, but when you have difficulty finding that ride, even when it’s a few feet away, the experience can be a nightmare, not to mention dangerous. How are ride-share and autonomous taxi companies working to make those last few feet from rider-to-car safer and better for blind and low vision riders?

    Speakers

    Concluded
    - PT
  • Main Stage

    Final Remarks: Sight Tech Global Host Will Butler and Executive Producer Ned Desmond

    Concluded
    - PT