Making VR more accessible for the hearing impaired

In his latest project, Myles de Bastion shows how VR could help the hearing-impaired in remote settings.

Virtual tools became a crucial part of work during the pandemic — but for those with hearing impairment, these options made navigating work environments all the more difficult.

In “Virtual Worlds Made Accessible Beyond Sound,” deaf artist, musician, and designer Myles de Bastion reveals how VR could address this problem — and what is still needed for the medium to realize this inclusive potential.

The challenge: There are alternatives to audio meetings, but they aren’t great. Auto-captioning tends to lag and isn’t always accurate (Zoom didn’t even offer auto-captioning until November 2020), while video calls often lack the spatial context often needed for communication in sign language.

Key quote: “In a Zoom meeting you don’t have any sense of physical space, so you lose a lot of the ability to communicate (in American Sign Language) fluently, and conversations take much longer. VR solves this by restoring the sense of physical space that is needed for ASL,” Myles de Bastion said in his presentation at Open Signal in Portland.

On display: In the project, de Bastion examines how virtual environments in a VR headset could address these core problems with sign language and what he calls “environmental captioning.”

Context: VR hasn’t always been conducive for sign language because it traditionally relied on controllers. Oculus announced hand tracking on its Quest headset in Fall 2019: with this feature, users can express themselves through their hands and fingers.

  • Buuuut: Headset-based hand tracking still isn’t perfect, and it is still not available on many applications.

  • Hands are perceived through cameras located on the headset itself. When one hand blocks another from view, for example, meaning can be lost.

Myles is leading the way in defining accessibility as a norm by developing these tools to create more inclusive virtual worlds.


Sonya Neunzert

Caption this: In the presentation, de Bastion shows how environmental captioning could literally meet users where they are, rather than forcing them to only look in one place for all captioning. He shows how dialogue can be linked to VR avatars, without obstructing others’ views of them, and encourages other designers to think about coloring, opacity, and shapes that change relative to other users’ proximities.

Just the beginning: De Bastion advocated for more inclusion in the extended reality (XR) industry, writing that “involving (d)eaf & disabled (people) will increase diversity of ideas in virtual spaces … (and accelerate the) mass adoption of VR.”

New norms: “One of the most interesting things to me about immersive tech is seeing people who actively define the norms and protocols in these new kinds of media that are still basically being invented,” de Bastion’s Mentor Sonya Neunzert said in a statement. “Myles is leading the way in defining accessibility as a norm by developing these tools to create more inclusive virtual worlds.”

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at tips@freethink.com.

Up Next
rec room
Subscribe to Freethink for more great stories