Tech hacks the nervous system to bring touch to virtual reality

"If you really want to engage in a virtual or alternate reality, you have to feel when you touch something."

The era of virtual reality seems to have finally arrived. VR headsets dominated the convention halls of this year’s CES. Apple’s Vision Pro and Microsoft’s HoloLens aim to blend virtual and real spaces into a single augmented reality. CEOs are waxing poetic about the metaverse’s potential to reimagine work, play, socialization, and just about everything else.

While the technology is impressive, it also focuses almost exclusively on two of our senses: sight and sound. But humans experience the world through a panoply of senses beyond our eyes and ears — some say as many as 20. One that is often neglected or absent entirely from these virtual worlds is touch.

“There is no reality to mixed or augmented reality until there’s touch,” Dustin Tyler, the chief science officer of Afference, told Freethink.

Tyler co-founded Afference with Jacob Segil, its CEO, to deepen the virtual experience by bringing realistic, tactile feedback to it. Their prototype device, called the Phantom, is a neural haptic device that doesn’t stimulate touch through a poke or vibrations. Instead, it hacks your nervous system.

[Touch] is not only a function; it’s an actual emotional connection.

Tyler Dustin

All in your head

Tyler has been developing neurostimulation technology for 20 years as a neural engineer researching prosthetics technology. He notes that while some advanced prosthetics can return fine motor skills to people who have lost limbs, it isn’t quite the same. Even when they see themselves, say, gripping an apple, the experience can feel disconnected.

“Every person said something like, ‘I want to hold my wife’s hand.’ It’s emotional, right? It’s not only a function; it’s an actual emotional connection. Connection matters,” Tyler said.

In 2012, Tyler and his research team created an implant sensory interface to add the sensation of touch to prosthetic fingers. Since then, they’ve continued to work on ways to improve the function and feeling of prosthetics. “People would say, ‘It’s my hand grabbing that. I have my hand back,” Tyler said. “Touch changes all of that.”

Building on Tyler’s research, Afference’s Phantom prototype aims to bring touch into VR and spatial computing. The device is a palmless, fingerless glove. It basically looks like something Billy Idol would rock during the cyberpunk apocalypse (a feature we place firmly in the “pro” column).

A man wearing a black hat and lanyard.
Ryan Tyler, Dustin’s son, wears the Phantom prototype while discussing the technology with a CES 2024 attendee. (Credit: Kevin Dickinson / Freethink)

When worn, a user’s fingers pass through rings that rest on the bottom joint. It’s these rings that stimulate the sense of touch — or what Afference calls “haptic illusions” — by conducting electrical signals through the nerves. These signals convince your nervous system that it feels the object you’re interacting with in the virtual space.

Hence why Tyler and Segil named their company Afference, which comes from afferent, meaning to convey impulses toward the central nervous system.

“We’re directly communicating with the brain,” Tyler says. “This technology allows us a lot more flexibility and opportunity. You can gain information from the virtual world. You’re not holding anything, but you can learn about an object by interrogating it with your tactile experience.”

By altering the complexity of these signals, the technology can subtly change the sensations a user experiences. It can be something as simple as the click of a button or something more complex, such as discerning the ripeness of avocados based on their firmness or squishiness. The Phantom can also render more abstract sensations, such as the beat of a virtual speaker crying more, more, more. The pulse of Idol’s “Rebel Yell” will even intensify or weaken based on the user’s proximity to it in the virtual space.

A booth at a trade show with a tv on it.
Afference shows a demo of how its Phantom prototype works at its booth at CES 2024. (Credit: Kevin Dickinson / Freethink)

Hooked on a feeling

At the moment, the Phantom is in the proof-of-concept phase. Afference demoed the prototype publicly at this year’s CES, where it won the Best of Innovation XR Technologies & Accessories award. The next steps include seed funding to build capacity and finding partnerships to experiment with and develop use cases.

“We think this is a game changer for making spatial computing a reality,” Tyler said. “If you really want to engage in a virtual or alternate reality, you have to feel when you touch something. That’s what we’re bringing, and without it, we don’t think [spatial computing] is going to be very successful.”

He points out that while we often focus on sight and sound, it’s touch that has made technologies more relatable in the past. Clicking a mouse made interacting with computer operating systems feel more intuitive. Vibrations in smartphones make actions like typing or rotating the screen feel natural and engaging. And if virtual and augmented realities are to become standard ways for people to work, play, and socialize, they’ll need to reimagine haptic feedback to match their high-definition worlds.

“What’s amazing to me is that your brain begins to understand that you really need that physical environment,” Tyler added. “Tiny things make all the difference.”

Related
Brain implant for “artificial vision” is still working after 2 years
A new type of brain implant technology has given a man with total blindness a kind of “artificial vision.”
In a future with brain-computer interfaces like Elon Musk’s Neuralink, we may need to rethink freedom of thought
In a future with more “mind reading,” thanks to computer-brain interfaces, we may need to rethink freedom of thought.
“Universal” BCI lets anyone play games with their minds
A specially trained “decoder” slashes the time it takes a brain-computer interface (BCI) to read a user’s mind.
AI-powered wearable “speaks” for people with vocal cord problems
Bioengineers at UCLA are developing an AI-powered wearable that could make it easier for people with vocal cord problems to talk.
First person with a Neuralink brain implant reveals how he uses it
Elon Musk’s Neuralink has revealed the identity of the first person to receive its brain implant — and the man says it has changed his life.
Up Next
A man walking down a street with smoke coming from a factory, showcasing concerns for the environment.
Subscribe to Freethink for more great stories