Mind-reading technology – Are we ready for it? 

Startup founders want to read your brainwaves - but how will they protect your data from tech giants?
Watch on YouTube

Neurable CEO Ramses Alcaide has a mission: bring neurotechnology out of research labs — and surgical suites — and into everyday life.

“Neurable exists to make brain-computer interfaces an everyday thing,” Alcaide says. 

He envisions a future where the neurotechnology that allows Neurable’s products to read brainwaves via electrical signals — without surgical implants — is used to track cognitive capability and mental health, like a Fitbit for the brain.

The company is also building towards a future where brain-computer interfaces (BCI) allow people with disabilities to easily control prosthetics and robotics, and even lets people who cannot speak communicate again.

Their current device, still only available as a preorder, is a set of headphones, called Enten. The headphones are designed to read electrical signals produced by neurons in your brain, a technique called electroencephalography, or EEG for short. 

In an EEG, tiny electrodes — usually placed directly on the scalp in the lab, but in the ear cups of the Enten headphones — pick up the minute electrical charges made by your neurons firing. 

Neurable claims that the EEG sensors inside the normal-looking headphone’s ear cups can detect brain activity, which its proprietary algorithms — named after Pokemon — can read, analyze, and decode. The idea is to provide insights in focus and neural activity from, and for, people in daily life.

“The app might note that a person tends to focus more when listening to certain playlists, for example, and recommend queuing those up right before a deadline,” Freethink’s Kristin Houser previously reported. “If it notices they tend to lose focus about an hour before lunch, it might recommend eating earlier to stave off the slump.”

But Alcaide’s goal is broader than helping you optimize your focus; it is mainstreaming BCI technology in the everyday world.

What is BCI? 

At its core, the idea of BCI is to use the brain’s electrical signals to communicate directly with an outside device — a machine, an app, or some kind of a computer. BCI allows you to move a robot’s hand to the left, say, by merely thinking of moving the hand slightly to the left.

While the tech has a distinctly sci-fi feeling to it, BCI is not only quite real, but has been actively researched for half a century.

However, the brain is extremely complicated, and our current understanding of it is limited — it’s a gray box, at best. And since many forms of BCI still require implanted neurotechnology, it’s hard to imagine many people signing up for surgery in (or even near) the brain for any kind of elective reason. 

Noninvasive neurotechnology, like the Enten, is likely to be an important step to feasibility and marketability.

The history of BCI

BCI research began in earnest in the early 1970s in the lab of UCLA’s Jacques Vidal, according to the Mayo Clinic. Vidal coined the term in a paper published in 1973, and he is broadly considered the founder of the field.

Studies in animals, including non-human primates, continued throughout the 1970s; by the end of the decade, human subjects were able to demonstrate BCI by moving an image of a rocket up and down on a screen. 

Research in 2006 showed that neurotechnology implanted into the motor cortex of a man with a spinal injury allowed him to operate a television and a prosthetic hand, as well as move a robotic arm.

Directly implanting neurotechnology in the brain allows researchers to place electrodes right where they’re needed and ensure they get a strong, constant signal. But by definition, they require surgery via specialized equipment and highly trained personnel, making them expensive and risky.

BCI that uses neurotechnology positioned on the skull, rather than inside it, opens up the door for wider adoption.

In 2011, researchers demonstrated the potential for noninvasive neurotechnology, using EEG-powered BCI to spell words on a computer screen, and research on both implanted and non-implanted options has continued apace. BCI neurotechnology is now allowing people to translate thoughts into text, control exoskeletons to help rehab hands, and you can even find open-source options.

The state of neurotechnology 

Neurable is not the only company trying to bring neurotechnology to everyday life. Most boisterous is Elon Musk’s Neuralink, whose efforts to develop commercially-available invasive nanotechnology have met with measured response from neuroscientists (and controversy from animal rights groups).

Musk’s chip-implanted pigs showed that Neuralink’s tech appears to be where the field has been for decades — the ability to see neurons firing and hear the sounds of a brain at work. But his ability to market cutting-edge tech, as seen with Tesla and SpaceX, makes the idea of commercially available Neuralink implants (someday, for some purpose) hard to write off as complete fantasy.

Closer to Neurable are other EEG products like Muse. Marketed as a meditation, sleep, and focus enhancement device, Muse utilizes a similar technology to Neurable, albeit in headband form.

As researchers continue to learn more about the brain and neurotechnology — including not only the hardware, but the algorithms needed to make sense of all that brain data — continues to improve, it is possible that Neuralink founder Alcaide’s vision may indeed come to pass.

For Alcaide and Neurable, “opening that technology to the world is really why we exist.”

Subscribe to Freethink for more great stories