“Blind” robot hand can operate solely by touch

It can manipulate objects in complete darkness.

Columbia University researchers have built a robot hand that can deftly manipulate objects without seeing them — allowing it to work in complete darkness.

The challenge: We’ve designed our environment around the dimensions and capabilities of the human body, so for robots to navigate our homes and workplaces, some believe we need to make the bots as “human” as possible.

One of the biggest challenges with that has been replicating the human hand, which has 27 degrees of freedom and incredible touch sensitivity — this allows us to manipulate objects while also knowing exactly how much pressure to use to grasp them.

To give their robots a sense of “touch,” some developers equip them with cameras so that the bot’s brain can analyze footage of an object and tell the hand how to grasp it. But cameras require sufficient lighting, which isn’t always available, and appearances can be deceiving — an object that looks firm might be flexible, for example.

“Robot hands can also be highly dexterous based on touch sensing alone.”

Matei Ciocarlie

What’s new? A team at Columbia University has now unveiled a new robot hand, and while it doesn’t exactly look human (it has five identical fingers sticking up from a flat, pentagon-shaped base), it is nimble and able to feel objects without seeing them — just like our hands.

“In this study, we’ve shown that robot hands can also be highly dexterous based on touch sensing alone,” said co-author Matei Ciocarlie.

A paper detailing the work has been accepted for publication at the 2023 Robotics: Science and Systems Conference in South Korea and is currently available as a preprint on arXiv.

How it works: The new robot hand builds off advanced touch-sensing technology the Columbia researchers unveiled in 2020 as a single robot finger.

That finger featured many small lights shining outward. By measuring how those lights bounced off the inside of a skin covering the finger, the researchers could tell when pressure was being applied to the outside of the digit, allowing them to give the finger a sense of touch.

For their new study, the researchers incorporated that tech into five fingers. They then taught the fingers to work together by training algorithms in a computer simulation that replicated the physics of the real world.

In that simulation, the algorithms practiced using a computer model of the hand to rotate objects, learning through trial and error (a technique called “reinforcement learning”). Once trained, the algorithms were given control over the physical robot hand.

a grid showing three robot hands manipulating an object in a computer simulation and three in real life
After the robot hand was trained in simulations (top) , it was ready to manipulate objects in the real world (bottom). Credit: Columbia University Roam Lab

To demonstrate the system, the researchers had the robot hand grasp and rotate an unevenly shaped object between its fingers. This seemingly simple task is challenging because the bot must constantly reposition some of its fingers while using others to prevent the object from falling.

Not only was the robot hand able to ace the test, it was also able to do so in the dark, demonstrating how it relies on its sense of touch — not sight.

Looking ahead: In their paper, the researchers note that many real-world situations will likely benefit from bots that rely on both touch and sight, so while their robot hand doesn’t need cameras to manipulate objects, they still plan to give it a vision system. 

“Once we also add visual feedback into the mix along with touch, we hope to be able to achieve even more dexterity, and one day start approaching the replication of the human hand,” said Ciocarlie.

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at [email protected].

Shape-shifting space robots help firefighters on Earth
The designer of a new type of rover for NASA has found a way to make her space robots useful to firefighters on Earth.
The AI healthcare revolution has begun
The ability of AI to diagnose diseases, discover drugs, and even perform surgery is increasing rapidly — but will patients accept Dr. AI?
This robotic arm study is preparing us for our cyborg future
A new study at the University of Tokyo aims to find out how people feel using robotic arms — and sharing them with others.
A general-purpose robot is entering the workforce
Tech startup Sanctuary AI has unveiled Phoenix, a general-purpose robot designed to perform many workplace tasks.
Watch NASA’s one-of-a-kind snake robot training to go to space
NASA has released a video of its one-of-a-kind snake robot EELS slithering across a variety of challenging terrains.
Up Next
a large machine driving long steel beams deep into the ground
Subscribe to Freethink for more great stories