Skip to main content
Move the World.
smart robot

Lead image © Prostock-studio / Adobe Stock

The robot brain has evolved and finally come to its senses.

Researchers have created a smart robot brain that is closer to functioning like a human one by using neuromorphic computing — mimicking the human mind with life-like brain functions. What's more, they've integrated the bot with artificial skin and sensors, giving it the ability to "see" and "feel."

What They Did:

Humans can easily do things like grab their keys out or lift a glass to take a sip. But, for a robot, these tasks aren't simple. They require an intuitive mind — the smarts to look at the object, feel its shape, and grab and lift with just the right amount of force.

But currently, in most cases, even a smart robot will only rely on visual cues and can't feel objects.

A team led by Benjamin Tee at the National University of Singapore have given robots the "human touch" by enabling them to take in both visual and tactile information to "learn" new skills. They did this by incorporating electronic skins, optical sensing, and an AI nervous system that mimics the human brain. The surface, called asynchronous coded electronic skin, detects touch 1,000 times faster than humans. It can also recognize shape, texture, and hardness.

"To make robots have greater intelligence and autonomy, they need sensory feedback to provide the ability to perceive and understand the world," said Tee in an email.

Why It Matters:

Taking robotic perception to the next level will significantly expand smart robot capabilities. Tee says that with enhanced perception, a smart robot could grasp concepts and learn: someday, robots could think for themselves.

For example, a perceptive robot that worked in a factory could identify unfamiliar objects and grab them with the right amount of grip force — adapting to new tasks.

"Humans learn by perceiving the world, and this is the same for robots to be able to learn. This makes them safer and smarter to work alongside humans," said Tee.

Tee used neuromorphic computing, which tries to emulate the neural structure and operations of the brain to process sensory data. Working with Intel's Loihi neuromorphic chips, they were able to get a smart robot to read Braille with 92% accuracy, and presented their work at this year's Robotics: Science and Systems Conference.

What Is Next:

Other researchers are also striving to increase the level of perception in robots by considering additional senses like the ability to smell, listen, and understand.

In 2018, a year after Intel introduced their neuromorphic chips, it upgraded the chip to simulate over 130,000 neurons. All of them communicate with thousands of other neurons — over 100 million neurons in total. To top that, they say that Loihi the chip smells (meaning, it's able to smell).

This could lead to a new era of robots taking jobs. Tee and his colleagues hope to develop their perceptive robot to work in the food manufacturing and logistics industries, where the pandemic has increased the demand for robotic automation.

Up Next

Will Robots Steal Our Jobs?
Will Robots Steal Our Jobs?
Will Robots Steal Our Jobs?
Could exoskeletons help us do our jobs? Should we actually be afraid of robots taking our jobs? These are the...
By Mike Riggs

Could exoskeletons help us do our jobs? Should we actually be afraid of robots taking our jobs? These are the latest stories from the frontlines of the robotic world.

Criminal Justice
Can This Robot Stop Violence at Traffic Stops?
Can This Robot Stop Violence at Traffic Stops?
Criminal Justice
Can This Robot Stop Violence at Traffic Stops?
A Duke robotics PhD student and his partner think they have a way ease tensions while deep-rooted differences are...
By Michael O'Shea

A Duke robotics PhD student and his partner think they have a way ease tensions while deep-rooted differences are hashed out.

Uprising
Robot Bees Could One Day Save Your Life
robot bees
Uprising
Robot Bees Could One Day Save Your Life
For the first time, a microbot powered by soft actuators has achieved controlled flight.

For the first time, a microbot powered by soft actuators has achieved controlled flight.

Dispatches
Robots Are Mass Producing Mini-Organs
Robots Are Mass Producing Mini-Organs
Dispatches
Robots Are Mass Producing Mini-Organs
Robots can make hundreds of tiny copies of your organs, allowing doctors to test many different treatments at the...

Robots can make hundreds of tiny copies of your organs, allowing doctors to test many different treatments at the same time.

Uprising
The Construction Robots Building Space Colonies
contruction robots
Uprising
The Construction Robots Building Space Colonies
Sending construction robots into outer space will help pave the way for human exploration, but there are some real challenges that lie ahead.
By Tien Nguyen

Sending construction robots into outer space will help pave the way for human exploration, but there are some real challenges that lie ahead.

Uprising
Trash-Talking Robots Get Under Our Skin
AI Influence On Human Actions
Uprising
Trash-Talking Robots Get Under Our Skin
Can robots control us? Probably not, but they can influence our actions, as this recent study on human-robot interaction by Carnegie Mellon shows.

Can robots control us? Probably not, but they can influence our actions, as this recent study on human-robot interaction by Carnegie Mellon shows.

The Future Explored
Robots Can Read Your P-P-P-Poker Face
emotion ai
The Future Explored
Robots Can Read Your P-P-P-Poker Face
Emotion-detecting technology could be the next frontier of personalization. But what does this mean for privacy?

Emotion-detecting technology could be the next frontier of personalization. But what does this mean for privacy?