Robot dogs are hiking the Alps and preparing for space

Four-legged robots could be our best friends in research and work.

This article is an installment of Future Explored, a weekly guide to world-changing technology. You can get stories like this one straight to your inbox every Thursday morning by subscribing here.

A robot dog could be your new hiking buddy — if you can keep up with it.

Back in 2016, scientists at Swiss research university ETH Zurich introduced the world to their autonomous four-legged robot, ANYmal. The robot dog has now been upgraded with a new controller that helps it traverse difficult terrain without any prior training.

To demonstrate it, ANYmal was tasked with hiking an unfamiliar trail in the Alps — and it reached the summit four minutes faster than the average human hiker, without falling or making any missteps.

Why it matters: As fun as a hike with an autonomous robot dog might be, these bots aren’t meant to replace family pets — they’re working dogs, built to help with dull, dirty, or dangerous jobs.

While some robot dogs can actually do things — Boston Dynamics’ famous Spot has an attachable arm that lets it move objects and turn levers — most are primarily designed to walk around and collect data using cameras and sensors. 

This can save human workers from having to conduct routine inspections of dangerous environments, such as chemical plants, or assess damage in disaster zones, such as the site of a nuclear meltdown.

Bots like ANYmal are working dogs, built to help with dull, dirty, or dangerous jobs.

Robot dogs can also work alongside humans. 

They can walk just ahead of a first responder during a search and rescue mission in a collapsed building, for example, helping them find survivors more quickly while scouting areas too unstable for a human to traverse.

Robot dogs could also join soldiers on the battlefield, conducting recon and surveillance missions that could help military members stay alive (the use of robot dogs as actual weapons is controversial and discouraged by many — but not all — developers).

Eventually, we might even send robot dogs to places humans can’t yet go, such as Mars.

Four-legged space explorers have advantages over rolling robots, getting into areas too treacherous for wheels or treads — including Mars’ underground lava tubes, which some experts think could house astronauts, or even harbor evidence of ancient life. 

The challenge: We’re only just getting to the point that we can reliably use a robot dog to inspect a chemical plant with a predetermined layout, and that’s a lot easier than setting one loose in unexplored caves on another planet.

To reach that level, we need to make it easier for robot dogs to analyze the unfamiliar terrain ahead of them and make a plan to traverse it before they take a single step forward.

“Snow, vegetation, and water visually appear as obstacles on which the robot cannot step.”

ANYmal’s Developers

Roboticists call this “exteroceptive perception,” and it’s something that humans do all the time — if we step outside and see that it snowed the night before, for example, we might walk a little more gingerly on the sidewalk in anticipation of it being icy.

Giving robot dogs this sense has been difficult, though. There are an infinite number of environments, and what a bot can “see” with its cameras and sensors might not paint a complete picture.

It has “remained a grand challenge in robotics,” the ETH Zurich team writes in their new paper, published in Science. “Snow, vegetation, and water visually appear as obstacles on which the robot cannot step — or are missing altogether due to high reflectance.”

We want robot dogs to be able to move quickly, especially in situations like search-and-rescue missions.

If a robot isn’t able to proceed based on its exteroceptive perception, it could use the feedback it gets from actually physically interacting with the environment. 

That’s called “proprioceptive perception,” and it’s also something humans do — imagine how you carefully feel your way through a dark room or stairwell, using feedback from your feet and hands to determine whether you can keep moving forward. 

But just like a person in a dark room, a robot dog has to move more slowly when relying on its proprioceptive perception, and we want the bots to be able to move quickly, especially in situations like search-and-rescue missions, where every minute counts.

“The controller allows ANYmal to tackle rough terrain faster, more efficiently and, above all, more robustly.”

Marco Hutter

What’s new: To help ANYmal rapidly cross unfamiliar terrain, the ETH Zurich team developed a control system that switches back and forth between exteroceptive and proprioceptive systems, based on how confident it is in what it’s “seeing.”

They trained this controller using computer simulations of different terrains, some with incomplete or missing data. After training, the system could predict when ANYmal should trust what it “saw” and when it should lean more heavily on what it could feel.

“[The controller] allows it to tackle rough terrain faster, more efficiently and, above all, more robustly,” lead researcher Marco Hutter said in a press release.

During real-world testing, the ANYmal successfully navigated loose rock, narrow tunnels, and snow-covered stairs.

It also completed the Alps hike without any failures — despite encountering tree roots, steep inclines, and slippery ground — finishing the full loop in the same amount of time it takes the average person and reaching the summit four minutes quicker.

“When I first checked the terrain, I thought it might be too difficult for the robot, but it could just handle all of them,” first author Takahiro Miki told IEEE Spectrum.

Looking ahead: ANYmal’s new navigation ability is impressive, but the bot still isn’t as nimble as a real dog, and that’s the ultimate goal.

“We think the next level would be somewhere which requires precise motion with careful planning such as stepping-stones, or some obstacles that require more dynamic motion, such as jumping over a gap,” Miki told IEEE.

For now, the researchers will continue developing their robot dog, preparing it for future deployment at worksites, in disaster zones, and, maybe one day, on other worlds. 

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at [email protected].

Related
This Stable Diffusion mod will turn you into a Pokémon
Stable Diffusion, an open-source text-to-image AI, has been modified to generate Pokémon-like characters from simple text prompts.
How do DALL-E, Midjourney, Stable Diffusion, and other forms of generative AI work?
DALL-E and other types of generative AI can produce images that look like photographs, paintings, or drawings that were created by humans.
Target is now offering the world’s first “robot manicure”
A robot that uses AI and 3D cameras to paint fingernails is now giving Target customers 10-minute manicures for just $8.
Meta can (kinda) guess what you’ve heard via your brain waves 
Meta has created an AI that can tell what you’re hearing based on non-invasive brain scan measurements.
First-of-its-kind trial shows AI beat humans at analyzing heart scans 
Echonet, an AI trained to assess a measure of heart function, has outperformed trained technicians in both accuracy and efficiency.
Up Next
Shape-shifting material
Subscribe to Freethink for more great stories