Skip to main content
Move the World.
Emergency Braking Was Disabled During Self-driving Uber Fatality: Feds
A screen capture from the Uber's dashcam moments before the collision; the darkness is mostly an artifact of the camera settings.

In March, a self-driving Uber hit a pedestrian in Arizona, resulting in the first known fatality from an autonomous vehicle. After a two-month investigation, the National Transportation Safety Board (NTSB) has released a four-page preliminary report on the accident. The report confirms that the vehicle's sensors detected the pedestrian and her bicycle a full six seconds before impact, but Uber's software did not automatically brake or notify the driver of the imminent collision. The report adds some depth to the story, but it does not resolve fundamental questions about why the crash happened and who was responsible.

The Computer: Records from the self-driving computer show that the car's radar and LIDAR systems were working fine: everything was functioning, and no errors were detected. The sensors picked up the pedestrian "about six seconds" before impact—more than enough time to avoid a crash. However, the software didn't recognize what the sensors were detecting, first "(classifying) the pedestrian as an unknown object, as a vehicle, and then as a bicycle." The software also struggled to understand that the moving object was simply crossing the road, from left to right, predicting "varying expectations of future travel path."

Nonetheless, at 25 meters away (about 1.3 seconds before the crash), the computer decided that emergency braking was necessary to "mitigate a collision." And here's where it gets weird: "According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator. … The data also showed that all aspects of the self-driving system were operating normally at the time of the crash, and that there were no faults or diagnostic messages."

The vehicle operator is relied on tointervene ... The system is not designedto alert the operator.

NTSBpreliminary report
Interior video shows the driver looking
down for several seconds before the
crash.

Interior video shows the driver looking
down for several seconds before the
crash.

The Humans: Unfortunately, the backup driver did not take the wheel until less than a second before the collision. Video from inside the car showed that she was looking down for several seconds before the crash, but she told investigators that she was monitoring the car's computer on the iPad installed in the center console, not looking at her phone. There is no particular reason to doubt this, as Uber says that "the operator is (also) responsible for monitoring diagnostic messages … and tagging events of interest for subsequent review."

The report had nothing new to say about the pedestrian herself: she was crossing outside of a crosswalk and wearing dark clothes. Toxicology tests were positive for methamphetamine and cannabis, but it's not clear that she was under the influence at the time she was crossing the street.

So Wait, What Happened? The preliminary report is barely three and a half pages, and the NTSB won't officially decide the "probable cause" until they complete their investigation. But the little that's in there just adds to the confusion. The NTSB flatly states that Uber's self-driving system does not have an emergency braking maneuver, even though the software does determine if emergency braking is needed to avoid a crash. Nor is the computer programmed to warn the backup driver about possible collisions. This is astonishing, if true.

The Car: The car itself (a new Volvo SUV) came already equipped with a "driver assistance program" that warns drivers about likely collisions and automatically brakes if the driver does not take control in time. The NTSB says that Uber disabled that system, too—presumably so that their self-driving software doesn't conflict with it—yet Uber's own fully self-driving software apparently lacks the same basic safety features.

Emergency braking maneuvers are notenabled while the vehicle is undercomputer control.

NTSBpreliminary report
NTSB diagram of the self-driving car's data about 1.3 seconds before impact, when the system determined an emergency braking maneuver would be needed to mitigate a collision.

NTSB diagram of the self-driving car's data about 1.3 seconds before impact, when the system determined an emergency braking maneuver would be needed to mitigate a collision.

The Math: One conclusion seems inescapable: if the Volvo's system had been engaged, or if the car had hit the brakes when Uber's software decided it should, the pedestrian would likely still be alive. Road tests for this model of Volvo show that its braking power could have slowed the Uber from roughly 40 mph to between 14-18 mph in 1.3 seconds. That would have cut the risk of death for a pedestrian hit at those speeds from around 50% to less than 5%.

In fact, it's quite possible that the car would have missed the pedestrian entirely. If it was fully braking, it wouldn't have taken 1.3 seconds to reach the pedestrian, it would have taken 2.2 seconds. By that point, the pedestrian might have been across the lane, and the car would have been near a dead stop. (Feel free to check my math, but I assume it is essentially the same calculation that the car's software must have done to decide it needed to emergency brake at 25 meters.)

All aspects of the self-driving systemwere operating normally at the time ofthe crash.

NTSBpreliminary report

Not a Good Look: So we know: (1) the software did not glitch and was operating normally; (2) the sensors picked up the pedestrian in time; and (3) the computer correctly calculated when emergency braking was necessary. The only puzzle is why Uber turned off its emergency braking system in the first place. Uber's cryptic comment that it was disabled "to reduce the potential for erratic vehicle behavior" is troubling. It implies that Uber's collision avoidance program has so many false positives that its "erratic" stopping prevents it from being used out on the roads. Even harder to understand is why the system wouldn't be designed to alert the backup driver to potential collisions. The way the NTSB describes this "fully autonomous" software makes it sound more primitive than the driver-assistance features that many ordinary cars have had for years.

The Upshot: We still have to wait for the complete investigation, and more detail might help explain what was left unsaid in these preliminary findings. But the bad news for Uber here might be good news for driverless technology as a whole. The NTSB makes it sound like everything from hardware to software operated correctly here, and the only thing that was missing was an alert to the driver—perhaps up to six seconds before the crash, when the sensors first detected some kind of obstruction ahead—and emergency braking at 1.3 seconds before the crash.

In 2018, these seem like simple things to add to a self-driving system, and they're in widespread use in millions of ordinary cars without triggering much "erratic" behavior. Time will tell how safe driverless cars can become, but this crash doesn't look like it raises fundamental concerns about the technology as a whole.

Up Next

Healthcare
Doctors Need to Learn What Illnesses Look Like on Darker Skin
clinical signs darker skin
Healthcare
Doctors Need to Learn What Illnesses Look Like on Darker Skin
Medical student Malone Mukwende wrote a book to teach others how to spot clinical signs of illness on patients with darker skin tones.

Medical student Malone Mukwende wrote a book to teach others how to spot clinical signs of illness on patients with darker skin tones.

Our Changing Planet
Preparing for Climate Change
Preparing for Climate Change
Our Changing Planet
Preparing for Climate Change
Climate change is increasingly reshaping our world, but communities across America aren’t losing hope — they’re taking action. Learn how 18 communities are using science to guide community-based decision-making.

Climate change is increasingly reshaping our world, but communities across America aren’t losing hope — they’re taking action. Learn how 18 communities are using science to guide community-based decision-making.

Health
The Future of Healthcare Could Look a Lot Like the 1900s
The Future of Healthcare Could Look a Lot Like the 1900s
Health
The Future of Healthcare Could Look a Lot Like the 1900s
For many cancer patients, being treated at home is just as safe, more affordable, and more convenient than being...

For many cancer patients, being treated at home is just as safe, more affordable, and more convenient than being treated in a clinical setting.

Making a Difference
Can People with Autism Help Create Next-Generation AI?
Can People with Autism Help Create Next-Generation AI?
Watch Now
Making a Difference
Can People with Autism Help Create Next-Generation AI?
Daivergent is a new startup that hires people with autism to train artificial intelligence - and helps them start independent careers.
Watch Now

Meet Bryan Dai, the founder of Daivergent--a startup that hires people with autism to train artificial intelligence and helps them start independent careers. His journey began when his mother passed away, and he knew that he would be responsible for helping support his brother with autism. After people with autism turn 21, they often encounter the “support cliff,” after which they stop receiving many forms of government...

Helping People with Disabilities Become Working Artists
Helping People with Disabilities Become Working Artists
Watch Now
Helping People with Disabilities Become Working Artists
Only 20% of people with disabilities participate in the workforce; Art Enables helps those with special needs turn art into a career.
Watch Now

Art Enables helps artists with special needs create and sell their artwork. Only 20% of people with disabilities participate in the workforce. This program counters that by providing both an art studio for people with disabilities to make art and career and vocational training to market and sell their artwork. It helps these artists express themselves creatively, socialize, and earn flexible income to increase their...

Challengers
Meet the Startup Developing Human-Level Artificial Intelligence
Meet the Startup Developing Human-Level Artificial Intelligence
Challengers
Meet the Startup Developing Human-Level Artificial Intelligence
The story of Vicarious' mission to build the world's first human-level artificial intelligence and use it to help...
By Mike Riggs

The story of Vicarious' mission to build the world's first human-level artificial intelligence and use it to help humanity thrive.

Challengers
Meet the Entrepreneurs Disrupting Industries and Changing the World
Meet the Entrepreneurs Disrupting Industries and Changing the World
Challengers
Meet the Entrepreneurs Disrupting Industries and Changing the World
Fast Company and Freethink bring you powerful stories of a new generation of entrepreneurs.
By Mike Riggs

Fast Company and Freethink bring you powerful stories of a new generation of entrepreneurs.

Change Agents
Could Ugly Produce Change the World?
Could Ugly Produce Change the World?
Watch Now
Change Agents
Could Ugly Produce Change the World?
Meet the startup that wants to sell you ugly fruits and veggies
Watch Now

As much as 40 percent of the food grown, processed, and shipped for human consumption in the United States will never make it into a human’s mouth, according to Feeding America, a nonprofit group that coordinates food banks. That comes out to roughly 70 billion pounds of tossed food each year. One California startup is trying to reduce that number by selling fruits and vegetables that are too “ugly” to occupy the produce...