Tesla and Uber fatalities show the limits of “semi-autonomous” cars

How can we make humans pay attention when a machine is doing our job for us?

Two weeks ago, a self-driving Uber car struck and killed a pedestrian in Arizona, bringing Uber’s driverless vehicle program to a screeching halt.

Five days later, a Tesla Model X slammed into a concrete barrier in California, killing the sole occupant. On Friday, Tesla confirmed that the vehicle’s “Autopilot” system was engaged at the time of the accident.

Both cases highlight the problems with splitting driving responsibilities between humans and machines.

On Autopilot: Allegedly, Teslas in Autopilot mode are not self-driving cars. Instead, Autopilot is just supposed to be a very advanced “driver assistance program” (a kind of souped-up cruise control) that only helps drivers navigate, steer, avoid obstacles, park, and maintain (or change) their speed, distance, or lane.

But the headline on the Tesla’s  website boasts “Full Self-Driving Hardware on All Cars,” and its advertising video shows Autopilot driving around complex city streets with no human direction.

Nonetheless, Tesla reminds drivers that “Every driver is responsible for remaining alert and active when using Autopilot, and must be prepared to take action at any time.” Autopilot also sends visual and audible warnings to the human driver if it detects them taking their hands off the wheel or looking away from the road.

The Tesla Crash: Tesla has released some details about the fatal Model X accident last month, saying that, while Autopilot was on, “The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision.”

A Tesla driver in Chicago tried to recreate the conditions of the accident along a similar stretch of highway, showing how Autopilot can get confused when the left-hand line on a highway splits off and becomes the right-hand line on the exit ramp, steering the Tesla directly into the concrete divider between the ramp and the highway.

It’s possible that the Model X suffered a similar error and the driver was not alert enough to react in time.

The Uber Crash: In the Uber fatality, the car was in full self-driving mode at the time of the accident, suggesting that something must have gone wrong with its sensors, which should have detected the approaching pedestrian.

But Uber’s human backup driver also failed to prevent the crash, and the car’s interior video shows the driver repeatedly getting distracted. Ultimately, she was looking down for several crucial seconds before the collision, while the car drove itself into the pedestrian.

Different Tech, Same Problem: While Uber is technically fully autonomous, and Tesla is officially semi-autonomous, the difference for their human drivers is small and shrinking. They are both supposed to be alert and ready to take over at a moment’s notice while the car is driving itself.

The problem is that people have a hard time focusing when they aren’t actively engaged with their surroundings, whether their hands are always supposed to be on the wheel (as in a Tesla) or not (as in Uber’s self-driving car).

Former Uber backup drivers have testified about how difficult it is to stay alert when you rarely have anything to do, and the problem of drivers “zoning out” on long, boring trips has been well-documented even in old-fashioned “dumb” cars.

Upshot: We’re in an awkward, in-between phase for self-driving tech, with fully autonomous cars and “driver assistance” programs beginning to converge, without either quite eliminating the need for people just yet.

Each step forward makes humans less and less necessary — the goal, after all, is to replace us — but as we become more redundant, it becomes harder for us to stay focused or recognize the lapses where we need to intervene.

On the road, the margin for error is often slim, and the difference between a car doing the right thing (staying in the left-hand lane) and the wrong thing (following the shoulder into a concrete barrier) isn’t always easy for a driver to anticipate.

But the benefits of fully self-driving cars are enormous, eliminating human error, making DUI impossible, and giving freedom to the elderly and disabled. As uncomfortable as it sounds, while the technology matures, we might simply have to ride it out, with both hands on the wheel and both eyes on the road — regardless of who or what is driving.

Related
Expert explains why a nearly new Boeing 737 MAX 9 door flew off
A 60-pound “door plug” blew out from a nearly new Boeing 737 MAX 9 in flight at 16,000 feet, leaving a gaping hole in the fuselage. Why?
Safer skies with self-flying helicopters
Engineers start with an existing helicopter model and add control, sensing, and other software systems to make it autonomous.
Aerospace engineer explains why AI can’t replace air traffic controllers
For everyone’s safety, humans are likely to remain a necessary central component of air traffic control for a long time to come.
Cobalt-free batteries could power cars of the future
A new lithium-ion battery that includes a cathode based on organic materials could offer a more sustainable way to power electric cars.
Up Next
Subscribe to Freethink for more great stories