On Saturday August 25, a Tesla Model S crashed into a stopped firetruck in San Jose, California. The two occupants inside the Tesla sustained minor injuries, and the 37-year-old driver was arrested on suspicion of driving under the influence of alcohol. According to a police report, he told authorities, "I think I had Autopilot on." Tesla has not confirmed the semiautonomous system was in use, but it's at least the third time this year a Tesla has hit a stopped firetruck at highway speeds. We've updated this story, which originally ran on January 25, 2018, about why Autopilot and similar systems have trouble detecting stopped vehicles.
Early Saturday morning, a Tesla Model S driving south on the 101 Freeway slammed into the back of a stopped firetruck in San Jose, California, the latest in a series of crashes that highlight the shortcomings of the increasingly common semiautonomous systems that let cars drive themselves in limited conditions. A Tesla spokesperson says the automaker has not yet received data from the vehicle, so can't confirm if Autopilot mode was running (this typically takes a few days), and that Tesla is "working to establish the facts of the incident.”
Whatever the particulars, there's a serious sense of déjà vu here. In January, a Tesla Model S drove into the back of a stopped firetruck on the 405 freeway in Los Angeles County. The driver apparently told the fire department the car was in Autopilot mode at the time. In May, a Tesla driver in Utah hit a firetruck at highway speeds; she told reporters Autopilot was engaged and she was looking away from the road at the time.
So this latest surprisingly non-deadly debacle—the San Jose Tesla driver and his passenger sustained minor injuries—also raises a technical question: How is it possible that one of the most advanced driving systems on the planet doesn't see a freaking fire truck, dead ahead?
The car's manual does warn that the system is ill-equipped to handle this exact sort of situation: “Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”
X content
This content can also be viewed on the site it originates from.
Volvo's semiautonomous system, Pilot Assist, has the same shortcoming. Say the car in front of the Volvo changes lanes or turns off the road, leaving nothing between the Volvo and a stopped car. "Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed," Volvo's manual reads, meaning the cruise speed the driver punched in. "The driver must then intervene and apply the brakes.” In other words, your Volvo won't brake to avoid hitting a stopped car that suddenly appears up ahead. It might even accelerate towards it.
The same is true for any car currently equipped with adaptive cruise control, or automated emergency braking. It sounds like a glaring flaw, the kind of horrible mistake engineers race to eliminate. Nope. These systems are designed to ignore static obstacles because otherwise, they couldn't work at all.
“You always have to make a balance between braking when it’s not really needed, and not braking when it is needed,” says Erik Coelingh, head of new technologies at Zenuity, a partnership between Volvo and Autoliv formed to develop driver assistance technologies and self-driving cars. He's talking about false positives. On the highway, slamming the brakes for no reason can be as dangerous as not stopping when you need to.
“The only safe scenario would be don’t move,” says Aaron Ames, from Caltech’s Center for Autonomous Systems and Technologies. That doesn't exactly work for driving. “You have to make reasonable assumptions about what you care about and what you don’t.”
Raj Rajkumar, who researches autonomous driving at Carnegie Mellon University, thinks those assumptions concern one of Tesla's key sensors. “The radars they use are apparently meant for detecting moving objects (as typically used in adaptive cruise control systems), and seem to be not very good in detecting stationary objects," he says.
That's not nearly as crazy as it may seem. Radar knows the speed of any object it sees, and is also simple, cheap, robust, and easy to build into a front bumper. But it also detects lots of things a car rolling down the highway needn't worry about, like overhead highway signs, loose hubcaps, or speed limit signs. So engineers make a choice, telling the car to ignore these things and keep its eyes on the other cars on the road: They program the system to focus on the stuff that's moving.
This unsettling compromise may be better than nothing, given evidence that these systems prevent other kinds of crashes and save lives. And it's not much of a problem if every human in a semiautonomous vehicle followed the automakers' explicit, insistent instructions to pay attention at all times, and take back control if they see a stationary vehicle up ahead.
The long term solution is to combine a several sensors, with different abilities, with more computing power. Key amongst them is lidar. These sensors use lasers to build a precise, detailed map of the world around the car, and can easily distinguish between a hub cap and a cop car. The problem is that compared to radar, lidar is a young technology. It's still very expensive, and isn't robust enough to survive a life of hitting potholes and getting pelted with rain and snow. Just about everybody working on a fully self-driving system—the kind that doesn't depend on lazy, inattentive humans for support—plans to use lidar, along with radar and cameras.
Except for Elon Musk. The Tesla CEO insists he can make his cars fully autonomous—no supervision necessary—with just radars and cameras. He hasn't proven his claim just yet, and no one knows if he ever will. Lidar's price and reliability problems are less of an issue when it comes to a taxi-like service, where a provider can amortize the cost over time and perform regular maintenance. But in today's cars, meant for average or modestly wealthy consumers, it's a no-go.
In the meantime, we're stuck with a flawed system, the result of a compromise made to navigate the world at speed. And when even the best systems available can't see a big red big firetruck, it's a stark reminder of how long and winding the path to autonomy actually is.
- How NotPetya, a single piece of code, crashed the world
- PHOTO ESSAY: A stunning decade at Burning Man
- Singer brings F1 know-how to the Porsche 911
- AI is the future—but where are the women?
- Think rivers are dangerous now? Just wait
- Get even more of our inside scoops with our weekly Backchannel newsletter