WASHINGTON (SOA) — Automakers are driving into the future, competing for the most advanced technology to make our travels easier, including cars that can do the driving for us.
Tesla, recently named one of America’s top 10 best-selling cars, touts futuristic features like partial self-driving, or what it calls Autopilot.
But there's a potential danger with this developing technology. In more than a dozen unexplained accidents, Tesla cars in Autopilot mode have slammed into emergency vehicles, creating a double emergency and putting first responders at risk.
Spotlight on America National Investigative Correspondent Angie Moreschi reports on the growing concern and calls for federal regulators to step in.
There are an estimated 830,000 Teslas on the road today, capable of using what the company calls "Autopilot." According to Tesla's website, this technology lets your car accelerate and brake automatically, designed to "assist you with the most burdensome parts of driving."
But following a series of crashes into emergency vehicles, Teslas using that technology are now the subject of a federal safety investigation.
The National Highway Traffic Safety Administration, or NHTSA, is investigating 16 crashes involving Teslas in Autopilot mode and emergency vehicles from coast to coast dating back to 2018, that have resulted in at least 15 injuries and one death.
Among the crashes under investigation are; a Tesla that slammed into a Florida Highway Patrol car in Orlando, a Tesla crash that injured five deputy constables in Montgomery County, Texas, and a dramatic crash in Nash County, North Carolina, that forced two law enforcement officers to dive out of the way.
California Highway Patrol Officer Jesse Matias was at the scene of a similar crash in July of 2021. Officers had shut down a San Diego highway after a deadly crash and were working the scene, when a Tesla blew through the road closure, despite being clearly blocked off with emergency lights flashing.
"It was not slowing down. It was not reacting to any emergency lights on scene," Officer Matias told Spotlight on America. "So, we started yelling, 'Car coming! Car coming!' You know, 'Get out of the way! Car coming!'"
Officers and other first responders were sent running.
The Tesla slammed into the back of a patrol car. Thankfully, first responders were able to escape injury, but a passenger in the Tesla was seriously hurt.
It's frightening. It's shocking," said Officer Matias. "It makes you worry about your job a lot more when you're out there."
Officer Matias isn't the only one worried.
This past March, four firefighters in Contra Costa County, California were injured when a Tesla failed to stop for their fire truck at another crash scene on the freeway.
It's unnerving for them to be involved in a near miss that had the potential to take their life," Contra Costa County Deputy Fire Chief Aaron McAlister told Spotlight on America.
Not only did the accident put firefighters in danger, it also put their two million dollar truck, a crucial piece of equipment, out of commission for months, requiring that it be shipped across the country for repairs.
With so many Teslas in Autopilot mode failing to avoid emergency vehicles, NHTSA launched an investigation in 2021 to figure out why. That probe has since expanded to include more than 100 crashes with other vehicles, as well.
No Guarantee Systems Will Work
University of South Carolina Professor Bryant Walker-Smith is an international expert on driving automation and its legal implications.
He is trained as both a lawyer and an engineer.
He told Spotlight on America that Advanced Driver Assistance Systems (ADAS) like Autopilot are unregulated and not guaranteed. Automakers, like Tesla and others, maintain it is the driver's responsibility to remain alert and step in if needed.
"The promise is, it will work, unless, and until it doesn't," Walker-Smith told us. "The assumption is that an alert, attentive human driver will be there to monitor, correct, step in to basically do all of the driving."
But Spotlight on America discovered that message isn't always reaching drivers.
In the crashes involving emergency vehicles, Tesla drivers admitted to being distracted, from watching a movie on their cellphone, checking on their dog in the backseat, and just plain failing to pay attention before impact.
According to NHTSA's investigation, in several of the incidents, drivers would have seen the emergency vehicles an average of eight seconds before impact. They're supposed to take over and take action, but most did nothing.
Moreschi: Are drivers getting the wrong impression that these cars are self-driving, that they don't have to pay attention?
Walker-Smith: The answer to that seems to be very clearly yes.
Tesla has three levels of driver assistance technology: Autopilot, Advanced Autopilot and Full Self Driving. But its website says the technology is "intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment.”
"These names that Tesla, in particular, use are very misleading," Professor Walker-Smith told us. "It is staggering to me that full self-driving would need a driver. "
It is staggering to me that full self-driving would need a driver. It is dangerous and it is irresponsible," he said.
Last summer, when NHTSA expanded its investigation of Tesla's Autopilot System, it cited "patterns in system performance and associated driver behavior" identified in its preliminary probe. The agency is now conducting what it calls an Engineering Analysis, which is the final step before NHTSA decides whether there is a defect in the vehicle.
Just this month, NHTSA sent a letter to Tesla, demanding it send more information about the vehicles involved in the crashes or face fines of more than $130,000 a day.
As the federal investigation continues, there are still no safety standards regulating advanced driver assistance systems. Bryant Walker-Smith told Spotlight on America that safety standards often lag behind technology by years. For now, he says agencies like NHTSA are playing "whack-a-mole" with regulations, trying to address new technologies as they pop up.
Earlier this year, Tesla agreed to recall some of its driver-assistance systems and has issued multiple software updates, but NHTSA's investigation is still open.
We asked NHTSA when they anticipate finishing their probe. The agency didn't agree to an interview and told us they generally don't comment on open investigations.
Tesla didn't respond to our request for an interview and didn't answer our specific questions. Tesla's website acknowledges that features like Autopilot "do not make the vehicle autonomous."
For firefighters like Aaron McAlister, who told us the freeway is the most dangerous place they work, it's another hazard that threatens their safety.
"We have a lot of ways to be distracted on the highway and, and now we have another one because your car allows you to not pay attention," he said. "That places our firefighters at risk."
-------
NHTSA’s investigation into crashes involving emergency vehicles is focused on Tesla drivers using Autopilot. But in 2021, the agency sent a letter to twelve other car makers to ask for information about the performance of their partially automated driving systems.
To read more about the ongoing NHTSA investigation, and to see which Tesla models are involved, you can find all related documents here.