CULVER CITY, Calif. — A Tesla Model S crashed into a fire truck while driving down a California highway, according to Culver City, California, firefighters.
A tweet by the local firefighter’s union Monday showed a photo of a Tesla Model S with its nose wedged under the back end of a fire truck and its hood badly wrinkled. The car had been traveling at 65 miles an hour, the tweet said.
“The driver reports the vehicle was on Autopilot. Amazingly there were no injuries!” according to the tweet.
In its owner’s manual, Tesla repeatedly warns drivers to pay attention to the road while using the semi-autonomous Autopilot system. Tesla classifies Autopilot it as a “driver assistance system” and not “autonomous driving.” In other words, the system is only designed to reduce the driver’s workload by taking over repetitive and mundane tasks like staying in the lane and avoiding other moving cars.
Other automakers, including Mercedes-Benz, Nissan, General Motors and BMW, have similar systems that keep a car in its lane on the highway and that can slow down and speed up automatically to keep up with the flow of traffic.
The Tesla owner’s manual says that Traffic-Aware Cruise Control, which is part of the Autopilot system, “cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”
Traffic-aware cruise control systems by some other car manufacturers also work this way. While Tesla has not yet responded to a request for comment on its system, other manufacturers have said their systems rely on the driver to respond to stationery objects. That’s because the systems are designed to work with flowing traffic and cannot be expected to decide, on their own, how to respond to a stationary obstacle.
In a deadly crash in Florida in 2016, a Tesla Model S driver using Autopilot drove under the side of a semi truck that had turned left in front of it. In that case, the Autopilot system, which has since been updated, apparently did not recognize the white truck trailer against the bright sky. Additionally, the Autopilot system is not designed to be used on roads with crossing traffic.
According to a later report by the National Transportation Safety Board, the driver ignored repeated warnings to keep his hands on the steering wheel. The NTSB concluded that crash was not the result of any defect in the Autopilot system.
Since that incident, Tesla has changed the Autopilot system so that, if a driver repeatedly ignores such warnings, the system will stop functioning and will be prevented from restarting for the duration of the trip. If the driver never responds, the car will gradually slow down until it stops and the flashing hazard lights will come on.