Tech

String of Tesla crashes triggers 23 new NHTSA investigations

The Autopilot driver assistance technology has been criticized for encouraging reckless behavior.

SEAL BEACH, CA - AUGUST 15: A damaged Tesla sits on a tow truck after a collision in the HOV lane on...
MediaNews Group/Orange County Register via Getty Images/MediaNews Group/Getty Images

The National Highway Traffic Safety Administration (NHTSA) is reportedly investigating 23 crashes involving Tesla vehicles. That news comes from U.S. News & World Report, which says four of the investigations have been completed.

Earlier this week, a Tesla Model Y rear-ended a police car while operating in Autopilot mode, the driver assistance software that Tesla hopes will someday make its cars fully autonomous. In another recent crash, a Tesla crashed into a truck, resulting in two people being taken to a hospital. That vehicle was not operating in Autopilot mode but the driver has been charged with reckless driving behavior.

Reckless behavior — Tesla warns drivers to keep their eyes on the road when using Autopilot, but the automaker has been criticized over the years for not doing enough to reign in reckless behavior, such as drivers playing video games, watching movies, and even sleeping while their cars are in Autopilot mode. Last year, a driver in Canada was found catching some shuteye while his Model S barreled down a highway at 93 mph. Both in 2016 and 2019, Tesla drivers died after their cars drove beneath tractor-trailers.

In March 2018, an Apple employee driving a Model X in Autopilot mode was killed after his car hit a concrete barrier. An investigation concluded the driver had previously known that Autopilot struggled on the particular stretch of road where the crash occurred. But nonetheless, at the time of the crash, the driver had Autopilot engaged and was playing a game on his phone.

Real-world testing — A string of 23 crashes wouldn’t by itself necessarily trigger an investigation — surely many Toyota Camrys crash every day without triggering a federal inquiry. But the NHTSA does investigate when new technology like Autopilot may be involved.

Tesla has gone against the grain in its introduction of self-driving technology, opting to use cheap cameras and its network of early-adopting owners as guinea pigs, testing the Autopilot software and submitting feedback when it’s gone awry. The hope is that the software will learn from the more than one million Tesla cars already on the road. But Autopilot is far from ready for truly driverless operation, and drivers have shown repeated willingness to ignore safety warnings. Competitors like Google don’t think self-driving should be rolled out until it’s absolutely ready to take over 100 percent of driving.

If Tesla is proven right, it will be able to offer self-driving at much more affordable prices than the competition. But today Autopilot is more of a driver assistance technology than anything. It may help avoid incidents in some cases, such as preventing drivers from drifting into another lane, but the driver still needs to watch the road closely. And Tesla needs to do more to emphasize that point. In Germany, the company has been banned from even marketing its technology as “Autopilot” so as not to deceive customers about its capabilities.