Autopilot

Tesla steps on the Autopilot accelerator as gov’t probes fatal crashes

Tesla is increasing its max Autopilot speed to 85 mph as the National Highway Traffic Safety Administration probes dozens of crashes related to the software.

NEW YORK, NEW YORK - MAY 02: Elon Musk attends The 2022 Met Gala Celebrating "In America: An Antholo...
Jeff Kravitz/FilmMagic/Getty Images

Last week, a fatal crash involving a Tesla using the company’s Autopilot feature resulted in the death of three people near Newport Beach, California — a tragedy which prompted the National Highway Traffic Safety Administration (NHTSA) to add the incident to its growing list of over 30 investigations involving the EV company’s “beta” self-driving mode.

Instead of, say, pausing a program that Tesla itself admits still needs a lot of work, the company is quite literally pushing on the accelerator pedal — using a new software update this week to raise the max speed of Autopilot for owners from 80 to 85 mph. The increase comes nearly a year after Elon Musk’s empire announced its “Tesla Vision” Autopilot program, which transitioned away from radar data to only utilizing camera-derived information.

All of this likely isn’t going to help things when it comes to Tesla’s little “phantom breaking” problem, reports of which have “soared since Tesla transitioned to vision-only Autopilot,” according to outlets like Electrek.

Look both ways before crossing a street, y’all.CHRIS DELMAS/AFP/Getty Images

Big deal despite bigger picture — Technically speaking, 30 incidents reported at the NHTSA is minuscule next to the daily millions of drivers and thousands of accidents. But there was a time when a car company found to be producing faulty steering wheels or seatbelts would immediately issue a recall update for its products in order to address the dangerous issue. Tesla upping its Autopilot’s maximum speed to 85 mph right now is like if Ford decided to make the Pinto’s rear bumper out of flint rock after hearing about issues with its fuel tank placement.

All of this to say, it’s clear Musk and Tesla are not taking this issue very seriously. Their eyes are on the supposed bigger picture (and likely bigger profits), even if a few people accidentally die in the process.

Hitting close to home — Should Musk have his way, there’ll be exponentially more Teslas on the road in the coming months and years. One can reasonably assume that more vehicles on the road using Autopilot, the greater the risk of Autopilot-related crashes — some of which will be deadly. Failing to address issues with Tesla’s AI shows a blatant disregard for public safety. Until there’s real data indicating that Autopilot systems are indeed safer than actual human drivers, this kind of technology maybe shouldn’t be available to everyone — least of all, people like this.

Watch: Extreme Reviews