Tech

Tesla won't anonymize Full Self-Driving accident footage anymore

Beta testers will need to accept the new stipulation in order to download the FSD software.

SEAL BEACH, CA - AUGUST 15: A damaged Tesla, right, and Honda Civic, sit on tow trucks after a colli...
MediaNews Group/Orange County Register via Getty Images/MediaNews Group/Getty Images

Tesla has begun asking Full Self-Driving (FSD) beta testers for permission to record footage of the software being used in case of an accident, Electrek reports. Agreeing to this stipulation is now a necessary part of being accepted into the beta program.

The recording itself isn’t new; Tesla has always collected footage from beta testers of both the inside and outside of the vehicle while using FSD. That footage has always been purposefully anonymous, though. Tesla is now requesting that the footage be attached to each individual, so it can be investigated in case of serious injury.

As overbearing as it may sound, the new stipulation is really long overdue. Beta testers are quite literally risking their lives to try out FSD — with footage being anonymous, Tesla can never fully conduct investigations into what went wrong. That information could come in handy for the National Highway Traffic Safety Administration (NHTSA), too, now that it’s breathing over Tesla’s shoulder on the matter of autonomous driving-related crashes.

Warning: You might just die in this Tesla — Downloading any beta software is inherently risky. When you’re meant to trust that software to drive you around in a hunk of metal… well, the risk level is very high indeed. Last month, when FSD began rolling out to a wider test group, Tesla updated the warnings presented when downloading the software to be much more explicit about these risks:

Full Self-Driving is in early limited access Beta and must be used with additional caution. It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention on the road. Do not become complacent. When Full Self-Driving is enabled your vehicle will make lane changes off highway, select forks to follow your navigation route, navigate around other vehicles and objects, and make left and right turns. Use Full Self-Driving in limited Beta only if you will pay constant attention to the road, and be prepared to act immediately, especially around blind corners, crossing intersections, and in narrow driving situations.

That warning now includes language about associating recordings with an individual vehicle:

By enabling FSD Beta, I consent to Tesla’s collection of VIN-associated image data from the vehicle’s external cameras and Cabin Camera in the occurrence of a serious safety risk or a safety event like a collision.

Accountability, perhaps — Full Self-Driving has rapidly gone from being Tesla’s promised land to its worst nightmare. After vowing for years to bring autonomous driving to the masses, the beta version of FSD began rolling out just months ago.

Recent versions of the FSD beta have been packed with errors and just generally very dangerous. Tesla works hard to keep the worst bits of FSD testing hidden behind a mountain of NDAs, but videos of testing do make their way into the public. They are a consistent reminder that Full Self-Driving is just not road-ready yet.

This new recording requirement won’t fix FSD’s many issues. At best, it will keep Tesla a bit more accountable when FSD accidents do occur. And they will continue to happen — Tesla doesn’t seem too phased by that fact.