The National Transportation Safety Board (NTSB) has concluded that Tesla was at least partly to blame for an Autopilot crash in 2018 that resulted in the death of Walter Huang, a 38-year-old Apple employee. Huang's Model X crashed into a safety barrier while operating in Autopilot mode. In its investigation, the NTSB found that Huang was playing games on his iPhone at the time of the crash.
NTSB Chairman Robert Sumwalt is pointing fingers at Tesla, though, because he says the automaker has ignored safety recommendations regarding its Autopilot driver assistance program made by the agency going back to 2017. “It’s been 881 days since these recommendations were sent to Tesla. We’re still waiting,” he said.
The grey area between manual and fully autonomous — The recommendations concern programs like Autopilot that can drive for long distances without human intervention, but still sometimes need the driver to take over when the systems get confused. Critics say such semi-autonomous programs lull drivers into thinking they can look away from the road, and by the time they take back the wheel to respond to an issue it can be too late to prevent disaster. The NTSB says Tesla and others must incorporate further safeguards that prevent drivers from looking away from the road while using driver assistance programs.
The issue strikes at the core of a philosophical debate surrounding self-driving technology. Proponents of existing driver assistance programs believe they are net-positive to drivers, so long as they pay attention to the road, while others think that the inevitable complacency of drivers makes semi-autonomous driving too dangerous and that cars should only operate themselves once they reach Level 5 classification, or "fully self-driving" mode. Either the car is driven manually or it completely drives itself, but nothing in-between.
The case in favor of Tesla — At the end of the day, Tesla's cars are not fully autonomous, and the company doesn't claim they are. While Tesla should certainly take some responsibility to ensure the safety of its customers, the company does alert drivers that its Autopilot program is assisted and therefore is intended to be operated in cooperation with the driver.
Tesla would be at fault if Autopilot malfunctioned, but that's not what happened here. Huang's family, however, sued Tesla following the accident, and in that case, we learned that Huang had previously complained about Autopilot malfunctioning in the same area where the crash occurred.