Tech

A Tesla driver slept while his Model S did 93 mph on the highway

Autopilot is meant to enable this one day, but not yet.

Myung J. Chun/Los Angeles Times/Getty Images

Stories of Tesla drivers taking advantage of their car's Autopilot feature to behave recklessly are nothing new, but this one out of Canada takes the cake. The Royal Canadian Mounted Police say a 20-year-old man was recently caught asleep while his Model S flew down a highway at speeds of up to 93 mph. Not only that, but he had his seat reclined so far you couldn't see him through the window.

"Nobody was looking out the windshield to see where this car was going," an RCMP officer told CBC News.

Although this particular driver made it out alive, other Tesla drivers haven't been so lucky. Despite the name, Autopilot is glorified lane-keeping assistance with cruise-control and can get confused by edge-case scenarios. For instance, a driver using Autopilot was killed in 2016 when his car's sensors failed to distinguish a large white semi-truck against a bright sky and attempted to barrel through the truck.

Tesla distinguishes itself by selling electric cars with futuristic new tech, so it's not hard to see why the company has been adamant about offering Autopilot despite problems. But it's also been forced to rein in the claims it can make about how much autonomous driving it enables. Sleeping definitely isn't on the list of intended use cases.

Philosophical debate — Tesla has an eventual goal of making its cars fully autonomous and argues that by releasing Autopilot to customers now it can collect the data needed to advance that goal. It says it requires drivers to constantly supervise their cars, though.

But critics believe that such driver assistance software poses a threat for the very reason we're seeing here: drivers take false comfort when the software performs well and stop watching the road. Even if they don't have the gall of this particular driver to take a siesta, they might still think it's okay to take their eyes off the road (and hands off the wheel) for an extended period. Like the Apple engineer who died in 2018 after his car hit a concrete barrier while he played a game on his phone.

In an investigation regarding that fatality, the NTSB concluded that Telsa was somewhat responsible because it had not implemented enough of the agency's safety recommendations.

Many in the autonomous driving industry say that cars should either be solely operated by the driver or fully autonomous but not something in the middle. The autonomous software needs to be good enough to take over all operations because humans cannot be trusted.

Protecting drivers from themselves — The warnings about Autopilot from Tesla don't seem to sink in with drivers. Some people have even found workarounds to deliberately trick the Autopilot system into thinking they have their hands on the steering wheel, like using weighted straps. In the process, they're not just placing themselves at danger, but also everyone else around them on the road.

The driver in Canada was charged with speeding and dangerous driving and is due in court in December.