Internet, Tesla Owner Proves (Again) That Autonomous Driving Won’t Cure Driver Irresponsibility

While other automakers may have more advanced autonomous driving systems in development, Tesla’s Autopilot features — available to anyone with a Tesla Model S or Tesla model X electric car made after October 2014 — is undeniably the most widely used autonomous drive system in the world today.

But while Tesla’s Autopilot system can now handle the majority of driving duties for an average daily commute with only minimal interaction from its owner, it is still officially a beta feature that requires full supervision at all times. Indeed, when activating Tesla Autopilot for the first time, customers must accept a multi-page disclaimer agreeing that they both understand the implications and risks associated with using the software.

View post on

Yet as we’ve seen demonstrated multiple times in the past, just because Tesla asks its customers to digitally accept a legal waiver before using Autopilot — and impresses on them the importance of remaining alert and ready to take over at all times — customers don’t always behave as they should behind the wheel.

Even Google's self-driving cars have people behind the wheel, ready to jump in.

Even Google’s self-driving cars have people behind the wheel, ready to jump in.

The latest example of such disregard for such warnings comes in the form of a video posted to Reddit yesterday afternoon by a Reddit user by the name of isevenx. Entitled “Caught Sleeping In Traffic With Autopilot,” it contains a short video showing a Tesla Model S owner apparently asleep at the wheel of their car.

We’re adding the ‘apparently’ because as always with this kind of video, it’s hard to prove its authenticity, especially when the video clip is so short, meaning it could (conceivably) be a little more staged than it appears. Either way however, it demonstrates some rather irresponsible behavior on the part of the person behind the wheel, as they clearly have their eyes closed and head rested on the seat belt.

Irresponsible yes, but what about illegal? That does depend rather on where you are. While some states and countries require the driver to hold onto the wheel of a moving car at all times, we’ve chatted to local enforcement officers who tell us that as long as the car is ‘under control’ there’s little they can do to ticket the individual responsible. Definition of ‘under control’ is of course also subjective, so individual enforcement agencies and professionals will have a different interpretation of statute. One cop may pull over a driver for not holding the wheel, while another may not mind as long as the car is being driven responsibly.

In any case however, we’d guess that closing your eyes and taking forty winks while behind the wheel is most certainly grounds for a traffic stop and quite possibly, a hefty fine to boot.

This kind of full autonomy is some way away.

This kind of full autonomy is some way away.

Which brings us to the next question: is it actually possible to fall asleep behind the wheel of a Model S while it is driving in Autopilot mode?

That depends on who you ask. According to those who responded to the original post on Reddit, while it is technically possible to let go of the wheel for extended periods of time, Tesla’s Autopilot software will start to nag drivers if it thinks there’s nobody holding the wheel. That length of time seems to vary depending on the road conditions and how fast the traffic is moving, as well as which country you happen to be in.

In the UK, for example, the car will nag you almost immediately if you let go of the wheel, since it’s illegal to not hold the wheel at all times. But, according to the responses posted in the Reddit thread in question — and again, we should point out there’s no way of verifying the authenticity of any of them — the time it takes a Model S or Model X in Autopilot mode to get upset about the lack of human input can range from a few minutes to more than half an hour.

If these warnings are ignored, Tesla’s Autopilot software will eventually come to a safe stop, putting its hazard warning lights on. Yet we’ve never heard of it happening in real life.

The car may steer itself, but there's still a human behind the wheel.

The car may steer, accelerate and brake by itself, but there’s still a human behind the wheel.

All told, the majority of Redditors commenting on this post are like us — a little skeptical that the person behind the wheel was actually asleep — but it does serve as an excellent platform for having the discussion we all need to have: at what point is it safe or appropriate for a car to completely take over the driving?

Legally, Tesla places the onus on its drivers to remain alert and ready to take over manual control during Autopilot operation. And in the rare cases where Autopilot has been accused of failing by customers, the culprit can be traced back to an owner either failing to engage autopilot or failing to understand exactly how it operates.

Not all automakers are the same. Volvo is so confident of its autonomous driving hardware for example, that it has said it will assume all responsibility for any accidents involving its self-driving cars as long as it was in control of the vehicle at the time of the accident. For now however, Volvo’s stance is the exception to the rule: most automakers prefer a Tesla-esque policy of placing the human — not the computer — at the heart of the equation.

And that makes sense, at least until self-driving software and hardware passes the same driving tests that humans have to pass. We’re already well on the way to achieving that, following the decision by the National Highway Traffic Safety Administration that Google’s Self-Driving Software is good enough to be considered the ‘driver’ of a vehicle under federal law.

Until every automaker and every self-driving car fits into that same legal classification however, we think the onus has to remain on the licensed person behind the wheel.

Volvo is the only automaker to indemnify its customers against autonomous accidents.

Volvo is the only automaker to indemnify its customers against autonomous accidents.

Which brings us to our final point: distraction. Distracted driving is an ever-increasing problem on our roads and advanced safety features — like Tesla’s Autopilot — do help minimize and mitigate the kind of fender benders often caused by not paying attention to the road ahead. But until systems like Tesla’s Autopilot is out of Beta and fully approved to take over complete control of a vehicle — something Volvo has long maintained Tesla’s system is incapable of doing right now — paying attention to the road ahead needs to be mandatory.

Could Tesla and other automakers work to ensure that their drivers really are paying attention? Systems to detect distracted drivers have existed in high-end cars for several years, and there’s no reason why a similar system couldn’t be employed in autonomous cars to ensure that the person behind the wheel wasn’t nodding off. If we’d guess, we’d have to suggest that Tesla is probably already considering such a system for future Autopilot rollouts.

Which means one thing and one thing only. For now, however good Autopilot and other systems are, the driver needs to keep their eyes on the road. Anyone who doesn’t or can’t is asking for trouble and — if we’re completely honest — shouldn’t be behind the wheel in the first place.



Want to keep up with the latest news in evolving transport? Don’t forget to follow Transport Evolved on Twitter, like us on Facebook and G+, and subscribe to our YouTube channel.

You can also support us directly as a monthly supporting member by visiting

Share on FacebookTweet about this on TwitterShare on Google+Share on LinkedInDigg thisShare on RedditEmail this to someonePin on Pinterest

Related News

  • Are companies like Tesla expecting that auto-pilot will cause less accidents and decrease driver responsibility? I’m not sure that’s really the way forward, considering the irresponsibility of some of these drivers.

    Ruby from

  • Martin Lacey

    Unbelievable and irresponsible if true. Autopilot is still only in beta stage and whilst it uses hive learning, it still has much to learn for all scenarios.

  • Mark Melocco

    I wonder how many nose to tail fender benders occur because the driver momentarily nods off. We will never know, just like we will never know if this was the Tesla preventing an accident in this case.
    Then there is the decision to make a video rather than perhaps sounding the horn with the hope of waking the dozing driver. Human nature probably won’t change but it’s likely that autonomous technology will advance and reduce accidents.

  • Albemarle

    With human nature, there seems to be one of two choices; ignore the technology, like cruise control where no one seems to use it despite it being pretty universal, and abuse the technology by not even making an attempt to drive. I am coming around to the Google idea: don’t give the idiots any controls if you don’t want chaos. Humans don’t share responsibility well.

  • Wild

    So was this driver actually sleeping on dead? Perhaps Tesla could add a software update to drive directly to the hospital. Or a motel?

Content Copyright (c) 2016 Transport Evolved LLC