To Avoid The Question of Who’s To Blame in Autonomous Vehicle Accidents, Tesla Has a Low-Tech Solution

From the way in which it designs custom circuit boards to exactly meet its expectations rather than rely on off-the-shelf components from tier one automotive suppliers to its innovative mall-based Tesla Stores and post-purchase over-the-air software updates, Tesla Motors is known for thinking outside the box.

Tesla's solution to keeping autonomous vehicle liabilities clear? Asking the driver to initiate each overtake.

Tesla’s solution to keeping autonomous vehicle liabilities clear? Asking the driver to initiate each overtake.

In fact, it’s Tesla Motors’ [NASDAQ:TSLA] software-oriented design process and responsive problem solving that have helped it quickly adapt to challenges as they present themselves. The result? A constantly-evolving, ever-improving product for both existing and future customers, one which is updated as soon as improvements become available, rather than at the next new model-year refresh. And when things aren’t to Tesla’s liking — be it a regulatory challenge or a technological one — it does what any good engineer would do: it hacks its environment.

Insurance companies won’t insure cars where driver liability is in question and automakers won’t be able to afford to produce autonomous cars if each and every car sold could theoretically land it in court.

As The Wall Street Journal detailed last week, that’s exactly what the Californian automaker is doing to solve one of the biggest challenges facing the evolution of autonomous vehicle technology today. Who is to blame when an autonomous vehicle crashes?

Tesla’s answer? To force the driver of its luxury Model S electric sedan to initiate any autonomous overtaking maneuvers by tapping the turn signal. That way, in the event of an accident, it was the person — not the car — which ultimately deemed the overtaking maneuver safe.

The car will do the hard stuff -- but the driver solves the liability issue.

The car will do the hard stuff — but the driver solves the liability issue.

The question over autonomous vehicle crash liability is one that automakers, legislators and insurance specialists have been trying to answer for many years. Indeed, without figuring out the answer, autonomous vehicles can’t take off: insurance companies won’t insure cars where driver liability is in question and automakers won’t be able to afford to produce autonomous cars if each and every car sold could theoretically land it in court.

Later this year, probably some time in Q3, Tesla Motors will push System 7.0 of its vehicular operating system to customers’ cars around the world. That software, promised Tesla CEO Elon Musk earlier this year, would include increased autonomous driving functionally including autonomous parking, lane follow, and automatic passing on the Freeway.

After Tesla rolls out that update, any Tesla Model S made after October last year will instantly become capable of autonomous driving.

Tesla's solution seems to be  watertight -- but it could lead to other questions.

Tesla’s solution seems to be watertight — but it could lead to other questions.

Right now however, the insurance industry and legislators haven’t caught up.

While certain states in the U.S. and some countries around the world now allow autonomous vehicles on the public road under close supervision from fully-trained ‘minders’ from each vehicle’s respective automaker, doing so still incurs the company conducting the tests to post impossibly high bonds to ensure that if things go wrong, there’s adequate insurance to put things right. Asking a private vehicle owner or indeed their insurance company to meet those high levels of coverage is impossible.

The Wall Street Journal hints that unnamed insiders at Tesla Motors believe the simple, low-tech solution of having drivers tap the turn signal for each and every overtake makes it possible for the luxury sedan to take over around 90 percent of freeway driving duties without requiring a change in current law.

That’s because instead of being a fully-featured autonomous driving system, Tesla’s autopilot functionality — a name that we note Tesla CEO Elon Musk has used since the very beginning to ensure emphasis is placed on the driver rather than the car — is essentially a driver aid.

It’s not a driver replacement.

Tesla’s autopilot functionality is a driver aid — not a driver replacement.

Instead, Tesla’s autopilot system looks to be marketed as a logical evolution of existing automotive technology in widespread use around the world. For example, many cars already feature lane keep assist warning systems, autonomous emergency braking and adaptive cruise control.

Tesla’s autopilot system essentially combines all of these, and adds steering control to the mix.

But while this clever interpretation of the rules means that Tesla should be able to activate its semi autonomous software in all compatible cars in the near future, it also opens up a completely different can of worms.

For example, while the software won’t require drivers to hold the steering wheel or touch the pedals while activated, drivers will still need to remain alert and be able to take over in the event of an emergency. The logical progression of that is that drivers must be free from distraction while the car is in motion under autopilot operation.

Tesla's clever interpretation means we could see auto-pilot Model S cars by Q3

Tesla’s clever interpretation means we could see auto-pilot Model S cars by Q3

If the driver is still ultimately responsible for the car, does that mean any additional stimulation — such as the driver operating a mobile telephone or laptop computer — are as illegal while the car is driving itself as they would be when the driver is piloting the vehicle?

We’re sure regulators and automakers will still spend many hours trying to solve that particular conundrum, even if Tesla has solved the matter of liability.

What are you thoughts? Who should be held responsible if there’s an accident? And do you think Tesla’s solution will be enough to convince legislators that self-driving cars are safe?

Leave your thoughts in the Comments below.

 

______________________________________

Want to keep up with the latest news in evolving transport? Don’t forget to follow Transport Evolved on Twitter, like us on Facebook and G+, and subscribe to our YouTube channel.

You can also support us directly as a monthly supporting member by visiting Patreon.com.

Share on FacebookTweet about this on TwitterShare on Google+Share on LinkedInDigg thisShare on RedditEmail this to someonePin on Pinterest

Related News

  • jeffsongster

    Seems a brilliant solution since letting the car take over completely will doom us to complacency and over trusting. Then when the Sontaran Atmos system takes over it will drive us into lakes and such. Bad Idea. Let the dumb humans participate minimally in the driving experience so they can be blamed when it has bugs. Gonna be interesting when the first big bug is introduced via overnight update and all the motorways, autobahns and freeways of the world are paralyzed as autonomous toddlers playing with legos in their self driving cars on their way to kindergarten park and are software bricked until all 2 million of them can be towed back to the factories… frantic parents in tow trucks waving to the waifs they cannot extract from the vehicles until the security systems are rebooted… you get the idea… yikes and wow at the same time. Next decade is gonna be fun?

    • Joe Viocoe

      Once again… an easy solution.
      Air gap the autopilot functions. There is no need for that sub-system to ever be connected to the larger network.
      But even it Tesla does allow over the air updates to the autopilot on final release… there should be an extensive testing and validation program that mitigates that risk.

      You cannot compare this to personal computers, because in that industry, the risk is different… so the roll out of new features doesn’t have to meet strict regulations.
      For aircraft control systems, the regulatory compliance standards for new software is MUCH stricter. For defense systems, even worse.

      Bottom line… Y2K was a big fear, because people didn’t understand the systems.

  • Cameras embedded in bumper corners should also be a consideration. Their cost is trivial now thanks to smartphone ubiquity and mass production. Such cameras could even double to replace the role of highway patrol by allowing drivers to redflag erratic driving (speeding with the flow of traffic or without any traffic on the road should yield no redflags).

  • D. Harrower

    IMO, Tesla’s autopilot solution is a nightmare scenario. Still requiring the driver to be alert and ready to intervene at a moment’s notice, but not actually have them actively participate in the act of driving. This would be like being forced to watch a video of someone driving down the highway. You’re attention would drift in about 30 seconds.