From the way in which it designs custom circuit boards to exactly meet its expectations rather than rely on off-the-shelf components from tier one automotive suppliers to its innovative mall-based Tesla Stores and post-purchase over-the-air software updates, Tesla Motors is known for thinking outside the box.
In fact, it’s Tesla Motors’ [NASDAQ:TSLA] software-oriented design process and responsive problem solving that have helped it quickly adapt to challenges as they present themselves. The result? A constantly-evolving, ever-improving product for both existing and future customers, one which is updated as soon as improvements become available, rather than at the next new model-year refresh. And when things aren’t to Tesla’s liking — be it a regulatory challenge or a technological one — it does what any good engineer would do: it hacks its environment.
Insurance companies won’t insure cars where driver liability is in question and automakers won’t be able to afford to produce autonomous cars if each and every car sold could theoretically land it in court.
As The Wall Street Journal detailed last week, that’s exactly what the Californian automaker is doing to solve one of the biggest challenges facing the evolution of autonomous vehicle technology today. Who is to blame when an autonomous vehicle crashes?
Tesla’s answer? To force the driver of its luxury Model S electric sedan to initiate any autonomous overtaking maneuvers by tapping the turn signal. That way, in the event of an accident, it was the person — not the car — which ultimately deemed the overtaking maneuver safe.
The question over autonomous vehicle crash liability is one that automakers, legislators and insurance specialists have been trying to answer for many years. Indeed, without figuring out the answer, autonomous vehicles can’t take off: insurance companies won’t insure cars where driver liability is in question and automakers won’t be able to afford to produce autonomous cars if each and every car sold could theoretically land it in court.
Later this year, probably some time in Q3, Tesla Motors will push System 7.0 of its vehicular operating system to customers’ cars around the world. That software, promised Tesla CEO Elon Musk earlier this year, would include increased autonomous driving functionally including autonomous parking, lane follow, and automatic passing on the Freeway.
After Tesla rolls out that update, any Tesla Model S made after October last year will instantly become capable of autonomous driving.
Right now however, the insurance industry and legislators haven’t caught up.
While certain states in the U.S. and some countries around the world now allow autonomous vehicles on the public road under close supervision from fully-trained ‘minders’ from each vehicle’s respective automaker, doing so still incurs the company conducting the tests to post impossibly high bonds to ensure that if things go wrong, there’s adequate insurance to put things right. Asking a private vehicle owner or indeed their insurance company to meet those high levels of coverage is impossible.
The Wall Street Journal hints that unnamed insiders at Tesla Motors believe the simple, low-tech solution of having drivers tap the turn signal for each and every overtake makes it possible for the luxury sedan to take over around 90 percent of freeway driving duties without requiring a change in current law.
That’s because instead of being a fully-featured autonomous driving system, Tesla’s autopilot functionality — a name that we note Tesla CEO Elon Musk has used since the very beginning to ensure emphasis is placed on the driver rather than the car — is essentially a driver aid.
It’s not a driver replacement.
Tesla’s autopilot functionality is a driver aid — not a driver replacement.
Instead, Tesla’s autopilot system looks to be marketed as a logical evolution of existing automotive technology in widespread use around the world. For example, many cars already feature lane keep assist warning systems, autonomous emergency braking and adaptive cruise control.
Tesla’s autopilot system essentially combines all of these, and adds steering control to the mix.
But while this clever interpretation of the rules means that Tesla should be able to activate its semi autonomous software in all compatible cars in the near future, it also opens up a completely different can of worms.
For example, while the software won’t require drivers to hold the steering wheel or touch the pedals while activated, drivers will still need to remain alert and be able to take over in the event of an emergency. The logical progression of that is that drivers must be free from distraction while the car is in motion under autopilot operation.
If the driver is still ultimately responsible for the car, does that mean any additional stimulation — such as the driver operating a mobile telephone or laptop computer — are as illegal while the car is driving itself as they would be when the driver is piloting the vehicle?
We’re sure regulators and automakers will still spend many hours trying to solve that particular conundrum, even if Tesla has solved the matter of liability.
What are you thoughts? Who should be held responsible if there’s an accident? And do you think Tesla’s solution will be enough to convince legislators that self-driving cars are safe?
Leave your thoughts in the Comments below.
You can also support us directly as a monthly supporting member by visiting Patreon.com.