Following Fatal Tesla Model S Crash, NHTSA Opens Preliminary Investigation into Tesla Autopilot

The U.S. National Highway Traffic Safety Administration confirmed today that it is opening a preliminary evaluation into Tesla Motors [NASDAQ:TSLA] famed Autopilot software following a recent fatal crash in which a Tesla Model S failed to notice a tractor trailer unit pulling across the road in front of it, subsequently colliding with the underside of the trailer at speed.

In a sombre blog post made by Tesla earlier this afternoon, the company confirmed that the incident happened on an undisclosed section of divided highway somewhere in the U.S., and was the cumulation of a set of “extremely rare circumstances.”

The fatality has cast a cloud over Tesla's Autopilot software.

The fatality has cast a cloud over Tesla’s Autopilot software.

“Following our standard practice, Tesla informed NHTSA about the incident immediately after it occurred,” Tesla said. “What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.”

In other words, neither the driver nor the car noticed the trailer until it was too late, and the timing of the collision meant that instead of colliding with the trailer or tractor’s numerous wheels, the Model S drove straight under the central part of the trailer, impacting the windscreen at high velocity and ultimately killing the driver.

Tesla says customers should hold the wheel at all times. It's not clear if the driver was in this instance.

Tesla says customers should hold the wheel at all times. It’s not clear if the driver was in this instance.

“It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine where the system worked according to expectations,” Tesla continued.

As we have covered numerous times on this site before, Tesla does reiterate regularly to drivers that its Autopilot software is still very much in beta mode and should not be used without full attention from the driver at all times. Indeed, Tesla asks customers to read and acknowledge a disclaimer before Autopilot can be activated for the first time, and visual and audio cues are given to ensure the driver keep their hands on the wheels at all times.

In its blog post, Tesla also reminds us that nationally, there is a fatal crash for every 94 million miles driven. Globally, that figure is lower at around one fatality for every 60 million miles. Today’s news is, it says, the first known fatality involving Tesla Autopilot in over 130 million miles.

And while we know very little about the circumstances involving this particular collision, we think it’s fair to say from Tesla’s account that we’ve all been there: a bright, sunny day with direct sunlight making the road ahead a little more difficult to see. And most of us have had that panicked last-second moment when we register the fact that there’s something in our path we need to stop for.

In other words, this crash, however unwelcome, sounds like any number of similar tractor-trailer impacts involving human-piloted cars.

We're curious as to why Tesla's system didn't see the truck.

We’re curious as to why Tesla’s system didn’t see the truck.

But here’s the issue. While Tesla reiterates the need for drivers to remain at the wheel at all times, alert and ready to take over, there’s still a lot of confusion over what Tesla’s Autopilot features can and can’t do. While many view Tesla’s Autopilot to be fully autonomous vehicle technology, it is only a Level 2 autonomous vehicle by NHTSA’s own standards, with Level 0 being a vehicle with no autonomy at all and Level 4 being a vehicle that can completely drive itself with no human input.

In the coming days, weeks and months, there will be a lot of tough questions asked of Tesla and a lot of scrutiny of its Autopilot System. The first will likely focus on how the crash happened — and why Tesla’s multiple sensors: ultrasonic, visual and sonar, failed to see the truck

Data already suggests that Tesla’s Autopilot system — just like those being developed by rival companies like Volvo, Google, Nissan, BMW, Ford, Mercedes-Benz and Audi — is far safer than a human driver, reducing the incidents of accidents and making the road a safer case. But until the investigation is complete, we hope Tesla temporarily deactivates its Autopilot system as a stop-gap measure until more is known or a way of ensuring the driver is alert and ready to take over at all times is devised.

Today’s sad news does sadly remind us with autonomous vehicle technology still in its infancy, mistakes like this will be made. But now we as a society have to decide if those mistakes are acceptable on the public highway. We have to decide if statistically reducing the number of fatalities is enough or if we expect our cars to be the perfect drivers we’ll never be.

Choosing the first will mean more accidents like this and more incidents of apparent autopilot failure, either caused by environmental conditions or system failure. And if we choose that, then we’re going to have to make peace with that.

If we choose the latter, then it could put autonomous vehicle technology back a very long way indeed because no matter how good a system is, there will always be edge-use cases never planned for.

We don’t envy the legislators or the automaker facing that decision.

Our thoughts go out to the family of the Tesla owner who died and on behalf of our team and our readers, we hope their loss will not be in vain.

______________________________________

Want to keep up with the latest news in evolving transport? Don’t forget to follow Transport Evolved on Twitter, like us on Facebook and G+, and subscribe to our YouTube channel.

You can also support us directly as a monthly supporting member by visiting Patreon.com.

Related News