Despite announcing it yesterday, customers won't get the update for another two weeks.

Following Fatal Tesla Model S Crash, NHTSA Opens Preliminary Investigation into Tesla Autopilot

The U.S. National Highway Traffic Safety Administration confirmed today that it is opening a preliminary evaluation into Tesla Motors [NASDAQ:TSLA] famed Autopilot software following a recent fatal crash in which a Tesla Model S failed to notice a tractor trailer unit pulling across the road in front of it, subsequently colliding with the underside of the trailer at speed.

In a sombre blog post made by Tesla earlier this afternoon, the company confirmed that the incident happened on an undisclosed section of divided highway somewhere in the U.S., and was the cumulation of a set of “extremely rare circumstances.”

The fatality has cast a cloud over Tesla's Autopilot software.

The fatality has cast a cloud over Tesla’s Autopilot software.

“Following our standard practice, Tesla informed NHTSA about the incident immediately after it occurred,” Tesla said. “What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.”

In other words, neither the driver nor the car noticed the trailer until it was too late, and the timing of the collision meant that instead of colliding with the trailer or tractor’s numerous wheels, the Model S drove straight under the central part of the trailer, impacting the windscreen at high velocity and ultimately killing the driver.

Tesla says customers should hold the wheel at all times. It's not clear if the driver was in this instance.

Tesla says customers should hold the wheel at all times. It’s not clear if the driver was in this instance.

“It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine where the system worked according to expectations,” Tesla continued.

As we have covered numerous times on this site before, Tesla does reiterate regularly to drivers that its Autopilot software is still very much in beta mode and should not be used without full attention from the driver at all times. Indeed, Tesla asks customers to read and acknowledge a disclaimer before Autopilot can be activated for the first time, and visual and audio cues are given to ensure the driver keep their hands on the wheels at all times.

In its blog post, Tesla also reminds us that nationally, there is a fatal crash for every 94 million miles driven. Globally, that figure is lower at around one fatality for every 60 million miles. Today’s news is, it says, the first known fatality involving Tesla Autopilot in over 130 million miles.

And while we know very little about the circumstances involving this particular collision, we think it’s fair to say from Tesla’s account that we’ve all been there: a bright, sunny day with direct sunlight making the road ahead a little more difficult to see. And most of us have had that panicked last-second moment when we register the fact that there’s something in our path we need to stop for.

In other words, this crash, however unwelcome, sounds like any number of similar tractor-trailer impacts involving human-piloted cars.

We're curious as to why Tesla's system didn't see the truck.

We’re curious as to why Tesla’s system didn’t see the truck.

But here’s the issue. While Tesla reiterates the need for drivers to remain at the wheel at all times, alert and ready to take over, there’s still a lot of confusion over what Tesla’s Autopilot features can and can’t do. While many view Tesla’s Autopilot to be fully autonomous vehicle technology, it is only a Level 2 autonomous vehicle by NHTSA’s own standards, with Level 0 being a vehicle with no autonomy at all and Level 4 being a vehicle that can completely drive itself with no human input.

In the coming days, weeks and months, there will be a lot of tough questions asked of Tesla and a lot of scrutiny of its Autopilot System. The first will likely focus on how the crash happened — and why Tesla’s multiple sensors: ultrasonic, visual and sonar, failed to see the truck

Data already suggests that Tesla’s Autopilot system — just like those being developed by rival companies like Volvo, Google, Nissan, BMW, Ford, Mercedes-Benz and Audi — is far safer than a human driver, reducing the incidents of accidents and making the road a safer case. But until the investigation is complete, we hope Tesla temporarily deactivates its Autopilot system as a stop-gap measure until more is known or a way of ensuring the driver is alert and ready to take over at all times is devised.

Today’s sad news does sadly remind us with autonomous vehicle technology still in its infancy, mistakes like this will be made. But now we as a society have to decide if those mistakes are acceptable on the public highway. We have to decide if statistically reducing the number of fatalities is enough or if we expect our cars to be the perfect drivers we’ll never be.

Choosing the first will mean more accidents like this and more incidents of apparent autopilot failure, either caused by environmental conditions or system failure. And if we choose that, then we’re going to have to make peace with that.

If we choose the latter, then it could put autonomous vehicle technology back a very long way indeed because no matter how good a system is, there will always be edge-use cases never planned for.

We don’t envy the legislators or the automaker facing that decision.

Our thoughts go out to the family of the Tesla owner who died and on behalf of our team and our readers, we hope their loss will not be in vain.


Want to keep up with the latest news in evolving transport? Don’t forget to follow Transport Evolved on Twitter, like us on Facebook and G+, and subscribe to our YouTube channel.

You can also support us directly as a monthly supporting member by visiting

Share on FacebookTweet about this on TwitterShare on Google+Share on LinkedInDigg thisShare on RedditEmail this to someonePin on Pinterest

Related News

  • As a human driver who recently had to full-stop in a tight-curve two-lane road that twas blocked by an idiot turning around in a long-bed king-cab truck… Did I mention it was a two-lane, no-shoulder, tight corner on a 4% grade? Yeah.

    When you’re driving on the highway, conditions can go very unfavorable very fast. I hope they work out a better safety protocol so the ‘clothesline’ doesn’t get repeated.

    And just like fog or snow-blindness or getting a mirrored reflection in your eye, the lidar the level 2 cars use can be blinded. Near sun-set can be really dangerous for a driver’s vision, human or robot.

    • Joe Viocoe

      I’d imagine a “predator” like system that can switch spectrums in the fly.

    • Electric Bill

      The incident you describe angers me terribly. I assume the jackass that nearly got you killed still has his license and may be out there ready to kill someone else. When these incidents happen, it should be an automatic suspension or revocation of a truck driving license, and their automobile license should also have some restrictions, since they are a danger to traffic even in a car… just with less opportunity for massive carnage.

      I assume that since you avoided the accident due to your quick reflexes, you never found out who nearly killed you, and he never had any repercussions for his idiocy.

      There is a solution to this: every commercial driver should he required to have a GPS that cannot be tampered with, which records exactly where they are at all times while driving. When someone reports such an incident, a log database could spit out which trucker was at the scene when the incident occurred. We need to get anyone off the road that is not attentive enough to realize that in whatever situation, such as the one you were in, they are putting others at risk… they can find some, other field of employment, but if they are that spaced out and oblivious to the treat they pose, truck driving should be eliminated from their list of employment options.

      Most truckers are responsible. But even tiny numbers of jackass drivers are capable of causing immense grief, such as the one who plowed into the back of a mini van in my area recently, causing two families to burn to death.

      It is against the law here in California to block an Intersection… i.e, enter an intersection only if you can pass through it without blocking cross traffic. I assume that is a law that is standard in other states and countries. Far too often I am blocked in exactly that way by zombie drivers… yesterday I saw two lanes of traffic blocked in front of a fire station which was clearly marked DO NOT BLOCK. I frequently see bus drivers… even the super-long, articulated buses… pull right into intersections, blocking them. You can see that even after doing so, they are oblivious to the danger they pose to emergency traffic and to just the ordinary flow of cross traffic, causing gridlock. I wish the mayor would tell the police chief to set that as a priority to cite such drivers once a month, the same way they do with DUI traffic stops.

  • Ad van der Meer

    I think it would be just wrong to turn off Autopilot all together. I think the way Tesla goes about developing autonomous driving is the fastest way to safer cars and yes there will be casualties along the way.
    First, the truck driver is responsible for the accident. Period!
    We don’t know why the Tesla driver didn’t use his brakes. Was he distracted, asleep, incapacitated?
    If he was distracted, shame on him. You have to keep your eyes on the road. He was an experienced Tesla driver with ample AP experience. He knew the rules of the game.
    If he was asleep or incapacitated I think there is a great way to improve safety. I know at least Mercedes Benz has the ability to detect when a driver falls asleep. Tesla should add this functionality. This way the car could warn the driver and maybe even bring the car to a stop if the driver does not respond.

    I guess it wouldn’t hurt to instruct current and future Tesla drivers even stronger on the limitations of Autopilot.

  • Martin Lacey

    I feel for the family of the deceased.

    Having lost my father in an car crash I understand the shock and something of the raw emotions they must be going through at this time. What makes it worse is the media interest, not in the fatality, but rather the dubious honor of being the first autopilot fatality.

    I hope the media storm soon blows over so they can grieve in peace.

  • Joe Viocoe

    “But until the investigation is complete, we hope Tesla temporarily deactivates its Autopilot system as a stop-gap measure until more is known or a way of ensuring the driver is alert and ready to take over at all times is devised.”

    Worst idea ever.
    First, it sets a precedent that we should react before more information is known.
    More importantly, it stops all learning of the system.
    And what happens when someone is killed by a model S driver who would have been on autopilot, but couldn’t because of this hold?

    • When an airplane autopilot system malfunctions (or a plane crashes) air authorities are usually very cautious about allowing planes of the same type to go back in the air if there’s a suspected system malfunction rather than human error.

      In this case, it is probably a combination of both, but right now, we’re still waiting or NHTSA regulations on autonomous vehicles. Remember though that Autopilot functionality (active safety features) work with autopilot engaged or disengaged. The suggestion to switch off Autopilot (while keeping active systems on) would help ensure that drivers pay more attention rather than relying on the system too much (which it appears too many people do.)

      • Joe Viocoe

        Every action of aircraft maintenance and operation is overseen by the FAA, but NHTSA doesn’t get involved with a decision to activate or not activate an optional feature on a car. Especially not after the vehicle is sold.
        Not even mandatory recalls could force a driver to stop using a vehicle.

        I understand that Tesla creates a new circumstance with the ability to dramatically change a vehicle’s function over the air… But there is nothing to indicate that NHTSA has the ability to compel Tesla to remove a feature that isn’t even active unless the driver deliberately enables it.

        Furthermore, it still sets a bad precedent to have Tesla take responsibility away from drivers, and override the decisions of adult customers.

      • Joseph Dubeau

        After all it’s only beta software.

  • Electric Bill

    “…why Tesla’s multiple sensors: ultrasonic, visual and sonar, failed to see the truck.” Sonar IS ultrasonic. You mean ultrasonic, visual and radar.

  • CDspeed

    This is a street view of the intersection where the crash occurred, not the street on the right side, but a head, do you see a well marked four way intersection? I use roads like this myself to cross Florida, there are no signs, they’re remote, and people travel them like highways doing 80 plus.

Content Copyright (c) 2016 Transport Evolved LLC