Transport Evolved Model X Unintended Acceleration 4

Ghost In The Shell Or Driver Error? The Difficulty In Deconstructing Hyperbole In Tesla Autopilot Crashes

First it was the Teslaractioning videos, where owners of the Tesla Model S P85D (and later P90DL) electric car filmed the reactions of unsuspecting passengers as they buried their right foot in the carpet.

Then there was the rush of Tesla Autopilot videos, showing Tesla owners experiencing Tesla’s semi-autonomous Autopilot functionality for the first time. Shortly afterwards came the Tesla Autopilot Fail videos, showing the occasions where still inexperienced Autopilot beta software didn’t do exactly what was expected of it. And as more and more Tesla customers turn Autopilot on, we’ve seen those Tesla Autopilot Fail videos shift from demonstrations of not-so-great driving to reports of actual collisions where Autopilot is accused — at least by the owners of the cars in question — of causing the accident.

It doesn't look good... Photo: "Puzant" (via IMGUR)

It doesn’t look good… Photo: “Puzant” (via IMGUR)

The latest case involves a five-day old Tesla Model X P90D which its owner claims the car suddenly accelerated while entering a parking stall, allegedly crossing ‘over 39 feet’ of planters before ultimately crashing into a building, causing significant damage to both the car and the building.

Do people really understand Autopilot?

Do people really understand Autopilot?

But while the clear allegation from Forum member Puzant is that Tesla’s Autopilot software is to blame for the sudden and violent crash, Tesla is hitting back, claiming logs from the car indicate the driver pushed accelerator to the floor, causing the car to react accordingly and ultimately crash into the building.

Both parties believe the other is wrong, and the spectre of intelligent, malevolent cars has been with us ever since Stephen King released the demonic Plymouth Fury on us in 1983 when he published Christine. So who is right — and how do we deconstruct obvious hyperbole from both skeptics and advocates for Tesla’s Autopilot and autonomous cars in general?

First, it’s worth looking at the details of this incident.

According to Puzant’s posting on the Tesla website, the series of events which led to the unfortunate accident unfolded in the following manner:

 “Our 5 day old Tesla X today while entering a parking stall suddenly and unexpectedly accelerated at high speed on its own climbing over 39 feet of planters and crashing into a building.

The airbags deployed and my wife’s arms have burn marks as a consequence.

This could have easily been a fatal accident if the car’s wheels were not turned slightly to the left. If they were straight, it would have gone over the planters and crashed into the store in front of the parking stall and injured or killed the patrons

The acceleration was uncontrollable, seemed maximum and the car only stopped because it hit the building and caused massive damage to the building.

This is a major problem and Tesla should stop deliveries and investigate the cause of this serious accident.

Tesla roadside assistance, who was my only contact, asked us to tow the car to AAA storage facility.”

Alongside the description after being encouraged to do so by other forum members, a series of photographs were uploaded showing the post-crash carnage. They document the damage to the front of the brand-new Model X — its license plate still to arrive — and hole the car made in the building it crashed into. At first glance, they’re pretty scary and seem to back up the claims of Puzant.

This time, the accident happened while parking. Photo: "Puzant" (via IMGUR)

This time, the accident happened while parking. Photo: “Puzant” (via IMGUR)

But in response to the accident, Tesla told Teslarati that, following examination of associated vehicle logs, the car wasn’t to blame. Instead, it suggested, the driver was the one at fault. Issuing the following statement, the California Automaker refutes any allegations that it was to blame:

We analyzed the vehicle logs which confirm that this Model X was operating correctly under manual control and was never in Autopilot or cruise control at the time of the incident or in the minutes before.

Data shows that the vehicle was traveling at 6 mph when the accelerator pedal was abruptly increased to 100 percent. Consistent with the driver’s actions, the vehicle applied torque and accelerated as instructed. 

Safety is the top priority at Tesla and we engineer and build our cars with this foremost in mind. We are pleased that the driver is ok and ask our customers to exercise safe behavior when using our vehicles.”

Of course, this isn’t the first time that we’ve heard of cars suddenly accelerating and crashing into things. As our friends at GreenCarReports note, past incidents of Unintended Sudden Acceleration have encompassed a variety of models from various years of production, including models by Audi, Toyota and Toyota’s luxury arm Lexus. Usually, the errors were easily explained: Audi’s issue was a result of poor pedal placement while the issues affecting Toyota Prii and certain Lexus models were traced back to either a badly-fitted, oversized carpet or driver error in which accelerator was pressed instead of brake.

But this example — and the other examples involving Autopilot-enabled Tesla electric cars — are the first time we’ve had to contend with privately-owned cars that are technically capable of Level 2 Automated operation on the National Highway Traffic Safety Administration’s five-level Automated Vehicle Classification scale. Since most people aren’t even aware of the existence of the scale and fewer are familiar with the inner working of autonomous or partially autonomous vehicles, it’s led to a lot of confusion, fear and misreporting.

Is Autopilot really to blame? No. Hyperbole is.

Is Autopilot really to blame? No. Driver error and hyperbole are.

And a giant dose of hyperbole from both sides.  So to try and clear things up, we think it’s worth briefly explaining NHTSA’s automated vehicle scale.

At one end (Level 0), a car is fully manual in its operation. There’s no assistance system in place at all, and the driver is fully responsible for every_single_thing the car does. Level 1 meanwhile, covers things like traction control, ABS and rudimentary cruise control. The car takes over basic responsibility to ensure maximum grip is available at all times and can travel at a pre-defined speed without requiring you to put your foot on the throttle. But you’re still responsible for driving the car.

Volvo's current autonomous vehicles are Level 3.

Volvo’s current autonomous vehicles are Level 3.

Level 2 is what Tesla’s Autopilot system officially qualifies as, at least in its current beta-test version — and at Level 2 a car can combine two or more safety systems such as adaptive cruise control or lane assist to help the car stay straight and true in its lane. Sophisticated Level 2 cars like Tesla’s Autopilot system can even take over large chunks of your driving duties. But while they may be able to cope with most situations, they are still not truly self-driving. The driver has to retain control at all times, and be ready to jump in at a moment’s notice to take over.

Level 3 continues autonomy further, describing a car that can operate independently for the majority of a trip, giving the driver advanced forewarning that it needs them to assume control of the vehicle. At this level, Drivers can cede control of their car, but are still required to be ready to take over given sufficient warning. Volvo’s DriveMe system is currently a Level 3 system. Tesla’s system, since it doesn’t give forewarning of the need for a driver to take over, is Level 2. And it’s only when you reach Level 4 — in which the car takes over complete responsibility for itself and its driving choices — that you have a truly autonomous vehicle.

Why is this important? Well, for a start, there’s a great deal of hyperbole in the media and Tesla’s fan base when it comes to describing the Model S and Model X as ‘self-driving cars’. And while Tesla itself is careful to note that Autopilot is a ß-software feature that should be supervised at all times, the excitement and giddy glee in which we are collectively consuming the promise of fully autonomous cars means that sometimes we’re failing to appropriately account for dangers present.

Worse still, because we all secretly (or not-so-secretly) want to live in the future, the promise of autonomy means we’re strangely willing to suspend disbelief and go with hype over fact.

Autopilot is still a beta software, and only provides Level 2 autonomy.

Autopilot is still a beta software, and only provides Level 2 autonomy.

Which brings us back to this specific case, and some potential hyperbole on the part of the claimants. While we’re not in a position to comment one way or another as to who is to blame since we don’t have all the information, we can make some simple observations based on what we can see from the supplied images — and what we know of Tesla’s Autopilot operation.

The claim alleges that the Model X in question suddenly drove over the corner of the parking space and climbed “over 39 feet of planters” before crashing into the building. Yet the photographs accompanying the claim show a Model X with its nose less than a foot from the building it crashed into. The rear wheels of the Model X P90D have climbed the curbstone between the parking lot and the landscaped planters, but only just. Its rear still hangs over the tarmac. And while it’s clear the crashed Model X did reverse slightly post-crash (presumably either an action by the driver or the emergency services) the distance between car and building, there’s certainly not 39 feet of planters between the parking space and the building.

The Tesla Model X is 16.5 feet in length. Even if we’re feeling generous and add four feet between the front of the car and the building (and we don’t think it is four feet) the distance between building and parking space cannot be any more than 20 feet at its maximum. A more accurate, if conservative guess would place it nearer to 16 feet.

That’s more than half of the alleged distance and while the acceleration may have begun while the car was 39 feet away from the building, basic math suggests that this would have occurred while the car was still outside the parking space (the average U.S. parking space being between 15 and 20 feet in length). This suggests, but does not prove, that the driver report isn’t entirely accurate. Then again, accident reports rarely are, due to the flooding of our bodies by adrenaline in the post-collision trauma.

That’s not to say we’re siding with Tesla. We don’t have the information to determine who is to fault and we’re not accident investigators. We’re merely noting the point as something we should all remember when reading first-hand accident reports. Trauma does strange things to our brains, while computer data doesn’t (usually) lie.

The most likely scenario we can come up with? That the driver went to apply firm brake pressure to stop the car, and accidentally hit the accelerator instead.

Autopilot isn't the fault here: driver error is the likely culprit.

Autopilot isn’t the fault here: driver error is the likely culprit.

Is Tesla off the hook then? No. While we think it is more likely the car accelerated due to driver error, the way in which Tesla’s Autopilot system operates still needs improvement in order to ensure that drivers really do know when they — or their cars — are in control. Moreover, there are some questions as to why the safety aspects of Tesla’s Autopilot system didn’t kick in to prevent the car from ramming into the building in question.

Additionally, given some of the reports we’ve seen and heard in recent weeks of Tesla Model S and Model X cars crashing due to confusion over Autopilot operation, we’re reiterating our concern that there’s really far too little education over how and when Autopilot should be used. Moreover, we can’t help but wonder if Tesla should have waited until a formal autonomous car framework was in place to ensure that both driver and other road users are aware of and ready for autonomous vehicles — even if they are only Level 2 — before releasing them on the road.

It is something of a PR nightmare. And unless Tesla or someone else steps up to the plate to rectify this situation quickly, we think it won’t take long for safety agencies acting for governments around the world to step in and essentially set back autonomous driving for years or even decades.

Blaming a mechanical ghost in the shell is an easy option. But at the same time, complacency over what the Autopilot system can really do — and what it should really be used for — is just as dangerous.  And for that, the automotive world, the legal world, the press and hardened advocates need to ensure that hyperbole has no place in autonomous vehicle discussions.


Want to keep up with the latest news in evolving transport? Don’t forget to follow Transport Evolved on Twitter, like us on Facebook and G+, and subscribe to our YouTube channel.

You can also support us directly as a monthly supporting member by visiting

Share on FacebookTweet about this on TwitterShare on Google+Share on LinkedInDigg thisShare on RedditEmail this to someonePin on Pinterest

Related News

  • There was also a bug in the Toyota software that had two memory managers running the same physical memory, which could lead to a vehicle’s applications overwriting each other, resulting in an acceleration of infinite being programmed, with no user input able to do anything except rebooting the car (turning it off). They’ve since programmed brakes to override acceleration settings and fixed the memory leak. Which is you want unless you’re some sort of rally driver.

    My mother’s car has several automatic functions, and she doesn’t trust it at all. It’s very easy to give the autopilot enough pedal instruction that it thinks the driver is in control when in fact the emergency system should be activating. It’s still driver error, though.

  • Joe Viocoe

    An amazingly well-written, and logical piece of writing. Thank you Nikki.

  • Joe Viocoe

    Even in a maximum effort to idiot proof a vehicle, or any computer system, human capacity will exceed that.

    So, even if Tesla puts a big physical breaker switch in the middle of the dashboard… showing the driver that there is no possible way for the autopilot to be active…. some people will still claim that electrons jump the gap.

    The cold harsh reality that human egos fail to grasp, is that our brains are notoriously poorly designed for accurate memory. It fails often and predictably during dramatic events. And can even be socially engineered by mentalists.

  • Just another victim of a Type 2, right foot pedal error as per the right foot braking epidemic? Right foot pedal errors happen 10,000 to 40,000 times per day to drivers of all ages and gender. Innocent people die. Sad that the authorities refuse to teach the safer left foot braking method and ban right foot braking on automatic cars and EV cars. These time bombs are going off all over America.

  • Andrew Hohmann

    Thank you NIKKI for the well written article. I learned to drive a manual shift transmission first, automatic transmission many years later. I recognize a person who learned to drive a clutch but carried those habits into a car without a clutch pedal, either using both feet to accelerate while backing up, or both feet during a panic stop. Not being attentive (situational awareness) or not being mentally present while having the 2 foot mindset sounds like a recipe for disaster. In my 3 pedal vehicles there’s enough room between accelerator and brake to stretch out my right leg while using cruise control to not bump the brake. I do not like the brake of my 2 pedal vehicles being too close to the the accelerator. During QUICK stops my foot sometimes gets caught behind the brake pedal. I believe the problem with 2 foot driving a 2 pedal car is learning bad habits which are misapplied during panic stops. I have to wonder what pedal discipline is being ingrained with the advent of regenerative braking 1 pedal driving.

  • simon blackmore

    The comments and discussion surrounding the Tesla crash where the bloke was hands free probably (almost certainly) watching a video and then failed to see the truck turning in the road, has now opened the door for every chancer who owns a Tesla to say, “It wasnt me at fault, it was my car.”

    Any insurance claim would be scrutinised to see if the facts as relayed, match the recorded verifiable evidence of the car data logs. Make a claim that is at odds to the facts… then put them on the lie detector. Or look for neighbours who may have surveillance cameras over looking the scene. If it transpires that a fraudulent claim has been made that the person should suffer the consequences, even criminal or civil prosecution.

Content Copyright (c) 2016 Transport Evolved LLC