Ever since Tesla turned on Autopilot functionality in hardware-enabled Tesla Model S and Tesla Model X electric cars, we’ve heard of countless instances where Tesla’s semi-autonomous autopilot driver assistance technology has helped keep drivers and passengers safe from being involved in some pretty nasty accidents.
We’ve also heard reports of some pretty nasty accidents involving Tesla electric vehicles where Autopilot was (erroneously) blamed for the collision but where fault was ultimately laid at the feet of the driver.
This week, just such a story surfaced on Reddit: the driver initially blamed his car’s Autopilot technology for crashing into a barrier without warning. But a few days later, footage was released showing that accident occurred when the driver failed to take control of the car as it entered into a construction zone with unclear road markings, something Tesla’s current version of Autopilot is not capable of safely navigating.
Which got us thinking: what do stories like this mean for autonomous vehicles, and how can we ensure that drivers become better educated on safe use of advanced driver assistance features like Autopilot?
You can also support us directly as a monthly supporting member by visiting Patreon.com.