Tesla Sued with Claims Autopilot Caused 2018 Fatal Crash with Old Software

News Tesla Lawsuit Crash Featured

There are reasons so many people are still not trusting of self-driving cars. The biggest reason is of course that it just doesn’t seem safe to have cars on the road with computers driving them instead of people.

Those fears were realized last year in a high-speed fatal crash. A man died while he had the Autopilot turned on in his Tesla. His family has now filed a lawsuit against the company, claiming the autonomous feature is defective because of older software.

Tesla Self-Driving Car Crash

This lawsuit both answers questions and raises others revolving around self-driving cars. One thing that is not uncertain is that there are reasons to question the safety of autonomous cars.

Wei Lun “Walter” Huang, an Apple engineer, was driving his 2017 Tesla Model X P100D on March 23, 2018, on a highway in California. He engaged the Autopilot feature of the car a total of four times during the 32-minute drive. There was one 19-minute long stretch of time where it was engaged just before the crash.

He was in the left lane when the lanes split. Once the car no longer detected it had a car ahead of it, it veered to the left and then accelerated to get back to the 75mph speed Huang had set. It drove straight into a highway barrier that was damaged eleven days before and left unrepaired. The front end of his Tesla was sheared off, and the vehicle caught fire.

The car’s sensors show that he was not holding the steering wheel for six seconds before the impact. The lawsuit charges that Tesla’s Autopilot system should have kept the car in the lane and alerted him to an imminent collision, then engaged the emergency braking, but that doesn’t appear to have occurred.

News Tesla Lawsuit Crash Building

The lawsuit is claiming the 2017 Tesla Model X P100D was operating on old software. Later car models introduced improved software for Autopilot that is capable of recognizing highway interchanges automatically taking exits while traveling on a set navigation route.

But Tesla has always maintained since the first Autopilot in 2015 that the driver must be able to take full control of the vehicle at all times.

One year later there was a crash that Tesla CEO Elon Musk said would have been prevented by later improvements to the system. He indicated customers would be getting a software update and mentioned that would prevent crashes into objects.

It is unknown what software version Huang’s car had that was released one year after Musk made the announcement, and the National Transportation Safety Board is still investigating.

Still Too Many Questions

For many people these problems are why they wouldn’t consider a driverless car or even a driverless feature on an existing car. It’s just too frightening to not be in control of a vehicle speeding down a highway at 75mph.

And at this time, there’s no data to compare it. Human-driven cars get in accidents all the time. Do autonomous cars crash more or less? We just don’t know.

Do you think autonomous cars and features will eventually reach a point where they can be trusted, or do you think this lawsuit over the Tesla car’s Autopilot feature is symptomatic of where we’re headed? Add your thoughts to the comments below.

One comment

  1. “Tesla has always maintained since the first Autopilot in 2015 that the driver must be able to take full control of the vehicle at all times.”
    If the driver must be able to take full control at all times then there is no purpose or need for the Autopilot other than to make more money for Tesla.

Comments are closed.