Sep 12, 2017, 4:49 PM ET

Tesla's semiautonomous system contributed to fatal crash: Feds


Federal investigators announced Tuesday that the design of Tesla's semiautonomous driving system allowed the driver of a Tesla Model S in a fatal 2016 crash with a semi-truck to rely too heavily on the car's automation.

Interested in Tesla?

Add Tesla as an interest to stay up to date on the latest Tesla news, video, and analysis from ABC News.
Add Interest

"Tesla allowed the driver to use the system outside of the environment for which it was designed," said National Transportation Safety Board Chairman Robert Sumwalt. "The system gave far too much leeway to the driver to divert his attention."

The board's report declares the primary probable cause of the collision as the truck driver's failure to yield, as well as the Tesla driver's overreliance on his car's automation — or Autopilot, as Tesla calls the system. Tesla's system design was declared a contributing factor.

In May 2016, Joshua Brown was driving his Tesla on a Florida highway when the vehicle collided with the side of a truck making a left turn from an oncoming lane. Investigators said they do not know if the truck driver saw the approaching car, because the driver refused requests to be interviewed. Brown was killed in the crash.

An NTSB analysis of a toxicology test found the truck driver used marijuana before the crash, but NTSB investigators could not conclude his level of impairment, if any.

The NTSB said Green's vehicle performed as designed but could be improved to deter drivers from diverting their attention from the road.

"While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles," Sumwalt said.

Tesla made updates to its Autopilot design after the crash, warning drivers earlier after they remove their hands from the steering wheel.

A Tesla spokesperson provided a statement to ABC News that read, "We appreciate the NTSB's analysis of last year's tragic accident, and we will evaluate their recommendations as we continue to evolve our technology. We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times."

According to The Associated Press, members of Brown's family said on Monday that they do not blame the car or the Autopilot system for his death.

A National Highway Traffic Safety Administration report on the crash can be found here. The NTSB has yet not published its full report; a synopsis of it can be found here.

News - Tesla's semiautonomous system contributed to fatal crash: Feds

RRelated Posts


  • JuPMod

    Gee, Brown and others are stupid to think these auto systems will prevent crashes, etc. Not always the case. Letting your eyes off the road just for few seconds can be fatal. Given he took his hands off the wheel and likely was not paying attention, Brown is at fault for his own death here. Even if my car have such an 'autopilot', I will not ever take my hands off the wheel and my eyes off the road. You never know what may happen.

  • Tyler U

    "The NTSB said Green's vehicle performed as designed but could be improved to deter drivers from diverting their attention from the road."

    @ABC Who is Green?

  • Randy Rjjd

    those who wish to have self driving cars should take public transportation and leave driving to the skilled. the roadways would be safer and less congested.

    the truth is too many people view driving as a right not a privilege. in my opinion everyone should have to take a skills test every 4 years to prove you should be allowed to drive.

  • j penske

    I will never have a self-driving car. I know there's a lot of incompetent idiots out there, especially in the town where I live, but I know my own skills and trust them way more than the decisions a self-driving car would take.

  • dharper08

    This technology isn't perfect and faces a long time of improvement. In the meantime many people will die as the technology is continuously improved (I may even be one of them as I have a Tesla Model 3 on order and intend to get the autopilot option). However, this technology has the potential to ultimately save a lot of human lives. Those developing and deploying this technology are at risk given our litigious society but without their efforts this potential life-saving capability may never exist. A hundred years ago flight was a new technology and a lot of people died during its development and improvement. Given that it was allowed to develop, today we enjoy one of the safest and fastest forms of travel as a result. It is important for those using this new technology to recognize the fact that, regardless of the ultimate promise of autonomous vehicles, for the foreseeable future the human driver will be completely responsible for the operation of the vehicle, just as is the captain of any seagoing vessel.

  • snake

    Gotta go with the Tesla autonomous system over the stoned truck driver every time.

  • Sheila Moore

    So people with regular cars take their hands off the wheel and check their cell phones, and it's their fault. Give them self-driving technology, and they take their hands off the wheel and check their cell phones, it's the car's fault . I have an idea. When a person is warned about their behavior, the radio should play full blast even if they are on a call. That would do it. It would work for people leaving babies in their car as well.

  • Cherious

    Autopilot is already much safer on the road than all those zooming by maniacs you see every day. Unfortunately it's those aggressive drivers who will be the last to surrender their "hands-on" adrenalin rush causing experience to the technology.