A driver dies in an accident in his Tesla Model S while using a semi-autonomous Autopilot

few months Ago we asked a question, which, unfortunately, has ended up being a tragic prediction. What would happen if a Tesla Model S suffered a serious accident, or even fatal, while a client was using the function Autopilot? Stephen Boulter, Project Manager of the Jaguar XF, stated in an interview to Mashable that among his greatest fears was the possibility of that. An accident which, by its effect, it could deal a major blow to the progress that is required for the introduction of the autonomous car. And unfortunately that time has come. Joshua Brown, the owner of a Tesla Model S, passed away, colliding against the trailer of a truck on a Florida highway when his car ran semi-autonomously using the function Autopilot. Not it seems that neither Joshua, nor Autopilot Tesla, had the guilt. But now there are many questions we ask ourselves about the autonomous car and its development.

Note: the image illustrating this entry, and which shows a Tesla Model S crashed, it does not reflect the accident that we talk about in this article, but another who suffered a few young people in Germany a few weeks ago.

Tesla Motors acknowledged the accident in an entry on her own blog, under the headline “a tragic loss”. It also recognized that the NHTSA – the department that oversees the safety of u.s. roads – has opened an investigation for analyzing the performance of Autopilot and determine if the system worked properly at the time of the accident. It was the brand itself that reported the incident to NHTSA, and that is already providing all of the event data to conclude what happened.

Recall that for its technology, the Tesla Model S entering large amounts of data that very probably will allow researchers to reconstruct the scene of the accident.

fabrica-tesla-motors-europa-2016-00

And how the accident occurred?

According to Tesla, the Model S driving in the mode Autopilot for a freeway, with several lanes, and with different senses of the march separated. Essentially that is the environment to which the Autopilot has been created, and where you should offer more guarantees. The big problem arose at the time that a truck with a trailer invaded the roadway, perpendicular, so that neither driver had time to react and take the controls, and try to avoid the accident, or Autopilot was able to detect the obstacle and stop automatically, by their own means.

Tesla recognized that this was one of the problems, the side-impact collision against a surface that does not have elements that reflect the light their systems were not able to detect. Tesla ensures that if the scope had been anticipated against the front, or behind, the truck, the accident would have been avoided, or at least their consequences have been mitigated.

of Course, the guilt was not on Autopilot, nor the driver of the Tesla Model S. But you will be with me in that the consequences of this accident can do a lot of damage to the autonomous car. And is that this accident has made clear one of the biggest problems facing this technology. The problem is that, even with autonomous cars able to offer a reliability, and absolute security, we have to assume that they will still be sharing the roads with drivers human, and therefore imperfect and prone to make errors, or inaccuracies.

Now all these are questions. What conclusions will extract the NHTSA for this accident? What could have been avoided if the systems of the Tesla Model S they would have been prepared to detect the truck, also in his side view? What the consequences will be for the advancement of Tesla and Autopilot? How will it affect the development of the autonomous car?

Source: Tesla