5 questions (with no answer) that leaves us with the autonomous car following the fatal crash of a Tesla Model S using Autopilot

last week we knew a tragic news, which has generated little controversy these days. A driver died in California in an accident in a Tesla Model S while driving in the mode Autopilot. Remember that Autopilot is a way of driving semi-autonomous, in which the technology of Tesla to take the controls to pilot the vehicle in certain conditions, maintain speed, turn, or even do some overtaking. Although everything points to the guilt of the loss will fall on the driver of the other vehicle, in a human, this accident has not made another thing that to increase the concern of the drivers to see how the machines relegate the driver to the mere role of passenger, and the concern because the autonomous car reaches the roads.

1. How will affect fatal accidents to the autonomous car?

is Probably the million dollar question. A few months ago we knew it in an interview, Mashable, Stephen Boulter, Project Manager of the Jaguar XF, I had a great fear. And their fear was not other that the fact that the arrival early of the autonomous car, without the relevant tests, would result in any major accidents, even no fault of the layer technology, posed to a halt in the advance of the autonomous car.

a Lot of us are afraid that these accidents will increase the distrust that generates the autonomous car. There is more to see the comments that generated the crash of the Tesla, or the comments and conflicting views that we look at any entry on the autonomous car. The thread of this topic, my colleague Sergio published an entry extremely interesting as to why that fear overwhelming the autonomous car, or even flying. Why generate so many fears as a means of transport, so sure – the figures they endorse – as the plane? Why are we so afraid of the cars self-conquer the road, even being aware that they could end up with the majority of traffic accidents, which by the way are produced by forgetfulness or irresponsibility of humans?

The automotive industry has to assume that every accident involving an autonomous car, is the fault, or not, of the own technology of the autonomous car, will further increase the distrust of the client in this technology.

tesla-model-s-x-bioweapon-defense-mode-04

2. What are sufficient figures, and a rate of loss ratio, low, to justify the autonomous car?

Tesla provided a figure is very illustrative. We are before the first fatal accident of Autopilot in more than 210 million kilometres travelled using this mode of driving semi-autonomous since its launch. On the roads of the united States occurs, on average, died in a traffic accident every 151 million kilometers, a figure that drops to 96 million miles away. Tesla tries to remind us that the Autopilot would have shown, in some way, be safer than human drivers driving. Although in my view such a comparison would require sampling of a volume of miles traveled in autonomous mode, more representative.

If we take into account that the majority of traffic accidents are due to carelessness, or a slip, of a conductor is to assume that these figures will continue to benefit the autonomous vehicle, and their rate of accidents, and fatal accidents, will be lower.

Is more, there are already those who predicted that the popularization of the autonomous car could lead to the reverse situation, when we don’t we can afford to leave the drivers to exercise their task of driving, because of the potential risk of an accident, by engaging in a ball, or commit a mistake.

tesla-que-es-01

3. How far driving modes semi-autonomous will be supervised by the driver?

Although Tesla has sold to Autopilot as an automatic pilot, with capabilities of autonomous car, the brand makes a claim that the driver always has to be aware that we are seeing a technology in development, in beta phase, and that the driver has not only to maintain focus on the road, but also the hands on the wheel during all the time to cater to any eventuality that may occur on the road. This double discourse is very problematic, especially if we take into account that the implications of an accident in a car-mode, semi-autonomous, and the legal consequences and their implications are not less of a problem.

My surprise was great when I read that the driver of the truck involved in the accident, which apparently had been guilty, she claimed that the “driver-passenger” of the Tesla Model S he was watching a Harry Potter movie (Jalopnik).

The severity of a fatal accident leads us to see that the technology of the autonomous car can only be present in the street, if the manufacturers ensure a total reliability of their systems, and insurers are not liable for the “acts” of this technology.

tesla-supercharger-girona-02

4. How will the coexistence between autonomous cars and drivers?

is Probably one of the most critical points and the big problem of the autonomous car, the assume that you will need to be a coexistence between drivers and autonomous cars. This will complicate greatly the development of the autonomous car, both the drivers often are not predictable. And probably that is the conclusion more interesting that would have removed Google analyzing the accidents of their autonomous cars in tests in the united States.

In the vast majority of the accidents suffered by the autonomous car of Google were involved other vehicles, driven by human, by the way, with the guilt of these in the fire. In recent weeks we have seen as tasks of everyday life such as beeping with the horn, to avoid a collision, pose many challenges for developers of autonomous cars.

Although it is not normal for a truck down, on the perpendicular, a highway, as in the accident suffered by a Tesla Model S, the autonomous car should be prepared to face any eventuality generated by a driver, for remote and incredible that is.

tesla-model-s-supercharger-1440px

5. What is the degree of reliability of the sensors and systems of recognition of images?

The accident suffered these days by a Tesla Model S, and other incidents that we have seen on Youtube since it was filed, Autopilot, lead us to question the accuracy and effectiveness of sensors and systems of recognition of images. According to Tesla, the Autopilot was not able to detect the truck that encroached on the road by the fact that his side white will melt with the clarity of the sky on a sunny day. Even though that was not the cause of the accident, and might not have been able to prevent it, to what extent can the technology equipped for the Tesla Model S, or equipped with the first prototypes autonomous, to ensure maximum accuracy and effectiveness in the recognition of images?

Think that the technology has to deal with situations that are often unpredictable and changing. And that on the road, not only we could find another car, but also with the invasion of animals, domestic or wild, a tree, a rockslide…