In the wake of the deadly accident on may 7, in Florida (united States) of a Tesla Model S that had turned on Autopilot, authorities look at Tesla with suspicion. One more example comes to us from Germany, where transport authorities are evaluating the effectiveness of the wizard of conduction of the manufacturer california.
According to Der Spiegel, an internal report of the ministry of Transport German qualifies to Autopilot as “a considerable risk to the traffic”, because the effectiveness of the system is not exactly 100%. For greater precision, you would have to say that currently, no system is completely self-contained.
The Autopilot Tesla is a conduction system of semi-autonomous of the type “hands-off”, that is to say, the driver may stop to grab the steering wheel temporarily, but remain pending at the traffic at all times. As an assistant, the responsibility of driving is always on the human.
The driver has to put hands on the steering wheel once in a while, or the system is off. With versions prior to 8.0 you could release the steering wheel for several kilometres in a row
For example, the report finds that if the wizard of conduction is not able to solve a particular problem, the owner will not receive information about it. Now, the human drivers can take control at any time. Further, it should…
Another point that is critical of the report is the limitation of perception of the sensors, or the effectiveness of the automatic braking in certain circumstances. Tesla has never claimed otherwise. Every system has some limitations and the Tesla is not an exception in that sense.
The Autopilot Tesla is a system designed to operate in conditions of circulation very specific: means of a one-way street, with several lanes, without additions, cross-sectional, with proper signaling, etc, In other words, is suitable for expressways, although the owners use it… when they want to. There is the problem.
The sensors of the Tesla Model S and Model X also have a limited scope, as in any other equivalent system!
The German authorities are still evaluating the system Autopilot, which already has the approval and adoption at european level. The american company defends itself by arguing that the Autopilot has become much more sure and makes it difficult for the driver to wash its hands of the driving.
it Is a mistake to take the Autopilot as a system of driving fully autonomously. The same can be said of the automatic pilot in aviation: their functionality is limited, their perception of the reality also, and continue to lack pilots trained and trained to fly the aircraft.
When mixing the bacon with the speed is when there are problems. If a driver does not address the issue of driving, believing that it has an autonomous car, is when it puts you at risk. In the contrary case, if it is not removed by the view of the driving, Autopilot does not contribute any uncertainty to the driving, on the contrary, it is a defensive barrier more.
Autopilot activated. In the dashboard you can see, simplificadamente what is perceived by the car in its environment
Whoever says, therefore, that the system Autopilot is a danger to road safety, absolutely do not know what they are talking about. What is a danger to road safety is the people who are delegates on a machine that is not prepared to deal with all the casuistry of the actual driving, it gets to play cards, watch a movie, or falling asleep at the wheel.
Youtube is full of examples of people irresponsible has done tests with Autopilot far beyond the purposes for which it was designed. The fatality of Florida I was not paying attention to driving, so did not see the truck against his car crashed, not having been detected by the sensors.
on The contrary, Tesla Autopilot is one of the active safety systems more advanced on the market, but it has to be used correctly. Tesla has not said that their system could dispense with the driver altogether, this technology remains experimental and is being tested every day, for years. Meanwhile, the final decision will belong to a human drivers.