last week was the first accident in which is involved a autonomous car and there is a victim. Elaine Herzberg, age 49, was hit by a Volvo XC90 which performs driving tests autonomous to the company Uber. Behind the wheel of the XC90 had, as always, a supervising driver in the process. In this case, electrically conductive, Rafaela Vasquez.
While it advances the development of the research, the indications are that the accident would not have been able to have been avoided had that happened the same thing with a driver fully concentrated on the task of driving, although it is likely that their consequences had been a little less malicious.
Let’s take a look.
1) The car ran at correct speed
The XC90 was circulated by a four-lane road at a speed of 40 mph (65 km/h), which is below the 73 km/h limit on that stretch of road by sign. The standalone mode always tries to keep the speed limits as one of its performance standards.
however, 65 km/h is a very high speed for ramming a pedestrian, since the chances of survival are practically nil. For that reason, urban areas typically have a limit of 50 km/h or 35 miles/hour (56 km/h).
Tweet insert: 975795365302726656
2) The victim crossed for a place that is not enabled
Analyzing the video that is recorded during the tests, it is evident that the victim crossed for a place that is not enabled and under-enlightened. According to the chief of police of Tempe (Arizona), ms. Herzberg appeared “nothing”, so that there was barely time to react to a machine. Apparently the intersection is confusing in its signaling, according to a tweeter.
If you already cross by a place that is not enabled is dangerous, more is not look at least if it is possible to cross. However, that circumstance does not suffice to exempt responsibilities, according to Janette Sadik-Khan, former commissioner of the department of transportation of the city of New York and consultant in the areas of mobility.
3) The security system was disconnected
series, the Volvo XC90 equipped with a security system called City Safety. According to the manufacturer, it first warns the driver and partially applied the brakes if it detects a danger, and can avoid the collision entirely to a speed of 45 km/h. At a higher speed, you can run over someone, but less strong.
Zach Peterson, a spokesman for Aptiv, manufacturer of the safety system dependent on Delphi, it has been clarified that the system was turned off at the time of the accident. Uber is developing its own technology of automatic braking, so that to keep the series system might cause some conflict. It is the system of Uber which has failed, in any case, the serial.
4) The driving test was going distracted, but…
Rafaela Vasquez was behind the wheel at the time of the crash, but not driving, but monitoring. Initially it was reported that it was a boy, Rafael. The media has investigated his past and served time for attempted armed robbery, but is completely clean and that circumstance is irrelevant.
In the video he shoots the bus driver you can appreciate how Rafaela had not at that moment the eyes on the road, which can be explained by the monotony of the work, the apparent lack of danger in that stretch.
anyway, having been completely attentive, with so little margin of maneuver (the victim only appears a second before) it is impossible to react. The reaction time standard is 0.75 seconds, and the car does not stop in the act.
5) Of time you can’t say that it a technology unsafe
The much-vaunted and airy accident two years ago of a Tesla Model S with the autopilot turned on, which caused the death of the driver, Joshua Brown, proved to be a simple neglect of the same. On the one hand, the system Autopilot Tesla is a semi-autonomous, requires eyes on the road, on the other, was not designed to be driven on a road with intersections at the same level. The victim was watching a movie.
In march 2017, there was a minor accident with a Volvo XC90 Uber. The research showed that there was a failure of the autonomous system, but of another driver. There was also another minor accident in February, but failed to prove the guilt of any of the parties.
therefore, the data are clear: in relation to the distance traveled, and excluding the errors of others, is a safe technology, very immature that is according to certain opinadores and other “experts”. Human drivers don’t do better.
–