Would you buy a car that is programmed to kill you if necessary?

we’re talking a lot of autonomous car. The future of the motor car inevitably involves cars that we will be mere passengers. Cars safe, efficient, and intelligent, which will save the traffic jams and we will save tens of hours lost a year. first modes of driving semi-autonomous on the motorway have already been released, without going further in the Mercedes GLC. But the autonomous car has a reverso tenebroso that may desconocías. would you buy a car that is programmed to kill you if necessary?

what Will my autonomous car to attempt against my life?

What moral laws should follow an autonomous car? Who designs their moral laws? What should be regulated in a formal manner?

We agree that accidents are unpredictable. For very sure that it can be an autonomous vehicle, accidents can still occur. Imagine that your autonomous car driving on a secondary road and a motorcycle out of control is heading toward your car, in a trajectory of shock safe. A path that possibly ends the life of the motorist. What should the autonomous car cast by the wayside if you know that the chances of survival of the occupants of the car are very much higher?


Is a moral decision that should be printed in the electronic brain of the car. A decision is very complex, that the car should take in milliseconds, and that gives the vehicle reasoning abilities similar to a human brain. We agree that this is an appropriate decision, so here goes another. Imagine if the same thing happens, but in a curve, blind to the outskirts of a village a group of children had run out to by a ball and occupy the centre of the road in an unconscious way.

do you Offer a manufacturer different algorithms morales for their autonomous cars? What should or should not?

Is it too late to brake, but the car has the option to swerve and hit a wall to the side of the road. The clash would end with the life of the occupant of the autonomous car, a single person at that time. The autonomous car could choose to end your life instead of the life of three children. Now, what would be different about the decision of the car if on-board were five occupants? Accidents are rare with an autonomous car, but there can still be situations beyond the control of the car.


An interesting paper published by Jean-François Bonnefon few months ago to the Toulouse School of Economics highlights the moral dilemmas to which the autonomous car of the future could face, and how I should fix them. The paper is written in English, but if you are interested in minimally disclose the future of the autonomous car, you should read at least the introduction of the study and the questions it raises. Questions that someone must give you an answer, whether the car manufacturer or a government commission.

do you Thumb up or thumb down? Creating algorithms morales

The laws of robotics of Isaac Asimov may very well be applied to the autonomous cars.

algorithms Are moral that will decide the performance of the car against the unforeseen in which human lives are at risk. The paper posits that it is statistical research that should answer these questions, giving rise to moral standards of behavior. Patterns that reflect the complexity of human emotions, by modelling actions in many cases instinctive. What you’d end up with your life in a situation as described above? How they would act in the same way all the people?


autonomous cars are already circulating in Spain, but always with a human supervisor.

I Understand that this article raises more questions than it answers, but ultimately, it will be the car that decides for us. Even come to raise the preference of the consumer for a car with an algorithm that is moral which tends most to preserve the life of the occupants, in front of the third-party. Therefore raises the government regulation. The same governments that are still unaware of the technology of the autonomous car to this level of specificity, and that are very latecomers to the private initiative.

The false sense of security that gives us the control of the car could swing out to people for cars less safe than the autonomous cars.

My personal opinion is that there should be a “moral framework” unified to the autonomous car, identical for all the cars in the market capable of driving by themselves. The goal is to create a common policy as fair to all road users. Alter the algorithms morales of a car should be punishable by law, as it is to take control of a car remotely. Believe me, it will not be the last time in the next few years that let’s talk about what that could become a barrier to the implementation of the autonomous car.


Maybe all is summarized in the false sense of security that gives us the driving a vehicle by ourselves, versus relinquishing control to a third party, in this case an electronic brain. Perhaps this is why we feel safer driving a car traveling in an airplane piloted by a third party, although statistically, the plane is much safer.

Source: CT
In motor: