HomeTechnologyMoral machine for self-driving cars

Moral machine for self-driving cars

-

Who should die, pedestrian or driver, children or elderly? The moral dilemma machine for ‘training’ self-driving cars

Autonomous cars, i.e. those that drive themselves, are a reality, although they are still in the testing phase. Before launching them on the road, they have to be trained and for this purpose software such as ‘Moral Machine’ is used, which compiles a human perspective on the ethical decisions to be made by these intelligent machines.

These vehicles take on the tasks normally performed by a driver thanks to different technologies, such as sensors, cameras, radars and artificial intelligence (AI). It also implies that they must decide for themselves what to do in dangerous situations, such as in an accident where they will have to gamble on saving passengers or pedestrians. Moral Machine’ is a platform that can be used to train them to deal with these complicated scenarios, and the best way to do this is with the help of the users, who must decide what they would do if they were behind the wheel.

This website is not a new tool, but emerged as an experiment by MIT (Massachusetts Institute of Technology) in 2018 with the aim of analyzing and establishing ethical principles for intelligent machines. Now, with the rise of autonomous cars, this platform has gone viral again, as it serves as a learning method for these vehicles to make the best decision in the face of possible accidents that cannot be avoided.

There is no doubt that, as soon as autonomous cars move normally on the roads, they are going to have to make difficult decisions, and in some cases may even be life or death decisions. ‘Moral Machine’ anticipates these possible scenarios with a platform in which the user must judge what they consider the most acceptable outcome in up to 13 different situations, but you can also create your own levels and even explore other user-generated scenarios. They propose cases such as killing a pregnant woman who is passing through a red light or a girl who crosses to the other sidewalk correctly.

In short, ‘Moral Machine’ is a tool that uses anonymously collected data for research purposes and can be of great use in training the AI of autonomous cars. Although they have been in the testing phase for years, they still have many problems to solve for them to be considered safe. For example, models as advanced as Tesla’s are presenting different errors in the US, such as the autopilot turning off two seconds before an accident.

And who would you kill? Take the test here.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read

Apple keyboard defect

0
Apple to pay up to $395 to those affected by butterfly keyboard defect Apple introduced the keyboard with butterfly buttons in 2015 on its 12-inch...
Hunter BIden

Hunter Biden video