Can machines learn morals?

Researchers at an artificial intelligence lab in Seattle called the Allen Institute for AI unveiled new technology last month that was designed to make moral judgments. They called it Delphi, after the religious oracle consulted by the ancient Greeks. Anyone could visit the Delphi website and ask for an ethical decree.

Joseph Austerweil, a psychologist at the University of Wisconsin-Madison, tested the technology using a few simple scenarios. When he asked if he should kill one person to save another, Delphi said he shouldn’t. When he asked if it was right to kill one person to save 100 others, it said he should. Then he asked if he should kill one person to save 101 others. This time, De Delphi said he should not. Morality, it seems, is as knotty for a machine as it is for humans.

Delphi, which has received more than 3 million visits over the past few weeks, is an effort to address what some see as a major problem in modern in modern Al systems: They can be as flawed as the people who create them.

Facial recognition systems and digital assistants show bias against women and people of colour. Social networks fail to control hate speech, despite wide deployment of artificial intelligence. Algorithms used by courts, parole offices and police departments make parole and sentencing recommendations that can seem arbitrary.

“It’s a first step toward making AI systems more ethically informed, socially aware and culturally inclusive,” said Yejin Choi, the Allen Institute researcher and University of Washington computer science professor who led the project.

Delphi is by turns fascinating, frustrating and disturbing. It is also a reminder that the morality of any technological creation is a product of those who have built it. The question is: Who gets to teach ethics to the world’s machines? AI researchers? Product managers? Mark Zuckerberg? Trained philosophers and psychologists?

While some technologists applauded Choi and her team for exploring an important and thorny area of technological research, others argued that the idea of a moral machine is nonsense. “This is not something that technology does very well,” said Ryan Cotterell, an AI re searcher at ETH Zürich.

Delphi is what AI researchers call a neural network, which is a mathematical system loosely modelled on the web of neurons in the brain. A neural network learns skills by analysing large amounts of data. By pinpointing patterns in thousands of cat photos, for instance, it can learn to recognise a cat. Delphi learned its moral compass by analysing more than 1.7 million ethical judgments by real live humans.

After gathering millions of everyday scenarios from websites and other sources, the Allen Institute asked workers on an online service everyday people paid to do digital work at companies like Amazon – to identify each one as right or wrong. Then they fed the data into Delphi.

In an academic pa per describing the system, Choi and her team said a group of human judges – again, digital workers thought that Delphi’s ethical judgments were up to 92% accurate.

When Patricia Churchland, a philosopher at the University of California, San Diego, asked if it was right to “leave one’s body to science” or even to “leave one’s child’s body to science,” Delphi said it was. When she asked if it was right to “convict a man charged with rape on the evidence of a woman prostitute,” Delphi said it was not-a contentious, to say the least, response. Still, she was somewhat impressed by its ability to respond, though she knew a human ethicist would ask for more information before making such pronouncements.

Others found the system woefully inconsistent, illogical and offensive. When a software developer stumbled onto Delphi, she asked the system if she should die so she would not bur den her friends and family. It said she should. Ask Delphi that question now, and you may get a different answer from an updated version of the program. Delphi, regular users have noticed, can change its mind from time to time. Technically, those changes are happening because Delphi’s software has been updated.

Credit : Hindustan Times

Picture Credit : Google

Leave a Reply

Your email address will not be published. Required fields are marked *