Scientists: “AI Santa” could destroy the world

The emergence of a “real” artificial intelligence capable of thinking like a human being could have devastating consequences for the entire world. Even a highly advanced AI programmed for good deeds could pose a threat, researchers at Australia’s Sunshine Coast University have warned.

The scale of the danger was shown using the example of the hypothetical algorithm SantaNet. He was put in the place of Santa Claus, who needs to deliver all the New Year’s gifts in one night.

As the thought experiment showed, the threat arises already at the stage of compiling a list of good and naughty children. Taking the task literally and without malicious intent, the neural network would have to create a large-scale system for covert surveillance around the world. AI Santa will be guided by its own moral principles in determining what behavior is “good” all year round, which can lead to discrimination, widespread inequality and human rights abuses.

There are about two billion children under the age of 14 living in the world, so the computer mind will need to create an army of similar AI workers. In turn, this will lead to an increase in unemployment among people.

What’s more, to provide everyone with a gift, SantaNet can turn the Earth into a giant toy factory. As a result, the planet’s resources will be depleted in a very short time, scientists warn.

The first such idea – the “paper clip problem” – was presented by the Swedish philosopher Nick Bostrom in 2003. He suggested that AI, with the goal of maximizing paperclip production, would take its task literally and resist any attempts to stop it.

“SantaNet may sound far-fetched, but this idea helps to highlight the risks of more realistic systems of strong artificial intelligence,” the researchers conclude. “Well-intentioned, such systems can pose huge challenges simply by optimizing ways to achieve their narrow goals and gathering resources.”

Notify of
Inline Feedbacks
View all comments
Would love your thoughts, please comment.x