Fears of artificial intelligence (AI) gone wrong prompted more than a thousand scholars and public figures - including theoretical physicist Stephen Hawking, SpaceX founder Elon Musk and Apple co-founder Steve Wozniak - to sign an open letter, cautioning that the autonomous weapons race is “a bad idea” and presents a major threat to humanity.
The letter, presented Monday at the International Joint Conference on AI in Buenos Aires by Future of Life Institute, warns about the high stakes of modern robotic systems that have reached a point at which they are to be feasible within just years, and that "a global arms race is virtually inevitable."
"This technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow," the letter states.Opening Pandora's box, or letting the genie out of the bottle?
"Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce. It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc."
While AI may promise to be truly beneficial to humanity in many ways, it has to be kept under strict controls, and perhaps even banned, the letter suggests, while warning that lethal autonomous weapons systems — or more simply, killer robots — which engage targets without human intervention, are on par with various weapons of mass destruction…
Kalashnikovs of Tomorrow: Musk, Hawking, Wozniak Fear Killer Robot Armies