AI Faces Its Oppenheimer Moment Amid Killer Robot Arms Race (1)
Regulators who want to control an emerging generation of artificially intelligent killing machines may not have much time left to do so, governments warned Monday.
As autonomous weapons systems rapidly proliferate, including on the battlefields of Ukraine-bsp-bb-link> and Gaza-bsp-bb-link>, algorithms and unmanned aerial vehicles are already helping planners military personnel to decide whether or not to hit targets. Soon, this decision could be left entirely to machines.
“This is the Oppenheimer moment of our generation,” said Austrian Foreign Minister Alexander Schallenberg, referring to J. Robert Oppenheimer, who helped invent the atomic bomb in 1945 before advocating for the control of the proliferation of nuclear weapons.
Civilian, military and technology leaders from more than 100 countries gathered in Vienna on Monday to discuss how their economies can control the merger of AI with military technologies – two sectors that have recently enlivened investors, helping to push stock market valuations to historic highs.
The spread of global conflicts combined with financial incentives for companies to promote AI adds to the challenge of controlling killer robots, according to Jaan Tallinn-bsp-person>, an early investor in
“Silicon Valley’s incentives may not be aligned with those of the rest of humanity,” Tallinn said.
Governments around the world have taken steps to collaborate with companies integrating AI tools into defense. Pentagon-bsp-bb-link> invests millions of dollars in AI startups-bsp-bb-link>. The European Union-bsp-bb-link> paid last week
Tel Aviv-based +972 magazine reported this month that Israel was using an artificial intelligence program called “Lavender” to suggest assassination targets. After this story – which Israel disputed – United Nations Secretary-General Antonio Guterres said he was “deeply troubled” by reports of the use of AI in the military campaign in Gaza and that no part of life or death decisions should be delegated to the cold calculations of algorithms.
“The future of slaughtering robots is here,” said Anthony Aguirre, a physicist who predicted the trajectory the technology would take in a 2017 short film seen by more than 1.6 million viewers. “We need an arms control treaty negotiated by the United Nations General Assembly. »
But advocates of diplomatic solutions are likely to be frustrated, at least in the short term, according to Alexander Kmentt, Austria’s top disarmament official and the architect of this week’s conference.
“A classical approach to arms control does not work because we are not talking about a single weapons system but a combination of dual-use technologies,” Kmentt said in an interview.
Rather than concluding a new “magnum opus” treaty, Kmentt suggested that countries might be forced to make do with the legal tools already at their disposal. Enforcing export controls and humanitarian laws could help contain the spread of AI weapons systems, he said.
Longer term, once technology becomes accessible to non-state actors and potentially terrorists, countries will be forced to write new rules, predicted Arnoldo André Tinoco, Costa Rica’s foreign minister.
“Ease of access to autonomous weapons removes the limitations that ensured only a few could participate in the arms race,” he said. “Now students with a 3D printer and basic programming skills can build drones capable of causing mass casualties. Autonomous weapon systems have forever changed the concept of international stability.
(Adds context of magazine report, UN secretary-general’s criticism in eighth paragraph.)
To contact the reporter on this story:
Jonathan Tirone-bsp-person> in Vienna at jtirone@bloomberg.net
To contact the editors responsible for this story:
Marton Eder-bsp-person> at meder4@bloomberg.net
Jessica Loudis
© 2024 Bloomberg LP All rights reserved. Used with permission.
News Source : news.bloomberglaw.com
Gn world