U.N. Assembly Calls for Guidelines on AI-Driven Weaponry; Divergent Votes from Global Powers
Share via:
In a landmark move, the United Nations General Assembly on Friday adopted its first-ever resolution on lethal autonomous weapons systems, or LAWS, aimed at establishing international guidelines for the use of AI in weapons capable of autonomously identifying and eliminating human targets.
The resolution, initiated by Austria, received overwhelming support from 152 nations, including major G7 countries like Japan and the United States. However, it faced opposition from four countries, notably Russia and India, both of which are advancing their military AI capabilities. Meanwhile, 11 countries, including China, Israel, and Iran, chose to abstain from the vote.
The urgency of this resolution stems from the rapid advancement of AI in warfare, raising concerns about the potential misuse of AI in conflict zones. These autonomous weapons, capable of processing vast amounts of data and making independent attack decisions, could inadvertently target civilians or spiral out of control.
Recent conflicts, such as in Ukraine, have highlighted the rapid progression of AI weaponry, bringing the issue to the forefront of international security discussions. The resolution emphasizes the applicability of the U.N. Charter and international humanitarian law to LAWS. It also expresses concern over the potential for an arms race, increased conflict, and the risk of such technology falling into the hands of terrorists.
The U.N. Secretary-General, Antonio Guterres, has been tasked with compiling member states’ perspectives on lethal autonomous weapons systems. A report on this matter is expected to be presented at the next U.N. session in September 2024.
While the United States and China, both leaders in AI technology, acknowledge the need for regulation, Russia remains opposed to such measures.
-
Marin Ivezic#molongui-disabled-linkMarch 31, 2015
-
Marin Ivezic#molongui-disabled-linkApril 6, 2016
-
Marin Ivezic#molongui-disabled-linkMarch 3, 2017
-
Marin Ivezic#molongui-disabled-linkAugust 9, 2017
-
Marin Ivezic#molongui-disabled-linkAugust 23, 2017
-
Marin Ivezic#molongui-disabled-linkFebruary 2, 2018
-
Marin Ivezic#molongui-disabled-linkFebruary 3, 2018
-
Marin Ivezic#molongui-disabled-linkFebruary 6, 2018
-
Marin Ivezic#molongui-disabled-linkMarch 22, 2018
-
Marin Ivezic#molongui-disabled-linkSeptember 12, 2018
-
Marin Ivezic#molongui-disabled-linkJanuary 1, 2019
-
Marin Ivezic#molongui-disabled-linkMarch 15, 2019
-
Marin Ivezic#molongui-disabled-linkApril 10, 2019
-
Marin Ivezic#molongui-disabled-linkMay 3, 2019
-
Marin Ivezic#molongui-disabled-linkSeptember 21, 2019
-
Marin Ivezic#molongui-disabled-linkOctober 5, 2019
-
Marin Ivezic#molongui-disabled-linkOctober 28, 2019
-
Marin Ivezic#molongui-disabled-linkNovember 13, 2019
-
Marin Ivezic#molongui-disabled-linkFebruary 25, 2020
-
Marin Ivezic#molongui-disabled-linkApril 12, 2020
-
Marin Ivezic#molongui-disabled-linkSeptember 27, 2020
-
Marin Ivezic#molongui-disabled-linkFebruary 11, 2021
-
Marin Ivezic#molongui-disabled-linkFebruary 11, 2021
-
Marin Ivezic#molongui-disabled-linkMarch 15, 2021
-
Marin Ivezic#molongui-disabled-linkJuly 7, 2021
-
Marin Ivezic#molongui-disabled-linkJuly 20, 2021
-
Marin Ivezic#molongui-disabled-linkNovember 13, 2021
-
Marin Ivezic#molongui-disabled-linkDecember 15, 2021
-
Marin Ivezic#molongui-disabled-linkFebruary 8, 2022
-
Marin Ivezic#molongui-disabled-linkApril 12, 2022
-
Marin Ivezic#molongui-disabled-linkJune 2, 2022