Standards, Frameworks, Guidelines

The US DoD Updates Directive 3000.09 “Autonomy in Weapon Systems”

The United States Department of Defense (DoD) has recently revised a crucial guideline, the DoD Directive 3000.09. This directive – “Autonomy in Weapon Systems,” – provides governance for the development, testing, and deployment of autonomous and semi-autonomous weapon systems.

The central tenets of the directive have not undergone any alteration. The decision to update the directive came about as a result of technological changes, adjustments in the organizational structure of the Department, and changes in the broader security environment.

The original DoD Directive 3000.09 was first implemented on November 21, 2012. For any weapon systems to gain approval, the DoD insists upon extensive testing, rigorous reviews, and thorough management oversight. However, the directive 3000.09 demands an extra layer of scrutiny – a comprehensive review by senior officials is necessary before the development and deployment of any autonomous weapon systems that do not fall within specific exemptions.

DoD underscores that the directive’s update serves primarily as a clarification and should not be viewed as a significant change. The directive continues to focus on ensuring that military commanders and operators are able to exercise an appropriate level of human judgment over the use of force. The directive continues to stress that military commanders and operators must execute appropriate caution and abide strictly by the law of war, all applicable treaties, weapon system safety rules, and applicable rules of engagement when they authorize the use of, direct the use of, or operate autonomous and semi-autonomous weapon systems.

More about the Directive 3000.09 on Autonomy in Weapon Systems

The US Department of Defense (DoD) Directive 3000.09 on Autonomy in Weapon Systems [PDF] establishes policies and guidelines for the development and use of autonomous and semi-autonomous functions in weapon systems. The directive was issued by the Office of the Under Secretary of Defense for Policy and became effective on January 25, 2023. It replaces the original version of the directive issued on November 21, 2012.

The purpose of this directive is to minimize the probability and consequences of unintended engagements by establishing policies and assigning responsibilities for developing and using autonomous and semi-autonomous functions in weapon systems, including armed platforms that are remotely operated or operated by onboard personnel.

The directive applies to all components of the Department of Defense, including military departments, defense agencies, combatant commands, and other organizations within the Department. It also applies to contractors who develop or provide autonomous or semi-autonomous weapon systems to the Department.

The directive assigns responsibilities to various offices within the Department of Defense. The Under Secretary of Defense for Policy is responsible for overall policy guidance on autonomy in weapon systems. The Under Secretary of Defense for Acquisition and Sustainment is responsible for ensuring that legal reviews are conducted in accordance with applicable law when acquiring or modifying autonomous or semi-autonomous weapon systems.

The Autonomous Weapon Systems Working Group is responsible for coordinating efforts across various offices within the Department to ensure consistency with this directive. The working group is also responsible for developing guidance on testing, evaluation, and certification requirements for autonomous and semi-autonomous weapon systems.

Legal reviews are required when acquiring or modifying autonomous or semi-autonomous weapon systems. These reviews must be conducted in accordance with DoDD 5000.01 (Defense Acquisition System), DoDD 2311.01 (DoD Law of War Program), and DoDD 3000.03E (Irregular Warfare). Legal reviews must address consistency with all applicable domestic and international law and, in particular, the law of war.

The directive emphasizes the importance of technical feasibility when considering support for autonomous and semi-autonomous weapon systems. Only those systems that are technically feasible, consistent with applicable law, and consistent with the standards in this directive should be considered for support.

The directive also seeks to balance potential benefits and drawbacks when considering support for autonomous and semi-autonomous weapon systems. The benefits of using these systems include increased precision, reduced risk to personnel, and improved situational awareness. However, there are also potential drawbacks, such as the risk of unintended engagements, the potential for loss of human control, and ethical and legal concerns.

To address these concerns, the directive establishes several requirements for autonomous and semi-autonomous weapon systems. These systems must function as anticipated in realistic operational environments against adaptive adversaries taking realistic and practicable countermeasures. They must also complete engagements within a timeframe and geographic area, as well as other relevant environmental and operational constraints, consistent with commander and operator intentions. If unable to do so, the systems will terminate the engagement or obtain additional operator input before continuing the engagement. Additionally, these systems must be sufficiently robust to minimize the probability and consequences of failures.

The directive also requires that autonomous and semi-autonomous weapon systems be designed with appropriate levels of human oversight and control. This includes ensuring that humans are able to monitor system performance, intervene if necessary, and make decisions about whether to engage targets.

Finally, the directive emphasizes the importance of transparency in developing and using autonomous and semi-autonomous weapon systems. The Department of Defense is required to provide information about these systems to other governments, international organizations, non-governmental organizations (NGOs), academia, industry partners, and other stakeholders as appropriate.

[email protected] | About me | Other articles
For 30+ years, I've been committed to protecting people, businesses, and the environment from the physical harm caused by cyber-kinetic threats, blending cybersecurity strategies and resilience and safety measures. Lately, my worries have grown due to the rapid, complex advancements in Artificial Intelligence (AI). Having observed AI's progression for two decades and penned a book on its future, I see it as a unique and escalating threat, especially when applied to military systems, disinformation, or integrated into critical infrastructure like 5G networks or smart grids. More about me, and about Defence.AI.

Related Articles

Share via
Copy link
Powered by Social Snap