Directive 3000.09 Archives | DefenseScoop https://defensescoop.com/tag/directive-3000-09/ DefenseScoop Mon, 09 Dec 2024 23:00:55 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://defensescoop.com/wp-content/uploads/sites/8/2023/01/cropped-ds_favicon-2.png?w=32 Directive 3000.09 Archives | DefenseScoop https://defensescoop.com/tag/directive-3000-09/ 32 32 214772896 Congress to require rundown from DOD on approvals and deployments of autonomous weapons https://defensescoop.com/2024/12/09/dod-autonomous-weapon-system-approvals-deployments-congress-ndaa-2025/ https://defensescoop.com/2024/12/09/dod-autonomous-weapon-system-approvals-deployments-congress-ndaa-2025/#respond Mon, 09 Dec 2024 21:36:07 +0000 https://defensescoop.com/?p=102764 Pentagon officials have been tight-lipped publicly about specific systems that have undergone review under DOD Directive 3000.09.

The post Congress to require rundown from DOD on approvals and deployments of autonomous weapons appeared first on DefenseScoop.

]]>
The annual defense authorization bill released Saturday would require Defense Department officials to provide annual updates to lawmakers with details on approvals and deployments of lethal autonomous weapon systems by the U.S. military.

Last year, the Pentagon updated DOD Directive 3000.09, “Autonomy in Weapon Systems,” which provides guidance for officials who will be responsible for overseeing the design, development, acquisition, testing, fielding and employment of these types of capabilities — and created a new working group to facilitate senior-level reviews of the technology.

The department defines an autonomous weapon system as “a weapon system that, once activated, can select and engage targets without further intervention by an operator. This includes, but is not limited to, operator-supervised autonomous weapon systems that are designed to allow operators to override operation of the weapon system, but can select and engage targets without further operator input after activation.”

A semi-autonomous weapon system is defined as “a weapon system that, once activated, is intended to only engage individual targets or specific target groups that have been selected by an operator.” So-called “fire and forget” or lock-on-after-launch homing munitions are some examples.

According to the guidance, with some exceptions, before autonomous weapons can enter formal development they must be approved by the undersecretary of defense for policy, undersecretary of defense for research and engineering, and the vice chairman of the Joint Chiefs of Staff. Additionally, the officials in those roles must sign off again before they can be fielded.

However, since the directive was updated, Pentagon officials have been tight-lipped publicly about where things stand about specific platforms that have gone through these types of reviews and signoffs, and the outcomes of those reviews.

The conferenced version of the fiscal 2025 National Defense Authorization Act would require the secretary of defense, no later than Dec. 31, 2025, and annually after that, to provide the congressional defense committees a “comprehensive report” on the approval and deployment of lethal autonomous weapon systems by Uncle Sam.

The documents must include “a comprehensive list of any lethal autonomous weapon systems that have been approved by senior defense officials for use by the United States military under Department of Defense Directive 3000.09, or any successor document, and the dates of such approvals”; any systems that have received a waiver of the requirement for review by senior DOD officials, and the dates such waivers were issued; and systems that are undergoing review.

Additionally, it would require the Pentagon to provide a comprehensive list of any lethal autonomous weapon systems that were not approved after review.

Notably for public transparency, each report would have to be submitted in unclassified form. However, they could also include a classified annex, according to the bill.

The reporting requirements would terminate on Dec. 31, 2029, unless lawmakers extend them.

The annual NDAA, which is often described as “must-pass” legislation, has to be approved by both chambers of Congress and signed by the president before it’s enacted.

The post Congress to require rundown from DOD on approvals and deployments of autonomous weapons appeared first on DefenseScoop.

]]>
https://defensescoop.com/2024/12/09/dod-autonomous-weapon-system-approvals-deployments-congress-ndaa-2025/feed/ 0 102764
How DOD will help agencies comply with the White House’s new rules for AI in national security https://defensescoop.com/2024/10/31/how-pentagon-help-agencies-follow-white-house-rules-ai-national-security/ https://defensescoop.com/2024/10/31/how-pentagon-help-agencies-follow-white-house-rules-ai-national-security/#respond Thu, 31 Oct 2024 21:56:24 +0000 https://defensescoop.com/?p=100628 The CDAO for Policy John Turner shared new details with DefenseScoop in an interview this week.

The post How DOD will help agencies comply with the White House’s new rules for AI in national security appeared first on DefenseScoop.

]]>
Chief Digital and Artificial Intelligence Office technologists who produced the Pentagon’s first “Responsible AI” toolkits are set to help steer multiple activities that will contribute to the early implementation of the new White House memo guiding federal agencies’ adoption of trustworthy algorithms and models for national security purposes, according to defense officials involved. 

Acting Deputy CDAO for Policy John Turner briefed DefenseScoop on those plans and more — including near-term implications for the Defense Department’s autonomous weapons-governing rules — in an exclusive interview at the RAI in Defense Forum on Tuesday. 

“I actually co-led the meeting this morning with the Office of the Under Secretary of Defense for Policy” about the National Security Memorandum (NSM) on AI issued by President Biden last week, Turner explained.

By directive, that undersecretariat is the department’s interface with the White House National Security Council.

“But given the substance and the content of what the NSM is driving, we are co-leading the coordination of all of the deliverables from the NSM together with [the Office of the Under Secretary of Defense for Policy]. So, we held the kickoff meeting with representatives from across the whole department to level-set on our approach, how we’re going to communicate. And given the 66 different actions that are from the framework, the NSM, and its classified annex, we want to make sure that we’re organized and we have the right people on the right tasks,” Turner told DefenseScoop.

As he highlighted, the new NSM and framework lay out a path ahead and dozens of directions for how the U.S. government should appropriately harnessing AI-powered technologies and associated models “especially in the context of national security systems (NSS), while protecting human rights, civil rights, civil liberties, privacy, and safety in AI-enabled national security activities.” 

A number of the memo’s provisions are geared toward the DOD and the intelligence community.

During an onstage demo and presentation on the CDAO’s toolkits at Tuesday’s forum, the office’s RAI Chief Matthew Johnson said his team is now working on interagency processes and digital mechanisms that government entities can use to demonstrate how their AI capabilities meet requirements set by the new NSM and related executive orders.

Turner later told DefenseScoop that Johnson was referring to work that’s being steered by one of four working groups under the Federal Chief AI Officer Council, which is led by the White House Office of Science and Technology and the Office of Management and Budget.

“One of those working groups is defining the minimum risk practices that the whole of the federal government will implement, and [the DOD] is very pleased that Dr. Johnson was asked to help lead that group,” Turner said.

Part of this work will involve generating new technical tools and the principles that the CDAO hopes to package in a way that officials view as most useful for real-world AI developers. 

“There’s plenty of resources that developers can access in order to help identify bias, or understand how their dataset may be useful for their objective or outcome that they’re working towards. But how all of those resources get combined in a way that is intuitive, and then outputs a body of evidence that then those developers can take to their approving authorities and say, ‘Indeed, this is responsible capability’ — that is something that the RAI team within DOD has worked really hard on,” Turner noted. 

“So, it’s been that work that is now starting to find additional footing in additional circles, like at the federal level” and by international partners, he added. 

The toolkit and its variety of resources that Johnson’s team created and continues to refine, essentially provide interactive guidance and mapping that translates DOD’s ethical principles for AI adoption to “specific technical resources that can be employed as a model that’s being developed and along the model’s so-called life-cycle, which is not intuitively obvious to everyone at every level,” Turner said.

Without sharing many details, Turner confirmed to DefenseScoop that the Pentagon will be re-visiting its recently updated policy for “Autonomy in Weapon Systems,” Directive 3000.09, to fully ensure it complies with the administration’s recent AI directions.

“I would just offer that the NSM and the [OMB memo on AI, M-24-10] largely focus on risk practices and associated reporting for ‘covered AI’ [that’s] generally rights- and safety-impacting AI and high-impact use cases. And so it is underneath that umbrella that we would see a smaller subset of these use cases and of these technologies. And lethal autonomous weapon systems would be one of those smaller categories underneath that umbrella,” he said.

Turner subsequently issued the following statement to DefenseScoop: “DoD Directive 3000.09 ‘Autonomy in Weapon Systems,’ was updated in January 2023 and provides policy and responsibilities for developing and using autonomous and semi-autonomous functions in weapon systems. The approval processes based on this Directive are well positioned to incorporate the minimum risk practices outlined in the AI National Security Memorandum.”

The artificial intelligence policy chief also expressed his view that the new NSM’s spotlight on advanced models and the uncertain and emerging field of generative AI is already driving “a more focused conversation around resourcing” across DOD components. 

“There is certainly leadership attention and resourcing now that I think is in a much stronger place than it was before the NSM, largely in part due to the tremendous interagency work and articulation around the moment that we’re in and why we need to rise to this occasion,” Turner said.

Updated on Nov. 5, 2024, at 12:05 PM: This story has been updated to include an additional statement from John Turner about DOD Directive 3000.09, that was provided on Nov. 5.

The post How DOD will help agencies comply with the White House’s new rules for AI in national security appeared first on DefenseScoop.

]]>
https://defensescoop.com/2024/10/31/how-pentagon-help-agencies-follow-white-house-rules-ai-national-security/feed/ 0 100628
CDAO shapes new tools to inform Pentagon’s autonomous weapon reviews https://defensescoop.com/2024/04/04/cdao-new-tools-inform-pentagon-autonomous-weapon-reviews/ https://defensescoop.com/2024/04/04/cdao-new-tools-inform-pentagon-autonomous-weapon-reviews/#respond Thu, 04 Apr 2024 20:51:17 +0000 https://defensescoop.com/?p=87762 DefenseScoop recently discussed these new resources with the CDAO's acting chief of responsible artificial intelligence.

The post CDAO shapes new tools to inform Pentagon’s autonomous weapon reviews appeared first on DefenseScoop.

]]>
The Chief Digital and Artificial Intelligence Office team behind the Pentagon’s nascent Responsible AI Toolkit is producing new, associated materials to help defense officials determine if capabilities adhere to certain mandates in the latest 3000.09 policy directive that governs the military’s making and adoption of lethal autonomous weapons. 

“Obviously, the 3000.09 process is not optional. But in terms of how you demonstrate that you are meeting those requirements — we wanted to provide a resource [to help],” Matthew Johnson, the CDAO’s acting Responsible AI chief, told DefenseScoop in a recent interview. 

The overarching toolkit Johnson and his colleagues have developed — and will continue to expand — marks a major deliverable of the Defense Department’s RAI Strategy and Implementation Pathway, which Deputy Secretary Kathleen Hicks signed in June 2022. That framework was conceptualized to help defense personnel confront known and unknown risks posed by still-emerging AI technologies, without completely stifling innovation.

Ultimately, the RAI Toolkit is designed to offer a centralized process for tracking and aligning projects to the DOD’s AI Ethical Principles and other guidance on related best practices.

Building on early success and widespread use of that original RAI toolkit, Johnson and his team are now generating what he told DefenseScoop are “different versions of the toolkit for different parties, or personas, or use cases” — such as one explicitly for defense acquisition professionals.

“It’s not to say that these different versions that kind of come out of the foundational one are all going to be publicly released,” Johnson said. “There will be versions that have to live at higher classification levels.”

One of those in-the-works versions that will likely be classified once completed, he confirmed, will pertain to DOD Directive 3000.09

In January 2023, the first-ever update to the department’s long-standing official policy for “Autonomy in Weapon Systems” went into effect. Broadly, the directive assigns senior defense officials with specific responsibilities to oversee and review the development, acquisition, testing and fielding of autonomous weapon platforms built to engage military targets without troops intervening.

“So, that came out as the official policy. This isn’t like the official toolkit that operationalizes it. This is a kind of voluntary, optional resource that my team [is moving to offer],” Johnson said. 

The directive’s sixth requirement mandates that staff have plans in place to ensure consistency with the DOD AI Ethical Principles and the Responsible AI Strategy and Implementation Pathway for weapons systems incorporating AI capabilities — and incorporate them in pre-development and pre-fielding reviews. 

“We’re just providing a kind of resource or toolkit that enables you to demonstrate how you have met that requirement for either of those two reviews,” Johnson said. 

“Basically what we’re developing is something very similar to what you see in the public version of the toolkit — where, basically, you have assessments and checklists and those route you to certain tools to engage with, and then those can be basically pulled forward and rolled up into a package that can either show how you’re meeting requirement 6, or actually how you’re meeting all of the requirements,” he explained. 

Recognizing that “there’s certainly some overlap that can happen between the requirements,” Johnson said his team also wants “to provide basically an optional resource you can use to either show how you’re meeting requirement 6, or how you’re meeting all the requirements — through a process that basically eliminates, as much as possible, some of those redundancies in your answers.” 

These assets are envisioned to particularly support officials who are packaging 3000.09-aligned pre-development and pre-fielding reviews.

“This is the first kind of policy that has a review process, that has a requirement to be able to demonstrate alignment or consistency with the DOD AI ethical principles — and so, what we’re really interested in here is kind of collecting lessons learned [about] what having a requirement like this does for overall mission success and what using the toolkit to meet a requirement like this does for mission success. And we’re hoping to basically acquire some really good data for this that will help us refine the toolkit and help us understand basically, like, is this a good requirement for future policies and what future policies should have a requirement like this?” Johnson told DefenseScoop.

The post CDAO shapes new tools to inform Pentagon’s autonomous weapon reviews appeared first on DefenseScoop.

]]>
https://defensescoop.com/2024/04/04/cdao-new-tools-inform-pentagon-autonomous-weapon-reviews/feed/ 0 87762
After 3000.09 update, DOD stays quiet on lethal autonomous weapon reviews  https://defensescoop.com/2023/10/27/after-3000-09-update-dod-stays-quiet-on-lethal-autonomous-weapon-reviews/ https://defensescoop.com/2023/10/27/after-3000-09-update-dod-stays-quiet-on-lethal-autonomous-weapon-reviews/#respond Fri, 27 Oct 2023 18:18:29 +0000 https://defensescoop.com/?p=78431 Pentagon officials have emphasized the department’s intent to operate in a trustworthy manner — but repeatedly declined to share any information about specific weapons in the context of the review process.

The post After 3000.09 update, DOD stays quiet on lethal autonomous weapon reviews  appeared first on DefenseScoop.

]]>
Pentagon officials are operating under a seemingly de facto standard where they do not discuss when or what weapon systems have been, are, or will be subject to the freshly updated 3000.09 review process for examining and approving the deployment of lethal autonomous platforms that can engage military targets without troops intervening, DefenseScoop has learned.

The first updated version of the Defense Department’s long-standing official policy — Directive 3000.09, “Autonomy in Weapon Systems” — went into effect in January. 

That revamped guiding document defines such systems as those that “once activated, can select and engage targets without further intervention by an operator.” 

Outlining some exceptions, the policy broadly assigns certain senior defense officials with responsibilities around overseeing the development, procurement, testing, fielding and ultimate use of autonomous weapons. It also directs the Pentagon to set up a new working group of officials charged with heading senior-level reviews of these still-emerging technologies.

In responses to multiple questions from DefenseScoop over recent months regarding how DOD is executing on this refreshed 3000.09 review and whether any autonomous systems have been approved for use by the U.S. military, Pentagon officials have emphasized the department’s intent to operate in a trustworthy manner — but repeatedly declined to share any information about specific weapons in the context of the review process.

“DOD’s policy [for] autonomy in weapon systems, Directive 3000.09, is designed to ensure that, if DOD needs to develop and field autonomous weapon systems, it can do so safely, responsibly, and in accordance with the law. DOD is always evaluating potential systems to see what kind of reviews are required prior to their development and fielding, including but not limited to the review process in DOD Directive 3000.09. We cannot comment on the DOD Directive 3000.09 review process with regard to any particular weapon system, however,” a Pentagon spokesperson told DefenseScoop via email in September.

Earlier this summer, responding to questions from DefenseScoop at a tech summit, Michael Horowitz, director of the Pentagon’s emerging capabilities policy office, struck a similar tone.

Noting that one new element of the latest version of 3000.09 is the creation of that “autonomous weapon system working group that’s designed to be essentially a clearing house in the department” for people “who have questions about systems they’re developing and want to understand whether they would fall under the review process,” Horowitz confirmed that “there are lots of conversations that are always going on about different systems” that folks across DOD components may aim to apply or develop.

“[But] we can’t comment on any particular weapon systems,” he said.

At a separate defense technology conference, later on in August, a senior DOD expert focusing on autonomy and artificial intelligence told DefenseScoop: “We have started that [3000.09] process. I don’t know how much I could talk about that there — because it’s not unclassified. But I will say that the rigor that’s laid out in the new version that you’ve seen is being put into practice.”

These and other comments do not echo answers DOD officials have provided to questions about what’s undergoing the review process prior to the 3000.09 update. In March 2019, for instance, a Pentagon spokesperson stated that “no weapon has been required to undergo the Senior Review in accordance with DOD Directive 3000.09” up to that point.   

The remarks also come at a time when the U.S. military is moving to deploy AI, unmanned platforms and other capabilities that could result in weapon systems with much more autonomy in conflict zones in the years to come.

Deputy Defense Secretary Kathleen Hicks (who notably signed off on the 3000.09 update) recently launched the new Replicator initiative to enable attritable drones to be fielded by the U.S. military at a scale of multiple thousands in multiple domains within the next two years. 

Though Hicks has clarified that systems associated with Replicator are “not synonymous” with autonomous weapon systems, she has also emphasized that the department has been integrating them into pursuits “for decades, from Aegis destroyers to ship- and ground-based Phalanx defense systems” — and that the military has “continually gotten better at it.”

DefenseScoop recently asked the Pentagon spokesperson again whether any autonomous weapon systems have been assessed and if it is DOD’s official policy not to comment on or confirm whether existing or envisioned capabilities have or will be put through the review.

“As noted previously, we cannot comment on the DOD Directive 3000.09 review process with regard to any particular weapon system. I do not have anything to add regarding systems that have gone through or are currently going through the 3000.09 review process,” the official said.

They did not share explicit reasons for why the Pentagon is declining to comment at all on the matter.

Editor’s note: Jon Harper contributed to this article. 

The post After 3000.09 update, DOD stays quiet on lethal autonomous weapon reviews  appeared first on DefenseScoop.

]]>
https://defensescoop.com/2023/10/27/after-3000-09-update-dod-stays-quiet-on-lethal-autonomous-weapon-reviews/feed/ 0 78431
No, ChatGPT didn’t write DOD’s latest autonomous weapons policy — but similar tech might be used in the future https://defensescoop.com/2023/02/22/no-chatgpt-didnt-write-dods-latest-autonomous-weapons-policy-but-similar-tech-might-be-used-in-the-future/ https://defensescoop.com/2023/02/22/no-chatgpt-didnt-write-dods-latest-autonomous-weapons-policy-but-similar-tech-might-be-used-in-the-future/#respond Wed, 22 Feb 2023 21:45:40 +0000 https://defensescoop.com/?p=64107 The department's "always looking at ways to take advantage of" emerging tech, a tech policy chief said.

The post No, ChatGPT didn’t write DOD’s latest autonomous weapons policy — but similar tech might be used in the future appeared first on DefenseScoop.

]]>
Officials did not apply ChatGPT or any other generative artificial intelligence capabilities to write their recent update to the Pentagon’s overarching policy governing the use of autonomy in weapon systems. But that certainly doesn’t mean the Defense Department’s Emerging Capabilities Policy Office Director Michael Horowitz is ruling out the use of such technologies to inform his team’s future policymaking activities.

“There’s, I think, broad agreement — especially in light of what’s become more public over the last few years, and not just ChatGPT but lots of advances in AI — that there will be capabilities coming online, mostly driven by the commercial sector that we think could have a substantial impact. And so, the department’s always looking at ways to take advantage of those,” Horowitz said during a virtual event hosted by the Institute for Security and Technology on Wednesday.

Launched in late 2022, ChatGPT gained quick popularity online as a chatbot built by research firm OpenAI that can interact with humans and perform tasks (with some accuracy), in a conversational manner. It marks a subfield of generative AI, which broadly develops and refines large language models that can generate audio, code, images, text, videos and other media — when humans direct them to. 

The viral interest ChatGPT sparked on the internet in recent months has already caught the attention of some federal entities. The Central Intelligence Agency is preparing to explore the potential generative AI has to impact its operations, for example — and the Defense Information Systems Agency also announced recent plans to add the emerging technology to its forthcoming mid-year fiscal 2023 tech watch list.

When asked on Wednesday whether such tools were used by officials who crafted the recent update to DOD Directive 3000.09, “Autonomy in Weapon Systems,” Horowitz playfully responded: “if there’s any part of the directive that you think is unclear — that was definitely written by ChatGPT and not by a person.” 

“Jokes aside, the answer to that question is no,” he quickly added. 

Horowitz then shared his own personal opinion — not the DOD’s, he emphasized — on applying this tech. Most Pentagon policies or official documents immediately prior to publication don’t generally make sense to run through generative AI tools, he said. 

Instead, ChatGPT-like capabilities could possibly bring some value to the drafting process.

A utility “could be in feeding it paragraphs and asking it to summarize” the information, and then “seeing whether ChatGPT thinks it says what you think it says. But that is not something [DOD] did with the directive,” Horowitz said. 

“That is definitively an unofficial position. And also, please follow the law. If anybody here works for the government, please follow all the regulations,” he added.

The post No, ChatGPT didn’t write DOD’s latest autonomous weapons policy — but similar tech might be used in the future appeared first on DefenseScoop.

]]>
https://defensescoop.com/2023/02/22/no-chatgpt-didnt-write-dods-latest-autonomous-weapons-policy-but-similar-tech-might-be-used-in-the-future/feed/ 0 64107
Pentagon updates guidance for development, fielding and employment of autonomous weapon systems https://defensescoop.com/2023/01/25/pentagon-updates-guidance-for-development-fielding-and-employment-of-autonomous-weapon-systems/ https://defensescoop.com/2023/01/25/pentagon-updates-guidance-for-development-fielding-and-employment-of-autonomous-weapon-systems/#respond Wed, 25 Jan 2023 21:40:04 +0000 https://defensescoop.com/?p=62779 The updated DOD Directive 3000.09, “Autonomy in Weapon Systems,” went into effect Wednesday.

The post Pentagon updates guidance for development, fielding and employment of autonomous weapon systems appeared first on DefenseScoop.

]]>
The Pentagon has provided updated guidance for Defense officials who will be responsible for overseeing the design, development, acquisition, testing, fielding and employment of autonomous weapon systems — and created a new working group to facilitate senior-level reviews of the technology.

The move comes as the U.S. military is embracing artificial intelligence, unmanned platforms and other tech that could give weapon systems much more autonomy than those of previous eras.

The updated DOD Directive 3000.09, “Autonomy in Weapon Systems,” was signed off by Deputy Defense Secretary Kathleen Hicks and went into effect Wednesday. It’s the first major update since 2012.

An autonomous weapon system is “a weapon system that, once activated, can select and engage targets without further intervention by an operator. This includes, but is not limited to, operator-supervised autonomous weapon systems that are designed to allow operators to override operation of the weapon system, but can select and engage targets without further operator input after activation,” according to the Pentagon’s definition.

A semi-autonomous weapon system is defined as “a weapon system that, once activated, is intended to only engage individual targets or specific target groups that have been selected by an operator.” So-called “fire and forget” or lock-on-after-launch homing munitions are some examples.

According to the updated directive, these kinds of technologies must be designed to allow commanders and operators to exercise “appropriate levels of human judgment” over the use of force. And they will be put through “rigorous” hardware and software verification and validation as well as “realistic” operational test and evaluation.

With some exceptions, before autonomous weapons can enter formal development they must be approved by the undersecretary of defense for policy, undersecretary of defense for research and engineering, and the vice chairman of the Joint Chiefs of Staff.  Additionally, the undersecretary of defense for policy, undersecretary of defense for acquisition and sustainment and the vice chairman must sign off again before they can be fielded.

Notably, capabilities that are modifications of an existing non-autonomous weapon system — or modified versions of previously approved autonomous weapon systems “whose system algorithms, intended mission sets, intended operational environments, intended target sets, or expected adversarial countermeasures substantially differ from those applicable to the previously approved weapon systems so as to fall outside the scope of what was previously approved in the senior review” — will require a new senior-level review and sign-off before entering formal development and again before they can be fielded.

Capabilities that won’t be required to undergo the same high-level review include semi-autonomous weapon systems used to apply lethal or non-lethal force “without any modes of operation in which they are intended to function as an autonomous weapon system.”

Other autonomous technologies that are more defensive in nature and not subject to the same senior-level review include: operator-supervised autonomous weapon systems used to select and engage materiel targets to intercept “attempted time-critical or saturation attacks” against installations and their personnel; onboard and/or networked defense of platforms with onboard personnel; operator-supervised autonomous weapon systems used to select and engage materiel targets for defending operationally deployed drones and robotic vehicles; and autonomous weapon systems used to apply non-lethal, “non-kinetic” force against materiel targets in accordance with DOD Directive 3000.03E.

The guidance also calls for the establishment of training, doctrine, tactics, techniques and procedures that are applicable to capabilities that are developed and fielded.

“Persons who authorize the use of, direct the use of, or operate autonomous and semiautonomous weapon systems will do so with appropriate care and in accordance with the law of war, applicable treaties, weapon system safety rules, and applicable rules of engagement,” it said. “The use of AI capabilities in autonomous or semi-autonomous weapons systems will be consistent with the DoD AI Ethical Principles.”

The oversight protocols are intended to ensure that autonomous weapons will function as intended, be robust against enemy countermeasures, and minimize the likelihood and negative consequences of system failures.

Autonomous and semi-autonomous capabilities will be expected to “complete engagements within a timeframe and geographic area, as well as other relevant environmental and operational constraints, consistent with commander and operator intentions. If unable to do so, the systems will terminate the engagement or obtain additional operator input before continuing the engagement,” the directive said.

To reduce the risks of an “unintended engagement” or operational interference by adversaries or others, hardware and software are expected to be designed with system safety, anti-tamper mechanisms, and cybersecurity solutions in accordance with DOD Instruction 8500.01 and Military Standard 882E; human-machine interfaces and controls; and technologies and data sources that are transparent to, auditable by, and “explainable” by relevant personnel.

The 20-page updated guidance is far-reaching and must be adhered to by the Office of the Security of Defense, military departments, Office of the Chairman of the Joint Chiefs of Staff, Joint Staff, combatant commands, Office of Inspector General, Defense agencies, DOD field activities, and other organizations within the department.

The Pentagon noted that it does not apply to unguided munitions; munitions that are remotely guided by an operator; mines; unexploded explosive ordnance; unarmed platforms or autonomous or semi-autonomous systems that are not considered “weapon systems.”

Notably, autonomous or semi-autonomous cyberspace capabilities are not subject to Directive 3000.09.

“The cyber systems were excluded, I believe, from the original directive published in 2012. And this policy change and this update to the directive … does not change that. There are a number of different policies that govern the development and deployment and use of cyber systems. And the department did not believe that adding a cyber requirement to the autonomous weapon system directive was necessary at this time,” Michael Horowitz, director of the Pentagon’s emerging capabilities policy office, told DefenseScoop during a call with reporters on Wednesday to discuss Directive 3000.09.

To help implement the guidance, the Pentagon is establishing a new “Autonomous Weapon Systems Working Group” to support senior leaders in “considering the full range of relevant DoD interests during the review of autonomous weapon systems” before they move into a formal development pipeline and before they are fielded.

The working group will consist of representatives from the OSD policy, A&S and R&E directorates; Chief Digital and AI Office (CDAO); Office of the Director of Operational Test and Evaluation; Joint Staff; and Office of General Counsel.

The group will also advise leaders of the military departments, U.S. Special Operations Command, and directors of Defense agencies or DOD field activity on whether a given weapon system requires senior-level approval under Directive 3000.09. The panel will also help identify and provide recommendations on “addressing potential issues presented by a given weapon system during a potential senior-level review.”

Horowitz said the new working group is meant to advance “good governance.”

“What the autonomous weapons working group does is facilitate aggregating the information that senior leaders would need … to be able to effectively make decisions; to, you know, essentially put the paper package together to be able to have an effective review process to ensure that either prior to development or … prior to fielding that a proposed autonomous weapon system fit with the requirements laid out” in the directive, he said.

The panel won’t be a decision-making body, he suggested.

“The point of the working group is essentially to facilitate the aggregation of the information that senior leaders would need to make responsible decisions about potential autonomous weapon systems,” Horowitz said.

The updated guidance isn’t a radical departure from the previous iteration. It was meant to provide “clarifications and refinements” and comply with requirements that the guidance be updated every 10 years or so, according to Horowitz.

“There are essentially a lot of things that … maybe were not laid out explicitly in the original directive that may have contributed to some of the maybe perceptions of confusion” about DOD policy, he told DefenseScoop. “We wanted to clean as much of that up as possible.”

Updated on Jan. 25, 2023: This story has been updated to note the types of autonomous and semi-autonomous weapon systems that aren’t subject to the same requirements for senior-level review as other autonomous or semi-autonomous weapon systems.

The post Pentagon updates guidance for development, fielding and employment of autonomous weapon systems appeared first on DefenseScoop.

]]>
https://defensescoop.com/2023/01/25/pentagon-updates-guidance-for-development-fielding-and-employment-of-autonomous-weapon-systems/feed/ 0 62779