intelligence community (IC) Archives | DefenseScoop https://defensescoop.com/tag/intelligence-community-ic/ DefenseScoop Tue, 08 Jul 2025 18:04:15 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://defensescoop.com/wp-content/uploads/sites/8/2023/01/cropped-ds_favicon-2.png?w=32 intelligence community (IC) Archives | DefenseScoop https://defensescoop.com/tag/intelligence-community-ic/ 32 32 214772896 Winston Beauchamp retires from federal service after 29 years at Air Force, IC https://defensescoop.com/2025/07/08/winston-beauchamp-retires-from-federal-service-air-force-ic/ https://defensescoop.com/2025/07/08/winston-beauchamp-retires-from-federal-service-air-force-ic/#respond Tue, 08 Jul 2025 18:04:12 +0000 https://defensescoop.com/?p=115487 Throughout his nearly three-decade career in federal government, Beauchamp has been at the forefront of several pivotal moments at the Pentagon — from the boom of commercial space-based imagery to the creation of the Space Force.

The post Winston Beauchamp retires from federal service after 29 years at Air Force, IC appeared first on DefenseScoop.

]]>
After nearly three decades of working for the U.S. government, Winston Beauchamp announced on July 4 that he’s departing from his role within the Department of the Air Force and leaving active federal service. 

Beauchamp began working for the department in 2015, and most recently served as the director of security, special program oversight and information protection within the Office of the Secretary of the Air Force. In that role, he oversaw the Air and Space Forces’ highly-classified special access programs (SAP) and worked on insider threat mitigation.

But Beauchamp’s 29-year career spans across multiple positions at the Department of the Air Force, the National Geospatial-Intelligence Agency (NGA) and the Office of the Director of National Intelligence (ODNI). By and large, he either led or was involved in several critical events within the national security space — so much so that someone once described him as “the Forrest Gump of the national security world.”

“He goes, ‘You were kind of there in all the big happenings of your time of your career. You were right in the middle of all these things that were the big developments. Sometimes you were there in the background of the scene, and sometimes you were there front and center doing the thing,’” Beauchamp told DefenseScoop in an interview on July 3, his last day at the Pentagon, recalling how a colleague described his tenure.

After graduating from Lehigh University in 1992, Beauchamp was hired as a systems engineer for General Electric Aerospace’s programs with the National Reconnaissance Office (NRO). He would eventually move to the National Imagery and Mapping Agency (NIMA) — the precursor to the NGA — after it was founded in 1996 as an operations analyst supporting work to collect imagery and targeting data in the Balkans during the Yugoslav Wars.

In 2000, Beauchamp became NIMA’s senior technical advisor for studies and analysis when he was 29 years old, making him the youngest person to be hired for a senior executive position within the agency since it was founded. Almost immediately, he was tasked with developing a congressionally mandated strategy that would convince the government to purchase imagery from commercial vendors.

At the time, the IC held a monopoly over space-based imagery and data, and the industry market was only just beginning to take hold. Beauchamp described the assignment as “trying to sell milk to people with their own cows.”

“Why would the NRO want to encourage the government to buy commercial imagery? They’re the judge to build and operate imagery satellites,” he said. “So I figured out what it would take in terms of investment to get industry to buy and build satellites sufficient to meet the government’s demands, because the national satellites were not meeting all of the government’s demand for mapping data.”

But after developing a business case for the strategy, Beauchamp said the government was largely opposed to implementing it. He decided to shelve the strategy after one final unsuccessful meeting held on Sept. 10th, 2001, he said.

“On the 11th of September, [Congress] called me up,” he said. “I’m in my office, we’re watching pictures of the [Twin Towers] smoking, and my phone rings and it’s the congressional staff saying, ‘You’ve got your money. Could you spend more?’”

Beauchamp’s $830 million plan was funded by one of Congress’ post-9/11 supplemental packages and created ClearView — the first program that allowed commercial companies to provide satellite imagery to the IC. Once U.S. forces had entered Afghanistan, Beauchamp also moved to purchase all of the overhead imagery of the country, he said.

“What we really wanted to do was make sure that this imagery that was being collected wasn’t being used by the Taliban to target our forces,” he said. “So I basically stitched a camouflage net made out of $100 bills over the country of Afghanistan in order to keep our forces safe.”

Today, commercially derived imagery is one of the fastest growing markets in the world. Companies like Maxar, BlackSky and Planet Labs all have several lucrative contracts with the federal government to provide space-based data for national security, weather and other needs. 

“So this industry, would it exist? Maybe. But would it have blown up the way it did? Probably not, if we hadn’t done this,” Beauchamp said.

The next several years of Beauchamp’s career would be spent at the NGA in various roles focused on strategy and acquisition. In 2012, he began a joint duty assignment as the ODNI’s director of mission integration under then-Director of National Intelligence Gen. James Clapper — a job he noted was one of the highlights of his career. During his second day on the job, U.S. government facilities in Benghazi, Libya, were targeted by militant groups, leading to the death of four American citizens.

Once Beauchamp’s team finished the assessment of the attack, he was immediately thrust into the fallout of the classified document leaks by Edward Snowden in 2013. His oversight led to a massive reform of the IC’s compartmented access programs and yet another overhaul of the government’s policy on commercial imagery.

“All of a sudden, now I’m convening people on the analytics side [and the] collection side, trying to figure out how to make up for the losses and capability that Snowden revealed,” he said. “And part of that is doing a reform of the IC’s compartmented programs, because they had way too many of them in overlap.”

Toward the end of his three-year assignment, Beauchamp started working with former Deputy Secretary of Defense Bob Work on a “side project” focused on standing up a new organization to pivot the Defense Department away from counterterrorism operations in the Middle East and towards great power competition, he said.

Beauchamp’s time in the intelligence community came to an end in 2015, when he was picked to be the Department of the Air Force’s deputy undersecretary for space and director of the principal DOD space advisor. There, he had two critical tasks, he noted.

“One, I’m working with all the international relationships with other countries who want to cooperate with us in space,” Beauchamp said. “At the same, I’m trying to convince the Americans to shift from space as a sanctuary from which you provide services, to space as a domain for warfighting.”

At the time, the Pentagon was reluctant to expand operations in space out of fear of being the first to weaponize the domain. But Beauchamp argued that the idea wasn’t about weaponization, and instead protection of critical space-based capabilities.

“It’s almost like before then, we were deliberately not protecting them so as you didn’t look like you wanted to start something,” he said. “And I was like, ‘This is not an option anymore.’ The Chinese had already demonstrated they could shoot down their own satellites, what’s to stop them from doing the same thing to us?”

Part of Beauchamp’s work was to develop a plan for how the Pentagon could make its space systems more resilient — many of which have become central to the Space Force’s operations, he noted. And when the first Trump administration decided to stand up the Space Force, Beauchamp was at the forefront of the effort to convince officials to approve the new military service.

Beauchamp would then transition to the Department of the Air Force’s office of the CIO, first as its director of enterprise IT in 2018 and later as the deputy CIO in 2020. His main focus was preparing the DAF for transitioning to telework operations as the COVID-19 pandemic spread across the globe, as well as consolidating the department’s enterprise licenses and creating a plan for modernizing base-level infrastructure, he noted.

“The overall trend line was eliminating the county option of uniqueness that was taking place at every base, and replacing it with a core set of enterprise services that were provided centrally,” Beauchamp said. “Big things like moving to zero trust — you can’t do those things if every base and every two-letter has their own architecture independent of everybody else’s.”

Today, the DAF has a strong path forward on modernizing its IT infrastructure, but Beauchamp said the true challenge will be convincing the department’s major programs to rely on enterprise services instead of building their own networks.

“It’s going to allow them to consolidate and collapse multiple redundant networks and really reduce the amount of money we’re spending on sustaining all this infrastructure,” he said. “When you modernize those networks, you also improve your cybersecurity, because the more deviation you have, the more gaps are created between the different baselines and different versions of software.”

Moving forward, Beauchamp said he will be taking time off but is open to other opportunities in the future.

“I’m excited for whatever the next challenge might be,” he said. “I’m interested in talking to folks who do exciting things, and to see who needs somebody like me to solve big problems.”

The post Winston Beauchamp retires from federal service after 29 years at Air Force, IC appeared first on DefenseScoop.

]]>
https://defensescoop.com/2025/07/08/winston-beauchamp-retires-from-federal-service-air-force-ic/feed/ 0 115487
OpenAI’s GPT-4o gets green light for top secret use in Microsoft’s Azure cloud https://defensescoop.com/2025/01/16/openais-gpt-4o-gets-green-light-for-top-secret-use-in-microsofts-azure-cloud/ https://defensescoop.com/2025/01/16/openais-gpt-4o-gets-green-light-for-top-secret-use-in-microsofts-azure-cloud/#respond Thu, 16 Jan 2025 19:42:46 +0000 https://defensescoop.com/?p=104770 Agencies across the intelligence community and the Defense Department can now use OpenAI’s GPT-4o for the government’s most classified mission sets.

The post OpenAI’s GPT-4o gets green light for top secret use in Microsoft’s Azure cloud appeared first on DefenseScoop.

]]>
Federal agencies with top-secret workloads can now use OpenAI’s GPT-4o through Microsoft’s Azure for U.S. Government Top Secret cloud.

Microsoft announced Thursday it received authorization for 26 additional products in its top-secret cloud environment, meeting Intelligence Community Directive (ICD) 503 standards and allowing agencies — particularly those in the intelligence community and Defense Department — to use them for the government’s most classified information. Those added tools include Azure OpenAI Service — which provides Azure customers access to OpenAI’s generative AI large language models — and Azure Machine Learning, among others.

Douglas Phillips, Microsoft corporate vice president, wrote a blog post announcing the news that Azure OpenAI “allows agencies and authorized partners operating in Microsoft’s Azure Government Top Secret cloud to benefit from multimodal generative AI models, such as GPT-4o, while meeting the rigorous security and compliance requirements necessary for the nation’s most sensitive data. Authorized users can easily access and integrate Azure OpenAI Service and further ground it on their data for more specialized and accurate intelligence.”

GPT-4o is an OpenAI model that can be used for natural language understanding and processing, text summarization and classification, sentiment analysis, question answering, conversational agents and more. It is the foundational model that the popular commercial generative AI tool ChatGPT is built on.

The announcement comes after Azure OpenAI received FedRAMP High authorization last August. 

Last May, William Chappell, Microsoft’s chief technology officer for strategic missions and technologies, told DefenseScoop that the company  had deployed OpenAI’s GPT-4 to an isolated, air-gapped Azure Government Top Secret cloud for use by the Department of Defense for testing. However, the model wasn’t accredited for wider use at the time. The accreditation announced Thursday would now make that possible.

Chappell told DefenseScoop the availability of GPT-4 in the top-secret environment would help DOD officials deal with vast amounts of data.

It’s about “making sure you have the right information at the right time,” he said. “So whether it’s geospatial or any amount of data, we’re swimming in data, we’ve got sensors everywhere. How do you actually make sense of the information that is within your organization? Whether that’s proposals or all sorts of different types of paperwork that we all have to do — how do you simplify and how do you sort through that … data that’s mission focused or data that’s more back office and human resource-focused?”

The post OpenAI’s GPT-4o gets green light for top secret use in Microsoft’s Azure cloud appeared first on DefenseScoop.

]]>
https://defensescoop.com/2025/01/16/openais-gpt-4o-gets-green-light-for-top-secret-use-in-microsofts-azure-cloud/feed/ 0 104770
Report highlights how secure data-sharing platforms can support the Intelligence Community’s IT roadmap https://defensescoop.com/2024/12/17/report-highlights-how-secure-data-sharing-platforms-can-support-the-intelligence-communitys-it-roadmap/ https://defensescoop.com/2024/12/17/report-highlights-how-secure-data-sharing-platforms-can-support-the-intelligence-communitys-it-roadmap/#respond Tue, 17 Dec 2024 20:30:00 +0000 https://defensescoop.com/?p=103442 GDIT’s DeepSky, Mission Partner Environments, Raven, data fabric, and digital accelerator programs illustrate how field-tested technologies can boost IC efforts to share data and promote cross-agency collaboration.

The post Report highlights how secure data-sharing platforms can support the Intelligence Community’s IT roadmap appeared first on DefenseScoop.

]]>
As the U.S. Intelligence Community (IC) grapples with a dynamic threat landscape and demands for faster, more secure data sharing, a new report from GDIT offers a practical guide for achieving a variety of the IC’s critical modernization goals.

The report, “Navigating the Intelligence Community IT Roadmap,” analyzes key challenges facing the IC and outlines how existing and tested technology capabilities can help IC components gain a strategic advantage over adversaries.

Download the full report.

The report’s timely release aligns with the IC’s five-year IT roadmap, which seeks to advance intelligence operations by promoting seamless collaboration, enhanced data sharing and management and the ability to deploy the newest tech innovations rapidly.

The report highlights a variety of currently available technical capabilities developed by GDIT as part of its long-standing work to support the U.S. defense and intelligence agencies, including:

  • DeepSky — a private, multi-cloud, on-prem data center environment developed and maintained by GDIT that facilitates the testing of emerging technology and security capabilities from multiple providers in collaboration with government agencies and their partners. “It’s really difficult to ingest massive amounts of data from a bunch of tools and make it usable for an engineer, an analyst or an executive. So DeepSky helps make those tools work together,” says Ryan Deslauriers, director of cybersecurity at GDIT.
  • Mission Partner Environments — a new generation of interoperable networking and data exchange environments. Originally designed to allow military units to exchange data with specific partners, these expanded information-sharing environments enable the selective yet secure sharing of sensitive and classified information with trusted military and coalition partners. MPEs make it possible to take a “full report, break out what can and can’t be released, and push it to the appropriate network virtually and automatically so that information gets to relevant users where they are in a timely fashion,” explains Jennifer Krischer, a former U.S. Air Force intelligence officer who now serves as vice president for defense intelligence at GDIT.
  • Raven — a mobile command center tech suite developed by GDIT that fits in the back of a truck. It extends and deploys the data mesh concept to mobile environments. It can be utilized for disaster relief, special forces operations, or disconnected environments, enabling operators to collect and disseminate data from the tactical edge directly to users on the ground and back to the enterprise. Raven is an example of how GDIT “enables teams to conduct their mission without having to develop, build, maintain, and operate the services internally,” notes Nicholas Townsend, senior director at GDIT.
  • Federated Data Fabric — creates a unified data environment through a centralized service platform designed to streamline data curation, management, and dissemination and enable seamless access to data independent of its source or security level. It allows users on the network’s edge to discover, request, publish and subscribe to information within a federated network environment.

Workforce commitment

The report also highlights GDIT’s distinctive approach to hiring and training professionals with extensive defense, IC, and technical experience who uniquely understand the needs of the government’s mission.

“Our workforce two to five years from now will need to be different from what it is today and prepared to take advantage of new technology,” notes Chaz Mason, mission engineering and delivery lead at GDIT. Recognizing this, GDIT doubled its investment in tuition and technical training programs in 2023. More than 20,000 employees have taken at least one of our cyber, AI, and cloud upskilling programs, he said.

GDIT’s staff currently numbers 30,000 professionals supporting customers in over 400 locations across 30 countries; 25%+ of the workforce are veterans.

Read more about how GDIT’s vendor-agnostic technology and decades of government customer experience can help achieve the Intelligence Community’s data-sharing vision.

This article was produced by Scoop News Group for FedScoop and DefenseScoop and sponsored by GDIT.

The post Report highlights how secure data-sharing platforms can support the Intelligence Community’s IT roadmap appeared first on DefenseScoop.

]]>
https://defensescoop.com/2024/12/17/report-highlights-how-secure-data-sharing-platforms-can-support-the-intelligence-communitys-it-roadmap/feed/ 0 103442
From Building 213 to the Pentagon: John Sherman reflects on his legacy in government https://defensescoop.com/2024/06/28/john-sherman-defense-department-cio-exit-interview/ https://defensescoop.com/2024/06/28/john-sherman-defense-department-cio-exit-interview/#respond Fri, 28 Jun 2024 14:47:08 +0000 https://defensescoop.com/?p=93088 As he departs from his role as Pentagon CIO, John Sherman spoke with DefenseScoop about his career in government and what challenges DOD faces in the future.

The post From Building 213 to the Pentagon: John Sherman reflects on his legacy in government appeared first on DefenseScoop.

]]>
If there was one thing John Sherman wasn’t afraid to do during his time as the Pentagon’s chief information officer, it was advocating for new ideas in a bureaucracy that is infamously resistant to change.

He entered the role in December 2021, a tumultuous era marked by controversy over the Joint Enterprise Defense Infrastructure (JEDI) cloud effort. In the midst of the fallout, Sherman recognized that the department needed to pivot.

“I truly felt we were figuratively fighting and dying on a hill not worth fighting and dying on,” Sherman told DefenseScoop. “All this litigation that we were stuck in and back-and-forth between the several cloud service providers, I felt we were all expending energy against the wrong goals.”

Six months into his tenure as DOD CIO, he made the recommendation to cancel JEDI — a program that sought a single vendor for the Pentagon’s first enterprise cloud capability — and pivot to a multi-vendor acquisition process under what is now known as the Joint Warfighting Cloud Capability (JWCC).

“That, to me, has been the flagship or one of the top achievements I’ve had as CIO,” Sherman said.

Sherman announced June 6 that he would be departing as Pentagon CIO by the end of the month, moving into a new role at Texas A&M University, his alma mater, as the Dean of the Bush School of Government and Public Service.

During an exit interview with DefenseScoop on Monday, Sherman reflected on his nearly three-decade career in government where he often campaigned for novel approaches and technologies to accomplish missions.

“Anytime you’re doing something new, you’re gonna break some glass doing it,” he said.

A ‘digitally focused’ IC

After serving in the Army as an air defense officer in the 24th Infantry Division, Sherman said he was interested in working in the intelligence community and initially applied to be an all-source analyst at the Central Intelligence Agency.

But when he received his interview package, he was sent to Building 213 in Washington, D.C.’s Navy Yard where the DOD was standing up the new National Imagery and Mapping Agency — now known as the National Geospatial-Intelligence Agency (NGA). Sherman was hired as an imagery analyst in 1997, investigating and distributing geospatial intelligence on the Iraqi Republican Guard.

“Working that Republican Guard account for several years will, and continues to be, one of my fondest memories in the IC — working with some amazing teammates in Building 213 supporting U.S. Central Command and other entities with what I thought was insightful analysis during the no-fly-zone days, and then moving to the start of Operation Iraqi Freedom and onward,” Sherman said.

He would spend the next 23 years in the intelligence community, including as the CIA duty officer in the White House Situation Room, an all-source analyst on the National Intelligence Council and a role at the NGA Office of the Americas.

Notably, Sherman was part of the small team that was present in the White House Situation Room on the morning of the September 11 attacks on the World Trade Center.

“It was a sobering experience, but also we were honored to be there to support crisis operations on that day,” he said.

In 2014, the CIA was looking to become more “digitally focused,” and Sherman became one of two deputy directors of the CIA’s Open Source Enterprise (OSE) managing the tradecraft of open source intelligence. He led the Middle East and Asia portfolios, as well as the portfolio for emerging technologies where he first began experimenting with commercial cloud capabilities, he noted.

While at OSE, Sherman helped stand up a low-side cloud capability called the Open Source Data Layer and Services (OSPLS). The effort leveraged Amazon Web Services and other capabilities provided by the IC’s Commercial Cloud Services (C2S) program to provide a cloud-based environment for less sensitive and non-critical information.

He detailed how he also took part in the Eyesight Mission Users Group. Although the group’s focus is classified, Sherman said the experience taught him critical lessons on data standards and exactly how cloud technology works.

“What I was able to do was, as one of the initiative leaders, use open-source gathered information to feed into NSA’s gov cloud — which was their part of the classified capability — to then run the compute against this open-source information and find new things that we would not have been able to discover otherwise,” he said.

Sherman was later tapped to serve as the intelligence community’s CIO in 2017, and during his time he initiated several innovative changes that allowed the IC’s IT enterprise to evolve. 

One of those was shifting focus on a program known as the Common IC Desktop Enterprise, which initially looked to create a unified architecture that would allow analysts and officers to move between agencies without the hassle of transferring their data. Despite all of the money and time the IC had already invested into the effort, Sherman said he recognized it wasn’t working.

“It was never going to scale out to being this IC-level capability that it was envisioned to be, and so we pivoted to a federated architecture where we would have standards and then be able to accomplish some of the same interfaces — but not with this unified overall architecture that we were first going along,” he said. 

Another accomplishment as IC CIO was the creation of the Commercial Cloud Enterprise (C2E) program. The intelligence community had been using a single-vendor approach under C2S since 2014, and Sherman initiated the follow-on C2E effort to bring a multi-vendor, multi-cloud capability to the IC in 2020, with Amazon Web Services, Microsoft, Google, Oracle and IBM serving as vendors.

“I’ll also admit this freely — C2E was the model for what became JWCC at DOD,” he said.

Leaning into hard decisions

Sherman was brought into the Defense Department as the principal deputy CIO in 2020, later replacing then-CIO Dana Deasy when he left his position in 2021. Although the department was grappling with many problems with its IT enterprise then, there are still a number of other issues the new CIO who replaces him will face in the future, he said.

“I don’t know what the next hard decision is going to be, but be ready to lean into that,” he said. 

Still, Sherman touted the accomplishments he made during his time at the Pentagon, especially related to the department’s pivot to JWCC and the awards made to Google, Oracle, Amazon Web Services and Microsoft for the program at the end of 2022.

He noted that over $700 million worth of task orders across all three security classifications have been awarded through JWCC to date, with organizations like the F-35 Joint Program Office, defense agencies and combatant commands all on board with the program.

JWCC’s growth has also initiated the Pentagon’s new Joint Operational Edge (JOE) initiative to provide cloud capabilities at the tactical edge — a concept he calls the “lily pad.” One JOE cloud has already been installed at Joint Base Pearl Harbor-Hickam in Hawaii, another is coming online next in Japan, and the Pentagon is currently looking at sites for a third one in Europe, he said.

“One of the big things that we talk about a lot with cloud tradecraft is procuring cloud is not the end of the story. You have to learn how to use it, you have to learn how to apply it to your mission,” Sherman noted.

As it prepares for the next phase of the program, dubbed JWCC 2.0, Sherman has directed the CIO’s team to conduct an after-action review of the entire effort. 

“While I’m a huge fan of it, I know it’s not perfect. Because like with C2E, we’re kind of figuring out how to walk and chew gum in a multi-vendor environment,” he said. “What can we do better for JWCC 2.0? Are there things we can put into place to make [software-as-a-service] offerings easier to manage?”

Along with cloud modernization, Sherman has led efforts to improve user experience at the department by creating a UX portfolio management office at the CIO, fix the lengthy authority to operate (ATO) process in response to complaints from industry, and move the Pentagon into adopting a zero-trust cybersecurity framework by 2027.

In a statement to DefenseScoop, Deputy Secretary of Defense Kathleen Hicks praised Sherman for positioning the department for success while he served as CIO.

“John tackled some of the most complex challenges in the Department during his tenure, advancing the Department’s information advantage and improving our decision superiority, from the combatant commander down to the platoon leader,” Hicks said. “His leadership on ground-breaking initiatives such as the Joint Warfighting Cloud Capability, Zero Trust Architecture, and the Emerging Mid-Band Spectrum Sharing assessment materially strengthened US national security.”

A key challenge for the department moving forward will be to ensure it is modernizing at the pace it needs to, all while leveraging industry capabilities when it can, he said.

“As we talk big thoughts about edge cloud and transport and zero trust, never forget that it comes down to a service member’s ability or civilian’s ability to do their job — not only at the Pentagon, but out at Osan Air Base in Korea, or onboard a ship in the Red Sea, or at a special forces detachment in Africa,” Sherman emphasized.

Another will be tackling the Pentagon’s growing tech debt, he added. Warfighters are still using a lot of outdated technology from previous conflicts in the Middle East, and Sherman noted that understanding that priority and leveraging the entire enterprise to address it quickly is crucial for the department.

“We’ve got to pay the piper on this because in the digital battlefield that we’ve seen in places like Ukraine and what we could have to face in the western Pacific, these digital IT capabilities are war-winning technologies,” Sherman said. “It’s not just blinky lights and data centers, this is the difference for decision capability for our commanders.”

When asked what advice he would give to the next DOD CIO, Sherman emphasized the importance of working as a team with all of the departments and components at the Pentagon, as well as collaborating with industry as much as possible.

Leslie Beavers, DOD’s principal deputy CIO, will serve as acting CIO as Sherman departs until the department makes a decision on a full-time replacement.

He also pointed to the importance of strong leadership when making hard decisions and setting a clear north star for some of the departments where change might be a heavy lift.

“This has been the greatest opportunity I’ve had professionally, but also I’d be lying if I didn’t say it’s the most challenging,” Sherman said. “So that would be my advice to the next CIO: Buckle your chin strap and get ready, because this is going to be a heck of a ride.”

The post From Building 213 to the Pentagon: John Sherman reflects on his legacy in government appeared first on DefenseScoop.

]]>
https://defensescoop.com/2024/06/28/john-sherman-defense-department-cio-exit-interview/feed/ 0 93088
IC, DOD want to get better at contracting for commercial space-based data and analytic services https://defensescoop.com/2023/08/22/ic-dod-want-to-get-better-at-contracting-for-commercial-space-based-data-and-analytic-services/ https://defensescoop.com/2023/08/22/ic-dod-want-to-get-better-at-contracting-for-commercial-space-based-data-and-analytic-services/#respond Tue, 22 Aug 2023 18:40:48 +0000 https://defensescoop.com/?p=74331 A new study directed by ODNI seeks to “examine ways to overcome barriers to use of commercial remote sensing/space-based data and analytic services in the Intelligence Community and Department of Defense."

The post IC, DOD want to get better at contracting for commercial space-based data and analytic services appeared first on DefenseScoop.

]]>
Amid growing interest from intelligence agencies and the Pentagon in buying remote sensing data and analytic services from commercial providers, the Office of the Director of National Intelligence is probing industry on what hindrances vendors face when contracting with the U.S. government.

A new study directed by ODNI seeks to “examine ways to overcome barriers to use of commercial remote sensing/space-based data and analytic services in the Intelligence Community and Department of Defense,” according to a request for information posted to Sam.gov on Monday.

Responses to the RFI are intended to help the IC and Pentagon identify those barriers, and they could assist the organizations in developing solutions and inform future funding decisions, the document added.

The office is specifically interested in two capabilities: commercial overhead data and commercial overhead analytic services.

The RFI defines commercial overhead data as “unprocessed and/or processed signals or images” purchased from a space-based provider in industry, including radio frequency data, communications intelligence, electronic intelligence and geolocation.

Overhead analytics services are the “products, analytics, or services derived using space-based commercial remote sensing capabilities,” which could include geospatial information derived or not derived from images, as well as finished analytics and products, according to the document.

In recent years, the Defense Department has worked to break down bureaucratic barriers in order to take advantage of capabilities available in a burgeoning commercial space industry. The Space Force in June broadened its Commercial Services Office to maximize opportunities for partnering with vendors, and it’s collaborating closely with the National Reconnaissance Office and the National Geospatial-Intelligence Agency, according to the Space Force.

At the same time, members of the intelligence community run a Commercial Space Council focused on leveraging commercial satellite data, analysis and services. 

“Although it is the policy of the United States to eliminate impediments to the timely delivery of space capabilities and accelerate the use of commercial capabilities, frequently commercial industry encounters challenges to working with the U.S. government and spending on commercial analytic products remains relatively small compared to spending on commercial satellite data,” the RFI said. 

The office is asking industry to respond to a survey of 13 questions by Sept. 22. The queries are intended to solicit details about the specific challenges that have occurred while working with the Defense Department and intelligence community, how cybersecurity requirements influence contracting, what artificial intelligence and machine learning tools responders are using, and more.

ODNI may hold an invitation-only focus group to discuss the topic more in depth based on the number of responses, according to the notice. 

The post IC, DOD want to get better at contracting for commercial space-based data and analytic services appeared first on DefenseScoop.

]]>
https://defensescoop.com/2023/08/22/ic-dod-want-to-get-better-at-contracting-for-commercial-space-based-data-and-analytic-services/feed/ 0 74331
Senate’s intelligence authorization bill questions ‘reverse engineering’ of government-recovered UAPs https://defensescoop.com/2023/06/27/senates-intelligence-authorization-bill-questions-reverse-engineering-of-government-recovered-uaps/ https://defensescoop.com/2023/06/27/senates-intelligence-authorization-bill-questions-reverse-engineering-of-government-recovered-uaps/#respond Tue, 27 Jun 2023 15:14:44 +0000 https://defensescoop.com/?p=70748 Unidentified anomalous phenomena refers to the government’s modern term for multi-domain UFOs.

The post Senate’s intelligence authorization bill questions ‘reverse engineering’ of government-recovered UAPs appeared first on DefenseScoop.

]]>
Buried in the Senate’s approved text of the Intelligence Authorization Act (IAA) for fiscal 2024 are inclusions that would direct deeper transparency regarding government encounters with unidentified anomalous phenomena and any associated attempts made to date to inspect or reverse engineer recovered, unexplainable craft or materials. 

The proposed legislative language included in the annual authorization bill comes just after reports from a former Pentagon official-turned-whistleblower emerged, alleging that the U.S. had or has what could be spacecraft of non-human origin in its UAP research arsenal. So far, lawmakers have not responded to those claims, which also have not been proven with official records or evidence to date. 

But in the latest version of the IAA introduced in the Senate last week, lawmakers incorporated a mandate for any person currently or formerly under contract with the federal government that “has in their possession material or information provided by or derived from the” government relating to UAP — “that formerly or currently is protected by any form of special access or restricted access” — to notify Dr. Sean Kirkpatrick, director of the Pentagon’s new All-domain Anomaly Resolution Office (AARO), within 60 days of the bill’s enactment.

No later than 180 days after the IAA’s passage, the officials would also need to make “all such material and information” and “a comprehensive list of all non-earth origin or exotic [UAP] material” available to AARO for “assessment, analysis, and inspection.”

Restricted and special access programs involve sensitive information at classified or higher security levels. 

Further, the text of Sec. 1104 of this version of the IAA states that “no amount authorized to be appropriated or appropriated by this act or any other act may be obligated or expended, directly or indirectly, in part or in whole, for, on, in relation to, or in support of activities involving [UAP] protected under any form of special access or restricted access limitations” that have not been “formally, officially, explicitly, and specifically described, explained, and justified” to the AARO director, and congressional leadership.

The legislation notes that it applies to “any activities relating to the following”:

  1. Recruiting, employing, training, equipping, and operations of, and providing security for, government or contractor personnel with a primary, secondary, or contingency mission of capturing, recovering, and securing unidentified anomalous phenomena craft or pieces and components of such craft.
  2. Analyzing such craft or pieces or components thereof, including for the purpose of determining properties, material composition, method of manufacture, origin, characteristics, usage and application, performance, operational modalities, or reverse engineering of such craft or component technology.
  3. Managing and providing security for protecting activities and information relating to unidentified anomalous phenomena from disclosure or compromise.
  4. Actions relating to reverse engineering or replicating unidentified anomalous phenomena technology or performance based on analysis of materials or sensor and observational information associated with unidentified anomalous phenomena.
  5. The development of propulsion technology, or aerospace craft that uses propulsion technology, systems, or subsystems, that is based on or derived from or inspired by inspection, analysis, or reverse engineering of recovered unidentified anomalous phenomena craft or materials.
  6. Any aerospace craft that uses propulsion technology other than chemical propellants, solar power, or electric ion thrust.

In recent years, intelligence and defense authorization bills have been used as mechanisms to pass UAP-related legislation. According to an executive summary released last week, the Senate’s National Defense Authorization Act of fiscal 2024 requests additional funding for AARO.

The post Senate’s intelligence authorization bill questions ‘reverse engineering’ of government-recovered UAPs appeared first on DefenseScoop.

]]>
https://defensescoop.com/2023/06/27/senates-intelligence-authorization-bill-questions-reverse-engineering-of-government-recovered-uaps/feed/ 0 70748
Building security resilience across global missions with next-gen firewalls https://fedscoop.com/building-security-resilience-across-global-missions-with-next-gen-firewalls/ https://fedscoop.com/building-security-resilience-across-global-missions-with-next-gen-firewalls/#respond Mon, 05 Jun 2023 19:32:08 +0000 https://defensescoop.com/?p=69497 Reducing security complexity doesn’t require sacrificing information security for defense and intelligence community organizations, says a new report.

The post Building security resilience across global missions with next-gen firewalls appeared first on DefenseScoop.

]]>
Reducing security complexity doesn’t require sacrificing information security for defense and intelligence community organizations, says a new report.

The post Building security resilience across global missions with next-gen firewalls appeared first on DefenseScoop.

]]>
https://fedscoop.com/building-security-resilience-across-global-missions-with-next-gen-firewalls/feed/ 0 69497
Intelligence agencies confronting challenges with multi-cloud environments https://defensescoop.com/2023/05/30/intelligence-agencies-confronting-challenges-with-multi-cloud-environments/ https://defensescoop.com/2023/05/30/intelligence-agencies-confronting-challenges-with-multi-cloud-environments/#respond Tue, 30 May 2023 18:35:30 +0000 https://defensescoop.com/?p=69114 The IC does not currently have an overarching cloud governance model. 

The post Intelligence agencies confronting challenges with multi-cloud environments appeared first on DefenseScoop.

]]>
While intelligence agencies are making progress generating modern cloud environments that underpin secure IT services and reliable access to their secretive data workloads, they’re also confronting unique challenges associated with operating in multi- and hybrid-cloud constructs, according to senior officials.

Broadly, multi-cloud computing models involve two or more public cloud options, and hybrid cloud computing refers to environments with a mix of private (or enterprise-hosted) and public cloud services.

Google, Oracle, Amazon Web Services and Microsoft are competing for task orders via the Defense Department’s enterprise cloud initiative, the Joint Warfighting Cloud Capability (JWCC). The intelligence community’s multi-cloud construct, Commercial Cloud Enterprise (C2E), is similar to JWCC and incorporates the same vendors, as well as IBM.

Awarded in 2020, C2E is a 15-year contract.

At this point, though, U.S. intel organizations “don’t have a multi-cloud/hybrid architecture at the IC level that would allow us to freely be able to exchange information with one another — and we don’t have a catalog … for [sharing] datasets,” Fred Ingham said last week during a panel at the annual GEOINT Symposium. 

Ingham is a CIA employee who’s currently on detail as the deputy chief information officer at the National Reconnaissance Office.

“In the old days, if I were to create a system that needed to take data from a spaceborne asset and write it very quickly to memory process that data, do analysis on that data, eventually come up with some intelligence and perhaps store it in a repository — what I might build is I might create a very high-speed network” and a storage area network, he said. He added that he’d also buy “purpose-built servers” and a database for processing, among other assets.

The government would approve that system for storing information only after “I knew precisely how all of those bits and pieces work together,” Ingham explained.

“Now, let’s fast forward into a multi-cloud construct” with that same system — “completely contrived,” he said — offering a hypothetical to demonstrate current challenges. 

“So we’re downloading the same bits and I’m going to choose to put that into Google, because I like their multicast capability, so we’re going to write those bits very quickly into Google. And then I’m going to process them. And let’s just say I’ve got my processing already in AWS, I’ve got heavy GPUs there. So, I want to process that in AWS. And I happen to like Microsoft’s [machine learning] algorithms, so I’m going to do the analysis there, inside of Azure. And this intelligence that I accrue, I’m going to go store this in an Oracle database. I didn’t leave out IBM, it’s just IBM is on the high side. Alright, so I want to do that — [but] I can’t do it,” Ingham said. 

He spotlighted reasons why officials can’t yet make this move-across-a-multi-cloud vision a reality.

“Number one, [the IC] acquired five cloud vendors, and we didn’t have a strategy or an architecture about how all of those things would fit together and work with one another,” Ingham said. 

The intel community does not currently have an overarching cloud governance model. 

Ingham noted at the conference he spoke to a representative from IBM, who told him about a commercial “cloud exchange, where each of those cloud providers are sitting in that same data center, and therefore they have the same type of networking capabilities — and so transport between the clouds are equal.”

“We don’t have that in the IC today,” he pointed out.

He highlighted a present lack of capacity to deterministically understand the performance of each cloud, onboarding tools, operational support, identity management, how data moves and comprehensive situational awareness across the cloud service providers, among other issues. 

“What I like to think about is frictionless computing, that’s not frictionless — and until we solve those issues, I don’t see us being able to use the multi-cloud in the manner that I just described,” Ingham said. 

On the panel, leaders from other intelligence agencies also reflected on the benefits and obstacles of their unfolding, government cloud deployments.

“The government has to do a better job in defining requirements — functional requirements — and more importantly, as you go towards a potential conflict with China, the operational requirements, or the operational scenarios in which you’re expected to run and deliver solutions [via the cloud]. I think we in the government have not done an appropriate job of that to our IT solution providers,” the Defense Intelligence Agency’s Deputy Chief Information Officer E.P. Mathew said.

Meanwhile, the National Security Agency is “already very far along on its multi-cloud journey,” according to NSA’s Deputy Chief Information Officer Jennifer Kron. Officials there “truly believe in finding the right computing solution for each mission” and purpose, she said, and so they are leveraging services from multiple providers.

The National Geospatial-Intelligence Agency started moving “everything” to the cloud in 2015. But by 2016, officials “very quickly found out” that moving all the workloads “wasn’t really the smart thing to do,” NGA’s Director for Chief Information Officer and IT Services Mark Chatelain said. Now, the agency is using the C2E contract to diversify its cloud holdings, he noted, with aims to “figure out how to smartly use the multi-cloud” over the next few years.

Recently, NGA has been requesting that industry provide “something like a single-pane-of-glass view of a multi-cloud” ecosystem, Chatelain said — “so, you don’t have to go to Google window or an Oracle window, you basically have a single-pane-of-glass window that you can manage all of the clouds.”

NGA also wants more affordable applications to move data and capabilities, as well as direct connections between the clouds to expedite information transfer.

“Imagery, as you know, consumes a huge amount of data. NGA brings in about 15 terabytes per day of imagery into their facilities, today. And that’s predicted to grow probably about 1,000% in the next coming six or seven years. So we’ve got to have the connectivity between the clouds to be able to share that information,” Chatelain noted.

He and other officials suggested that cloud providers should recommend an architecture and appropriate path forward. They were hopeful that could soon be in the pipeline.

“I had the opportunity to be with all of the cloud vendors yesterday and today — and without exception, every one of them is very much in favor of exactly that. They know they bring something to the fight that nobody else does, and they know that their competitors bring something to the fight that they can’t bring,” Chatelain said.

The post Intelligence agencies confronting challenges with multi-cloud environments appeared first on DefenseScoop.

]]>
https://defensescoop.com/2023/05/30/intelligence-agencies-confronting-challenges-with-multi-cloud-environments/feed/ 0 69114
Intelligence community working with private sector to understand impacts of generative AI https://defensescoop.com/2023/05/04/intelligence-community-generative-ai/ https://defensescoop.com/2023/05/04/intelligence-community-generative-ai/#respond Thu, 04 May 2023 20:24:22 +0000 https://defensescoop.com/?p=67516 The United States’ intelligence community is looking to engage with the private sector to help them assess the technology, U.S. Director of National Intelligence Avril Haines told lawmakers Thursday.

The post Intelligence community working with private sector to understand impacts of generative AI appeared first on DefenseScoop.

]]>
As the possibilities and potential threats of generative artificial intelligence grow, the United States intelligence community is looking to engage with the private sector to help them assess the technology, according to U.S. Director of National Intelligence Avril Haines.

“We’ve been writing some analysis to try to look at what the potential impact is on society in a variety of different realms, and obviously we see some impact in intelligence activities,” Haines told the Senate Armed Services Committee on Thursday. “What we also recognize is that we do not yet have our hands around what the potential is.”

Generative AI is an emerging subfield of artificial intelligence that uses large language models to turn prompts from humans into AI-generated audio, code, text, images, videos and other types of media. Platforms that leverage the technology — like ChatGPT, BingAI and DALL-E 2 — have gone viral in recent months.

While leaders within the U.S. government have acknowledged the technology’s usefulness to assist workers, some have expressed concern over how generative AI will affect daily life and have called for better collaboration with industry to help understand its capabilities.

Within the intelligence community, Haines said that many organizations have assembled task forces made of experts in the field of artificial intelligence to comprehend the technology’s impact.

“We have been trying to facilitate groups of experts in the IC to essentially connect with those in the private sector who are on the cutting edge of some of these developments, so that we can make sure that we understand what they see as potential uses and developments in this area,” she added. 

Haines’ comments came on the same day as a meeting between Vice President Kamala Harris and the CEOs of Alphabet, Anthropic, Microsoft and OpenAI — four pioneers of generative AI. The meeting is intended “to underscore this responsibility and emphasize the importance of driving responsible, trustworthy, and ethical innovation with safeguards that mitigate risks and potential harms to individuals and our society,” according to a White House fact sheet on new administration actions to promote responsible AI innovation.

In addition, Anthropic, Google, Hugging Face, Microsoft, NVIDIA, OpenAI and Stability AI have announced that they will open their large language models to red-teaming at the upcoming DEF CON hacking conference in Las Vegas as part of the White House initiative. The event will be the first public assessment of large language models, a senior administration official told reporters on condition of anonymity.

“Of course, what we’re drawing on here, red-teaming has been really helpful and very successful in cybersecurity for identifying vulnerabilities,” the official said. “That’s what we’re now working to adapt for large language models.”

Haines has not been the first leader within the U.S. government to call for better collaboration with industry on generative AI. For example, Defense Information Systems Agency Director Lt. Gen. Robert Skinner pleaded with industry on Tuesday to help them understand how they could leverage the technology better than adversaries. 

“Generative AI, I would offer, is probably one of the most disruptive technologies and initiatives in a very long, long time,” Skinner said during a keynote speech at AFCEA’s TechNet Cyber conference in Baltimore. “Those who harness that [and] that can understand how to best leverage it … are going to be the ones that have the high ground.”

As the technology continues to evolve at a rapid pace, some experts in industry have advocated for a temporary ban on generative AI so that vendors and regulators can have time to create standards for its use. But others at the Pentagon have argued against such a moratorium.

“Some have argued for a six-month pause — which I personally don’t advocate towards, because if we stop, guess who’s not going to stop: potential adversaries overseas. We’ve got to keep moving,” Department of Defense Chief Information Officer John Sherman said Wednesday at the AFCEA TechNet Cyber conference in Baltimore.

The post Intelligence community working with private sector to understand impacts of generative AI appeared first on DefenseScoop.

]]>
https://defensescoop.com/2023/05/04/intelligence-community-generative-ai/feed/ 0 67516
NGA picks 13 companies to compete through $900M intelligence support contracting vehicle https://defensescoop.com/2023/05/02/nga-picks-13-companies-to-compete-through-900m-intelligence-support-contracting-vehicle/ https://defensescoop.com/2023/05/02/nga-picks-13-companies-to-compete-through-900m-intelligence-support-contracting-vehicle/#respond Tue, 02 May 2023 20:26:04 +0000 https://defensescoop.com/?p=67341 The agency is pretty tight-lipped about the work GEO-SPI B will fundamentally enable, but a spokesperson shared some details with DefenseScoop on Tuesday.

The post NGA picks 13 companies to compete through $900M intelligence support contracting vehicle appeared first on DefenseScoop.

]]>
The National Geospatial-Intelligence Agency officially tapped 13 companies to now vie to supply a range of technologies and mission support services via its major multiple award indefinite-delivery/indefinite-quantity (IDIQ) contracting vehicle for national security-aligned intel capabilities.

A list of the entities that landed spots on the GEOINT Enterprise Operations Service and Solutions Program with Industry, Core Mission Operations (GEO-SPI B) contract was included in a federal contracting award notice posted online on Monday afternoon.

“The companies selected for the IDIQ will compete for individual task orders across the seven-year ordering period, collectively worth up to $900 million,” an NGA spokesperson told DefenseScoop in an email on Tuesday.

The spy agency has been pretty tight-lipped about the work GEO-SPI B will fundamentally enable. Contracting materials that would shed light on that are accessible only to individuals and businesses approved by the U.S. government to use the Intelligence Community Acquisition Research Center. 

“This suite of IDIQ contracts provide NGA’s core contracted intelligence and foundational analysis encompassing the tasking, collection, processing, exploitation, and dissemination (TCPED) functions that underpin GEOINT work,” according to the agency’s spokesperson.

The 13 approved contractors include:

  • 3GIMBALS
  • BAE Systems
  • Booz Allen Hamilton
  • Castalia Systems
  • Continental Mapping Consultants
  • Geo Owl
  • Leidos
  • ManTech
  • Novetta 
  • ProCleared
  • Royce Geospatial Consultants
  • Solis Applied Science
  • Thomas & Herbert Consulting 

Once selections are made under this IDIQ, associated work will be performed at NGA sites and in partner facilities.

The post NGA picks 13 companies to compete through $900M intelligence support contracting vehicle appeared first on DefenseScoop.

]]>
https://defensescoop.com/2023/05/02/nga-picks-13-companies-to-compete-through-900m-intelligence-support-contracting-vehicle/feed/ 0 67341