IBM Archives | DefenseScoop https://defensescoop.com/tag/ibm/ DefenseScoop Mon, 28 Jul 2025 17:28:19 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.2 https://defensescoop.com/wp-content/uploads/sites/8/2023/01/cropped-ds_favicon-2.png?w=32 IBM Archives | DefenseScoop https://defensescoop.com/tag/ibm/ 32 32 214772896 Former Pentagon CDAO Radha Plumb takes AI transformation role at IBM https://defensescoop.com/2025/07/28/radha-plumb-ibm-cdao-defense-department/ https://defensescoop.com/2025/07/28/radha-plumb-ibm-cdao-defense-department/#respond Mon, 28 Jul 2025 17:28:16 +0000 https://defensescoop.com/?p=116455 As part of her role, Plumb will be IBM's "Client Zero," meaning she will internally operationalize AI technologies and concepts to test them before deploying to clients.

The post Former Pentagon CDAO Radha Plumb takes AI transformation role at IBM appeared first on DefenseScoop.

]]>
After stepping down from leading the Department of Defense’s Chief Digital and Artificial Intelligence Office during the Biden administration in January, Radha Plumb has taken a role at IBM, leading what the firm calls “AI-first transformation.”

As vice president of AI-first transformation, Plumb will spearhead IBM’s Next-Generation Transformation Strategy and work across the company’s core business lines to foster adoption of AI, automation and hybrid cloud computing throughout the global organization and with its clients and partners.

Plumb started in the role July 14.

A key part of her job, Plumb told DefenseScoop, will be serving as IBM’s “Client Zero,” meaning she will internally operationalize AI technologies and concepts to test them before deploying to clients.

“The approach is really taking AI solutions and embedding them in the company’s own processes and then using that to prove out how AI solves problems, drives agility, creates efficiencies, which IBM then can use to help demonstrate that value for its customers, right? So, this is an internal transformation role, but with an eye towards building out concrete examples of execution for external consumption,” Plumb told DefenseScoop.

That’s not so dissimilar from her role leading the CDAO, which serves as a central hub for accelerating and spreading the adoption of AI, data and analytics capabilities across the U.S. military. She likened it to the work of CDAO’s Rapid Capabilities Cell, which has been responsible for ushering in major contracts with frontier AI models.

Likewise, IBM is very focused on “scaled adoption at the enterprise level,” Plumb said.

“So how can you get AI tools into the hands of your workforce, and do it in a way that, rather than AI as a substitute for all the humans, you team AI with the humans to drive efficiency and productivity?” she said.

Plumb explained: “IBM’s big bet is … how can we do this as an enterprise transformation and really kind of drive the AI transformation vision in concrete ways through businesses.”

In particular, she sees an opportunity for IBM in working with her former employer, the Pentagon, and the federal government at large on the business side with applications, for example, managing supply chains, logistics, contracting and more.

“That’s where I think there’s a lot of potential for rapid movement of things we find that work in IBM and applications to the federal sector,” Plumb said.

Since Plumb’s departure from the CDAO in January, the office was led by Margie Palmieri, the deputy CDAO, until DOD leadership named Douglas Matty as the new leader in April. Matty previously founded the Army AI Integration Center under Army Futures Command, which he led between 2020 and 2022. Last week, DefenseScoop reported that Palmieri, one of the CDAO’s longest-tenured leaders, is the latest to depart the organization amid a raft of others who’ve left.

The post Former Pentagon CDAO Radha Plumb takes AI transformation role at IBM appeared first on DefenseScoop.

]]>
https://defensescoop.com/2025/07/28/radha-plumb-ibm-cdao-defense-department/feed/ 0 116455
Inside IBM’s complex effort to create and scale next-gen quantum supercomputers https://defensescoop.com/2025/03/03/ibm-effort-create-scale-next-generation-quantum-supercomputers/ https://defensescoop.com/2025/03/03/ibm-effort-create-scale-next-generation-quantum-supercomputers/#respond Mon, 03 Mar 2025 22:01:45 +0000 https://defensescoop.com/?p=107732 Researchers and engineers at IBM are hustling to build out a large-scale, fault-tolerant quantum supercomputer by 2029.

The post Inside IBM’s complex effort to create and scale next-gen quantum supercomputers appeared first on DefenseScoop.

]]>
Researchers and engineers at IBM are hustling to build out a large-scale, fault-tolerant quantum supercomputer by 2029, according to a senior executive steering that work.

In the process, the team is generating cutting-edge devices that can no longer be modeled with classical computing architectures. And the company recognizes the U.S. government’s need for quantum-safe security measures.

“The next [four to five years are] probably going to be the most exciting time in quantum computing. You’ve gone from it being a scientific exploration into the hardware, to how do we actually start using these quantum computers to do real things and scale them up,” said Jay Gambetta, IBM’s vice president for quantum. 

A leading quantum computing expert with more than two decades of professional experience, Gambetta joined IBM in 2011 — about seven years after he moved to the U.S. from Australia. 

“I was the first to pull [an IBM-built] quantum computer onto the cloud” back in 2016, he said.

Gambetta recently hosted a small group of journalists at the company’s Washington, D.C., office for a roundtable to discuss some of IBM’s recent quantum computing advancements, and his team’s roadmap and vision for what’s to come.

“At a high level, the U.S. has as always been a leader in quantum computing, and as we go into this new, next phase, we have to remain a leader in both continuing to build the best quantum computers and also being the first to use them for useful applications,” he said.

Quantum computing is considered an alternative computational paradigm and, broadly, applies certain laws of physics to digital information processing. It’s part of the emerging and disruptive technology field of quantum information science (QIS) that uses complex phenomena happening at atomic and subatomic levels to store, transmit, manipulate, process and measure information.

The Defense Department and other federal agencies have been increasingly prioritizing quantum-enabling activities and investments as proponents largely predict it will lead to revolutionary breakthroughs — like an unhackable internet — in the not-so-distant future.

IBM is in a small cadre of major U.S. businesses, including Google and Microsoft, vying to be the first to generate a quantum system that can outperform all the classical computers that came before.

“The whole purpose of quantum computing, and the whole thesis of it, is that there are algorithms that exist that no classical computer can ever simulate. Most famous of it is Shor’s algorithm, but more practical in terms of commercial applications is simulating chemistry, simulating materials, simulating biology, simulating finance, doing some risk calculation, some optimization. They are the reason we’re all doing it,” he explained.

On the heels of recent innovations, IBM aims to start experimenting with using a quantum computer as a scientific tool to discover new algorithms. 

Gambetta noted that the field of AI optimization is based on heuristic algorithms, or those that use approximate solutions to puzzle out answers to complex problems.

“We, as a society, have benefited from algorithm discovery, and we continue to benefit from it. Now we have something that we’ve got to add to it. And I expect that in the next year or two, we will see legitimate demonstrations of quantum advantage or supremacy. I prefer the word ‘advantage,’ but you call it whatever you want. [But it means] where we’ll actually see a quantum computer do something cheaper, faster, or more cost-effective than the classical computer alignment,” he said. 

There’s been rising hype over the last decade around possibilities for the future of quantum. Gambetta acknowledged that, as well as critics’ responses, but said he’s confident IBM is on the cusp of some next-level breakthroughs.

“In some sense, to me, we’re in this most exciting time. We’re now getting devices that you can no longer simulate with classical computing,” he said.

Early government focus and investments from defense and intelligence research labs zoned in on proving the existence of quantum bits, or qubits, the basic unit of information used to encode data in quantum computing. 

The congressionally mandated National Quantum Initiative (NQI) enacted in 2018 accelerated U.S. progress when momentum was already swiftly building, Gambetta noted. And alongside new innovations in cloud access, it meant the government’s focus shifted from fundamental research science to puzzling out and pursuing more practical applications. 

“Now we have 100-qubit machines that are beyond what we can simulate. So, you can build whatever classical computer you want but you can’t simulate it to do the pen-and-paper algorithms that we all want to do for quantum computing,” he explained.

Leading experts are also now more serious than ever about ensuring the in-the-works advanced quantum systems are being built in a way that they can be quickly and safely linked up and scaled as soon as they’re fully realized.

“Quantum communication, quantum computing, quantum sensing — they were all kind of almost distinctly different [in the past]. But the future is going to actually be quantum computers connected to quantum sensors connected with quantum communication,” Gambetta said.

Responding to DefenseScoop’s questions at the roundtable, the quantum chief discussed some of the high-priority challenges that anticipated quantum innovations pose for the DOD and U.S. military. 

“One of the algorithms that quantum computers will break is RSA, which [underpins] a lot of our security,” Gambetta explained, referring to the widely used Rivest–Shamir–Adleman public-key cryptosystem for securing data transmissions.

“The quantum computers we’ve built today don’t break [it] because they’ve got to be bigger to do that. But we know, given our commercial roadmap, that will happen,” he said. “That’s why quantum computers obviously drive a lot of national security concerns.”

Led by the National Institute of Standards and Technology, government agencies are working to transition to more protected security architectures and uncover new quantum-safe algorithms that won’t be broken by the future supercomputers.

Quantum computing further holds promise to expand DOD’s ability to solve complex logistics problems. 

Gambetta noted the technology also “changes the equation for simulating materials,” which the Pentagon spends heaps of high-performance computing resources on.

“We know this future [of IBM-run fault-tolerant quantum computers] is coming. I think how the commercialization of this gets done is a complicated question, and it’s why we’re talking so much with the people in the government,” he said.

The post Inside IBM’s complex effort to create and scale next-gen quantum supercomputers appeared first on DefenseScoop.

]]>
https://defensescoop.com/2025/03/03/ibm-effort-create-scale-next-generation-quantum-supercomputers/feed/ 0 107732
What IBM’s new AI-enabling, brain-inspired computer chips could mean for DOD https://defensescoop.com/2023/10/19/what-ibms-new-ai-enabling-brain-inspired-computer-chips-could-mean-for-dod/ https://defensescoop.com/2023/10/19/what-ibms-new-ai-enabling-brain-inspired-computer-chips-could-mean-for-dod/#respond Thu, 19 Oct 2023 21:49:17 +0000 https://defensescoop.com/?p=77936 DefenseScoop was briefed on the latest developments for the NorthPole project.

The post What IBM’s new AI-enabling, brain-inspired computer chips could mean for DOD appeared first on DefenseScoop.

]]>
Technologists at the Air Force Research Lab, Sandia National Laboratory and other government research hubs are developing real-world use cases to inform the maturation of a new IBM-produced computer chip prototype — with a cutting-edge, brain-inspired architecture — that holds potential to drastically innovate where and how the military can harness the power of artificial intelligence, DefenseScoop has learned.

As soon as they were approached about funding the in-the-works chip and IBM’s underpinning NorthPole project in 2021, Assistant Secretary of Defense for Critical Technologies Maynard Holliday and his team immediately “recognized the utility and potential — I don’t want to say revolutionary, but game-changing, leap-ahead capability that this represented,” he said in an interview on Wednesday.

Dharmendra Modha, IBM Research’s NorthPole lead, has been working on what would evolve into NorthPole since around 2004. On Thursday, he published results of his team’s research in Science, arguing that this new AI-enabling semiconductor prototype has proven to outperform all the latest prevalent chip architectures on the market, including the most advanced. 

A major element of its promise is that all the memory for the device is stored directly on the chip — as opposed to elsewhere, like the cloud or company-owned servers.

“I would say NorthPole is the faint reflection of the brain in the middle of a silicon wafer that points to a new direction in computer architecture … What that means, the world and we will together figure it out over the years to come,” Modha noted.

In separate interviews on Wednesday, IBM’s Modha and DOD’s Holliday briefed DefenseScoop on the making and uniqueness of this new prototype, and where the U.S. military and defense researchers might take it from here.

“What this enables is that compute to be able to be done at the point of interaction. And so you can think about how that’s really valuable for the soldier and/or the platform when you’re in a GPS-[strained] or contested electromagnetic environment and the compute is being able to be done locally,” Holliday said.

‘Just scratching the surface’

According to Modha, the phrase “‘the architecture of the brain’” refers to how the organ stores, moves and schedules information. And “‘technology of Mother Nature,’” he explained, encompasses how the chemicals and the organic substrate of the brain implement that architecture. 

“What we have taken is the architecture of the brain — the blueprint of how it computes, how it stores, how it connects, how it schedules, how it communicates and how it interacts with the world — those principles, we have extracted and implemented in an off-the-shelf silicon technology process. So it does not have any of the organic technologies of the brain, such as the chemical signaling, or anything like that. It’s taking the architecture and implementing it in today’s technology — that’s the essence of the innovation,” Modha told DefenseScoop.

One of the government’s top engineering and robotics experts and technology innovation leaders, Holliday has helped push forward a range of crucial national security-aligned capabilities. 

“This always blows my mind when I think of it. So, our brains have 10 to the 14th or 100 trillion synapses, and we have [billions of] neurons. Humans have created civilization, and our brain consumes the power of a light bulb, right? And it occupies the volume of a large soda bottle. So, you know, it’s amazing,” he said. 

Modha “is right about this being a slight reflection of the brain,” Holliday added, in that the architectures are in many ways similar. However, the NorthPole prototype chip currently has 22 billion transistors — compared to the billions or trillions of synapses in human brains.

“We’re just scratching the surface with respect to architecture. But nature is, a lot of times, the best guide for … what works, because we’re still alive and thriving with our brain architecture. And so, to the extent that we can mimic that in machinery, it’s showing that it’s as powerful or more powerful than this von Neumann architecture — which is memory and central processing [unit]-separate. And you have this classic, what we call ‘von Neumann’ bottleneck — is a testament to how efficient a neuromorphic architecture is,” Holliday told DefenseScoop.

Today’s computer architecture is still largely dominated by what’s known as the von Neumann architecture, Modha explained. That term has roots that trace back to around 1945 from a description by John von Neumann, a leading defense scientist and mathematician who notably worked on the Manhattan Project to build the world’s first atomic bombs. In such architectures, memory and CPUs that process information are separated. The von Neumann bottleneck refers to the phenomenon where the time and power it takes to shuffle data between memory, processing, and any other devices within a chip can limit or affect how the corresponding system performs.

The NorthPole prototype is designed to set a completely different path from the von Neumann architecture by combining features in one place. And that also enables the nascent chip to carry out AI inferencing — or the process of running through a deep learning network’s data in real time on an application — considerably faster than others that are out there currently. 

“If you look at any computer chip, broadly, it has five dimensions: computation, memory, communication, control and input-output. Along all these five dimensions, NorthPole breaks with the past,” Modha said.

NorthPole prototype (IBM photo)

NorthPole was fabricated with a 12-nm node process. According to Modha’s new research published in Science, his team tested the prototype using the ResNet-50 model, the well-known convolutional neural net for image recognition that benchmarks the performance of AI chips. 

The researchers found that NorthPole can recognize images faster and is 25 times more energy efficient when it comes to the number of frames interpreted per joule of power required, than common 12-nm GPUs and 14-nm CPUs like those now widely used and made by Nvidia and other major players.

It also out-performed competitors in space and time efficiency. 

Pointing to those results and why they matter for the defense enterprise, Holliday spotlighted NorthPole’s power to ingest multiple sensor modalities including audio, electro-optical, infrared and sonar signals; wide-area motion imagery; synthetic aperture radar, and Lidar. 

Autonomous vehicles can use Lidar to characterize everything in their ambient environments, he noted, and electro-optical and infrared sensors can essentially use heat signatures to classify helicopters and other platforms.

“You can think about acoustic sensors that could be on a dismounted soldier or on a platform, being able to discern audio feedback from targets or just to discern, you know, signatures of weapons fires. So then you can say, ‘Alright, well, that’s, you know, this kind of weapon or that kind of weapon being fired’ … because you know what the audio signature is,” Holliday said.

“The important thing to note is it’s low, what we call ‘SWAP— size, weight and power,” he added regarding the AI chip prototype.

His hope is that such systems could potentially last longer, have longer endurance, and be more densely configured to provide more compute power.

“You can think about it on the dismounted soldier, on the [unmanned aerial vehicle], or on the [unmanned underwater vehicle]. So, all of these platforms where we want to exploit man-machine teaming, where we want to do swarming. To be able to, again, have compute at the tactical edge — that makes all of these systems smarter and more capable,” Holliday told DefenseScoop.

Next moves

Modha has been working to generate the prototype that NorthPole has become since the early 2000s for IBM, with support from the Defense Advanced Research Projects Agency, the Air Force and others along the way.

When Holliday returned to the Pentagon in 2021 to serve in his latest post, he and his team were approached about funding IBM’s NorthPole pursuit. “Unfortunately, it had been canceled by the previous administration,” he told DefenseScoop.

“They came to me to say, ‘We are close. This is a capability we absolutely need — because it provides AI compute at the edge,’” Holliday recalled. The Office of the Secretary of Defense then opted to fund them to finish the job so that the chips could be fabricated in low numbers in the U.S. and be tested and evaluated by the government and the industrial base.

Now, that’s happening across federal, defense and other research organizations.

Each prototype AI chip has to be placed on a field programmable array to then be integrated to work within more complex systems. 

Looking to ultimately operationalize the prototype, Holliday confirmed that IBM hosted a transition workshop this summer to teach government lab insiders and other partners how to program those arrays so that they can puzzle out and run new use cases on the chips.

“IBM has proved that this is a leap-ahead capability. And we need to bridge one of these ‘valleys of death,’ which is getting it out of the prototype stage, and doing test and evaluation, and after that point, giving feedback to IBM and having them iterate one or two more times. And then, this goes into volume production for platforms that support both commercial, as well as for DOD applications,” Holliday said of the vision.

NorthPole has been funded most recently via OSD’s microelectronics program. For the next iteration, he’s encouraging IBM to go after investments via the CHIPS and Science Act.

“They’ve been in the lab for almost two decades and it’s ready to be ‘fab’ now — and so that’s one of the things that the CHIPS Act was designed to address is to get capabilities like this to commercial scale. So they fab it in low quantities at GlobalFoundries in New York, but to get to commercial scale they need to get it to a commercial-scale [fabrication facility]. There are some in the U.S. but not at the state-of-the-art technology nodes. And so TSMC and Intel have announced, and actually broken ground on three different fabs … but it’s going be years before they’re online,” Holliday explained.

Modha confirmed that — beyond the exploration of next steps, even smaller architectures, and new directions for this research — his team has already “started the process of redesigning the circuit board, so as to be less vulnerable to some of the supply chain constraints.”

The post What IBM’s new AI-enabling, brain-inspired computer chips could mean for DOD appeared first on DefenseScoop.

]]>
https://defensescoop.com/2023/10/19/what-ibms-new-ai-enabling-brain-inspired-computer-chips-could-mean-for-dod/feed/ 0 77936
Intelligence agencies confronting challenges with multi-cloud environments https://defensescoop.com/2023/05/30/intelligence-agencies-confronting-challenges-with-multi-cloud-environments/ https://defensescoop.com/2023/05/30/intelligence-agencies-confronting-challenges-with-multi-cloud-environments/#respond Tue, 30 May 2023 18:35:30 +0000 https://defensescoop.com/?p=69114 The IC does not currently have an overarching cloud governance model. 

The post Intelligence agencies confronting challenges with multi-cloud environments appeared first on DefenseScoop.

]]>
While intelligence agencies are making progress generating modern cloud environments that underpin secure IT services and reliable access to their secretive data workloads, they’re also confronting unique challenges associated with operating in multi- and hybrid-cloud constructs, according to senior officials.

Broadly, multi-cloud computing models involve two or more public cloud options, and hybrid cloud computing refers to environments with a mix of private (or enterprise-hosted) and public cloud services.

Google, Oracle, Amazon Web Services and Microsoft are competing for task orders via the Defense Department’s enterprise cloud initiative, the Joint Warfighting Cloud Capability (JWCC). The intelligence community’s multi-cloud construct, Commercial Cloud Enterprise (C2E), is similar to JWCC and incorporates the same vendors, as well as IBM.

Awarded in 2020, C2E is a 15-year contract.

At this point, though, U.S. intel organizations “don’t have a multi-cloud/hybrid architecture at the IC level that would allow us to freely be able to exchange information with one another — and we don’t have a catalog … for [sharing] datasets,” Fred Ingham said last week during a panel at the annual GEOINT Symposium. 

Ingham is a CIA employee who’s currently on detail as the deputy chief information officer at the National Reconnaissance Office.

“In the old days, if I were to create a system that needed to take data from a spaceborne asset and write it very quickly to memory process that data, do analysis on that data, eventually come up with some intelligence and perhaps store it in a repository — what I might build is I might create a very high-speed network” and a storage area network, he said. He added that he’d also buy “purpose-built servers” and a database for processing, among other assets.

The government would approve that system for storing information only after “I knew precisely how all of those bits and pieces work together,” Ingham explained.

“Now, let’s fast forward into a multi-cloud construct” with that same system — “completely contrived,” he said — offering a hypothetical to demonstrate current challenges. 

“So we’re downloading the same bits and I’m going to choose to put that into Google, because I like their multicast capability, so we’re going to write those bits very quickly into Google. And then I’m going to process them. And let’s just say I’ve got my processing already in AWS, I’ve got heavy GPUs there. So, I want to process that in AWS. And I happen to like Microsoft’s [machine learning] algorithms, so I’m going to do the analysis there, inside of Azure. And this intelligence that I accrue, I’m going to go store this in an Oracle database. I didn’t leave out IBM, it’s just IBM is on the high side. Alright, so I want to do that — [but] I can’t do it,” Ingham said. 

He spotlighted reasons why officials can’t yet make this move-across-a-multi-cloud vision a reality.

“Number one, [the IC] acquired five cloud vendors, and we didn’t have a strategy or an architecture about how all of those things would fit together and work with one another,” Ingham said. 

The intel community does not currently have an overarching cloud governance model. 

Ingham noted at the conference he spoke to a representative from IBM, who told him about a commercial “cloud exchange, where each of those cloud providers are sitting in that same data center, and therefore they have the same type of networking capabilities — and so transport between the clouds are equal.”

“We don’t have that in the IC today,” he pointed out.

He highlighted a present lack of capacity to deterministically understand the performance of each cloud, onboarding tools, operational support, identity management, how data moves and comprehensive situational awareness across the cloud service providers, among other issues. 

“What I like to think about is frictionless computing, that’s not frictionless — and until we solve those issues, I don’t see us being able to use the multi-cloud in the manner that I just described,” Ingham said. 

On the panel, leaders from other intelligence agencies also reflected on the benefits and obstacles of their unfolding, government cloud deployments.

“The government has to do a better job in defining requirements — functional requirements — and more importantly, as you go towards a potential conflict with China, the operational requirements, or the operational scenarios in which you’re expected to run and deliver solutions [via the cloud]. I think we in the government have not done an appropriate job of that to our IT solution providers,” the Defense Intelligence Agency’s Deputy Chief Information Officer E.P. Mathew said.

Meanwhile, the National Security Agency is “already very far along on its multi-cloud journey,” according to NSA’s Deputy Chief Information Officer Jennifer Kron. Officials there “truly believe in finding the right computing solution for each mission” and purpose, she said, and so they are leveraging services from multiple providers.

The National Geospatial-Intelligence Agency started moving “everything” to the cloud in 2015. But by 2016, officials “very quickly found out” that moving all the workloads “wasn’t really the smart thing to do,” NGA’s Director for Chief Information Officer and IT Services Mark Chatelain said. Now, the agency is using the C2E contract to diversify its cloud holdings, he noted, with aims to “figure out how to smartly use the multi-cloud” over the next few years.

Recently, NGA has been requesting that industry provide “something like a single-pane-of-glass view of a multi-cloud” ecosystem, Chatelain said — “so, you don’t have to go to Google window or an Oracle window, you basically have a single-pane-of-glass window that you can manage all of the clouds.”

NGA also wants more affordable applications to move data and capabilities, as well as direct connections between the clouds to expedite information transfer.

“Imagery, as you know, consumes a huge amount of data. NGA brings in about 15 terabytes per day of imagery into their facilities, today. And that’s predicted to grow probably about 1,000% in the next coming six or seven years. So we’ve got to have the connectivity between the clouds to be able to share that information,” Chatelain noted.

He and other officials suggested that cloud providers should recommend an architecture and appropriate path forward. They were hopeful that could soon be in the pipeline.

“I had the opportunity to be with all of the cloud vendors yesterday and today — and without exception, every one of them is very much in favor of exactly that. They know they bring something to the fight that nobody else does, and they know that their competitors bring something to the fight that they can’t bring,” Chatelain said.

The post Intelligence agencies confronting challenges with multi-cloud environments appeared first on DefenseScoop.

]]>
https://defensescoop.com/2023/05/30/intelligence-agencies-confronting-challenges-with-multi-cloud-environments/feed/ 0 69114