Intelligence agencies confronting challenges with multi-cloud environments

The IC does not currently have an overarching cloud governance model. 
(Getty Images)

While intelligence agencies are making progress generating modern cloud environments that underpin secure IT services and reliable access to their secretive data workloads, they’re also confronting unique challenges associated with operating in multi- and hybrid-cloud constructs, according to senior officials.

Broadly, multi-cloud computing models involve two or more public cloud options, and hybrid cloud computing refers to environments with a mix of private (or enterprise-hosted) and public cloud services.

Google, Oracle, Amazon Web Services and Microsoft are competing for task orders via the Defense Department’s enterprise cloud initiative, the Joint Warfighting Cloud Capability (JWCC). The intelligence community’s multi-cloud construct, Commercial Cloud Enterprise (C2E), is similar to JWCC and incorporates the same vendors, as well as IBM.

Awarded in 2020, C2E is a 15-year contract.


At this point, though, U.S. intel organizations “don’t have a multi-cloud/hybrid architecture at the IC level that would allow us to freely be able to exchange information with one another — and we don’t have a catalog … for [sharing] datasets,” Fred Ingham said last week during a panel at the annual GEOINT Symposium. 

Ingham is a CIA employee who’s currently on detail as the deputy chief information officer at the National Reconnaissance Office.

“In the old days, if I were to create a system that needed to take data from a spaceborne asset and write it very quickly to memory process that data, do analysis on that data, eventually come up with some intelligence and perhaps store it in a repository — what I might build is I might create a very high-speed network” and a storage area network, he said. He added that he’d also buy “purpose-built servers” and a database for processing, among other assets.

The government would approve that system for storing information only after “I knew precisely how all of those bits and pieces work together,” Ingham explained.

“Now, let’s fast forward into a multi-cloud construct” with that same system — “completely contrived,” he said — offering a hypothetical to demonstrate current challenges. 


“So we’re downloading the same bits and I’m going to choose to put that into Google, because I like their multicast capability, so we’re going to write those bits very quickly into Google. And then I’m going to process them. And let’s just say I’ve got my processing already in AWS, I’ve got heavy GPUs there. So, I want to process that in AWS. And I happen to like Microsoft’s [machine learning] algorithms, so I’m going to do the analysis there, inside of Azure. And this intelligence that I accrue, I’m going to go store this in an Oracle database. I didn’t leave out IBM, it’s just IBM is on the high side. Alright, so I want to do that — [but] I can’t do it,” Ingham said. 

He spotlighted reasons why officials can’t yet make this move-across-a-multi-cloud vision a reality.

“Number one, [the IC] acquired five cloud vendors, and we didn’t have a strategy or an architecture about how all of those things would fit together and work with one another,” Ingham said. 

The intel community does not currently have an overarching cloud governance model. 

Ingham noted at the conference he spoke to a representative from IBM, who told him about a commercial “cloud exchange, where each of those cloud providers are sitting in that same data center, and therefore they have the same type of networking capabilities — and so transport between the clouds are equal.”


“We don’t have that in the IC today,” he pointed out.

He highlighted a present lack of capacity to deterministically understand the performance of each cloud, onboarding tools, operational support, identity management, how data moves and comprehensive situational awareness across the cloud service providers, among other issues. 

“What I like to think about is frictionless computing, that’s not frictionless — and until we solve those issues, I don’t see us being able to use the multi-cloud in the manner that I just described,” Ingham said. 

On the panel, leaders from other intelligence agencies also reflected on the benefits and obstacles of their unfolding, government cloud deployments.

“The government has to do a better job in defining requirements — functional requirements — and more importantly, as you go towards a potential conflict with China, the operational requirements, or the operational scenarios in which you’re expected to run and deliver solutions [via the cloud]. I think we in the government have not done an appropriate job of that to our IT solution providers,” the Defense Intelligence Agency’s Deputy Chief Information Officer E.P. Mathew said.


Meanwhile, the National Security Agency is “already very far along on its multi-cloud journey,” according to NSA’s Deputy Chief Information Officer Jennifer Kron. Officials there “truly believe in finding the right computing solution for each mission” and purpose, she said, and so they are leveraging services from multiple providers.

The National Geospatial-Intelligence Agency started moving “everything” to the cloud in 2015. But by 2016, officials “very quickly found out” that moving all the workloads “wasn’t really the smart thing to do,” NGA’s Director for Chief Information Officer and IT Services Mark Chatelain said. Now, the agency is using the C2E contract to diversify its cloud holdings, he noted, with aims to “figure out how to smartly use the multi-cloud” over the next few years.

Recently, NGA has been requesting that industry provide “something like a single-pane-of-glass view of a multi-cloud” ecosystem, Chatelain said — “so, you don’t have to go to Google window or an Oracle window, you basically have a single-pane-of-glass window that you can manage all of the clouds.”

NGA also wants more affordable applications to move data and capabilities, as well as direct connections between the clouds to expedite information transfer.

“Imagery, as you know, consumes a huge amount of data. NGA brings in about 15 terabytes per day of imagery into their facilities, today. And that’s predicted to grow probably about 1,000% in the next coming six or seven years. So we’ve got to have the connectivity between the clouds to be able to share that information,” Chatelain noted.


He and other officials suggested that cloud providers should recommend an architecture and appropriate path forward. They were hopeful that could soon be in the pipeline.

“I had the opportunity to be with all of the cloud vendors yesterday and today — and without exception, every one of them is very much in favor of exactly that. They know they bring something to the fight that nobody else does, and they know that their competitors bring something to the fight that they can’t bring,” Chatelain said.

Latest Podcasts