From data silos to strategic insights: The interoperability imperative for defense analytics

How technology advances, AI and data fabrics are bridging critical gaps in information sharing and accelerating the promise of near real-time, actionable intelligence.
Listen to this article
0:00
Learn more. This feature uses an automated voice, which may result in occasional errors in pronunciation, tone, or sentiment.
(Getty Images)

The increasing use of drone warfare in global conflicts, as seen in the Russia-Ukraine conflict, presents a new imperative for military forces to quickly identify and intercept hostile unmanned weapon systems when every minute counts. It also reveals a more profound need to gather and analyze multiple data streams and respond faster than humans can process manually.

However, recent technological advances are demonstrating new promise in solving that challenge — and more broadly, one of the military’s Achilles heels: turning massive amounts of data into actionable intelligence in near real time.

From air, sea and ground reconnaissance telemetry to intricate logistics data, the sheer volume of information generated across the defense landscape is staggering. Yet, transforming this data into actionable intelligence with the speed and precision required for achieving decision advantage on the battlefield and cyberspace has remained a critical stumbling block.

The solution isn’t necessarily more modern data collection tools, according to experts, but rather a concerted effort to make existing and future systems work seamlessly together — and move data up the value chain from collection to near real-time insights where and when needed.

“The greatest challenge isn’t around collection — there are plenty of capabilities out there for that — but it’s in connecting that data to enable insights,” says Joe McMahon, Senior Director, Mission Software portfolio at GDIT.  “Data fragmentation across both functional silos and classification levels, along with inconsistent standards, latency issues, and the sheer volume of data, creates interoperability gaps that prevent the timely movement of information from sensor to decision maker,” he explains.

The good news is that maturing technology capabilities are beginning to bridge these gaps, fostering greater data interoperability across diverse defense environments.

The rise of open data fabrics

One of the most promising strategies to counter data fragmentation, according to McMahon, is the adoption of federated data fabrics that rely on open data standards and advanced analytics platforms like Open DAGIR, Advana, Army Advantage and Navy Jupiter. These architectural approaches leverage governance and industry incentives to align enterprise applications, data, and infrastructure holistically to achieve greater interoperability, speed and flexibility at scale.

What sets them apart from traditional, proprietary architectures is how they function. Rather than users having to collect data, the systems operate on the principle of “leaving the data with the data creators — or data origin — but exposing it in a way that others can find it, and gain value from it,” according to McMahon.

These systems are gaining increasing traction due to the expanding use of open standards and specifications, such as Open Geospatial Consortium web services, Open Sensor Hub and Open API,which allow disparate systems to interact and exchange data more seamlessly.

High-velocity dataflow

Adding momentum to these expanding data-sharing capabilities is the rise of high-throughput, low-latency distributed streaming platforms that allow for near real-time data processing. Apache platforms like Kafka and Pulsar can ingest, store and process massive amounts of fast-moving data in motion from thousands of data sources simultaneously.

That ability is enhanced by dataflow management systems, like NiFi, which automate data flow between systems. Created by the U.S. National Security Agency (and now part of the Apache Software Foundation), NiFi makes it possible to connect to a massively distributed system of components that weren’t designed to work together. It enables the ability to fuse data created in different formats and subject to different policies, while reducing latency – all of which dramatically streamlines analysis.

“The defense sector’s data interoperability challenge requires more than good intentions — it demands infrastructure purpose-built for it,” says Fazal Mohammed, senior manager and solutions architect at Amazon Web Services. “Modern cloud architectures can support a federated data fabric approach, where security data can be centralized across domains while maintaining fine-grained access control. Advanced streaming technologies can handle the high-velocity dataflows that military operations demand. When combined with specialized government and classified cloud environments, this enables the vision of ‘data stays at origin but becomes discoverable’ — with the governance, speed and security that defense missions require.”

Zero-trust: Bridging domains

A third factor accelerating real-time information sharing is the increasing adoption of zero-trust architectures and their emphasis on continuous authentication and least privilege access, which are pivotal in enabling more fluid and robust cross-domain data sharing.

Zero-trust architectures are starting to chip away at data ownership issues and inherent resistance to sharing, particularly by those who contend, “If the data is restricted, you can’t access it,” says McMahon. Instead, he suggests that zero-trust architectures provide mechanisms that enable robust secure sharing capabilities.

AI-enhanced analytics

What’s unleashing the potential of all of these developments is integrating artificial intelligence (AI) and machine learning (ML) more deeply into the data fabrics and mission platforms, rather than just siloed pilots. Moreover, they’re being trained on cross-domain data products and deployed closer to the tactical edge.

That’s proving to be a game-changer in managing the military’s growing data complexity and accelerating insights, as evidenced by its growing use to thwart drone attacks in hostile conditions.

This increasing integration allows the military to shift its focus from simply assessing the accuracy of AI and ML tools to a deeper understanding of whether the AI output can be trusted, whether it’s explainable and how effectively the tools perform in contested environments.

McMahon points to three use cases as examples of how these capabilities are expected to accelerate decision-making.

  • Fusing multiple sensors for a common operating picture: By integrating radar tracks, acoustic sensors, electro-optical or infrared fields, commanders can identify hostile drones earlier and act faster. This analytics-driven fusion has reduced decision cycles from minutes down to seconds.
  • Predictive maintenance: Applying machine learning to telemetry and maintenance logs from aircraft and other equipment, like the F-35, helps identify early indications of parts that are likely to fail. This proactive approach extends aircraft availability and avoids unplanned downtime, preventing issues before they occur.
  • Optimizing logistics and supply chains: Integrating data on shipping and fuel consumption, inventories, etc., allows planners to model and predict bottlenecks in the supply chain. This enables rerouting supplies and sustaining tempo in contested logistic environments, preventing delays that could otherwise stall efforts.

In each instance, these AI applications are intended to augment, not replace, human decision-making to make operators better, faster and stronger at doing our jobs. “There still needs to be somebody driving,” says McMahon. These tools empower seasoned analysts to go through more data more quickly and identify critical time-sensitive things, versus having them sift through massive data stores.

Looking ahead, McMahon suggests reframing the concept of weaponizing data in the defense sector. Instead, “insights are the weapons, and data is really the building blocks of those weapons.” Collecting and connecting data needs to be easy and quick so technical experts and analysts can focus on analyzing, deriving insights, and making rapid decisions.

While classification and sensitivity remain valid concerns, the inability to share data only serves to disadvantage the military. Ultimately, for the defense sector to unlock the full potential of its extensive data reserves, the future of defense analytics depends on treating data as a shared resource with policies that promote interoperability, open standards, and partnerships with experienced data-sharing experts.

Learn more about how GDIT can help your organization develop flexible platforms for mission advantage.

Latest Podcasts