Advertisement

Navy’s new ‘Project OpenShip’ aims to swiftly apply AI to data captured by vessels at sea

Members of Task Force Hopper recently launched a new program to build “low lift, high impact” artificial intelligence and machine learning applications. DefenseScoop was briefed on this effort.
SAN DIEGO (Jan. 7, 2021) An aerial photo of Naval Base San Diego in San Diego, CA. The photo was taken from a U.S. Navy MH-60S Seahawk assigned to the Helicopter Sea Combat Squadron 14 (HSC-14). (U.S. Navy photo by Mass Communication Specialist 2nd Class Austin Haist/Released.)

SAN DIEGO, Calif. — As they work to streamline information sources and operationalize artificial intelligence and machine learning capabilities across the Navy’s entire surface fleet, members of Task Force Hopper recently launched a new program to build “low lift, high impact” AI and ML applications that can generate value from the heaps of data captured on ships every day, according to Capt. Pete Kim.

Kim has led the task force as its director since Hopper was first launched in 2021 to accelerate AI across Naval Surface Force organizations — a massive enterprise that performs many different duties to prepare, maintain, equip and staff U.S. warships before they are deployed to their respective fleet commands for specific military missions. 

“I’d like to talk about Project OpenShip — it’s something we’re pretty excited about,” Kim told DefenseScoop last week in an exclusive interview at the facility near San Diego where Hopper is headquartered.

“It’s an effort to kind of integrate datasets to build better decision tools for sailors at sea,” he explained.

Advertisement

Kim, who also steers the Navy’s Surface Analytics Group (SAG) there, noted that when the service “started this journey about two years ago, there was really no roadmap when it came to data management, digital infrastructure requirements, talent management, supporting analytics and AI development.”

Early on, his team faced complex organizational challenges, as the Navy at the time was not “structured to get after those things,” he said. They also quickly realized there was no real vehicle to seamlessly integrate and collaborate with data across the sprawling organization. 

“There’s stovepipes out there,” Kim noted, adding, “you would find several teams working on projects that are very adjacent,” but leaning on different products, tools, or programming languages — as well as data that wasn’t high quality or easily shareable. 

Since then, SAG and the task force have helped the Navy “come a long way in addressing those common challenges” as an enterprise, in Kim’s view. 

With some roots tracing back to those accomplishments, Project OpenShip emerged over the last six or so months as commercial technologies were evolving and a number of AI and machine learning projects within the Navy were starting to mature.  

Advertisement

The project marks a new “surface enterprise-wide effort to develop and deploy software in novel ways to create decision tools for operators by integrating disparate sensor and information data available on ships today,” Kim told DefenseScoop.

Various Navy program offices have been fielding a data architecture to support this, which Kim referred to as “a common sensor software stack.” 

The platform is envisioned to provide officials with the ability to collect terabytes of labeled training data that Navy ships already generate hourly at sea — and while onshore, Task Force Hopper can supply critical inputs to inform the modern use cases.

“They’re essentially applications for our operators to use on ships,” Kim said. “And we want to do this within months and not years. This is an important effort, because I see it as a vehicle for many of the different program offices we’re talking to — owners of these systems and data — to work towards a common objective,” he added.

When asked to elaborate on this intent to create and unleash software “in novel ways,” Kim pointed to some of the Navy’s legacy ships. Typically, those systems’ hardware and IT infrastructure have been limiting factors to innovation in the past, but now program offices have been able to field and support that software stack.

Advertisement

“A good example would be for contact management on the bridge — to be able to integrate various sensor and information feeds there, so that you can automate a lot of the processes for the bridge watchstanders to free up their time,” Kim said.

The work will fuse and connect distinct teams across the surface fleet, he said, via such “complementary use cases.”

“If we didn’t have a pilot like this, there would be many different AI tools being built in stovepipes in a very proprietary manner,” Kim noted.

The Department of the Navy’s Chief Information Office and the Pentagon’s Chief Digital and AI Office (CDAO) have been “really supportive,” he added, in accelerating his team’s efforts to modernize and migrate some of their priority data systems into the cloud.  

Looking forward, Kim said there will be room for the Defense Department to continue to help alleviate obstacles hindering the military’s AI/ML deployments — particularly with Project OpenShip. 

Advertisement

“Traditionally, a good example is our combat systems and like our [command, control, communications, computers, and intelligence, or C4I] data — there’s those enclaves that have never talked to each other intentionally, but we’ve got use cases where we’ve got to integrate those datasets. So, working through the policy and then the technical hurdles to be able to integrate those disparate datasets, that’s something that we’ll be working on and focusing on,” he told DefenseScoop.

His teams have also been making progress in other efforts to enable the scaling of artificial intelligence and machine learning technologies. 

They’ve recently “been able to set a target of 75 mission-capable ships” that the force needs at all times, through an ongoing analytics effort to measure and track surface ship readiness, Kim confirmed. In the near term, they will also be prioritizing the automation of data streams to support application programming interfaces (APIs) and other tools to underpin AI and ML applications. 

At this point Project OpenShip is “probably the most exciting” effort that’s currently underway, Kim said, “because it’s really going to be a blueprint for common data engineering and machine learning models, like that distribution infrastructure for our edge platforms, both current and future,” he said.

“And in the future, we’re talking about unmanned vehicles and our integrated combat systems. So, the goal here is to support those program offices and that system architecture, the documentation, and really get at lessons learned to be able to scale this as a program of record across our ships,” Kim said.

Latest Podcasts