Advertisement

Kendall: Generative AI tech like ChatGPT currently has limited military utility

The Air Force secretary directed the service's scientific advisory board to examine these types of capabilities and their potential military applications.
Secretary of the Air Force Frank Kendall delivers an address to Air Force Academy cadets during their graduation ceremony at Falcon Stadium on May 25, 2022 in Colorado Springs, Colorado. (Photo by Michael Ciaglo/Getty Images)

Air Force Secretary Frank Kendall is keen on his department acquiring artificial intelligence capabilities, but he’s not so gung-ho on some of the generative AI technology that’s commercially available now.

Tools that can generate content — such as text, audio and images — based on prompts and the data they’re trained on have gone viral in recent months.

“There are certain forms of AI, which are advancing dramatically. And the one that’s getting an awful lot of attention right now is generative AI. ChatGPT, for example, DALL-E — the programs that can appear to be creative,” Kendall said Thursday at an event hosted by the Center for a New American Security. “I find limited utility in that type of AI for the military, so far. I’m looking and we’re all looking, right? But having it write documents for you … is not reliable in terms of the truthfulness of what it produces.”

Executives at OpenAI, the maker of ChatGPT, have acknowledged that these types of technologies sometimes produce so-called hallucinations or misinformation.

Advertisement

Kendall referenced a recent example in which a lawyer used ChatGPT to write a legal brief, but the technology made up cases that weren’t real and a judge noticed the errors.

“We’ve got a ways to go yet before we can rely on tools like that to do operation orders, for example,” Kendall said.

He tasked the Air Force Scientific Advisory Board to examine these types of capabilities.

“I’ve asked my scientific advisory board to do two things related to AI. One was to take a look at the generative AI technologies like ChatGPT and think about the military applications of them. Put a small team together to do that fairly quickly. But also to put together a more permanent, AI-focused group that will look at the [broader] collection of AI technologies … and help us understand them and figure out how to bring them in as quickly as possible,” he noted.

The advisory board was scheduled to brief its parent board last week on the results of its fiscal 2023 study of generative artificial intelligence, according to a notice in the Federal Register.

Advertisement

Kendall isn’t the only high-level official at the Department of Defense to express concerns recently with the reliability of the generative AI that’s commercially available today.

“We are not going to use ChatGPT in its present instantiation,” Maynard Holliday, the Pentagon’s deputy CTO for critical technologies, said last week at Defense One’s annual Tech Summit.

However, he said generative AI models could have “a lot of utility” for the DOD if they’re pursued in the right way and jointly developed by the Pentagon and its industry partners.

“We will use these large language models, these generative AI models, based on our data. So they will be tailored with Defense Department data, trained on our data, and then also on our … compute in the cloud and/or on-prem, so that it’s encrypted and we’re able to essentially … analyze its, you know, feedback,” he explained.

The U.S. military needs “multimodal” generative AI capabilities, not just large language models, he told DefenseScoop on the sidelines of the conference.

Advertisement

“We’ve got to get a handle on our data and … we’ve got to label all of our data. So, the electronic signatures of different platforms, all the vision and all the full-motion video we get — it’s got to get labeled. And then once that’s labeled, then we can put it into generative [AI models]. Because large language models are just one modality. How it’s going to work for us — it’s going to be multimodal. So, it’s gonna have to be language, it’s gonna have to be vision, it’s gonna have to be signals. And that’s all gonna have to be melded for us to use it,” he said.

The Defense Department was slated to host a conference in McLean, Virginia, this week focused on generative AI. About 250 people from government, industry and academia were expected to attend, according to Holliday.

Speaking of artificial intelligence capabilities more broadly, Kendall on Thursday said there are military applications where machines with high processing speeds can digest vast amounts of data.

Tools like neural networks and machine learning can be “very helpful” to warfighters for assisting with intelligence-related tasks like pattern recognition and targeting, he noted during the CNAS event, adding that the Pentagon has been working on that type of technology “for quite a long time.”

The Air Force is also pursuing new unmanned platforms such as “collaborative combat aircraft” that will be enabled by AI algorithms.

Advertisement

“It has revolutionary potential in terms of the capability that it provides,” Kendall said.

However, the Pentagon must make sure it applies AI tools in an ethical and responsible way, especially when it comes to tasks like targeting and applying lethal force in situations where civilians could accidentally be killed, he noted.

The DOD recently updated its autonomous weapons policy, which regulates how the U.S. military will develop and field those types of systems.

“All of those things are going to be coming in, they’re going to be used, they’re going to give us more capability. Humans are still going to have a role and be responsible for what actions are taken, and they have to be in the decision-making process so they can do that in a way which is also operationally efficient, right? And we’re gonna have to do some work on figuring out how to test and have reliable performance from these technologies,” Kendall said.

The requirements process also needs to keep up.

Advertisement

“We have to be very smart about how we write requirements so that we … provide the opportunity for those technologies to get on board, and then find paths for them that address all the issues I just talked about. But get them in the hands of warfighters where they can give them an advantage,” Kendall said.

Latest Podcasts