Advertisement

AI task force for Navy surface fleet devising comprehensive data catalog

It’s one part of a broad, federated model the Naval Surface Force is applying to accelerate AI adoption.
(Source: Getty Images)
(Source: Getty Images)

A Navy task force formed to operationalize artificial intelligence and machine learning across the surface fleet is steering efforts to fix data and lay the foundation for associated emerging technology applications in the near future. 

Effective AI depends on data that is “clean” or cohesive enough to build algorithms off of. Naval Surface Force components, which equip and staff warships ahead of their deployments to respective fleet commands for military operations, create and use heaps of data — but currently, it’s all pretty messy from an organizational perspective. 

“Our data landscape is so vast and complex. There’s no common data ecosystem, no data catalog, and not enough clean data,” Task Force Hopper Director Capt. Pete Kim told FedScoop in an interview July 29.

Kim has led the task force since it was launched last summer to drive AI capabilities across the surface force. He and his team have made progress shaping a nascent approach, strategy and implementation plan to guide that wide-ranging effort. As those documents are now being prepared for release, the task force is also working to engineer and refine a digital and conceptual hub that makes sense of the organizations’ multitudes of data and helps personnel better analyze and apply it for AI and ML.

Advertisement

“As you can imagine, it’s pretty challenging to get this infrastructure right,” Kim said. “I think it’s because of the nature of our different security classifications, roles and environments. It’s not as easy as, like, getting an app on your iPhone and doing the updates quickly.”

‘Cracking that code’

Kim now heads both the Surface Analytics Group (SAG) and Task Force Hopper. The experience has been eyeopening.

Capt. Pete Kim, then commanding officer of Ticonderoga-class guided-missile cruiser USS Princeton (CG 59), uses the ship’s general announcing system to speak to the crew, Aug. 14, 2020. (U.S. Navy photo by Mass Communication Specialist 2nd Class Logan C. Kellums)

“I’ve always been in the operational fleet — so, the one providing the data — and I didn’t realize how much data we have in the Navy that is not exploited,” he explained. “I think we tend to look for the data that we need to answer the mail and things like that, but until the last several years, I don’t think we’ve had the capability to really process big data. And we’re doing that now. So, that’s probably the coolest part.”

Advertisement

Still, there’s a long way to go before the service’s ambitious aims of applying AI and ML on a large scale are completely realized.

An initial priority for Task Force Hopper is to help the surface force pinpoint and clean data, so the separate parts of the sprawling enterprise can collectively take full advantage of it. To help demonstrate the vast scope of information, Kim noted that the SAG concentrates on readiness-related data.

“The spectrum is having data from authoritative databases, where it’s very structured data and we’re pulling in all the unique datasets that we need — all the way down to being on someone’s desktop or on a shared drive hidden away somewhere that you’ve got to find the right person or the manager to get to that data,” he said.

In addition to challenges around data availability, quality and governance, Kim noted that technology-centered work in the Navy has traditionally been organized and structured based on platforms and supporting program offices. But AI development cuts across many different stovepipes and organizations. 

“That’s why this federated model is so key in cracking that code,” he explained. 

Advertisement

That nascent approach he alluded to was recently conceptualized by the task force and will be detailed in a soon-to-be-released data and AI strategy and implementation plan. The overarching idea is to have more centralized data governance and a one-stop data catalog — combined with “decentralized analytic and AI development nodes at different places in the enterprise,” where personnel know and use data best.

Each node will focus on certain categories associated with artificial intelligence and machine learning, like maintenance or lethality. The SAG, for instance, is considered an AI node focused on readiness.

“I think every node is going to be a little bit different, and that really depends on the problem set, the use case. And then, again, what’s the state of the data?” Kim said.

If nodes have high-quality, mature datasets, they’ll likely be developing AI models pretty quickly. But if they start near or from square one, they’ll probably have to spend more time on data collection, cleaning and labeling in the first portion of the journey.

“I know that data management is not the sexiest topic, but we do believe this is one of the significant leaps to accelerating AI and ML in a large organization,” Kim added.

Advertisement

Entering a new era

Task Force Hopper is named in homage to the trailblazing computer pioneer and formal Naval officer Grace Hopper, who reached the rank of rear admiral (lower half) before her retirement.

A group of key AI and data stakeholders across the surface force — one of the Navy’s largest enterprises — has been meeting on a biweekly basis over the past year or so. Kim said they’ve kicked off “that data governance process” and are identifying many datasets for their respective realms to prioritize. 

Crafting clearly-defined use cases for the surface force’s many sources of data is also presently top-of-mind for Task Force Hopper.

“When it comes to analytics and AI, we’re kind of entering a new era where you have to have the operator, the warfighter, or the maintainer involved in every step of the development,” Kim said. “I think this is a departure from the past where we just give requirements to some contractor and then they come back in two years with the product.” 

Advertisement

In his view, the task force and SAG are seeing success from “having the right subject matter experts sitting side by side with data scientists, with AI model developers to produce really valuable products.”

Task Force Hopper has also made headway in working with the Navy’s office of the chief information officer, according to Kim, to apply a platform called Advana-Jupiter as its common development environment. 

“It’s got data-warehousing tools and all the applications you need to visualize the data and create AI models,” he explained. “We’re using that platform as a place to have a single catalog so that if folks are working on a project and they’re looking for certain datasets to move forward, they’re not stalled because they can’t find it or it’s unavailable.”

As one, evolving piece of Advana, the Pentagon’s bigger enterprise data hub, Jupiter will enable surface force members to seamlessly access data — and then build AI and ML algorithms informed by it.

“On the readiness side, we’re looking towards predictive and prescriptive maintenance to sustain our ships and increase reliability at sea,” Kim said.

Advertisement

Another readiness node priority area is condition-based maintenance. “As we start employing unmanned surface vehicles, we’re going to need those types of CBM models to support those vessels at sea, since they won’t have maintenance personnel onboard,” Kim noted.

He added that while Jupiter does not need to host every single dataset, “that’s where we want to catalog it so that if someone’s working on a project, it’s like a menu” where they can see the point of contact and details on the data.

“We’re going to use Advana-Jupiter as that platform where we can kind of integrate different datasets, because as we start building more advanced AI models, it’s not just going to be one sensor data source, it’s going to be multiple things,” Kim said.

A key goal for the task force is to help the surface force become AI-ready by 2025. 

“I think with new technology, you always feel like you’re behind. That’s why we’re putting so much brainpower behind this. But as you know, having that high-quality dataset, the tools, the right people for the project — I mean that’s like 80% of the journey. So, if we get that infrastructure part right, the last 10% of producing this widget or what have you is the easy part,” Kim said. “And we can really partner with industry to really leverage the tech that’s out there and develop these unique tools that we need.”

Latest Podcasts