As the U.S. adjusts to new great-power developments, the Army has revised its data strategy to reflect the need to move quickly and securely to leverage its information assets against adversaries.
Army CIO Lt. Gen. Bruce Crawford walked through the new strategy — essentially an updated version of the strategy the service issued in 2016 under Lt. Gen. Robert Ferrell — at a recent Association of the U.S. Army event.
It is crucial to be “able to enable our warfighter, enable the users whether they’re sitting in the Pentagon at the enterprise or they’re sitting in a TOC [tactical operations center] somewhere down at the tactical edge, to be able to orient, decide and act faster than peer adversaries,” Crawford said of his office’s work to modernize the Army’s IT.
At the heart of that is what he called the Army‘s “data problem.”
“Right now we’ve got structured data, we’ve got unstructured data, there are various states and it kind of is where it is,” he said. “But if we want to achieve outcomes, and that ultimately is our ability to properly leverage data, we’ve got to fix the data problem, and the first part starts with the actual strategy.”
The strategy, like its predecessor, has principles of making data visible, accessible, understandable, trusted and interoperable. But the new version also adds a principle for securing it. Each one has its own related standards for accomplishing what the Army wants.
“Just because I put out a memo that says to make your data visible … that’s not going to solve the problem,” Crawford said.
Interoperable and trustworthy data is becoming a bigger requirement for the modern Army.
Things like deepfake videos “cast just enough doubt to cause you not to trust what you’re looking at,” Crawford said. “There is a growing mistrust in what used to be norms. You combine that with untrusted data, meaning I’m looking at it but I’m not 100% sure that this is correct — then now you’re starting to impact operations.”
And as the Army looking to modernize its networks, relying more on cloud environments but also still on traditional data centers in some capacity, data must be interoperable, he said.
“I’m not necessarily looking to move it. But what we’re trying to say is we’re going to put standards in place,” Crawford said. “Some of it will end up in a cloud-hosted environment, but I’ve talked to no expert yet that says that the future will not involve data centers. So there will be some of our data that resides in data centers. But the question is regardless of where it is, it’s got to be interoperable. And then ultimately it’s got to be secure.”
The data problem cannot be solved by a strategy alone though. Crawford said culture is the biggest barrier.
“Overnight, because I put out a data strategy, does that stop all the disparate nature and isolated nature of everything?” It obviously doesn’t, he said.
Culture is what “drove me to tell the leadership that among the hardest things we’re going to do in the next 10 years is implementing the data strategy,” Crawford said. “It’s going to require a culture change. It’s going to require trust in places that don’t exist right now.”