Cloud computing has given federal agencies newfound ability to run large-scale modeling and high-performance computing workloads that not that long ago required coveted time slots on supercomputers. But moving the massive data sets from agency data centers to the cloud still involves a lot of work. Data extraction charges can also be costly and until recently, latency was a significant issue.
That’s changing thanks to new capabilities available from a software solution that many federal agencies are already using on premises to help them manage those workloads, says Nic Perez, chief technology officer at ThunderCat Technology, a value-added reseller that provides IT services to U.S. federal and state governments and Fortune 500 companies.
“I think the key when you think about moving [large volumes of] data to the cloud is I want to reduce risk. I want to make it simple. And I want to use tools that I have on-premise,” says Perez in a new podcast produced by FedScoop and underwritten by ThunderCat Technology, AWS and NetApp.
“Your teams [are already] moving data between data centers, between office locations, between disaster recovery locations, and they’ve been using tools like NetApp’s SnapMirror,” which replicates business-critical data at high speeds over local or wide area networks.
NetApp, known for its storage and data management solutions, has translated those tools to work fluidly in almost all of the leading cloud service providers. Consequently, system administrators who already rely on NetApp to replicate and migrate data between agency-run sites can readily transfer data to and from the cloud.
“The beauty of it right now is that those tools are available to customers in the cloud,” says Perez.
Capitalizing on the cloud
During the podcast, Perez points to the success a large federal financial research organization had migrating 350 terabytes of historical data to the cloud, using NetApp SnapMirror and ThunderCat’s expertise.
“Because of the amount of investment and the amount of work that NetApp had done to support this system running inside the cloud,” the agency discovered the cloud delivered even greater performance than the agency’s high-end, on-premises systems, Perez says.
The ability to migrate and analyze large volumes of data in the cloud are especially useful to federal agencies that rely on video surveillance, where users need to review, and layer on AI and machine learning, for instance, for video transcription.
“If you place those videos in the cloud, and [users] had their client on their desktop, you’re moving that data over the network. This data extraction costs money when you move it over the network,” Perez explains. So in one instance, “We were able to use a NetApp solution, where we stored [surveillance data] on high-performance storage next to a [virtual desktop infrastructure] solution inside the cloud.” The resulting arrangement saved significant amounts of money and delivered extremely high performance, according to Perez.
One of the other key benefits agencies realize using NetApp suite of solutions is the reduced amount of training involved in migrating and managing datasets as they move back and forth into any of a number of cloud providers.
“I think the hardest thing [about tackling big data migrations] is the retraining and the resources you …need in the cloud. A lot of the technologies are the same [internally], but very different when you’re moving from inside a secure data center into a services-based solution on a third party host,” he says.
Listen to the full podcast conversation on “IT Modernization in Government” on our FedScoop.com and FedScoop’s radio channels on Apple Podcasts, Spotify, Google Play, Stitcher and TuneIn.
Nic Perez has an extensive career in technology, having worked for Northrop Grumman, American Red Cross, AOL, Crowd Adopter and Booz Allen Hamilton before joining ThunderCat Technology.
This podcast was produced by FedScoop and underwritten by ThunderCat Technology, AWS and NetApp.