Leveraging innovations that transform data

Pure Storage’s John Karagozian explains how innovations in flash technologies are making it easier for federal agencies to transition to next generation tools for managing data storage.
artificial intelligence
(FedScoop)

As government agencies gather more data, they are facing a mounting challenge of how to effectively manage and analyze massive datasets.

Agencies understand the benefits: If they can tap into the full potential of their data collection, they can put it to smarter use, analyzing information for things such as better road traffic control, to see population health patterns or even to design infrastructure projects.

The difficulty is in assembling and organizing the data in ways that can be interpreted quickly and economically. In recent years, agencies have come to rely on big data tools, such as SAP HANA, that bring computing power to the data instead importing the datasets into a central computing platform. That helps agencies reduce their data center requirements, improve computing performance and begin to look at their data in real time.

But agencies are also beginning to benefit from the growing capabilities of flash storage, which, together with big data analytics tools, are facilitating the adoptions of AI and machine learning, according to John Karagozian, Global SAP Lead for Pure Storage.

“Applications like SAP’s HANA is churning out big datasets and allowing you to do customized reporting, to do high-speed analytics, to do real-time analytics, and really kind of put that data to work for you,” explains Karagozian in a FedScoop podcast underwritten by Pure Storage.

The current trend in big data management is to move away from traditional relational databases — such as Oracle, SQL and CB2 — and move toward next generation technology, that rely more on columnar databases. The columnar approach allows information to be moved much faster and to pull real-time data, which increases computer capability and performance.

SAP HANA, a database management and application development platform, adds analytical horsepower to the process by doing the computing work in-memory instead of on servers.

“In the current environment, everything new that SAP is building is going into the HANA platform. If you are still running in an [SAP] Oracle-based system, you are not getting all features or functionalities, and especially the things tied around big data,” Karagozian says.

SAP has announced end-of-life support by 2024 for its more popular database management systems which were compatible with other database companies. Agencies who use SAP’s system will eventually have to move to the HANA platform.

“A lot of people ask me: If you are moving to an innovative columnar-based, memory-based platform like HANA, its running on the server, so why does storage matter?” Karagozian says. “The reality is, there is a whole lot of storage interaction … so even just loading a database into HANA can be very time consuming.”

Innovations in flash drive technologies are an important part of the equation. When an organization is managing backup and recovery at the storage platform level, flash-to-flash is a high-speed method that can make the process both time and cost efficient.

Read more about big data storage and solutions agencies can take to improve the management of their systems.

This podcast was produced by FedScoop and sponsored by Pure Storage.

Latest Podcasts