Cyber, big data and cloud spark dialogue at FedScoop’s Tech Shoot-Out

Share

Written by

2013_05_Shoot-out Panelists discussed cloud, cyber and big data at FedScoop’s 4th Annual Tech Shoot-Out on May 7, 2013 (Photo: FedScoop)

Leaving convention at the door, executives from government and industry got real on cybersecurity, cloud computing and big data at FedScoop’s 4th Annual Tech Shoot-Out.

Held May 7 at the Newseum in Washington, D.C., the event drew hundreds of attendees from both sectors to partake in “shoot-outs” that zeroed in on the most pressing issues for the IT community today.

Round One: Cloud Shoot-Out

The first panel, focusing on cloud, was moderated by Casey Coleman, chief information officer at the General Services Administration, and featured panelists Mike Giesler, vice president of systems engineering at NetApp’s U.S. public sector; Kevin Fiftal, civilian director at Intel; Dan Juengst, OpenShift PaaS product marketing, cloud business unit at Red Hat; and Michael Donovan, chief technologist at HP Enterprise Services.

Coleman kicked off the discussion by asking panelists about the current state of cloud computing.

“[W]e’re at cloud-first,” Donovan said, referring to the policy that calls on agencies to first evaluate cloud options before making new IT investments. However, there is still a large difference between the public and private cloud — and a long way to go before getting to an integrated environment, he added.

Though the initial interest in cloud computing was economic, Red Hat found customers are moving beyond that and using cloud as an innovation platform for their applications to serve new mobile needs, according to Juengst.

“[C]loud computing is synonymous with innovation,” Coleman said, then asked the panelists what innovations they see emerging that will be impactful to the mission of the federal government.

A shrink-wrapping of services, Fiftal responded. Amazon has 120 features specific to government in its cloud. Amazon is “shrink-wrapping functionalities” so they are specific to the agencies that use them.

Giesler brought up leveraging both public and private cloud as an innovation affecting government.

With the private cloud, “you inherently get better security because you have greater control,” he said. “Public cloud has still been a little bit at arm’s length.”

More and more though, public cloud is being used to leverage services or for the computing power it can offer. “We’re going to see that happen more — private cloud for data storage and public cloud for services and computing,” Giesler concluded.

The panel concluded with a “what to do, what not to do, cloud” final round. Donovan highlighted the importance of sharing – and doing more of it. The natural inclination with new technology is to hold onto it; learn how to share data within the government and with industry, he said.

“In terms of things to do less of . . . when we are doing something for the first time, we have a tendency to want to transfer the risk, so take a risk,” he said.

Juengst added a different dimension, focused on the customer: “You’ve got to go ugly, early,” he said. “Do more thinking about applications and application architectures you want in your cloud environment … that’s really what IT is all about, providing applications for end users.”

Echoing Juengst’s sentiments, Fiftal said, “Do less, go ugly early” and “take a risk, but an informed one…we’re in the federal government, when you think cloud, think compliance from the start.”

Giesler: “Think more about the services and less about the technology… think about the data.”

Round Two – Cyber Shoot-Out:

Thomas Bayer, CIO at the Securities and Exchange Commission, moderated the second round of the shoot-out for panelists Jennifer Nowell, director of federal healthcare and government solutions at Symantec’s Americas public sector; Dr. Phyllis Schneck, VP and chief technology officer at McAfee’s global public sector; and Al Kinney, director of cybersecurity capabilities at the U.S. public sector at HP Enterprise Services.

Focusing on an issue “near and dear” to Bayer, the panel launched with a discussion about the ever-evolving nature of the threat landscape and how industry is adapting in response.

“The cyber adversary is fast and loose,” Schneck said. “This is about turning networks into ecosystems, looking at how to make networks as agile as our adversaries are… Resilience means run while you’re under attack.”

According to Schneck, the days of the signature are long gone. Today, industry and government should focus on behaviors as opposed to signatures in trying to identify the cyber adversary. New threats are also emerging. Sometimes referred to as a watering hole, hackers now sit and wait for the right targets to hit a website and then launch an attack, Nowell said.

New strategies must be developed to deal with these threats and, according to Nowell, one solution may be to use a Trojan horse to attach to the malware.

“The sky continues to fall, it’s true… but the thing we have to realize is that we still have to fly on bad weather days,” Kinney said.

Part of the solution must come from information sharing. “Our president said ‘cyber’ in a State of the Union address,” Schneck exclaimed. The cyber issue has been elevated at the national level, which she believes is a good thing because it will likely result in more cyber funding for small and medium-sized businesses.

“Small and medium business may be building the next jet engine, but they may not be able to protect it,” Schneck said.

The problem with information sharing is, “it’s hard enough to share with your friends sometimes,” and protecting critical infrastructure moves into private concerns, according to Kinney.

Round Three – Big Data Shoot-Out:

The last shoot-out, on big data, was moderated by Wolf Tombe, CTO at U.S. Customs and Border Protection. Panelists included Dante Ricci, senior director of SAP’s public sector; Marshall Presser, field CTO at Pivotal; Scott Pearson, director of big data solutions at Brocade; and Mark Ryland, chief solutions architect of the World Wide Public Sector Team at Amazon Web Services.

To begin the discussion, panelists gave their definition of big data. Ryland astutely described big data as the “variety, volume and velocity of data.”

“It’s an umbrella terminology,” Ricci said. “It’s really about taking the data and making sense of it.”

The panelists then shared their big-data strategies, all agreeing on one best practice: bringing business people together with IT people. According to Ricci, a big-data strategy should include getting a holistic view of the data, prioritizing it and then making sure the priorities are politically and otherwise feasible. “We can find out all kinds of things, but if it’s not feasible it’s not usable,” he said.

Identifying where the problem lies is also key, Presser said, because “people get caught up in ‘I don’t know what to do because I don’t know anything about it.’” For this issue, he suggested one-week big-data courses publicly available online as a starting point. Presser went on to suggest meetup.com big-data groups to help professionals figure out what big data is, what the tools are and what other people are doing about it.

Pearson also plans to contribute to the big-data community through education. “What I’m looking into is working in collaboration with partners to provide a big-data 101 orientation,” he said.

-In this Story-

Agencies, Amazon, Amazon Web Services, big data, Casey Coleman, Charles Romine, Cybersecurity, data analytics, Departments, General Services Administration, Government IT News, Homeland Security Department, HP Enterprise Services, Innovation, Intel, Jennifer Nowell, McAfee, NetApp, Phyllis Schneck, Red Hat, SAP, Symantec, Tech, Tom Bayer, U.S. Customs and Border Protection, Wolf Tombe