NSF awards $20 million for two experimental cloud projects

Share

Written by

2014_08_NSFCloud2 Located at the University of Utah, Apt is a precursor to CloudLab and will be built around the hardware cluster. (Credit: Chris Coleman, School of Computing, University of Utah)

The National Science Foundation awarded two $10 million contracts last week to create cloud computing testbeds for academics to explore new cloud architectures and work with novel applications inside of them.

The projects, called Chameleon and CloudLab, are part of the NSFCloud program within the agency’s Directorate for Computer and Information Science and Engineering and are meant to be complementary to typically industry-driven cloud development, similar to NSF’s involvement during the genesis of the Internet.

“Just as NSFNet laid some of the foundations for the current Internet, we expect that the NSFCloud program will revolutionize the science and engineering for cloud computing,” said Suzi Iacono, acting head of NSF’s Directorate for Computer and Information Science and Engineering, in a statement. “We are proud to announce support for these two new projects, which build upon existing NSF investments in the Global Environment for Network Innovations (GENI) testbed and promise to provide unique and compelling research opportunities that would otherwise not be available to the academic community.”

Chameleon, as the name would suggest, will test the adaptability for a range a experimental needs. The large-scale, reconfigurable environment will consist of hundreds of nodes and five petabytes of storage configured into custom clouds to test for efficiency and usability in a variety of operations, like “machine learning and adaptive operating systems to climate simulations and flood prediction,” a release said. Chameleon will be co-located at the University of Chicago and The University of Texas at Austin.

Kate Keahey, a scientist at the University of Chicago and principal investigator for Chameleon, said the testbed will allow scientists to experiment “on a large scale, critical for big data and big compute research. But we also want to go beyond the facility and create a community where researchers will be able to discuss new ideas, share solutions that others can build on or contribute traces and workloads representative of real life cloud usage.”

At the University of Utah, Clemson University and the University of Wisconsin, researchers will develop CloudLab, another large-scale infrastructure on which they will construct several types of clouds.

“Today’s clouds are designed with a specific set of technologies ‘baked in’, meaning some kinds of applications work well in the cloud, and some don’t,” Robert Ricci, a research assistant professor of computer science at the University of Utah and an investigator of CloudLab, said in a statement. “CloudLab will be a facility where researchers can build their own clouds and experiment with new ideas with complete control, visibility and scientific fidelity. CloudLab will help researchers develop clouds that enable new applications with direct benefit to the public in areas of national priority such as real-time disaster response or the security of private data like medical records.”

In all, the NSFCloud projects aim to make broad advances in cloud computing. After the first development phase of NSFCloud, each of the projects will become fully staffed and operational to launch as academic testbeds for cloud exploration. Both projects will have staggered development with operations introduced this fall and continuing through late 2016.

-In this Story-

Agencies, Departments, National Science Foundation (NSF)
TwitterFacebookLinkedInRedditGoogle Gmail