US seeks to tap cloud computing for climate change, more
US researchers are studying cloud computing’s potential to provide a cost-effective and energy-efficient way to speed up discoveries and innovation in climate change, physics and biology.
The programme is being funded by the American Recovery and Reinvestment Act through the US Department of Energy (DOE).
Cloud computing provides on-demand access to a network of computing resources that can be easily provisioned as needed. A similar model could deliver an efficiency of scale to help scientists solve problems while still still providing computing capacity for individual tasks.
“Cloud computing has the potential to accelerate discoveries and enhance collaborations in everything from optimising energy storage to analysing data from climate research, while conserving energy and lowering operational costs,” said Pete Beckman, director of the Argonne Leadership Computing Facility and project lead. “We know that the model works well for business applications, and we are working to make it equally effective for science.”
To test cloud computing for scientific capability, Argonne and the National Energy Research Scientific Computing Centre (NERSC) will install similar mid-range computing hardware, but will offer different computing environments. The combined set of systems will create a cloud testbed that scientists can use for their computations while also testing the effectiveness of cloud computing for their particular research problems.
Because the project is exploratory, it’s been named Magellan in honor of the Portuguese explorer who led the first effort to sail around the globe and for whom the “clouds of Magellan” — two small galaxies in the southern sky — were named.
One of the goals of the Magellan project is to explore whether cloud computing can help meet the overwhelming demand for scientific computing. Although computation is an increasingly important tool for scientific discovery, and DOE operates some of the world’s most powerful supercomputers, not all research applications require such massive computing power. The number of scientists who would benefit from mid-range computing far exceeds the amount of available resources.
“As one of the world’s leading providers of computing resources to advance science, the Department of Energy has a vested interest in exploring new options for meeting the overwhelming demand for computing time,” said Michael Strayer, associate director of DOE’s Office of Advanced Scientific Computing Research. “Both (test centres) have proven track records in deploying innovative new systems and providing essential support services to the scientists who use those systems, so we think the results of this project will be quite valuable as we chart future courses.”
“Our goal is to get a global picture of Magellan’s workload so we can determine how much of DOE’s mid-range computing needs could and should run in a cloud environment and what hardware and software features are needed for science clouds,” said Kathy Yelick, director of the National Energy Research Scientific Computing Center. “NERSC’s users will play a key role in this evaluation as they will bring a very broad scientific workload into the equation and help us learn which features are important to the scientific community.”
The two test facilities will be linked by a groundbreaking 100 gigabit-per-second network developed by DOE’s ESnet (another DOE initiative funded by the Recovery Act). Such high bandwidth will enable rapid transfer of data between geographically dispersed clouds and let scientists use available computing resources regardless of location.
“It is clear that cloud computing will have a leading role in future scientific discovery,” said Beckman. “In the end, we will know which scientific application domains demonstrate the best performance and what software and processes are necessary for those applications to take advantage of cloud services.”