Building Better Public/Private HPC Partnerships Is Focus of Pending NCSA Study
The most powerful computer systems in the world reside in the public sector – principally, at federally funded supercomputer centers. Yet some of the most demanding workload requirements reside in the private sector. Bridging the gap between the two – finding better ways to get advanced scale computing into the hands of commercial organizations – is the focus of a $300,000 grant awarded by the National Science Foundation (NSF) to the National Center for Supercomputing Applications, based at the University of Illinois, Champaign, IL, a facility that, in part, provides HPC resources to business.
The project’s goal is to evaluate current HPC-centered partnership programs, determine best practices and to create a report that serves as a guidepost for other partnerships. The project comes on the heels of the National Strategic Computing Initiative (NSCI), announced by President Obama last year, to support U.S. competitive leadership in HPC.
“The premise here is: What can we replicate?” Merle Giles, director of private sector programs and economic impact at NCSA, told EnterpriseTech. “It’s rooted in this notion in the NSCI strategy, there ought to be some economic impact as we do HPC in the nation. And we don’t have a good sense as to what industry wants. What is the impact for them, do we understand that, and what are those activities?”
Last week, EnterpriseTech reported on the ACCEL program at Oak Ridge National Laboratory that grants commercial organizations compute time on ORNL’s Titan, the world’s second most powerful supercomputer, for extreme-scale simulation, modeling and data analytics workloads. Similar programs exist at other HPC centers, such as Argonne National Laboratory’s Argonne Leadership Computing Facility.
Now NCSA and industry analyst firm IDC will examine the elements required for successful HPC partnerships. Giles is something of a scholar on the topic of public-private HPC partnerships, having co-authored a book on the subject.
“NSF has a high interest in understanding how industry engages with HPC centers,” Giles said. “So our funding is to tell the story and gather data about these most effective and least effective—maybe both ends of the spectrum—to gather stories around effective practices with industry’s engagement with HPC centers.”
Giles said the study will take about six months to complete and will be global in reach. He cited the Fortissimo Program, a partnership coordinate by the University of Edinburgh and funded by European Commission involving manufacturing, application development, IT solutions and HPC cloud services companies from 14 countries; and Los Altos, CA-based Ubercloud, which provides cloud-based HPC resources, with a membership of more than 2500 companies and individuals, among them 60+ cloud resource providers, 80+ software providers and several hundred consulting firms.
Giles cited the need among industrial users for supercomputing capabilities given the diminishing amount of basic research conducted among commercial organizations.
“We don’t have large industrial research centers like we had once upon a time,” he said. “Few of the big companies still have research centers, fewer than there were once were. Companies today do more applied research than basic research. Universities do the basic research. Corporate practice in basic research has changed rather dramatically, and now that HPC is so mainstream in several industry sectors, the question is: how do we keep it going? Are there things that could be done that would really touch both sides, both the research and the industrial communities, that would be mutually beneficial?”