Advanced Computing in the Age of AI | Friday, April 19, 2024

New IBM Data Analytics Software Targets Workload Management, Spark Adoption 

IBM announced additions today to the infrastructure layer of its high performance data analytics software portfolio, including “cognitive features,” such as scheduling and resource management, and capabilities aimed at easing adoption of Spark.

The new software-defined infrastructure products – in which the data center is managed, provisioned and automated by software regardless of compute, storage or network components – called IBM Spectrum Computing, are intended to reduce the complexity of performance-intensive data analytics and machine learning implementations. They replace IBM’s old “Platform” software portfolio.

Bernard Spang, vice president of IBM Software Defined Infrastructure, told EnterpriseTech that the products are a “blending of what has been traditionally HPC and the need for that technology for enterprise computing, data analytics, Internet of Things,” and other data-intensive workloads.

He said the Spectrum Computing platform offers new resource-aware scheduling policies that will increase compute resource utilization and predictability across multiple workloads, control costs and accelerate results for new generation applications and open source frameworks, such as Hadoop and Apache Spark. It also will assist with consolidating data center infrastructure and sharing resources across on-premises, cloud or hybrid environments.

IBM's Bernard Spang

IBM's Bernard Spang

“We believe this is the industry’s first aggregated software-defined infrastructure offering that includes both software-defined compute and software-defined storage for these new generation scale-out infrastructures,” Spang said. “By combining these capabilities together and building in the intelligence and the automation, we can support a very high-performance and cost efficient infrastructure that can run many of these new generation scale-out workloads.”

The products include:

  • Spectrum Conductor: Designed to speed complex data analytics of data, it works with cloud applications and open source frameworks, enabling applications to share resources while protecting and managing data throughout its lifecycle.
  • Spectrum Conductor with Spark; Aimed at simplifying adoption of Apache Spark; according to IBM, it delivers 60 percent faster analytical results.
  • Spectrum LSF: Workload management software with interfaces designed to facilitate research and design, and control costs through resource sharing and improved utilization.

“The new IBM Spectrum LSF version adds important capabilities to one of the world's leading workload managers for HPC,” said Steve Conway, research vice president in IDC's High Performance Computing group, “a market we forecast will grow from $21 billion in 2015 to nearly $30 billion in 2020. The new and enhanced functions are designed to boost productivity and ROI on HPC systems, along with ease of use and mobile operation. But what's most impressive is the integration of LSF into a coherent IBM Spectrum product family that supports HPC, advanced analytics, cloud computing, and other activities that are increasingly important for existing HPC users and for the growing number of commercial firms that are adopting HPC for their most daunting big data challenges.”

While the software runs on OpenPOWER servers, Spang said they also run on X86 servers, including those from systems vendor Supermicro, which partnered with IBM to optimize its hardware for the new Spectrum Computing products.

“Working with IBM, we have integrated our latest server, storage and networking solutions with IBM Spectrum Conductor with Apache Spark and IBM Spectrum LSF to accelerate deployment of scalable, high-performance analytics infrastructure” said Charlie Wu, General Manager, Rack Solutions at Supermicro. “Our collaborative efforts enable extraction of more predictable results and insight across hybrid cloud environments.”

EnterpriseAI