Advanced Computing in the Age of AI | Friday, March 29, 2024

FPGAs, ASICs Seen Driving Machine Learning 

As machine learning makes steady progress in enterprise datacenters, industry observers are forecasting the emergence of a new wave of machine learning chips in the coming year, a market currently dominated by graphics processors.

In it's annual technology forecast, Deloitte Global predicted that FPGAs and application-specific integrated circuits (ASICs) would make major inroads in 2018 for processing machine-learning workloads. It estimates that the chip architectures will be used to accelerate more than 25 percent of machine learning workloads in datacenters by the end of 2018.

That would represent a major shift in how machine-learning models are handled, Deloitte noted, since nearly all machine-learning jobs in large datacenters are currently run by GPUs and CPUs.

Deloitte projects enterprise datacenters will consume about 800,000 machine-learning chips next year. GPUs will continue to dominate the market, accounting for more than 60 percent of the global market, with FPGAs accounting for about one-quarter of demand (200,000) and ASICs adding another 10 percent (100,000).

FPGAs and ASICs "should increase dramatically the use of [machine learning], enabling applications to consume less power and at the same time become more responsive, flexible and capable, which is likely to expand the addressable market," Deloitte said in a global technology forecast released this week.

The transition to FPGAs and ASICs would be significant, Deloitte stressed, since nearly all machine-learning applications based on the "artificial neural network" approach currently rely on a combination of GPUs and CPUs.

The market analyst estimates as many as 200,000 GPUs were sold in 2016 to run GPU jobs. Deloitte did not estimate the value of the GPU machine learning market, but did cite other forecasts pegging the machine learning acceleration market at between $4.5 billion and $9.1 billion by 2022.

The survey breaks down machine learning based on artificial neural networks into two primary tasks: training and inference. The earliest adopters of FPGA and ASIC accelerators for machine learning application used them for inference. For now, they are expected to use a mix for both inference and training machine learning models.

Late last year, Amazon Web Services (NASDAQ: AMZN) announced a new FPGA instance along with a separate GPU service. Deloitte noted that Microsoft (NASDAQ: MSFT) also is using FPGAs for inference as part of its hosted machine learning services on its Azure public cloud.

Earlier this year, Chinese online retailer Alibaba (NYSE: BABA) unveiled a partnership with Intel to demonstrate the chipmaker's Xeon-FPGA platform to accelerate cloud-based applications. Intel (NASDAQ: INTC) also has been promoting FPGAs as a way of tuning cloud platforms to accelerate diverse workloads running in datacenter, including machine learning, video and data encryption.

Meanwhile, other players such as Google (NASDAQ: GOOGL) are promoting ASICs for machine learning, including a chip based on Google's TensorFlow machine learning software.

TPUs could make a dent in large server farms running power-constrained applications such as inference, Deloitte said. Second-generation TPUs from Google may handle model training as well, the market watcher said, adding that it remains unclear "whether the relative performance advantage of TPUs over GPUs for certain inference tasks will be comparable for [machine learning model] training tasks."

The forecast also notes that the transition to a combination of CPUs and GPUs led to an explosion of machine learning applications. "If the various FPGA and ASIC solutions offer similar order-of-magnitude improvements in processing speed, efficiency [and] price," Deloitte concluded, "a similar explosion in utility and adoption seems probable."

About the author: George Leopold

George Leopold has written about science and technology for more than 30 years, focusing on electronics and aerospace technology. He previously served as executive editor of Electronic Engineering Times. Leopold is the author of "Calculated Risk: The Supersonic Life and Times of Gus Grissom" (Purdue University Press, 2016).

EnterpriseAI