Advanced Computing in the Age of AI | Tuesday, April 23, 2024

Could the Human Brain Inspire High Efficiency Computing? 

At just under 2500 Megaflops per Watt, the Appro Beacon machine at the University of Tennessee comes in at the top of the Green 500 list. Yet underlying this are the minds that designed that facility. More succinctly, the brain could be that most efficient computing system, and thanks to new research from IBM, it could inspire and inform a new high efficiency computing setup that would potentially supplant silicon chips.

Consider the grandest examples of energy efficient computing. One might think to Google’s data centers, which are optimized to direct more cooling air toward the servers as opposed to over them. One may also consider new facilities such as the Massachusetts Green High Performance Computing Center (MGHPCC) or the National Renewable Energy Laboratory’s (NREL) Energy Systems Integration Facility that use advantageous cooling methods such as outside air or cool liquid.

At just under 2500 Megaflops per Watt, the Appro Beacon machine at the University of Tennessee comes in at the top of the Green 500 list. Yet underlying all of this are the minds that designed these facilities. More succinctly, the brain could be that most efficient computing system, and thanks to new research from IBM, it could inspire and inform a new high efficiency computing setup that would potentially supplant silicon chips.

“This is an alternative to a slowdown in Moore’s Law,” said IBM Research Fellow Stuart Parkin in discussing a new computing substance that would hypothetically avert the physical limitations of silicon. “Our inspiration is the brain and how it operates. It is full of liquids and ionic currents. We could build more brain-like devices.”

Some may take issue with calling a brain a computing system. After all, the logics under which computers operate are significantly different than that of the brain. Measuring a brain’s capacity in floating point operations per second (or floating point operations per watt) would be pointless since for the most part the human brain cannot perform a floating point operation.

However, in a similar but not entirely analogous way to how computers flip electronic bits to make decisions, a brain thinks by sending electrical signals across its neurons. While the actual computing power of the brain is unclear (some studies suggests that it is 60 bits per second, although the researcher noted that is not necessarily the upper bound), it is thought that the brain on average only consumes about 20 watts of power.

One of the main reasons the brain does not require a significant amount of power to operate is that when memories are created, additional power does not have to coarse through those memories for them to continue to exist. Parkin’s team used these same liquid ionic properties to construct a transistor.

“Unlike today’s transistors, the devices can be switched ‘on’ and ‘off’ permanently without the need for any power to maintain these states,” Parkin said. “This could be used to create highly energy-efficient memory and logic devices of the future.”

Going into more detail, the diagram on the left shows a liquid ionic solution that filters onto the underlying oxide surface. When voltage is applied to the blue solution on top, the liquid goes from insulating to conducting. Based on the voltage and the configuration of the underlying oxide surface, bits will essentially be flipped, creating the kin to electronic memory. When the voltage disappears, the memory would still be intact.

“We are using tiny currents of ions of atoms generated by these electrical signals to change the state of matter of this oxide material,” Parkin said. “It is a means to build low-energy, highly efficient devices by turning on and off their conducting state. We turn this material into a metal and maintain it without any need to supply power.”

IBM has been on-board the green computing train for a few years now, producing research last year on “zero-emission data centers.” A significant portion of green data center research focuses on innovative cooling technologies such as using outside air in cool climates or utilizing more efficient, non-conductive liquids to carry away excess heat. These ideas end up, either incidentally or purposefully, driving down PUE by reducing the non-IT power consumption in a facility.

Of course, producing efficient IT equipment is also important in the long run as high performance computers begin approaching exascale.

It is important to note that while this research makes highly efficient, human brain-inspired computing hypothetically possible, it has yet to produce an actual computing system. Douglas Natelson, physicist at Rice University, warns the optimism might be misplaced until real applications are seen.

“This is a really nice piece of science but you have to wait and see how much impact it’s going to have on computing,” Natelson said. “Silicon has a lot of legs left, I think.” Indeed, while Moore’s Law is grinding to a halt, that halt is not expected to occur for another couple decades.

Parkin noted that it will be two to four years before circuits based on this technology will be possible. Still, from a green computing perspective, it is worth researching the capabilities and limitations of a substance that could offer power-free storage. Further, this structure could make it possible to tie together a computing system’s memory and logic subsets, something that it is in general thought impossible without a significant uptick in power.

“We want to build devices, architecturally, which are quite different from silicon-based devices,” Parkin argued. “Here, memory and logic are fully integrated.” While the technology is in a prototype testing stage, it is worth keeping an eye on.

Related Articles

Google Uses PUE Properly in POP Optimization

Does Green Sacrifice Performance for Efficiency?

Heat-Trapping and NREL's Green Datacenter Leadership Effort

EnterpriseAI