Real-Time AI at the Edge: The Coming Age of ‘Connected Intelligence’
The world's undergoing digital transformation and it’s all coming together – all of it: all the sensor data, unstructured data, machine data, data produced by humans, everything - every image, item, person, activity, task and process will be pixelated and digitized, and it won’t be long before it’s all analyzed and utilized, increasingly in real time at the edge.
OK, this isn’t exactly new news. But at the GlobalFoundries Technology Conference in Santa Clara yesterday, CEO Sanjay Jha painted a particularly compelling picture of our digitized, real-time, AI future. For Jha, it adds up to a coming golden age for the semiconductor business, which is his business. For the rest of us - consumers, business users or IT strategists - it means a world transformed.
Right now, this transformation is at a relatively primitive stage (that’s always the case with technology, compared with what’s coming) in which things, in the form of data, remain mostly isolated, inaccessible, removed from real-time use. But soon enough data will be the motive force of life, the air we breathe.
Connecting various technology dots, Jha outlined a future of increasingly “connected intelligence” among people, among people and machines, among machines. As an indicator of the emerging, shrinking, more tightly connected digital world to come he held up the example of the Facebook digital community, proliferation of which never ceases to amaze:
“There are 2 billion Facebook users,” Jha said. “There are now more people in the Facebook community than (there are people in) any country [China has 1.4 billion – ed.], it’s the largest community in the world. Only four years ago there were 5.3 hops between people on Facebook. That has come down to 3.7. Remember ‘Six Degrees of Separation’? That number is 3.7 degrees of separation for 2 billion people in the world. That's important to understand.”
Reducing degrees of data separation, key to creating the new digital world, will be accomplished only after an immense effort and innovation over the next 10-20 years.
“The amount of data that's being generated is increasing thousands of folds over the next 10 years,” Jha said. “And I think that it's really difficult to see how that amount of data can be consumed by the network with feedback provided in real time. So what that leads to is intelligence at the edge; there are mission critical elements that need to take real-time action.”
This in turn means that datacenter analytics capabilities will increase dramatically.
“You’ve already seen improvements Google has announced with their Tensor core, you're seeing they are now making APIs available so that all of us can have access to intelligently analyze our data,” said Jha. “That trend will continue, but the trend that really drives our business is real-time action, mission critical action at the edge, making sense of the context that the device finds itself in.”
Jha pointed out that 5MB/hour is required to support Google Maps, and “we think that it will go to about 25 MB/hour to support self-driving cars.”
The compute and analytics demands of autonomous driving are astounding because life-and-death decisions must be made in a flash. At 70 miles per hour, he said, a car travels 100 feet per second; and at that speed, breaking distance also is about 100 feet.
"So you have literally milliseconds to take action so you can stop in time,” Jha said. “It is not easy for me to see how we can get that done, the analysis of radar/sonar/ as many as 16 cameras to decide in real time whether to take breaking action – while you are sending all the data back to the network and back.”An important enabling technology: adoption of 5th generation wireless mobile networks.
“I think 5G will disrupt wireless communication the same way data wireless communication disrupted voice wireless communication,” Jha said. “It's probably the biggest change we will see over the next period of time because it delivers seamless connectivity with low latency anywhere at gigabytes-per-second rates, as opposed to today where sustained we probably get somewhere close to a megabyte per second.”
Everything else in the ecosystem also will have to step up its game. Better “connected intelligence” at the edge, enabling the autonomous car to make executive decisions itself, means semiconductors of increasing capability along with an architectural change from traditional von Neumann to some form of distributed computing.
Jha sees a future in which greater investments will be made in semiconductor technology to support a host of AI and image recognition workloads, such augmented reality and virtual reality.
“We have about 80-100 billion neurons in our brain, about 60 billion of that is dedicated to pattern recognition, understanding images,” he said. “I think in computing we are just scratching the surface of understanding these images. These 7 billion cameras (in use in the world) will keep generating images. Our understanding of the pixels, of the images, is very, very rudimentary. To be able to scan those images at the edge and then make sense of it in the cloud will be the biggest driver of silicon consumption going forward.”
He cited increased venture money pouring into AI, such as the $1.7 invested in LIDAR, a surveying method measuring distance to a given object used in autonomous driving. In response to the growing demand for higher level capabilities, Jha said, is the emergence semiconductor startups, backed by significant venture investments, “that are really beginning to be disruptive.”
“Many of us used to discuss how venture capital had completely dried up,” he said. “Maybe two years ago, a VC funding person would say he could set up a software house for $50,000, ‘So why should I invest in a semiconductor company for $50 million?’
The answer: because semiconductors address $5 billion AI markets.
“That's the innovation that's beginning to happen and I think that the system houses recognize that and we're beginning to see the M&A activity in semiconductor increase as well."
Another big change for the semiconductor industry: FANG* companies buying direct.
“We’re seeing an important shift in the business model of the foundry business,” Jha said. “System companies (like Google, like Amazon, like Tesla, like Microsoft) are coming directly to foundries, they are working with EDA companies, IP companies and system design houses to get the IP. They want to control the hardware/software interface for the next generation of AI developments. They really want to control the architecture of both hardware and software, and it's been the scenario over the last 10 years.
“People who control and capture the hardware/software interface capture most of the value in the industry, and certainly Apple has proven that's where innovation occurs. I think more and more people are beginning to see this business model and I think we're seeing more system houses hiring semiconductor engineers and driving innovation.”
*Facebook, Amazon/Azure/Apple, Netflix, Google.