Advanced Computing in the Age of AI | Friday, March 29, 2024

IO Runs the Datacenter Like a System 

Building new datacenters is difficult. They are expensive, lengthy, and complex IT projects that have massive up front costs and usually have some surprises. Converging server, storage, and networking infrastructure and thinking at the rack level makes building datacenters easier. What if you could take the converged infrastructure model, and scale it to the actual datacenter?

io-anywhereOne company, IO Data Centers, believes that this is the next logical step, offering 460 square foot factory produced datacenter modules that can be ordered, shipped, and online in a couple of months as opposed to the many months to years it takes to build a conventional brick-and-mortar datacenter. This is a model that IO calls Data Center 2.0.

“There’s great consensus that information infrastructure needs to be converged,” Patrick Flynn, of IO told Enterprise Tech in an interview this week. “We need to have abstraction layers to optimize the allocation of applications across geographies, across datacenters – some sort of optimal matching scenario. There is consensus that that’s needed, but most everybody else is attacking that from the top of the stack downward.”

While Flynn says software companies and IT vendors have tried pushing down through the software layers and into the datacenter layer, he says that IO has approached the problem differently, by working from the bottom up. At the core of its approach is a bit of software dubbed IO.OS, which IO bills as the world’s first datacenter operating system.

“Datacenters weren’t talking to each other, visibility wasn’t what it needed to be, so we created a purpose-built software operating system platform that consolidates all of the data, and provides a single view,” explained Flynn. What’s more, he says, is that this approach has given IO a platform on which more can be built.

To that end, IO has launched a new division, called IO Applied Intelligence, that will be tasked with taking advantage of the massive amounts of data that is passing through its IO.OS and turning it into products and services that can benefit its own organization as well as its customers.

“Applied Intelligence first and foremost is a team of data scientists and engineers based here in San Francisco,” said Flynn, who is the group leader of the new division. “We see ourselves as a bit of an internal R&D lab/skunkworks project with a broad mandate to push product performance from the data that we’ve captured.”

With its mandate, Applied Intelligence will work to streamline operations, increase customer engagement (including both operational engagement and added products and services), and steering the direction of the internal IO hardware and software engineering for its base product, the IO datacenter modules.

One customer poised to benefit from the new division is Goldman Sachs, who has bought in on the IO modular datacenter approach and has named IO its modular data center provider of choice. Recently, Goldman Sachs installed a new datacenter module in Singapore, a project which Flynn said took roughly 120 days. “Speed to market is a big advantage,” says Flynn. “Being able to identify that Singapore was on their roadmap and to be able to get there quickly is a big value proposition.”

With the addition of the Applied Intelligence division, IO believes it will be able to drive more value for Goldman Sachs. “Once you have a standardized hardware platform, the data that you’re capturing is meaningful,” says Flynn. “You can benchmark, you can examine outliers, you can model the architecture of the system. You can do it for one of those 460 square foot increments, and get it in very high resolution and you can replicate that analysis across hundreds of modules in place.”

One of IO Applied Intelligence’s key technology partners is McLaren Applied Technologies, which was formed to handle the immense big data needs of it’s the McLaren Formula One racing team. "One of our cars is covered in over 100 sensors," Ron Dennis, executive chairman of the McLaren group, told Wired magazine this summer. "In a Grand Prix they harvest 6.5 billion data points to try and give some sort of visualization. So when you want to comparing two races, that's 13 billion data points."

McLaren has developed an analytics application that ingests this stream of real-time F1 racecar data and transforms it into predictive analytics using SAP's HANA in-memory database. “If you took the world's population times two and you ask them a question all at the same time and want that answer very quick, Hana delivers the answer to the question in 100 milliseconds. That's processing power that's so powerful it can predict future,” McLaren’s Dennis said.

“One of the first things we did with them was ask them to really zoom in at the most granular level of customer usage,” said Flynn, describing how IO is putting McLaren’s analytics engine to work. “We examined the power behavior at an individual branch within one of our data modules.” Looking at the fine details of its systems and extrapolating over a large assortment of modules, Flynn says that IO has been able to use the analysis to build machine learning algorithms to model and predict the behavior of customer IT usage at the most granular level.

Using the data to predictively operate the datacenter, Flynn says IO can better help its customers be more intelligent about how they procure capacity. “A customer might get an alert two months ahead of time alerting them that there is a 10 percent chance that based on our forecast of their IT usage, they will breach the available power capacity of the branch that they’re on,” he explains, adding that the alarm time frames can be individually set to give IO’s customers enough time to make requisite adjustments.

Benefitting from its IO.OS system visibility, he adds that the data being collected gives IO more options through which it can advise its customers. This includes informing the customer in the aforementioned scenario that an adjacent branch has very little utilization and can be reconfigured to give it an additional six months of utilization at the current capacity. “That sort of capacity recommendation at the very granular level, we see as a foundational building block to understanding how customers use their IT, and I think that will play a role in not just that product feature, but many other product features going forward,” says Flynn.

There is a security angle to IO.OS as well. “If we can start building a quantified representation of what security is, then we might enable routing applications and workloads based on their security requirements,” says Flynn. And, of course, IO.OS is instrumental in making datacenters more efficient. “It’s also quantifying energy efficiency, and what is eco-friendliness, what is reliability, what is uptime – quantifying those parameters and then assessing what available capacities we have around in all of its different varying parameters will then enable that converged, cohesive, efficient digital infrastructure spanning the entire globe.”

That, says Flynn, is what is in IO’s sights. “That’s where we’re trying to go, and we think we’ve got a unique position to get there because we’re starting from the foundation and working upwards.”

EnterpriseAI