Advanced Computing in the Age of AI | Friday, March 29, 2024

Big Data Tools Target Datacenter Performance 

Among the growing number of uses for the products of big data analysis is the ability to trouble shoot datacenter performance and narrower parameters like storage.

The big data vendor CloudPhysics comes to the VMworld event this week in San Francisco with a dynamic benchmarking tool designed to monitor the overall health of virtualized datacenters on a daily basis. The Silicon Valley company also previewed a "workload shapes" technology intended to provide VMware administrators with a "visual shorthand" for spotting and resolving storage problems.

The CloudPhysics software-as-a-service platform called "Global Insights" is billed as allowing VMware users to continuously benchmark virtual infrastructure against global metrics. It works with a "Daily Insights" tracker to spot operational problems across virtualized datacenters.

The platform collects and analyzes a daily stream of configuration, performance, failure and event data from a user base, with an estimated 50 trillion samples collected so far. The big data on virtual datacenter performance is crunched using a proprietary datacenter simulation and resource management techniques, CloudPhysics said August. 25.

In a company blog, CloudPhysics co-founder John Blumenthal said the company has been collecting "trillions of operational metrics objects" over the past two years representing a huge dataset from thousands of vSphere datacenters. It then applied data analytics techniques to generate the new types of analysis and simulations.

The datacenter intelligence is delivered to customers via algorithms. Among them is the global benchmarks and daily updates. The former compares datacenter metrics against a global dataset that provides benchmarks for evaluating the performance and health of customers' datacenters. The daily tracker aggregates and recommendations to improve overall datacenter performance metrics, the company claimed.

The troubleshooting utility is designed to provide visibility into, for example, disk I/O contention at the data store level. A timeline visually correlates patterns among data stores and virtual machines, isolating "hotspots" via an algorithm used to detect data store contention.

Meanwhile, the company's new "workload shapes" technology is intended to analyze storage workload spatial location (sequential versus random), dominant I/O block sizes and latency profiles. It then visualizes the "workload shapes" so IT administrators can determine which shapes are "normal" for the virtual environment while identifying performance hotspots.

CloudPhysics claims the technology is capable of analyzing every I/O to ensure that, unlike other tools, anomalies "don't get lost in the averages."

Blumenthal called the "workload shapes" approach an example of "little data" in that it is "ultra-narrow in scope" and the opposite of large, aggregated queries. He claimed that understanding the storage demand "shape" helps solve workload issues such as balancing performance troubleshooting and capacity planning. He added it could also help in deciding when, for example, to introduce solid-state drive or flash storage into datacenter infrastructure.

CloudPhysics said all new features are now available for VMware vSphere environments on an annual subscription basis. A free version includes a new interactive console and benchmarking metrics.

About the author: George Leopold

George Leopold has written about science and technology for more than 30 years, focusing on electronics and aerospace technology. He previously served as executive editor of Electronic Engineering Times. Leopold is the author of "Calculated Risk: The Supersonic Life and Times of Gus Grissom" (Purdue University Press, 2016).

EnterpriseAI