Inside Advanced Scale Challenges|Monday, July 16, 2018
  • Subscribe to EnterpriseTech Weekly Updates: Subscribe by email

Qumulo Launches Nearline Archive Series, Aims to Breathe New Life into Archival Data 

Qumulo Nearline Archive Series

On the heels of an over-subscribed $93M series D funding round announced two weeks ago,  high performance file storage specialist Qumulo today launched a new product series that it says breathes new life into what typically is cold, hard-to-access archival data, making historical data more visible, performant and usable for operational and high performance analytics workloads.

Qumulo’s Nearline Archive Series scale-out storage combines Qumulo’s File Fabric (QF2) with cloud data center-standard server hardware using high density 12TB drives and Intel Xeon SOC chipsets. A Nearline Series server has the capacity for 144 terabytes of data in 1U “pizza box” of rack space, according to Qumulo, and is capable of 6GB/second of reads, 3GB/second of writes per usable petabytes of data stored in the system. To scale, Qumulo product manager Jason Sturgeon told us, customers purchase another Nearline server and connect them via their 10GB network ports. The result, according to Qumulo, is scalability to billions of files and 20-40 percent improved storage efficiencies over existing legacy storage for large and small files.

Qumulo said the Nearline series overcomes existing archive storage technologies characterized by low-volume, proprietary hardware, holding customers back from innovations in advanced server components that leaves archived data off-line and relatively inaccessible. The need, according to the company, is to enliven what would otherwise be cold data.

“Because of all the advancements in computing and the ease with which people can deploy lots and lots of computing resources and point it at problems in business today, all data is active now in some way,” Qumulo CMO Peter Zaballos told EnterpriseTech. “We used to put data in an archive for safety. But now you need to put it someplace where you can access it really easily. The idea that you’re putting data where it’s not going to be looked at for a long time, that’s the obsolete part. You need to have all of your data accessible.”

Zaballos said a key to Qumulo’s archived data strategy is the QF2 file fabric’s ability to leverage metadata. “One of the innovations we brought to the market was a file system that collected the metadata when it wrote the data to the storage location, so on the way in we collected the metadata,” he said. “So if you want to find out something about what you’re storing, we just ask the metadata and you get an answer very quickly. We don’t have to check every location. In an age (when customers have) billions of files, this is a total game changer. We can give you literally instant visibility into the state of storage, where it is, who’s using what… We give you (archival data) performance like it’s part of your main active data.”

Qumulo said QF2 runs on premises and in the cloud and uses continuous replication to create a data fabric that interconnects QF2 clusters, whether it’s all-flash, high-performance hybrid SSD/HDD, nearline archive or running on EC2 instances in AWS.

Earlier this month, Qumulo announced it closed a Series D funding round $93 million lead by BlackRock Private Equity Partners and included Goldman Sachs and Western Digital along with existing investors Highland Capital Partners, Kleiner Perkins Caufield & Byers, the Madrona Venture Group and Valhalla Partners.

Add a Comment

Do NOT follow this link or you will be banned from the site!
Share This