Advanced Computing in the Age of AI | Saturday, April 20, 2024

Best Practices for Intelligence-Based Storage Management 

Data is the most valuable asset for any company in any industry today, yet management of data is often reactive, under-funded and – frankly – haphazard. File data tends to be the highest growth and least managed category of data, but it contains tremendously valuable information about customers, products, and business processes. Firms often let data pile up in whatever repositories are available and wait until the last moment to address migrations forced by end-of-life refresh. Ongoing optimization often takes a back seat to simply keeping the lights on and with high rates of data growth and flat budgets, staff augmentation to address this is unlikely.

As organizations recognize the value of their data and become aware of new ways to derive value from it, it makes good sense to shift from unmanaged chaos to intelligence-based storage management. It’s no easy task to take on more technology management objectives in any category, but the risk of not doing so grows as data sets increase in size and competitors have more resources available to use data to gain advantage. Companies interested in moving towards intelligence-based storage management should consider the following best practice recommendations:

  • Plan for refresh in advance:
    When you have a big storage environment, systems will come to the end of their usable life throughout each year. Don’t let these potentially disruptive events be a surprise. Maintain a calendar that tracks upcoming migration efforts, and plan for each one well ahead of time. Ideally, think about a permanent solution that your team knows how to use and is installed and ready to go well before each individual migration event.

    Andrew Reichman, Data Dynamics

    Andrew Reichman, Data Dynamics

  • Treat migrations as enterprise IT projects:
    Organizations take application development projects seriously, and bring in qualified experts and advanced solutions to support them. But enterprises often conduct data migrations on an ad hoc basis with free tools and homegrown scripts. When data moves, it exposes the company to risk of data loss, leakage, or inaccessibility. Recognize the importance of moving data systematically and avoid negative consequences by investing in a comprehensive approach to migration.
  • Use discovery to enable good decisions:
    Effective storage management starts with a good inventory of your If you don’t know how much data you have, who uses it, how important it is, and where it’s stored, you can't make good decisions about where to move it, how many copies to make, what levels of encryption and resilience to configure or many other key considerations. It can be a challenge to get a solid baseline inventory and even more challenging to keep it current, but an automated discovery solution can help gather this needed data with less effort.
  • Recognize and accept your multi-vendor reality:
      It’s nice to think about consolidating all data into systems from one vendor running a single protocol. You would have fewer systems to learn how to manage; moving data among them would be easier; you would eliminate silos, and purchasing would be far simpler. But if you have lots of data it’s likely you have several vendors, systems, and protocols on hand to store it. Eliminating the complexity is unlikely, so a better approach is to look at automated management solutions that can be used on multiple platforms. This helps future proof your environment; even if you’re in love with one vendor today, things may well change down the road as new technologies emerge- if you have a solution in place that can manage multiple vendors, you won’t hesitate to bring in the latest and greatest system when it makes sense.
  • Groom and optimize data continuously:
    As people and applications create, share, and use data, the storage environment that stores them grows and changes. Keeping hot data on high performance resources and moving stale data to cheaper tiers of storage, archiving unneeded data and maintaining the right number of copies and the correct level of geographic dispersion can be a huge chore. But it only gets harder the longer you wait. If you can set up automated solutions and processes to analyze and optimize data from the beginning, the effort will be manageable. If you let the data pile up without managing it, you will likely never be able to go back and clean it all up.

 

About the Author
Andrew Reichman brings more than 15 years of technology leadership as a marketer, analyst, consultant, project manager, and hands-on data programmer to his role as vice president of marketing at Data Dynamics.

Before joining Data Dynamics, Andrew ran an independent analyst practice working with vendors, end users, and investors on advisory, strategy, and content creation projects, which followed a tenure at Amazon Web Services leading enterprise marketing. Prior to that he spent six years as a principal analyst at Forrester Research covering storage, cloud, and data center economics. Before his time at Forrester, he was a consultant at Accenture, optimizing data center environments on behalf of EMC. He has also held a variety of technology leadership positions at vendor and end-user firms, focused on financial analysis, project management, and data Jiu Jitsu. Andrew holds an MBA in finance from the Foster School of Business at the University of Washington and a BA in History from Wesleyan University. Follow him on Twitter: @reichmanIT or @datadynamicsinc

About the author: Alison Diana

Managing editor of Enterprise Technology. I've been covering tech and business for many years, for publications such as InformationWeek, Baseline Magazine, and Florida Today. A native Brit and longtime Yankees fan, I live with my husband, daughter, and two cats on the Space Coast in Florida.

EnterpriseAI