Fast data growth is nothing new in enterprise data centers. Every year, companies generate more data, retain data longer, and demand higher levels of data availability for business critical operations. Until recently, most have found ways to handle data growth and to control complexity in their data protection environments. They have moved from physical tape to faster, more efficient disk-based backup, implemented inline deduplication to help reduce data volumes, and have begun adding new technologies to cut costs.
However, in the past few years, enterprise data centers have started to lose their battle against data growth. A recent survey* of enterprise data managers provides evidence of the growing difficulty. Nearly half (46%) of respondents did not feel that their backup environment was meeting their needs. Many are choosing to leave data under-protected. They are facing costly sprawl as they add more and more systems to meet backup windows. They are also facing unprecedented complexity as they apply multiple different technologies to improve efficiency. It’s time for a different approach.
Unprecedented Data Sprawl and Complexity Hits Data Centers
Let’s take a closer look at the issues driven by data growth. At one end of the spectrum, data managers are addressing fast growing data volumes by adding multiple new technologies. For example when meeting backup windows becomes difficult, they add new backup systems to increase performance and/or capacity. They may add snapshots or other technologies to improve RTO or RPO or leverage database features such as multiplexing and multistreaming to backup large databases faster. Many are moving to or improving tiering as a way to save costs and improve efficiency. While individual technologies may solve immediate problems, over time, this point solution approach results in costly data center sprawl and increased complexity.
At the other end of the spectrum, many are not using enough technology. As noted above,they are simply leaving an increasing volume of data under-protected. For example, according to the survey, 55% are not encrypting their data at rest and more than a fifth (21%) are not providing disaster protection for their data because they fear it will add unnecessary complexity to an already over-burdened data center.
What is Needed is An Orchestrated Solution
So, too many technologies add complexity. Too few leaves data unprotected. What is the solution to protect massive, fast-growing data volumes without breaking your budget or overworking your IT staff? The answer lies in taking a holistic approach to enterprise-wide data protection and applying the technology needed to implement it. Moving from tactical problem solving to an orchestrated solution will help you apply technology and IT resources in the most effective way to achieve maximum return on investment and optimal data protection.
Address Data Growth by Deduplicating More Efficiently
One of the most productive ways to address data growth is to back up data with efficient deduplication. This is an area where companies can make significant improvements with a minimal investment. Today, most companies are using traditional inline deduplication, which is a point solution with several noteworthy drawbacks.
First, inline deduplication can jeopardize backup windows as it can slow ingest performance over time. As a result, it forces you to add more backup targets before fully using available capacity – not only causing poor utilization, but also adding labor required to load balance the new system.
*Sepaton Data Protection Index 2013.
Peter Quirk, director, Product Management, Sepaton, Inc.
Image courtesy of Shutterstock.