Backup of data over a network creates primary stores of active data for immediacy of access across locations anytime. Active data is data that is frequently accessed, modified and recommitted to the database over the network. Active data involves replication, duplication and multiple generations of the same data across locations. Hence, the backup of primary data or active data becomes a challenge in itself as organizations struggle to reduce the primary data footprint and leverage storage capacity to create cost savings.
Consequently, there is a demand for technologies that will reduce the primary storage footprint and storage manufacturers and service providers strive to come up with a number of solutions to problems associated with primary storage. Two such solutions offered at the altar of primary storage are de-duplication and data compression.
While data compression compresses stored data in real-time, data de-duplication removes multiple copies of data and stores references to the data in place of the data itself. Both these technologies were considered suspect in the yesteryears due to performance concerns. Today, this is no longer a major concern as technologies have evolved to address the challenges effectively and efficiently.
Modern compression technologies compress data in real time even as data is fed into the storage repository, to create storage savings, increase storage performance and make data highly available. Decompression techniques enable data delivery with immediacy as data is decompressed even as it is delivered to the end user. The compression and decompression process is relegated to a compressor/de-compressor device that is independent of the storage controller.
Data de-duplication technologies are pressed into service even before the data is compressed and placed in the data store. References to duplicate data are created and software delivers the content in its original format to the user of the content from the content store, replacing the reference with a copy of the data. The result is a seamless access to original data with no extra effort on the part of the IT Administrator.
Nevertheless, a note of caution must be sounded. It must be remembered that the amount of compression and de-duplication that can be achieved on primary data depends on the environment in which the data is created, accessed and stored.
- In environments with high level of redundancy of data, there will be a significant ROI while in others the reduction may just be 10% to 20%.
- The rate of compression or de-duplication will also depend on the qualification of the data for the process and the application specificity of the data that is to be compressed, de-duplicated and backed up. Application specificity can cause performance bottlenecks and may not deliver any appreciable capacity gains.
- In multi-platform and multi-vendor IT infrastructure, there maybe a need for multi-data reduction approaches, which can cause confusion at the administration level.
Having said this, it should be noted that, today, with increasing number of vendors offering online backup and storage of primary data, the concept of compression and de-duplication on primary storage is gaining traction. More and more storage hardware and software providers are focusing their energies on creating storage options for optimum performance. Higher performance hardware, multi-core, high speed processors and low cost DRAM for cache and solid state technology are all contributing towards mitigating the performance penalties that are associated with primary data storage technologies. The future holds promise.
SecurStore provides a bespoke offsite backup solution catered for customers who have both mission critical data and non-critical data, i.e., it provides customers with a secure & efficient backup and recovery solution, which is sustainable over time. This coupled with agentless technology and advanced support for all environments and applications makes it suitable for any type of business, data centre provider or reseller.