frenta - Fotolia
The data reduction strategies used by the majority of all-flash arrays on the market to improve flash performance are compression, deduplication or a combination of the two. Although these are the best-known approaches, there are two other data reduction strategies that are used in some products.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Pattern removal. This data reduction method works at the binary level, removing frequently occurring patterns of zeros and ones -- including long strings of zeros that may indicate empty space or null data. While some observers would classify this technique as a type of compression, there are varying levels of granularity. Some of the more aggressive data reduction engines use 8-bit granularity for pattern removal, which eradicates more data than would be possible when using a higher bit count.
Instant cloning. This technology goes by many different names, but the basic idea is that data is often used for more than one purpose. For instance, a database might be used by a production application while a copy of the database is used on a development and test server. From a capacity-reduction standpoint, it makes no sense to create multiple copies of the same database. An instant cloning feature allows copies of data to be created through the use of pointers or snapshots, rather than through physical data cloning.
Taneja Group analyst Mike Matchett discusses the differences between compression and data deduplication.
Array vendors have been working to strike a balance between data reduction strategies and ways to improve flash performance. Early on, the goal for flash vendors was to reduce data at all costs. Flash storage was very expensive, so there was a great deal of pressure to minimize the data footprint. Unfortunately, some data reduction algorithms were very CPU- and memory-intensive, which decreased performance.
Some modern algorithms attempt to decrease memory and CPU consumption by examining data prior to implementing data reduction strategies to estimate the benefit of applying a data reduction algorithm. In the past, arrays would usually attempt to reduce the data footprint by globally applying data reduction strategies, even to nonredundant data that could not be reduced. The end result of these modern techniques are data reduction rates that remain comparable, and decreased CPU and memory usage.
Approaches to data reduction in the all-flash data center
SSD environments require data reduction to optimize performance
Customers value deduplication systems on the storage array
Dig Deeper on Solid state storage technology
Related Q&A from Brien Posey
A ghost image can be used to copy the contents of one server to another for backup, but the process of creating ghost images may not be as simple as ...continue reading
Backup and recovery trends, such as hybrid cloud data protection, are gaining popularity in the IT industry. Are these three major trends part of ...continue reading
Can disaster recover planning save an organization from a costly IT outage? The British Airways IT disaster of 2017 has many wondering how to prevent...continue reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.