frenta - Fotolia

Q
Manage Learn to apply best practices and optimize your operations.

How can I improve flash performance with data reduction strategies?

Data reduction techniques are important for increasing capacity on SSDs. Compression and deduplication are the most well-known, but other techniques can fill similar roles.

The data reduction strategies used by the majority of all-flash arrays on the market to improve flash performance are compression, deduplication or a combination of the two. Although these are the best-known approaches, there are two other data reduction strategies that are used in some products.

Pattern removal. This data reduction method works at the binary level, removing frequently occurring patterns of zeros and ones -- including long strings of zeros that may indicate empty space or null data. While some observers would classify this technique as a type of compression, there are varying levels of granularity. Some of the more aggressive data reduction engines use 8-bit granularity for pattern removal, which eradicates more data than would be possible when using a higher bit count.

Instant cloning. This technology goes by many different names, but the basic idea is that data is often used for more than one purpose. For instance, a database might be used by a production application while a copy of the database is used on a development and test server. From a capacity-reduction standpoint, it makes no sense to create multiple copies of the same database. An instant cloning feature allows copies of data to be created through the use of pointers or snapshots, rather than through physical data cloning.

Taneja Group analyst Mike Matchett discusses the differences between compression and data deduplication.

Array vendors have been working to strike a balance between data reduction strategies and ways to improve flash performance. Early on, the goal for flash vendors was to reduce data at all costs. Flash storage was very expensive, so there was a great deal of pressure to minimize the data footprint. Unfortunately, some data reduction algorithms were very CPU- and memory-intensive, which decreased performance.

Some modern algorithms attempt to decrease memory and CPU consumption by examining data prior to implementing data reduction strategies to estimate the benefit of applying a data reduction algorithm. In the past, arrays would usually attempt to reduce the data footprint by globally applying data reduction strategies, even to nonredundant data that could not be reduced. The end result of these modern techniques are data reduction rates that remain comparable, and decreased CPU and memory usage.

Next Steps

Approaches to data reduction in the all-flash data center

SSD environments require data reduction to optimize performance

Customers value deduplication systems on the storage array

This was last published in April 2016

Dig Deeper on Solid state storage technology

PRO+

Content

Find more PRO+ content and other member only offers, here.

Have a question for an expert?

Please add a title for your question

Get answers from a TechTarget expert on whatever's puzzling you.

You will be able to add details on the next page.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What new data reduction techniques do you find most exciting when applied to flash technology?
Cancel

-ADS BY GOOGLE

SearchCloudStorage

SearchDisasterRecovery

SearchDataBackup

SearchStorage

SearchITChannel

Close