Have concerns about data loss with write caching products been addressed? Or is data loss still possible if the...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
Write caching improves performance by acknowledging to the application that a write has been successfully stored when it is in the cache storage area instead of in the hard disk storage area. This means there is a period of time that if the cache fails for some reason, data loss will occur. Failure can occur in two situations: the flash module being used for cache storage fails or the server itself fails.
Failure of the flash module storing cache data is a real concern if the cache turnover rate is too high. This means that the system constantly has to update and replace data being stored in cache. To protect against this, users should implement a larger cache and they should mirror the cache drive in the server. Ideally, that mirroring should be done by the cache software itself, which provides seamless operation in the event of a cache failure.
None of the above steps can protect from a server failure. To protect against this, the caching software should have the ability to extend that cache to a shared flash device on the storage network. Then inbound data can be written to both devices for reliability, but read from the server side instance for performance. If the server fails, the cache software needs to have the capability to understand that a failure event occurred and check the shared copy first on restart.
Comparing write-through, write-back and write-around caching
George Crump of Storage Switzerland compares three common types of caching -- write-through, write-back and write-around.
Martin: Using SSD as cache
In this segment of his Storage Decisions presentation, Dennis Martin of Demartek discusses the benefits of using SSD as cache.
Why is persistent cache important?
Leah Schoeb, partner with Evaluator Group, discusses the importance of persistent cache in this Expert Answer.
Related Q&A from George Crump
According to analyst George Crump, you might want to think about going with a non-traditional Hadoop architecture.continue reading
Cloud storage doesn't just have to be for backup. According to George Crump, cloud services can make deploying a new application or disaster recovery...continue reading
If your IT department has the skills set, OpenStack object or block storage might be a good idea, analyst George Crump said.continue reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.