"You need a reason to go to it," said John Fagg, manager of storage services at the University of Utah Hospital in Salt Lake City. "We're usually bleeding edge on a lot of stuff, but on this one, we're sitting back and waiting."
The hospital is evaluating products such as Fusion-io's ioDrive and NetApp's Performance Acceleration Module (PAM) mainly to support read-intensive Picture Archiving and Communications System (PACS) data and database logs.
One place Fagg said he's shied away from SSDs so far is within the disk array. "Our [Hitachi Data Systems] USP V arrays have a lot of cache in them anyway," he said. "There's still a benefit [to SSDs inside an array] but it could be even better if it's not behind a standard drive interface."
Matthew Barnes, informatics system administrator at the University of Florida's College of Medicine in Gainesville, said his users have implemented SSD and it's now coming to the data center.
Part of Barnes' environment supports researchers working to process large amounts of data.
"Our top tier has been 15,000 rpm Fibre Channel drives, but we're looking to add a higher tier of storage," he said. "There are instances where researchers want to wring out every possible performance gain—it's all about getting an answer, and if you get an answer quicker, it's a competitive edge."
Barnes said he doesn't share some of the qualms in the market about SSD durability. The hard part is funding the purchase of solid state.
"Universities everywhere have been hit hard with budget cuts during the recession," he said. "I love the idea of SSD, but haven't yet found a mechanism to pay for it."
Lawrence Di Gioia, director of information services for the City of Altamonte Springs, Fla., said he has no doubt he will use SSDs eventually.
"We're all going to go there at some point, but it has to be for the right application," he said, adding that the "green piece" of SSDs is especially appealing because they use much less energy than spinning hard drives. However, for now Di Gioia said he's waiting for SSDs to become more cost-effective and reliable.
Chris Lionetti, senior SAN engineer at Microsoft Corp., said he's been testing out SSDs inside Microsoft's storage network for use with database applications, but hasn't decided exactly where they may be put into production in the company's data center. Lionetti stressed the importance of testing SSDs before employing the technology.
"Everything in the industry is fluff until you test it," he said. "You can't believe the specmanship in the market. If I had to sum up our experience in one word it would be 'cautious.'"
Vendors point out SSD pitfalls
While many storage vendors have embraced SSDs over the past year, not all of them are convinced the technology is ready for primetime. In an SNW panel session Monday called "SSD: Enterprise Ready or Not?," representatives from IBM Corp., LSI Corp. and Seagate Technology LLC warned attendees of the details that have yet to be worked out about SSDs.
Marty Czekalski, interface and emerging architecture program manager at Seagate, said SSDs need more data integrity features such as the data integrity field (DIF) standard, an eight-byte string appended to 512-byte standard blocks containing error correction codes.
"Performance should be predictable with deterministic error recovery" before the drives are enterprise-ready, he said.
Harry Mason, LSI's director of industry marketing, storage components group, said better benchmarks were needed to compare devices on the market. "The variation in quality of drives and higher cost [of the products] invites further inspection, including consistent and accepted industry metrics," he said.
Clod Barrera, distinguished engineer and chief technical strategist for the IBM Systems and Technology Group, said he was concerned that the industry might make an investment in SSDs, only to see a better mousetrap turn up soon.
"If we adapt operating systems, middleware and applications [for SSDs], what happens when the next version of this shows up—do we do it again?" he asked.