What you will learn in this tip: Find out how SSD technology affects the performance balance among CPU, memory, networks and storage, as well as why SSDs and 10 gigabit networking can
be a good match.
You’ve probably seen television advertisements for medications where the announcer eventually says “possible side effects include …” and then describes a long list of potential adverse conditions that may occur when taking it. It might be useful to do the same thing when describing the benefits of solid-state drives (SSDs). The good news is that some side effects associated with SSD technology may actually benefit you.
SSDs have the ability to move bottlenecks away from storage and expose new bottlenecks in systems, sometimes in unexpected places. Based on the tests we've been running in our lab over the past couple of years, I would like to focus on two areas of IT infrastructure that can be affected when SSDs are deployed the network and the CPU.
SSD technology and the network
Depending on the application, solid-state drive technology can have a noticeable effect on the network. Because SSDs provide significantly improved performance when compared to spinning disk drives, more I/Os can be completed in less time. This means enterprise applications that use the network can generate increased network load when SSDs are deployed. In some cases, this increased network load isn't a problem; it simply drives up network utilization from what may be low levels to moderate levels. In other cases, application performance may grow enough to require an additional network adapter for the purpose of network interface card (NIC) teaming to provide enough bandwidth for peak loads and possibly even sustained loads.
In one of our tests, we deliberately increased the traffic load to very high levels with a PCI-Express (PCIe) SSD card, but were perplexed by the relatively slow performance of the storage. Upon further analysis, we realized that even though we were using four 1 GbE NICs teamed together on the application server, the network was the bottleneck and was preventing us from achieving full performance of the PCIe SSD. So we reconfigured the test to use our 10 gigabit network with 10 gigabit adapters in the clients and application server. It was only then that we were able to get full performance out of the PCIe SSD.
We're at an interesting inflection point in our industry: we're seeing increased adoption of 10 gigabit networking at the same time we're seeing increased adoption of SSDs. My personal observation is that SSDs and 10 gigabit networking are made for each other.
SSD technology and CPU utilization
When a single operating system in a physical, non-virtualized server environment is running on a relatively modern server platform, we find CPU utilization is generally pretty low -- often well under 20% -- which we characterize as underutilized. In virtualized environments, CPU utilization generally climbs to something much higher and may not be considered underutilized, depending on the number of guest machines and applications running.
In our tests with SSDs where we have driven applications to higher performance levels, we noticed that because of the higher performance and lower latency of SSDs, the CPU utilization goes up in many cases. In some of our tests we've seen CPU utilization of approximately 10% with spinning disks in a physical server environment grow to 50% CPU utilization with SSDs in the same physical server. Of course, we're getting considerably more work done, so this is a good thing. However, the answer to the question of how many virtual machines (VMs) can run on a physical server needs to account for significantly improved storage performance when SSDs are deployed.
Summary: SSDs can move bottlenecks
SSD technology has the capability to move bottlenecks. The performance balance among CPU, memory, network and storage is certainly changed when SSDs are deployed. And as SSDs become more widely adopted, server, network and storage administrators need to adjust their workload calculations in each of their respective areas. We invite you to see how bottlenecks moved in our test results in the Demartek SSD zone.
BIO: Dennis Martin is president at Demartek LLC, an Arvada, Colo.-based industry analyst firm that operates an on-site test lab.
This was first published in April 2011