First impression on unpacking the Q702 test unit was the solid feel and clean, minimalist styling.
A waste of space: Bulk of drive capacity still underutilized
- — 29 July, 2010 01:17
Near the turn of the century, data centers were only beginning to implement Fibre Channel storage-area networks (SAN), with most relying on direct-attached storage (DAS). Data utilization rates were abysmal, with data centers on average using just 25% to 30% of their hard disk drive capacity.
Despite Fibre Channel and IP SAN adoption and the advent of technologies such as thin provisioning, storage resource management, capacity reclamation and storage virtualization, storage utilization rates remain at 40% or lower. In other words, IT shops aren't using as much as 60% of their storage capacity, wasting electricity and floor space.
"Most people I talk to don't even know how many terabytes of capacity they have on the floor, much less what their utilization is. And a lot of them don't even know how they'd measure it if they could," said Andrew Reichman, an analyst at Forrester Research.
Without better use of storage management tools, users won't be able to track utilization and improve it, Reichman said.
Reichman is one of several industry analysts who believe storage utilization today still averages between 20% and 40%, mainly because thin provisioning and storage management applications with advanced reporting capabilities that could point to wasted storage assets aren't being used.
"It's a bit of a paradox," he said. "Users don't seem to be willing to spend the money to see what they have."
For an enterprise-class data center, comprehensive monitoring and reporting software can cost as little as $250,000 or as much as $1 million, and in many cases a full-time employee is needed to manage it, Reichman said.
Rick Clark, CEO of Aptare Inc., said most companies can reclaim large chunks of data center storage capacity because it was never used by applications in the first place. Aptare's main offering, StorageConsole, is used for backup and storage capacity reporting. It can show admins where and how storage is being utilized.
Clark said the main problem in data centers today is that there is no single way to view host servers, networks and storage to determine how efficiently assets are being used.
Aptare's latest version of reporting software, StorageConsole 8, costs about $30,000 to $40,000 for small companies, $75,000 to $80,000 for midsize firms, and just over $250,000 for large enterprises.
"Our customers can see a return on the price of the software typically in about six months through better utilization rates and preventing the unnecessary purchase of storage," Clark said.
In many cases, companies buy more raw disk capacity than they need because the base cost of hard disk storage is pennies per gigabyte. But Reichman and others say that it's a fallacy to think that disk storage is cheap, because it costs money to manage and it eats up data center floor space and electricity.
"I start out every presentation with a slide showing it's a higher percentage of IT spending than any other area. Yes, the dollar-per-gigabyte [cost] has gotten cheaper, but application data growth rates are tremendous," Reichman said. "Saving 100TB of capacity out of a storage environment represents $1 million."
The problem of capacity underutilization is as old as digital data storage itself. Traditionally, business units have asked storage admins for more capacity than an application would need, to ensure that they wouldn't run out.
In turn, those sysadmins added onto the tab by overprovisioning storage capacity to ensure that they wouldn't be the cause of an application outage. The result has been an enormous waste of disk capacity.
Adding to the problem of underutilization is the misconfiguration of storage capacity, where storage is purchased but never allocated to any server. It just sits idle.
A magic bullet?
Over the past five years or so, thin provisioning, or provisioning only as much storage as an application server needs, has been among the most popular technologies to increase storage utilization. Thin provisioning is a form of virtualization where the application servers do not know where the physical location of their storage capacity is and are fed from a pool of disk capacity with a layer of abstraction in front of it.
Thin provisioning applications either automate the provisioning of storage or send alerts to sysadmins to allocate more when a threshold is reached. But it's a relatively small percentage of users in the marketplace who are actually taking advantage of thin provisioning, Reichman said.
Recent research from New York-based TheInfoPro shows that thin provisioning utilization is growing quickly. Anders Loftgren, chief research officer at research firm, said that as many as 50% of Fortune 1,000 companies it surveyed in June now use thin provisioning or plan to do so. The results of the company's survey of 250 Fortune 1,000 and midsize companies are expected to be released in mid-August, Loftgren said.
Loftgren said the results show utilization rates from 40% to 60% at the companies surveyed. Those numbers shouldn't be considered poor, because most companies pad their capacity needs for future growth with up to 30% more capacity than what is currently required, he said.
"These guys have the charter to make sure their businesses are up and running, and no one wants to get caught not having enough storage capacity," Loftgren said. "Of course, you want to get that to be as efficient as possible."
The most popular thin provisioning vendors include 3Par, Compellent and LeftHand Networks (now part of Hewlett-Packard). All major storage vendors today, however, offer some flavor of thin provisioning, which can increase utilization rates to as much as 80% when optimally used.
For example, a business unit might ask for 50GB of capacity for a new database implementation. But by using thin provisioning technology, a storage administrator can allocate just 5GB and then set thresholds that will either alert him to increase capacity or automatically add it to the application as needed.
"EMC and Hitachi and IBM have some versions of thin provisioning, but I've talked to zero users that actually are doing what we think of as thin provisioning with oversubscription. My objective evidence tells me that there are virtually no users of it," Forrester's Reichman said.
"Most storage administrators are using storage virtualization for its wide-striping capability, which increases performance and eases storage provisioning, but not for thin provisioning," he noted.
Adam Couture, an analyst at market research firm Gartner Inc., said another reason users may not be embracing thin provisioning and other storage optimization technologies faster is that most aren't replacing infrastructure because of the recession.
"The overwhelming dictate was to control costs -- no capital spending -- which meant you lived with what you had," Couture said. "And if your array wasn't built to take advantage of thin provisioning, there's no way you can retrofit it."