Sep 17, 2012 04:45 pm | CIO
by Thor Olavsrud
Complexity in the data center has a number of unwelcome effects on the enterprise, from increased costs to reduced agility and even downtime. For the past five years, organizations have been virtualizing their data centers in an effort to reduce complexity and increase efficiency. But while virtualization offers significant benefits, many such projects have shifted rather than eliminated complexity in the data center. To truly mitigate data center complexity, organizations need training, standardization and information governance.
"So many people think that virtualization is the penicillin of the data center, but in reality, what we've seen is that while people are investing heavily in virtualization, they didn't necessarily have the foresight to see the ramifications of virtualizing so quickly," says Danny Milrad, director of product marketing at Symantec, which just released the results of its 2012 State of the Data Center Survey. "One of the benefits of virtualization is spinning up an application so quickly, but they don't think about how big the footprint of that application can become."
Business-Critical Apps Drive Data Center Complexity
The increasing number of business-critical apps is the primary driver of complexity in the data center: 65 percent of respondents in Symantec's study listed it as a driver of the complexity of their data centers. Symantec contacted 2,453 IT professionals from 32 countries. They included senior IT staff focused on operations and tactical functions, as well as staff members focused on planning and IT management.
"Show me an app that isn't a business critical application outside of file and print these days," Milrad says. "Now you've got to replicate it, and your storage footprint goes up. With all these new applications coming online, they're being virtualized, and you've got a ton more data than you ever expected."
When that happens, organizations hit a wall. "As they virtualize more and more, the cost of storage and the cost of virtualization licenses and everything that falls out of that grows faster than expected," he says. "Storage is cheap, but it's still very expensive when you have to buy 10 times more than you expected."
Other key drivers of data center complexity include the growth of strategic IT trends such as mobile computing (cited by 44 percent of respondents), server virtualization (43 percent) and public cloud (41 percent). The most commonly cited result of data center complexity is increased costs (47 percent). But other effects include reduced agility (39 percent), longer lead times for storage migration (39 percent) and provisioning storage (38 percent), security breaches (35 percent) and downtime (35 percent).
Complexity a Key Contributor to Data Center Outages
The survey found that the typical organization experienced an average of 16 data center outages in the past 12 months, at a total cost of $5.1 million. On average, one of those outages was caused by a natural disaster (costing $1.5 million), four were caused by human error (costing $1.7 million) and 11 were caused by system failure resulting from complexity (costing $1.9 million).
That's not to say virtualization is a bad thing, Milrad is careful to note, but it does mean IT needs to pay attention and prepare for the potential side effects.
"It's much like what happened with the introduction of SharePoint," Milrad says. "SharePoint created a power and cooling nightmare. It wasn't expensive for marketing or sales to spin them up, but power, cooling and storage costs went up as a result. It's the same thing with virtualization. IT needs to get [its] arms around it and manage it as part of the infrastructure. It's just a matter of slowing down and looking at what you're doing."
The survey found that 90 percent of organizations are implementing or actively discussing information governance in an effort to get their data center complexity under control. They cite enhanced security, ease of finding the right information in a timely manner, reduced costs of information management and storage, reduced legal and compliance risks and moving to the cloud among the benefits they seek to achieve.
Best Practices for Mitigating Data Center Complexity
Trevor Daughney, also a director of product marketing at Symantec, recommends adopting the following best practices to help reduce data center complexity:
Get visibility beyond platforms. Understand the business services that IT is providing, and all of their dependencies, to reduce downtime and miscommunication.
Understand what IT assets you have, how they are being consumed, and by whom. This will help cut costs and risk. The organization won't buy servers and storage it doesn't need, teams can be held accountable for what they use and the company can be sure it isn't running out of capacity.
Reduce the number of backup applications to meet recovery SLAs and reduce capital expenses, operating expenses and training costs. The typical organization has seven backup applications, generally point products for particular databases.
Deploy deduplication everywhere to help address the information explosion and reduce the rising costs associated with backing up data. It's not to simply deduplicate the backup. Consider placing an archive that has deduplication capabilities next to applications such as Exchange or SharePoint that tend to be the biggest data offenders.
Use appliances to simplify backup and recovery operations.
Establish C-level ownership of information governance. Building an information-responsible culture and creating an umbrella of information governance can help organizations capture synergies across focused projects.
0 comments:
Post a Comment