For all the undeniable benefits of the information-technology revolution, it comes at a cost, or costs. The first is energy: U.S. data centers consumed 61 billion kilowatt hours in 2006, ten times the amount consumed by all residences and businesses in San Francisco. That in turn exacts an environmental price, with IT serving as an underappreciated but sizable contributor to Corporate America’s collective carbon footprint.
Whatever the dangers of the latter, it is the former that has companies taking action. In a survey by Forrester Research asking companies that have carried out green IT initiatives what motivated them, the number-one answer was high energy costs. “It’s not necessarily to save the planet,” says Christopher Mines, a senior vice president at Forrester. “It’s the other green.”
Environmental concerns did rank second, but some experts say that reducing energy usage may depend less on what the priorities are than on who sets them. “This is often seen as just another issue for the IT organization,” says Ken Brill, founder of the Uptime Institute, an IT benchmarking and consulting firm based in Santa Fe, New Mexico. “But the CFO knows it’s about the viability of the company. If he or she mandates that you have to look at the energy efficiency of equipment when making procurement decisions, you’ll see big changes.”
Cutting energy costs doesn’t just hinge on buying the newest, most energy-efficient equipment. There are a number of related steps that can help companies drive down IT power consumption by half over a period of 12 to 18 months, according to Brill.
The first order of business may be to consolidate servers. “There is gold all over the floor of the data center waiting for someone to pick it up,” says Brill. He estimates that up to 30 percent of the servers in a typical data center could be shut off because they are obsolete and were never decommissioned. Of those left, Brill says that, on average, 95 percent of their capacity goes untapped.
Virtualization technology is designed to address that. Typically, a server runs only a single operating system and one application. Virtualization software allows one server to run multiple operating systems and applications, thus doing the job of as many as 30 devices. The Uptime Institute estimates that turning off just one $2,500 server would save up to $1,270 a year in direct electricity and cooling costs, not to mention associated software and maintenance costs, and eliminate tons of greenhouse gases.
Virtualization has helped Highmark Inc., a Pennsylvania-based health insurer, lower IT power consumption by 10 percent this year. It’s not without challenges, however. “Users think they own a server, and even if it’s only being 5 percent utilized they don’t want anyone else on it,” says chief information officer Tom Tabor. “So it’s been a culture change.”
Whatever the number of boxes in the data center, better cooling technologies can yield big savings. As servers have become more powerful, the amount of heat they generate has risen; at the same time, they have become more sensitive to high temperatures. Old methods of air conditioning an entire room are being jettisoned as companies find innovative ways to chill equipment.