Quick, what segment of your IT setup uses the most electricity: your servers and network technology or the assorted PCs scattered around the cube farm? If you said the stuff in the server room, you're wrong.
According to recent data released by Forrester Research, desktop computers, printers, and monitors account for a full 55 percent of a company's electricity usage while servers and the like use only 45 percent.
A survey of over 300 companies from North America and Europe were polled to determine how technology impacts energy consumption. Respondents ranged from the telecom and utility sector to finance and insurance companies. As ZDNet's Heather Clancey points out, "From a geographic standpoint, European companies used less electricity in their data centers than North American ones, according to the Forrester research. Ditto for stuff outside the data center. There was very little difference in usage when it came to company size."
To put these numbers in a little better perspective, previous Forrester research shows that the energy used to power a data center with 25,000 servers for one month is equal to the electricity required to power 420,000 homes for an entire year. That's staggering. Of even more concern is that those figures are several years old, so consider what the consumption levels must be now -- even with the push for green IT. No wonder analyst firm Gartner claimed data centers would run out of power by the end of this year.
Now, obviously data centers aren't going to go dark in the next two weeks or alarm bells would be sounding across the planet. There's no denying, however, that the IT sector is confronting a serious problem.
What's the takeaway message? Well, broadly speaking, make a New Years resolution to learn all you can about green IT. In the short term, for heaven's sake, shut down all those PCs before you lock up the office this holiday season.