I can’t run the only organization that has this problem. On one hand, our plant management folks are talking about the millions of dollars we can save each year in deploying power management software on our computers. With over one hundred sites, all those humming computers between 5PM and 8AM, a lot of heat is getting cooled for no particular reason.
At the same time, the rise in grid computing (in which I have more than a passing interest), demands CPU cycles to be used to help create cures for cancer, new peptide polymers to create the latest designer drug to fight the dreaded liver spots, or seaching for Alf’s home planet.
There’s no confluence of interests in the cube farm. Sure, monitors can be powered down. With flat screen monitors, the heat difference is hugely lowered anyway. But that CPU, twiddling its digital thumbs for of most it’s glacial interaction with humans, can’t both effectively race for the cure while simultaneously give the HVAC systems — and our environment — a break.
The solution is to move grid computing away from the desktop, where the erg ROI is shaky and the grid software becomes an additional load on the desktop TCO, and back to the server. With their ability to manage larger jobs better, and with concomitantly faster payoffs, grid computing can be better accomodated. This is the logical direction: the grid community’s ROI is better and the incremental cooling costs are better managed in a more efficient and better controlled environment.
And we’ll let our faithful desktop servants sleep when we leave them at the end of the day.