Massive data centers like those used by Google and Microsoft are notorious for consuming vast amounts of energy. But some say that dispersing servers throughout a community could actually help local homes and businesses conserve energy.
People talk about the cloud and cloud computing like it's something the geniuses at Google hide in their basement. In actuality the cloud is actually a collection of servers, usually housed in large data centers.
These clusters of data servers need massive amounts of electricity to run, and then because they're running 24/7/365, they also need constant air conditioning to keep from melting down.
It's this latent heat production that has researchers at Microsoft and the University of Virginia talking.In The Data Furnace: Heating Up with Cloud Computing, researchers from Microsoft and the University of Virginia argue that the problem of servers’ heat generation can be turned into an advantage, with computers placed in buildings to provide low latency cloud computing while also heating space, water or even clothes dryers (EL).
So instead of trying to combat the high temperature effects of constant use, this waste heat could be put to use reducing the energy bills of others in the community.
Electricity consumed by computers and other IT equipment has been skyrocketing in recent years, and has become a substantial part of the global energy market. In 2006, the IT industry used 61 Billion kWh electricity (or 3% of totalenergy consumption in the U.S.), and is the fastest growing industrial sector.Energy efﬁciency is not only important to reduce operational costs, but is also a matter of social responsiblity for the entire IT industry.
Image Credit: neospire