As every computer owner knows, when their machines run a complex program they get pretty hot. In fact, cooling the processors can be expensive, especially when you're dealing with huge
Data centers of large Internet firms, such as Google, Apple, Microsoft and others, contain thousands of computer servers. As they process information they generate enormous amounts of heat requiring cooling towers that dissipate it into the atmosphere.
A Dutch firm thinks paying for electricity to run the servers and then paying again to cool them is a waste of energy.
Boaz Leupe, CEO of the start-up Nerdalize, says it’s actually quite simple.
"We don't actually have to build the data center, which saves a lot of costs in infrastructure and we don't have the cooling overhead, plus that you have the environmental benefit, that the kilowatt hour you are using is used twice, once to heat the home and once to compute the clients task without the cooling overhead," says Leupe.
The company developed what it calls an e-Radiator, a computer server that also works as an alternative heating source. Leupe says that five Dutch homeowners are experimentally using them in their homes.
“We reimburse the electricity the server uses, and that we can do because of the computer clients on the other side, and, in that way, home owners actually get heating for free, and computer users don't have to pay for the overhead of the data center,” says Leupe.
One of the participants in the year-long experiment, Jan Visser, says the amount of heat produced by e-Radiator depends on the work being done by the server’s processors so it cannot be used as the primary source. But he is ready to try it.
“If it gives good enough warmth, you can use less of your existing central heating, and there is the chance for a home owner to pay less bills.”
Nerdalize says e-Radiators generate temperature of up to 55 degrees Celsius and could save up to $440 in annual heating costs.