Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Digression: Is it true that every single watt my CPU/server/electronics consume is ultimately turned into heat?

To put it another way, does a server under a constant 500W produce the exact same amount of heat as a dumb 500W electric space heater? Excluding things like the tiny amount of energy that gets sent out on CAT5 cables, which I believe still gets turned into heat just elsewhere.

I think this is true, but it’s still counterintuitive. Makes me think there’s something more useful we could make electric space heaters do with their energy and still produce the same amount of heat.



> I think this is true, but it’s still counterintuitive. Makes me think there’s something more useful we could make electric space heaters do with their energy and still produce the same amount of heat.

Indeed: people have experimented with mining during the winter to recoup electric heating costs, which makes sense but only if you ignore the (expensive and getting more expensive by the day) cost of procuring the original mining hardware (aka GPU).


Hardware is a high cost if you're using current gen mining hardware to be competitive. If power is essentially "free" (since you'd use it anyway for heat), then even very old mining hardware would have a net positive benefit for you. It might only be worth a cup of coffee a week, but it is "free" money you can collect...


It still depends on what your alternative is. Much of the world does not use electricity for heat; in the United States, electricity is rarely used for heat in in the parts of the country that traditionally get brutal winters. It's cheaper for me to heat my home via the forced-air natural gas furnace (even though it is a less efficient fuel-to-heat process as compared to resistance heaters) given the disparity in pricing between residential electric and natural gas costs. I tried mining ethereum for a month last winter and I don't think that I broke even (despite using a relatively recent RTX 2080).


In short, yes. Exactly (well minus a few used to spin fans and hard drives). Here's one source from a quick search:

https://www.pugetsystems.com/labs/articles/Gaming-PC-vs-Spac...

I remember seeing a paper referenced here once explaining why that energy loss is actually necessary, but I don't seem to have it saved anymore.


> I remember seeing a paper referenced here once explaining why that energy loss is actually necessary, but I don't seem to have it saved anymore.

Taking a stab at explaining the energy loss…

When computing you are reducing local entropy (e.g. achieving a certain pattern of electric charges in chips and a certain pattern of pixels on the screen) while still increasing global entropy (the second law of thermodynamic). Heat dissipation from the local system to its surroundings accounts for the difference.

This is analogous to Schrodinger’s description of life where life forms increase the entropy of their surroundings to maintain their own low entropy.


The spinning fans use energy to move air, a (slightly) viscous fluid, which eventually encounters turbulence and friction and turns its kinetic energy into heat. Similar to discs that use energy to overcome friction in bearings and generate noise, which is a kinetic energy in the form of vibrations that also eventually dissipates into heat.


By a paper, do you mean the laws of thermodynamics?

The fans also eventually also dissipate their energy as heat when they push air around.

There is an alternative heater design though for the same power, which is a heat pump (the same as an AC, run in reverse), which can be much more efficient at heating (down to -5F outside, or so, when they may freeze over and stop working entirely)


> something more useful...

Ha! A space heater with bitcoin mining coils. Call them "bitcoils".


I'm sure you've heard the old joke about gamer college students using their fancy graphics cards to heat their rooms. :)


I did this last winter. Mining 24/7 on my personal rig noticeably took a bit of load off the heater and made a modest amount of ethereum. I stopped once the weather warmed as it seemed like a waste of energy.


I did the same, then my graphics card died and it was a net loss :-(


How were your electric bills?


Not appreciably different, that was the whole logic behind my thinking. In my case I have electric heaters, so doesn't make a difference compared to a gpu -- they're both effectively 100% thermally efficient.


The only other places the energy could go are small - LED light, radio waves, sound. Almost all of it will go to heat.


And all of those things will go to heat eventually too, as will all other energy!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: