View Single Post
Old 8th December 2017, 00:01   #1283  |  Link
TheFluff
Excessively jovial fellow
 
Join Date: Jun 2004
Location: rude
Posts: 1,100
Quote:
Originally Posted by lordsmurf View Post
I disagree. It's a pretty good guide for what temperatures you can expect, both at idle and 100%. Again, I get 25 from a CPU with mild load, but an "idle" GPU would easily be in the 30s-40s or more, at least doubling my system heat output. TDP suggested it. TDP of that CPU is about 90, while TDP of most fancy graphics cards are well into the 100s-200s.
I realize that historically I have had absolutely zero success in convincing the various doom9 forums nutjobs I've encountered over the years of anything, but for the record... I currently have, in the computer right next to me, a top of the line graphics card powering my 4k monitor. It's 300mm long, it's got 8GB VRAM, it's got three big honking fans on it, and it'll clock up to somewhere north of 1.9GHz if it needs to. It's got more than twice the TDP of my CPU, at an impressive 180W. As you can see, this monster is turning the room into a sauna even when idling:



oh. uh. well.
The fan readout isn't bugged by the way, the card has a ginormous heatsink on it and the fans are intentionally stopped when the GPU is near idle.

Compare this to the old faithful i5 CPU which is actually using more power (almost a whole watt more! what a waste!) and running its fan at a few hundred RPM:



Also, lol at the idea of even trying to measure the heating effect the dissipation of these monstrous 30-ish watts of energy has on an entire room. An oldschool light bulb in a desk lamp would put out more heat! Even 300 watts is pretty much peanuts when it comes to space heating. It's an absolutely trivial amount of heat in relation to the thermal inertia of even a small apartment.

Last edited by TheFluff; 8th December 2017 at 00:20.
TheFluff is offline   Reply With Quote