You'd have to use Kelvin since it's the only absolute temperature scale. If raising the power limit from 100W to 110W ends up raising the GPU temperature from 70C to 80C, you can't say the "heat" has gone up 14%, for several reasons.įirst of all, Celsius can't be used for this since it's not an absolute scale. End of rant.Ĭard temperature readouts have nothing to do with how much wattage of heat it's outputting into the room. What we should want is for Nvidia and AMD to continuously improve on both performance AND power efficiency. It's bad for the environment and often turns our gaming spaces into uncomfortable hotboxes. This is a big part of the reason why we don't want GPUs from each successive generation to continuously raise their TDPs, which unfortunately is exactly what it looks like is going to happen (again) with the upcoming generation. A 500W TDP graphics card using all 500W of its TDP would output exactly as much heat as a 500W space heater. Every Watt of power consumed by an electrical device will eventually output in the form of heat. Despite being downvoted, u/splepage is absolutely correct. Higher power budgets also mean more heat, so there's a balance there.Įdit: I want to address an argument that happened below. Generally, the more power budget a GPU has, the more performance it can achieve, as long as it has sufficient cooling.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |