RussianSensation
Elite Member
- Sep 5, 2003
- 19,458
- 765
- 126
Your link is the power consumption for one game. You can't draw final conclusions from one game.
It doesn't work like that. Its a crazy assertion. Surely AMD methods in finding the 7970 tdp differed from nvidia and kepler but in no case would they base their figures from simply loading a save point in a single game (crysis 2).
No one said anything about using Crysis 2 as end all be all game for power usage. However, if you go browse 30 reviews of the cards outlined below in games, you will see that all of them consume different average and peak power. TDP actually has more to do with guidance for cooling/heatsink design due to heat dissipation. TDP != power consumption.
HD6970 = 250W
R9 280 = 250W
HD7970 = 250W
HD7970GE = 250W
780 = 250W
780Ti = 250W
All of these cards use a completely different amount of power but all have the same TDP rating.
7970 and 6970 are 190-200W cards and 780 is a 220W not a 250W card, while 780Ti easily exceeds 250W:
http://www.techpowerup.com/mobile/reviews/NVIDIA/GeForce_GTX_780_Ti/25.html
Stand back and just think about it: are you going to claim now that an R9 280 uses up to 250W but 7970Ghz also uses 250W? The TDP ratings for AMD and NV tell us little about the card's real world power usage in games.
R9 280 is barely higher clocked than 7950 but has a 250W TDP. The real world power usage of cards such as 280 or 7950 or 7970 is far below 250W.
Look at the TDP of 280X vs. 770. You would think that 770 uses way less power in games but it's not even remotely true.
http://techreport.com/review/25466/amd-radeon-r9-280x-and-270x-graphics-cards/10
Or
http://www.anandtech.com/show/6994/nvidia-geforce-gtx-770-review/16
In fact, the variance in power usage among 280X models alone shows how worthless the TDP is for gauging real world power consumption in games:
http://www.techspot.com/review/841-radeon-r9-280x-roundup/page11.html
There are countless cases of cards which use much less power than their TDP, about the same power and way more power. The fact that AMD and NV even define TDP differently makes it even more pointless to compare them.
Then we get to the part where after-market cards use better cooling, digital power delivery and overall more efficient components. As a result, you can have an after-market card that uses less power than the reference design. Alternatively, the board could be designed specifically to support high overclocking (Classified/Lightning/Matrix/Vapor-X) and these tend to use more power than the reference design. Unless you test a specific card in question, often times the TDP rating doesn't align at all.
First it took gamers a long time to accept that Furmark was a waste of time for testing a GPU's real world power usage but using TDP to mean power consumption is another one of those old errors/myths that long needs to go away. We have tools that allow us to measure the card's real world power usage where it's no longer relevant to look at some arbitrary number on the box.
Last edited: