That's why TDP is worthless for enthusiast PC gamers:
1)
AMD/Intel and NV define TDP differently. Therefore, TDP cannot be compared between these brands. If someone uses TDP to compare AMD and NV GPU, they are wasting their time most of the time.
2) Since you already admitted that TDP deals with heat dissipation, not power usage, we shouldn't care what the TDP of any component is. In fact, there is another key reason why TDP is meaningless --
We can measure the actual power consumption of a given component in many tasks from gaming to distributed computing to mining. I don't care if my GPU has a TDP of 1000W. If I look at the power usage of 10 most demanding PC games from Crysis 2/3 to Metro LL and use distributed computing/rendering workloads, I have measured my 99% percentile power usage for real world (non-theoretical/non-power virus) scenarios.
3) Since a GPU can actually exceed its rated TDP, TDP tells me little about my GPU's maximum power consumption. Since I know it's not possible for a stock R9 280 to use as much power as a stock HD7970Ghz and that it's not possible for a stock 780 to use as much power as a stock 780Ti, the fact that they are all rated at 250W TDP tells me little about their power usage in the real world and shows how out of touch the TDP rating has become since it's just randomly assigned to many SKUs.
4) If you have no idea what the TDP is of a CPU/GPU, it wouldn't change anything about your build because ultimately you'd simply read a review which measured the real world power usage. Since there are GPU coolers capable of dissipating 450-600W (Gigabyte Windforce, Asus Matrix, MSI Lightning), I really don't care about a 250W TDP rated card overheating. Similarly someone who is going to be overclocking an X99 CPU isn't going to use a $20 budget heatsink.
5) Since different tasks increase the load on the rest of system components, especially games, if I just look at the TDP, I do not know the overall performance/watt (i.e., efficiency) of a modern gaming rig.
Ultimately, I have to look at reviews which find that out for me.
It's OEMs that need TDP because they don't have the time to test power usage of games/distributed computing and other such tasks. It's a shortcut for them to know that if a GPU is rated at 250W TDP, well their 300W PSU won't cut it and an SLI system with 2 of those GPUs cramped into a micro-ATX case with 1 fan won't work well.
TDP is just a general guidance but for anyone who cares about real world power usage in games, they should just
measure it. We tend to use TPU since they use GPU heavy games such as Crysis 2 and Metro which are pretty good at representing 95% of GPU load in games.
---
TL; DR
TDP is worthless for enthusiast PC gamers because:
1) We can measure real world power usage which makes TDP rating a theoretical number. Why do I care about a theoretical number when I can get actual real world results?
2) Since TDP is defined differently across AMD/Intel/NV, comparing TDPs is pointless.
3) Modern high-end CPU/GPU after-market cooling solutions have the capacity to easily dissipate 250W, which makes TDP heat dissipation ratings pointless for overclockers. Overclockers compare real world results, they don't care for theoretical numbers.
4) If I don't the TDP rating of a particular PC component, I lose nothing at all since in the end I only care about real world power usage in real world scenarios. Since a GPU or a CPU doesn't operate in a vacuum, I need to know the actual power usage of the total system when assessing if my PSU is sufficient.
5) Looking at TDP of individual components tells me little about their efficiency overall in a complete build under load because of CPU/GPU limited situations.