http://www.hardware.fr/articles/787-4/dossier-nvidia-geforce-gtx-480.html
Some real power draw data for the video card only.
You can clearly see the 480 is pulling some mean wattage, getting to 298W. NV basically lied at the last minute, revising the TDP for their AIB partners cos admitting a near 300W limit for a single GPU would have made many of them faint. Another review has it reaching a max of 318W and 105C temps! So yeah, better have a damn well ventilated case.
Huge die size, insane power draw and heat, expensive and for a marginal performance gain in a few games and 1 useless synthetic benchmark? I don't see how this kind of monolithic architecture can keep going at this rate. Fermi is an utter failure. Looking forward to the next generation battle.
They didn't lie outright, they just changed their system.
Instead of TDP being basically a worst case scenario, like it is for AMD cards and all previous NV cards, they changed it (seemingly only on the GTX480 looking at various numbers) to being typical power draw (I think it's similar to the difference which there is, or at least used to be, between AMD and Intel TDPs, with AMD being typical and Intel being maximum).
The main "issue" is that they seem to still be using the old system on the GTX470 (225w was the load under Furmark and in gaming it was lower) while the other method is used on the GTX480, with 250w being the TDP and typical load, and Furmark etc giving maximum load at ~300w or so, which is what the TDP was claimed as being by certain parties, and what it would be if they followed the system which they used on all their previous cards and that AMD/ATI use for their cards.
So yeah, it's all marketing BS, and they changed the TDP to make it seem less of a dog than it really is. The GTX470 though is still a realistic TDP at 225w, or it seems so ftom the numbers available. The 480 is just a joke though.