Keys didn't list all of the facts, but at least he listed something. Mk6 jump in this thread and claimed that OC 2 5850 to 1 Ghz = ownage without knowing, or deliberately hiding the fact that a) not all 5850 can OC up to 1Ghz, b) the heat generated will be much higher than 470 SLI, c) eats up a lot of electricity, and d) 470 can also be OCed too.
You're wrong on all accounts. Why do you post if you have no idea what you're talking about? I posted about 5850 CF because it's a better deal than either GTX 470 SLI or a 5970, barring extreme/specific circumstances. If the OP already bought a GTX 480, that's fine, but it gives an alternative to consider for others reading this thread for information. Anyway, A) Almost all 5850's overclock to 1GHz, and the few that don't come close. It doesn't matter if it hits 1GHz anyway, it isn't like extra shaders magically unlock at 1GHz. If you can only get your 5850 to 980MHz, that last 2% really doesn't matter as it's still going to be tons faster than a GTX 470. B) Nope, not at all, do your research.
http://www.xbitlabs.com/articles/video/display/gpu-power-consumption-2010_4.html#sect0 . You'll notice that even at 975MHz, the power consumption of that 5850 is still well below the 225W TDP of the GTX 470 (which is an undershoot of an estimate as it is). The 1GHz number isn't realistic, as they boosted the voltage to 1.35V (one doesn't need 0.15V to get that extra 25MHz). C)The GTX 470's can't be overclocked to anywhere near this level (5850's can get close to a 40% overclock). From reviews and what users have posted, they generally top out at 700MHz (15%). As I have shown, a GTX 470 at stock trades blows with a 5850 at stock (although I'd say the 470 is probably 5-10% faster overall), when the GTX 470 overclocks to anywhere near these levels, then it might be competitive.
The quote you made on Keys did showed advantages on 5970. It is cooler and uses less electricity. What exactly are you trying to say? Are you trying to tell people that Fermi cards will turn your office into a sauna? If so, than you are dump as the extra heat produced can hardly increase room temperature by 1 degree. It really doesn't need a doctor's degree to know that. As I have told Mk6 before, but this time to you, that a light bulb worth of heat don't warm up your office, so you can forget about a free sauna. However, it is possible to turn your office into a gaming, non-productive place.
What I am saying is, don't try to make a mountain out of mole hill.
And as I said before, you need a basic education in physics. Congrats, you know that a watt is a unit of energy conversion, now only if you could apply it correctly. Do you ever just walk up to a 100W lightbulb that's been on for 20 minutes and cup your hand around it? No? Why? Because it's really hot, that's why. How much hotter do you think it is inside the glass of the lightbulb? The light bulb creates an extremely hot pocket of air that radiates heat (well, heat spreads via several other methods too, but that's besides the point) in a gradient around itself, so yes, you don't feel the heat at a decent distance, but that doesn't mean the heat isn't there. Considering GTX 470 SLI will dump up to 240W of heat extra (
http://www.hardocp.com/article/2010/03/26/nvidia_fermi_gtx_470_480_sli_review/7) in a room over a 5850 CF setup, yes, it's a substantial difference. If you can't understand that, then you need to learn more physics, as it's a relatively simple concept.