Wow, this thread is hilarious. A 6870 losing to a 580? No way... I mean, lets not even take into considering the fact that for the price of 1 580, you can almost fit 3 6870's (which isnt possible) so lets just say 2 6870's:
http://www.newegg.com/Product/Produc...n=6870&x=0&y=0
6870's cost $189-229 with/without MIR.
http://www.newegg.com/Product/Produc...-587-_-Product
GTX 580 costs $499. So i mean. How about we take one 580, OC it to the max, and then take 2 6870's in Xfire, and OC them to the max. That would actually be a mildly interesting test.
And then you compare the 560 which has an extremely high OC to a 6970 with an OC of 10 MHz. I mean, the 560 had an OC of 25%, and the 6970 had 1%. I mean that makes perfect sense doesnt it? All the smug Nvidia fans need to quit it with these BS benchmarks saying "Oh look, the 6970 is overpriced, it loses to a card that costs a lot less and is OC'ed to the max". Easily the most overpriced card is the GTX 580. It costs 2x more than the GTX 560. Why dont we see anybody going on about how the GTX 580 needs a price cut?
Then AMD fans need to realize the GTX 560 IS a great card, and it WILL OC to beat higher priced cards, and with some degree of ease. This thread is a flaming POS with the amount of fanboy's in here. Why dont we run real world benchmarks? 1920x1080, max settings (8AA), and stock vs stock, OC vs OC, SLI vs Xfire, price to performance. Such as 2x max OC'ed 560's in SLI vs the GTX 580 max OC'ed, thats a perfect $ to $ comparison. Or maybe 2 6850's in Xfire max OC'ed vs 1 GTX 560 max oc'ed, as that scenario is only about $100 of from each other. Maybe even 2x 6950 vs 2x GTX 560 both clocked to use the same amount of power and at stock to see straight up Scaling performance. Those are the benchmarks that should be made, not these BS benchmarks that either only give Nvidia advantages, or tell the already obvious.