the limiter is there for a reason so why turn it off? I am talking about real gaming. again you said "the 570 wasn't much of an improvement over the 470/480, slightly better performance per watt, but nothing to write home about."
that's nonsesne because the gtx570 had nearly 25% more performance per watt in games than the gtx470. and again it also runs much cooler and quieter. now you want to try and compare the stock gtx570 to your modified card? lol
This really isn't the place for this discussion, but it's clear you don't understand watts, nor do you understand kWh. It's also pretty clear that you look down on 470's as inferior to your amazing GTX 570. Let me first make it clear my statements were this: The performance per watt going from GF100 to GF110 wasn't impressive to me, it wasn't something I cared about because we're talking in watts and we're not talking about a huge difference. I also said just because the reference cooler for the GTX 470 and 480 was bad didn't mean the chips, or GF100 was bad.
Now let's make it clear what we're discussing, your opinion, against mine. You clearly feel my opinion on the matter of wattage is wrong, and for whatever reason you've decided to be very callous towards me on the subject.
First let's start with watts, and performance per watt. The Anand review showed a five watt difference with a 20% performance advantage in Crysis: Warhead. This was total system draw, the 470 system was drawing 366 watts, and the GTX 570 was drawing 361 watts. Now since we don't have exact figures, and this is simply put one single game we'll just run with it with the knowledge that we don't actually have all the facts.
Currently where I live a kWh costs six cents. 360 watts will cost me six cents after slight more than three hours of continuous gameplay at full load. If we assume for the same of assuming 12 hours a week of full load gaming the cost of the system per week is a shade over eighteen cents. Let's again assume we do this every week for an entire year, after one year of running my PC at full load in Crysis: Warhead I will have spent $9.36 cents in electricity. We both will have paid $9.36 to play our system all year, however you will have obtained 20% more performance than me for the same price. Except it's not the same price, but we'll get to that later.
Now let's figure we overclock the GTX 470 to GTX 570 performance, to do that we're going to need about 770 core on the GTX 470, with a slight bump in vram. So about a 27% overclock for the GTX 470, clearly power isn't going to scale linearly to the core clock, however for the sake of simplistic discussion we'll just add 20% to the TOTAL SYSTEM power consumption rating of 366 watts, which is 439 watts.
Now we have the same performance, except my system is drawing 439 watts while yours is still drawing 361. We'll use the same example as before. The 470 system with it's equal performance will only be able to game for two and a half hours before costing six cents, while the 570 it still just a shade over three hours. For one year, twelve a week full load gaming, it's still $9.36 for the GTX 570 while now the GTX 470 is costing $14.97. So the total cost difference from increasing the performance level of the 470 to the 570 for one year of gaming twelve hours a week is $5.61.
Now let's look at cost, the GTX 570 was $350 when the GTX 470 was $250. That is a $100 price difference, if you read the Anand review they even mention this: "but at $100 over the GTX 470 and Radeon HD 5870 you’re paying a lot for that additional 20-25% in performance." Again let's not forget unless they tested a new game they didn't go back with updated drivers for the GTX 470, they simply used their initial benchmarks when they reviewed it months before the 570 came out.
This summarizes why I don't care about wattage on a desktop computer. Leave wattage out of the discussion, it's only for people who pay outrageous kWh prices and people who are clueless. Wattage is only a factor if are running off a battery, and I'm pretty sure your desktop isn't.
Now let's go back to your dogging the GTX 470 as if it wasn't a decent card with some crazy overclocking potential, because you seem to be in need of reeducation. First let's look at cost, I'm a bargain shopper, I got my first card for $180 on black Friday of 2010, thanks to the 570 I got another for $120 used off Overclock.net, I also got two blocks from Danger Den for $80 each, $160 total.. Let's add those up quick, 180 + 120 + 80 + 80 = $460. Woah, that's more than a single GTX 570 at the time, weeeh $350 (most were $370 /w shipping) vs $460 = 30% more! I wonder if I get 30% more performance?
Hopefully we'll find out!
First let's look at overclocking the GTX 470 when in SLI on water:
Stock is 607 core 1215 shader 3348 memory
Overclocked is 950 core 1900 shader 4300 memory
For the tests I used a i5-2500k @ 5,278MHz, with 4GB of DDR3 @ 2206MHz 8-10-8-24 1T, which sadly at 1080p was actually bottlenecking my cards in Dirt 3, F1 2010, and F1 2011.
So what does a 57% core overclock, and 29% memory overclock get us with $460 invested?
AvP Settings as shown
Metro 2033 Settings as shown
Dirt 3 Settings Ultra 1080p 8xMSAA
F1 2010 Settings Ultra 1080p 8xMSAA
F1 2011 Settings Ultra 1080p 8xMSAA
So let's recap, I said the performance per watt reduction wasn't impressive because I don't run my desktop off a battery, and the cost difference is negligible at best. I never once said GF100 was better than GF110, nor did I ever say it was more efficient. I would like to make this clear to you, the GTX 570 is a better card than the GTX 470, it has better performance per watt, and runs cooler and quieter on reference cooling. What the GTX 470 isn't is bad, it isn't a crap card, and if you still feel that way please be sure to respond with links to benchmarks of your own.