I am not confused at all. C-states and other power savings technologies allow the CPU to run below its nominal frequency in idle states. Since none of these technologies affect operating frequencies at load states, the 2500k @ 3.4ghz will always consume less power at load in games than 2500k @ 5.0ghz would, regardless of C-states, EIST, etc.. C-states and power saving technologies do not help a processor when its being loaded by a game if the game actually needs the CPU's resources. You can try to minimize the power consumption at load by specifying the overclocking by using Turbo multipliers on a per core basis, but once the game loads a core, it's going to 5.0ghz when in use, which means that your CPU will always consume more power at load than a stock 2500k.
And I get 5.0GHz of performance when the core is fully loaded for an instant. What I don't get, is 140-150W more power, which you erroneously insisted I would,
twice:
Obviously, especially since HD7970 bottlenecks a stock 2500k in modern games at 2560x1600 4AA. It's like throwing
140-150W of extra power into nothing.
And what exactly is 2500k @ 5.0ghz getting you for 140-150W of extra power consumption in games?
But no, you aren't confusing anything. Games simply don't apply the load to a CPU that a program like Prime95 does, and you won't get the same power consumption either. After flubbing up such a basic concept, I'm surprised you're still actually posting in this thread.
But more so, you already stated that 30-36 fps is perfectly playable for you in a FP game. Since 2500k/2600k can
easily achieve that without overclocking, why would you overclock your CPU to 5.0ghz? Surely for someone who absolutely cares about 50-55W of power, this would prohibitive? You said yourself you don't care to game at 60 or even 120 fps where the CPU performance actually makes a huge difference.
Perhaps you believe that your CPU uses 5-10W of power more at load when it's overclocked and running a game at 5.0ghz?
I think it's cute that you're trying to be witty after showing a blatant lack of understanding of the argument at hand. That only works when everyone reading the thread still thinks you have an idea of what you're talking about. Right now you're desperately trying to marry any ad hominem to my argument since you haven't been able to properly refute it despite trying half the night. If I cared about absolute power consumption numbers, I wouldn't be overclocking, would I? I've mentioned several times already that it's the concept of what performance is gained per watt, but you're conveniently ignoring this because you've injected too much personal loathing into what could have been an enlightening discussion for you. But please, continue to show off how little you know about the subject.
I didn't have any intent, it's the only remaining logical conclusion left after 3-4 pages of seeing your side. You thought GTX480's 15-20% performance advantage over 5870 was "meaningless". You thought GTX580's 20% performance advantage over HD6970 was unimportant even though GTX580 had that lead for 14 months. You argued that HD6970 can be overclocked to GTX580 speeds, but when
a fully overclocked GTX580 was put head-to-head against an overclocked HD6970 and crushed it, you then claimed that GTX580's performance advantage in overclocked state didn't add to playability, but in fact it easily cleared 30 fps in games where HD6970 could not.
1) The GTX480 was only ~10% faster than the 5870 -
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/32.html . Use facts when you talk to me or I will just end up ignoring you because you're wasting my time.
2) You link a review which I've already pointed out did not do an equal overclock when comparing the cards. So a highly overclocked GTX 580 pulls ahead of a slightly overclocked HD6970; bravo, next you'll tell us overclocking causes power consumption to rise.
3) Nowhere did I claim that the GTX 580's performance advantage didn't add to playability. Go quote it.
If all you want is 30 fps in games on the PC, then
GTX580 was perfectly fast for that for 14 months. Suddenly, HD7970 is a must buy at $550 with only 20% more performance over GTX580. Really?
I've noticed you've conveniently stopped quoting my posts, even though you scurried off to go re-write your own. So what, after your ad hominem attacks fail you're just going to blatantly put words in my mouth? The GTX 580 wasn't impressive because my overclocked 6950 was just as fast for half the cost. Overclocking a GTX 580 would have netted another 20% performance for a lot more heat and money. Early reviews show an overclocked 7970 performing as well as and even surpassing a GTX 5
90 on air, nevermind what it'll do when I put it under water. 60%+ more performance is impressive, 20% is not.
I understand the argument clearly. In all situations where you cannot logically defend your view, you attack the poster (like Notty22's statistical knowledge which has no relevance, or my monitor (again no relevance), or dismiss any enthusiast's desire to game at 50-60 fps in a FPS or a racing game as not the norm, etc.). You dismiss most websites anyone links unless they show what you want to see (like H and TPU only) as not counting since they are running "canned benchmarks". Ironic, because completely ignore facts such as GameGPU.ru and Bit-Tech.net running manual runs, and well in your mini-HD6950 to HD7970 review, you yourself plan to use canned/in-game benchmarks. Really?
Like I said, keep up the ad hominem and maybe it'll detract from your lack of argument. When you continually show your own ignorance on the subject, I didn't really have much part in it, you did it yourself. Bravo.
You make a big deal about 50-55W of power consumption, while ignoring that
HD7970 is already a very high power consuming card to begin with.
Who made a big deal of it? Where's the quotation? I belittled the difference in my first post on the matter:
First off, if you look at the charts from the last dozen posts, the difference between a GTX 580 and HD6970 isn't even 50-60W, it's more 29-33W from those charts.
You're still clearly missing the point though, I guess I should just repeat it again and again though - it's performance per watt and what performance you get at said power consumption.
The best part you discuss how important power consumption is while coming from a previous generation i5 760 @ 4.1ghz and an HD5850 @ 1000mhz+.
I think this graphs speaks for itself:
Source
And that shows how much you don't understand the discussion or my argument. Since the 5850 consumed so little power at stock (127W), it was easy enough to clock it with volts and still maintain a decent TDP ceiling to keep my computer cool and quiet. I could also point out how that test is incorrect and the 5850 doesn't consume anywhere near that amount of power at 1000MHz,
but I already did that last year.
Everyone knows that I am a price/performance guy and I don't hide it. People might not agree with me and I am 100% OK with that. At least I consistently ripped GTX580 for being overpriced vs. GTX570 and HD7970 for costing $550 and bringing so little over 580 after 14 months. I also said power consumption wasn't a big deal for high-end cards like GTX480/6970, even if people didn't agree, I was consistent.
In your case, you throw arms in the air over 50-55W of power in some cases, and in other cases (like i5 @ 4.1ghz and 5850 @ 1000+ mhz), it's not important. In other cases 30 fps is fast enough for enthusiast PC gamers, and you seem to think that HD7970 provides a whole new world of playability when the card can't even break 30 fps in
Metro 2033, or
Dragon Age 2, or
Crysis 2 with AA at 2560x1600?
Notice a pattern of inconsistency?
The only inconsistencies are that you change your argument every 10 posts or so, as well as go back and edit your older ones, in some lame attempt to save face. I think I've rebutted several times any argument you put forth, as well as showing the forum your blatant lack of understanding of even the most basic physics/engineering principles. And seeing that video made me realize I've been doing this for longer than I thought. This discussion is ended, stop wasting my time.