G80 CPU Scaling

lopri

Elite Member
Jul 27, 2002
13,220
606
126
While I was reading posts regarding G80 @XS there was a universal comment that made me uncomfortable. These folks were comparing 3DMark scores with Conroe and Kentsfield, respectively, and concluding that G80 is very CPU-dependent. Whether they're ignorant or their arguments always run circles around 3DMark is beyond me. And yesterday's G80 reviews state the obvious once again: Today's games are absolutely GPU-dependent. Although we do not have AT's data yet, we have results from other independent sites' G80 reviews yesterday.

Driver Heaven E6700 2.66GHz VS 3.60GHz
Guru3D Core 2 Duo 1.86GHz ~ 3.47GHz

Guru3D's tests look to be performed without any AA/AF processing, so if you take it into consideration, you can easily draw your own conclusion when those post processings are enabled. What I could conclude is;

1) Modern CPUs (A64 2.40GHz and up) are more than enough to feed G80 to full force.
2) Fast CPUs can account for a better minimum FPS at low resolutions (read: 1024x768/1280x1024), where the FPS is already high enough.
3) In eye candy mode & resolution 1280x1024 and above, a better performance is achieved by a better GPU, not by a better CPU.


What worries me is AT's recent push of quad-core CPUs. Multi-cores may be the way we're headed, but I would still think reviewers should be on consumers' side and keep the pace with them, instead of pretending to be a leader. (Performance/Watt thingy in G80 review is another example) When there is no meaningful application to take advantage of quad-cores currently and in near future, saying "Quad-cores are the best thing since sliced bread because we've heard a few games will be faster with them in, well, maybe a year later" is not the most responsible argument, IMO. We already have confused users with questions like "Will my X2 be enough?" "Should I wait for Kentsfield instead of buying Conroe now?" in the forum.
 

Fraggable

Platinum Member
Jul 20, 2005
2,799
0
0
I don't know what to think about CPUs 'bottlenecking' GPUs anymore. Architecture for both has changed so dramatically in the last few months that I think we will have to wait a couple more months before it's really clear what DX10 does to the whole equation.

I upgraded from an A64 Venice 3000+ to a FX-55 (1.8 to 2.6GHz) while keeping the same X1800XT and noticed that my minimum framerates improved, but maximum rates didn't change a whole lot. Minimum frames is what really matters in games so I was happy. I always play my games at max detail at 1440 X 900 with 6XAA and whatever maximum AF the game will allow.
 

SparkyJJO

Lifer
May 16, 2002
13,357
7
81
I think people have become so bugged with the "ub3r l33t" C2D that they can't imagine using any other than that for the G80 even if a different CPU would be fine. Games still are more GPU dependent, yes you want a fast enough CPU but I'm willing to bet the game performance between my dual core opteron with a G80 and the C2D with a G80 would be minimal.
 

Sunrise089

Senior member
Aug 30, 2005
882
0
71
I strongly agree with the OP in his views that tech sites need to be careful of moving away from serving as the ally of the consumer.

Tech reviews will continue to provide reviewed systems with X6800s and Quad Core chips in their benchmarks, and the real informed users will do what they always have: buy the fancy new GPU and pair it with high-value processors, either a new midrange overclocked Conroe, or an old A64, Opty, or X2 for those who don't have so old of builds. All the time you see systems like this in the forum:

-AMD X2 3800+ @ 2.4ghz
-ATI Radeon X1900XT

If CPUs were really the limiting factor, you'd be seeing this:

-Intel Core2Duo E6700
-Nvidia 7600GS

I think the actual numbers of the above systems in the enthusiest community speak for themselves.
 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
I agree with the only caveat being in your second statement:

2) Fast CPUs can account for a better minimum FPS at low resolutions (read: 1024x768/1280x1024), where the FPS is already high enough.

There are some rare situations where at the lower resolution the fps is still not high enough and is cpu-bound. I.E. Oblivion when it first came out on pretty much everything but the fastest setups at the time. Sometimes cranking up the resolution another notch to be gpu-bound again would not be viable.
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
I have said a zillion times that gaming is still ridiculously GPU limited, unless you run things at a silly 1024x768 resolution or lower.

And like you point out, it's still true.
 

lopri

Elite Member
Jul 27, 2002
13,220
606
126
Originally posted by: Fraggable
I don't know what to think about CPUs 'bottlenecking' GPUs anymore. Architecture for both has changed so dramatically in the last few months that I think we will have to wait a couple more months before it's really clear what DX10 does to the whole equation.
This is true. It's just that DX10 is currently non-existing therefore it's a non-issue.

Originally posted by: SparkyJJO
I think people have become so bugged with the "ub3r l33t" C2D that they can't imagine using any other than that for the G80 even if a different CPU would be fine. Games still are more GPU dependent, yes you want a fast enough CPU but I'm willing to bet the game performance between my dual core opteron with a G80 and the C2D with a G80 would be minimal.
Very well put. Although I understand it as people's desire to upgrade their hardware. (Hey, we're all geeks, aren't we? )
 

SpeedZealot369

Platinum Member
Feb 5, 2006
2,778
1
81
Exactly, when I buy my system next year guess what it will have in it? a g80 with probably an opteron 170 and some ddr400 sticks.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: Jodiuh
I felt sorry for my 8800, so I'm feeding it a 6600 tomorrow.

But at 1600x1080, you won't notice any difference between your E6400 and a new E6600.

If anything, feel sorry that you're forcing your 8800gtx on a Dell 2007WFP. That's like chaining your dog onto the 10x10 porch so he can stare out at a 30-acre fenced yard.
 

Noubourne

Senior member
Dec 15, 2003
751
0
76
It's clear that a single G80 is still a limiting factor for today's games at most high resolutions, and in that case your CPU doesn't matter as much, so long as you're in the C2D range of power.

I wonder if that's true for SLI though.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: Noubourne
It's clear that a single G80 is still a limiting factor for today's games at most high resolutions, and in that case your CPU doesn't matter as much, so long as you're in the C2D range of power.

I wonder if that's true for SLI though.

Actually, you don't even need to be in the C2D range. A simple A64 single-core would suffice... as long as you're playing at the upper-end of resolutions, and if one is buying a G80 for anything less, then "cpu bottlenecks" shouldn't be their main concern.

As for your SLI question, Crossfire *and* SLI both benefit greatly from a faster CPU, creating a CPU bottleneck at resolutions 1600x1200 and below, but, again, it will vary across different games. For example, see this page. Here, we have one game (SS2) showing a severe CPU limitation with SLI G80s all the way up to 19x12 res. Then, below that, we have SC3 which scales well across the whole spectrum of resolutions, showing no signs of bottlenecking, even at 1024x768.
 

Jodiuh

Senior member
Oct 25, 2005
287
1
81
Originally posted by: deadseasquirrel
Originally posted by: Jodiuh
I felt sorry for my 8800, so I'm feeding it a 6600 tomorrow.

But at 1600x1080, you won't notice any difference between your E6400 and a new E6600.

If anything, feel sorry that you're forcing your 8800gtx on a Dell 2007WFP. That's like chaining your dog onto the 10x10 porch so he can stare out at a 30-acre fenced yard.

Lol, I was kidding. I'm hoping it hits higher Giggahurtz than my 6400. I was actually hesitant to move to a 20 as even that was rough on Oblvion w/ the X19. 1680x1050's plenty for me to get work done and the res's nice enough without killing FPS.

*the 2407's looking kinda nice tho...*

 

harpoon84

Golden Member
Jul 16, 2006
1,084
0
0
As already mentioned, generally a faster CPU will provide higher minimum framerates, so the game will tend to run 'smoother' with less framerate spikes. Think of it as a capacitor for framerates, if you will.

On *most* current 3D games, I'd say a mid level A64 would be sufficient to enable all but the most demanding games (ie. Oblivion) to be very playable on a G80. The exception to that would be some RTS games and flight simulator games, which are generally very CPU bound. From my experience, some online shooters such as CS:S and BF2 are also both CPU and GPU intensive.

I guess in a nutshell, a balance between CPU and GPU speed is needed for the 'optimal' gaming experience, with more emphasis on a faster GPU. A mid range CPU with a top of the line GPU will provide an awesome gaming experience, but a top end CPU with a mid range GPU will only bring slightly above average results.
 

markymoo

Senior member
Aug 24, 2006
369
0
0

ok i found an article i believe on hardocp that blew away the myth using the conroe give you more higher fps. i try and find it.

what they were saying once you get over 2gb its all relying on the graphics. they did a test with all heavy graphic games like fear and oblivion and conroe was in front by just 1 frame.
 

murban135

Platinum Member
Apr 7, 2003
2,747
0
0
Originally posted by: markymoo

ok i found an article i believe on hardocp that blew away the myth using the conroe give you more higher fps. i try and find it.

what they were saying once you get over 2gb its all relying on the graphics. they did a test with all heavy graphic games like fear and oblivion and conroe was in front by just 1 frame.

I think you might be referring to this article:

HardOCP.com Link

Quote:
"We have proven here that the flurry of canned benchmarks based on timedemos showing huge gains with Core 2 processors are virtually worthless in rating the true gaming performance of these processors today. The fact of the matter is that real-world gaming performance today greatly lies at the feet of your video card. Almost none of today?s games are performance limited by your CPU. Maybe that will change, but given the trends, it is not likely. You simply do not need a $1000 CPU to get great gaming performance as we proved months ago in our CPU Scaling article."
 

ScythedBlade

Member
Sep 3, 2006
56
0
0
Unfortunately, that article was blammed so many times. Especially by real people @ xtremesystems.org. Of course, it does tell some truth. You don't really need a great CPU for gaming and that the GPU matters more. The statement where they make about how a GPU may be limited by the CPU is the frame gains per faster CPU. It's on an opinion basis. Once you get a small enough marginal FPS gain for a new clock, then it's considered not CPU limited anymore.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
In my opinion it is not that 8800 is cpu limited, but rather not a lot of current generation games are sophisticated and complex enough to take advantage of all the capabilities of G80 (ie. their graphics are just not next gen yet).

That's why we see 120fps 4AA/16AF at 1600x1200 in the reviews. Sure with a faster cpu you might get 140, ok so what? I want to see a game that looks 2x as good and runs at 60fps instead. Who cares about the cpu limitation argument. The argument should be towards the slow progress of software. Where are games like Doom 3 and Far Cry that put tremendous load on graphics card at the time? If we had a game like that (ahead of its time) come out today, I doubt many would say 8800 is cpu limited. It's all relative. You can't test a next generation card with Doom 3 and Quake 4 and Prey that all run on the same engine, barely more demanding from Doom 3 era and then say 8800 is cpu limited. Of course it will be when you run it on a 2 year old engine.

Once we see Unreal 3 engine and other advanced games come out, without a doubt an X2 4600+ with 8800GTX will absolutely blow out C2Duo 4.0ghz with X1950XTX.

If anything this goes to show that for a non-hardcore gamer who runs 17 or a 19 inch LCD monitor, an 8800 right now is probably not worth the price premium if one intends to play at something like 1280x1024. That's why there are cheaper cards out there. Also it could be argued that it might not always make sense to buy the latest and the greatest since it will not be utilitized to its full potential for another 6-12 months. This could partly explain the popularity of cards like 6600GT and 7800GT which weren't top of the line but ran everything fast enough for the majority of users. But then someone who has a 30 inch LCD will insist that 8800 is not cpu limited once higher resolutions are used for gaming.


 

Jodiuh

Senior member
Oct 25, 2005
287
1
81
Originally posted by: deadseasquirrel
Originally posted by: Jodiuh
I felt sorry for my 8800, so I'm feeding it a 6600 tomorrow.

But at 1600x1080, you won't notice any difference between your E6400 and a new E6600.

Where were you when I placed my order? The man's 100% right on. I can't even tell a difference in encoding, let alone games. Yes, yes, my SuperPI 1M score went down 2ms...who cares?!?!

350 spent upgrading board, cpu, ram = 0% improvement in games, maybe 5-10% in real life stuffs

350 spent on upgrading to 8800 GTX = 200% plus improvement in games, nothing in real life stuffs

I'm pretty sure this makes sense to all of us...and shame on me for not listening to reason.

 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: Jodiuh
Originally posted by: deadseasquirrel
Originally posted by: Jodiuh
I felt sorry for my 8800, so I'm feeding it a 6600 tomorrow.

But at 1600x1080, you won't notice any difference between your E6400 and a new E6600.

Where were you when I placed my order? The man's 100% right on. I can't even tell a difference in encoding, let alone games. Yes, yes, my SuperPI 1M score went down 2ms...who cares?!?!

350 spent upgrading board, cpu, ram = 0% improvement in games, maybe 5-10% in real life stuffs

350 spent on upgrading to 8800 GTX = 200% plus improvement in games, nothing in real life stuffs

I'm pretty sure this makes sense to all of us...and shame on me for not listening to reason.
Wow, I'm amazed. This is absolutely the first time I can remember when someone comes back later, just to say that they weren't right, and someone else was right. What's the world coming to? Nice to have you in the forum, BTW.
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
But but but...i want an E6600 for the longer e-penis

Not to mention that they seem to hit 3600 MHz a lot more often than E6400s, & that extra 200 MHz will make a huge real world difference to me
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: n7
But but but...i want an E6600 for the longer e-penis

Not to mention that they seem to hit 3600 MHz a lot more often than E6400s, & that extra 200 MHz will make a huge real world difference to me
Says the guy with ~1½ TB of hard drive space in one PC. What do you expect, we're geeks.:laugh:
 

VERTIGGO

Senior member
Apr 29, 2005
826
0
76
im a widescreen dual card user, and since my opty's well overclocked I see no reason why I'll need a new cpu for at least another year. These next games (crysis, UT2007, and Alan Wake, etc.) may require me to go with new graphics next summer, but I definately agree that the A64 (especially dual core) will hold up for a while in gaming performance.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |