Finding optimal CPU speed for GPU

jesterb84

Member
Mar 14, 2008
127
0
0
I'm currently running a 4870/512mb card but have been noticing that games are choking up at 1920x1200 with full detail, etc. However, I have no idea whether this lack of performance is due to the GPU or whether my CPU is the bottleneck (system specs still the same as in my signature). Is there a general guide, or a method which I can use to test whether my CPU is the bottleneck? I'm currently considering an upgrade sometime this year, but was trying to factor whether the video card needed replacing as I still see recommendations here and there for 5770s, which appear to perform on-par or slightly worse than a 4870 (so maybe the performance is still not bad by today's standards).

As an aside, I realize my motherboard is only PCI-X 1.1 but decided to test the CPU first.

Thanks for any suggestions!
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,429
1,780
136
In principle, increasing the resolution or the AA/AF modes does not in any way increase the load on the CPU, while considerably increasing the load on the gpu. So, to test where your bottlenecks lie, try considerably dropping the resolution. If this leads to no frame rate gains, the problem is your cpu.
 

jesterb84

Member
Mar 14, 2008
127
0
0
Okay, so if I lower resolution and it doesn't get any faster - then I know it's the CPU being the bottleneck. However, if the reverse is true - then I know the GPU is the bottleneck. I know that performance increases by going down to 1680x1050. Additionally, the fan on the 4870 seems to spin up less aggressively. I guess this is the GPU being the bottleneck then.

Now, I'll need to determine whether the PCI-x 1.1 is the bottleneck, or the 512MB VRAM, or both.
 

evilspoons

Senior member
Oct 17, 2005
321
0
76
Sorry to nitpick rather than actually contribute to your question, but I believe you are referring to PCI Express 1.1, which is PCIe/PCI-E. PCI-X is actually something else entirely, mostly used in servers.



*end nitpicking*

Tuna-fish is right though, increasing resolution or AA/AF shouldn't effect CPU that much - it still has to set up the same 'environment' that the GPU renders at the target resolution. Turn the sucker down to 800x600 and if it's still slow, it's probably your CPU.
 

kmmatney

Diamond Member
Jun 19, 2000
4,363
1
81
Definately GPU at 1920 x 1200, although I think the next generation of games will probably start to bog down the cpu as well. My HD4890 1GB also bogs down a bit in some games at 1920 x 1200.
 

DirkGently1

Senior member
Mar 31, 2011
904
0
0
Despite what some would have you believe, most games are still GPU bound at decent 'gamer' settings and as long as you have a reasonable CPU it makes most sense to look at the GPU for upgrades.

You'll hear a lot of people mentioning FPS gains going from C2Q to SB, so by that logic games need the CPU upgrade. What that doesn't tell you is that the frame rate went from 90 to 100 FPS, for example, which is meaningless as long as the minimum is above 60. Even AT has been guilty of displaying graphs at lower resolutions and settings which show good performance scaling, (no doubt to highlight differences between different generations of CPU), but the differences disappear when the resolution and settings are turned up. (And when you are getting 60FPS or more, there is always a way to turn more eye-candy on)!

A C2Q will be adequate in most circumstances for a long while yet, (Core2Duo maybe less so), whereas GPU upgrades are still de rigueur.
 

jesterb84

Member
Mar 14, 2008
127
0
0
Great, thanks for the insightful comments. Now here's a few more questions I'd like to tag on to this post:

1. How important is it to go to PCIe 2.0 if I were to get a more recent video card (1 GPU, no crossfire/SLI)? Will the PCIe 1.1 severely hamper performance?

2. Which game features use up the most VRAM (i.e. AA, filtering, resolution, etc.)?
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,429
1,780
136
Great, thanks for the insightful comments. Now here's a few more questions I'd like to tag on to this post:

1. How important is it to go to PCIe 2.0 if I were to get a more recent video card (1 GPU, no crossfire/SLI)? Will the PCIe 1.1 severely hamper performance?

No. Frankly, 16x PCIe 1.0 still has much more bandwidth than GPU's ever need.

2. Which game features use up the most VRAM (i.e. AA, filtering, resolution, etc.)?

Better texture sets (In game, usually "texture quality").

AA and more resolution increases the amount of memory used by the framebuffer, but that's just not that big anymore. A triple buffer 1920*1200 FSAA scene with 32-bit z and 8-bit stencil would still take less than 50MB. It used to be a big deal, it just isn't any more.

AF means that there has to be anisotropic versions of the textures, which adds ~30% to the texture memory usage.

But generally "how much vram do I need?" is the wrong question. What you need is more shader and rops power, to get it you'll have to get a gpu that has a gigabyte of vram. If you want to stay in AMD's camp, I'd suggest 6870 or 6950, depending on your budget. If you feel that the grass is greener on the other side, you can't go wrong with GTX560Ti.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
Play game for 5 minutes. Pick an area or level that feels like its really taxing your system the most. Make a save point at that area if you can. Play in that area for 5 minutes. Open task manager and look at your cpu graphs. You want to look for flat lines where the cpu is pegged 100%. If either core pegs at 100%, then that's telling you you need more cpu power. The thing about the E2160 is that it is quite cache starved. Even at 3GHz the constant cache misses are going to hit it hard in some situtations, and you will perceive that as a loss of framerate. 1M of cache is simply not enough for an intel chip, for games anyway. Unfortunately intel charges out the wazoo for cache. But if you look on ebay you can try to find an old conroe with 4M cache. It will make a huge difference in your minimum frame rates. Moreso than moving to a 5850. Thats imo, it would be close, but the cpu is the cheaper upgrade. I price checked an E6420 on ebay just now and there are plenty at the $40 range. Even if you can only manage to OC it to 2.7GHz, the 4X more cache will make a very noticable difference.
 

jesterb84

Member
Mar 14, 2008
127
0
0
@Tuna: Thanks, that was very informative and exactly the type of reply I was looking for! I'm going to price-check some of those cards on NewEgg/Canada Computers.

@sm625: That's a good suggestion, I'll check eBay out and see if I can find some older Conroes/Wolfdales for cheap. I guess the 1MB cache is making by 3.0GHz c2d work harder than an equivalently-clocked cpu with higher cache.
 

evilspoons

Senior member
Oct 17, 2005
321
0
76
Even AT has been guilty of displaying graphs at lower resolutions and settings which show good performance scaling, (no doubt to highlight differences between different generations of CPU), but the differences disappear when the resolution and settings are turned up.

I wouldn't say "guilty". I would say they are properly scientifically rigorous when reviewing a CPU by attempting to remove the GPU from the equation.

It's just a matter of people interpreting the data incorrectly - without realizing how the game will scale from the resolution settings used in the CPU test to the settings you'll actually want to play on.
 

Necc

Senior member
Feb 15, 2011
232
0
0
@Tuna: Thanks, that was very informative and exactly the type of reply I was looking for! I'm going to price-check some of those cards on NewEgg/Canada Computers.

@sm625: That's a good suggestion, I'll check eBay out and see if I can find some older Conroes/Wolfdales for cheap. I guess the 1MB cache is making by 3.0GHz c2d work harder than an equivalently-clocked cpu with higher cache.
Get your self a 2nd hand Q6600, 4 cores 8MB cache. they are pretty cheap these days (Ebay).
 

DirkGently1

Senior member
Mar 31, 2011
904
0
0
I wouldn't say "guilty". I would say they are properly scientifically rigorous when reviewing a CPU by attempting to remove the GPU from the equation.

It's just a matter of people interpreting the data incorrectly - without realizing how the game will scale from the resolution settings used in the CPU test to the settings you'll actually want to play on.


I would say that running game benchmarks to show off a CPU is totally unnecessary and misleading, as you create conditions that nobody will run under in real life.

Why not a CPU test like wprime instead?
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
that cpu would certainly hold back a 4870 a little bit but it would be very unlikely to notice it in actual game play for most games. now if you are trying to run really high settings and AA in all modern games then a 4870 will choke a bit at 1920x1200. the gpu itself is pretty decent for most games if you reduce some settings but 512mb is still going to kill you in most modern games at that res. most of the newer titles will easily exceed 512mb at 1920x1200 at what would otherwise playable settings.
 

jesterb84

Member
Mar 14, 2008
127
0
0
that cpu would certainly hold back a 4870 a little bit but it would be very unlikely to notice it in actual game play for most games. now if you are trying to run really high settings and AA in all modern games then a 4870 will choke a bit at 1920x1200. the gpu itself is pretty decent for most games if you reduce some settings but 512mb is still going to kill you in most modern games at that res. most of the newer titles will easily exceed 512mb at 1920x1200 at what would otherwise playable settings.

Thanks, I had a sense that 512mb on the card may be a bit little compared to the actual GPU power for some of the latest games. This card was given to me for free though so can't complain at all ;-)

I think what I'm seeing in the comments in general (and in my testing so far) is that a GPU upgrade is going to help most, but a CPU upgrade to more cache/quad core is not a bad idea. Although I am tempted to get a Q6600, I just couldn't find one for decent price in Canada. What I'll probably end up doing is wait for Bulldozer and see what affect it has on the market. Then, I can evaluate between BD vs. SB and plan an upgrade accordingly at the end of this year.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
I dont know what is wrong with the legionhardware site, but I have a link to a google cached (ironic eh?) version of an article that tests 4 different intel cache sizes, all at the same speed and multiplier. It is a great source of data. Here is the tinyurl to that google cached article:

tinyurl - 3owzfjn

In some games, 4M of cache gets you nearly 30% more performance compared to 1M of cache. But I think it is even worse for the small caches when you look at minimum frame rates. And of course it is the minimum frame rate that is most critical.

Keep in mind that it is not just the size of the cache, but also the associativity that suffers on an E2160. It is only an 8-way cache whereas an E6420 supposedly has a 16-way cache.
 
Last edited:

evilspoons

Senior member
Oct 17, 2005
321
0
76
I would say that running game benchmarks to show off a CPU is totally unnecessary and misleading, as you create conditions that nobody will run under in real life.

Why not a CPU test like wprime instead?

It's not "showing it off", it's demonstrating the contribution the CPU makes. Obviously both the CPU and GPU need to be fast for most games to work properly, but some games just don't care and are highly CPU or GPU dependent. If you bottleneck the game by having GPU settings that are too high, you can't see potential CPU bottlenecks.

Remember that benchmarks are an average of a run.

Let's say half a benchmark is easy on the GPU and half is hard on the GPU when you're at high resolution (say... indoors and outdoors). You end up with an average frame rate of 30 FPS because indoors is getting 50 and outdoors is getting 10. This will hide the fact that the CPU may be supplying enough data for some theoretical faster video card to run at 50 fps outside versus if you turned the GPU-heavy detail items down and let it 'run free'. Once the GPU-bottlenecking details are turned down/off, you can see whether the CPU will allow the game to run any faster or if it's still only running at 10 FPS - in which case the CPU is also holding it back, and you should have a faster CPU (and GPU) to run the game better. On the other hand, if the cheapest CPU on the market runs the 10 FPS section at 400 FPS when the GPU-dependent detail's turned down, it's safe to say a faster CPU is not needed to improve the game.

This is why you must try to isolate the components. Make a CPU decision based on CPU-isolated tests, and make a GPU decision based on GPU-isolated tests.
 

jesterb84

Member
Mar 14, 2008
127
0
0
I dont know what is wrong with the legionhardware site, but I have a link to a google cached (ironic eh?) version of an article that tests 4 different intel cache sizes, all at the same speed and multiplier. It is a great source of data. Here is the tinyurl to that google cached article:

tinyurl - 3owzfjn

In some games, 4M of cache gets you nearly 30% more performance compared to 1M of cache. But I think it is even worse for the small caches when you look at minimum frame rates. And of course it is the minimum frame rate that is most critical.

Keep in mind that it is not just the size of the cache, but also the associativity that suffers on an E2160. It is only an 8-way cache whereas an E6420 supposedly has a 16-way cache.

Thank god I didn't get the celeron at the time with 512kb L2 cache. That would've really been a bad decision ;-)
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |