a good example of CPU limitations

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: toyota
Originally posted by: RussianSensation
Originally posted by: Idontcare

It doesn't make sense all.

Sure. A lot of things in life don't make sense because we may lack the knowledge or the technical expertise. I don't have an explanation of how 30% boost can give you 100% to 200% performance increase but here is another site with another bench:

X2 3800+ = 2 x 2.0ghz A64
A64 4000+ = 1 x 2.4ghz A64

So we 'logically' would say you can expect a 2x performance increase at most, right? Apparently that would be Wrong.

Far Cry 2 - X2 3800+ vs. A64 4000+
X2 3800+ = 29.1 average, 20 minimum
A64 4000+ = 9.2 average (>3x slower), 5 minimum (4x slower)

Also notice how the minimum framerates of Q6600 2.4ghz almost match the average frames of E6600 2.4ghz for the "dual core is enough" camp.

well all they do is shut off one core of an X2 and that doesnt always work correctly.

Or that multithreaded programs don't always behave optimally on a single core cpu. By going multithreaded, you're increasing register pressure and memory requirements, without providing additional hardware resources to deal with it. Because there's two threads, the app may be forcing many more context switches than a dual core would deal with, and besides the penalty that entails, the cache on an athlon may not be big enough to handle two threads. In fact, it probably isn't, considering how much 256KB cache cripples a sempron at this point, the 512KB on the athlon's probably can't handle two threads. Change to a Phenom I or II and turn off 3 cores and the performance hit may not be as severe since all the threads will fit comfortably in cache.

http://www.tomshardware.com/re...erformance,2373-9.html
Actually, here are some phenom II results.
BTW, there are plenty of games that take good advantage of quad core...basically anything UE3 based or ported from the consoles. Thing is, the console cores are so slow that it doesn't matter if the game scales well going to quad core, dual core was more than fast enough. Well, unless the game is gta4...
I'd imagine console centric engines could make use of up to 6 threads, given that both the ps3 and 360 support 6 threads. That's a very nice point in favor of the i7's hyperthreading, if someone ever makes a game engine that scales to need that kind of horse power.
 

vj8usa

Senior member
Dec 19, 2005
975
0
0
Originally posted by: RussianSensation
here is another site with another bench:

X2 3800+ = 2 x 2.0ghz A64
A64 4000+ = 1 x 2.4ghz A64

So we 'logically' would say you can expect a 2x performance increase at most, right? Apparently that would be Wrong.

Far Cry 2 - X2 3800+ vs. A64 4000+
X2 3800+ = 29.1 average, 20 minimum
A64 4000+ = 9.2 average (>3x slower), 5 minimum (4x slower)

Also notice how the minimum framerates of Q6600 2.4ghz almost match the average frames of E6600 2.4ghz for the "dual core is enough" camp.

I found those numbers interesting as well, and decided to see if I could get something similar. I ran the same benchmark at the same "Very High" preset, but at a lower resolution of 1280x800 (my GPU's weaker than the 4870 they used, after all). I also did this by setting affinity in task manager.

At 3GHz, I go from 30.8/22.7 (avg/min) on 2 cores to 24.2/17.1 FPS with one core. Not much of a drop.
At 2Ghz, I go from 27.7/19.2 to 18.4/12.6. Pretty significant drop, but still a far cry from what PCGH got.

edit: to reduce any GPU limitations, I just tried again at 640x480 with visual settings on min, and got some better numbers: from 53/37 FPS with 2 cores down to 29/18.5FPS with 1 core. Min framerate is actually cut almost exactly in half, which is pretty neat. Those numbers are in line with my expectations, though (unlike PCGH's numbers).
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: RussianSensation
Originally posted by: Idontcare

It doesn't make sense all.

Sure. A lot of things in life don't make sense because we may lack the knowledge or the technical expertise. I don't have an explanation of how 30% boost can give you 100% to 200% performance increase but here is another site with another bench:

X2 3800+ = 2 x 2.0ghz A64
A64 4000+ = 1 x 2.4ghz A64

So we 'logically' would say you can expect a 2x performance increase at most, right? Apparently that would be Wrong.

Far Cry 2 - X2 3800+ vs. A64 4000+
X2 3800+ = 29.1 average, 20 minimum
A64 4000+ = 9.2 average (>3x slower), 5 minimum (4x slower)

Also notice how the minimum framerates of Q6600 2.4ghz almost match the average frames of E6600 2.4ghz for the "dual core is enough" camp.

But that's comparing a dual core minimum threaded game then benching with a single core cpu. Without another core it bottoms out especially if the game isn't optimized for a single core.

Tom's benches are wrong. Benches show 30% higher clock for same dual core CPU that gets double the frame rates. There is a user error. That's usually the case when dealing with tom's hardware now a days.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,269
12
81
Originally posted by: Idontcare
Originally posted by: vj8usa
I honestly don't understand how the numbers Tomshardware is getting are even possible. According to them, a 30% clock speed increase is netting over a 200% increase in framerates in some cases (FC2 on the 3850 goes from 20.2 to 61.6 FPS, for instance). How can the performance increase be greater than the clockspeed increase?

It doesn't make sense all.

If the rate-limiting step in the computation sequence that results in fps is improved by 30% then at best the speed-up in the computation sequence that results in fps is going to improve by 30%.

To derive a 200% fps improvement requires something to have been bottlenecking the computation loop quite severely and that something had to be improved upon by 2x without anything else in the computation loop becoming the rate-limiting step.

You're right, it doesn't make sense, unless there is something else going on. But I have similar hardware, and I've tested a stock HD3850 with the Athlon X2 @ 2.00 GHz and 3.1 GHz. Sadly, somehow my 3.1 GHz results have been erased or overwritten, so I'm going to run more tests and edit them in later, but here are my results for the Athlon X2 @ 2.00 GHz. I tried to replicate the same settings they did.

Test Setup
Athlon X2 5000+ BE
Arctic Cooling Freezer 64 Pro
Radeon HD3850 512MB @ stock
Gigabyte MA78G-DS3H
2x2GB G.Skill DDR2-1000
600W OCZ StealthXStream
500GB WD5000AAKS
Enermax Chakra case
Samsung SyncMaster 915n
Windows Vista Ultimate 32-bit

When results are edited in later, I'll put the scores right next to each other. For example:
Max: @2.00 GHz -- @3.1 GHz
Avg: @2.00 GHz -- @3.1 GHz
Min: @2.00 GHz -- @3.1 GHz

*Not a clean install, some background apps were running, like Windows Live messenger and Skype, but nothing else was hogging CPU time)

Far Cry 2 Ranch Small

1280x1024, all low settings, no AA

Max: 62.29 -- 85.80
Avg: 46.33 -- 63.95
Min: 34.14 -- 47.69

1280x1024, Low Render Quality settings, Performance settings set to Very High, no AA

Max: 61.43 -- 81.96
Avg: 43.78 -- 59.57
Min: 32.33 -- 43.79

1280x1024, all medium settings with HDR, no AA

Max: 55.86 -- 70.73
Avg: 38.50 -- 49.62
Min: 29.22 -- 37.91

1280x1024, Medium Render Quality settings with HDR, Performance settings set to Very High, no AA

Max: 55.86 -- 65.99
Avg: 35.59 -- 48.42
Min: 25.65 -- 37.95

So far, even at 2.00 GHz and with an HD3850, I'm not getting the ridiculously low results that Tomshardware got.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BFG10K
Yep, my stance is summed up in one sentence: as long as you have a half-decent CPU (e.g. my E6850), the GPU is by far the biggest bottleneck in performance overall
But what is a half-decent CPU for one game can be a massive bottleneck for another, particularly with a really fast GPU setup. GTA4 and ArmAII are two notable examples of games that struggle on any C2D.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Originally posted by: toyota

1920 is a low res? lol.
Low is probably too strong, but certainly middling at best.

if I raised it it wouldnt even be playable in the first place.
That simply proves your bottleneck is the GPU since dropping the resolution is increasing your frames. Even your own tests confirm what I?m saying.

2.0 on the E8500 is not artificially low and is faster than a 5600 X2 which is faster what the majority of pc users still have.
Sure it?s slow; you?re vastly underclocking your CPU just to get some kind of difference, but that difference is unrealistic and overblown. This is akin to those 1280x1024 tests done online that conclude a processor is ?better? because it scores 150 FPS instead of 130 FPS at said settings.

As for what most users use, most use a GMA, so can I conclude using a 2400 for graphics testing is okay because it?s vastly faster? Of course not, because a 2400 is prehistoric junk, much like that X2 processor of yours. Drawing inferences from junk processors is as invalid as drawing them from junk GPUs.

last year I had horrific experiences trying to run a decent card on a 5000 X2.
Did you look at the tests I linked to? Did you look at the settings used? The 8800 Ultra was bottlenecking a stock E6660 even at 1600x1200 with 4xAA in almost every test. Do you really think someone that dropped $1000 on a quad-core i7 is going to be gaming at lower settings than that?

Those benchmarks are the norm, not the exception; I see this all the time when gaming.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Originally posted by: Fox5

BTW, there are plenty of games that take good advantage of quad core...
There really aren?t; there are simply a few fringe titles that represent a drop in the bucket compared to what is actually out there. Furthermore, many of those that do show practical differences also run piss-poor on everything, like ArmA 2.

basically anything UE3 based or ported from the consoles.
At what settings? 1280x1024 with no AA?

I can tell you for a fact that every UT3 based game I have is bottlenecked by my GTX285, by far (dual-core E6850 @ stock). Some random benchmark bars might show some pretty differences, but when you actually sit down to play the games at real settings, you quickly find the bottleneck is almost entirely with the GPU.

In all except Bioshock I use 1920x1200 and stick to 2xAA to keep the framerate up in demanding areas. Again, would you expect someone dropping $1000 on a quad-core to be gaming at less than those settings?

So even if there were some differences in a miniscule amount of titles, even those are often nullified when you start using middling resolutions with minimal AA.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Originally posted by: kylebisme

But what is a half-decent CPU for one game can be a massive bottleneck for another, particularly with a really fast GPU setup. GTA4 and ArmAII are two notable examples of games that struggle on any C2D.
Again, I?m not saying there aren?t any differences between quad and dual, just that they?re fringe cases. Furthermore, I?m also stating that at any reasonable resolution and minimal AA level those differences often cease, and the biggest bottleneck becomes the GPU by far.

Quad-core is still horrifically overpriced and underperforming compared to dual-core, especially the i7 platform. If you invest the difference into a better graphics card you?ll get far better gaming performance overall.

As for your two examples, you can clearly see they?re piss-poor engines that don?t run well on anything; both generally slideshow even at 1280x1024 with no AA.
 
Apr 20, 2008
10,162
984
126
Originally posted by: biostud
http://www.tomshardware.com/re...overclocking,2395.html

going from 2Ghz to 2.6Ghz:

Left 4 Dead is clearly limited by the Athlon X2 3800+ CPU--look at the difference 600 MHz makes. Our Athlon 64 X2 4200+ at Athlon 64 X2 5000+ speeds is doubling the frame rate.
[/quote]

Its tom's hardware. They don't know what the fuck they are doing over there.

These are the same people who couldn't overclock a Q8200 past 2.5ghz without insane voltages (1.45V). A noob overclocker (yes I said it) such as myself overclocked to 2.8ghz without any adjustments AT ALL. 3ghz took only an hour or so of actual testing.

http://www.tomshardware.com/re...m-pentium,2366-12.html
 

betasub

Platinum Member
Mar 22, 2006
2,677
0
0
Originally posted by: Scholzpdx
These are the same people who couldn't overclock a Q8200 past 2.5ghz without insane voltages (1.45V). A noob overclocker (yes I said it) such as myself overclocked to 2.8ghz without any adjustments AT ALL. 3ghz took only an hour or so of actual testing.

So challenge them to swap their Q8200 CPU with yours and show them how it is done
 
Apr 20, 2008
10,162
984
126
Originally posted by: betasub
Originally posted by: Scholzpdx
These are the same people who couldn't overclock a Q8200 past 2.5ghz without insane voltages (1.45V). A noob overclocker (yes I said it) such as myself overclocked to 2.8ghz without any adjustments AT ALL. 3ghz took only an hour or so of actual testing.

So challenge them to swap their Q8200 CPU with yours and show them how it is done

Sarcastic or not, they completely fail.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: BFG10K
Originally posted by: toyota

1920 is a low res? lol.
Low is probably too strong, but certainly middling at best.

if I raised it it wouldnt even be playable in the first place.
That simply proves your bottleneck is the GPU since dropping the resolution is increasing your frames. Even your own tests confirm what I?m saying.

2.0 on the E8500 is not artificially low and is faster than a 5600 X2 which is faster what the majority of pc users still have.
Sure it?s slow; you?re vastly underclocking your CPU just to get some kind of difference, but that difference is unrealistic and overblown. This is akin to those 1280x1024 tests done online that conclude a processor is ?better? because it scores 150 FPS instead of 130 FPS at said settings.

As for what most users use, most use a GMA, so can I conclude using a 2400 for graphics testing is okay because it?s vastly faster? Of course not, because a 2400 is prehistoric junk, much like that X2 processor of yours. Drawing inferences from junk processors is as invalid as drawing them from junk GPUs.

last year I had horrific experiences trying to run a decent card on a 5000 X2.
Did you look at the tests I linked to? Did you look at the settings used? The 8800 Ultra was bottlenecking a stock E6660 even at 1600x1200 with 4xAA in almost every test. Do you really think someone that dropped $1000 on a quad-core i7 is going to be gaming at lower settings than that?

Those benchmarks are the norm, not the exception; I see this all the time when gaming.

1920 isnt middling either. its still a high res for most people. you act like everybody has a 30inch monitor and decent quad core cpu when that is far from the case. also an E6600 is much faster than a 5000 X2 so your comment about the 8800gtx being a bottleneck with it at 1600 with 4x AA has little bearing. not to mention many newer games are much more demanding on the cpu in general than those games used.


putting my E8500 at 2.0 was not unrealistic because it would still be better than all but the very high end X2 cpus, low end core 2s, and even single cores that people still run. perhaps reality has escaped you but there are many threads about people wanting to upgrade to high end graphics cards or even sli that have just as slow or slower cpus than my example. in some of those cases even their current video card is faster than what their cpu can even handle. adding way more gpu power isnt always the answer especially when your current cpu is already hold your fps back.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
I pretty much agree with BFG except for 1920 being a low resolution. If anything it's rather high resolution. Now it's becoming the norm since monitor price have dropped considerably.

I want to upgrade to a quad as well but there's really no need for quad except few exceptions. 3.0 ghz core 2 duo or even slightly slower is all you need to be gaming comfortably. Jesus I had my CPU for nearly 3 years and still strong.

As for Tom's benches. I stopped reading that site after the buy out.
 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: toyota
also an E6600 is much faster than a 5000 X2

It really isn't that much faster. A 3ghz x2 will be neck and neck with 2.4ghz c2d considering a 5000+ is 2.6ghz.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: Azn
Originally posted by: toyota
also an E6600 is much faster than a 5000 X2

It really isn't that much faster. A 3ghz x2 will be neck and neck with 2.4ghz c2d considering a 5000+ is 2.6ghz.

overall its a slightly bigger difference than that in the games where you really need more oomph from the cpu. being about equal to an E8500 at 1.6 in many games the 5000 X2 is a lot slower than people think. you can see that in this link where an E2160 at 1.8, which is much slower clock for clock than the E8xxx core 2 cpus, keeps even with a 5000 X2. so at 1680 with very high settings and even 4x AA the E6600 makes the game playable where it wasnt with the 5000 X2. http://www.pcgameshardware.com...hmarks/Reviews/?page=2 a 30% increase in min and average framerate is not bad at with those settings. sure its just at 1680 but very high settings and 4x AA are pretty graphically demanding.




as for your cpu of course you have no issues because it is quite fast with that overclock. this cpu limitation stuff is real though and can easily be shown. it just really depends on what cpu and gpu that you have. if you have a 3800 x2 and an gtx260 you are really holding that card back and adding a faster gpu will NOT help in many games at that point. I am not saying that games wouldnt be playable it just the fact so much performance is being missed out on and a faster gpu wont fix that. the only people that would argue that havent taken the time to test it out. a 3800 X2 would be like my E8500 at 1.3-1.4 and that will hold back even a current midrange card. anyway I completely agree that Toms test seems way off though.


 

OVerLoRDI

Diamond Member
Jan 22, 2006
5,494
4
81
Originally posted by: biostud
increasing your CPU speed by 58% increase you average framerate by ~23%

in the article increasing the CPU speed by 30% increase the framerate more than 100% in some games. That is quite surprising to me, and make me happy that my "poky", but overclocked CPU is running fine with my 8800GT.

Dude I used to be running a similar config. I had a x2 3800 OC'd 25% to 2.5Ghz. It is fast enough, but I recently made the switch to a Phenom II X4 940. It was worth it, you don't realize your CPU limitations till you make the switch sometimes.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,269
12
81
Originally posted by: toyota
Originally posted by: Azn
Originally posted by: toyota
also an E6600 is much faster than a 5000 X2

It really isn't that much faster. A 3ghz x2 will be neck and neck with 2.4ghz c2d considering a 5000+ is 2.6ghz.

overall its a slightly bigger difference than that in the games where you really need more oomph from the cpu. being about equal to an E8500 at 1.6 in many games the 5000 X2 is a lot slower than people think. you can see that in this link where an E2160 at 1.8, which is much slower clock for clock than the E8xxx core 2 cpus, keeps even with a 5000 X2. so at 1680 with very high settings and even 4x AA the E6600 makes the game playable where it wasnt with the 5000 X2. http://www.pcgameshardware.com...hmarks/Reviews/?page=2 a 30% increase in min and average framerate is not bad at with those settings. sure its just at 1680 but very high settings and 4x AA are pretty graphically demanding.




as for your cpu of course you have no issues because it is quite fast with that overclock. this cpu limitation stuff is real though and can easily be shown. it just really depends on what cpu and gpu that you have. if you have a 3800 x2 and an gtx260 you are really holding that card back and adding a faster gpu will NOT help in many games at that point. I am not saying that games wouldnt be playable it just the fact so much performance is being missed out on and a faster gpu wont fix that. the only people that would argue that havent taken the time to test it out. a 3800 X2 would be like my E8500 at 1.3-1.4 and that will hold back even a current midrange card. anyway I completely agree that Toms test seems way off though.

Generally speaking, the E2xxx lineup is on par clock for clock with Athlon X2. You really can't base your claim the 1.8 GHz E2160 is the same as the 5000+ off one article which only tested one game. And this game, Far Cry 2, is definitely known to generously favor Intel processors.

Here are two more tested games, and clock for clock they are pretty similar: http://www.anandtech.com/mb/showdoc.aspx?i=3051&p=6

The E6600 is most comparable to the Athlon 6400+ (and probably now the Athlon X2 7750/7850).
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BFG10K
Again, I?m not saying there aren?t any differences between quad and dual, just that they?re fringe cases. Furthermore, I?m also stating that at any reasonable resolution and minimal AA level those differences often cease, and the biggest bottleneck becomes the GPU by far.

Quad-core is still horrifically overpriced and underperforming compared to dual-core, especially the i7 platform. If you invest the difference into a better graphics card you?ll get far better gaming performance overall.

As for your two examples, you can clearly see they?re piss-poor engines that don?t run well on anything; both generally slideshow even at 1280x1024 with no AA.
While I agree that most games are fine with a decent dual-core, I've yet to see any other engines accomplish as much as ArmAII or GTA4 while running as well on any given setup. They both run respectably well on my Ci7 setup, but not so hot on my C2D.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Originally posted by: toyota

1920 isnt middling either. its still a high res for most people.
Exactly the same applies to a Radeon 2400: ?it?s still a high level of performance for most people? given the vast majority of the market uses an Intel GMA, which is grossly slower. So then using your logic, does that make a Radeon 2400 HD a high-end card?

As for middling, let?s compare pixel counts:

1920x1080 = 2073600.
2560x1600 = 4096000.

If something is halfway between a number and zero, that puts in the middle. That?s not my definition, that?s a mathematical definition.

Seriously, it?s really quite middling. 1600x1200 has only slightly less pixel count and I?ve been using said resolution since 2001 on my GF Ti4600, and running modern games at the time on it.

you act like everybody has a 30inch monitor and decent quad core cpu when that is far from the case.
Not at all; in fact I?m specifically arguing the case to use mainstream dual-core processors and putting the savings towards the fastest graphics card you can buy. I?m also arguing against quad-core because there?s little benefit in gaming from them overall, and the cost is often horrific compared to a high clocked dual-core.

And again, what resolution do you think someone that just dropped a thousand bucks onto a quad-core i7 is going to be using? Do you honestly think they?re going to be gaming at anything less than 1920x1200 with 4xAA? These people pay to get top-level performance and eye candy, not to run prehistoric resolutions without AA.

also an E6600 is much faster than a 5000 X2 so your comment about the 8800gtx being a bottleneck with it at 1600 with 4x AA has little bearing.
Actually it has a lot of bearing. Firstly, the E6600 isn?t hugely faster than the 5000 X2:

http://www.virtual-hideout.net...000_Black/index3.shtml

Those tests were done at 1440x900 with no AA; increase the settings to 1600x1200 with 4xAA and witness flat-lining, exactly the same flat-lining I observed when I went from a E6600 to a E6850, using a GPU about twice as fast as theirs I might add:

http://www.legitreviews.com/article/365/4/

Secondly, the GTX285 is about twice as fast as the 8800 Ultra in my real-world testing, a far cry from the 25% theoretical advantage my E6850 has over my former E6600.

So, with a 25% increase in CPU speed and a 100% increase in GPU speed, my GTX285 still massively bottlenecks me in just about every situation I use it in. It?s painfully obvious that quad-core simply has no place in that picture.

not to mention many newer games are much more demanding on the cpu in general than those games used.
I definitely disagree there. Again, there are some fringe titles, but by far the vast majority of titles are massively GPU bound, and they get more-so as you increase the details.

Cryostasis is an excellent example; it makes my GTX285 cry even with PhysX disabled at 1680x1050 with no AA. I know it?s the GPU because dropping the resolution significantly increases performance.
 

biostud

Lifer
Feb 27, 2003
18,402
4,965
136
Originally posted by: OVerLoRDI
Originally posted by: biostud
increasing your CPU speed by 58% increase you average framerate by ~23%

in the article increasing the CPU speed by 30% increase the framerate more than 100% in some games. That is quite surprising to me, and make me happy that my "poky", but overclocked CPU is running fine with my 8800GT.

Dude I used to be running a similar config. I had a x2 3800 OC'd 25% to 2.5Ghz. It is fast enough, but I recently made the switch to a Phenom II X4 940. It was worth it, you don't realize your CPU limitations till you make the switch sometimes.

I can't really decide wether I will upgrade my mboard+CPU or get an SSD first, I plan on switching from winXP to win7 when it's released, and my limited funds doesn't allow for both. I also need to by a new waterblock if I upgrade my mboard, so it's probably not going to fit into my budget right now.

(I also only play Quake Wars and Civ IV atm, so I have plenty of speed for those games)
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Originally posted by: kylebisme

I've yet to see any other engines accomplish as much as ArmAII or GTA4 while running as well on any given setup. They both run respectably well on my Ci7 setup, but not so hot on my C2D.
?Respectably well?? Would you please list these accomplishments? Because the performance figures look absolutely abysmal. I?ve seen GTA4 and it doesn?t have anything that would warrant that kind of woeful performance.

As for ArmaA 2, I haven?t seen it in person, but the review you linked to was pretty scalding about some things:

This impressive visualization is ruined by the bad LoD and the aggressive Streaming system as well as the post processing effects: The extremely exaggerated Motion Blur can be dealt with, but the Depth of Field which comes into effect in a short distance doesn't just soften the environment but also opponent who are more or less made invisible by that - so lower details are much better from a gameplay related point of view. Biggest drawback: Even on a graphics card with 2 GiByte VRAM ArmA 2 loads high resolution object textures at a very late time. Thus it is possible that a wooden box is made of a washy bitmap and until you get close enough you don't even recognize that the brow something is supposed to be a wooden texture.

But even more important is the low performance: Even on overclcoked highest-end hardware (Core i7; 12 GiByte RAM and a GTX 285 with 2 GiByte VRAM) ArmA 2 becomes a slide show (less than 15 fps) running at 1,280 x 1,024 pixels with very high details. You have to go to low or medium details and have to activate Pixel Doubling (which is ugly) to get playable framerates. Given the performance you shouldn't even think about Anti Aliasing, but the Real Virtuality Engine doesn't support it, as well as any SLI or Crossfire modes, anyway.
Less than 15 FPS at 1280x1024 with no AA? Is that what you consider ?respectably well?? Because I consider that utterly piss-poor.

Call of Juarez BiB is a DX9 renderer like ArmaA 2, has very large draw distances and impressive visuals, but runs at extremely fast even at 2560x1600:

http://www.xbitlabs.com/articl...-juarez-2_4.html#sect1

So yes, quad-core might score ?victories? in those two games, but they?re also hollow since nothing runs those games acceptably.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Originally posted by: BFG10K
Originally posted by: kylebisme

I've yet to see any other engines accomplish as much as ArmAII or GTA4 while running as well on any given setup. They both run respectably well on my Ci7 setup, but not so hot on my C2D.
?Respectably well?? Would you please list these accomplishments? Because the performance figures look absolutely abysmal. I?ve seen GTA4 and it doesn?t have anything that would warrant that kind of woeful performance.

As for ArmaA 2, I haven?t seen it in person, but the review you linked to was pretty scalding about some things:

This impressive visualization is ruined by the bad LoD and the aggressive Streaming system as well as the post processing effects: The extremely exaggerated Motion Blur can be dealt with, but the Depth of Field which comes into effect in a short distance doesn't just soften the environment but also opponent who are more or less made invisible by that - so lower details are much better from a gameplay related point of view. Biggest drawback: Even on a graphics card with 2 GiByte VRAM ArmA 2 loads high resolution object textures at a very late time. Thus it is possible that a wooden box is made of a washy bitmap and until you get close enough you don't even recognize that the brow something is supposed to be a wooden texture.

But even more important is the low performance: Even on overclcoked highest-end hardware (Core i7; 12 GiByte RAM and a GTX 285 with 2 GiByte VRAM) ArmA 2 becomes a slide show (less than 15 fps) running at 1,280 x 1,024 pixels with very high details. You have to go to low or medium details and have to activate Pixel Doubling (which is ugly) to get playable framerates. Given the performance you shouldn't even think about Anti Aliasing, but the Real Virtuality Engine doesn't support it, as well as any SLI or Crossfire modes, anyway.
Less than 15 FPS at 1280x1024 with no AA? Is that what you consider ?respectably well?? Because I consider that utterly piss-poor.

Call of Juarez BiB is a DX9 renderer like ArmaA 2, has very large draw distances and impressive visuals, but runs at extremely fast even at 2560x1600:

http://www.xbitlabs.com/articl...-juarez-2_4.html#sect1

So yes, quad-core might score ?victories? in those two games, but they?re also hollow since nothing runs those games acceptably.
They're two totally different games running on two totally different engines. Why do people compare linear FPS's with virtual worlds and think that the CPU load is the same? Granted Arma 2 needs optimization and patching, the quad core still flexes its muscles. Just because the load on the CPU is (more or less) unnecessary doesn't negate the fact that there is still a load on the CPU. When that kind of load becomes a requirement years down the road, the CPU is still going to be able to cope.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: cusideabelincoln
Originally posted by: toyota
Originally posted by: Azn
Originally posted by: toyota
also an E6600 is much faster than a 5000 X2

It really isn't that much faster. A 3ghz x2 will be neck and neck with 2.4ghz c2d considering a 5000+ is 2.6ghz.

overall its a slightly bigger difference than that in the games where you really need more oomph from the cpu. being about equal to an E8500 at 1.6 in many games the 5000 X2 is a lot slower than people think. you can see that in this link where an E2160 at 1.8, which is much slower clock for clock than the E8xxx core 2 cpus, keeps even with a 5000 X2. so at 1680 with very high settings and even 4x AA the E6600 makes the game playable where it wasnt with the 5000 X2. http://www.pcgameshardware.com...hmarks/Reviews/?page=2 a 30% increase in min and average framerate is not bad at with those settings. sure its just at 1680 but very high settings and 4x AA are pretty graphically demanding.




as for your cpu of course you have no issues because it is quite fast with that overclock. this cpu limitation stuff is real though and can easily be shown. it just really depends on what cpu and gpu that you have. if you have a 3800 x2 and an gtx260 you are really holding that card back and adding a faster gpu will NOT help in many games at that point. I am not saying that games wouldnt be playable it just the fact so much performance is being missed out on and a faster gpu wont fix that. the only people that would argue that havent taken the time to test it out. a 3800 X2 would be like my E8500 at 1.3-1.4 and that will hold back even a current midrange card. anyway I completely agree that Toms test seems way off though.

Generally speaking, the E2xxx lineup is on par clock for clock with Athlon X2. You really can't base your claim the 1.8 GHz E2160 is the same as the 5000+ off one article which only tested one game. And this game, Far Cry 2, is definitely known to generously favor Intel processors.

Here are two more tested games, and clock for clock they are pretty similar: http://www.anandtech.com/mb/showdoc.aspx?i=3051&p=6

The E6600 is most comparable to the Athlon 6400+ (and probably now the Athlon X2 7750/7850).

Not to mention, he ignores the 2Ghz 3800+ which performs on par with the 1.8Ghz e2160 (basically the same clock for clock performance). The game would appear to be very cache intensive, since the 6000+ performs massively better and has 1MB L2 cache. That would also fit in with my earlier theory, multiple threads greatly increase cache needs to the point where a multi-threaded app can hurt older cpus with less cache. Farcry 2 is designed around quad (or possibly more) cores, and the 512KB cache x2's can't cope with it, while the processors with 1MB or more cache per core perform much better.

Still, at this point you can barely find x2's on the market anymore, there's really no reason not to go with at least an athlon II.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Originally posted by: MrK6

They're two totally different games running on two totally different engines. Why do people compare linear FPS's with virtual worlds and think that the CPU load is the same?
I?m not saying the CPU load is the same, I?m saying the performance of Arma A 2 is utterly abysmal. As for open virtual worlds, look at something like the original Stalker which again has a DX9 renderer with comparative visuals, but runs far better than Arma A 2. I can actually run it acceptably at 2560x1600 with 2xTrAA (around ~50 FPS average in the benchmark runs).

Just because the load on the CPU is (more or less) unnecessary doesn't negate the fact that there is still a load on the CPU. When that kind of load becomes a requirement years down the road, the CPU is still going to be able to cope.
Cope? They can?t cope right now in GTA4 and Arma A 4; how are they going to cope in more demanding titles? We need faster quad cores, plain and simple.

Yes, it?s quite possible a reliance on quad core will become the norm in the future, but when that happens, quad core will be more proliferated and won?t be much more expensive than dual core, so it?ll make sense to get one. This is like the situation we?re in now where it makes absolutely zero sense to get a single core.

The point is that now it isn?t the norm, and advocating quad-core so you can hit 15 FPS @ 1280x1024 with no AA is quite silly. If you?ve got $1000 to burn on a i7 then it?s a far better idea to go with CF/SLI and witness much bigger and more far-reaching performance gains than you ever would with quad-core.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |