- Sep 13, 2003
- 976
- 0
- 0
Originally posted by: Madellga
I get around 11000 on my 1st rig.
Originally posted by: BroadbandGamer
Originally posted by: Madellga
I get around 11000 on my 1st rig.
:Q
How are you getting such a high score? I thought my CPU was pretty good. :frown:
Originally posted by: jim1976
Originally posted by: BroadbandGamer
Originally posted by: Madellga
I get around 11000 on my 1st rig.
:Q
How are you getting such a high score? I thought my CPU was pretty good. :frown:
Well his score is right.. I get 11130 with Conroe@3.6GHz and GTX stock.
4400+ is a very good cpu. Just not that good as a Conroe especially an O/Ced one ..
I got major differences from the two cpus.. I like you had a 4400+@2.8GHz and I switched for a 6600@3.6GHz O/Ced now.. Don't believe what you read that cpu does not play a role @16x12 or above. I measured differences up to 25-30% @16x12, and if you don't see it with your own eyes you cannot believe it
It's one of the few times that 3dmock says the truth about the cpu power..
Originally posted by: nitromullet
Originally posted by: jim1976
Originally posted by: BroadbandGamer
Originally posted by: Madellga
I get around 11000 on my 1st rig.
:Q
How are you getting such a high score? I thought my CPU was pretty good. :frown:
Well his score is right.. I get 11130 with Conroe@3.6GHz and GTX stock.
4400+ is a very good cpu. Just not that good as a Conroe especially an O/Ced one ..
I got major differences from the two cpus.. I like you had a 4400+@2.8GHz and I switched for a 6600@3.6GHz O/Ced now.. Don't believe what you read that cpu does not play a role @16x12 or above. I measured differences up to 25-30% @16x12, and if you don't see it with your own eyes you cannot believe it
It's one of the few times that 3dmock says the truth about the cpu power..
http://www.xtremesystems.org/forums/showthread.php?t=123831
http://www.driverheaven.net/reviews/8800nvidiareviewx120/36.php
http://www.guru3d.com/article/Videocards/391/23/
CPU scaling benchmarks with actual games don't support your 25-30% @ 1600x1200 claim. 3DMark06 is not even remotely relevant to what you will see in games @ 1600x1200 anymore. First of all, by default it runs at 1280x1024 with 0AA/0AF, which no one with an 8800GTX should be running their games at. Second, 3DMark06 has a CPU benchmark that solely relies on the CPU to render the screen. Yeah, sounds real world to me...
3DMark06 MAY be a relevant benchmark for entire system performance (I'm sure your OC'ed Conroe will encode movies/music way faster then a 4400+), but it certainly falls short of being a good test for isolating 3D video performance as it's name would imply.
Originally posted by: nitromullet
http://www.xtremesystems.org/forums/showthread.php?t=123831
http://www.driverheaven.net/reviews/8800nvidiareviewx120/36.php
http://www.guru3d.com/article/Videocards/391/23/
CPU scaling benchmarks with actual games don't support your 25-30% @ 1600x1200 claim. 3DMark06 is not even remotely relevant to what you will see in games @ 1600x1200 anymore. First of all, by default it runs at 1280x1024 with 0AA/0AF, which no one with an 8800GTX should be running their games at. Second, 3DMark06 has a CPU benchmark that solely relies on the CPU to render the screen. Yeah, sounds real world to me...
3DMark06 MAY be a relevant benchmark for entire system performance (I'm sure your OC'ed Conroe will encode movies/music way faster then a 4400+), but it certainly falls short of being a good test for isolating 3D video performance as it's name would imply.
Don't believe what you read that cpu does not play a role @16x12 or above. I measured differences up to 25-30% @16x12, and if you don't see it with your own eyes you cannot believe it It's one of the few times that 3dmock says the truth about the cpu power..
Originally posted by: nitromullet
Ok, perhaps I misunderstood your statement:
Don't believe what you read that cpu does not play a role @16x12 or above. I measured differences up to 25-30% @16x12, and if you don't see it with your own eyes you cannot believe it It's one of the few times that 3dmock says the truth about the cpu power..
...which sounds as if you are saying that 3DMark06 is giving you the truth on what you can expect from OC'ing your CPU. However, you are essentially saying that 3DMark06 does a good job of showing off CPU overclocks, although they don't really translate to much in the real world. Don't get me wrong, your OC and 3DMarks are impressive, and I would probably have my Conroe clocked a bit too if I had had better luck with my 680i board (I think I'll wait for rev. 2 of the Asus boards). That being said, the OP really doesn't need to worry about his 3DMark06 score since it doesn't count for much when it comes down to it, which is really what this thread is about and the main point I was making.
While not exactly a comparison between an OC'ed 4400+ and an OC'ed C2D, you probably remember this article from HardOCP a while back that got everybody's feathers ruffled when they compared an FX-62, E6700, and an X6800 and saw almost no differences with even a 7900GTX, which is a much lesser card than a 8800GTX....
http://enthusiast.hardocp.com/article.html?art=MTEwOCwxLCxobmV3cw==
Originally posted by: BroadbandGamer
Originally posted by: Madellga
I get around 11000 on my 1st rig.
:Q
How are you getting such a high score? I thought my CPU was pretty good. :frown:
Originally posted by: tilki
Guys I just bought a Winfast 8800GTS and put it in the following system:
Win XP SP2 (with latest updates)
Abit AB9 motherboard
2 x 1024 MB 800 MHz RAM
97.02 NVidia driver
But I try to install it, I get the following message: "the software for this hardware did not pass the Win Logo test to verify its compatibility with XP".
Has anybody experienced the same problem? What do I do?