Alright I've been reading posts on here about people with Athlon XP 1900, 2000, 2100 with a GF4 TI4200 pushing 11,000 on 3Dmark 2001, but only about 1,000 to 1,500 on 3Dmark 2003.
Here's my specs:
2x Athlon MP 1900+ (not that they help gaming)
Radeon 9700Pro by ATI, updated drivers
Kingston PC2100 Registered ECC 512MB DDR
EPOX M762A motherboard
Windows XP Pro
However, even with a Radeon 9700 Pro, I get between 11,200 and 11,300 every time. But I do get 4,280 on 3Dmark 2003. Is 3Dmark 2003 a better benchmark (as in a better measuring stick)? It seems as though that most of the cards that score just as high as me on 3D '01 fall way below my score on 3D '03. Why is that?
Second of all, how much of a role would you guys rate CPU performance in your score? Like... would you say it played about a 50%, 25%, etc. role? Your thoughts are appreciated.
Here's my specs:
2x Athlon MP 1900+ (not that they help gaming)
Radeon 9700Pro by ATI, updated drivers
Kingston PC2100 Registered ECC 512MB DDR
EPOX M762A motherboard
Windows XP Pro
However, even with a Radeon 9700 Pro, I get between 11,200 and 11,300 every time. But I do get 4,280 on 3Dmark 2003. Is 3Dmark 2003 a better benchmark (as in a better measuring stick)? It seems as though that most of the cards that score just as high as me on 3D '01 fall way below my score on 3D '03. Why is that?
Second of all, how much of a role would you guys rate CPU performance in your score? Like... would you say it played about a 50%, 25%, etc. role? Your thoughts are appreciated.