- Mar 20, 2006
- 10
- 0
- 0
Hey everyone. I was running a Nvidia Geforce 5200FX and then a Radeon 9200 (both 128MB, AGP cards). In the game Day of Defeat you can type a command that displays your FPS, real time, during the game. I had the maximum set at 100 and would basically get 100 throughout the entire game.
My friend offered to sell me his MSI Nvidia 6600GT for $40. I couldn't pass up the deal, seeing as they were selling for up to $150 elsewhere. I installed it and ran the game to a disappointing 40-50ish FPS. My friend who sold it to me and I concluded that my power supply wasn't enough (350 watts) so I just installed a 500 watt PSU. Now when I ran the game it maxed out at 60 FPS. What gives?! I don't know what else could be wrong -- I have a gig of RAM which is way more than you need for Day of Defeat (it's an old game).
Any help would be appreciated.
*** I originally posted this in the General Hardware forums where someone suggested something about how before I was dealing with DX7 but now am experiencing DX9... how does this play into it?
My friend offered to sell me his MSI Nvidia 6600GT for $40. I couldn't pass up the deal, seeing as they were selling for up to $150 elsewhere. I installed it and ran the game to a disappointing 40-50ish FPS. My friend who sold it to me and I concluded that my power supply wasn't enough (350 watts) so I just installed a 500 watt PSU. Now when I ran the game it maxed out at 60 FPS. What gives?! I don't know what else could be wrong -- I have a gig of RAM which is way more than you need for Day of Defeat (it's an old game).
Any help would be appreciated.
*** I originally posted this in the General Hardware forums where someone suggested something about how before I was dealing with DX7 but now am experiencing DX9... how does this play into it?