PrinceXizor
Platinum Member
- Oct 4, 2002
- 2,188
- 99
- 91
Originally posted by: AMCRambler
I think video cards have reached the point where their rate of improvement has far exceeded that of the games we play. Just like processors for pcs but it took longer. I remember when a game would require a Pentium 266 and with the top of the line Pentium 500 we thought man, games will never catch up. The gap has grown so that the min required is like 800mhz, and the top of the line is what, 3.2ghz? Same thing is happening with vid cards. Upgrades on cards are gonna start becoming fewer and farther between due to the same effect.
I've got a Geforce 4 4600 that's can run everything I throw at it right now on the highest settings. The card came out 2 years ago and it's still more than enough to run the latest stuff. Now I don't run it at 1400x1600 because that would be ridiculous on a 17inch monitor but even with all details on at 1024x768 it runs UT2k fine. It'll probably be at least a year before I'll need a new card just because I don't need to have the highest settings enabled with 30X anti-aliasing or whatever. It seems to me though that the trend seems to be going more towards greater amounts of time between component upgrades when it comes to vid cards and cpu's because they have outpaced the demands of the software that runs on them.
Make a note...you ready? UT2K is NOT current games.
Ok...make another note....you ready? Please read Anand's latest article on video cards.
Ok...make another note....you ready? Notice that the LATEST and GREATEST are getting sub 30 FPS in some games.
Ok...make another note...you ready? To claim that your video card can handle "anything" and then to disclaim running high resolutions and anti-aliasing is pure and simple B.S.
Ok...make another note...you ready? Please see a doctor because its obvious that someone has secretly removed your brain without you knowing it.
P-X