I run a Half-Life dedicated server on my PC, and according to the console window the frame rates have always been ~90-99 -- with a TNT1 Diamond Viper V550 video card installed. Yesterday, I installed a VisionTek GF3, and with nothing else changed (Nvidia 12.60's still, etc.) the reported FPS is now down to 50-60. HL is running the exact same server version, game, and with the same number of bots as when it was 90-99FPS. Nothing else is running on the PC. Yes, the 4in1's, AGP miniport, etc. are installed.
How is the video card a factor in dedicated server FPS? I'm not in the game, so how is video even involved in the equation? Thanks.
How is the video card a factor in dedicated server FPS? I'm not in the game, so how is video even involved in the equation? Thanks.