Originally posted by: Speedo
RussianSensation: Thanx a lot for your very informing reply!
Questions still remaining in my head is: What is TruForm, and what does it do?
Basically TruForm makes objects smoother by rounding the edges. Games like CounterStrike and Quake 3 support it I believe and perhaps some others. I am not sure if newer games support it or not but it used to be a forward looking feature. You can enable it in games if you'd like to see if it makes a difference or not. The performance drop is minimal with it or without it. You can read more on this
Here and
Here. Of course TruForm debuted with R200 (Radeon 8500), but since then evolved into TruForm 2 on Radeon 9800Pro offering:
TRUFORM? 2.0
2nd generation N-Patch higher order surface support (better image quality)
Discrete and continuous tessellation levels per polygon
Displacement mapping (for instance now you can distinguish between the tread of a car's tire much better making it look more 3D)
Now I am not sure if TruForm is still supported with the latest X800 series cards, leading me to believe that not a lot of games utilize it. Maybe someone can clarify this better.
Couldn't it be a good idea to force, lets say, 4AA/4AF in all games, since that should run fine at 1024x768, or? I think FarCry only had "no, medium, high" AA settings etc, and I don't even know what that translates into. Dunno if there were any "in game" AF setting...
Sure, in theory forcing AA should make things more convenient. However, sometimes the graphics card isnt fast enough to run one game with those settings when it can run another game just fine. This often forces players to keep changing the settings in the control panel by going back and forth making it incovenient. A possible solution would be to simply set application preference and just adjust to whatever settings your card can play "in game" thus preventing the need to exit the game and making changes on the desktop (ati control panel). The problem also lies in that sometimes when you force these settings they often dont work and you have to make changes within the game. Other times even if you make changes within the game, they are not implemented so you'll have to change them manually in teh control panel. It all depends and I guess you'll have to experiment or ask people on forums about specific game that you are interested in. What's worse, is that some games (ie. Halo) dont even support AA and only support AF, so it gets tricky. Also there is the final problem of personal preference. Personally I prefer 1600x1200 0AA/4AF over 1024x768 4AA/8AF. So you'll have to figure out what you like and what looks and runs better for you for every game. The beauty of having the "fastest" possible videocard you can get is that you wont have to worry about making these choices as the card will be able to play anything at any setting. Also, I think ATI should introduce a game porfolio in the drivers so you can remember the settings for each game you play like Nvidia's driver panel does. Hopefully that will come soon.
Speaking of FarCry; how do I run an FPS test? Some kind of timedemo, or is it possible to enable an FPS counter during normal gameplay?
Yes there was a timedemo circulating somewhere on the internet. You might want to look into a recent Far Cry thread that people were discussing or search for Far Cry-related threads or simply form a new quesiton and someone will point you to the timed demo. Other ways of estimating performance would be to read review sites with the games you play and try to approximate what you will get with "similar" setup. One way to enable the FPS counter is to download
FRAPS 2.1 and run it by enabling the counter in the corner you desire. You can also average the frames, but I dont remember how, because I dont really use that feature with my slow card. Again maybe someone will point it out.
I've seen "ATI tool" being mentioned in some other thread. Any drawbacks with that? It doesn't show up in the driver control panel?
I haven't personally used ATI tool, so I cannot comment on its effectiveness and how it works. The Radlinker I posted above installs itself into ATI control panel. Once you enable the clocks you simply click Set and you are done. Powerstrip is also very simple. It really shoulndt' matter as you should be able to reach equal speeds with all of those tools. But some provide more precise increase/decrease increments (by making them smaller), making it much easier to find the top overclock.
Btw, would it really be vice to run GPU/mem in sync, considering the big default difference? Also, does really the core put out so much more heat just by upping the frequency, eventhough there obviously isn't any raise in operating voltage?
Unlike AMD or even Intel architecture, running videocards in sync doesn't really gain any improvement vs. running them in async so it really doesnt' matter. Besides for some cards it doesnt really make sense. For instance my radeon 8500 runs at 275/275. So i simply click Sync and adjust both sliders at once. But I simply do it for convience as my card stop at about 300/300 for each. Your card will probably go 380 => 415GPU (+35) and 680 => say 730 (or +25) So adjusting the slider together wouldn't be convenient. So try to adjust each to see where the artifacts come and then adjust both together to those same limit speeds and see if you need final adjustments.
The core will put out more heat regardless of increase in voltage. Imagine if something has to work faster, it will waste more energy and thus product more heat. If you increase the voltage (NOT recommended for VD) then it wil get even hotter. The heat produced at 415+mhz will not be that much more, but it will be a lot for the videocard because the cooling system on videocards isn't something like a Thermalright SP-97 or Zalman 7000Cu. If you had watercooling, you'd be able to increase it even more. Honestly though, I woudlnt even bother overclocking the 9800Pro because a) it's already fast enough in all games that it plays well b) in games where it's slow (Far Cry, Halo) overclocking it will not make them suddenly playable at the resolutions you were unable to play at before. Personally, i'd keep overclocking for benchmarking purposes because I doubt it will aid you more than 10%-15% improvement.