Hey Ben,
<<<<"The digit-life guys also have a look in the performance of the ansitropic filtering and after 16 taps, there is a huge performance hit and frame rates begins to crawl at very high resolutions.">>>>
<<Driver revision- 10.5, not a shipping GF3 driver by any means. In fact, they had to use a reg hack to enable anisotropic, how much time do you think they have spent optimizing for it? I'm sure a lot of people greatly appreciated the numbers they provided, it would serve people well though to remember the early numbers we saw on the GeForce using the not for GF series 2.08 Detonators. They scored quite poorly when compared to the vanilla TNT2 which now we know to be far off the mark. Am I saying not to trust those benches as represenative of final GF3 performance? Absolutely.>>
Hmm yep! You may be true, but still anisotropic filtering uses lots of texture data to compute the color value of the pixel. I think 8-tap means 8 texture data, so the amount of data required for each pixel doubles by doubling of the number of the taps. I think my performance hits discussion is still true, but they can improve the performance-of course-for every tap configuration, which makes 16+ taps configurations also viable
Note: I heard (of course I can not validate this note) that Nvidia does not supply their boards to reviewers because of immaturity of their drivers. I think thats what you talk about. And I am pretty sure that they would increase the performance considerably because they have the best driver group in the area with no doubt.
<<<<"Thats why the performance scores of 16 bit does not effected much when compared to Geforce2 Ultra + do not expect any enormous improvement in 16 bit therefore.">>>>
<<They are under 60FPS 1600x1200 16bit Quake3 with a GF3, I would gladly wager that those numbers will improve significantly with final drivers. Look to MPixel/Mtexel or any other measurement you want, the GF3 is performing significantly slower in 16bit then it should be. >>
Even if the drivers matures, I do not still expect a huge difference in 16 bit. As far as I understand GF2 series also does its job well in 16 bit area, and this card would not improve it considerably much. However, I am really happy to see that 32 bit shines now. And if 32 bit performance is almost same as 16 bit performance, why to use 16 bit then?