Originally posted by: ChineseDemocracyGNR
The question is - how much of an improvement is SLI for a game that is not optimized/supported by the drivers?
Good question. As you can see from that link, UT2k4 is not listed as supported by nV. Well, it works with AFR rendering with a manual change in the profile. My benches below:
Used U-Mark utility from http://www.unrealmark.net
A64 3000+ stock
single 6800GT 370/1000 (stock BFG)
1600x1200 with 4x AA 8x AF forced in the drivers
Map 1-- 57.61 FPS
Map 2-- 95.39
Map 3-- 44.32
Map 4-- 104
Map 5-- 56.59
Map 6-- 48
Map 7-- 64.19
Map 8-- 54
Map 9-- 65.77
A64 3000+ stock
SLI GTs 370/1000 (stock BFG)
1600x1200 with 4x AA 8x AF forced in the drivers
Map 1-- 75 (23% increase)
Map 2-- 157 (39% increase)
Map 3-- 62 (29% increase)
Map 4-- 104 (0% increase)
Map 5-- 74 (23% increase)
Map 6-- 48 (0% increase)
Map 7-- 93 (31% increase)
Map 8-- 54 (0% increase)
Map 9-- 66 (0% increase)
It appears that I was CPU limited on some of those maps as when I overclocked to 2500mhz, it opened up a little more. I don't have benches of a single 6800gt with the overclocked CPU though, so these aren't exactly apples to apples:
A64 3000+ @ 2500mhz
SLI GTs stock 370/1000
1600x1200 with 4x AA 8x AF forced in the drivers
Map 1-- 95 (39% increase) total increase over single GT
Map 2-- 157 (44% increase)
Map 3-- 62 (41% increase)
Map 4-- 104 (23% increase)
Map 5-- 74 (40% increase)
Map 6-- 48 (24% increase)
Map 7-- 93 (43% increase)
Map 8-- 54 (24% increase)
Map 9-- 66 (24% increase)
The % increase (when compared apples-to-apples with the stock CPU) for UT2k4 doesn't quite equal the % increases I've seen in games that "officially" have the nV stamp, such as Far Cry, HL2, and Doom3... which all saw gains of around 40% at the stock CPU. It wasn't until I overclocked the 3000+ that I saw 40% gains.
But, this may not be a fair example to compare "officially" nV-supported games vs. unofficial since UT2k4 is known for being CPU-dependent.