Originally posted by: Rusin
So according to Bit-Tech this game doesn't even support AA and has 'film grain' to hide that fact..but BT recomends to disable this option because it makes game look worse (enemies with greater distance looks just bad) :frown:
Originally posted by: golem
Originally posted by: Nemesis 1
This game has been ported from Xbox. So does that maybe help some out on the AA. If you don't know google is your friend.
That may explain why ATI is doing better, but it doesn't explain why in the lower rez test, a 9600gt beats a 9800GTX. Isn't a 9800GTX and 8800 GTS/512 very close to being the same card? Their scores should be much closer.
I'm not saying this one thing invalidates all the benchmarks, but it does indicate that they should be taken with a grain of salt until more test are done.
Originally posted by: Nemesis 1
AMD is looking better all the time. Question is how much time do they have?
Originally posted by: Rusin
The reason why there aren't any AA scores is that this game uses Unreal3-engine and AMD's graphics cards don't support it. AMD-card + Unreal3 based game + no AA seems to work well. Problem is that Nvidia cards don't seem to take big performance hit when enabling AA. For example in Unreal Tournament 3 game is still playable at 1920x1200 8xAA 16xAF with 8800 GTS 512..and fps numbers on 0xAA aren't as much higher that one could imagine.
We have seen these driver issues so many times. Radeon cards sucked..it self.. in Lost planet DX9 first, but new driver release made really strong player (well they still lose badly in DX10 mode). 8600 GTS were first closer to 7600 GT in some game and after driver update it was on par with 7900 GTO etc.
Unreal3 engine is not nice on AA + AMD graphics card. I wouldn't be suprised if this Mass Effect PC-version wouldn't even have anti-aliasing support at all.
Originally posted by: taltamir
8.3? march of 2008? can you really fault him for not knowing that since 2 months ago it has been fixed?
Originally posted by: taltamir
isn't the 9600GT the only card to increase the PCIe bandwidth? remember that whole deal with people accusing nvidia of "cheating" because the 9600GT increases the PCIe bandwidth with 174+ driver and an nforce mobo? Maybe the 9600GT being a 9800GTX with half the shaders and lower clockspeed is compensated for by the ... no wait that makes no sense, the 9600GT also has a lower bus if I recall correctly, with less then 256bit that the 9800GTX has. It makes no sense for it to come on top.
Maybe multiple testes checked the cards on different systems and this was the result?
Or maybe we have something similar to the NWN2 (also aurora engine) bug where the 8800GTX suffered from an infinite loop while trying to render water reflections, causing severe slowdowns and much lower performance compared to lower end nvidia cards.
Originally posted by: taltamir
Ok, but that was uncalled forWTF, so much BS the meter is off the charts
Originally posted by: Foxery
I'm surprised there hasn't been much discussion of the rumored 1GB (yet single GPU) models. Are games ready for >512 yet, or is ATI jumping the gun here? That also adds a lot to the cost when the RAM is also the first release of its generation - GDDR5. No doubt board partners will release 512MB versions, but the plans I've seen imply that the intended difference between 4870 and 4850 performance should be pretty wide.
Originally posted by: dennilfloss
Originally posted by: Foxery
I'm surprised there hasn't been much discussion of the rumored 1GB (yet single GPU) models. Are games ready for >512 yet, or is ATI jumping the gun here? That also adds a lot to the cost when the RAM is also the first release of its generation - GDDR5. No doubt board partners will release 512MB versions, but the plans I've seen imply that the intended difference between 4870 and 4850 performance should be pretty wide.
I need that 1GB card for Oblivion. 512MB is not enough with Qarl's Texture Pack 3.
Originally posted by: JPB
Originally posted by: dennilfloss
Originally posted by: Foxery
I'm surprised there hasn't been much discussion of the rumored 1GB (yet single GPU) models. Are games ready for >512 yet, or is ATI jumping the gun here? That also adds a lot to the cost when the RAM is also the first release of its generation - GDDR5. No doubt board partners will release 512MB versions, but the plans I've seen imply that the intended difference between 4870 and 4850 performance should be pretty wide.
<I need that 1GB card for Oblivion. 512MB is not enough with Qarl's Texture Pack 3.
Awesome graphics mod. I have this as well, and you are totally right
Originally posted by: JPB
Awesome graphics mod. I have this as well, and you are totally right
Originally posted by: bryanW1995
gtx 280 will be 1gb and 4870 should have a similar amount of bandwidth, so I think that 1gb is justified in this case. Obviously it won't be needed for power users like me who game at 14x9, but nitro or bfg-type users with 30" monitors should actually see quite a bit of benefit this time around from the added memory.