ClickOriginally posted by: BFG10K
Good stuff and 8xMSAA is a pleasant surprise.
I really hope the xS modes haven't been removed though as that's the main reason I'm getting a 8800 GTS.
I'm confused though. I didn't think Nvidia offered a 4xS before?NVIDIA has introduced a new method of anti-aliasing called Coverage Sample Anti-Aliasing. This differs from the traditional Multi-Sample AA by taking samples from a coverage area instead of subsamples from a pixel. There are several new AA modes. 2X AA is 2 sample MSAA. 4x AA is 4 sample multisample anti-aliasing. This is simple enough to understand right? NVIDIA has deviated from their previous practice of mixing Multi-Sample AA+ Super Sample AA.
In point of fact, 8XS and 4XS modes no longer exist in the NVIDIA drivers. 8x AA is really 4x multisample Antialiasing with 4 additional coverage samples. 8xQ AA is true 8 sample multisample anti-aliasing. 16x AA is 4x AA+8 coverage samples. This mode provides the best performance in most applications+ image quality as performance is just a little less than the 4x mode. 16xQ AA is 4x multisample anti-aliasing+12 coverage samples. This is the highest quality mode of anti-aliasing available on the GeForce 8 series.
That doesn't mean anything since they're basically just saying the control panel doesn't present those options.[In point of fact, 8XS and 4XS modes no longer exist in the NVIDIA drivers.
It used to be in the control panel back in the NV2x/NV3x days but now you need a tweaker to access it. It's pretty worthless though as there's absolutely no reason not to use 8xS instead.I didn't think Nvidia offered a 4xS before?
NWN2? G3? CoH? FEAR:EP is more demanding than the original FEAR. Also, for some reason DM:MM is a pretty intensive game with all settings maxed out.So what new/popular games can justify this much horsepower...Nothing from what I see.
For instance, Ackmed had an X1900 CF system running on a PCP&C 510W SLI PSU. Now, if the 8800GTX's power draw at load is very close to one X1900XT(X), why do we need such monster PSU's to power them and not so much for the X1900 CF rigs?You lost me there Josh. What do you mean? Doesn't the X1950XTX use about the same amount of power as an X1900? Sure, the 1950 uses GDDR4, but clocked much higher. Kind of cancels out the power savings? Well, whatever you meant there, it would seem that a good quality 700W PSU would suffice for 8800GTX SLI.
So why did they change the SSAA availability? Does CSAA look better than SSAA?That doesn't mean anything since they're basically just saying the control panel doesn't present those options.
Originally posted by: nib95
Anand's review is pretty technical and all, but the actual benchmarks are lame.
Here's hoping they get that retail unit soon so we can get a proper review.
Probably because the masses find those settings unwieldy. You have to really understand the modes to use them well.So why did they change the SSAA availability?
No. It may be better at edges than 8xS but CSAA is still MSAA which means it can't AA textures. I would still be taking 8xS over any of the CSAA modes.Does CSAA look better than SSAA?
Yes. EverythingSo...8xS does textures and everything?
When SuperSamplingAA isn't being used and MultiSamplingAA is, TrAA performs AA on alpha textures.What about the "transparency antialising" as mentioned in AT review? How does that come into play?
For this card it looks like 16xAA with TrSAA would be a great combination. That, of course, is until a SuperSampling mode can be tweaked from the registry as BFG10K suggests.Looks like i'd be sitting there an hour just to find out which AA modes is the best (mean. highest quality with moderate performance impact)
Originally posted by: josh6079
For instance, Ackmed had an X1900 CF system running on a PCP&C 510W SLI PSU. Now, if the 8800GTX's power draw at load is very close to one X1900XT(X), why do we need such monster PSU's to power them and not so much for the X1900 CF rigs?You lost me there Josh. What do you mean? Doesn't the X1950XTX use about the same amount of power as an X1900? Sure, the 1950 uses GDDR4, but clocked much higher. Kind of cancels out the power savings? Well, whatever you meant there, it would seem that a good quality 700W PSU would suffice for 8800GTX SLI.
Originally posted by: keysplayr2003
Originally posted by: josh6079
For instance, Ackmed had an X1900 CF system running on a PCP&C 510W SLI PSU. Now, if the 8800GTX's power draw at load is very close to one X1900XT(X), why do we need such monster PSU's to power them and not so much for the X1900 CF rigs?You lost me there Josh. What do you mean? Doesn't the X1950XTX use about the same amount of power as an X1900? Sure, the 1950 uses GDDR4, but clocked much higher. Kind of cancels out the power savings? Well, whatever you meant there, it would seem that a good quality 700W PSU would suffice for 8800GTX SLI.
Ah I see. But just consider that not all GPU "functionality" is being utilized benching these DX9 titles. I'm thinking that, when they arrive, full blown DX10 titles loaded with all the bells and whistles (developer pulls out all the stops) just might make the G80's work a bit harder, or more completely. In turn, pulling more power to feed that tremendously large 681 million transistor core.
Because something just doesn't add up.
1. Transistor count skyrocketed from 298 million transistors on 90nm, to 681 million on 90nm. HUGE increase. This card "should" be drawing more power than it was shown to pull.
I'm just guessing here, but TSMC did not pull off this kind of miracle. I think the core is not being fully utilized (well of course it isn't. No DX10 titles yet) and will pull more power when it actually has to do what it was designed for.
MHO
EDIT: Oh yeah, my point! Spend the extra few bucks on a PSU that will exceed manufacturers specs the first time around. Better to do that up front now, then to find out it just cant cut it later.
Originally posted by: DeathBUA
Originally posted by: keysplayr2003
Originally posted by: josh6079
For instance, Ackmed had an X1900 CF system running on a PCP&C 510W SLI PSU. Now, if the 8800GTX's power draw at load is very close to one X1900XT(X), why do we need such monster PSU's to power them and not so much for the X1900 CF rigs?You lost me there Josh. What do you mean? Doesn't the X1950XTX use about the same amount of power as an X1900? Sure, the 1950 uses GDDR4, but clocked much higher. Kind of cancels out the power savings? Well, whatever you meant there, it would seem that a good quality 700W PSU would suffice for 8800GTX SLI.
Ah I see. But just consider that not all GPU "functionality" is being utilized benching these DX9 titles. I'm thinking that, when they arrive, full blown DX10 titles loaded with all the bells and whistles (developer pulls out all the stops) just might make the G80's work a bit harder, or more completely. In turn, pulling more power to feed that tremendously large 681 million transistor core.
Because something just doesn't add up.
1. Transistor count skyrocketed from 298 million transistors on 90nm, to 681 million on 90nm. HUGE increase. This card "should" be drawing more power than it was shown to pull.
I'm just guessing here, but TSMC did not pull off this kind of miracle. I think the core is not being fully utilized (well of course it isn't. No DX10 titles yet) and will pull more power when it actually has to do what it was designed for.
MHO
EDIT: Oh yeah, my point! Spend the extra few bucks on a PSU that will exceed manufacturers specs the first time around. Better to do that up front now, then to find out it just cant cut it later.
Yea...this is true. I wonder how big those L1 caches are in the streaming processors...And just wonder is this new arch that much of a power saver.....
Who knows.
Like u said maybe when its more utilized it'll draw more power....
Unless nVidia pulled some kinda voodoo power magic
Originally posted by: RussianSensation
Great new features. Not sure about "must have 450 watt supply or + (or 30A)". My OCZ powerstream 420watt has 30A and will most likely meet the power requirements. This might be another case of Nvidia being conservative (recall 480watt power supply requirement for 6800Ultra). Also let's not forget that P4D system with X1950XTX probably uses more power than a core 2 duo and 8800GTX if we consider the whole system consumption in aggregate Great time to build a new system.
I must say this is probably the only time I can recall in the last 3-5 years where the top card actually justifies the price premium over the GT/GTS version at stock speeds.
GTX is simply amazing!
GTS stock performance is very disappointing for me from AT's review.
Originally posted by: flexy
Originally posted by: Hyperlite
Originally posted by: otispunkmeyer
Originally posted by: theprodigalrebel
Anyone who has doubts in these reviews should just go to BFG's website.
Look at the picture in the third column here
And Newegg has them for $660/$500 (GTX/GTS)
lol
i want one that says
FTMFCSCGASPFGIW
see if you can guess what that stands for !
for the mother fcking cork sucking christ...somthing, i dunno :laugh:
i don't even know what to say about G80. its unbelieveable, but true. ARGH! necessito dinero!!! y un psu de 700 watts
>>>
[...]GASPFGIW
>>>
G@Y A$$ S***NG PERFORMANCE FOR GAMERS I WANT ?
Originally posted by: BassBomb
Now ilike the fact that these cards own and all... cooler looks like crap but from reviews does well enough in the sound category
i didnt see anywhere mentioning heat output
anyways
Only if the price was right... at 570 CDN, cheapest i found, that is a little steep just for the 8800GTS ... hopefully by boxing day when im ready to blow some money it is around a more reasonable 450 (which i doubt, and thats till way to high for me )
ah, cant wait til 8600GT-- only if those product cycles werent so long compared to high end
Originally posted by: BFG10K
That doesn't mean anything since they're basically just saying the control panel doesn't present those options.[In point of fact, 8XS and 4XS modes no longer exist in the NVIDIA drivers.
The fact is 16xS (for example) has never been offered through the control panel but it's been there since the NV40.
As long as the driver hasn't removed the AA code or the registry settings you should be able to still access the xS modes through tweakers just like you can for many other AA modes.
It used to be in the control panel back in the NV2x/NV3x days but now you need a tweaker to access it. It's pretty worthless though as there's absolutely no reason not to use 8xS instead.I didn't think Nvidia offered a 4xS before?
Originally posted by: BFG10K
Probably because the masses find those settings unwieldy. You have to really understand the modes to use them well.So why did they change the SSAA availability?
No. It may be better at edges than 8xS but CSAA is still MSAA which means it can't AA textures. I would still be taking 8xS over any of the CSAA modes.Does CSAA look better than SSAA?
As for 16xS, I'd say it's still the best AA mode we have available, both for edges and textures.