EightySix Four
Diamond Member
- Jul 17, 2004
- 5,122
- 52
- 91
Well, that means nvidia shot itself in the foot, most of their cards in people's computers don't support SM3 either.. It lowers my opinion of ubisoft that they would basically say ~80% of you aren't worth getting the full graphics, even though within a month we could have made your cards run it just as well!
(I run a 9700 pro, and although I'm sure I could've handled this game with an alright resolution with pretty good detail, if no 1024x768 with full detail, they basically shut me out of any graphical goodness I could've gotten, which frankly pisses me off... Nvidia makes great cards but saying SM3 is all that is bullshit, and that's from a programmer's stand point.)
The programmers who developed CryENGINE for FarCry said SM3 REALLY DOES NOT EVEN MAKE THA BIG A DIFFERENCE, even if the engine had been coded from the ground up sm3, and they changed it down to sm2b, you still wouldn't see tha much a difference, SM3 is a great marketing gimmick, and while I have written some DX stuff for fun, I have gone through it and there's not that big a difference. And on HDR, ATI's hardware is capable of it in a different way, and it is slower than nvidia's method, but hell even with nvidia's method you still chug along with HDR enabled, so what's the point?
(I run a 9700 pro, and although I'm sure I could've handled this game with an alright resolution with pretty good detail, if no 1024x768 with full detail, they basically shut me out of any graphical goodness I could've gotten, which frankly pisses me off... Nvidia makes great cards but saying SM3 is all that is bullshit, and that's from a programmer's stand point.)
Originally posted by: keysplayr2003
Creig, what prevents ATI R300 thru 480 from being branded SM3.0 compliant?
Also, what allows Nvidia NV40 thru 46 to be branded SM3.0 compliant?
What are the benefits of having it?
What do you lose by not having it?
Do you believe developers will NOT proceed to code games from the bottom up using SM3.0? Do you think they'll skip right over to SM4.0?
The thing is, you and I both know what the proper buy is. And it's not current ATI products with their aged tech. Are they great cards? By all means. Are they the greatest? By no means.
By a PCI-E or AGP (when available, should be soon) X800XL today and save some dough now, but will have to just buy another card soon with SM3.0 compliance. So, is that really saving money if you have to buy 2 cards relatively close together?
Sure, you can sell the X800XL and take a hit on it. But that's not very practical when you could have just bought the 6800GT and be done with it knowing full well that it was made to eat pure SM3.0 for breakfast.
I would like to see a 100% pure SM3.0 game run side by side on a X800XL and a 6800GT. I think 6800GT owner would be pleasantly surprised when the reviewers pick apart the inefficiency at which the ATI SM2.0b card runs SM3.0 while the NV4x runs it as smooth and efficient as it was intended to.
I now have a X800XTPE thanks to Rollo and I am just waiting for new games to come out to see what happens. I have a feeling that the ATI cards are called SM2.0b compiant and not SM3.0 compliant for a reason, and I get to see it first hand. My GT will do just fine.
Wuv,
Keys
The programmers who developed CryENGINE for FarCry said SM3 REALLY DOES NOT EVEN MAKE THA BIG A DIFFERENCE, even if the engine had been coded from the ground up sm3, and they changed it down to sm2b, you still wouldn't see tha much a difference, SM3 is a great marketing gimmick, and while I have written some DX stuff for fun, I have gone through it and there's not that big a difference. And on HDR, ATI's hardware is capable of it in a different way, and it is slower than nvidia's method, but hell even with nvidia's method you still chug along with HDR enabled, so what's the point?