SM3.0 is a scam.

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

sandorski

No Lifer
Oct 10, 1999
70,620
6,177
126
I sorta, like many others, agree with the OP, he just didn't word it well.

NVidia has always been plagued with this type of criticism. That is, the whizzbang new feature that'll change the world, but not on the card it's Marketing the feature with. That First Generation card will provide a taste as to what that Feature will eventually bring, but that's about it. That kind of implementation really fueled the old skool 3dfx vs NVidia Battle Royales. NVidia provided new features that were the future, 3dfx provided features only capable of being used in the Present. NVidia didn't care if the main Marketing feature was useable in a Practical sense, 3dfx only cared that a feature was useable in a Practical sense. Eventually NVidias Marketing method won out and why would they change?

On the positive side of NVidia's Marketing method, they do force a new feature onto the Industry and these features do indeed eventually offer great benefit, so it's not a total "ripoff". What annoys many though are the, as the OP mentions, "OMG Feature XXXX!!!!!!!" types that totally ignore issues regarding Performance/Price and the results of thorough Testing/Comparison. These type seem to get caught up on the wonders of the Feature rather than the Practicality of it at Present. It's almost as if they think their whizzbang Feature will start running acceptably when more powerful cards that are acceptably powered come out. They'll never know(many of them), as they'll have moved on to the next Generation before they really have a chance to try it out.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: hans030390
Yes, it would limit the audience, but there's really nothing developers wont do. If you can't play the games, too bad. Many people upgrade systems to play new games. many people also buy games not realizing their system wont play it.

Okay, so according to you, economics don't matter, and you'll get plenty of sales from people who buy your game by mistake and can't actually play it. Can't argue with that logic...

Still, even if games just now need sm1.4, what kind of games are those and do they run on sm1.4 cards? Probably. So, when SM3 is a minimum, the 6 series will be like that sm1.4 card (sucks, but still plays games). I'm assuming this due to a pattern i've seen before.

I was referring to Battlefield 2, which runs using SM1.4 at minimum settings. This is (AFAIK) the first even remotely large game release to require something beyond just DirectX8 support and HW T&L. And we've had SM2.0 hardware on the market (even in the sub-$100 range) for several years.

And you're right about the Sm3 ground up thingy, but its like this: Sm3 boosts performance when using a code previously meant for 2.0. Now, when you make a code just for Sm3, it will either boost performance even more than if it was just "ported" from sm2 to sm3, or it would be as fast as Sm2 or slower, but look much better (and if Sm2 tried to look like what Sm3 can deliver, it would go much slower than sm3).

Basically, yes, but I think it is unlikely that performance will increase "even more" by a noticeable amount, and very likely that is will be so much "slower" with the improved IQ that it will not really be usable on today's cards.
 
Feb 19, 2001
20,155
23
81
3....2....1.... DUEL FANBOYS!

*whips out wand*

Avada Kedavra!

muhahahaha

Op: What games do you play? It is a marketing thing, but it does work right? SM3.0 is good also in some games, but I think it's just overhyped.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: BenSkywalker
2. About the whole sm3 from the ground up thing: IT DOESNT MATTER. Sm3 or sm2 only determines the shaders that run on the gpu. All the shaders are written in a common high level language, and then are compiled into either sm2 or sm3 code. And these shaders make up only a small part of the game engine, and it makes absolutely no difference if the game was designed with sm3 only or with sm3 and sm2 - most of the code is still the same and it will still run at similar speed.

I can't grade you very high on effort here, you should have gone on for much longer. Obviously you fail miserably in terms of accuracy but I assume you already knew this.

What are you trying to say? I cant grade your post either, because your sarcasm isnt much useful for disproving my point. I also do my own 3D game programming when I feel like it, so I know a thing ot two about how 3D games are built, I'm not just making stuff up.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I also do my own 3D game programming when I feel like it, so I know a thing ot two about how 3D games are built, I'm not just making stuff up.

OK-

it makes absolutely no difference if the game was designed with sm3 only or with sm3 and sm2 - most of the code is still the same and it will still run at similar speed.

What on Earth would make you build in a conditional branching path on a SM 2.0 level shader path? Why wouldn't you use conditional branch paths to significantly reduce computational overhead on a SM 3.0 level part?

You explain it.

Mathias-

In fact, for many simple shaders, there would be no difference whatsoever between SM2.0 and SM3.0 implementations.

Simple shaders being the main point. We are talking about moving forward here and what to expect are we not? We are starting to see shader hardware that makes utilization of shaders viable in terms of them being superior to texture mapping for general purpose effects(ie- we have enough shader power now to make a brick wall look better using shaders then with high quality texture maps). With more complex shaders the ability to plan on and build shaders with branching in mind from the start will allow developers considerably more flexibility. This has nothing to do with nV or ATi as obviously the R520 is going to have to support SM 3.0 to remain viable and within MS's targets for viable hardware(not to mention to allow it to see ports from XB360 at any point without a serious reworking). The amount of interaction with the environment and lighting in particular can be significantly more complex under 3.0 then 2.0 looking to the same level of performance as long as the branching hardware is decent(it is with the GTX, not so with the 6x00 parts).

We've only just started seeing games that require SM1.4 -- requiring SM3.0 would limit your audience way too much at this point.

Unless you factor in the console element. All of the next gen consoles are going to be pushing ~SM 3.0 level shader hardware as default(not sure about the Revolution, but they are indicating R520 based parts so they should also) so ports from the consoles will have SM 3.0 support built in from the ground up already- at that point the question becomes one of how much effort is it worth rewriting the graphics engine and dumbing it down for PCs? In the early days of their life cycle we will likely see it, but the move to SM 3.0 is likely to be far more widespread then the move to SM 2.0 has been simply due to the fact that all of the mass market platforms already will be using SM 3.0.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Considering that UE3 had to have 6800gts right away for SM3 support, i'd say its a pretty big.

Obviously its important enough that ATI is going to use it and for microsoft to have a big article on their website (probably soon to make it the standard shader model thingy).

I'm having trouble why so many people who have SM2 say that SM3 is "stupid". Is it because you dont have it? because we say its needed right now? could it be we dont keep our cards for a few months before upgrading? could it be we like newer features?

What exactly is it? For me, I don't upgrade often (every few years), so SM3 was needed for safety measures. I also like the fact that its a new feature that will let me play games at either boosted performance or added eye candy.

What reason do you have not to like it right now? is it because of me? yup, probably!
 

WA261

Diamond Member
Aug 28, 2001
4,631
0
0
They also cranked up the clocks 40mhz and listed it lower....made the card seem faster then it is
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
What on Earth would make you build in a conditional branching path on a SM 2.0 level shader path? Why wouldn't you use conditional branch paths to significantly reduce computational overhead on a SM 3.0 level part?

You're missing the whole point of the argument. The sm3 code in a sm3-only game will likely be much similar to the sm3 code in a sm3/sm2 game for a given effect. I never said sm2 is the same code as sm3, and neither did I say that sm2 is just as efficient as sm3. Once you determine that the hardware is sm3 capable, the game choses the sm3 code path, and the fact that there's also a sm2 code path available makes no difference to the sm3 code running. It's not like the whole game is written in a different language if it only has sm3. Most of the cpu code will still be the same, and the gfx card doesnt know what other possible shader code might be built in if it's not enabled.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Hold on, check this out. This is from Mark Rein of the UE3 dev team:

"The best benefit is we now have a card [the 6800] that can run this in real time and do Pixel Shader 3, which is a key component of what we're doing. It's pretty good that Nvidia's come along and is pushing this technology hard. We've had a long relationship with Nvidia and they've done a great job of keeping us on our toes and giving us new powerful toys to play with."

Ok, so they have a relationship with Nvidia...oh, linky....http://www.eurogamer.net/article.php?article_id=55942...
But, they did say that DX9 was a minimum. They said that cards that use just DX9 will have to play at low settings. Anyone who has alot of ram and a dx9 card can play one notch above. Someone with a Sm3 card can go farther than the main competitor card (6800gt vs x800xl) with the Sm3 eye candy on (and when i mean farther, i mean graphics wise and still be playable, not beat it FPS wise and look better). They said something like that. Or thats what i made it out to be.

I just thought it was interesting.
Something like that.
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: hans030390
And i'd like to add this

http://www.firingsquad.com/news/newsarticle.asp?searchid=7790

just read it. It's old, but does "cards based off of the 6600gt will run the engine easily" mean anything? I'm assuming he means the 6 series...which as Sm3...and the above link shows its important to them...

um....yeah...

I dont think he meant like with the settings even at med.

I think the settings would have to be preeetty low. Tho i suppose you know that as you were just stating SM3 is useful

Its getting late. Someone knock me out

 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
I'm well aware that my system wont run it probably past medium/low settings. I'm fine with that.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
OTOH, I've heard the 6800U was running the UE3 demo at ~15fps (extrapolating from Epic coments on 7800GTX framerate), and Tim Sweeney has written to Reverend that SM3 was just a minor improvement.

Anyway, SM3 is not the future anymore, it's basically the present reality (GF6 & 7, Xb360, PS3). Hopefully ATI arrives (in force) in a month and we can put this argument behind us, in favor of whose SM3 featureset and performance is better.
 

zendari

Banned
May 27, 2005
6,558
0
0
Originally posted by: hans030390
And i'd like to add this

http://www.firingsquad.com/news/newsarticle.asp?searchid=7790

just read it. It's old, but does "cards based off of the 6600gt will run the engine easily" mean anything? I'm assuming he means the 6 series...which has Sm3...and the above link shows its important to them...

um....yeah...

Developer minimum specs are so backwards its not even funny. With min spec you get like 15 fps at 640x480.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Wow, can anyone say that the whole 15fps thing was on a NON-COMPLETED version of the engine running with ALL eye candy on?

I can.

No wonder it got 15fps....sheesh. Not like i'm gonna play it on uber IQ settings.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: BenSkywalker
it makes absolutely no difference if the game was designed with sm3 only or with sm3 and sm2 - most of the code is still the same and it will still run at similar speed.

What on Earth would make you build in a conditional branching path on a SM 2.0 level shader path? Why wouldn't you use conditional branch paths to significantly reduce computational overhead on a SM 3.0 level part?

You explain it.

I didn't write that, but I think he might be trying to make the same argument as I was, which is that with the shader lengths you can actually use on something the speed of a GF6, the differences are minor. Many simple shaders don't even need conditional branching.

He may have also been referring to the non-shader parts of the graphics engine, or of the rest of the game as a whole. Better shaders don't increase your card's memory bandwidth, nor will they make your physics simulation or AI processing take less CPU time.

In any case, building a game engine "from the ground up" with SM3.0 in mind will not magically make the shaders run faster. "You can't judge SM3.0 performance based on Far Cry or SC:CT because they weren't built ground-up with SM3.0" is just not a good argument.

Mathias- (sic)

In fact, for many simple shaders, there would be no difference whatsoever between SM2.0 and SM3.0 implementations.

Simple shaders being the main point. We are talking about moving forward here and what to expect are we not? We are starting to see shader hardware that makes utilization of shaders viable in terms of them being superior to texture mapping for general purpose effects(ie- we have enough shader power now to make a brick wall look better using shaders then with high quality texture maps). With more complex shaders the ability to plan on and build shaders with branching in mind from the start will allow developers considerably more flexibility. This has nothing to do with nV or ATi as obviously the R520 is going to have to support SM 3.0 to remain viable and within MS's targets for viable hardware(not to mention to allow it to see ports from XB360 at any point without a serious reworking). The amount of interaction with the environment and lighting in particular can be significantly more complex under 3.0 then 2.0 looking to the same level of performance as long as the branching hardware is decent(it is with the GTX, not so with the 6x00 parts).

I'm unconvinced from the performance numbers that even a (single) 7800GTX can really start to take advantage of SM3.0, but I agree with your basic points. SM3.0 is an improvement over SM2.0 for writing big, scary shaders, and when we start getting GPUs that can really crank out shader ops as their primary function, it'll be a huge advantage. But I think that we're looking at the next next generation (ie, the one after R520 and GF7), when DX10 (or whatever Microsoft is calling it this week) is defined and the GPUs are built primarily as unified shader engines (rather than fixed-function pipelines with some shader units thrown on top).

We've only just started seeing games that require SM1.4 -- requiring SM3.0 would limit your audience way too much at this point.

Unless you factor in the console element. All of the next gen consoles are going to be pushing ~SM 3.0 level shader hardware as default(not sure about the Revolution, but they are indicating R520 based parts so they should also) so ports from the consoles will have SM 3.0 support built in from the ground up already- at that point the question becomes one of how much effort is it worth rewriting the graphics engine and dumbing it down for PCs? In the early days of their life cycle we will likely see it, but the move to SM 3.0 is likely to be far more widespread then the move to SM 2.0 has been simply due to the fact that all of the mass market platforms already will be using SM 3.0.

Considering that none of these consoles are on the market yet, I think we have a while (I'd say in the realm of 2-3 years, but that's a semi-random guess) before this becomes a major concern. I do agree that SM3.0 will, in the end, become bigger and grow faster than SM2.0, partily because developers are starting to accumulate more and more experience with programmable shaders, and partly because in a few years most households in America will have at least one SM3.0-capable GPU (whether in the form of a PC video card or an XBox360/PS3/Revolution).

Look. I'm not saying that SM3.0 is worthless (far from it!) And I think ATI would be making a huge mistake if R520 didn't support it. I just don't think it makes a lot of sense to base purchasing decisions for a PC graphics card on it right now. If you want to lay out $700+ for a 6800GT/U or 7800GTX SLI setup and keep it for five years, sure. But not for us mortals who have, you know, budgets.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
I have a budget....thats why i bought a sm3 card...cuz...i wouldnt be able to afford upgrading after i got this card...
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: hans030390
I have a budget....thats why i bought a sm3 card...cuz...i wouldnt be able to afford upgrading after i got this card...

Well, I personally disagree with your decision to buy a card that performs slower today but has SM3.0 than a card that is faster overall but has SM2.0. It's just not what I would have done, as I do not think SM3.0 will be terribly useful on a card as slow overall as the 6600GT.

However, I don't have all the details of what you paid, what was available at that time, etc.. And of course, you can't put a price on peace of mind or satisfaction. If you feel an SM3.0 card is going to perform or hold value better, I can't exactly stop you from buying one. I'm just some random guy on teh Intarweb, after all.
 

gac009

Senior member
Jun 10, 2005
403
0
0
^yes but can the gpu your 6600gt really take avantage of the new shaders?
I know you gotta a lot of love for SM 3.0 in your heart but do you really believe that the 6600gt will out perform the x800 in games 2-3 years running the SM 3.0 path?
Have you ever tried out those cool SM 3.0 features and HDR while playing SC:CT on your 6600gt?

I believe that one day SM 3.0 will be king but when that day comes I wont want to be gaming with a 6 series card.

 

Rock Hydra

Diamond Member
Dec 13, 2004
6,466
1
0
It's the people who don't do research and automatically thing a SM 3.0 video card is better than a SM 2.0 card whether it is or not, whether it be now or in the future.
 

munchow2

Member
Aug 9, 2005
165
0
0
If an equivalently performing and priced ATI and Nvidia card were compared. I would of course take Nvidia because I would have nothing to lose to take SM3.

However, it is not like I will get ub3r quality graphics SM3 is supposed to deliver when I use a 6600GT vs. 7800GTX SLI.

People who say SM3 is a scam are too strong in their judgement. I can see that SM2/3 won't make a HUGE difference in games but they do make the games slightly prettier. Directx versions on the other hand are extremely crucial and play a much more important role in the videogame experience than anything else. Nevertheless, SM3 is the next evolution of making graphics and we should all just accept it as opposed to bashing it since joe shmoe just purchased an expensive gfx card with SM2 and is feeling ripped off.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
You're missing the whole point of the argument. The sm3 code in a sm3-only game will likely be much similar to the sm3 code in a sm3/sm2 game for a given effect. I never said sm2 is the same code as sm3, and neither did I say that sm2 is just as efficient as sm3. Once you determine that the hardware is sm3 capable, the game choses the sm3 code path, and the fact that there's also a sm2 code path available makes no difference to the sm3 code running. It's not like the whole game is written in a different language if it only has sm3. Most of the cpu code will still be the same, and the gfx card doesnt know what other possible shader code might be built in if it's not enabled.

I'm not missing the point of your argument- you aren't seeing the flaws in it.

If you are building a game from the ground up to utilize SM3.0 you are going to be building all of your art assets in a different manner then you would with a SM2.0 part- this goes on both the geometry and shading side. Your physics calculations could easily see a sizeable shift as if you start using HOS with deformation properties it is a considerably different approach then what you would do if you were building your engine around a SM2.0 code path. This is going to effect a significantly larger portion of your code base then just what you can alter using a different flag when compiling. When looking solely at the shaders themselves particularly when dealing with light interactions SM 3.0 allows a significantly simplified instruction count for identical shaders compared to SM 2.0 if you code it with branching in mind- you can handle this using 2.0 hardware but it would be so incredibly slow it wouldn't be worth while. When talking about games built from the ground up to utilize SM 3.0 level hardware it is very far removed from simply choosing which code path to utilize at run time- if you are doing that you did not build the engine from the ground up with that purpose in mind.

Matthias-(spelled it properly this time )

I didn't write that, but I think he might be trying to make the same argument as I was, which is that with the shader lengths you can actually use on something the speed of a GF6, the differences are minor. Many simple shaders don't even need conditional branching.

But the problem with that is what is an extremely lengthy shader on a SM2.0 part could be a very short one on a SM3.0 part to do the exact same thing. That is what we are really talking about here.

In any case, building a game engine "from the ground up" with SM3.0 in mind will not magically make the shaders run faster.

Except that it very easily can depending on the situation(although obviously that isn't always the case). Using FC as an example if they had built the interior shaders from the ground up to use SM 3.0 they could have collapsed the passes and instruction count considerably(beyond what they have modified, it would require rewriting all of the shaders) making it run much better with higher levels of accuracy then the SM2.0 path.

I'm unconvinced from the performance numbers that even a (single) 7800GTX can really start to take advantage of SM3.0

Of course, because you are thinking of it bass ackwards. What you should be asking yourself is can the 7800GTX handle the simplicity of SM3.0, not the complexity. There really isn't much you can do under PS3.0 that you can't do under SM2.0- you can just do it in a much simpler fashion.

SM3.0 is an improvement over SM2.0 for writing big, scary shaders, and when we start getting GPUs that can really crank out shader ops as their primary function, it'll be a huge advantage.

The 7800GTX can crank out shader ops faster then it can draw pixels- what are you waiting for in terms of it being the 'primary function'? SM3.0 is about taking what would be a big scary shader op and making it a more reasonable one- not about exploding complexity it is about increasing simplicity.

But I think that we're looking at the next next generation (ie, the one after R520 and GF7), when DX10 (or whatever Microsoft is calling it this week) is defined and the GPUs are built primarily as unified shader engines (rather than fixed-function pipelines with some shader units thrown on top).

How would you feel about a CPU that had general purpose units that handled both fp and integer ops- neither as fast as the dedicated units they currently utilize? That is akin to what unified shader hardware is going to be. Considerably reduced performance per unit- I'm not convinced it is the ultimate best way to handle thing at this point- it sounded nice in theory but there is quite a bit of compromise as of now.

Beyond that, I'm in no way am I saying that we don't need a massive increase in shader performance before we can think about a fully shaded game- that said SM3.0 is lowering that bar- that is a Good Thing.

Considering that none of these consoles are on the market yet, I think we have a while (I'd say in the realm of 2-3 years, but that's a semi-random guess) before this becomes a major concern.

So you are under the impression that the major game developers are going to dumb down their titles for years prior to bringing them to the PC?

I just don't think it makes a lot of sense to base purchasing decisions for a PC graphics card on it right now.

SM3.0 is an advantage for whatever board has it versus a SM2.0 board period. This is very far removed from an either or choice ATM- either you get very fast SM2.0 performance and SM3.0 as an option or you get very fast SM2.0 performance and no SM3.0 as an option. There is no way ATi isn't going to be supporting SM3.0 and when that happens all of their rabid loyalists can end this nonsense about it not being a worthwhile feature. SM2.0 was useless when it first shipped in hardware as their was no API to exploit it- that is certainly not the case here.
 

trinibwoy

Senior member
Apr 29, 2005
317
3
81
Originally posted by: WA261
They also cranked up the clocks 40mhz and listed it lower....made the card seem faster then it is

Hmmmm my IQ dropped a couple notches just reading this post. The card is as fast as it is regardless of clocks. Nothing they do can make it "seem" faster. My head is still spinning from reading that

 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |