Originally posted by: Genx87
Originally posted by: n7
Originally posted by: n7
I suspect the poor performance is due to the stream processors layout for the AMD cards vs. nV's, but i can't find anything stating just how it's set up.
From a poster commenting on the DT article:
The R600 has 64x5 Vec5D units which maces each unit handle a maximum of 320 stream ops per second, but makes it's worst case scenario a lot worse with 64 stream ops per second (for the lack of a better term). You can think of that in the same manner as you see that SIMD units in our current processors can deliver huge amounts of processing power if, and only if, used correctly and optimized accordingly, otherwise we see no gains.
In my opinion AMD/ATI made a design compromise, they used this approach as it could prove to be way better in the dx10 world, and in a much more interesting way, in the GPGPU world.
Think about it, if you open up your architecture with CTM and give the people the power of 64x5 vec5d units you end up with an amazing amount of processing power. That's where I think they are focusing.
Nvidia has a much more favorable place in the gaming world. If you have 128 scalar units, in a worst case scenario you'd still issue 128 stream ops (all else constant, and given you have the bandwidth). But your best case scenario isn't that good.
I believe they delayed it because they were expecting dx10 games (of course, this is just speculation on my part). And I hope, for their sake, that it performs a lot better in that world.
Still, if I am somewhat right, drivers could provide better optimization for shader programs that aren't made with a simd architecture in mind, but then again, I could be entirely wrong.
This is what i was thinking too.
If that is true they made a terrible design decision for todays market, which will be 99% DX9 titles for the next 12-18 months. By then the R600 and G80 will be nothing.
Typically I don't agree with you because your posts often display a certain 'bias', but in this case you're bang on. Poor performance for DX9 titles is a serious blow because DX10 games are just not coming en masse in the near future. We've got Halo 2 (been there, done that - plus doesn't it work with DX9 cards, just requiring Vista?), a possible Crysis patch (which, if their unsupported and buggy SM 3.0 patch for Far Cry is any indication, will be late to the table and buggy), and I don't know what else. Just a total disaster for ATI/AMD (the HD 2900XT could be OK if it's priced at $399, but the power consumption figures are ridiculous).
Honestly, what's the incentive to go ATI right now, other than future performance, which has never been a smart indication of what to buy because there's always something new on the horizon and ATI/NVIDIA have been on a 6-month refresh cycle for years now.
-----------------
As an aside, it seems that in general the gaming industry is in a lull, as development teams are all working on "next-gen" looking games, which means even longer development times than before...
Not to mention an increase in console exclusive (or cross-platform) titles, such as Gears of War and many more Unreal 3 engine games, which appear to be in development for Xbox360, PS3 (in many cases) as well as PC. Ever since Microsoft introduced the original Xbox, it has stolen a lot of PC gaming's thunder, and the mockery that has become the expensive PC graphics card race has really turned a lot of people off (myself included). [though MS has their own problems with the ****** life expectancy of Xbox360 systems, and PS3 reliability is still to be determined]