Josh, so your basing the DX10 performance from a game preview, a game that isnt even finished yet?
Of course not.
I'm basing the performance of a card based on what we've been shown.
of course its going to be skippy in terms of smoothness, but eventually when the games comes out, it will be fully optimised to a point where game devs feel right to release it.
The game can be "optimized" all it wants to be, but if it's a graphics hog it's a graphics hog
--look at FEAR.
In the Hellgate demo, I didn't see visual artifacts, crashings, or weird character bugs which is the majority of what game developers fix in patches. They don't often release a patch just to increase performance. Normally it's simply to resolve glitches or add content.
This time around, im not so sure if R600 is going to beat the G80, because this isnt your G70 vs R520 or NV40 vs R420.
That's fine. I never claimed my opinion to be truth and you very well may be right. I see this as another one of the 100+ speculative threads where we take information that we have been given thus far and produce an educated guess.
Since when does one game preview automatically decipher how these next-gen cards handle DX10.
That depends on how much game developers utilize DX10's limitations in said application.
Basing the argument that it runs slow on one DX10 application means it will on all DX10 software in general is a pre-mature statement, and one that is rather futile at this point. It would be like saying the Radeon x1900PRO sucks at DX9 because it can't run Oblivion with max settings on a decent resolution.
Hardly. I never claimed to have known the resolution the demo was ran on, what settings they used, nor what card(s) may have been running it
--all we know is that it was a G80 of some combination.
I trust that they did the best that the could to give as fluid of a preview as possible especially since it was a live preview to public enthusiasts and not some clip released on the net for download.
What about the Radeon 9700. That card was a beast with DX8 games, and handled many DX9 games too.
Sure it handled DX9 games, but not
that great, especially when you started to crank up IQ. In the long scheme of things it was what I think the G80 will be, a good introductory card for the beginning of a new DirectX API.
Not to be rude, but I don't think Josh has a clear understanding on the issue. These GPUs have been designed very closely with Microsoft's DX10 software teams, along with game developers. And while support for DX10 will always get better with each new generation, I don't think it's accurate to say that G80/R600 aren't "meant" for DX10.
My understanding of the issue is that if companies plan to keep making cards and turn a profit there has to be a demand for those cards. If Nvidia wants to be able to entice people from a G80 to a G81, the G80 has to stop making someone happy at some point, thus not giving enough performance for a certain title. Considering the time frame that Vista gets released, DX10 games emerge, and G81 gets released, I just think that the G80 will try to get phased out as soon as possible with the smaller nm core of the G81.