Granted it was one preview, I'd have to say both.Originally posted by: josh6079
Which overall perspective? The capabilities of the G80 or the capabilities of DX10?
Any basis for that? Thus far, I've read quite the contrare. I'd love to know where you're getting your information from, though. Perhaps we can contact that source and fill them in.Originally posted by: josh6079If you think FEAR wasn't "well-optimized" on XP near the end of the DX9 era just wait until you see Hellgate on Vista in the beginning of the DX10 era.
How is it hard to understand that different developers program their games in different ways. What you're suggesting is that this one preview encompasses the code that is used for all DX10 applications to come. Logically speaking, we have games today that vary in what they require from the hardware. Oblivion happens to be quite a demanding game, whereas game x will run on almost anything. So again, quit basing all DX10 games to come on this one preview, because it's just silly.Originally posted by: josh6079I would only hope the unknown variables in the preview were demanding ones such as high resolutions and acceptable AA levels. Otherwise, the situation will be even worse.
:roll:Originally posted by: josh6079So, you're saying that as DX APIs mature and developers learn how to better utilize the Direct3D versions games don't become more demanding and require better hardware?
You implied nVidia held G80 back so that they could release a G81 as soon as Vista released. That is where I pointed out your flawed thinking, so bringing up something totally different isn't really going to make you're argument more compelling.
Originally posted by: josh6079I don't seem to remember getting a huge increase in performance by updating DX iterations. Heck, sometimes I only update them when I install a newer game that has a later DX iteration coupled with it. Are you implying that DX10.2 will make or break the G80's DX10 performance?
I guess you missed the memo.
Microsoft has split DX10 up into what one could ball "bite sizes." They will release each new iteration after x amount of time, and what this does is give game developers more cushion for features they decide to implement in their games. This hasn't been the case in previous DX versions, so I'm not sure what you're talking about when you say you've done this before.
But yes, I'm very sure that Dx10.2 will add new features / capabilities for game developers not supported by DX10 GPUs. That's not to say it'll be out fairly soon, in fact I'd venture to guess we won't even see DX10.1 until sometime next year, if even by then. It depends on how Microsoft is going to handle it.
Then why are you assuming they'll start doing this now?Originally posted by: josh6079That's not representative to my analogy as there was no die shrink from the 7900's to the G80. The last time Nvidia did a die shrink in their flagship products you most certainly did see a "plaguing" of their former GeForce series (aka - 7800--->7900).
Lol, a lot of the questions you're asking are the same ones I'm asking you. Maybe you can start answering them instead of wish-washing around to new ideas and talking-points.
Nono, if you're the one making those accusations, it's up to you to provide the evidence / clips. All you have been able to offer is one measely clip of a "so-called" preview, the central of this debate and why we called you out on making characterizations based on this.Originally posted by: josh6079It's just my opinion based off of what Nvidia and Flagship Studios have all showed us. If you think the G80 / R600 will do good with some DX10 titles show me some clips because I'm most interested.
How about answering that, and then we can get into all the other things you seem to want to discuss.
Nelsieus