The R600 will equal the 8800GTX in performance

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Nelsieus

Senior member
Mar 11, 2006
330
0
0
Originally posted by: josh6079
Which overall perspective? The capabilities of the G80 or the capabilities of DX10?
Granted it was one preview, I'd have to say both.

Originally posted by: josh6079If you think FEAR wasn't "well-optimized" on XP near the end of the DX9 era just wait until you see Hellgate on Vista in the beginning of the DX10 era.
Any basis for that? Thus far, I've read quite the contrare. I'd love to know where you're getting your information from, though. Perhaps we can contact that source and fill them in.

Originally posted by: josh6079I would only hope the unknown variables in the preview were demanding ones such as high resolutions and acceptable AA levels. Otherwise, the situation will be even worse.
How is it hard to understand that different developers program their games in different ways. What you're suggesting is that this one preview encompasses the code that is used for all DX10 applications to come. Logically speaking, we have games today that vary in what they require from the hardware. Oblivion happens to be quite a demanding game, whereas game x will run on almost anything. So again, quit basing all DX10 games to come on this one preview, because it's just silly.

Originally posted by: josh6079So, you're saying that as DX APIs mature and developers learn how to better utilize the Direct3D versions games don't become more demanding and require better hardware?
:roll:

You implied nVidia held G80 back so that they could release a G81 as soon as Vista released. That is where I pointed out your flawed thinking, so bringing up something totally different isn't really going to make you're argument more compelling.


Originally posted by: josh6079I don't seem to remember getting a huge increase in performance by updating DX iterations. Heck, sometimes I only update them when I install a newer game that has a later DX iteration coupled with it. Are you implying that DX10.2 will make or break the G80's DX10 performance?

I guess you missed the memo.

Microsoft has split DX10 up into what one could ball "bite sizes." They will release each new iteration after x amount of time, and what this does is give game developers more cushion for features they decide to implement in their games. This hasn't been the case in previous DX versions, so I'm not sure what you're talking about when you say you've done this before.

But yes, I'm very sure that Dx10.2 will add new features / capabilities for game developers not supported by DX10 GPUs. That's not to say it'll be out fairly soon, in fact I'd venture to guess we won't even see DX10.1 until sometime next year, if even by then. It depends on how Microsoft is going to handle it.

Originally posted by: josh6079That's not representative to my analogy as there was no die shrink from the 7900's to the G80. The last time Nvidia did a die shrink in their flagship products you most certainly did see a "plaguing" of their former GeForce series (aka - 7800--->7900).
Then why are you assuming they'll start doing this now?

Lol, a lot of the questions you're asking are the same ones I'm asking you. Maybe you can start answering them instead of wish-washing around to new ideas and talking-points.

Originally posted by: josh6079It's just my opinion based off of what Nvidia and Flagship Studios have all showed us. If you think the G80 / R600 will do good with some DX10 titles show me some clips because I'm most interested.
Nono, if you're the one making those accusations, it's up to you to provide the evidence / clips. All you have been able to offer is one measely clip of a "so-called" preview, the central of this debate and why we called you out on making characterizations based on this.

How about answering that, and then we can get into all the other things you seem to want to discuss.

Nelsieus

 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
I'd have to say both.
The capabilities of the G80 are, I think, amazing just as DX10's is. However, I can't look at the G80 and see that Nvidia gave it everything they had. The left some variables out of the G80 for their refresh - the G81. Because they did this, I can't see the G80 / R600 being the last card you'll need for DX10, even if Microsoft developed DX10 with these cards in mind. The 9700 was a good entry for DX9, just like the G80 / R600 will be a good entry for DX10. The field that they will really be able to stretch their legs with though is DX9.
What you're suggesting is that this one preview encompasses the code that is used for all DX10 applications to come.
Hardly. What I'm suggesting is that the games to come will demand all of what the G80 / R600 has to offer and then some.
So again, quit basing all DX10 games to come on this one preview, because it's just silly.
Where have I made assumptions about all of the efficiency of all the DX10 games to come? I've made assumptions about how our first / current DX10 GPU's will be able to handle those DX10 games, not how well developers will be able to fulfill the DX10 spec.

To be fair, I'm not looking at just the Hellgate previews but the Crysis ones as well:

Link (provided by shabby here.)

Again, no Vsync, firefights, and hardly any other characters on screen. Performance is, eh....*wobbles hand*

Yes, I don't know all of the specifics surrounding the demo run, but, once again, I only hope they were demanding ones. Otherwise the situation is even worse.
You implied nVidia held G80 back so that they could release a G81 as soon as Vista released.
I did? Where did I say they will throw away G80 as soon as Vista hits?

I implied that Nvidia will probably phase out the G80 when the G81 hits in the same manner they phased out the G70 when G71 hit. Nvidia will instigate that, not Microsoft.
Microsoft has split DX10 up into what one could ball "bite sizes." They will release each new iteration after x amount of time, and what this does is give game developers more cushion for features they decide to implement in their games. This hasn't been the case in previous DX versions, so I'm not sure what you're talking about when you say you've done this before.
Really? In truth, I never heard about that. Thanks.

I thought you were discussing DX10 "updates" much like the DX9 updates that Microsoft releases sometimes.
But yes, I'm very sure that Dx10.2 will add new features / capabilities for game developers not supported by DX10 GPUs. That's not to say it'll be out fairly soon, in fact I'd venture to guess we won't even see DX10.1 until sometime next year, if even by then. It depends on how Microsoft is going to handle it.
When you say "add new features" do you propose that those "new features" will increase the G80 / R600 performance with DX10 applications?
Then why are you assuming they'll start doing this now?
I never assumed that they'll do this now. I believe they'll do that phasing out to the G80 after they release the G81.
Lol, a lot of the questions you're asking are the same ones I'm asking you.
I have to ask questions with you misreading my statements and practicing wishful thinking. What were you smoking when you thought I said that the G80's will be phased out when Vista hits and one DX10 game will decide the functionality of all?

I'm not concentrating on the ability for developers to program an efficient DX10 application but rather the ability of the DX10 GPU's to perform well with the games coming for DX10, whether they're efficiently coded or not.
 

Nelsieus

Senior member
Mar 11, 2006
330
0
0
Originally posted by: josh6079
Where have I made assumptions about all of the efficiency of all the DX10 games to come?

Josh: "I'm not all that excited the R600 / G80 performances in DX10 and that preview explains why."

Originally posted by: josh6079I did? Where did I say they will throw away G80 as soon as Vista hits?

Josh: "Considering the time frame that Vista gets released, DX10 games emerge, and G81 gets released, I just think that the G80 will try to get phased out as soon as possible with the smaller nm core of the G81."


Originally posted by: josh6079When you say "add new features" do you propose that those "new features" will increase the G80 / R600 performance with DX10 applications?
When G80 hit, did we see Geforce 7900 products get faster in DX9? 0_o

Originally posted by: josh6079What were you smoking when you thought I said that the G80's will be phased out when Vista hits and one DX10 game will decide the functionality of all?
Ooh, I get to double paste...

Josh:"I'm not all that excited the R600 / G80 performances in DX10 and that preview explains why.""Considering the time frame that Vista gets released, DX10 games emerge, and G81 gets released, I just think that the G80 will try to get phased out as soon as possible with the smaller nm core of the G81."

And please don't make such inappropriate comments as "what were you smoking..." I find it rather uncalled for.

Nelsieus

EDIT: Spelling

 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
You're really good as misreading things, just like you misread tuteja1986's comments here.

Let's look at my comments again, since you're basing your whole argument off of my comments instead of outside evidence.

Originally posted by: josh6079
I'm not all that excited [about] the R600 / G80 performances in DX10 and that preview explains why.
What does this mean? I am not impressed with the DX10 GPU horsepower for DX10 applications. It says nothing about effecient coding, DX10 applications as a whole, or the how the developers handle DX10. Why you thought that is beyond me.

Your reading comprehension failed again when you thought I was claiming that the G80 will be phased out when Vista hits. If you would have bothered to look at the other factors in that sentence, you would have realized that the time period I was discussing was Q1 of 07, not Janurary:
Originally posted by: josh6079
Considering the time frame that Vista gets released, DX10 games emerge, and G81 gets released, I just think that the G80 will try to get phased out as soon as possible with the smaller nm core of the G81.
"Considering the time frame that Vista gets released" (it's already out but that's beside the point) was one major thing that was going to happen in the time frame I was discussing. The other factors were the emergence of DX10 games and the March / April estimates concerning the G81. When the G81 is launched, Nvidia themselves will put the G80 behind them as fast as they put the 7800's behind them when the 7900's came. That's why I mentioned the phasing out "as soon as possible" because they'll bring the G81's "as soon as possible". Nvidia would much rather make a GPU on a smaller nm die than a larger one.
When G80 hit, did we see Geforce 7900 products get faster in DX9? 0_o
That has absolutely nothing to do with my question. I'll go slow, let's look at it again:
Originally posted by: josh6079
Originally posted by: Nelsieus
But yes, I'm very sure that Dx10.2 will add new features / capabilities for game developers not supported by DX10 GPUs.
When you say "add new features" do you propose that those "new features" will increase the G80 / R600 performance with DX10 applications?
And you answered with, "When G80 hit, did we see Geforce 7900 products get faster in DX9? 0_o"



I'm asking you if you think the "added features" enclosed in DX10.2 will increase or decrease the G80's / R600's performance. Not whether the launch of a GPU increases or decreases the previous GPU's performance.
And please don't make such inappropriate comments as "what were you smoking..." I find it rather uncalled for.
It's just as valid of a question as asking me what memo I got. By the looks of things you're simply arguing with comments of mine that I've since then clarifyed for you.

[*] No, Vista is not going to phase out the G80. G81 will.

[*] No, I'm not basing what developers can do with DX10 nor what DX10 in itself can do off of one game.

[*] I'm basing what the current DX10 GPU's can do with DX10 off of what we've been shown thus far.
 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
Considering how good of games can be made with great frames (gears of war) with the Xenos processor, which is mightily inferior to the G80, and the fact that DX10 makes it easier to code for, a la dedicated consoles (not completely, but you get the drift), I feel as if the lasting power of DX10gpus will be more impressive than any previous directx generation.
 

Nelsieus

Senior member
Mar 11, 2006
330
0
0
Originally posted by: josh6079
Let's look at my comments again, since you're basing your whole argument off of my comments instead of outside evidence.
Aren't I supposed to be basing my argument of your comments?

Originally posted by: josh6079
What does this mean? I am not impressed with the DX10 GPU horsepower for DX10 applications. It says nothing about effecient coding, DX10 applications as a whole, or the how the developers handle DX10. Why you thought that is beyond me.
Which is why we brought up the fact that the preview you spoke of does little to showcase the full "horsepower" of the DX10 GPUs, and that basing your feelings off that one test was going to prove quite inaccurate.

So I'm not sure why you're starting to closer analyse your comments. Maybe you're starting to realize some glarring flaws within them?

Originally posted by: josh6079Your reading comprehension failed again when you thought I was claiming that the G80 will be phased out when Vista hits. If you would have bothered to look at the other factors in that sentence, you would have realized that the time period I was discussing was Q1 of 07, not Janurary:
I never disputed the fact that you claimed Q1. I disputed the fact that G80 would be "phased out as soon as possible [your words] during the Vista timeframe.

Lol, unfourtunately broadening your "window" time period isn't going to save your argument. *tisk

Originally posted by: josh6079
"Considering the time frame that Vista gets released" (it's already out but that's beside the point) was one major thing that was going to happen in the time frame I was discussing. The other factors were the emergence of DX10 games and the March / April estimates concerning the G81.
March and April timeframes were for a high-end G80 part, aka 8850GTX. G81 is nowhere near scheduled for March or April, so get your facts straight before you start accusing others of "mis-interpreting" your statements.

Originally posted by: josh6079When the G81 is launched, Nvidia themselves will put the G80 behind them as fast as they put the 7800's behind them when the 7900's came. That's why I mentioned the phasing out "as soon as possible" because they'll bring the G81's "as soon as possible". Nvidia would much rather make a GPU on a smaller nm die than a larger one.
Then apparently you were absent from the market during that time period. Geforce 7800GTX and 7800GT existed for quite a long time alongside the Geforce 7900 parts. Nobody would buy them because the newer series had much better price/performance ratio's, but they were most definately available for anyone who desired them.

Originally posted by: josh6079That has absolutely nothing to do with my question.
And you answered with, "When G80 hit, did we see Geforce 7900 products get faster in DX9? 0_o"



I'm asking you if you think the "added features" enclosed in DX10.2 will increase or decrease the G80's / R600's performance.
And I'm explaining the lack of logic applied to that question by asking if when we saw G80, with "new features for DX10," if that meant G70 got faster in DX9. The answer is no, but apparently you didn't get it that easily. *sigh, it really wasn't supposed to be a thinker, but I guess it proved a little difficult for ya. ^_^;;


Originally posted by: josh6079[*] No, Vista is not going to phase out the G80. G81 will.
No, ya think...:roll:

*Sigh, it's sleep time.

Nelsieus

 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
I never disputed the fact that you claimed Q1.
No, you just narrowed my comment down only concentrate on a portion of the timeframe I was discussing.
Lol, unfourtunately broadening your "window" time period isn't going to save your argument. *tisk
I haven't broadened any window. You just misread my initial claim and thought I was only speaking about January.
March and April timeframes were for a high-end G80 part, aka 8850GTX. G81 is nowhere near scheduled for March or April...
I seem to recall a thread with a poll in it asking other members what they were going to do (buy a G80 now, wait for R600, wait for the G80 refresh in March / April, I'm happy with what I have, or other)

Perhaps instead of calling it a G81 I should have said the G80 refresh.
Then apparently you were absent from the market during that time period. Geforce 7800GTX and 7800GT existed for quite a long time alongside the Geforce 7900 parts. Nobody would buy them because the newer series had much better price/performance ratio's, but they were most definately available for anyone who desired them.
You forgot to add that they stopped production of 7800's too.

Yes, the G80 will still be available for purchase after its refresh hits but that's irrelevant. The point is that the same situation will entice buyers to a G80 refresh more than the G80. The 8850GTX / G81 / whatever you wanna call it, is going to do to the G80 what you described: "Nobody would buy them because the newer series had much better price/performance ratio's..."
Which is why we brought up the fact that the preview you spoke of does little to showcase the full "horsepower" of the DX10 GPUs, and that basing your feelings off that one test was going to prove quite inaccurate.
It's more accurate than your wishful method of hoping for the best. Can you provide a clip or other piecie of evidence that will discard my doubts? Or do you just want to play semantics all day?
...when we saw G80, with "new features for DX10," if that meant G70 got faster in DX9.
So the move to DX10.2 will be just like going from DX9 to DX10 or from DX8 to DX9?

Is there going to ever be a DX11? 12? Can I have a link or are you just grabbing this from an orifice?
...*sigh, it really wasn't supposed to be a thinker, but I guess it proved a little difficult for ya. ^_^;;
The only thing I'm having difficulty with is understanding what you're argument is.

You say I am making a premature estimate on what the G80 performance will be like for DX10. Of course it's premature, there aren't any DX10 games out. However I'm going off of what I have been given so far, whereas you have provided ZERO evidence of the contrary.

Instead all you've done is say that it's too early to make a call. That's fine, but don't expect me to withdraw my fears just because you believe it will all work out okay.

Do you think the G80 will be anything more than an introductory level for DX10? If so then why?

The bulk of my statement is that more users will use the G80 / R600 for supreme DX9 playability and some mediocre DX10 playability. I'm not going to rush into Vista until a service pack, nor buy some beta DX10 games until there are some solid patches, nor purchase a DX10 GPU just for DX10 Beta. That's why I'm not excited about them for DX10.

What about that do you disagree with?

edit: spelling / grammar
 

sandorski

No Lifer
Oct 10, 1999
70,131
5,658
126
Has there been a Top of the line card in the last 3ish years that was not replaced by something better in 3-4 months? No sense worrying about it, you can pretty much count on it.
 

dogin

Junior Member
Jul 29, 2002
11
0
0
For whatever comfort it is, the 8800GTX is outstanding with DX9 games and well worth the investment. I am using it on an OC'd FX57 system with the card at 650/2.0GHz and it runs all games I have tried at max settings and the highest in game resos with the exception of COD2 which will get some stuttering at 2048x1536 unless you play with the settings. It seem the best at 1600x1200 with all settings maxed. Oblivion could use 8800's in SLI to get the speed up, but plays at relatively good speed with all settings maxed with one card. FEAR EP is a whole new game with this card. And Doom III at Ultra high at 1600x1200 is fast and beautifull. This card replaced two 7800GTX's in SLI and out performs them in all games. The R600 has been delayed already so be happy with this card until your next build. I love mine!! The speed and image quality is leaps above the 7xxx series.

You do need a good Power Supply to run the card. I am using a PCP&C 850W and it runs fine. Make sure your PS is on the certified list.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |