The R600 will equal the 8800GTX in performance

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: munky
I expect the r600 to be faster than the 8800gtx, but since you already bought the gtx, be happy with what you have. At least you didnt shell out $1000 for a 7800gtx-512 a month before the x1900xtx came out.

Edit: And neither did I, I'm just bringing up an example.


Yeah i remember the goons buying it ... Rollo raved on how awesome they were but he failed to mention that they were overpriced or X1900XT were so close to being released.
 

biostud

Lifer
Feb 27, 2003
18,403
4,966
136
Originally posted by: josh6079
Not to mention we haven't even seen the full power of the 8800GTX (DX10, constant buffer fetching?).
Yes we have.

When they showed the actual gameplay of the armored character running outside in the rain the frame rates were horrible, especially for there being no combat in that same environment.

I'm not all that excited the R600 / G80 performances in DX10 and that preview explains why. Granted, it may just be unpolished game development, but I'd hardly think they would give a public preview of it if they felt that it wasn't almost done. Not to mention we don't really know if it had a GTS or a GTX powering it.

Not to mention, the current state of Vista, nVidia drivers or the lack of the same......
So probably running a beta game on release version of vista, with beta DX10 support and beta DX10 G80 drivers. My guess is that they run pretty much in compatible mode ATM.

 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Not to mention, the current state of Vista, nVidia drivers or the lack of the same......
So probably running a beta game on release version of vista, with beta DX10 support and beta DX10 G80 drivers. My guess is that they run pretty much in compatible mode ATM.
This is also why I'm not in a rush to use a G80 or R600 for DX10. Most of their life-span in that part of the DX era will be in beta. By the time DX10 is more common, games are using it, and drivers are meant for it, there will be G81's and R700's. I'll buy one of these new GPU's for what they can do with what I already have and introduce me to what is coming.
 

biostud

Lifer
Feb 27, 2003
18,403
4,966
136
you can only wonder how they run DX10 software if this is true

http://theinquirer.net/default.aspx?article=36427

Unfortunately, the Geforce 8800 still doesn?t have a Vista driver - not even the basic one - and this has caused a few upset faces amongst folks running the Vista Business Edition.

No doubt the Vista driver will come in a different package to teh other three consigning the Unified driver marchitectural idea to the wastebasket of history.

Both the Forceware 10 driver, with a few new tricks inside, and the Vista driver have been delayed for a while and we don?t think we are going to see them anytime soon. Nvidia has a beta but it doesn?t want to share it, even with its closest partners. µ
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Ati generally performs better in more shader intensive games, while Nvidia does well in OpenGL titles. Therefore, in those extreme situtations you should see each card winning or losing by a lot more than +/-5%. But it's doubtful that overall performance of the ATI card will signifinactly overpower Nvidia's G80 which already doubles 7900GTX's performance.

More importantly, a situation could arise where R600 delivers something like 150fps while G80 125 in today's games. We simply need newer more complex games like Crysis, Unreal 3 and etc. to compare next generation cards with one another. There is not much use in comparing G80's prowess in something like Battlefield 2 which is an old game by today's standards and current cards can handle them just fine.

So until new games come out, esp with DX10 features, it's just guessing at this point as to the average gains R600 will bring, if any.
 

IndyJaws

Golden Member
Nov 24, 2000
1,931
1
81
Originally posted by: secretanchitman
Originally posted by: 40sTheme
Originally posted by: TBSN
the R600 has GDDR3 Ram I believe....

Oh, you didn't hear from the Inquirer? R600 is going to have the unheard of, unreleased GDDR5. I heard it will liek totely opertae at 5GHz more easily tahn teh G80.

no no no. GDDR10 is already being secretly produced. its going to operate beyond 10Ghz levels and will pwn any card out there.

R600 will take G80's mother out for a nice seafood dinner and never call her again.


(sorry, couldn't resist)
 

Golgatha

Lifer
Jul 18, 2003
12,685
1,606
126
Originally posted by: lavaheadache
Originally posted by: Lonyo
Originally posted by: Cooler
Originally posted by: allies
Originally posted by: Cooler
Originally posted by: allies
I just bought an 8800GTX and I don't wanna have buyer's remorse in a month or two

Hopefully at most the R600 is only 5% faster then I'd feel fine with my purchase... I'll get benchies up when the card arrives.

You also have the G81 refresh coming soon as well.

I don't suspect the refresh will be out before March. Stop trying to get me down!

And hopefully it's just cooler/quiter/draws less power, and not much increase of performance
That depends what the R600 is if it beats the 8800 GTX by ~20 % you can bet the refresh will be a 7800 GTX 512.
Expensive, limited and hard to find?


LOL, you took the words out of my mouth

LOL!
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: munky
I expect the r600 to be faster than the 8800gtx, but since you already bought the gtx, be happy with what you have. At least you didnt shell out $1000 for a 7800gtx-512 a month before the x1900xtx came out.

Edit: And neither did I, I'm just bringing up an example.

I bought 2 of them just for the investment potential. :laugh:

edit: ok I'm fos
 

Elfear

Diamond Member
May 30, 2004
7,115
690
126
I'd be willing to bet the R600 will end up being 15-20% faster overall than the G80 with each winning in certain areas. ATI's had more experience with unified shaders and they had more development time. ATI's been competetive with Nvidia for the last 5-6 years at least and I can't imagine they wouldn't have something to beat the G80 which is just continuing the cycle. Nvidia on top --> ATI on top --> and on and on.
 

Stoneburner

Diamond Member
May 29, 2003
3,491
0
76
I heard that the r600 has changed dramatically since the merger. It's now a k8l core combined with an etch-a-sketch.
 

thilanliyan

Lifer
Jun 21, 2005
11,912
2,130
126
Originally posted by: Stoneburner
I heard that the r600 has changed dramatically since the merger. It's now a k8l core combined with an etch-a-sketch.

We all know K8L can't change the tide.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Josh, so your basing the DX10 performance from a game preview, a game that isnt even finished yet? of course its going to be skippy in terms of smoothness, but eventually when the games comes out, it will be fully optimised to a point where game devs feel right to release it. This means its optimised to run on todays current end GPUs and other old GPUs. (Unless the game devs dont like crunch time )

This time around, im not so sure if R600 is going to beat the G80, because this isnt your G70 vs R520 or NV40 vs R420. To me G80 is like the R300 where the performance/IQ gains along with architectural overhaul (CUDA and all) compared to last gen is just too dramatic (I agree with ratchet on this one). For R600 to beat G80, its going to have to be even more brilliant and impressive as the G80.

So, i would predict that its going to be close. Very close.

 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,230
2
0
Originally posted by: Cookie Monster
Josh, so your basing the DX10 performance from a game preview, a game that isnt even finished yet? of course its going to be skippy in terms of smoothness, but eventually when the games comes out, it will be fully optimised to a point where game devs feel right to release it. This means its optimised to run on todays current end GPUs and other old GPUs. (Unless the game devs dont like crunch time )

This time around, im not so sure if R600 is going to beat the G80, because this isnt your G70 vs R520 or NV40 vs R420. To me G80 is like the R300 where the performance/IQ gains along with architectural overhaul (CUDA and all) compared to last gen is just too dramatic (I agree with ratchet on this one). For R600 to beat G80, its going to have to be even more brilliant and impressive as the G80.

So, i would predict that its going to be close. Very close.

Or have tremendous brute force
 

Nelsieus

Senior member
Mar 11, 2006
330
0
0
Originally posted by: Cookie Monster
Josh, so your basing the DX10 performance from a game preview, a game that isnt even finished yet? of course its going to be skippy in terms of smoothness, but eventually when the games comes out, it will be fully optimised to a point where game devs feel right to release it. This means its optimised to run on todays current end GPUs and other old GPUs. (Unless the game devs dont like crunch time )

This time around, im not so sure if R600 is going to beat the G80, because this isnt your G70 vs R520 or NV40 vs R420. To me G80 is like the R300 where the performance/IQ gains along with architectural overhaul (CUDA and all) compared to last gen is just too dramatic (I agree with ratchet on this one). For R600 to beat G80, its going to have to be even more brilliant and impressive as the G80.

So, i would predict that its going to be close. Very close.

Agreed.

Since when does one game preview automatically decipher how these next-gen cards handle DX10. Basing the argument that it runs slow on one DX10 application means it will on all DX10 software in general is a pre-mature statement, and one that is rather futile at this point. It would be like saying the Radeon x1900PRO sucks at DX9 because it can't run Oblivion with max settings on a decent resolution.

As far as R600 competitng with G80, I don't want to make a call yet. But I will say that it's pretty evident G80 probably caught ATI off guard, just like it caught most of us off guard, as well. But I don't doubt ATI's ability to compete.
 

TBSN

Senior member
Nov 12, 2006
925
0
76
wait, I think this is a stupid question, but is R600 going to be DX10?
 

TBSN

Senior member
Nov 12, 2006
925
0
76
MY GOD.
Hellgate: London looks like one of the most visually appealling, and also most fun games we've had since Diable II. When I first saw the theatrical trailer I was amazed my the graphics, and also the setting of the storyline; some medieval/futuristic landscape filled with demons and undead. Then I saw the gameplay trailer, which made it look a little to quake-like with less strategy like in HL2, until I saw the Diablo-style inventory screens and the 3rd person gameplay!
Sorry about the slightly off-topic rave about a game trailer, this just looks like would be my favorite mix of PC game genres together. The gaming is what we buy these cards for, right?

Of course, this is DX10, and that has been released (Vista) publicly yet. Does anybody know if Vista will ship or come out at the same time as dx10, or will that come out possibly before or after vista is released?


P.S.: Does anybody know if Command and Conquer 3 is going to come out in DX10, or will it be directX 9? Also, can Dx10 games be made for XBox as well?
 

Nelsieus

Senior member
Mar 11, 2006
330
0
0
Originally posted by: TBSN
MY GOD.
Hellgate: London looks like one of the most visually appealling, and also most fun games we've had since Diable II. When I first saw the theatrical trailer I was amazed my the graphics, and also the setting of the storyline; some medieval/futuristic landscape filled with demons and undead. Then I saw the gameplay trailer, which made it look a little to quake-like with less strategy like in HL2, until I saw the Diablo-style inventory screens and the 3rd person gameplay!
Sorry about the slightly off-topic rave about a game trailer, this just looks like would be my favorite mix of PC game genres together. The gaming is what we buy these cards for, right?

Of course, this is DX10, and that has been released (Vista) publicly yet. Does anybody know if Vista will ship or come out at the same time as dx10, or will that come out possibly before or after vista is released?

DX10 ships with Vista.

And I agree about Hellgate London. It's one of the games on my highly anticipated list for the begining of next year.

Nelsieus
 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
Originally posted by: josh6079

But, by general rule-of-thumb, first cards for first DX versions don't perform that great. These cards are truly meant for outstanding DX9 performance with an introductory level of DX10 usage.

What about the Radeon 9700. That card was a beast with DX8 games, and handled many DX9 games too.
 

Nelsieus

Senior member
Mar 11, 2006
330
0
0
Originally posted by: allies
Originally posted by: josh6079

But, by general rule-of-thumb, first cards for first DX versions don't perform that great. These cards are truly meant for outstanding DX9 performance with an introductory level of DX10 usage.

What about the Radeon 9700. That card was a beast with DX8 games, and handled many DX9 games too.

Not to be rude, but I don't think Josh has a clear understanding on the issue. These GPUs have been designed very closely with Microsoft's DX10 software teams, along with game developers. And while support for DX10 will always get better with each new generation, I don't think it's accurate to say that G80/R600 aren't "meant" for DX10. They're specifically meant for DX10, with the added bonus of bringing increased performance (and support) to DX9.

Nelsieus
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Josh, so your basing the DX10 performance from a game preview, a game that isnt even finished yet?
Of course not.

I'm basing the performance of a card based on what we've been shown.
of course its going to be skippy in terms of smoothness, but eventually when the games comes out, it will be fully optimised to a point where game devs feel right to release it.
The game can be "optimized" all it wants to be, but if it's a graphics hog it's a graphics hog--look at FEAR.

In the Hellgate demo, I didn't see visual artifacts, crashings, or weird character bugs which is the majority of what game developers fix in patches. They don't often release a patch just to increase performance. Normally it's simply to resolve glitches or add content.
This time around, im not so sure if R600 is going to beat the G80, because this isnt your G70 vs R520 or NV40 vs R420.
That's fine. I never claimed my opinion to be truth and you very well may be right. I see this as another one of the 100+ speculative threads where we take information that we have been given thus far and produce an educated guess.
Since when does one game preview automatically decipher how these next-gen cards handle DX10.
That depends on how much game developers utilize DX10's limitations in said application.
Basing the argument that it runs slow on one DX10 application means it will on all DX10 software in general is a pre-mature statement, and one that is rather futile at this point. It would be like saying the Radeon x1900PRO sucks at DX9 because it can't run Oblivion with max settings on a decent resolution.
Hardly. I never claimed to have known the resolution the demo was ran on, what settings they used, nor what card(s) may have been running it--all we know is that it was a G80 of some combination.

I trust that they did the best that the could to give as fluid of a preview as possible especially since it was a live preview to public enthusiasts and not some clip released on the net for download.
What about the Radeon 9700. That card was a beast with DX8 games, and handled many DX9 games too.
Sure it handled DX9 games, but not that great, especially when you started to crank up IQ. In the long scheme of things it was what I think the G80 will be, a good introductory card for the beginning of a new DirectX API.
Not to be rude, but I don't think Josh has a clear understanding on the issue. These GPUs have been designed very closely with Microsoft's DX10 software teams, along with game developers. And while support for DX10 will always get better with each new generation, I don't think it's accurate to say that G80/R600 aren't "meant" for DX10.
My understanding of the issue is that if companies plan to keep making cards and turn a profit there has to be a demand for those cards. If Nvidia wants to be able to entice people from a G80 to a G81, the G80 has to stop making someone happy at some point, thus not giving enough performance for a certain title. Considering the time frame that Vista gets released, DX10 games emerge, and G81 gets released, I just think that the G80 will try to get phased out as soon as possible with the smaller nm core of the G81.


 

Nelsieus

Senior member
Mar 11, 2006
330
0
0
Originally posted by: josh6079
I'm basing the performance of a card based on what we've been shown.

But what we've been shown does little to represent the overall perspective, so I don't think it's fair to use it to make a generalized statement.


Originally posted by: josh6079The game can be "optimized" all it wants to be, but if it's a graphics hog it's a graphics hog--look at FEAR.

But some would argue FEAR wasn't optimized, so I don't see how that argument is relevant.



Originally posted by: josh6079That depends on how much game developers utilize DX10's limitations in said application.
Well, we've just started reaching DX9's limitation, so I think it's safe to say developers have quite a bit of leg room as far as DX10 development. goes.


Originally posted by: josh6079Hardly. I never claimed to have known the resolution the demo was ran on, what settings they used, nor what card(s) may have been running it--all we know is that it was a G80 of some combination.
Then how can you base your comment of "I'm not all that excited the R600 / G80 performances in DX10 and that preview explains why." if there are so many unknown variables within that preview, as you just acknowledge?


Originally posted by: josh6079My understanding of the issue is that if companies plan to keep making cards and turn a profit there has to be a demand for those cards. If Nvidia wants to be able to entice people from a G80 to a G81, the G80 has to stop making someone happy at some point, thus not giving enough performance for a certain title. Considering the time frame that Vista gets released, DX10 games emerge, and G81 gets released, I just think that the G80 will try to get phased out as soon as possible with the smaller nm core of the G81.

That doesn't seem very plausible, I'd have to disagree. You need to remember there will be several iterations of the DX10 spec, aka dx10.1, 10.2, and so on.

And besides that, you didn't see nVidia plaguing their Geforce 7900 series cards to get people to buy G80. So I'm not really buying into your logic.

Nelsieus



 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
But what we've been shown does little to represent the overall perspective, so I don't think it's fair to use it to make a generalized statement.
Which overall perspective? The capabilities of the G80 or the capabilities of DX10?
But some would argue FEAR wasn't optimized, so I don't see how that argument is relevant.
If you think FEAR wasn't "well-optimized" on XP near the end of the DX9 era just wait until you see Hellgate on Vista in the beginning of the DX10 era.
Well, we've just started reaching DX9's limitation, so I think it's safe to say developers have quite a bit of leg room as far as DX10 development. goes.
Of course they do, but my comment was concentrated on G80's leg room with DX10, not DX10 itself.
Then how can you base your comment of "I'm not all that excited the R600 / G80 performances in DX10 and that preview explains why." if there are so many unknown variables within that preview, as you just acknowledge?
I would only hope the unknown variables in the preview were demanding ones such as high resolutions and acceptable AA levels. Otherwise, the situation will be even worse.

If they were low settings and resolutions what would that say?
Originally posted by: Nelsieus
Originally posted by: josh6079
My understanding of the issue is that if companies plan to keep making cards and turn a profit there has to be a demand for those cards. If Nvidia wants to be able to entice people from a G80 to a G81, the G80 has to stop making someone happy at some point, thus not giving enough performance for a certain title.
That doesn't seem very plausible, I'd have to disagree.
So, you're saying that as DX APIs mature and developers learn how to better utilize the Direct3D versions games don't become more demanding and require better hardware?
You need to remember there will be several iterations of the DX10 spec, aka dx10.1, 10.2, and so on.
I don't seem to remember getting a huge increase in performance by updating DX iterations. Heck, sometimes I only update them when I install a newer game that has a later DX iteration coupled with it. Are you implying that DX10.2 will make or break the G80's DX10 performance?
And besides that, you didn't see nVidia plaguing their Geforce 7900 series cards to get people to buy G80.
That's not representative to my analogy as there was no die shrink from the 7900's to the G80. The last time Nvidia did a die shrink in their flagship products you most certainly did see a "plaguing" of their former GeForce series (aka - 7800--->7900).
So I'm not really buying into your logic.
It's just my opinion based off of what Nvidia and Flagship Studios have all showed us. If you think the G80 / R600 will do good with some DX10 titles show me some clips because I'm most interested.

edit: spelling / grammar
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |