Full Review: 9800GX2 vs HD3870X2!!!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Dadofamunky

Platinum Member
Jan 4, 2005
2,184
0
0
I will ***never*** spend $600+ on a video card. Not unless it wipes my a$$ for me and does my laundry. The 8800GT/GTS-g92 and 3870 HD/3870x2 are much better value propositions.

I would have been much more interested in seeing how the card did against the 8800U. We already know it should have a decent edge against the ATI card. I hope other more well-known sites actually bother to compare the two NVidia cards.

3DMarks are meaningless eye candy.

And of course NVidia is going to do everything they can to make sure their Crysis drivers are optimized. In several other results the differences are small enough to be meaningless for normal gaming usage.

Both cards seem to do very well at 2560. Beyond 60 fps, you literally cannot see the differences. Of course the ATI card will likely come in second. But 85 fps vs. 100? Exactly what visual difference does that make? Exactly none. You physically cannot perceive the difference.

Sorry, for my dollar ATI still comes out ahead at this point on the high end. The 3870x2's marginal utility (especially the GDDR4 variant, which I will very likely buy) is far superior. Now, given a choice between the 8800GT and the 3870XT at similar prices, it would be no contest. But either card is more than good enough for my purposes. It all comes down to who gives me the most for my money. JMHO.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: reviewhunter
guys, do read the review again, seems like the author, the one in English, updated the review.
http://lly316.blogspot.com/200...radeon-hd-3870-x2.html

I spotted these changes:

CoJ --> Although the 9800 GX2 is 1 fps faster than the AMD, but it was unable to run the benchmark with 4xAA turn on. It happened at the moment the benchmark begins, it bounced right back into desktop. Therefore, the score is recorded as zero.


UT3--> Correction:
Just like the previous game benchmarks, 9800 GX2 took the lead easily without AA.
But performance decreased drastically with 4xAA/16xAF turned on, resulting with a single digit fps on the average. Thus the score is once again recorded as zero. (There were no errors encountered here)


WIC--> Again, when AA is turned on for the GeForce 9800 GX2, an error message appears. Thus the score is recorded as zero.

Do tell me if you spotted other changes.

Good update. That was not there when I made my last post.
 

reviewhunter

Member
Mar 4, 2008
79
0
0
Originally posted by: Dadofamunky
I will ***never*** spend $600+ on a video card. Not unless it wipes my a$$ for me and does my laundry. The 8800GT/GTS-g92 and 3870 HD/3870x2 are much better value propositions.

I would have been much more interested in seeing how the card did against the 8800U. We already know it should have a decent edge against the ATI card. I hope other more well-known sites actually bother to compare the two NVidia cards.

3DMarks are meaningless eye candy.

And of course NVidia is going to do everything they can to make sure their Crysis drivers are optimized. In several other results the differences are small enough to be meaningless for normal gaming usage.

Both cards seem to do very well at 2560. Beyond 60 fps, you literally cannot see the differences. Of course the ATI card will likely come in second. But 85 fps vs. 100? Exactly what visual difference does that make? Exactly none. You physically cannot perceive the difference.

Sorry, for my dollar ATI still comes out ahead at this point on the high end. The 3870x2's marginal utility (especially the GDDR4 variant, which I will very likely buy) is far superior. Now, given a choice between the 8800GT and the 3870XT at similar prices, it would be no contest. But either card is more than good enough for my purposes. It all comes down to who gives me the most for my money. JMHO.

I guess in the utmost top range, products like the 88U & 98GX2 aren't meant to be price sensitive. I can see AMD is playing the price war on the high and mid-range.

What 2560?
Over 100fps? It's time to level-up your AA!
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: nitromullet
Originally posted by: lopri
Originally posted by: BFG10K
The currently broken AA is alarming.
I can't seem to force AA using the Control Panel (8800 GT SLI). Is it a known issue?

Which drivers are you using? The last WHQL set I was using with my 8800GTS 512 had the same issue. I'm using the 174.31 (from guru3d) drivers now, and it seems to work.

What BFG is referring to is that in the review posted in this thread, the 9800GX2 failed to run with AA enabled for many of the benchmarks. They don't really make it clear what that means though... Were they just not able to force AA in the driver, or did it crash?

I actually have to amend this... I can't force AA either.

edit: I just checked again... it seems i can actually force AA (in UT3 at least) via the driver.

I picked up the 174.31 driver here:

http://www.start64.com/index.p...iew&id=1959&Itemid=107

...64-bit only of course.
 

schneiderguy

Lifer
Jun 26, 2006
10,788
76
91
Originally posted by: Rusin
Originally posted by: schneiderguy
I doubt nvidia can squeeze much more performance out of the GX2 with better drivers. They've had a few years to work on the G80 SLI drivers
Well that's G80. They have made decent work with G92 and specially G94.

They are the same core, pretty much. G92 is just a 65nm die shrink.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
More results.

This thing gets slapped by a pair of 8800 GTs so if you're going SLI you may as well get two of those.

I can't seem to force AA using the Control Panel (8800 GT SLI). Is it a known issue?
Current official nVidia drivers don?t force profile AA and instead use the global value. Also I have seen cases where G92 based cards refuse to run AA in certain situations where their older siblings work fine.

I don?t think either of those scenarios apply to the review though.

What is disappointing is that it?s essentially G92 in SLI, something that has already been around for a while in the form of 8800 GT/GTS 512 SLI, yet there are still numerous driver issues.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: schneiderguy
Originally posted by: Rusin
Originally posted by: schneiderguy
I doubt nvidia can squeeze much more performance out of the GX2 with better drivers. They've had a few years to work on the G80 SLI drivers
Well that's G80. They have made decent work with G92 and specially G94.

They are the same core, pretty much. G92 is just a 65nm die shrink.

The G9X cards do have higher shader clocks, twice as many texture address units per cluster, better color and z-compression, and better video decoding.
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Higher Shader clocks doesn't make a huge impact on performance. Most DX 10 apps are shader hungry so the performance factor depends on the number of shader processors than their clock speeds.
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,674
146
106
www.neftastic.com
If AMD executes a high end price cut on the x2, dropping it to around $329, there would be absolutely NO reason to buy a GX2 in my opinion, especially with how well AMD has been executing with drivers lately.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Originally posted by: SunnyD
If AMD executes a high end price cut on the x2, dropping it to around $329, there would be absolutely NO reason to buy a GX2 in my opinion, especially with how well AMD has been executing with drivers lately.

You think AMD will cut the price that much on the 3870X2s?

Edit - I could see a 30 or 50 dollar price cut, bringing it to 420 or 400 even, but full 120 dollar price cut, seems a bit excessive. Course, AMD does tend to cut the prices on their products a lot more than ATI did on their own. Nvidia rarely cuts their prices.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: schneiderguy

They are the same core, pretty much. G92 is just a 65nm die shrink.
And that G92 has over 70M more transistors and different structre etc.

But still they made good job with G94 (9600 GT scales better than HD3870 in two card configuration) and some decent work with G92. Don't know about G80
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,674
146
106
www.neftastic.com
Originally posted by: Bateluer
Originally posted by: SunnyD
If AMD executes a high end price cut on the x2, dropping it to around $329, there would be absolutely NO reason to buy a GX2 in my opinion, especially with how well AMD has been executing with drivers lately.

You think AMD will cut the price that much on the 3870X2s?

Edit - I could see a 30 or 50 dollar price cut, bringing it to 420 or 400 even, but full 120 dollar price cut, seems a bit excessive. Course, AMD does tend to cut the prices on their products a lot more than ATI did on their own. Nvidia rarely cuts their prices.

With as agressive as AMD is when the only move they have left is price... I could see them bringing it down to at LEAST $349.
 

nubian1

Member
Aug 1, 2007
111
0
0
I'll have to wait until more "Established" sites such as this one perform their own benchmarks with retail shipping cards before I come to a full conclusion, but there are a few things I can see already.
At this price , this new Nvidia product will be relegated by the vast majority of PC users to a Niche product. Something to talk about in forums like this one, something to speculate on but NOT something to actually purchase. This isn't to say that there aren't those who will lay down that amount of coin for the card, but compared to the masses this will be a very small number.

AMD/ATI is not standing still and given AMD's skill at lowering prices a drop in the cost of a 3870x2 of even $50 will go far in taking the wind out of Nvidia's new baby, at least from a viable purchase standpoint.

The average, and even above average, consumer very easily comes up against the law of diminishing returns specially when their $$$ is on the line. IMHO Nvidia's new release just makes the 3870x2 that much more appealing as well as the obvious SLI possibilities of some of their own much less expensive products.

 

reviewhunter

Member
Mar 4, 2008
79
0
0

imported_wired247

Golden Member
Jan 18, 2008
1,184
0
0
Quick question, would you need an SLI mobo to run the 9800gx2? Or will it run SLI on an X38 chipset

Since we know from the previous test that SLI needs to be enabled for maximum performance, yet Intel chipset based boards do not support SLI, we might be looking at a compatibility issue here.

It was then the GeForce 9800GX2 moved to the Striker II Formula platform, where it managed to perform all tests. On the other hand, the 3870 X2 maintained its stability on both platforms. We have to give credit to AMD on its compatibility advantage. In conclusion, we would advice those considering the 9800 GX2 to get a Nvidia chip based motherboard.

this is really ambiguous... I can't tell whether it will or won't work... (Driver update??)

They would "advice" (sic) me to get an nvidia chipset, but does that mean I would HAVE to?
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: wired247
Quick question, would you need an SLI mobo to run the 9800gx2? Or will it run SLI on an X38 chipset

Since we know from the previous test that SLI needs to be enabled for maximum performance, yet Intel chipset based boards do not support SLI, we might be looking at a compatibility issue here.

It was then the GeForce 9800GX2 moved to the Striker II Formula platform, where it managed to perform all tests. On the other hand, the 3870 X2 maintained its stability on both platforms. We have to give credit to AMD on its compatibility advantage. In conclusion, we would advice those considering the 9800 GX2 to get a Nvidia chip based motherboard.

this is really ambiguous... I can't tell whether it will or won't work... (Driver update??)

They would "advice" (sic) me to get an nvidia chipset, but does that mean I would HAVE to?

I think that's the whole issue really. Apparently, the state the drivers are in at the moment require an SLI motherboard to run the 9800GX2... Obviously, that somewhat defeats the purpose of making a dual gpu/single PCIe slot card, unless you want to run two of them.

There is no technical reason why SLI wouldn't work on an Intel chipset, so NVIDIA has to actually prevent two card SLI from working on an Intel chipset board via the driver. Most likely the drivers they used in this benchmark still had that synthetic restriction enabled. NVIDIA is probably going to have to figure put a way to allow "SLI" for the 9800GX2 on an Intel, AMD or VIA chipset while ensuring that normal 2 card SLI still won't work on those chipsets.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Rusin
Originally posted by: schneiderguy

They are the same core, pretty much. G92 is just a 65nm die shrink.
And that G92 has over 70M more transistors and different structre etc.

But still they made good job with G94 (9600 GT scales better than HD3870 in two card configuration) and some decent work with G92. Don't know about G80

The extra 70M of transistors could be because of the integrated NVIO chip, and the upgraded pure video engine.
 

lopri

Elite Member
Jul 27, 2002
13,212
597
126
Originally posted by: nitromullet

Which drivers are you using? The last WHQL set I was using with my 8800GTS 512 had the
same issue. I'm using the 174.31 (from guru3d) drivers now, and it seems to work.
169.25, I believe.

Originally posted by: nitromullet

I think that's the whole issue really. Apparently, the state the drivers are in at the moment require an SLI motherboard to run the 9800GX2... Obviously, that somewhat defeats the purpose of making a dual gpu/single PCIe slot card, unless you want to run two of them.

There is no technical reason why SLI wouldn't work on an Intel chipset, so NVIDIA has to actually prevent two card SLI from working on an Intel chipset board via the driver. Most likely the drivers they used in this benchmark still had that synthetic restriction enabled. NVIDIA is probably going to have to figure put a way to allow "SLI" for the 9800GX2 on an Intel, AMD or VIA chipset while ensuring that normal 2 card SLI still won't work on those chipsets.
Sharp analysis. I didn't think about that before. That makes perfect sense because NV has to figure out how to *partially enable* (or partially disable if you so prefer to call it) SLI on Intel chipsets with the GX2. Pathetic but a legitimate business practice, I guess?

 

Dadofamunky

Platinum Member
Jan 4, 2005
2,184
0
0
Originally posted by: reviewhunter
Originally posted by: Dadofamunky
I will ***never*** spend $600+ on a video card. Not unless it wipes my a$$ for me and does my laundry. The 8800GT/GTS-g92 and 3870 HD/3870x2 are much better value propositions.

Sorry, for my dollar ATI still comes out ahead at this point on the high end. The 3870x2's marginal utility (especially the GDDR4 variant, which I will very likely buy) is far superior. Now, given a choice between the 8800GT and the 3870XT at similar prices, it would be no contest. But either card is more than good enough for my purposes. It all comes down to who gives me the most for my money. JMHO.

I guess in the utmost top range, products like the 88U & 98GX2 aren't meant to be price sensitive. I can see AMD is playing the price war on the high and mid-range.

What 2560?
Over 100fps? It's time to level-up your AA!

That's definitely true. they're aimed at the elite enthusiast market, obviously. I'm certainly not in that category.

I noticed in that 'review' that several game tests were done at 2560, which I thought was a good idea. But I'll be much more interested when this site and others do their own assessments.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: ManWithNoName
Don't know if it's been posted yet., but here is a New Tweaktown Review but old Drivers from January. 8800GT OC vs HD3870 X2 vs 9800 GX2 .....................

http://www.tweaktown.com/artic...ntroduction/index.html

Unfortunately, those drivers are betas from January and basically only enable the card, v. 173.67

I think I've personally seen three driver notifications since these on the corp ftp site, these results are not representative of what you will see on launch day.




 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |