ATI 4xxx Series Thread

Page 31 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AshPhoenix

Member
Mar 12, 2008
187
0
0
Radeon HD 4850 beats 8800 GT

At least in 3DMark06

Someone told us that the Radeon HD 4850 will beat the 8800 GT by about 8-9 percent, this might not sound by much, but considering that the expected retail price of the Radeon HD 4850 will be less than some 8800 GT cards, then it?s getting interesting.

The 3DMark06 scores we heard were around 10,800 for the 8800 GT with the Radeon HD 4850 coming in at about 11,760. Not a huge win, but this is with beta drivers on a reference GDDR3 card, so with some help from AMD?s partners, we might see some better scores.

e haven?t seen a single partner made card so far, but some partners will have shipped cards into the channel by the time AMD is ready to announce the new cards and we?re not talking a handful of cards, we?re talking 1000?s if not 10,000?s of cards.

Things might just start to look up a little bit for AMD, at least in the upper mid-range and lower high-end of the market, which should be quite a good money making segment these days.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
8-9% win in 3dMark06 is not that good. HD3870 won 9600 GT on 3dMark06 by around 5%, but in games it lost by 10%.
And what will be price range of HD4850?
 

Martimus

Diamond Member
Apr 24, 2007
4,488
153
106
Originally posted by: Rusin
8-9% win in 3dMark06 is not that good. HD3870 won 9600 GT on 3dMark06 by around 5%, but in games it lost by 10%.
And what will be price range of HD4850?

I have seen you write that a few times, but I had thought that the 3870 beat the 9600 in most benchmarks. I thought maybe I remembered wrong, so I typed "9600gt review" in goolge, and looked at the first hit and low and behold the 3870 beat the 9600GT in 3 out of 4 games they tested. So I wonder where you get that the 9600GT is better for games? It is a nice card, and because was cheaper than a 3870 it was a better value, but I think calling it better overall is a stretch.
 

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: Martimus

I have seen you write that a few times, but I had thought that the 3870 beat the 9600 in most benchmarks. I thought maybe I remembered wrong, so I typed "9600gt review" in goolge, and looked at the first hit and low and behold the 3870 beat the 9600GT in 3 out of 4 games they tested. So I wonder where you get that the 9600GT is better for games? It is a nice card, and because was cheaper than a 3870 it was a better value, but I think calling it better overall is a stretch.

It probably depends on if they test with AA on or off and which games. I thought the opposite, that 3870 was faster and checked Xbitlabs benchmarks. They test with a wide variety of games but usually with AA on (if possible). From their benchmarks, it seemed 9600GT was faster from a quick glance through.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: Martimus
I have seen you write that a few times, but I had thought that the 3870 beat the 9600 in most benchmarks. I thought maybe I remembered wrong, so I typed "9600gt review" in goolge, and looked at the first hit and low and behold the 3870 beat the 9600GT in 3 out of 4 games they tested. So I wonder where you get that the 9600GT is better for games? It is a nice card, and because was cheaper than a 3870 it was a better value, but I think calling it better overall is a stretch.
Well from these

Average numbers:
Firinsquad: 9600 GT - 2-10% faster than HD3870
Anandtech:9600 GT - 5% faster than HD3870
Digit-Life: 9600 GT - 6-25% faster than HD3870
Overclockersclub: HD3870 - 5-10% faster than 9600 GT
Tom's Hardware: Without AA HD3870 wins by 5%, with AA 9600 GT wins by 5%
Hothardware: 9600 GT won by 1-4%
Techreport: 9600 GT won by 10-15%
Bit-Tech: 9600 GT won by 1-14%
Techarp: 9600 GT won
Muropaketti: 9600 GT won on every game test with fair margin (used gameplay for review)
Techpowerup: 9600 GT won by 4-8%
PCper: 9600 GT won by 10-18%
Expreview: HD3870 won by 6,75%
Trustedreviews: 9600 GT won by 0-17%

These are reviews including reference clocked 9600GT and HD3870 cards that can be easily found with Google. Links etc. if you insist..but since this is ATi 4000-thread..I'd prefer pm.

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
it would be pretty damn pathetic of AMDs dual GPU card couldn't even beat a single GPU nvidia card...
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: taltamir
it would be pretty damn pathetic of AMDs dual GPU card couldn't even beat a single GPU nvidia card...

People need to get rid of this mindset.

R700 is dual-GPU because it was designed to be, not because AMD can't compete with a single-GPU card. If years ago when AMD was designing R700 they thought a huge, rediculously expensive single-GPU solution was best, they would have designed a single GPU with 960SP/64 TMU/32 ROP. But they decided that was not the way to go, and we will see soon whether that gamble pays off.

If R700 > GT200, then R700 > GT200. There is no "well, it's a dual GPU so it doesn't count." That's just BS.

Originally posted by: Rusin
..in 3dMark

This isn't 3D Mark 06, where R600 had a huge advantage over G80. Current AMD cards suck in Vantage, so if R700 beats it in Vantage it should beat it elsewhere as well. The only area where Vantage doesn't reflect gameplay is HD 3850 vs 9600GT... the two are pretty close in Vantage but the 9600GT blows it away in 99% of games. This is probably a consequence of Vantage taxing shader resources heavily and the 9600GT's texture advantage can't save it.

 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: Extelleron
Originally posted by: taltamir
it would be pretty damn pathetic of AMDs dual GPU card couldn't even beat a single GPU nvidia card...

People need to get rid of this mindset.

R700 is dual-GPU because it was designed to be, not because AMD can't compete with a single-GPU card. If years ago when AMD was designing R700 they thought a huge, rediculously expensive single-GPU solution was best, they would have designed a single GPU with 960SP/64 TMU/32 ROP. But they decided that was not the way to go, and we will see soon whether that gamble pays off.

If R700 > GT200, then R700 > GT200. There is no "well, it's a dual GPU so it doesn't count." That's just BS.

Originally posted by: Rusin
..in 3dMark

This isn't 3D Mark 06, where R600 had a huge advantage over G80. Current AMD cards suck in Vantage, so if R700 beats it in Vantage it should beat it elsewhere as well. The only area where Vantage doesn't reflect gameplay is HD 3850 vs 9600GT... the two are pretty close in Vantage but the 9600GT blows it away in 99% of games. This is probably a consequence of Vantage taxing shader resources heavily and the 9600GT's texture advantage can't save it.

I expect NV to join AMD/ATI in that direction in the future. I'd say it just make sense. It does not have to stop with video cards.
If the scaling goes well can you imagine 10 GPUs on one motherboard in a blade server?
10TFlops
Back on topic: 700's 2Tflops are a good foundation for a very fast card.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
excelleron, you are overlooking scaleability.
If the R700 was "designed to be dual GPU" how come there are single GPU R700? Nvidia will probably make a sandwich card as well... and there is still the issue of tri-sli and quad xfire.
Unless nvidia can only put tri-sli with a G200, and AMD is gonna pull an octa xfire with 4 individual 4870x2 cards for 8 cores that scale well, then my commend about pathetic stands.

Ofcourse I don't expect it to matter because I expect the 4870x2 to be faster then the a single G280. When it works that is (multi-GPU still has many issues).

Oh and isn't the biggest problem right now is lack of GDDR5 availability? wasting half the ram on a dual GPU setup is VERY bad if that is the case.
 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
Wasn't there speculation that the new X2 series was viewed by the OS as a single card? And that memory was pooled? If these rumors are substantiated then it's not a waste of RAM.... we'll see when the time comes I guess.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Originally posted by: allies
Wasn't there speculation that the new X2 series was viewed by the OS as a single card? And that memory was pooled? If these rumors are substantiated then it's not a waste of RAM.... we'll see when the time comes I guess.
That was rumour..but it did go away. It still uses that PEX-chip known from HD3870 X2 and is crossfire setup.
 

chewietobbacca

Senior member
Jun 10, 2007
291
0
0
The rumor hasn't gone away because use of the chip does not mean its the same thing.

Basically unless the chip is a MCM design (which it isn't), the chips still have to communicate with one another so you're still going to need an intermediary source. Whether or not it *IS* CF on the card or not, we don't know yet. But saying it is straight up CF on the board just because there is a chip there is not correct.
 

Shortass

Senior member
May 13, 2004
908
0
76
Originally posted by: taltamir
excelleron, you are overlooking scaleability.
If the R700 was "designed to be dual GPU" how come there are single GPU R700?

Because it's cheaper for them, gives them an easy way to have price and product tiering without creating many variants of 'crippled' cards that waste resources. It's smart.

Nvidia will probably make a sandwich card as well... and there is still the issue of tri-sli and quad xfire.

Except that the size and heat that will likely be produced by the 280 will make it an extremely unpleasant solution, perhaps unfeasible. I believe the gtx 260/280 thread has speculated this a while ago, but of course we'll see in a few weeks what we're dealing with.

Of course I don't expect it to matter because I expect the 4870x2 to be faster then the a single G280. When it works that is (multi-GPU still has many issues).

Why does this matter? If it's ati's best performing card versus nVidia's best performing card, the comparison is fair. If nVidia can't make a delish sandwich for us to devour yet it's their fault they will lose the performance crown. They've had more than enough time to prepare for this.

Oh and isn't the biggest problem right now is lack of GDDR5 availability? wasting half the ram on a dual GPU setup is VERY bad if that is the case.

It will suck for us, but doesn't really hurt them all that much. Regaining the performance crown would be wonderful for them right now, considering the past few years. That said, I'm really only considering the hd4870 gddr5 version so the slower this is to release (and I mean release, not 2 hours of availability and 2 months of price gouging madness) so I really hope the shortage is overstated.
 

biostud

Lifer
Feb 27, 2003
18,678
5,408
136
AFAIK the 4870x2 will show as one GPU, but it will still be running some sort of CF. Whether this improves performance compared to 2x 4870 in CF we must wait to see.
 

stepone

Member
Aug 25, 2006
86
0
0
The performance of the 4850 is getting more concerning the more we learn. After all initial reports suggested it would be 5-10% quicker than a 9800gtx, then it was reported as parity with the 9800gtx & now we're talking about it being only marginally faster than a 8800gt?!

Even if the 4850 is only $199 at launch the problem with this is that you can pickup an Asus 8800gt 512mb with a better non-ref cooler for as little as $135 from newegg:
http://www.newegg.com/Product/...x?Item=N82E16814121224

And some heavily oc'd cards in the $150-$160 range then the 4850 may struggle with the price/performance crown.
However this doesn't take into account the improved media capabilities of the 4850 but I feel that ATI was too overzealous in trying to differentiate between the 4850 & 4870 as they've apparently reduced the clocks a little too much - 625mhz for the 4850 as opppsed to 850mhz for the 4870.

Still you never know for sure until we actually see some benchies with finalised drivers & cards...
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: taltamir
excelleron, you are overlooking scaleability.
If the R700 was "designed to be dual GPU" how come there are single GPU R700? Nvidia will probably make a sandwich card as well... and there is still the issue of tri-sli and quad xfire.
Unless nvidia can only put tri-sli with a G200, and AMD is gonna pull an octa xfire with 4 individual 4870x2 cards for 8 cores that scale well, then my commend about pathetic stands.

Ofcourse I don't expect it to matter because I expect the 4870x2 to be faster then the a single G280. When it works that is (multi-GPU still has many issues).

Oh and isn't the biggest problem right now is lack of GDDR5 availability? wasting half the ram on a dual GPU setup is VERY bad if that is the case.

There are single GPU RV770 parts because that is part of the multi-GPU plan.

There are two major reasons why multi-GPU cards make sense compared to single-GPU now and in the future.

The first is the die size of the chip which of course affects yield tremendously. If you want to continue the current performance improvement that we see with GPUs every year (typically ~2x every year) then single-GPU setups cannot continue. The die size just keeps going up, despite smaller processes. G80 was 484mm^2 @ 90nm, GT200 is 576mm^2 @ 65nm..... what's next? Is "GT300" coming to be 700mm^2 @ 45nm?

With GT200 vs R700, you have a huge 576mm^2 chip with horrible yields or 2 small ~250mm^2 chips with great yields. Which one would you want to produce?

There is another benefit to having small GPUs. It allows AMD to produce their GPUs on the most advanced process available. Because of the immaturity of TSMC's 55nm process (relative to their 65nm process at least) nVidia had to produce GT200 @ 65nm. Because AMD has a small GPU, they are able to use 55nm without any real consequences. This is why you always saw a new process debut on midrange or low-end hardware.

The second reason is cutting design time and costs. If your high-end is 2 mid-range GPUs, then you do not have to design a totally new high end GPU. This reduces the design time and also the costs; you don't have to design two chips, tape out two chips, and produce two chips.

In the future I would expect to see both nVidia and AMD move in this direction, likely with non-software connection between the GPUs and a unified connection to a pool of memory. Whether R700 makes any move in this direction, we will see. But I would expect nVidia's next gen chip and R800 certainly will.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: taltamir
If the R700 was "designed to be dual GPU" how come there are single GPU R700? Nvidia will probably make a sandwich card as well... and there is still the issue of tri-sli and quad xfire.
Unless nvidia can only put tri-sli with a G200, and AMD is gonna pull an octa xfire with 4 individual 4870x2 cards for 8 cores that scale well, then my commend about pathetic stands.

Ofcourse I don't expect it to matter because I expect the 4870x2 to be faster then the a single G280. When it works that is (multi-GPU still has many issues).

Oh and isn't the biggest problem right now is lack of GDDR5 availability? wasting half the ram on a dual GPU setup is VERY bad if that is the case.

Most of these points apply to the 3870x2. We don't know enough about the 4870x2 to assume that it will have the same issues of scalability, memory usage, etc... and until Nvidia shrinks the gt200 to a smaller, cooler chip, it's highly unlikely there will be a dual gt200 card.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
so you are saying the 48xx is a better card because it is a cheaper budget card to make and research? how does that work exactly? that doesn't make it better, that just means AMD has an easier time of making it better in the future, maybe.
Does anything in your counter arguments (your as in the few people arguing) make sense for the 4870X2 to be weaker then a single G280 and still make sense? do you expect the G280 to be magically limited to 2 way SLI and for the R770 to come in 4 x 4870x2 bundles with 99% scaling or something of this sort?
When I said it would be pathetic I meant IF it cannot compete even in that state. And it would be, that whole "people need to drop the multi-gpu is bad attitute" is a myth. Not to mention "future improvements" in a type of technology doesn't make CURRENT implementation worthwhile.

No it is not going to be seen as a single card by the OS.
No it is not going to not waste half the ram (or more if you have more GPUs).
No it is not going to be scaling perfectly.
And no, it is, most likely, not going to be so weak as to loose in CF to a single nvidia GPU (probably).
And if it is it WILL be utterly pathetic because everyone in the high end will own an NV, 2xNV, or 3xNV. Because if NV > 2xAMD then 3NV is most likely going to much greater then 4xAMD (maybe AMD will get 8 cores working together... but again, scaling will be terrible).
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
if rumored prices are any indication then it doesn't seem likely that 4870x2 will be faster than 280gtx. it would be nice to have some competition, but $500<<$650. NV is going to dominate the high end again this time. I'll be highly impressed if 4870 is even close to gtx 260, and the same for 4870x2 with 280 gtx.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
which is pretty sad.

Oh, to reiterate:
Judging a technology should be done on it's current capability, not some imagined future ones. (otherwise netburst is godly, rather then the worst crap from intel, ever)
People are fully justified to judge multi-gpu setups harshly cause right now they suck, if they get better in the future, then the opinions will change
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: taltamir
so you are saying the 48xx is a better card because it is a cheaper budget card to make and research? how does that work exactly? that doesn't make it better, that just means AMD has an easier time of making it better in the future, maybe.
Does anything in your counter arguments (your as in the few people arguing) make sense for the 4870X2 to be weaker then a single G280 and still make sense? do you expect the G280 to be magically limited to 2 way SLI and for the R770 to come in 4 x 4870x2 bundles with 99% scaling or something of this sort?
When I said it would be pathetic I meant IF it cannot compete even in that state. And it would be, that whole "people need to drop the multi-gpu is bad attitute" is a myth. Not to mention "future improvements" in a type of technology doesn't make CURRENT implementation worthwhile.

No it is not going to be seen as a single card by the OS.
No it is not going to not waste half the ram (or more if you have more GPUs).
No it is not going to be scaling perfectly.
And no, it is, most likely, not going to be so weak as to loose in CF to a single nvidia GPU (probably).
And if it is it WILL be utterly pathetic because everyone in the high end will own an NV, 2xNV, or 3xNV. Because if NV > 2xAMD then 3NV is most likely going to much greater then 4xAMD (maybe AMD will get 8 cores working together... but again, scaling will be terrible).

I'm not saying that the production advantages have anything to do with the 4870 X2 being a better card. The 4870 X2 will be a better card if it performs better than the GTX 280, and that is all that matters. Yet some people such as you want to say "Well, it might be faster but it's dual GPU so it doesn't count." That is the mindset that needs to go away. The 4870 X2 is not dual-GPU because AMD is throwing it together as a last ditch effort because RV770 sucks. It is dual-GPU because that is the future and that is the way that it was designed. RV770 and GT200 were never meant to be compared. So the comments "if R700 doesn't beat GT200 it's pathetic" are not warranted. It's the same thing like comparing R600 to G80. AMD has taken a smarter approach this time and chosen high yield and low manufacturing cost over extremely low yields and high cost. There is nothing wrong with the multi-GPU implementation on HD 3870 X2 and there will be nothing wrong with the implementation on HD 4870 X2. Anandtech themselves said that the HD 3870 X2 functioned for all intents and purposes like a single card, and with great driver support it sees good performance in pretty much every game.

The 4870 X2 will be much cheaper than the GTX 280 because it is so much cheaper to produce... it is rumored to be priced at ~$500 meanwhile the GTX 280 will debut at $649. nVidia can't afford to sell their top GT200 part for $499, it is too expensive to produce and that would mean GTX 260 would have to be <$400.

 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |