ATi 4870 X2 (R700) thread

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

CP5670

Diamond Member
Jun 24, 2004
5,535
613
126
Originally posted by: Jessica69
So very true. You gotta feel bad for all the suckers, errrrrr, consumers that bought the GTX280 already......visions of $$$ being flushed down the commode come to mind.

The pain........the pain of it all.........

I think the only people I've seen here with it are those of us who got it for $430 with that MS cashback discount a few weeks ago. I'm pretty happy with the card at that price, but it's not quite the killer deal that I thought it was at the time (when the 48xx performance was still unknown). I would have probably gotten a 4870 if I hadn't seen that deal.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: munky
Originally posted by: TC91
Originally posted by: Stoneburner
Munky, why is SLI just NOW a good idea? Has gtx280 resolved the common issues with SLI?

i think hes talking about SLI 8800gt's (or similar) being a better option than a single gtx 280 for the price and performance.

Yep. Up until now a single new high end card offered about 2x the performance of two older generation cards, often with better image quality and features thrown in. So there was practically no reason for someone with the older card to get a second one as an upgrade to SLI. But now it actually makes sense to go SLI with cards like the 8800gt (and similar Ati cards in crossfire) for better performance and lower cost than a single gtx280, and you won't even be missing out on any new gaming features.

That's not true m8 and you know it.. Did you forget the 6month cycles that we had prior to the G80 release?

G80 spoiled everyone of us with its amazing performance compared to the previous generation..
It's one thing to say that Nvidia could have brought up the GT200 series a bit earlier and another to say that we were used to see 2x the performance thus far in GPU history..

I think 4870 is a helluva gpu for the bang/bucks but this doesn't negate the fact that GTX280 is a monster as well.. They just couldn't achieve the clocks they were hoping for in order for these cards to perform much better..
GT200 series are not overpriced speaking "RATIONALLY" for what they offer for the following reasons..They are just not good perf/price competitors..

1. They didn't risk to go to 55nm process..So we are talking about a very expensive SKU in general.. (trasistor count,512bit bus,1 GB framebuffer etc)
2. They didn't expect AMD to produce such a great gpu (4870)
3. They concentrated on making it a more complete "GPGPU" rather than achieve the highest performance levels they could.

I'm not saying that we should care about all these but we definitely need to take every aspect into consideration..
It's an undeniable fact that 4870 cremates GT280 in terms of price/perf.. But..
I preferred to take the GTX280 route because I wanted the best single card solution right now to change my long living 8800GTX.
I also believe that in future scenarios the GTX280 will prove to be more "futureproof" in many D3D10 cases, due to its more "rational" architecture.

Either way I didn't and I will not care about SLI/XFIRE solutions. I wanted a single card and I decided that the big premium that I had to pay for the GTX280 over the 4870 was worth it considering all the aspects.. I know the vast majority won't agree with me, and I find that perfectly rational, but I do believe there's a considerable consumer market for the GT280 besides the "showoff of the best performing single card"..
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: jim1976
Originally posted by: munky
Originally posted by: TC91
Originally posted by: Stoneburner
Munky, why is SLI just NOW a good idea? Has gtx280 resolved the common issues with SLI?

i think hes talking about SLI 8800gt's (or similar) being a better option than a single gtx 280 for the price and performance.

Yep. Up until now a single new high end card offered about 2x the performance of two older generation cards, often with better image quality and features thrown in. So there was practically no reason for someone with the older card to get a second one as an upgrade to SLI. But now it actually makes sense to go SLI with cards like the 8800gt (and similar Ati cards in crossfire) for better performance and lower cost than a single gtx280, and you won't even be missing out on any new gaming features.

That's not true m8 and you know it.. Did you forget the 6month cycles that we had prior to the G80 release?
G80 spoiled everyone of us with its amazing performance compared to the previous generation..
It's one thing to say that Nvidia could have brought up the GT200 series a bit earlier and another to say that we were used to see 2x the performance thus far in GPU history..
I got at least a doubling in performance with each generation of cards when I switched from a 9800p -> x800xt -> x1900xt.

I think 4870 is a helluva gpu for the bang/bucks but this doesn't negate the fact that GTX280 is a monster as well.. They just couldn't achieve the clocks they were hoping for in order for these cards to perform much better..
GT200 series are not overpriced speaking "logically" for what they offer for the following reasons..

1. They didn't risk to go to 55nm process..So we are talking about a very expensive SKU in general.. (trasistor count,512bit bus,1 GB framebuffer etc)
2. They didn't expect AMD to produce such a great gpu (4870)
3. They concentrated on making it a more complete "GPGPU" rather than achieve the highest performance levels they could.

I'm not saying that we should care about all these but we definitely need to take every aspect into consideration..
Those were all design decisions made by NV, which no doubt increased the costs, but that doesn't change the fact that in the face of competition the gt200 cards are overpriced. And not just competition from Ati, but also competition from their own midrange products.

It's an undeniable fact that 4870 cremates GT280 in terms of price/perf.. But..
I preferred to take the GTX280 route because I wanted the best single card solution right now to change my long living 8800GTX.
I also believe that in future scenarios the GTX280 will prove to be more "futureproof" in many D3D10 cases, due to its more "rational" architecture.
I've no reason to see the gt200 architecture as more rational or futureproof. It might be more efficient in terms of shader utilization, but when considering its performance in relation to other aspects like transistor count, die size and cost, it actually lags behind the competition.

Either way I didn't and I will not care about SLI/XFIRE solutions. I wanted a single card and I decided that the big premium that I had to pay for the GTX280 over the 4870 was worth it considering all the aspects.. I know the vast majority won't agree with me, and I find that perfectly rational, but I do believe there's a considerable consumer market for the GT280 besides the "showoff of the best performing single card"..
I'm also generally opposed to multi-gpu setups compared to a single faster card, but with the prices of a 8800gt hovering around $150, I'd be much more inclined to get a second one for SLI as opposed to a gtx280 for $600.
 

etrin

Senior member
Aug 10, 2001
692
5
81
All I hear is micro stutter on crossfire (I assume also sli Nvidia cards as well) or this game either does not support or drivers do not support this game--NO SCALING so 2 cards are useless. Do the X2 cards suffer from the same things ?
At first I was going 2 4850's then the 4870's came out...yep get 2 of them..NOW I don't know what to do.

 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: munky
I got at least a doubling in performance with each generation of cards when I switched from a 9800p -> x800xt -> x1900xt.

That's why I talked about the 6 month cycles, something we had forgotten nowadays.. About your gpu "course" I have to check I can't recall, 2x the performance seems a bit much..

I wen't 9800p->6800gt->7800gtx->x1900xt->8800gtx->gtx280.. It was only with G80 that I show a REAL doubling in performance for a looong time..

Those were all design decisions made by NV, which no doubt increased the costs, but that doesn't change the fact that in the face of competition the gt200 cards are overpriced. And not just competition from Ati, but also competition from their own midrange products.

Yes they are and that's why I said that we as consumers don't have to care about all these, but the fact remains that Nvidia holds the single card throne for a long time, and it will probably stay like this in the near future.. And this is what counts for me..In other words.. Overpriced? Yes.. Cannibalized ny their own products and competition? yes.. Worth it? For me yes despite the 4870 presence..

I've no reason to see the gt200 architecture as more rational or futureproof. It might be more efficient in terms of shader utilization, but when considering its performance in relation to other aspects like transistor count, die size and cost, it actually lags behind the competition.

And that is a small thing? If games are inclined that way, you know the outcome..


I'm also generally opposed to multi-gpu setups compared to a single faster card, but with the prices of a 8800gt hovering around $150, I'd be much more inclined to get a second one for SLI as opposed to a gtx280 for $600.

As I said.. Decisions.. I just don't factor the multi gpu solutions..
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
While y'all duke it out on the technical stuff, I'll just say..

This card's gonna whopass for decent coin...
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: jim1976
That's why I talked about the 6 month cycles, something we had forgotten nowadays.. About your gpu "course" I have to check I can't recall, 2x the performance seems a bit much..

I wen't 9800p->6800gt->7800gtx->x1900xt->8800gtx->gtx280.. It was only with G80 that I show a REAL doubling in performance for a looong time..
6 month cycles were refreshes, obviously you won't get a doubling of performance every 6 months. But in my case, the generational transitions did offer a doubling of performance.
Yes they are and that's why I said that we as consumers don't have to care about all these, but the fact remains that Nvidia holds the single card throne for a long time, and it will probably stay like this in the near future.. And this is what counts for me..In other words.. Overpriced? Yes.. Cannibalized ny their own products and competition? yes.. Worth it? For me yes despite the 4870 presence..
It may be worth it to you, but if someone was looking for a cost effective upgrade from a 8800gt, going SLI would be a smart alternative to a new gtx280, and that's something that wasn't true before, for example, getting a second 7800gt when the x1900xt was released. You simply did not get so much performance for the money in the years past, and there is practically no new features this generation either.

I've no reason to see the gt200 architecture as more rational or futureproof. It might be more efficient in terms of shader utilization, but when considering its performance in relation to other aspects like transistor count, die size and cost, it actually lags behind the competition.

And that is a small thing? If games are inclined that way, you know the outcome..
Considering that Ati has 800 shaders as opposed to 240, I'd say they can afford to have somewhat lower utilization. If Ati decided to make a 1400 shader, 1.4B transistor gpu the size of g200, they would probably have the fastest card around, despite having lower utilization, simply because they have more performance per mm^2 and per transistor.
As I said.. Decisions.. I just don't factor the multi gpu solutions..
So in your case, the SLI logic doesn't apply. But from a consumer point, it applies now a lot more than it did before.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: jim1976

And that is a small thing? If games are inclined that way, you know the outcome.


Buying for the future likely remains a waste. In this era of consoles, by the time we get real dx 10.1 or physics whatever.......... your current card will be worth $75 on ebay.


Still we might get 1 level with a bit of fog ..............
 

nanaki333

Diamond Member
Sep 14, 2002
3,772
13
81
i'm curious to see the power draw compared to just 2 4870 in crossfire. i would have 2x 4870 in my machine, but i want to see how this thing is on heat, power, and if it really will be ~15% faster than just 2 4870s in crossfire. i guess my 9800GX2 will have to wait a little longer to be upgraded....
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: nanaki333
i'm curious to see the power draw compared to just 2 4870 in crossfire. i would have 2x 4870 in my machine, but i want to see how this thing is on heat, power, and if it really will be ~15% faster than just 2 4870s in crossfire. i guess my 9800GX2 will have to wait a little longer to be upgraded....

why have 2 x 4870 when you can get 4870X2 for $499 in a couple of weeks? You could end up saving a lot of power, space, heat and money.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: nanaki333
i'm curious to see the power draw compared to just 2 4870 in crossfire. i would have 2x 4870 in my machine, but i want to see how this thing is on heat, power, and if it really will be ~15% faster than just 2 4870s in crossfire. i guess my 9800GX2 will have to wait a little longer to be upgraded....

I do believe the 4870x2 will be the fastest card available when it's released, but I don't think it'll be much faster than two 4870 cards in CF. There are a few reasons for this, but the main reason could be that these 4870x2 cards may have lower core clocks compared to the 4870, example 700Mhz vs 750Mhz. This may have to be done to lower power consumption and heat.

Unless ATI surprises everyone and the R700 will be two 45nm chips
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: munky

6 month cycles were refreshes, obviously you won't get a doubling of performance every 6 months. But in my case, the generational transitions did offer a doubling of performance.

Could be I can't recall, I'm bored to check benches and I can take your word for that no worries .. Sometimes though you had to wait a greater period of time than one year for a new gen to appear.. So under this prisma , one and a half year for double the performance (1.8x on avg) from G80 to GT200 is not that much of a big period..
Anyway my point is that you never got a doubling in performance after a "refresh".. And G80 did that..G80 was not 2x the performance of the 7800gtx, but of the 7900gtx if I recall correctly.. The prior "phenomenon gpu" was 9700pro which had also a great longetivity..

It may be worth it to you, but if someone was looking for a cost effective upgrade from a 8800gt, going SLI would be a smart alternative to a new gtx280, and that's something that wasn't true before, for example, getting a second 7800gt when the x1900xt was released. You simply did not get so much performance for the money in the years past, and there is practically no new features this generation either.

SLI/ XFIRE have the known problems besides the advantages.. If you're ok with them that's fine by me.. I just stated my opinion on the matter, which was that they are not a viable option for me..

Considering that Ati has 800 shaders as opposed to 240, I'd say they can afford to have somewhat lower utilization. If Ati decided to make a 1400 shader, 1.4B transistor gpu the size of g200, they would probably have the fastest card around, despite having lower utilization, simply because they have more performance per mm^2 and per transistor.

And if Nvidia decided to risk it and go 55nm we could have seen much better clocks and maybe GDDR5.. It would have outclassed RV770 any day again, since this particular architecture clearly needs higher clocks to demonstrate its raw power.. We can't talk like that.. It's pure speculation.. Besides that theoretical GFLOPS are approximately 930 for 280 and 1200 for RV770.. Can you say that this is depicted in the benchmarks? A 280 is on average 10-15% faster than 4870 nowadays, and if you narrow the benchmarks to heavy D3D9 and "D3D10" ones you get a 20-25% increase on average.. So on the one hand one can say that nvidia has the upper hand in future games.. On the other hand one can say that ATI has the upper hand with the 10.1 support.. The real truth can only be seen from the choices of the game devs, and the programming route they will take for their games..
Anyway.. Whatever the truth is,you have to agree though that for higher resolutions with AA despite the very good AA scaling of RV770 , GTX280 is a better solution. Bigger fillrate and more bandwidth are key elements for that as you surely know..

So in your case, the SLI logic doesn't apply. But from a consumer point, it applies now a lot more than it did before.

I'm talking about myself and I'm quite sure from the pov of a significant gamer portion..
Once again SLI and its features/problems are not a viable option for me.. But it sure is for a lot of ppl..

 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: ronnn
Originally posted by: jim1976

And that is a small thing? If games are inclined that way, you know the outcome.


Buying for the future likely remains a waste. In this era of consoles, by the time we get real dx 10.1 or physics whatever.......... your current card will be worth $75 on ebay.


Still we might get 1 level with a bit of fog ..............


I know it is a waste.. It wasn't the reason that I bought GTX280.. I'm into this market from the early days of 3D acceleration, so believe me I know that much..
Once again the reason that I bought my GTX280 was that it is the best single card available right now, and that's it.. Plain and simple.. If one finds that inadequate excuse to justify the big premium then I can't help you with that.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Yep that makes sense, as it is available - and early adopters always pay a premium. :beer:
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: ronnn
Yep that makes sense, as it is available - and early adopters always pay a premium. :beer:

Not only that, but in 1-2 months I might be outclassed by GT200b if this card is to be released as an answer to 4870x2.. (I doubt that I see a new x2 card from nvidia as well but that's another story).. Anyway in my book though you can't go by this logic.. Do you need a gpu today? Can you find a card that suits your performance and financial needs? Go ahead and take it..
Waiting is a dangerous and risky game sometimes..
 

HumblePie

Lifer
Oct 30, 2000
14,665
440
126
I already purchased 2x 4850's in CF when best buy had their slip up for them at $150. Still, I really still want to see all the numbers for a 4870x2. I might have to sell off my CF cards for a 4870x2 and maybe add another later. Who knows.

As far as progression, I still remember every vid card purchase to date I made. Although I purchased a few others to "try out" and immediately sell to other people for their comps as builds for other people, these are the cards I "kept" below and used in my main system.

Matrox (initial card) ($100) -> Voodoo 1($75) -> Banshee 32MB($60) -> Obsidian (2x Voodoo2's SLI'd on a single card) ($300) -> Savage 3D 128MB($160) -> TNT2 Ultra ($175) -> 9800p AIW ($130) -> x850xtpe ($125) -> 8800 GTX ($200) -> 4850's CF ($300)

As you can tell from my track record, I typically only update the GFX card every few years when MAJOR performance jumps justify the price. Even then, I go for the best bang's for the buck and the best deals. I listed the prices I paid for all my cards to see how I went about upgrading.

To me, I'm not a die hard fan of either brand. I've gone back and forth between many a company as I seek to upgrade every 2 years or so and then for the best bang for the buck. Only ONCE did I go over the top and that was with that Obsidian card right as the Voodoo 2's came out from 3dfx. It was the first multi GPU single card solution ever if memory serves me. I'm really surprised it's been a decade since that initial card that manufacturers are finally going back to multi GPU solutions again.

Too me, if the 4870x2 next month ends up being the best performance/price king and I can get it cheap while selling off my 4850's CF for at or near the price of a 4870x2, then I'll be doing a last mini upgrade that will hold me off for a couple more years I think.

Glad to see competition again like everyone else.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: jim1976
SLI/ XFIRE have the known problems besides the advantages.. If you're ok with them that's fine by me.. I just stated my opinion on the matter, which was that they are not a viable option for me..
And it IS a viable option for others, despite the associated pitfalls of multi-gpu rendering. Getting >gtx280 performance for $300 less would be a pretty strong incentive to try SLI, even for someone who generally prefers a single faster card.

And if Nvidia decided to risk it and go 55nm we could have seen much better clocks and maybe GDDR5.. It would have outclassed RV770 any day again, since this particular architecture clearly needs higher clocks to demonstrate its raw power.. We can't talk like that.. It's pure speculation.. Besides that theoretical GFLOPS are approximately 930 for 280 and 1200 for RV770.. Can you say that this is depicted in the benchmarks?
Again, you're judging performance based on theoretical figures, like FLOPS and shader count. Yes, the rv770 can and does beat the gtx280 in some pure shader benches, like the 3dmark perlin noise. But even while the game performance doesn't match the theoretical figures, the rv770 still beats the gtx280 in performance/area even when you scale the gt200 to 55nm.

A 280 is on average 10-15% faster than 4870 nowadays, and if you narrow the benchmarks to heavy D3D9 and "D3D10" ones you get a 20-25% increase on average.. So on the one hand one can say that nvidia has the upper hand in future games.. On the other hand one can say that ATI has the upper hand with the 10.1 support.. The real truth can only be seen from the choices of the game devs, and the programming route they will take for their games..
Anyway.. Whatever the truth is,you have to agree though that for higher resolutions with AA despite the very good AA scaling of RV770 , GTX280 is a better solution. Bigger fillrate and more bandwidth are key elements for that as you surely know..
Ive seen no evidence to suggest that the gt200 is better at DX10 code. In Crysis and WiC DX10 benches the 4870 is on par with the gtx260, while in Bioshock it even beats the gtx280. The rv770 might not be better at DX10, but likewise, it's not worse. At any rate, you're comparing a $300 card to a $600 card, and let's not forget the topic of this thread - the dual gpu r700 that the real competitor to the gtx280.

I'm talking about myself and I'm quite sure from the pov of a significant gamer portion..
Once again SLI and its features/problems are not a viable option for me.. But it sure is for a lot of ppl..
... and that's where I leave this subject. Back to the thread topic.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Agreed we can discuss this further in another topic, I don't wanna spam this one..

The last sentence was my personal opinion from the my first post in this topic..
You have to ask yourself though.. Is it more rational to compare an x2 card with a single card solution (just because they cost more or less the same) than comparing a 300$ gpu with a 600$ gpu?
Have we reached the day that a single card can be directly compared to a single card solution solely due to the price factor?
All I'm trying to say is that each option has its own advantages and disadvantages..
OTOH you'll have better perf with an x2 card but with the known problems (see 4870x2), and on OTOH an unorthodox price/perf gpu BUT single solution.. (see GTX280)

Pick yer poison.. End of story ..

 

nonameo

Diamond Member
Mar 13, 2006
5,902
2
76
Originally posted by: jim1976
Agreed we can discuss this further in another topic, I don't wanna spam this one..

The last sentence was my personal opinion from the my first post in this topic..
You have to ask yourself though.. Is it more rational to compare an x2 card with a single card solution (just because they cost more or less the same) than comparing a 300$ gpu with a 600$ gpu?
Have we reached the day that a single card can be directly compared to a single card solution solely due to the price factor?
All I'm trying to say is that each option has its own advantages and disadvantages..
OTOH you'll have better perf with an x2 card but with the known problems (see 4870x2), and on OTOH an unorthodox price/perf gpu BUT single solution.. (see GTX280)

Pick yer poison.. End of story ..

It is true that you can't really compare dual card solutions to single card, but in the end one entity, the consumer, will pick the true winner.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: Kuzi
Originally posted by: nanaki333
i'm curious to see the power draw compared to just 2 4870 in crossfire. i would have 2x 4870 in my machine, but i want to see how this thing is on heat, power, and if it really will be ~15% faster than just 2 4870s in crossfire. i guess my 9800GX2 will have to wait a little longer to be upgraded....

I do believe the 4870x2 will be the fastest card available when it's released, but I don't think it'll be much faster than two 4870 cards in CF. There are a few reasons for this, but the main reason could be that these 4870x2 cards may have lower core clocks compared to the 4870, example 700Mhz vs 750Mhz. This may have to be done to lower power consumption and heat.

Unless ATI surprises everyone and the R700 will be two 45nm chips


How about 778MHz.

AMD?s team in Austin managed to use two R700 dual-GPU graphic cards (four RV770 chips) to get a score of X12515. This was done with four GPUs, while Nvidia uses three GTX280 boards to achieve a similar score. The R700 boards were clocked at 778 MHz, while the GDDR5 memory was clocked at 980 MHz QDR (that's 3920 "MHz", or just 3.92 GigaTransfers/sec), we were told.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: nitromullet
Originally posted by: chizow
Originally posted by: thilan29
the chip they use is different and so won't necessarily be x8/x8. And in all likelihood, if there was a bottleneck in that part of the card from the 3870x2, ATI would have replaced it this time around.

What do you think the bridge chip does? Unless the PLX gen 2 chip is able to make an x16 PCIE 2.0 slot into an x32 PCIE 2.0 slot then it will result in an x8/x8 PCIE 2.0 on a single slot.

It really depends on what the PCIe bridge does... It may not provide communication between the PCIe slot and both gpus. It's possible that only one gpu has communication with the PCIe slot and that the bridge chip provides communication between the two gpus. In this type of arrangement, you would essentially have dual sets of PCIe X16 lanes. One set of X16 lanes on on the motherboard, and another set internally on the card.

From looking at the 16x/16x/4x/4x 4-way crossfire results:

http://forums.anandtech.com/me...=2201032&enterthread=y
http://www.tweaktown.com/artic..._crossfirex/index.html

I think it becomes pretty clear that for multi-gpu scaling beyond 2 or 3 gpus, that motherboards are not going to provide enough usable bandwidth to facilitate the communication between all these gpus. Hopefully, ATI has recognized that and is working on a remedy with the 4870X2.

Originally posted by: magreen
That article and "analysis" by John Peddie was nothing but buying into AMD marketing. The 256-bit memory bus w/ faster ram? A stroke of genius? Gimme a break. Every techie knows you can do that. Making smaller chips that work together? They haven't done it yet. They've yet to release a card to compete with this generation's high end. The 4850/70 are very nice value cards. But don't count your X2 chickens before they hatch. Their proprietary bus instead of XFire is the one thing mentioned in that article that could put them ahead. But it's a complete unknown. "AMD bests nvidia with graphics chip strategy"? FUD.

...damn Peddiephiles.

That is a very valid point. This article is a tad bit premature in crowning winners. IMO, if 4870 X2 is to GTX 280 as 3870 X2 was to 8800GTX/Ultra, I wouldn't consider it a clear victory for ATI. That being said, the individual 4870 looks to be a much better card than the 3870 and it beats the GTX 260 in almost every benchmark I've seen, so it does look promising for ATI.

I don't see really much the hype. After all, the HD 2900XT was as fast as the 8800GTS 640 (Sometimes faster, sometimes slower) while the flagship 8800GTX was faster overall, so the HD 3870X2 was created to outperform it. The history has repeated itself again. The HD 4870 is as fast or faster than the GTX 260 (Sometimes slower in rare cases) and the GTX 280 is faster overall. So the HD 4870X2 is created to outperform it. The only difference this time is that the HD 4X00 series has a more consistent performance, suck less power, has more features and is cheaper. The only HD 2900XT that was worth of praise was the HD 3870 (UVD, DX10.1, 55nm etc). I see that the AMD multi GPU approach to reach high end is the best in terms of price/profit ratio. Like you stated, if they use the propietary bus, the XFire will put them ahead.
 

toslat

Senior member
Jul 26, 2007
216
0
76
Originally posted by: chizow
Again, it doesn't matter what kind of interconnect is between the GPUs, if I had to guess that's what the CF sideport is for. You still run into the problem of the physical PCIE 2.0 slot the card is placed into. AFAIK you cannot make this larger than x16 if the physical traces and the slot itself are limited to 16 lanes. Even if you were passing everything through a single GPU and not splitting the PCIE bus you'd have ~2x data (approx. 4870CF) passing over an x16 bus rather than 1x data over an x8 bus. Normally you would think this sufficient but based on the Tweaktown review of 4850CF on P45 and X48 that's not the case.

Yes PCIE 2.0 doubles bandwidth but based on the Tweaktown review, running CF with two x8 PCIE 2.0 lanes starts showing degraded performance on 4850CF. Considering the 4870s would require more bandwidth that might be more pronounced. The linked Tom's comparison does show PCIE 1.0a x8 (PCIE 2.0 x4) decreased performance but little/no change with PCIE 1.0a x16 (PCIE 2.0 x8) and PCIE 2.0 x16.


On being CPU limited:
A simple test using two quads at two different clocks (all else same) should put this to rest

On PLX 'bridge' and shared PCIe:
The PLX chips on x870X2 do not split lanes i.e. 16x switched ~= 8x:8x. It is wrong to use 8x:8x performance of 4850 CF to predict performance of 4870X2

IIRC the 3870X2 has a 48 lane 3 port v1.1 PEX8547 switch (not a bridge) and thus each GPU gets full 16x access when it is granted and not 8x. The advantage of this is that since the PCIe connection to the north bridge carries bursty traffic (actual traffic pattern depends on the game), the impact of latency is reduced and bandwidth preserved, as opposed to the bandwidth bottlenecking that would occur with a fixed 8x:8x split.

In the 4870X2, the new switch (possibly PEX8648) will support v2.0 which will double the available data rate (though with slight latency increase).

Also its is rumored that inter GPU traffic will improve. Not sure if inter GPU traffic went through the NB (and memory) in the 3870X2 since the switch functionally could route traffic between the GPUs. So I expect, at the least, that inter GPU traffic is switched at the PEX chip, and if possible they might have a common/duplicate memory area.

So there is no basis yet to state that the 4870X2 will be bottlenecked by the 'bridge' unless you can provide stats that show that the traffic pattern on the bus is sustained at >50% for a 4870 (single or CF),and even then show that ( 2x individual data - common data) exceeds 100% of 16x in CF
 

Elcs

Diamond Member
Apr 27, 2002
6,278
6
81
Just installed my HD4870, havent got an OS sorted yet but damn that thing is hot!!!!!

24C ambient, open case. I could clearly hear the fan ramping up a notch during formatting/bios checking/OS pre-installation. Became too hot to touch, the heatsink itself feels much hotter than the 8800GT it replaces which I finger tested while NVidia told me the core was between 65-70C (idle).

Is anyone else finding that its absolutely roasting hot? Cant wait to get it up and running properly but I hope it doesnt need to ramp up constantly or that I can keep it fixed at a lower than audible noise.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Its because of the low fan speed. There are workarounds in this, if you can find them. Shouldn't to too hard tbh ;.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |