ATi 4870/4850 Review Thread

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: NullSubroutine
Originally posted by: Sylvanas
Originally posted by: NullSubroutine
Does anyone know if you put it in a PCI-E 2.0 slot if both 6 pin connectors are required or if its just one? (Since the 2.0 delivers more power than the 1.1 slots do)

Yes you need 2 connectors regardless.

I had a similar train of thought up until a few weeks ago when I forget who it was (sorry!) that pointed this out to me but it looks like PCI-E 2.0 is capable of 150w from the slot (there are news articles supporting this if you do a quick google search) but in practice I don't think it actually does and that nobody has implemented it. Have a looks at PCI Sig's documentation in particular Q11. Nowhere does it allude or state that 150w from the slot is officially part of PCI-E 2.0.

No matter, most PSU's have at least 2 6 Pin connectors and if not the cards come with Molex adapters anyway (since both Molex and 6 pin PCI-E provide 75 watts).

Well with my PSU I wouldnt have a problem supplying the power to a card let alone two. The problem is I only have 1 6 pin, and 1 6+2 (8 pin) pin. I was trying to decide if I should wait for the R700 or get a new board and do CF. However, if you need 4 6 pin connectors I can't physically do it as I only have 2. And since I just bought this 140 dollar PSU last Sept. I dont really want to spend another 170 dollars so I can get 2 more 6 pins.

If it really requires 2 6 pins (4 for CF) in 2.0 boards then I will have to wait for the R700 and just use my 6 pin and 8 pin to provide power.

you can get the molex to 6 pin adapters if you need 4x6 pins.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: BFG10K
And in titles where it isn't showing high performance but CF/SLI is significantly higher? What then? Are you claiming a CPU bottleneck is responsible for the GTX280 which magically doesn?t affect multi-GPU?
First of all, I never said there were CPU bottlenecks or frame caps in every instance and every game, so get that ridiculous notion and line of questioning out of your head. In games where single-GPU is frame capped/sync'd/smoothed I think it is possible for SLI/CF to override that function and exceed the cap whether by drivers or the game itself.

One example would be Assassin's Creed, which we know for sure is capped normally, yet SteelSix posted screenshots of frame rates that far exceeded capped frame rates. AT's review also indicated capped frame rates for AC which isn't uncommon as different review sites often have different results.

But again you can't claim that if CF/SLI are faster. You also can?t claim that if the graphs aren?t flat-lining of which there are numerous examples.
Again, in cases where CF/SLI are only a few FPS faster than its still valid. Same for single GPU with frame capping or CPU bottlenecking. For instance, if a game is limiting/capping FPS to 60 and you see a spread of a few FPS difference between all the parts, its obvious that there is capping going on and the difference in FPS is just the slower parts spending more time below 60FPS with lower FPS than the faster parts. But that's not necessarily indicative of gameplay and its definitely not a good gauge of potential performance since the faster parts (and SLI/CF) are capped, limiting your maximums. This is very different from uncapped averages where you might see spreads between 90 and 30 FPS to come to an average close to 60. Unfortunately there's no way to tell from a review unless they graph the frame dumps or you see it first-hand. Some sites do disclose any capping or smoothing but its pretty easy to gloss over, like in the AT review.

I'm not sure what examples you were looking at. While there were plenty that agreed with you, plenty did not:

http://www.firingsquad.com/har...performance/page12.asp

I'm not sure how anyone can claim CPU limitations, frame smoothing or a framecap is responsible for those figures. In Bioshock the 4870 is faster than the GTX280 without AA, no two ways about it, and you can see CF is significantly faster with the gap widening as the resolution increases. This is GPU bottlenecking 101.

Furthermore there is no flat-lining at or near 62 FPS or the refresh like you claim..
And once again, I'm not claiming CPU bottlenecking or frame rate capping in every game and certainly not every resolution. As I said originally, what I found interesting was that there was clearly more CPU bottlenecking occurring at higher resolutions, even 16x10 or 19x12 when in the past those were considered higher resolutions.

Perhaps, but we aren't talking about those situations. We're talking about 159.8 vs 78.9 which is a vast change.
No, you are talking about those situations and I'm not disagreeing in those cases anyways. But what if you have 4870 posting 90 FPS, 4850 posting 75 FPS, 4850 CF posting 130 FPS and 4870 CF posting 131FPS, along with 9800GX2 posting 121 FPS, GTX 280 posting 111 FPS etc. Pretty clear there is CPU bottlenecking going on, with the majority of differences attributable to longer durations spent at lower FPS for the slower parts. Sure there is some difference, but is it indicative of how fast the parts really are?

You're basically saying "well, the 4870 isn't faster than the GTX280 because in the situations it is, it's because of CPU limitations or [insert reason X]. Likewise multi-GPU isn't faster, it's micro-stutter".
No, in the situations the 4870 is faster, it'd be because there are no CPU limitations. Likewise in the situations the GTX 280 is faster than the 4870, its because there clearly are no CPU limitations. I'm referring to situations where the 4870, GTX 280 and every other single GPU solution is within 5FPS or 10%, like AC, Witcher, and Crysis in AT's review.

That argument is nothing more than green propaganda.

Tell me, when the GTX280 is faster than the 4870 do you also chalk that up to CPU limitations or other nonsensical reasons? Or how about when the GTX280 is faster than the 8800 Ultra? Is that also not really faster using your reasoning?
The only nonsensical reasons I see are the ones you're inventing to prove points I never claimed.

I was heavily involved in that thread and I produced numerous graphs. But I can tell you that the framerate increase here can't be explained by micro-stutter. In fact micro-stutter is totally irrelevant to this argument since multi-GPU cannot provide a performance gain to begin with if there?s a bottleneck elsewhere.
Did you even read annihilat0r's post and methodology? Micro-Stutter thread, 3rd to last post.

annihilat0r:
It's obvious that every third frame sees a jump from around 50 FPS to around 150 FPS. I don't think I need to tell you that in this situation your eye will see the fluidity of a 50 FPS system with some stuttering (caused by the super-fast delay third frames). However, the reported frame rate will be 1000*(63-48)/(981-753) = 66 FPS.

So all hardware sites will take this result and, I'm sorry but, stupidly compare it to non-AFR single GPU frames and say "wow, our FPS increased from a single 8800GT's 40 to 66 when we plugged in another 8800GT!!" Which is obviously nonsense. I can't believe that after all the awareness evoked from this kind of threads in hardware forums, nearly no hardware site mentions this in their reviews, including Anandtech, which I wouldn't normally expect such a thing from.
Again, this is classic micro-stuttering that also shows how multi-GPU can inflate FPS because they're rendering frames at irregular intervals.

I never said you were lying, I asked you to provide recent benchmarks of it in action, otherwise it's irrelevant.
No, you dismissed it as if it was a problem that didn't exist, when it clearly does. Its not irrelevant because it shows Devs are implementing methods of frame capping to normalize performance that you may or may not know about, and even if you did, may or may not be able to turn on or off. This might also explain how some sites can get such drastically different results from others. I asked Derek about some of his results in the 4870 feedback and he said they no longer force Vsync off (due to his findings with Crysis I'm sure). Maybe some reviewers are forcing Vsync off and getting better results in some games as a result. Its clear there is considerable frame capping and/or CPU bottlenecking in Assassin's Creed, Witcher and Crysis up to 1920x1200 in AT's review. I've seen a few others as well from different sites (once again, I'm not claiming CPU limitations in every title and resolution, and certainly not in Bioshock which has always run better on ATI hardware).

Sure, but I've provided Bioshock examples that demonstrate no such cap is in effect. Again you need to provide real examples or stop dismissing benchmarks on the basis of fictional hypothetical situations.
I never dismissed any benchmark, you're coming up with examples you came across that I never mentioned, and I'm using examples I came across in my points. At no time did I ever say or imply that I found CPU limitations or frame capping in every instance, but you're clearly assuming that in your arguments, which is ridiculous.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: SlowSpyder
Originally posted by: JPB
Grabs a piece of the HD 4800 pie

Gainward goes ATI


The green loving Gainward has ditched the Nvidia exclusivity and has prepped its HD 4800 series graphics cards. The reason is probably as simple as it can be; DAAMIT has great products with its HD 4800 series and Gainward simply wants to be a part of it.

Gainward is going to announce reference designed HD 4850 and HD 4870 cards. We have seen this happen when Nvidia launched its 8-series cards, and this time it's the other way around. Let's just hope, for Nvidia's sake, that Gainward is the only "Nvidia exclusive" partner that will announce the HD 4800 series.

Gainward's mothership a big graphic company that starts with a P has been doing ATI for a while and since Gainward is now run by ex ATI chap from Taiwan, it kind of make sense that this will happen.

So Gainward finally went DAAMIT leaving only few to be Nvidia only.

I'd love an EVGA KO SSC whatever 4870 with a stepup.

JPB, nice sig.

evga isn't going to leave nvidia. I could see some of the smaller guys doing it as g92(b) becomes a tougher sell, but the top guys (bfg and xfx) will get preferential treatment and inside info on upcoming releases that would keep them in the fold even if they weren't so financially tied to nvidia.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
chizow, you need to go back and read some of nrollo's posts. we all KNOW that he's biased, he got perma-banned in years past, he has nvidia info in his signature...yet most of us are more apt hear positive things come out of his mouth about ati than yours. Frankly, your clear bias is a detriment to the forums.
 

Compddd

Golden Member
Jul 5, 2000
1,864
0
71
Originally posted by: bryanW1995
chizow, you need to go back and read some of nrollo's posts. we all KNOW that he's biased, he got perma-banned in years past, he has nvidia info in his signature...yet most of us are more apt hear positive things come out of his mouth about ati than yours. Frankly, your clear bias is a detriment to the forums.

I agree.
 

Foxery

Golden Member
Jan 24, 2008
1,709
0
0
Can we get this thread back on track? The amount of personal blogging going on here is inane.

Now, when do the board partners first receive engineering samples to work with? The heat problems and very low quality of the reference coolers for both cards are surprising, yet the first report I've seen about selling good ones is at Fudzilla, and still a month away. Considering the PCB size and layout are reportedly the same as the 3800s, what's the holdup?
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
4870 is back oos at newegg. I hate it when they do this, the sapphire still shows in stock until you click on it...then you get the "auto-notify" option.
 

CP5670

Diamond Member
Jun 24, 2004
5,534
613
126
Regarding the UT3 frame smoothing issue, that feature does indeed limit the framerate to 62fps. I just turned it on and checked it to make sure. It doesn't quite work as you would expect though and seems to create an effect very similar to AFR microstuttering. The game subtly speeds up and slows down all the time, causing an apparent choppiness even when the framerate is at the cap. I remember it struck me immediately when I tried the demo, and they kept it in the full game too. It's easy to disable in UT3's ini files, but there is no option to do it in the game itself.

At the same time, I've only seen this in UT3 itself, not in any other game based on that engine. UT3 may also be automatically disabling it when a benchmark is run, as I don't recall seeing any review mention such a cap.
 

teatime0315

Senior member
Nov 18, 2005
646
0
0
Originally posted by: Compddd
Sounds like you're a student, how did you afford two GTX 280s lol

Drug money? haha
I can't wait to get a 4870... I think one should last me for a while considering I'm still on an Nvidia 6800gs
 

Lugaidster

Junior Member
Apr 24, 2007
3
0
0
There is something I've been wondering and I don't know if this is the right place to ask. But suppose I have a motherboard that has 2 PCI-Express slots (like a P35 based mobo) with a 4x-in-a-16x PCIE Slot and a PCIE16x slot and I put a Radeon HD 4870 in the big one and a Geforce 9800 GTX, will I be able to use Physx in the geforce while rendering a game in the radeon?? Is this at all possible?? since I already have a geforce and would like to get the radeon. This is something I think the review lacked since Physx software is already out.

Regards,
Lugaidster
 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
Originally posted by: Cookie Monster
Wait, why would one sell two GTX 280s for 2 HD4870s? Did i read something wrong?

Yeah. He's gonna get 2 4870X2s.

Also, if I made the mistake of buying $1300 worth of video card equipment I'd try to recoup my losses by selling and buying $600 worth of equip that nearly perform about the same.
 

Elcs

Diamond Member
Apr 27, 2002
6,278
6
81
How big is the 4870 compared to other cards?

I know its a dual slot solution but is it one of these monster long cards?
 

lopri

Elite Member
Jul 27, 2002
13,211
597
126
Originally posted by: Lugaidster
There is something I've been wondering and I don't know if this is the right place to ask. But suppose I have a motherboard that has 2 PCI-Express slots (like a P35 based mobo) with a 4x-in-a-16x PCIE Slot and a PCIE16x slot and I put a Radeon HD 4870 in the big one and a Geforce 9800 GTX, will I be able to use Physx in the geforce while rendering a game in the radeon?? Is this at all possible?? since I already have a geforce and would like to get the radeon. This is something I think the review lacked since Physx software is already out.

Regards,
Lugaidster
Hi,

Welcome to AT Forums.

What you're asking is, unfortunately, not possible (nor desirable) currently. Mixing ATI card and NV card in a system is a very difficult project, if not downright impossible. This is especially true with DX10 and Vista.

Plus, the PhysX is a questionable value at this time. (Only works under 3DMark AFAIK) P35 isn't a terribly efficient chipset when it comes to multiple PCIe lane configurations as well.
 

lopri

Elite Member
Jul 27, 2002
13,211
597
126
Originally posted by: Elcs
How big is the 4870 compared to other cards?

I know its a dual slot solution but is it one of these monster long cards?
Nope. It's 9.5", which means it doesn't stick out of a motherboard. (9.6", be it mATX or ATX)
 

mharr7

Member
Feb 17, 2008
191
0
0
I'm sure this has been asked before, but this thread is like 32462 pages long....

Will a Corsair HX520 support a single HD 4870?
Comp setup will be E8400 at ~3.6ghz, and most likely single HD.

Thanks.
 

MyLeftNut

Senior member
Jul 22, 2007
393
0
0
Originally posted by: mharr7
I'm sure this has been asked before, but this thread is like 32462 pages long....

Will a Corsair HX520 support a single HD 4870?
Comp setup will be E8400 at ~3.6ghz, and most likely single HD.

Thanks.

Yes, more than enough.
 

deerhunter716

Member
Jul 17, 2007
163
0
0
Always been a NVidia fanboy from the 8800GTX to the 8800GT --> now going 4870 ALL THE WAY. Cannot beat the price for the performance.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |