ATi 4870/4850 Review Thread

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: tuteja1986
why are the Nvidia focus group member focusing so much of their power in 4870/4850 review thread.

Is sad.

If and when any blockbuster dx 10.1 games come out, the playing field will tilt. Until then, really we are talking $50 to $100 value, not that big a deal. Except of course for the company losing sales.
 

Canterwood

Golden Member
May 25, 2003
1,138
0
0
Originally posted by: tuteja1986
why are the Nvidia focus group member focusing so much of their power in 4870/4850 review thread.
It does make you wonder.

Speaking for myself, I would NEVER trust the opinions of anyone affiliated with one particular company for an unbiased assessment of a rivals product.

The 4800 series does seem to be a winner for ATI/AMD though.

Great to see them back in the game. I was getting fed up of the stagnancy of Nvidias products. :thumbsup:

 

Ackmed

Diamond Member
Oct 1, 2003
8,487
533
126
Originally posted by: dreddfunk
Keys - as my post in reply to Rollo indicates, I have no problem with people pointing out facts, least of all you. But, as I discovered early in my graduate career, selective pointing out of facts is used to mislead people as often as it is to *help* people make informed decisions. So I was merely pointing out that incessantly telescoping in on particular facts that make the GTX260 look good can be every bit as misleading as ignoring those facts all together. Point them out, by all means, but be honest about where those 'trees' stand in relation to the 'forest'.

Im sure you know this, but the reason some people do this, is because they're paid to. Paid to try and steer people towards one brand, from the other. Sad actually, but whatever.

I bet these reviews really makes their job harder. :lol:
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: nRollo
Originally posted by: dreddfunk
Keys - you can't 'downplay' CUDA or PhysX, but they certainly aren't of critical interest to the vast majority of 3D gamers at this time, and 3D gamers are the primary market for these cards.

I understand that you and Rollo both have an interest in pointing out the positive aspects of NVIDIA hardware, and that's valid, but to those of us with no dogs in the fight it seems like you're telescoping in on trees rather than looking at the forest.

[edited for clarity]

Depends how you look at it.

For non PhysX, non multi card I would choose a 4870 over a GTX260 now based on the $100 price difference and similar performance.

OTOH, $100 isn't much money in the world where I have to pay that to fill up my truck, and I do like multi card. Not to mention if a guy is looking to keep his card a year and a half there are a whole lot of PhysX games coming.

So to me it's kind of a wash, buyer has to pick what matters to them. Physics obviously can make a difference in immersion level.

For the first time in a long time, I don't think you can go too far wrong either way, especially short term.

you would pay hundreds of dollars extra... for a 650$ card... and then keep it a year and a half until it starts outperforming the competition?

I completely agree with you that in a year and a half the GTX 280 should annihilate the 4870CF, due to 2560x1600 becoming the standard, and shaders becoming ever more important. As well as the prolification of physX.
But i assure you, not a single person who pays over 600$ per card would be interesting in using a year and a half card by then... they will be using the radeon 6870 or the GTX 480 by then... (after we will have the 55nm die shrink, we will have a new architecture for 45nm... and by then probably a 35nm dir shrink of that).
 

Hunt3rj2

Member
Jun 23, 2008
84
0
0
Originally posted by: keysplayr2003
Originally posted by: Hunt3rj2
Originally posted by: nRollo
In any case, tough to speculate on the rest, not an engineer. We've seen what the 9800GTX+ increases are with the drop to 55nm, but the GTX2XX are much more complex designs, so my guess is they're not going to drop to 55nm and say "Haha! Now we crank the core clocks 250MHz FTW!". Unless they're going to have the GTX290 Peltier Edition.

Actually it'd be the GTX290 Liquid Nitrogen edition.


All joking aside, it really looks like Nvidia needs to revise the GTX 200 cores to have higher clocks and to be more efficient, I don't know many people who would like a 200 dollar power bill from leaving their computer on all day.


Wait.. is the ATI 48x0 the revision of R600 or is it R700? I'm being led to believe that it is R700 due to the core being called "RV770". I intend on getting a 4850 with a VF1000 when they get pretty cheep and plentiful.

The GT200's idle at like 25W dude.

Linkified

What's more important is the fact that this core is HUGE. Sure, it might be efficient at idle, but when ATI's 48x0 can beat the entry-level card for this for 100 dollars less and with far less transistors and smaller die size you really think that something is being wasted.

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
well, shaders ARE being wasted on the GTX280... it annihilates in crysis... but other games were just not written to take advantage of so much shader power...

Another thing is, it is clocked very low... it is just too big and can't handle the heat... but if you could put some serious cooling on it and OC you would get the best thing on the market, by far. (watercool or liquid nitro OC I mean).
They should be able to raise the clocks and performance, a lot, when they convert the GTX280 to 55nm.

But I don't buy products based on "technology that would be really awesome in future products once we taper out all the kinks"...
This is why I passed on SLI and CF (although with its CF bridge on the GPU the 4870 seems like it would make a single card dual core solution truely awesome. I am dying to see this come out, i think it will be my first multi GPU)
 

deerhunter716

Member
Jul 17, 2007
163
0
0
Bottom line is the 4870 wipes the 260 on the floor and in fact beats the 280 in a # of benchmarks. It might lose more to the 280; but does indeed beat quite a few times also. Being so much cheaper --> ATI whooped NVidia's ass all over the floor.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Yup too many "ifs" used when talking about the GTX 280... The thing is the card is what we have seen, and not the refresh, and by the time the refresh is out so will the 4870 X2
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I wouldn't go that far, the 4870 is actually pretty similar to the 260. Loosing some, winning some... winning more then loosing. but the 260 has physX and more shaders.
Really the two are equivalent... but the 4870 costs 300$ vs the 400$ 260 it competes with... and the 650$ is ridiculously overpriced.

Originally posted by: ShadowOfMyself
Yup too many "ifs" used when talking about the GTX 280... The thing is the card is what we have seen, and not the refresh, and by the time the refresh is out so will the 4870 X2

And the point of it all is... when the refresh is out... who cares?
How will the knowledge of the existance of a cheaper, faster, GTX 280+ at 55nm matter to a person who bought an owns a 65nm version?

I doubt there are many who would think: Look at my card, it is so uber, i bought it for 650$ and now there is somthing much better with a similar name at 500... this must mean that my card is better... durr... (but if there are, they would surely buy one).
The one exception i see is someone who is willing to SLI or even tri SLI the behemoth on their god box.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: taltamir

due to 2560x1600 becoming the standard,

Why would that become standard? For people who like a smaller screen (less footprint) no need for that type of resolution. :beer:

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: ronnn
Originally posted by: taltamir

due to 2560x1600 becoming the standard,

Why would that become standard? For people who like a smaller screen (less footprint) no need for that type of resolution. :beer:

exactly... for the 0.000000000001% of the population that likes smaller screens (i exaggerate) it isn't needed. Everyone else likes bigger.
 

deerhunter716

Member
Jul 17, 2007
163
0
0
Originally posted by: taltamir
I wouldn't go that far, the 4870 is actually pretty similar to the 260. Loosing some, winning some... winning more then loosing. but the 260 has physX and more shaders.
Really the two are equivalent... but the 4870 costs 300$ vs the 400$ 260 it competes with... and the 650$ is ridiculously overpriced.



It beats the 260 hands down; no questions asked in a majority of benches. It beats the 280 in plenty of benchmarks also with the below just a small sample of those where the uber 280 is beaten.

http://www.firingsquad.com/har...performance/page14.asp

http://www.firingsquad.com/har...performance/page12.asp

http://techreport.com/articles.x/14990/9
 

dadach

Senior member
Nov 27, 2005
204
0
76
taltamir you are trying too hard...maybe i was right when i called you "nvidia guy"...are you trying to score a focus group membership like ntrollo and keys?
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Is this how you're going to handle yourselves guys? Dadach? Ackmed? Tuteja? ronnn?
So be it.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Originally posted by: deerhunter716
Originally posted by: taltamir
I wouldn't go that far, the 4870 is actually pretty similar to the 260. Loosing some, winning some... winning more then loosing. but the 260 has physX and more shaders.
Really the two are equivalent... but the 4870 costs 300$ vs the 400$ 260 it competes with... and the 650$ is ridiculously overpriced.



It beats the 260 hands down; no questions asked in a majority of benches. It beats the 280 in plenty of benchmarks also with the below just a small sample of those where the uber 280 is beaten.

http://www.firingsquad.com/har...performance/page14.asp

http://www.firingsquad.com/har...performance/page12.asp

http://techreport.com/articles.x/14990/9


I'd think that the vast majority of people who buy a $400-$650 card are the type that want to play with all the eye candy they can, right? So, considering that anyone who buys a GTX280 probably wants to use as much AA as possible. From the benches I've seen, the 4870 seems to be a monster with 8xAA. I'd absolutely love to see benches done at 8xAA between the 4870 and GTX260/GTX280. I think it would pretty well show how poor of a value the GTX260/280 are. I won't go so far to say this will happen, but I wouldn't be shocked if based on 8xAA performance the 4870 is the overall leader regardless of price. So, *IF* that was indeed the case, wouldn't that be kind of like saying the GTX280 is the fastest and justifies it's very high price... so long as you use low-med quality settings?

And what is the advantage of the 396MB more memory the GTX260 has? In the benches I saw it is very close (still often losing) to the 4870 at 2560x1600 res with 4xAA. I don't see how the more memory is going to prove to be anything more then marketing at best. It certainly doesn't appear to actually help over the 4870's 512MB. The GTX260 core is powerful enough, certainly if there was an advantage it would show at that res with AA?

Just a few thoughts.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: dadach
taltamir you are trying too hard...maybe i was right when i called you "nvidia guy"...are you trying to score a focus group membership like ntrollo and keys?

Originally posted by: taltamir
And the point of it all is... when the refresh is out... who cares?
How will the knowledge of the existance of a cheaper, faster, GTX 280+ at 55nm matter to a person who bought an owns a 65nm version?

I doubt there are many who would think: Look at my card, it is so uber, i bought it for 650$ and now there is somthing much better with a similar name at 500... this must mean that my card is better... durr... (but if there are, they would surely buy one).
The one exception i see is someone who is willing to SLI or even tri SLI the behemoth on their god box.

Yes, because saying the GTX280 is a retarded purchase (except for an unlimited budget godbox) is obviously me being a huge fanboi.

keys is an upstanding guy. He was made a moderator for a reason (both voted for and approved by the existing ops).

You didn't even try to refute any of my statements, you just personally attacked my integrity and motives. That is a violation of the CoC and also a violation of basic human decency. How about you post a link to a review and counter specific points I have said instead?
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: keysplayr2003
Originally posted by: Janooo
Originally posted by: nRollo
...

I've given some reasons the GTX260 might be worth $100 more to some people.


...

I understand why you are making a fool out of yourself.
I didn't at the time of X1900XTX and 7800GTX 512MB.

Janoo, examples were given why some people may think the GTX260 might be worth $100 more to them. And you point a finger and shout "fool". How wise are you?
I'm not going to show you the benefits of CUDA since it is all over the internet. But there is no way you can successfully downplay it at this stage of the game. And Physx? 16 titles this year?
How bout it? AMD cannot accomplish onboard physics at this time. Neither will Intel be able to.
And that is why I think they are the source of Charlie Demerjians article about Futuremark scores which fell flat on it's face in light of Futuremarks quoted comments. Everyone seems to be touting how AMD's GPU technology is more advanced? Where is it?
If you dismiss the advanced features of CUDA and Physx (which by the way is retro all the way back to 8800 series) then what features would you, or would you not dismiss on AMD's last few gens of GPU's including the current? Start with DX 10.1 for example.

Moral: Don't call somebody a fool for stating some facts that can be backed up. Else someone else looks the fool. Be civil dude.

Now, let's just wait and see how long the price is going to stay at $399.
260 is in no man's land.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Janooo
Originally posted by: keysplayr2003
Originally posted by: Janooo
Originally posted by: nRollo
...

I've given some reasons the GTX260 might be worth $100 more to some people.


...

I understand why you are making a fool out of yourself.
I didn't at the time of X1900XTX and 7800GTX 512MB.

Janoo, examples were given why some people may think the GTX260 might be worth $100 more to them. And you point a finger and shout "fool". How wise are you?
I'm not going to show you the benefits of CUDA since it is all over the internet. But there is no way you can successfully downplay it at this stage of the game. And Physx? 16 titles this year?
How bout it? AMD cannot accomplish onboard physics at this time. Neither will Intel be able to.
And that is why I think they are the source of Charlie Demerjians article about Futuremark scores which fell flat on it's face in light of Futuremarks quoted comments. Everyone seems to be touting how AMD's GPU technology is more advanced? Where is it?
If you dismiss the advanced features of CUDA and Physx (which by the way is retro all the way back to 8800 series) then what features would you, or would you not dismiss on AMD's last few gens of GPU's including the current? Start with DX 10.1 for example.

Moral: Don't call somebody a fool for stating some facts that can be backed up. Else someone else looks the fool. Be civil dude.

Now, let's just wait and see how long the price is going to stay at $399.
260 is in no man's land.

Thought so. You just answered all of my questions. -Thanks.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
I agree Cuda is nice and so well whatever ATI does will be also vary nice.

Now the problem I am seeing with Cuda VS ATI??? THE NV guys are saying the cuda is easier to programm for than ATI EPIC. Which is likely true. But from where I am seeing things from. Graphics aside Intel larrabbee using X86 . Will likely destroy Cuda . AMD even tho they have EPIC Can still go X86 . Intel has the compiler to convert epic into X86 or X86 instructions into epic. Now their is a performance penelity for using this compiler. But with 800sp. and a little extra cache that shouldn't be the problem . Intel may or may not give ATI this complier.

Thats exactly why Intel doesn't care about NV Cuda or AMD EPIC . Intel has the bases all covered. Intel bought a cpu company just to get that compiler. This compiler does it all . Read about it . Its called ELBrUS . The man behind it is an intel fellow NOW.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: keysplayr2003
Originally posted by: Janooo
Originally posted by: keysplayr2003
Originally posted by: Janooo
Originally posted by: nRollo
...

I've given some reasons the GTX260 might be worth $100 more to some people.


...

I understand why you are making a fool out of yourself.
I didn't at the time of X1900XTX and 7800GTX 512MB.

Janoo, examples were given why some people may think the GTX260 might be worth $100 more to them. And you point a finger and shout "fool". How wise are you?
I'm not going to show you the benefits of CUDA since it is all over the internet. But there is no way you can successfully downplay it at this stage of the game. And Physx? 16 titles this year?
How bout it? AMD cannot accomplish onboard physics at this time. Neither will Intel be able to.
And that is why I think they are the source of Charlie Demerjians article about Futuremark scores which fell flat on it's face in light of Futuremarks quoted comments. Everyone seems to be touting how AMD's GPU technology is more advanced? Where is it?
If you dismiss the advanced features of CUDA and Physx (which by the way is retro all the way back to 8800 series) then what features would you, or would you not dismiss on AMD's last few gens of GPU's including the current? Start with DX 10.1 for example.

Moral: Don't call somebody a fool for stating some facts that can be backed up. Else someone else looks the fool. Be civil dude.

Now, let's just wait and see how long the price is going to stay at $399.
260 is in no man's land.

Thought so. You just answered all of my questions. -Thanks.

Nevertheless, only people that don't care about money would buy or recommend 260 over 4870, or they are paid to do so...
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
If you don't care about physics and the many benefits of CUDA, and many do not, the 4870 is a very nice card indeed. Enjoy it. I'll be enjoying my 2 4850's soon enough. Just working on getting an Xfire board and some more DDR2 to go with it. I already have a CPU for it. You see dude, I'm not telling anyone not to buy what they want, as you seem to be making yourself believe. I am telling you that there could be good reasons why a GTX260, at 100 bucks more than a 4870, could be worth the bucks to someone who cares about what it offers. As far as performance goes, 4870 and GTX260 are pretty close.
I am buying a pair of 4850's because I am interested in them for what they are. Very nice cards for their price.
I don't know what else to tell ya.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Nemesis 1
I agree Cuda is nice and so well whatever ATI does will be also vary nice.

Now the problem I am seeing with Cuda VS ATI??? THE NV guys are saying the cuda is easier to programm for than ATI EPIC. Which is likely true. But from where I am seeing things from. Graphics aside Intel larrabbee using X86 . Will likely destroy Cuda . AMD even tho they have EPIC Can still go X86 . Intel has the compiler to convert epic into X86 or X86 instructions into epic. Now their is a performance penelity for using this compiler. But with 800sp. and a little extra cache that shouldn't be the problem . Intel may or may not give ATI this complier.

Thats exactly why Intel doesn't care about NV Cuda or AMD EPIC . Intel has the bases all covered. Intel bought a cpu company just to get that compiler. This compiler does it all . Read about it . Its called ELBrUS . The man behind it is an intel fellow NOW.

Is that what they are telling you? Because Intel should be sweating submarines right about now. This discussion would deserve its own thread however.

 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |