R9 280 vs GTX 960 - better choice?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

garagisti

Senior member
Aug 7, 2007
592
7
81
But if you're buying GPU now then R9 290 is the obvious pick under the constraint of upgrading to an i5 or i7 later. It blows everything else out of the water at any price below $330. Pretty rare to get a high end GPU for a mid level price.
Wouldn't he be GPU limited in more games than CPU limited? Besides, a 290 is already around $200 or so, and how much lower do you think can the price go? Then on top of it there's money to be spent on a 270 too. which is north of $100. Not much, but when you compare to what a 290 costs, well that is almost half the money.
 

nvgpu

Senior member
Sep 12, 2014
629
202
81
Tahiti is a 3 year old gpu that doesn't support variable refresh rate for gaming, doesn't support DX12 new hardware rendering features, does not support 4K H.264 decoding, does not support 4K HEVC decoding, does not support HDMI 2.0 and knowing AMD, driver support will be gone soon once it hits 4th year, just ask the HD4000 users about their driver support.

http://www.anandtech.com/show/8544

http://techreport.com/news/27000/amd-only-certain-new-radeons-will-work-with-freesync-displays

That means brand-new cards like the Radeon R9 280, 280X, 270, and 270X won't be FreeSync-capable. Nor will any older Radeons in the HD 7000 and 8000 series. AMD tells us these prior-gen GPUs don't have the necessary support for the latest DisplayPort standard.
http://www.anandtech.com/show/5775/...4000-gpus-being-moved-to-legacy-status-in-may

http://zachsaw.blogspot.com/2014/09/state-of-legacy-drivers-2014-amd-vs.html

While the AMD HD 4350 card is definitely a superior card to the NVIDIA 8400GS, it suffers from abysmal driver quality. This makes it completely useless as a HTPC card where you would want to set your TV frequency to match the video being played. In the recent releases of MPDN, the AMD card is also exhibiting artefacts running PixelShader code that both Intel and NVIDIA GPUs have no problems running. I have yet to investigate into this issue so this will be an article for another day. Given my experience as a software developer, I would stay away from AMD graphics cards if I were you.
You'd be a fool to buy an outdated card like this, but it's your money to waste.
 

kantonburg

Platinum Member
Oct 10, 1999
2,975
1
81
Tahiti is a 3 year old gpu that doesn't support variable refresh rate for gaming, doesn't support DX12 new hardware rendering features, does not support 4K H.264 decoding, does not support 4K HEVC decoding, does not support HDMI 2.0 and knowing AMD, driver support will be gone soon once it hits 4th year, just ask the HD4000 users about their driver support.

http://www.anandtech.com/show/8544

http://techreport.com/news/27000/amd-only-certain-new-radeons-will-work-with-freesync-displays

http://www.anandtech.com/show/5775/...4000-gpus-being-moved-to-legacy-status-in-may

http://zachsaw.blogspot.com/2014/09/state-of-legacy-drivers-2014-amd-vs.html

You'd be a fool to buy an outdated card like this, but it's your money to waste.

Are you saying the 290 is an outdated card? Judging by your handle I'm assuming you're nVidia all the way no matter what?
 

nvgpu

Senior member
Sep 12, 2014
629
202
81
I only wrote about Tahiti/280, I did not write anything about Hawaii/290.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Pretty incredible thinking about it how viable a 3 year old card still is today still,man how things have changed.I am talking about the 7950 of course or the 3 1/2 year old 7970 if you want to mention the 280x.

Thinking how old these cards are,i would almost want to stretch my budget for a 290.I never seen old hardware still be a viable choice for so long,unless you consider the god forsaken 8800gts-gts250 rebadge era.
 

SimianR

Senior member
Mar 10, 2011
609
16
81
I had a friend who literally just upgraded from an 8800 GT But he was playing new games with all the settings on low, the ones that would run anyway He grabbed a 290 for something like $250 Canadian (and also built a new system around it).
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
I had a friend who literally just upgraded from an 8800 GT But he was playing new games with all the settings on low, the ones that would run anyway He grabbed a 290 for something like $250 Canadian (and also built a new system around it).

Guess the big hurt with that 8800gt was the 512mb wasn't it?Assuming like most 8800gt,he had the 512mb. I think that old card could be useful still if people had the 1gb version.Would be better then anything from Intel or a APU for gaming.
 

SimianR

Senior member
Mar 10, 2011
609
16
81
Guess the big hurt with that 8800gt was the 512mb wasn't it?Assuming like most 8800gt,he had the 512mb. I think that old card could be useful still if people had the 1gb version.Would be better then anything from Intel or a APU for gaming.

Some games would still be okay, like DOTA2 or some less demanding games. I felt the same way about my 5870. It only had 1GB and had I purchased one of the 2GB crossfire versions I probably would have still been using it but so many games were starting to push vram usage over 1GB
 

kantonburg

Platinum Member
Oct 10, 1999
2,975
1
81
Some games would still be okay, like DOTA2 or some less demanding games. I felt the same way about my 5870. It only had 1GB and had I purchased one of the 2GB crossfire versions I probably would have still been using it but so many games were starting to push vram usage over 1GB

Yeah my trusty 6850 is still kicking. Been playing Far Cry 4 with no issues. I'll upgrade my old FX-4100 Zambezi first. Terrible CPU, but fit the budget at the time.
 

SolMiester

Diamond Member
Dec 19, 2004
5,331
17
76
Really, you showed an individual frame from the video run?

That was from Assassin's Creed Unity.

Even the slower of the 2 960s beats both the R9 280 and the R9 285 at 1080p / FXAA on that game. These are the overall avg FPS:

Assassin's Creed Unity, High, FXAA
960/ 960 / 760 / R9 280 / R9 285
43.0 47.7 32.3 41.7 32.9




From the article :

"Of the nine titles tested, there are wins for the GTX 960 in just four titles (ACU, COD, Tomb Raider, BF4), while the R9 285 wins four (Crysis 3, Metro Redux, Shadow of Mordor, Ryse) and the R9 280 emerges triumphant on Far Cry 4. "


Kinda puts all your fear-mongering into perspective. Or should, if you were objective.


I should point out that the authors statement that the R9 280 won in Far cry 4 - from their own benchmark table - is incorrect. The R9 285 beat the R9 280 by almost 20%. And that's at 1080p / Ultra / SMAA.

What all these testers and most of the posters here fail to factor in is the texture compression that exists on both the 960 and the R9 285. You cannot just compare physical VRAM size or raw memory bandwidth anymore.

LMAO...Thats RS for you....the AMD agenda man!
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
232
106
What all these testers and most of the posters here fail to factor in is the texture compression that exists on both the 960 and the R9 285. You cannot just compare physical VRAM size or raw memory bandwidth anymore.
Maxwell v2 uses approx. 25% fewer bytes per frame compared to Kepler. So, in the best case scenario, that GTX 960 is only as good as Kepler with 2.5GB of RAM, which is still 512MB less than a cheaper alternative, i.e. R9 280.

However, there are already games requiring more than 3 gigs of RAM, making both of these options rather obsolete today.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
^ Theoretical arguments about Maxwell's memory compression sound good on paper, but when real world games like Shadow of Mordor, Wolfenstein NWO, Titanfall, AC Unity, Evolve, DAI, FC4 already show that stuttering occurs with 2GB of VRAM, I don't understand why some people still try to spin Maxwell 2GB as "greater" than 2GB when in practice this theory falls apart. There is no real world example of any PC game listed above which shows that a 960 can cope when memory usage exceeds 2GB of VRAM. What we get are poor frame times and stutters. Furthermore, it's unbelievable how some gamers can be so close-minded as to not recognize that VRAM demands are likely to increase over the useful life of that 960 card, which means it will run into VRAM bottlenecks more often than a 3-4GB card.

Since the OP already said that his friend might consider switching to the i5 since he has not purchased the i3, and that the budget seems a bit flexibile beyond $200 to possibly a used R9 290 4GB, it seems to me this thread has turned into a worthless/irrelevant argument of R9 280 non-X vs. 960, defending the 960, but hardly providing out of the box thinking that would allow OP's friend to get a 50%+ faster gaming rig by just spending $50 more for the i5 over i3 and $50 more for the R9 290 over the 960. In the context of gaming over 2 years and countless PC games he might purchase, I would strongly consider the extra $100 for the CPU and GPU upgrades. More so, i5 and 290 will retain a higher resale value too than an i3/960 combo, which means should OP's friend decide to sell his parts to upgrade in 2 years, he will get back probably $40-50 out of that $100 spent on the i5+290 upgrade, which in turn means an excellent price/performance ratio! But I guess some posters find it easier to label my advice as having an AMD agenda rather consider the entire context of someone using this PC over 2-3 years.

All it takes is 2-3 AAA games like the Witcher 3, Starwars Battlefront, The Division to use more than 2GB of VRAM and soon the list of AAA titles that tank with 2GB of VRAM will go into double digits.
 
Last edited:

SimianR

Senior member
Mar 10, 2011
609
16
81
Yeah I'm pretty skeptical of the memory compression being as effective as having an additional 1GB of VRAM. Doesn't the compression just increase the effective bandwidth of the card? It's not a replacement for total available video memory. If the 2GB of available memory are used up and exceeded.. then you're going to run into performance issues regardless.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
^ Exactly. It's not a situation where a game uses 2001MB of VRAM and the compression suddenly saves the card from doom. 2015-2017 games will use 3-3.5GB. No amount of compression will save current 2GB cards.
 

kantonburg

Platinum Member
Oct 10, 1999
2,975
1
81
This has been a great thread. Just so everyone knows he picked up a used 290 here in FS/FT.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
7,131
6,001
136
Wow, a 290 for $250 is pretty sweet, but for $180 is insane value. Still man, gotta pair it with an i5 or better to really hit the potential of such a powerful GPU.
 

kantonburg

Platinum Member
Oct 10, 1999
2,975
1
81
He said that they were looking for a deal... may be OP should and also ask his friend to try looking at fleabay? What do you recommend senor?

He's pretty excited using the FS/FT forum. He said he'll probably look into an i5 and possibly i7 from there in roughly a year. He could probably recoup at least half of what he pays for the i3 I'd think. It's just what fits his budget right now.
 

Spanners

Senior member
Mar 16, 2014
325
1
0
Maxwell v2 uses approx. 25% fewer bytes per frame compared to Kepler. So, in the best case scenario, that GTX 960 is only as good as Kepler with 2.5GB of RAM, which is still 512MB less than a cheaper alternative, i.e. R9 280.

However, there are already games requiring more than 3 gigs of RAM, making both of these options rather obsolete today.

I would love to see some proof of this. Every game review I've read shows Maxwell (compression) and Hawaii (none) using very similar amounts of VRAM. I'm of the opinion that besides some write-backs colour compression mainly saves GPU-Memory bandwidth and has a negligible effect on VRAM usage. Prepared to be wrong but all I've seen is people assuming this is the case with no data to back it up. If Nvidia's 2GB cards were equivalent to 2.5GB (even best case) I'd have expected some mention/promotion of that fact. That would imply the GTX 980 effectively has 5GB compared to a card without compression. Again I'd assume that would have rated a mention.

To the OP good choice, that will have a much longer useful lifespan than a GTX 960.
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |