Amd 7970 still decent

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Brunnis

Senior member
Nov 15, 2004
506
71
91
Seems I'll finally be leaving Tahiti behind. As I wrote earlier in this thread, a couple of weeks ago I RMA'd my Sapphire R9 280X for the third time. Well, I just got the verdict: The store will be exchanging the card for a Sapphire R9 390 Nitro 8GB! Pretty awesome replacement, if I may say so. I guess they wanted to compensate for the trouble I've had and I'm definitely happy with how they handled it this time.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
That is an awesome outcome, and great customer service. I wish we had that kind of service here in the states, instead I am almost forced to stick with EVGA for their customer service, since the stores wipe their hands clean of replacement after 30 days.
 

Seba

Golden Member
Sep 17, 2000
1,497
144
106
That is an awesome outcome, and great customer service. I wish we had that kind of service here in the states, instead I am almost forced to stick with EVGA for their customer service, since the stores wipe their hands clean of replacement after 30 days.
You won't be that delighted when you see the prices in Europe.
 

Brunnis

Senior member
Nov 15, 2004
506
71
91
That is an awesome outcome, and great customer service. I wish we had that kind of service here in the states, instead I am almost forced to stick with EVGA for their customer service, since the stores wipe their hands clean of replacement after 30 days.
Yep, really nice. Actually the first time I use the mandated 3 year period for filing complaints when out of warranty. Worked fine, but I guess it helped that I had actually complained about this issue within the warranty period.

Seba said:
You won't be that delighted when you see the prices in Europe.
Yeah, prices can be quite a bit higher, but here in Sweden it's mostly because of the high VAT (25%). For example, if you remove the VAT, this card is approximately 330 USD at the current exchange rate, which I think is pretty competitive with US prices.
 

Brunnis

Senior member
Nov 15, 2004
506
71
91
Jeeeeesus... Just got the R9 390 installed. Huge problems. Works fine as long as no driver is installed, but as soon as the driver installation finishes, the screen starts going black whenever opening a window or doing anything that creates any sort of light or moderate load. Opening any 3D application results in instant black screen and loss of signal until the application is closed (Alt+F4 FTW ).

Just managed to get it to stop the constant black screening by completely turning off PCI-E power savings in the active power profile. Mildly optimistic, I then tried running Furmark. Got a quick black screen flash within 1 minute.

So, folks, I guess it's time for my fourth AMD GPU. To be honest, I just feel like setting fire to the POS and move on with my life.

EDIT: Been googling around and apparently it's a pretty common problem going all the way back to the introduction of the 290/290X. Just like with the vertical lines issue on the 7970 and derivatives, people have come up with many attemps to "solve" the issue: downclocking, overvolting, creating custom power profiles, etc. Sometimes it works, sometimes it doesn't. However, again, it all seems to come down to either defective hardware or hardware that's simply not up to spec.

I'll see if I can verify the problem in another PC, then it's going back to the store.

EDIT2: I may have found a work-around... The issue only happens on the rightmost of the three DisplayPort ports (haven't tried HDMI or DVI though). Both of the others work perfectly. Additionally, even the right most DP port seems to start working if you unplug and then plug the DP connector back in while Windows is running. Looks like some strange timing issue. If the card works as expected except for this, I'll probably keep it.

EDIT3: I'm sorry for blaming you, AMD... Believe it or not, it was actually the cable! This cable has worked perfectly until now, but the DP port in question appears to be running closer to the "edge" than the other two (and the DP ports on my previous cards). Pretty strange failure mode (being dependent on load), but replacing the cable with one provided with my HP Z24i monitor fixed the black screen issue completely. Time to game!
 
Last edited:

Seba

Golden Member
Sep 17, 2000
1,497
144
106
Have you tried to remove/clean the old driver first with DDU?
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
I think 7970 OC has the most relevant staying power compared to even 9700Pro or 8800GTX U. That's also a function of PC gaming not advancing from a technical point of view as fast as the in old days. Looking at SW:BF though and how well 7970 does in it, if a lot more PC games were as well optimized, 7970 could last another 2 years for 1080P.

AMD's drivers eventually catch up with the hardware, so it ages like a fine wine. I'd say the same thing for even VLIW5 AMD cards.

My Radeon HD 5850 does surprisingly fine in alot of newer gen games. However, I can attest that Witcher 3 is the first game I've tried (video coming soon!) that the 5850 can't really handle at 1080p, 30 FPS, and decent looking visuals (medium-ish settings). Still not bad for a six year old video card though. The DX11 VLIW5 cards had ALOT of GFLOPS that required very good drivers, moreso than GCN.

AMD has to step their driver game for GPU launches.
 
Last edited:

cyclohexane

Platinum Member
Feb 12, 2005
2,837
19
81
I have never had a problem with amd cards over the years, always ran well for me.

Cards:
Radeon 9800se (ancient)
Sapphirex1959-xt (old)
HD4650 (old)

Gigabyte r9-390 (new)
 

SPBHM

Diamond Member
Sep 12, 2012
5,058
410
126
AMD's drivers eventually catch up with the hardware, so it ages like a fine wine. I'd say the same thing for even VLIW5 AMD cards.

My Radeon HD 5850 does surprisingly fine in alot of newer gen games. However, I can attest that Witcher 3 is the first game I've tried (video coming soon!) that the 5850 can't really handle at 1080p, 30 FPS, and decent looking visuals (medium-ish settings). Still not bad for a six year old video card though. The DX11 VLIW5 cards had ALOT of GFLOPS that required very good drivers, moreso than GCN.

AMD has to step their driver game for GPU launches.

something seems to be off with Witcher 3 with updated game/drivers and VLIW cards judging by the 6970 here

around launch time
http://pclab.pl/art63116-7.html

newer
http://pclab.pl/art66374-5.html

if you see the 6970 performance is a lot lower while the rest is more or less the same,


in any case, as you can see even on the old tests the 6970 was not doing so great, so I wouldn't expect much with the 5850,

but when I tried back when the game was launched and with low expectations I was actually impressed by the game when I tried it with a 5850 (with some decent OC), keep in mind I was trying it at 1280x1024 and 1440x900, settings were mixed low-med-high, I can't remember exactly, but it looked pretty and was running around 30FPS most of the time, not sure how much worse it would be with updated game/drivers, judging by the 6970 test, a lot.

but I think the 7970 is aging A LOT better than the 5800s were.
 
Last edited:

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
but I think the 7970 is aging A LOT better than the 5800s were.

I would agree. But I think it has more to do with

1) being a more efficient architecture when it comes to drivers and actual performance
2) GCN commonality with console platforms benefiting AMD graphics cards

GCN's biggest problem now seems to stem from bandwidth requirements. I'm not sure what kind of cache setup GCN has, but clearly Maxwell benefited from more shader L2 cache, since you have 128-bit memory bus GPUs like the GTX 950 and 960 matching AMD 256-bit memory bus GPUs. A larger bus means more memory modules, a higher price, and typically more overall power draw, making it difficult to use in the mobile space, which has become quite lucrative.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
clearly Maxwell benefited from more shader L2 cache, since you have 128-bit memory bus GPUs like the GTX 950 and 960 matching AMD 256-bit memory bus GPUs.

GTX950/960 are slower than R9 380/380X/280X and that difference grows even more at higher resolutions.
https://www.techpowerup.com/reviews/ASUS/R9_380X_Strix/23.html

something seems to be off with Witcher 3 with updated game/drivers and VLIW cards judging by the 6970 here

around launch time
http://pclab.pl/art63116-7.html

newer
http://pclab.pl/art66374-5.html

if you see the 6970 performance is a lot lower while the rest is more or less the same,

That does look bad. VLIW is going to suffer more and more as AMD is no longer going to optimize drivers for it. Even though Fermi is doing better than HD6970, The Witcher 3 is also aging that architecture. Seems older archtiectures are getting bottlenecked by TW3.

HD7850/7870 vs. 6970/580 on launch:


The benches you linked
HD7870 = 49.8 / 43.4 / 42.6 / 50.7 = avg 46.63
GTX580 = 37.7 / 29 / 36.5 / 37.5 = avg 35.18

Today HD7870 costs $90. This just goes to show that future-proofing with $500+ flagship cards for next gen games out in 4-5 years doesn't work well. It would actually be cheaper to buy a 6970, throw it in the garbage and buy an R9 270 to play TW3 today than keep using the $500 580, and it's even worse considering HD6950 was $299 and unlocked to a 6970.

but I think the 7970 is aging A LOT better than the 5800s were.

The 7970 OC has very few weaknesses for its time compared to HD5850. It has 3GB of VRAM vs. 1GB on the 5850, it has big overclocking headroom vs. much smaller OCing on the 5850/6950, it has lots of memory bandwidth and lots of shading power and it benefits greatly from AMD focusing on GCN driver optimizations. Also, back then graphics improved at a much more rapid pace compared to modern period. Since HD7970 is more powerful than PS4's GPU and many modern AAA games are targeting to run on PS4 (Just Cause 3, Far Cry Primal, The Division, Watch Dogs, Far Cry 4, Dragon Age Inquisition, etc.), it's not a surprise that 7970 OC is still viable for 1080P gaming.

The performance difference bewteen the 2012 HD7970Ghz (aka 7970 OC) and 2009 5850 is just massive compared to 2012 HD7970Ghz vs. 2013 R9 290X.

@ 1080p
HD7970Ghz is 2.38X faster than HD5870 (~5850 OC)
vs.
R9 290X is just 31% faster than HD7970Ghz. With 1.2Ghz OCing R9 290X might squeeze 45% faster but that's about it.


http://www.techspot.com/article/942-five-generations-amd-radeon-graphics-compared/page9.html

It is crazy to think that from Sept 2009 when HD5870 launched, AMD increased performance 2.38X when they launched HD7970 in January 2012 but 290X wasn't a big improvement at all in the grand scheme of things. Even with Fury X, AMD is nowhere close to 2.38X faster than R9 280X/7970Ghz. It will take a card >30% faster than the 980Ti to get a 2.38X lead over R9 280x at 1080P to get to relative terms wrt: HD5850 OC vs. HD7970Ghz.

R9 280X = 99%
GTX980Ti = 179%
=> 99% x 2.38X ~ 236%
https://tpucdn.com/reviews/ASUS/R9_380X_Strix/images/perfrel_1920_1080.png

At 1600p, HD7970Ghz was 2.95X faster than HD5870 per the TechSpot's January 2015 review (333% vs. 113%). To put that into perspective, it'll take a card with a rating of 292% on this chart below to get to where HD7970Ghz sits vs. HD5870 in 2015 @ 1440/1600p.


WOW.
 
Last edited:

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
GTX950/960 are slower than R9 380/380X/280X and that difference grows even more at higher resolutions.

I was specifically referring to the R9 270 and R9 270X. Sorry for not clarifying.

That does look bad. VLIW is going to suffer more and more as AMD is no longer going to optimize drivers for it. Even though Fermi is doing better than HD6970, The Witcher 3 is also aging that architecture. Seems older archtiectures are getting bottlenecked by TW3.

I've got a Witcher 3 w/ Radeon HD 5850 video uploading right now. Check the link in an hour, and it should be up and running.

Witcher 3 w/ Radeon HD 5850

The quick story is that the Radeon HD 5850 gets absolutely pulverized by The Witcher 3 at 1080p, even at lowest settings. 720p is more than doable, even with some medium settings, but it's just too much for the 5850 at 1080p. Sad, but it's a geezer of a card that's done good over the years, just like the 8800GTX before it. It's fun testing it out with newer games, and if I could afford to purchase Fallout 4 and some of the more recent AAA releases, I'd give them a test drive with the 5850. Not sure if I'll bother giving Tomb Raider 2013 a video, as it's a past-gen game technically, and the remastered PS4 and Xbone versions are not exactly analogous to the PC version. It's the same deal with Metro: Last Light (original version, not Redux). Titanfall will probably be the last feature in my "Radeon HD 5850 in 2015" series.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I've got a Witcher 3 w/ Radeon HD 5850 video uploading right now. Check the link in an hour, and it should be up and running.

Witcher 3 w/ Radeon HD 5850

The quick story is that the Radeon HD 5850 gets absolutely pulverized by The Witcher 3 at 1080p, even at lowest settings. .

I'll try to check your video later but it still says it hasn't been uploaded yet.

5850 is from Sept 2009 and it's already December 2015. It's time to retire that old girl. With your 4690K CPU, lots of great options out there like GTX970/390.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
I'll try to check your video later but it still says it hasn't been uploaded yet.

5850 is from Sept 2009 and it's already December 2015. It's time to retire that old girl. With your 4690K CPU, lots of great options out there like GTX970/390.

It's up now

I have an R9 270 already. It's just not installed of course doing these 5850 tests. I might replace it next year, I might not.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It's up now

I have an R9 270 already. It's just not installed of course doing these 5850 tests. I might replace it next year, I might not.

For some reason the first 16 seconds of the video are all blurry. Maybe it's my connection/comp but I tried watching the video 2-3 times and it's the same thing. I like your entire commentary. You should try to do more videos with R9 270 to continue the series although I feel that the R9 270 may not have a life as long as your 5850 did.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
For some reason the first 16 seconds of the video are all blurry. Maybe it's my connection/comp but I tried watching the video 2-3 times and it's the same thing. I like your entire commentary. You should try to do more videos with R9 270 to continue the series although I feel that the R9 270 may not have a life as long as your 5850 did.

Hmmm, it doesn't look more blurry/blocky to me beyond what may be attributed to using Quicksync performing realtime encoding duties with Open Broadcasting System. I do not have an external recorder (can't afford one right now), and of course using x.264 would suck precious CPU resources at a decent bitrate. The caveat is that Quicksync OBS recording is VERY temperamental with Windows 7 and likes to randomly crash. Moving to Windows 8 would solve this, but alas, I'm hesitant to spend the cash on an OEM disc. Windows 10 is out of the question because of privacy issues even if it is a free upgrade from Windows 7 but apparently the issues show up on Win10 too. How odd.

And thanks for your comment on the commentary lol. I try to sound informative and entertaining without sounding like one of those cliche gamer dweebs who sound way too nasally. I like to investigate the technical and hardware side of running PC games, hence why I have so many videos of that nature. Fighting for views and subs is a bitch though lol. I may do some more vids investigating my fiancee's i3-2120 & Radeon 7750 system in how it handles recent games for more perspective on how well it's aged.

As for my R9 270, yeah, it's not the best, but it's given me a good experience so far and a huge boost over the 5850. Honestly I should've spent more money though. It was originally meant for my Phenom II x4 system, but that didn't go far when I had issues finding DDR3 RAM sticks that would actually work with the old 2009 AM3 mobo (voltage compatibility?). Being stuck on 4 GB of RAM was it's death sentence and I wanted to be absolutely ready for GTA5!
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |