Gigabyte Gtx 970 G1 or AMD R9 390?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
Are we talking about single card or multi-cards? Of course, at that point, things change heavily. I wouldn't even recommend 2x of any cards except the highest tier card (980ti, titanx, fury, fury x). If you can get the same or similiar with 1 card, you do that instead. Plus, 2x 970 will be held back by 3.5gb more than 2x 290/x.

For a single card, unless you run some janky case with very, very little air flow, it wouldn't matter much. On the other hand, if you try to shove a GTX 970 in a case with janky air flow, it will suck, too! I have both of those cards. The heat issue is exaggerated. We're talking about ~50watts delta. Are you seriously thinking that's going to blow up your room? C'mon. That's a light bulb in terms of wattage.....

edit: I'm talking about AIB GPUs, not reference designs. We all know the 290/x reference cooler was jank. A proper cooled 290/x consumes less power, runs cooler and doesn't throttle.

I was looking at his other thread where he got XFX 290s in crossfire. Apparently they massively overheated to 100+ C.

This is one of the more AMD-friendly graphs and it shows a 167W difference between 970SLI and 290 CF (493W vs 326W). The 2 x 290s take as much power as 3 x 970s would.

Granted they both can put out a lot of heat in SLI / CF, but the 290 takes it to a different level. Looking at the R9 390 review from guru3d, the 390X drops about 30W vs the 290X (they didn't have a 390 on the power chart).

So in X-Fire, with a 390 maybe he will save 60W of heat dissipation and get down to ~ 430W dissipated vs 493 for a 290.

That is still > 100W more than a 970 SLI setup.

But really, it would make more sense to get a Fury X or a 980 Ti. Or perhaps save some $$ and go for a good 980 or regular Fury.


 

werepossum

Elite Member
Jul 10, 2006
29,873
463
126
100% he is. I just realized he posted a hot deal on the $309 970 in the Hot Deals section where I not only provided him with 3 alternatives from AMD and NV, which were better value to start with. If he thought $309 970 was a hot deal on August 18th, why did he need to start a thread on 970 vs. 290 decision considering at $309 he found the 970 to be a Hot Deal, the very same day when I linked him a $290 970, $220 290 and $260 290X?

It's very easy to see that perf/watt and power usage were never factors that truly mattered because if they did, HD4000/5000/6000 and a lot of HD7000 series (it took NV 6-9 months to do a full Kepler roll-out from 650-660Ti), would already have had 60-70% market share since NV was not competitive in perf/watt for 3.5 generations. And yet, not only did NV users wait for 6 months to buy worse perf/watt Fermi cards, but they also skipped HD4000-6000 series.

Perf/watt is just a random metric that's used to justify their brand preference, like why a $200 960 is somehow worth buying over a 50% faster $250 R9 290. Similarly to how they taught how 290/290X cards all run hot and loud, always ignoring the existence of after-market solutions. I thought you'd see right through all of that by now.

Did you ever wonder why almost none of them to this day go out of their way to recommend a $250 PowerColor PCS+ 290?

1. It's cool





It's 43% faster than an after-market 960, just 4% behind an after-market 970.
http://www.computerbase.de/2015-08/...#abschnitt_tests_in_1920__1080_und_2560__1440

It costs $250 and NV has no competition against it unless we consider a b-stock GTX970; and yet this forum will NOT recommend an after-market 290, instead using NV marketing gimmicks like 4K decoding, HDMI 2.0, DX12 feature set, number of DisplayPorts and all kinds of NV marketing bullet-points to downplay a gigantic 40-50% performance deficit cards like 960 have against similar AMD cards. It's almost as if some posters on here are working for a certain company or are so devoted to the brand, that they downplay traditional metrics like performance and perf/watt in lieu of features that either have no benefit to the majority of PC gamers or may only prove to be beneficial years from now when it's time to upgrade to a new card anyway.

In Trine 3, a reference R9 290 is 42% faster than a 960 and 91% faster than a $170-180 GTX950. North American review sites get marketing $, free review samples, media perks and so on so a company with more resources is able to provide more of these perks. Similarly, if the GPU market share is nearly 80% in favour of some major player that sends you free review samples, which cards are you more likely to recommend?

That's why today, the PC gamer has to go out of his way to read European, Australian, New Zeland, etc. objective reviews and do proper research as most North American sites (esp. the ones who don't buy their own cards) can hardly be trusted to provide truly objective/BEST advice for a PC gamer. That's why we are starting to see websites create artificial budget ceilings of $170 or something ridiculous to obfuscate themselves of any bias but anyone with experience of following this industry can see through most of this marketing PR. Forums such as ours are supposed to help PC builders/gamers weave through all of the marketing BS/PR because the very existence of these publications rests on the support by the companies that send them samples and marketing $$$. Because of this conflict of interest, it's more or less impossible to expect objectivity in today's hardware reviews in North America. Some sites are so obvious now they hardly even try to hide who supports them.
I've noticed that often I get better information from non-American sites. I think the bigger American sites get so much ad revenue that they go out of their way to not offend either company, to the point of obscuring where one manufacturer has an honest and undeniable advantage over another. Typically the market forces equivalent pricing for equivalent performance, but if the performance is perceived as equal even if it's not, then pricing can remain equal. We are not well served by such reportage.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I was looking at his other thread where he got XFX 290s in crossfire. Apparently they massively overheated to 100+ C.

That's the problem right there. He never owned XFX R9 290s in cross-fire.

Notice he doesn't even believe that a single R9 390 can be powered by his 750W power supply?

http://forums.anandtech.com/showpost.php?p=37638999&postcount=12

Do you think someone who ran dual R9 290s in Cross-fire would ask those type of questions if his PSU powered dual R9 290s already? Secondly, you are saying he ran R9 290s in Cross-fire but he says he originally owned an XFX R9 290 and then returned his 2nd R9 290X. So how many R9 290 cards did he own? 3?

Finally, why would he return two cross-fired R9 290(X) cards and buy just a single 970 to replace them? He would get dual 970s or a single 980 if he really wanted to have the power of dual R9 290s (hence why he bought them supposedly in the first place). Are you suggesting he first spend $400-500 on dual 290s and then had a change of heart and said Oh jeez, I would just downgrade to a single 390/970 just 1 month later?

The crazy part is in this very thread he never once mentioned anything about running dual R9 290s, it's only something you brought up. Every review shows that an XFX R9 290/290X Black Edition doesn't hit 90-95C at load and when he was questioned about any details of his card or his fan profile, he ignored all of that.

His story doesn't add up. What's more likely the case is that he may have owned a single XFX R9 290 black edition and then he flashed it with an XFX R9 290X bios from another version of the card but this new BIOS had borked fan speed profile which meant his fan wasn't ramping up properly on his flashed card. Of course he admits to no such thing in this thread about how he altered the card to perform outside of the scope of its boxed specifications without knowing all the risks that come with flashing the BIOS to an R9 290X's one from a different card.

EDIT:
There you go ^ that's exactly what he did and then tried to claim in this thread how R9 290 cards are nuclear reactors and run hot but now we find out he ran the card outside of its specs and most likely he messed up his own card by not adequately accounting for a different BIOS' fan curve settings. It's hard to believe someone who can't figure out that a 750W PSU can power an R9 390 would have even have the necessary skills to do an R9 290-> R9 290X bios flash and set up a proper custom fan profile. Not a chance.

So who is the one making stuff up in this thread?
 
Last edited:
Reactions: Grazick
Feb 19, 2009
10,457
10
76
No, I am not going to walk away. I will call out information that appears questionable. You claimed your XFX R9 290 was hitting max load of 40C earlier and suddenly it hits 90-95C?

Secondly, thus far you provided 0 proof of anything you claim. No MSI After-burner screenshots of fan profiles, no temperature screenshots, no video, nothing.

Thirdly, per your July 23, 2015 post, you already apparently had XFX R9 290 in your possession, but on August 18, 2015, you asked in this thread if XFX R9 290X is worth buying?

A gamer who already owned an XFX R9 290 for 1 month isn't going to ask such questions on a gaming forum since R9 290X isn't even an upgrade from XFX R9 290. Furthermore, if both of your R9 290 cards ran hot and loud, what about the mythical XFX R9 290 Black Edition that hit 40C at load -- according to you?

In this post you said that you are returning your 2nd R9 290X XFX but you never owned a 1st R9 290X by XFX.

Your story doesn't add up and you asked for no help on this forum regarding your supposedly hot and loud R9 290X cards, so that perhaps people could have helped you with fan profiles, seen pictures of your case to assist in case airflow, etc.



That's not what his posts says at all. He claimed in this very thread that his R9 290/290X cards hit 90-95C. Yet, July 23rd, he claimed:

GEOrifle
"P.S. By the way my R9 290 Black Edition is quiet as a mouse 35c idle under 40c under heavy load all spect max. First time I saw FarCry4 got actual Fog in game ��, I got 7 fans in case with adjustable speed, CM Sniper baby ��." ~ Your own source

Sorry, unless he moved to Africa and has his PC outside, he is either lying or trolling. And no one said anything about Cross-fire or SLI so no point in even bringing that up.



He already did but the main point is he never had any after-market R9 290 cards that hit 95C at load; not to mention the original thread discussed R9 390 vs. 970 but his very first post started claiming how NV cards run way cooler and so on, basically starting his original thread making false claims to ultimately justify his purchase anyway without doing objective research on the topic:

MSI Gaming R9 390 vs. EVGA GTX970
Sapphire R9 390 Nitro vs. MSi Gaming GTX970

RS, nice pwning of an obvious troll. Don't even know why people fake to ask for help, wasting everyone's time.
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
That's the problem right there. He never owned XFX R9 290s in cross-fire.

Notice he doesn't even believe that a single R9 390 can be powered by his 750W power supply?

http://forums.anandtech.com/showpost.php?p=37638999&postcount=12

Do you think someone who ran dual R9 290s in Cross-fire would ask those type of questions if his PSU powered dual R9 290s already? Secondly, you are saying he ran R9 290s in Cross-fire but he says he originally owned an XFX R9 290 and then returned his 2nd R9 290X. So how many R9 290 cards did he own? 3?

Finally, why would he return two cross-fired R9 290(X) cards and buy just a single 970 to replace them? He would get dual 970s or a single 980 if he really wanted to have the power of dual R9 290s (hence why he bought them supposedly in the first place). Are you suggesting he first spend $400-500 on dual 290s and then had a change of heart and said Oh jeez, I would just downgrade to a single 390/970 just 1 month later?

The crazy part is in this very thread he never once mentioned anything about running dual R9 290s, it's only something you brought up. Every review shows that an XFX R9 290/290X Black Edition doesn't hit 90-95C at load and when he was questioned about any details of his card or his fan profile, he ignored all of that. His story doesn't add up.


You're right, that doesn't make sense. I thought he was still doing x-fire / SLI. He was probably trolling to try to make some kind of point.
 
Reactions: Grazick

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
You're right, that doesn't make sense. I thought he was still doing x-fire / SLI. He was probably trolling to try to make some kind of point.

Reading some of his posts more closely, even the original card comparison doesn't make a lot of sense to me:

That's right. I guess 390 wins but there is just couple brands i recommend: MSI and Sapphire, maybe XFX too, to bad they eliminate Lifetime Warranty.
No Powercollor, they don't support second owner warranty, i had a chance to buy 3 month old 390x for $350 and just ignored because you never know when it will go bad.
So lets wait when MSI will have discount on 390, they have on 390x for $399 after MIR but

If he is so concerned about warranty, he should have gone with an EVGA card and its extended warranty. Gigabyte's warranty starts from the date the card was manufactured, not the date of purchase so it's actually not even the best NV AIB to purchase if warranty and card replacement in case of failure is the #1 priority. In other words if the longest warranty was the #1 priority, we wouldn't even recommend an AMD R9 390 card.
 

Seba

Golden Member
Sep 17, 2000
1,497
144
106
We all know the 290/x reference cooler was jank. A proper cooled 290/x consumes less power, runs cooler and doesn't throttle.
A better cooler lowers the temperature, but does not make the card consume less power.
 

Seba

Golden Member
Sep 17, 2000
1,497
144
106
Yes it does, all else being equal, cooler transistors are more efficient transistors.

Do you have an example where a better cooler on a graphics card resulted in lower measured power consumption (at the same GPU and VRAM frequencies)?
 

Seba

Golden Member
Sep 17, 2000
1,497
144
106
First link is about a CPU.

The second discusses improvements in R9 290X GPU. I do not see here a power consumption comparison with two coolers on the same card (one at a time).
 
Nov 2, 2013
105
2
81
First link is about a CPU.

You think GPUs are made from magical transistors that don't obey the same general laws?

The second discusses improvements in R9 290X GPU. I do not see here a power consumption comparison with two coolers on the same card (one at a time).

Here is the relevant passage:

All other elements being held equal, temperatures affect silicon devices in 3 important ways: longevity, power consumption (leakage), and attainable clockspeeds. For longevity there’s a direct relationship between temperature and the electromigration effect, with higher temperatures causing electromigration and ultimately ASIC failure to occur sooner than lower temperatures. For power consumption there is a direct relationship between temperature and power consumption, such that higher temperatures will increase static transistor leakage and therefore increase power consumption, even under identical workloads.
 
Last edited:

redzo

Senior member
Nov 21, 2007
547
5
81
"Facts":
1. 390 is more power demanding than a 970. The main let down I think it's the multi-monitor power usage. If you use multi-monitor and you do not game that much, you'll spend most of the time at desktop and, unfortunately, 390's power consumption under this use case is a few times higher than 970's.
2. 390 is twice the memory capacity, 0.5 GB 970 full speed issue aside.
3. Currently, they both perform almost the same, cherry picking high clock OC products aside.
4. Directx 12 is irrelevant right now. It should not matter. By the time dx12 will mater, this generation will be obsolete or very under performing.

"Speculation":
1. I believe in the 290/x, 390/x longevity more. The 7970/ghz/280x aged very well. I expect the same even for the 290/x considering that they're 99% the same chips.
2. I expect the 390 to go down in price more.

At the end of the day, currently it is more a mater of pure taste.

I would personally choose a 390 based on "Speculation" and forget about its power consumption disadvantage.
And there is also this feeling: the 280x used to compete with the gtx 770. The 770 used to be a little more expensive too. After a few years I would rather end up with a 280x in my rigg than with a 770. When I look at both this gpu's, that's what I see: 280x(390) vs 770(970).
 

Seba

Golden Member
Sep 17, 2000
1,497
144
106

On air coolers you do not get 50 degrees Celsius temperature drop by changing the cooler. So this is why I did not see such differences in power draw between various air cooled cards (same model/same GPU, but with different coolers).

The Sapphire Radeon R9 290X with the stock cooler averaged 146 Watts with a peak of 171.5 Watts. The same card with the water cooler averaged just 120 Watts with a peak of 147 Watts. The fact that GPU-Z was using 26W less power with water cooling was astounding.We showed these results to AMD and they said that at lower temperatures that there will be less leakage across insulators inside the GPU. When talking about a temperature drop of 94C to 84C that difference is usually negligible, but since we had a 50C temperature drop that could be part of the reason we are seeing huge power savings.

Read more at http://www.legitreviews.com/nzxt-kr...d-radeon-r9-290x_130344/4#gjJcgYtcmELxuVf0.99
 
Last edited:

redzo

Senior member
Nov 21, 2007
547
5
81
From what I remember, It took at least one hardware generation to get acceptable high FPS at high quality image for the next DX iteration.

My thinking is: hd5000 was dx11 capable, so was hd6000, so was hd7000. I think that current gen dx12 hardware belongs to the first hd5000 dx11 class.
Get serious dx12 with HBM2 products and fine tune dx12 with next gen HBM2.

AOS is currently a dx12 benchmark and not a game. Devs may decide to push in game image fidelity much more than this.
 

iiiankiii

Senior member
Apr 4, 2008
759
47
91
On air coolers you do not get 50 degrees Celsius temperature drop by changing the cooler. So this is why I did not see such differences in power draw between various air cooled cards (same model/same GPU, but with different coolers).

It doesn't have to be a 50c delta to have an effect. Basically, at a certain point, the heat is so great that it starts leaking like crazy. If you keep your GPU around 75c (which isn't hard at all with AIB coolers), you'll greatly reduce the power consumption compared to 94-95c of the stock blower.

Here's an air cooler that uses less power than the stock blower: Notice that the stock cooler consumes 246W on average vs 221W of the Sapphire Tri-X. Remember that "quiet" mode means running the cooler at 94-95c!!



and another one... 218W vs 246W!!



and another.... 231W vs 246W



Also, don't forget that these cards are also FACTORY OVERCLOCKED when compared to the stock cooler. That means not only do these AIB GPUs consume less power, they also have more performance.
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
No, I am not going to walk away. I will call out information that appears questionable. You claimed your XFX R9 290 was hitting max load of 40C earlier and suddenly it hits 90-95C?

Secondly, thus far you provided 0 proof of anything you claim. No MSI After-burner screenshots of fan profiles, no temperature screenshots, no video, nothing.

Thirdly, per your July 23, 2015 post, you already apparently had XFX R9 290 in your possession, but on August 18, 2015, you asked in this thread if XFX R9 290X is worth buying?

A gamer who already owned an XFX R9 290 for 1 month isn't going to ask such questions on a gaming forum since R9 290X isn't even an upgrade from XFX R9 290. Furthermore, if both of your R9 290 cards ran hot and loud, what about the mythical XFX R9 290 Black Edition that hit 40C at load -- according to you?

In this post you said that you are returning your 2nd R9 290X XFX but you never owned a 1st R9 290X by XFX.

Your story doesn't add up and you asked for no help on this forum regarding your supposedly hot and loud R9 290X cards, so that perhaps people could have helped you with fan profiles, seen pictures of your case to assist in case airflow, etc.



That's not what his posts says at all. He claimed in this very thread that his R9 290/290X cards hit 90-95C. Yet, July 23rd, he claimed:

GEOrifle
"P.S. By the way my R9 290 Black Edition is quiet as a mouse 35c idle under 40c under heavy load all spect max. First time I saw FarCry4 got actual Fog in game ��, I got 7 fans in case with adjustable speed, CM Sniper baby ��." ~ Your own source

Sorry, unless he moved to Africa and has his PC outside, he is either lying or trolling. And no one said anything about Cross-fire or SLI so no point in even bringing that up.



He already did but the main point is he never had any after-market R9 290 cards that hit 95C at load; not to mention the original thread discussed R9 390 vs. 970 but his very first post started claiming how NV cards run way cooler and so on, basically starting his original thread making false claims to ultimately justify his purchase anyway without doing objective research on the topic:

MSI Gaming R9 390 vs. EVGA GTX970
Sapphire R9 390 Nitro vs. MSi Gaming GTX970


EPIC takedown. Legendary.

The trolls lose one of their number today
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Off topic but where are you getting your LED bulbs for $5 like you mention in your sig? All of the decent LED bulbs that i have been able to find locally have been expensive, especially if you want 100 watt bulbs.

Keep an eye on Slickdeals.net. LED Bulbs seem to have really good discounts there. You usually have to buy 6+ bulbs in a pack in order to get around $5 or less per unit. For example: http://slickdeals.net/f/8062402-led...humbs+-+Title+and+First+Post+-+Hot+Deals+Only.

Unit price: $3.20 per for 60w equivalents. The 100 watt calculation was based on a deal I saw 2 for $5 60w equivalents @ 10watt consumption each vs 60w consumption each for the incandescent being replaced -> (60-10) + (60-10) = 100w / 50w per bulb. 100w equivalent bulbs are generally a good bit more expensive than the 60w equivalents but they've been coming down now too

You can find deals fairly regularly there for all sorts of different types of bulbs, higher/lower power equivalents, flood light versions, IoT bulbs. Watch out for dimmable vs non-dimmable LEDs and read the reviews. Some are technically dimmable but work like crap when you actually do it. Definitely more variation in quality than with incadescents or even CFLs but there are certainly high quality ones out there.

edit: sometimes you can get Amazon to price match too
 
Last edited:

tential

Diamond Member
May 13, 2008
7,355
642
121
You couldn't pay me to use a currently released gpu in a dx12 game. Great for those who are thinking of buying and holding an amd product to play at reduced settings for dx12.

As you can obviously tell nvidia isn't worried about that. They're worried about pascal/Arctic + dx12.

If you're planning on using a current gpu vs a gpu twice as fast next year, then you're not a customer neither company should be worried about in relation to dx12. People seem to forget that these companies are here to make money, having products that limp through the years like amd has does nothing to help you generate new sales.
 

GEOrifle

Senior member
Oct 2, 2005
806
5
81
***********************************************

WTF??? ARE YOU NUTS ????? RELAX THERE and ENJOY FORUM.
Your logic SUCKS MAN !!!!
I don't need to send you receipts of cards.
1.First card was XFX r9 290 BE, in the beginning it was good, then BSOD, Black screens
2 XFX r9 290x way to hot.

I didn't lied and JUST MAKE SURE YOUR "TEACHING" TONE IT ACCEPTABLE.
Because of YOU I BOUGHT DAMN 290x and YOU GOT YOUR TILE BURNED !!!!

R9 SUCKS and just ENJOY IT....
Your help was really appreciated in the beginning but this is OVERDOSE !!!!
AND enough saying stupid things, open up your own post and do WHATEVER YOU WANT !!!!!
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
If your card started overheating and BSODing out of nowhere, you should RMA it. That's a faulty card. If you think they all work that way you're totally off base.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |