[Rumor (Various)] AMD R7/9 3xx / Fiji / Fury

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
As it stands, based on the leaks, it looks like we're going to get trash silicon for everything below the R9 390X.

The leaks indicate that there's only going to be a R9 380, not a R9 380X. This hints that it's going to be the same cut-down Tonga chip we're getting now. The Gigabyte box render indicates it has an 8-pin connector; the OEM page says two six-pins. This means no gains in power efficiency. It's worth pointing out that AMD could have fit Tonga in a single 6-pin (under 150W) if they wanted to; the FirePro W7100 does this. They just didn't care.

The R7 370 Pitcairn appears to be 1024 shaders (meaning it's a rebranded R7 265, which is itself a rebranded HD 7850). It's listed as being 20% more powerful than a GTX 750 Ti, and a full Pitcairn would do better than that (25%-30% better). Also, the OEM page indicates that the OEM 370 (which for some reason is R9 series instead of R7) is 1024 shaders. Why AMD chose to do this is baffling, since they can't possibly be having yield issues at this point in the 28nm process on a chip that small, and the existing R9 270 (non-X) already runs on one 6-pin power connector if they were concerned about that.

The OEM R9 360 is 768 shaders (cut-down Bonaire), so it's now safe to assume that the retail R7 360 will be the same. Here, this appears to have been done for power consumption reasons; according to AMD's page, the R9 360 doesn't have an auxiliary power connector. I guess this is supposed to be competition for the GTX 750, but I don't see it being a big seller when the GM107 cards are more powerful and more efficient for these kind of systems.



At least we'll have fiji to look forward to.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
I don't think there was a worthwhile purchase below the 290 mark anyway.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Yep, $499 for an 8GB Fiji Pro, if it performs better than a 980 (and hopefully close to the 980Ti in a few things), is the perfect choice.

Doesn't have to be 8GB. Fiji PRO with 4GB HBM1 is more impressive than Fiji PRO with 8GB GDDR5 because with 3500 shaders+, it's better to have next gen memory tech against the 980 non-TI.

But what do you think the R9 390/X reviews are going to say, if these cards are nothing more than straight rebrands of a 20-month-old product with more RAM and a slightly higher clock?

OK, we can't have a discussion if the term Rebrands is used incorrectly. You guys keep interchanging Rebrands with Refreshes. According to some people on this forum, GTX560 and 770 were rebrands, no those are Refreshes. A rebrand is taking the exact same product and changing the name. If you up clock speeds, double the VRAM, add features like HDMI 2.0, that's called a refresh, not a rebrand. With that out of the way, did people on this forum and the Internet criticize the GTX560 / 770 Refreshes? No! Those cards sold like hot cakes.

If the review sites are not biased/paid off by NV, they will compare these cards against every AMD/NV card in that price range and make objective conclusions. What did these same review sites say about Refreshed GTX560/770 cards?

This forum seriously has short memories and some slanted bias alright. GTX560/770 sold like hot cakes and no one called them Rebrands, rebrands, rebrands. What people call Rebrands today used to be called Refresh in the past.

HD6870 launched October 21, 2010, cost $239

GTX560, a refresh of GTX460, launched May 17, 2011, cost $199

I don't remember this entire forum ripping GTX560 apart.
http://www.anandtech.com/show/4344/nvidias-geforce-gtx-560-top-to-bottom-overclock/16

In fact, 770 sold like hot cakes and that card was overpriced from day 1 based on price/performance. I recall a TON of PC gamers who bought 770s. Was 770 criticized like crazy for being a refresh of a 680? No, the media praised it.

Do I like refreshes? No, not really but do they make sense sometimes? Yes, they do as both NV/AMD have done them for years with 10-15% performance increases and minor new features. GTX280->285 is another such example. Refresh, not rebrand.

They're going to say that AMD is trying to sell old wine in new bottles for an inflated price. The new reviews may not contain the same criticisms as the old ones, but they are definitely going to be critical of AMD for not doing any real work in bringing new midrange designs to market - and rightly so.

What difference does it make if its a Refresh or a new design as long as the product is competitive?

680MX = 780M = 880M
The only differences are VRAM and higher clocks, and newer fabrication processers that improved perf/watt over time due to higher GPU clocks at lower power usage. 3 consecutive NV flagship mobile dGPUs cards based on the same 680MX design, sold like hot cakes, not one person on this forum who keeps regurgitating rebrands, rebrands, rebrands criticized those cards.

http://videocardz.com/48633/nvidia-geforce-gtx-880m-rebranded-gtx-780m-8gb-memory

NV and AMD both do this and have done it for years.

Aftermarket coolers can reduce the noise and temperature issues, but they don't substantially reduce power consumption. You're still looking at 253W in games and 316W in FurMark.

#1 I won't argue with you about FurMark since it's pointless. You clearly do not understand what FurMark is and keep using it to gauge power usage.

#2 How do you know that AMD didn't produce revision 2 Hawaii chips with identical specs but achieved better leakage and perf/watt?

I also expect Nvidia to take direct aim at AMD if the 300 series really is all straight rebrands. Traditionally, neither company has used this as a talking point in their marketing campaigns, since they've all been guilty of it to some extent.

If they do, NV will be the most hypocrital GPU firm of all time considering they reuse Fermi/Kepler for years and refreshed GTX680MX 3 gens in a row.


The bottom line is that AMD's current graphics lineup is not only out of date, but it has to compete with a massive backlog of overproduced stock and used ex-mining cards. If AMD wants to be able to charge higher prices, they need to actually produce new silicon, not just new stickers.

We'll just agree to disagree. If you start comparing used cards against new cards, might as well write-off the entire AMD/NV line-up of existing cards besides the 980Ti. What kind of a comparison is that?

You provided no rebuttal how a cool and quiet $329 R9 390 with 8GB is not a better buy than a $330 970 3.5GB card if the performance is similar. You provided no rebuttal how a $389 R9 390X 8GB that will be similar to a 980 in performance based on rumours is a fail against a $499 980. I guess your rebuttal is NV marketing = winning?

As I said, a lot of marketing brain-washed average Joe PC gamers think 290X uses 300W of power, runs at 94C, and sounds louder than a 480. They wouldn't buy it for $199. This marketing argument won't work against a cool and quiet 390/390X. It's not going to be possible to make claims now that R9 390/390X run hot and loud, which is 2/3 disadvantages of 290/290X image right there.

You keep discussing perf/watt but ignoring that 970 only has 3.5GB of fast VRAM. You don't think that matters at all?

"Geforce GTX 970 issue destroyed sales in February"

If a 390 beats a 970 by even 3%, I won't recommend a 970 with 3.5GB of VRAM over a true 4GB 390 card that's also cool and quiet and is faster. If someone is buying a $300 GPU, what do you think matters more to them saving $2 a month in electricity or not knowing if 3.5GB of VRAM will become an issue in 6 months? Since most of the PC gaming community thought that R9 290/290X sound like jet engines and ran at 94-95C, their performance didn't even matter to start with. This argument disappears once 390/390X launch.

All of a sudden, the 3.5GB VRAM vulnerability of the 970 and 980's horrible price/performance at $499 are back on the table. If you are going to be objective, then it's only fair you address respective weaknesses of both AMD and NV's line-ups, yet in your case you aren't doing that at all. You are just focusing on how a hypothetically 10% faster R9 390/390X are automatic failures since they are refreshes, yet ignoring the major weaknesses 970 and 980 have too.
 
Last edited:

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
OK, we can't have a discussion if the term Rebrands is used incorrectly. You guys keep interchanging Rebrands with Refreshes. According to some people on this forum, GTX560 and 770 were rebrands, no those are Refreshes. A rebrand is taking the exact same product and changing the name. If you up clock speeds, double the VRAM, add features like HDMI 2.0, that's called a refresh, not a rebrand.

Nope. If there are no transistor-level changes on the wafer, then it's a rebrand. GTX 580 wasn't a rebrand of GTX 480 because there were changes to the die. In contrast, R9 280X was just a rebrand of 7970 GHz Edition because there was no change in the actual silicon. Putting in faster RAM chips doesn't change that.

Even if they release full Tonga for the desktop, it would still be a rebrand, because the Tonga silicon already exists coming off the wafers. It's only a new chip if they tweak the base layer (not just the metal layer - that's a mere stepping, which is a rebrand).

As I said, a lot of marketing brain-washed average Joe PC gamers think 290X uses 300W of power, runs at 94C, and sounds louder than a 480. They wouldn't buy it for $99. This argument won't work against a cool and quiet 390/390X.

It does use 300W of power, and will probably use even more on the R9 390X due to the completely unnecessary switch to a 1500 MHz memory clock.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Why? If it is indeed $389, it would be $100 cheaper than a 980, have twice the RAM of a 980 and would have nearly the same performance. Where's the downside? Close in performance, cheaper and with more memory. Sounds like a good deal. Oh, and it should be DX12 compatible as well.


?? It's already nearly as fast as a 980 at 4K. And we don't know yet if the 390X is going to have the exact same GPU as the 290X or if it's been enhanced. Personally, I suspect enhanced but we'll have to wait for either a solid leak or launch day to know for sure.

They might be simply going for the moar is better marketing? I hope not though. Considering you can get the Sapphire 290X tri-X 8GB for $360 w/Free game right now.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Nope. If there are no transistor-level changes on the wafer, then it's a rebrand. GTX 580 wasn't a rebrand of GTX 480 because there were changes to the die. In contrast, R9 280X was just a rebrand of 7970 GHz Edition because there was no change in the actual silicon. Putting in faster RAM chips doesn't change that.

Even if they release full Tonga for the desktop, it would still be a rebrand, because the Tonga silicon already exists coming off the wafers. It's only a new chip if they tweak the base layer (not just the metal layer - that's a mere stepping, which is a rebrand).



It does use 300W of power, and will probably use even more on the R9 390X due to the completely unnecessary switch to a 1500 MHz memory clock.

The 290X does not use 300W under normal usage (ie: Games where is averages ~250W). "FurMark" is not a game, it is not used to tell people how many watts a GPU uses while gaming (ie: Regular usage). To use it is such is disingenuous.

If you want to go that route, then the GTX980 uses 342W according to TPU, as opposed to 182W while gaming.

Plus, stating that the 300 series are direct renames of the 200 series as fact is WRONG as we DO NOT KNOW. You can talk about how rumors state one thing or another, but do not state them as fact.
 
Feb 19, 2009
10,457
10
76
Rebrand = giving the same chip a new product name.

If its respin with less leakage at TSMC, that isn't a rebrand.
If its produced on GloFO, that isn't a rebrand.

Both of those result in a different chip. Thus, its more a refresh.

Also its funny how these clickbait sites, videocardz & wcct have all this time sold the idea that it's a straight rebadge with higher clocks and more vram.. now they are putting it as "Enhanced Hawaii". Oh really? How is it enhanced? Well duh, it's probably on GloFo with 30% better perf/w like some of us have speculated on since the start of the year.

Thats the only way they are gonna sell it at ~$329/389 prices. They are never gonna sell many if the gaming power load remains ~240W.

The problem with R290/X was never performance. It was 1) High power use compared to 970/980 and 2) Poor OC capability. If those two issues are addressed, it's very competitive.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Nope. If there are no transistor-level changes on the wafer, then it's a rebrand.

Wrong. Rebrand is taking a product and releasing it in identical form with no clock speed changes, no VRAM changes, no new features. GTX560/770 are not rebrands; they are refreshes. According to you 560/770/680MX/780M/880M are all rebrands.

Are you saying that i7 4790K is a rebrand of an i7 4770K then? That's called a Refresh, not a Rebrand. You are using the word rebrand incorrectly.

GTX 580 wasn't a rebrand of GTX 480 because there were changes to the die.

GTX580 wasn't a re-brand not only because of changes to the die but because it had new features like improved Z-culling and different specs.

If a new product is 10-15% faster and has even 1 new feature (HDMI 2.0), it's a Refresh, not a Rebrand. Rebrand is like taking a 2015 Mustang GT and calling it a 2016 Mustang GT with 0 changes to any specs.

Even if they release full Tonga for the desktop, it would still be a rebrand, because the Tonga silicon already exists coming off the wafers.

You are interchanging the terms based on silicon level vs. product level.

If Porsche takes a 3.8 litre engine and increase its horsepower from 400 to 425hp, is that a rebrand? No, that's a refresh of existing technology with a better revision. Even if 98% of the engine is the same.

Since we never had a 2048 shader 384-bit Tonga XT on the desktop, that's not a rebrand of a 285 product. That's a completely new product even if the ASIC itself is the same. If the performance and specs improved, it's called a Refresh, for the 100th time. This is how it's been for 30 years of GPU releases and you are changing the definition cuz you feel like it.

It's only a new chip if they tweak the base layer (not just the metal layer - that's a mere stepping, which is a rebrand).

New product =! new Asic

i7 4790K is an improved version of i7 4770K, but it's not a rebrand, it's a refresh. When CPU manufacturing nodes mature and AMD/Intel are able to release faster clocked versions of the exact same ASIC, it's called Refreshing the CPU lineup, not Rebranding the CPU lineup.

It does use 300W of power, and will probably use even more on the R9 390X due to the completely unnecessary switch to a 1500 MHz memory clock.

Not in games, it doesn't. Your constant usage of FurMark is irrelevant. It's been proven irrelevant but you keep ignoring it.

If you want to live in lalala land and use a power virus as indication of real world gaming performance, no one on these forums will take you seriously on discussions of power usage.


FurMark is a worthless metric for measuring real world gaming power usage. It is a fact, not an opinion. You need to really catch up on the methodology of FurMark and how it works since no one on these forums except you uses it as indicative of real world power usage. No game engine is based on FurMark, no game can stress the GPU to 99.99% like a power virus can, and both AMD/NV have publicly admitted that FurMark is irrelevant as it's not indicative of any real world power usage application.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Rebrand = giving the same chip a new product name.

If its respin with less leakage at TSMC, that isn't a rebrand.
If its produced on GloFO, that isn't a rebrand.

Both of those result in a different chip. Thus, its more a refresh.

It goes beyond that. According to him now, a 1050mhz HD7970Ghz is a re-brand of 925mhz HD7970, 560/770/680MX/780M/880M are also all re-brands. Next thing you know he is going to tell us that 6800 Ultra Extreme and X850XT Platinum Edition cards are rebrands of 6800 Ultra and X800XT. 99% of the world considers all those products Refreshes because performance and specs changed, even if the underlying ASIC is made on the exact same node process.

He is just making up new terms in the industry based on arbitrary definitions in his own mind, just like he continues to use FurMark as a measurement of real world power usage.

What's next, once Intel releases a 4.6Ghz 6790K as a successor to 4.2Ghz i7 6700K, that's also a "rebrand"?

His incorrect use of the term Rebrand is remarkable since he tries to change the entire history of GPUs. According to him, X850XT PE and 6800 Ultra Extreme are both instant fails over X800XT/6800 Ultra, but no one would have said that during that generation because most people understand that if you improve performance 8-15% over time, whether it's a new ASIC or a more mature node with lower leakege, you Refresh your product line and it's a normal course of business for both NV and AMD/ATI. NV did it with 285/560/770/and that barrage of 680MX chips I mentioned. Is releasing an all new ASIC better? Most of the time YES, but it doesn't mean a Refresh is an automatic fail as he tries to spin it so hard. You can have a great refresh like this:

 
Last edited:
Feb 19, 2009
10,457
10
76
7970 -> 7970Ghz is a new SKU not a refresh really because they co-existed together, the former was not phased out immediately.

What Intel does is a refresh, as their node mature they can release higher clocked chips that use the same power, but they phase out the older chip.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
7970 -> 7970Ghz is a new SKU not a refresh really because they co-existed together, the former was not phased out immediately.

What Intel does is a refresh, as their node mature they can release higher clocked chips that use the same power, but they phase out the older chip.

Even if we put aside the definition of the words Refresh vs. Rebrand, it makes no sense to criticize faster 290/290X cards with double the VRAM but at the same time ignore the weaknesses of the competitor.

Talk about sweeping the 3.5GB VRAM issue under the rug when it didn't go away. Fact of the matter is once VRAM is pushed, 970's minimums could drop below 30 fps but R9 290X keeps on going. In Shadow of Mordor, 970 is not in the same league!




Also, if R9 390X is 98% as fast as the 980 but costs $110 less, that's a lot of $ to some people to save $2 on electricity.

I think at this point it's best to wait for the actual launch to see official prices, actually gauge if there have been any improvements made to perf/watt, if AMD included any changes to HEVC unit, and where 390/390X stand vs. 970/980 when prices settle 1-2 weeks after launch. After all, what if Fiji PRO pushes the prices of $499 980 down and at that point whatever comparisons we are trying to make now will not be relevant against say a $449 980. We can't assume that NV will just stand still. They could announce 8GB 970/980 versions, price drops, new game bundles, release some amazing new drivers, etc.
 
Last edited:
Feb 19, 2009
10,457
10
76
Power use isn't just electricity cost, it's heat released into the room, requiring more case airflows ($ & noise) & a better PSU ($). If you live in a hotter climate, you'll know how bad that is, higher ambient just makes everything worse. There's certainly a premium for efficiency, you may not agree but the market have spoken. So get used to the idea.

The competitor to 390/X will still be custom 970 models. The 980 is not a prime mover if we read up on sales of Maxwell, the vast majority is due to the 970. So all perf/$ and perf/w comparisons need to be against custom 970 models that get great boost clocks out of the box.

AMD definitely need "enhanced" Hawaii, aka efficient Grenada made on GloFo's node to be competitive, period.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Power use isn't just electricity cost, it's heat released into the room, requiring more case airflows ($ & noise) & a better PSU ($). If you live in a hotter climate, you'll know how bad that is, higher ambient just makes everything worse. There's certainly a premium for efficiency, you may not agree but the market have spoken. So get used to the idea.

The competitor to 390/X will still be custom 970 models. The 980 is not a prime mover if we read up on sales of Maxwell, the vast majority is due to the 970. So all perf/$ and perf/w comparisons need to be against custom 970 models that get great boost clocks out of the box.

AMD definitely need "enhanced" Hawaii, aka efficient Grenada made on GloFo's node to be competitive, period.

If Grenada performs above the 980, plenty of people will not care if it uses ~40w more. Sure you'll have the hardcore nVidia fans acting like it's the end of the world, but that's just a vocal minority.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Man I live in Houston, Texas and I couldn't give two shits less about how much heat a card puts out. A little feminine to nitpick 30W of usage if you ask me.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Power use isn't just electricity cost, it's heat released into the room, requiring more case airflows ($ & noise) & a better PSU ($). If you live in a hotter climate, you'll know how bad that is, higher ambient just makes everything worse. There's certainly a premium for efficiency, you may not agree but the market have spoken. So get used to the idea.

I totally understand your point, especially if we were discussing a gaming rig with a 750Ti and a 35-65W Intel CPU against a high-end gaming rig. But how many gamers buying $300+ GPUs will be in that position?

If you are going to argue that a high-end gaming rig will heat up a small room, that applies to most high-end gaming systems, including a GTX970/980 rig.





If you are going to use that argument, then you need to start adding up speaker and monitor power usage too in relative terms. My Westinghouse 37" LVM-37W3 monitor alone peaks at 210W. Once all of that is added up, do you think a difference of 50-60W matters a great deal in heating up a room when the entire gaming rig with speakers and the monitor uses 500W+ of power already?

That's why I can't understand how all the perf/watt proponents ignore the overall power usage of a gaming system, including the monitor + speakers, etc. Why don't they want to measure perf/watt in the context of the overall system power usage? I can't game without my monitor or an i5/i7 rig, mobo, etc. If I am legitimately going to compare perf/watt used in generating IQ/FPS, I need to look at the overall power usage since that's how we use our computers. That's why I feel sites only using perf/watt on a videocard basis alone are being disingenuous to the readers.

The competitor to 390/X will still be custom 970 models.

Doesn't solve 3.5GB memory issue. Also, AMD's AIBs tend to be more aggressive with rebates. Even if R9 390 is $329, it shouldn't be too long before $20 rebates are offered.

The 980 is not a prime mover if we read up on sales of Maxwell, the vast majority is due to the 970. So all perf/$ and perf/w comparisons need to be against custom 970 models that get great boost clocks out of the box.

That's another key weakness of a 970, the after-market versions are often not that much faster than a reference version, and can be slower than a stock reference 290X.



Also, if we are going to bring up after-market 970s, we can't use the marketing 145W power usage TDP for NV. Look at the power usage of after-market 970 cards, it's 180-190W, easily.



NV marketing => Use after-market 970 level of performance and quote 145W TDP in reviews. Average Joe buys into that. AMD's marketing team needs to discredit this marketing tactic.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Man I live in Houston, Texas and I couldn't give two shits less about how much heat a card puts out. A little feminine to nitpick 30W of usage if you ask me.

Yeah, but he lives in Australia. 1st world vs. 2nd world. (Just kidding Silverforce11 ) Seriously though, electricity will be a lot less in Tx.

I agree though that 30w-40w isn't a lot more heat. The only people that really matters to are the OEM's. If you are building 1000's of PC's and you can save $10 for each PSU and one less fan, it adds up. Add to it that nVidia on the box adds value.

You or I though paying $100 more for the same performance (or possibly less) because of 40w is nonsensical.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Yeah, but he lives in Australia. 1st world vs. 2nd world. (Just kidding Silverforce11 ) Seriously though, electricity will be a lot less in Tx.

I agree though that 30w-40w isn't a lot more heat. The only people that really matters to are the OEM's. If you are building 1000's of PC's and you can save $10 for each PSU and one less fan, it adds up. Add to it that nVidia on the box adds value.

You or I though paying $100 more for the same performance (or possibly less) because of 40w is nonsensical.
Agree with everything you said.

One could also look at last gen's cards (290x vs 780ti) and see that the amd part actually provides superior perf/watt. Interesting aside from the fact that this metric is useless apart from an extreme outlier.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Agree with everything you said.

One could also look at last gen's cards (290x vs 780ti) and see that the amd part actually provides superior perf/watt. Interesting aside from the fact that this metric is useless apart from an extreme outlier.

Well, nVidia abandoning it's last gen customers is a whole 'nother consideration. I'm pretty sure we have a thread on that somewhere too though. Although if so, it's likely locked by now.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I agree though that 30w-40w isn't a lot more heat.

You or I though paying $100 more for the same performance (or possibly less) because of 40w is nonsensical.

I actually decided to split my last post to really drive this point:

That's why I really like Computerbase.de. They call marketing BS right away when they see it.

Five Generations of NV GPUs compared (GTX470 to 970)

Total System Power Usage:

GTX670 (170W TDP) = 253W
GTX970 (145W TDP) = 278W
GTX470 (215W TDP) = 283W
GTX570 (219W TDP) = 292W
GTX770 (230W TDP) = 293W

NV's Maxwell power usage numbers for GTX970 GM204 are marketing lies, or is it another accidental mistake of the marketing department that didn't double-check their findings with their engineering department?

I think North American professional review sites aren't performing their journalistic duty to expose these misleading marketing practices which irrespective of AMD's TDP are not fairly representative of the power usage of an NV product. The average uninformed PC gamer uses TDP to mean power consumption and we all know it. If we as readers can notice these discrepancies, it should be a given that a professional review site that performs proper objective hardware reviews ought to inform their readers of the same. Alternatively, AMD might as well just assign arbitrary TDP values to its products as well since now AMD/NV are just playing TDP marketing game. At that point why just not call R9 390/390X 180W TDP and 200W TDP cards?

AMD definitely need "enhanced" Hawaii, aka efficient Grenada made on GloFo's node to be competitive, period.

It would certainly be a lot better, but even if R9 390/390X use 150W of power, some other new metric will be used to downplay/discredit them, like lack of TXAA, poor performance in GW titles, lack of PhysX.

https://www.youtube.com/watch?v=X2NVRBbi5B4

Let's face it, people who are loyal to NV won't care if R9 390/390X use 100W less power and are 10% faster than 290/290X, they still won't buy them.
 
Last edited:

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
Well, nVidia abandoning it's last gen customers is a whole 'nother consideration. I'm pretty sure we have a thread on that somewhere too though. Although if so, it's likely locked by now.

So you think coming into this forum for the express purpose of behaving disruptively and burdening the staff is funny?

I don't.

-- stahlhart
 
Feb 19, 2009
10,457
10
76
Man I live in Houston, Texas and I couldn't give two shits less about how much heat a card puts out. A little feminine to nitpick 30W of usage if you ask me.

It's not 30W. It's more 170 v 240W with the 970/980 vs R290/X scenario.

If AMD can reduce the power use down towards 200W, they are on to a winner with equal performance, more vram for cheaper.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
It's not 30W. It's more 170 v 240W with the 970/980 vs R290/X scenario.

If AMD can reduce the power use down towards 200W, they are on to a winner with equal performance, more vram for cheaper.
I think what's reasonable to do is take the Tahiti pro power levels and compare to Tonga. It's almost the same percentage as the result nvidia achieved with Maxwell.
 

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
So are these all respins made by GlobalFo? Because there's some confusion if they're respins or rebadges or like the GTX480-GTX580 change.

Either way looking forward to June 16th.
 
Feb 19, 2009
10,457
10
76
I think what's reasonable to do is take the Tahiti pro power levels and compare to Tonga. It's almost the same percentage as the result nvidia achieved with Maxwell.

Tonga on M295X for Apple maybe, not the 285, that didn't improve perf/w compared to 7970/R280X.
 
Feb 19, 2009
10,457
10
76
It would certainly be a lot better, but even if R9 390/390X use 150W of power, some other new metric will be used to downplay/discredit them, like lack of TXAA, poor performance in GW titles, lack of PhysX.

Let's face it, people who are loyal to NV won't care if R9 390/390X use 100W less power and are 10% faster than 290/290X, they still won't buy them.

Let them get a new metric.

Perf/$ and Perf/w has always been important metrics.

If gamers loyal to NV won't buy it, that's fine also, but certainly reducing power usage will make LESS neutral gamers jump to the NV option.

Currently 970 $330 or R290X for $300, I will get the 970 just because of the 50-80W power use difference alone (total system power) since both are comparable on performance and OC v OC, still comparable, except the R290X uses 400W OC.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |