wccftech:AMD Fiji XT Leaked For The Third Time On Zauba – 20nm Looks Promising

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
They're hemorrhaging market share. How's that for forward thinking?

Even if they were faster, cheaper and more power efficient, they would still lose in marketshare.

The common folks developed a perception early on and they stuck with it, regardless. Early on, AMD became known for cheap knock offs. This is why even when they lead in the Athlon era, Intel still commanded the market.

Likewise for the 9700 era vs NV. Same again for the 4800 and 5800 series (Fermi was very late, using ~twice the power, running extremely hot & loud etc, still NV led in marketshare!).

So it's not really about marketshare dominance. They just need to perform with R380/R390 series and they'll be just fine. There's still plenty of gamers who are brand agnostic and go for whatever is the best currently or best bang for buck.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
The 290/290X still hold up well against the 970/980 despite being an entire year older. Even more so at higher resolutions.

But AMD isn't selling many 290/290X despite them pricing below 970/980 and no it's not because everyone loves nvidia. People given a choice will pick the lower power solution if the price is anything close. On top of that you can bet a huge high power card like the 290 costs a lot to make, more then the lower power nvidia chips. So AMD can't sell their more expensive to make cards for less money, I'd say that's a pretty major problem.

Then there's the whole gaming laptop market, which is big these days that AMD has no share of at all once you go above APU's (for which there is barely any money to be made). Another major problem. The whole world is going lower power higher efficiency these days, if you want to sell more then desktop gpu's you need to nail that.

300W says that AMD can't compete on efficiency, if they could they wouldn't be releasing new cards using that much power.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
But AMD isn't selling many 290/290X despite them pricing below 970/980 and no it's not because everyone loves nvidia. People given a choice will pick the lower power solution if the price is anything close. On top of that you can bet a huge high power card like the 290 costs a lot to make, more then the lower power nvidia chips. So AMD can't sell their more expensive to make cards for less money, I'd say that's a pretty major problem.

Then there's the whole gaming laptop market, which is big these days that AMD has no share of at all once you go above APU's (for which there is barely any money to be made). Another major problem. The whole world is going lower power higher efficiency these days, if you want to sell more then desktop gpu's you need to nail that.

300W says that AMD can't compete on efficiency, if they could they wouldn't be releasing new cards using that much power.
You really think they are even aware of power consumption? It's all about the brand and marketing.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Was never true for AMD when they had the more efficient GPUs.

That's why it's kinda odd to ask if NV users would buy a 300W product, they most certainly would if it's made by NV and provides good flagship performance. I never understood the backlash against high power usage high end cards. If someone only wants a 150-175W card, it's always out there in the market. Just imagine a 300W Maxwell card. You take the most efficient Perf/watt architecture and scale it. High end gamers get what they want and others get to enjoy 970/980. What's not to like? What matters is how 390X looks relative in price/performance, absolute performance and performance/watt vs. 980/GM200. Besides using 300W TDP as an indication of a card's intended high performance class, it otherwise tells us very little if or not it will be a good product.

Also, as mentioned, we already know that TDP rarely matches real world power usage. Cards like 7970/7970Ghz/280/280X/780/780Ti have similar/same TDPs but different power usages. You can also have cards like 970/980 with marketing driven TDP that has nothing to do with reality for 99% of retail after-market cards. This idea that TDP somehow equates to real world power consumption has always been wrong because TDP does not actually tell us the maximum power usage of an ASIC.
 

Creig

Diamond Member
Oct 9, 1999
5,171
13
81
But AMD isn't selling many 290/290X despite them pricing below 970/980 and no it's not because everyone loves nvidia.
Could you please provide a link to sales figures of the 290/290X so that I could check it out?

People given a choice will pick the lower power solution if the price is anything close. On top of that you can bet a huge high power card like the 290 costs a lot to make, more then the lower power nvidia chips. So AMD can't sell their more expensive to make cards for less money, I'd say that's a pretty major problem.
First off, I stand by what I said earlier. If a person doesn't lock themselves into a line of cards simply for the name on the box, they look at price and performance. Power consumption would only be a check mark to ensure their current power supply has enough headroom unless they're planning to install the card in an HTPC or some other thermally constrained environment.

Secondly, we have no idea how much it actually costs to make a 290/290X die. The design layout, the transistor density, the wafer yields and many other factors all determine the cost to manufacture. You can't simply say that because Design X consumes more power than Design Y that it must cost more to produce.

Third, AMD doesn't actually produce video cards anymore. Neither does Nvidia. All they do is come up with a reference design. Their AIBs actually construct and sell the video cards. Their AIBs also come out with their own custom cards designed in-house.

Then there's the whole gaming laptop market, which is big these days that AMD has no share of at all once you go above APU's (for which there is barely any money to be made). Another major problem. The whole world is going lower power higher efficiency these days, if you want to sell more then desktop gpu's you need to nail that.
Again, could you provide some facts to back up the "barely any money to be made" and "big gaming laptop market"? You can actually make more money selling a large number of small margin items than selling a few large margin items. There's no way to know for sure unless we have some hard facts.

And despite my interest in computers, none of my friends own gaming laptop, just gaming desktops. And naturally, all the laptops at my place of work are business machines. That's hundreds of APU/integrated laptops versus 0 gaming laptops.

So a link that shows the actual profit margins on AMD laptops and number of gaming laptops vs business laptops would be great.

300W says that AMD can't compete on efficiency, if they could they wouldn't be releasing new cards using that much power.
We have no idea if 300W is efficient or not because we have no performance numbers to compare it to. A 300W card that is 60% faster than a 290X would be efficient to me.

You keep going on about efficiency being the end-all be-all metric of the video game world, but try to keep in mind that the GTX 480 pulled 257W at peak draw while its direct competitor, the HD 5870, consumed only 144W while delivering 90% of the performance of the 480. And yet some people still bought the GTX 480 despite it running hotter, being louder and consuming MUCH more power.

 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
The "it's all brand and marketing argument" only goes so far, sure nvidia does have better marketing and brand but why is that? Do they bribe all the journalists? Is there something hypnotic about the nvidia logo? Has JH kidnapped everyone's grandma's?

Just perhaps it's the products? Perhaps the combination of performance, drivers, power usage and features is just better then their competitors and that's why people buy. Bottom line is AMD need to up their game to win the customers, and for all the many reasons already discussed super hot and high power products aren't a good solution going forward.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
That's why it's kinda odd to ask if NV users would buy a 300W product, they most certainly would if it's made by NV and provides good flagship performance. I never understood the backlash against high power usage high end cards. If someone only wants a 150-175W card, it's always out there in the market. Just imagine a 300W Maxwell card. You take the most efficient Perf/watt architecture and scale it. High end gamers get what they want and others get to enjoy 970/980. What's not to like? (...)
Also, as mentioned, we already know that TDP rarely matches real world power usage. Cards like 7970/7970Ghz/280/280X/780/780Ti have similar/same TDPs but different power usages. You can also have cards like 970/980 with marketing driven TDP that has nothing to do with reality for 99% of retail after-market cards. This idea that TDP somehow equates to real world power consumption has always been wrong because TDP does not actually tell us the maximum power usage of an ASIC.

Maybe Maxwell at 28nm can't scale out to make a meaningful improvement to Kepler that would justify its development, unless of course NV is willing to make a huge ~550mm2 chip that's target only at gaming and some niche compute market where DP doesn't matter like that dual GK104 Tesla or was it Quattro?. And leave compute market to GK110 for some more time but historically that never happened and their best chips were always dual purpose by always I mean ever since CUDA became a thing. GK110 is already at 550mm2 and that's the practical limit of the size of ASICs in products that don't cost thousands of dollars and if they do the upper limit is upper 6XX mm2* but I remember only Intel and IBM going that high and that's for chips that stared at $2461(not counting dice castrated chips) and up. Is maxwell even more efficient per mm2 than kepler? Remember that GM204 is just 10% faster than GK110 and its at 400mm2 already and that's without 896 DP shaders that GK110 has. With the same DP performance do you think it would be smaller than GK110 and it would have to have even better DP performance to justify it. Do you think that 150mm2 is all that is dedicated to GK110 for DP performance? I think not.
* Some ancient nodes had chips fabricated using them that were as large as 1000mm2
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Maybe Maxwell at 28nm can't scale out to make a meaningful improvement to Kepler that would justify its development, unless of course NV is willing to make a huge ~550mm2 chip that's target only at gaming and some niche compute market where DP doesn't matter like that dual GK104 Tesla or was it Quattro?. And leave compute market to GK110 for some more time but historically that never happened and their best chips were always dual purpose by always I mean ever since CUDA became a thing. GK110 is already at 550mm2 and that's the practical limit of the size of ASICs in products that don't cost thousands of dollars and if they do the upper limit is upper 6XX mm2* but I remember only Intel and IBM going that high and that's for chips that stared at $2461(not counting dice castrated chips) and up. Is maxwell even more efficient per mm2 than kepler? Remember that GM204 is just 10% faster than GK110 and its at 400mm2 already and that's without 896 DP shaders that GK110 has. With the same DP performance do you think it would be smaller than GK110 and it would have to have even better DP performance to justify it. Do you think that 150mm2 is all that is dedicated to GK110 for DP performance? I think not.
* Some ancient nodes had chips fabricated using them that were as large as 1000mm2

Maxwell like gcn reuses the DP shaders for SP performance. There are no *dedicated* DP shaders anymore.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
You see. It's not how it works.
You can have a card that takes 400Watt and still it could be the most efficient card out there.
Huge power consumption doesn't mean inefficiency.

If 390X uses power efficient HBM, which in turn makes it run lean, will this be directly transfered to 380X which will probably sport old GDDR5 (possibly refreshed hawaii?). Because I expect the influence going from bad product to good product when it comes to amd products - there is heavy contra-marketing going on to enshoure that.

I know I already responded to this post but again to reiterate:
300W suggests they haven't got a hyper efficient new architecture and are just throwing more power at the problem which isn't going to work.

Remember posts like this if AMD releases a 300W card that's "efficient". Won't matter if it's super fast, people will see high power draw and immediately think inefficient. And that's all that matters, market perception, not reality.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
If you're doing high performance desktop gaming, you're not power efficient. Period. If you want to game on total 250w go buy a console. If you want to save power, buy LED lightbulbs. Unless you've replaced all your 60w incandescents with 6 watt LEDs, arguing about video card consumption is disingenuous "do as a say and not as I do." 60w -> LED = 10x improvement in power efficiency. Even switching from CFL bulbs to LEDs doubles your power efficiency. I guarantee you use your lights more often than your GPU. GPU power consumption matters for figuring out what size power supply you should buy

In the real world, where enthusiasts care about performance and not a negligible increase in annual kWH, I'd take a 300w monster chip if it puts up the performance numbers every day of the week. I'd buy a 400w monster if they could cool it and it put up the performance numbers. I really hope they release a HUGE single chip card so I can get tons of performance without having to resort to SLI or CF with all the issues that brings.

If you really want to talk about efficiency... given that SLI and CF do not scale perfectly, reaching high performance levels via multiple cards becomes less and less efficient as you add cards. A single card which is equivalent to two weaker cards will almost always be more efficient from a power use perspective. For example, if you want to hit Max quality in Crysis 3 at 4k, you will either need 3-4 weak cards or maybe 2 fast cards. The 2 fast cards will be more efficient than 4 weak cards because of the terrible scaling out of the 3rd and 4th GPUs.
 
Last edited:

scooterlibby

Senior member
Feb 28, 2009
752
0
0
980 beats your 690, and 780Ti is practically on par. If it can't beat 690 it would be trash.

I did not know that. When the 980 game out I looked at the reviews pretty carefully and it didn't seem the 980 flat out beat the 690. Well, then that's good news about the 390x.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
The 980 is not faster than the 690 unless in a VRAM limited scenerio.

In many scenarios the 980 and 690 can be close in performance. Some of the recent console ports are very dependent on higher VRAM, this can give the 980 a real edge.....sure. This is a result of not optimizing the next gen console ports. The VRAM is becoming more and more important.

The 690 can be VRAM limited but there is plenty of reviews out there that show its not slower than the 980 on average. If it wasn't for the VRAM limitation, the 980 wouldn't have much at all over a 690. But because of it, depending on the games you play, the gtx980 might be worth upgrading to. But honestly, I think anyone with a 690 would be better off waiting for big maxwell or a 390x, cause those would be more of a true upgrade. Not a 980
 

scooterlibby

Senior member
Feb 28, 2009
752
0
0
The 980 is not faster than the 690 unless in a VRAM limited scenerio.

In many scenarios the 980 and 690 can be close in performance. Some of the recent console ports are very dependent on higher VRAM, this can give the 980 a real edge.....sure. This is a result of not optimizing the next gen console ports. The VRAM is becoming more and more important.

The 690 can be VRAM limited but there is plenty of reviews out there that show its not slower than the 980 on average. If it wasn't for the VRAM limitation, the 980 wouldn't have much at all over a 690. But because of it, depending on the games you play, the gtx980 might be worth upgrading to. But honestly, I think anyone with a 690 would be better off waiting for big maxwell or a 390x, cause those would be more of a true upgrade. Not a 980

Thanks for clearing that up. That was my thinking as well.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Yep part of the reason I ended up with 970sli was because a single 980 would be somewhat of a sidegrade at times coming from 670sli
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
My thinking was get a 970 and i could go SLI later. Surely 970SLI will be very capable even compared to a gm200/390x.

I expect the 970SLI to be faster than those cards but worst case......just as fast.
 

metalliax

Member
Jan 20, 2014
119
2
81
My thinking was get a 970 and i could go SLI later. Surely 970SLI will be very capable even compared to a gm200/390x.

I expect the 970SLI to be faster than those cards but worst case......just as fast.

When we're supposedly so close to 390x being released, it's probably a good idea just to wait and see what drops. If 390x is as fast as some of the rumors state, it could be the ultimate pairing to an Oculus Rift or even just good enough for 4k gaming... SLI/xFire is usually isn't as good when it comes to the overall quality on a high-end single-gpu. The current 970/980/290/290x are good enough for 1440p gaming, but not the next-gen experience. These products will leave you wanting more in 6 months.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
If you're doing high performance desktop gaming, you're not power efficient. Period. If you want to game on total 250w go buy a console. If you want to save power, buy LED lightbulbs. Unless you've replaced all your 60w incandescents with 6 watt LEDs, arguing about video card consumption is disingenuous "do as a say and not as I do." 60w -> LED = 10x improvement in power efficiency. Even switching from CFL bulbs to LEDs doubles your power efficiency. I guarantee you use your lights more often than your GPU. GPU power consumption matters for figuring out what size power supply you should buy

In the real world, where enthusiasts care about performance and not a negligible increase in annual kWH, I'd take a 300w monster chip if it puts up the performance numbers every day of the week. I'd buy a 400w monster if they could cool it and it put up the performance numbers. I really hope they release a HUGE single chip card so I can get tons of performance without having to resort to SLI or CF with all the issues that brings.

If you really want to talk about efficiency... given that SLI and CF do not scale perfectly, reaching high performance levels via multiple cards becomes less and less efficient as you add cards. A single card which is equivalent to two weaker cards will almost always be more efficient from a power use perspective. For example, if you want to hit Max quality in Crysis 3 at 4k, you will either need 3-4 weak cards or maybe 2 fast cards. The 2 fast cards will be more efficient than 4 weak cards because of the terrible scaling out of the 3rd and 4th GPUs.
Not really, 2GPUs can be more power efficient because they can be clocked lower, compare 590 with 480. And nowadays M-GPU really scales very well, in the past few months I only had one "issue". In DA:3 SLI suddenly stopped working because they included windowed full screen mode and SLI only works in full screen otherwise scaling is very close to perfect. See for yourself:

Having said that I also had some games that didn't work at all with M-GPUs like Far Cry 3.

The 980 is not faster than the 690 unless in a VRAM limited scenerio.

I often see RAM usage exceeding 3.5GB, 690 wouldn't even work... 980 is better in new games, that's that. Unless you play at 1080p with the textures not on the highest. At this point I wouldn't touch a card with just 2GB of memory, 690 should have came out with 8GB. It's severely crippled.
See that:
It is faster and a single GPU without any AFR to boot. I made a pool once and people even preferred a Titan to the 690. I would always prefer a card 20% slower but a single card than an M-GPU solution that's just 18% faster with a severe RAM limitation. Faster, more ram and single? not even a contest.
Don't care about that much but a lot of people seem to be fixated about it ever since NV made an efficient GPU, during Fermi days those same people didn't even mention it but here it goes
 
Last edited:

tential

Diamond Member
May 13, 2008
7,355
642
121
When we're supposedly so close to 390x being released, it's probably a good idea just to wait and see what drops. If 390x is as fast as some of the rumors state, it could be the ultimate pairing to an Oculus Rift or even just good enough for 4k gaming... SLI/xFire is usually isn't as good when it comes to the overall quality on a high-end single-gpu. The current 970/980/290/290x are good enough for 1440p gaming, but not the next-gen experience. These products will leave you wanting more in 6 months.

TBH, I think the 390x is the first clever "hype train" AMD has done. Claim a 380x card is a 300W monster that may be power efficient and fast and on 20nm (so god only knows how fast the 390x is). Rumors/hype train starts building up to the point where people think the 390x may even best the GTX 970 SLI.

People hold off purchasing as they wait to see what drops, the R9 390x is only marginally better than the GTX 980 Ti (lets say 15% better and that's no gaurantee just best case scenario) and then AMD has a better chance at reclaiming market share than if gamers just purchased the GTX 980/GTX 970 now or the GTX 980 Ti (assuming it comes out before the 390x) or even the GTX 960.

A large part of me just thinks this is a hype train and we aren't going to see some magical AMD card that makes us all hail AMD as our new GPU crown overlords.

@Lepton87 in AC Unity, SLI scales above 100%.
That game really utilizes multi GPU really well /trollface.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
@Lepton87 in AC Unity, SLI scales above 100%.
That game really utilizes multi GPU really well /trollface.

Wouldn't have mattered that game is in alpha I usually wait at least until beta because many games never make past that.



Doubling of framerates wouldn't make it playable. Also it seems that 3GB is not enough for it.

I thought you were joking but apparently not, what a piss poor game.
http://www.hardocp.com/article/2014..._performance_video_card_review/3#.VLdPkS7unT8

Tahiti not playable at all.... CF not working.... NV involvement detected
 
Last edited:

tential

Diamond Member
May 13, 2008
7,355
642
121
To be fair, gameworks(or ubisoft take your pick Nvidia should disassociate with them) games are so bad that Nvidia's own SLI isn't working in Watchdogs til this day.
http://forums.guru3d.com/showthread.php?t=395785
It is a well known problem, that SLI scaling was basically broken by Ubisoft in the latest patch (October 27).
No messing around with settings or custom SLI profiles can fix this issue (trust me, I have tried it all)

The only solution is to roll back to the previous 1.05.324 patch, by following the instructions in this tweak guide I wrote for Watch Dogs a while ago:
http://www.forum-3dcenter.org/vbulle...postcount=1712

But please remember that Watch Dogs is still quite CPU limited on most SLI systems below 4K resolutions, even with the old patch.

With AC Unity:

SLI is working. The reason why this shows over a 100% performance increase is because the GTX 980 is unusually slower than it should be normally in this game. We think the GTX 980 is being held back on performance for some reason, chalk it up to some game bug, or other unoptimized nonsense, the GTX 980 isn't working as fast as it should be. This is quite normal for all the video cards in this game, we think the full power and full potential isn't being utilized in this game for whatever buggy or unoptimized reason.

Again, it's probably Ubisoft lol but this is part of the reason I was scared of mGPU setups. Having $300-600+ that you spent that is unable to be utilized in major AAA games is ridiculous. Especially if you're an NV user playing a Gameworks game.
Edit: Or in the case of AC Unity, needing a mGPU setup to get your cards running at 100% of their capability.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
And to blindly theorize here, I think this is part of the reason The Witcher 3 was delayed. Ubisoft screwing up a release? Typical Ubisoft. Gameworks not working well with CDPR? Maybe it's Gameworks after all that's the common denominator between a lot of buggy releases.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Again, it's probably Ubisoft lol but this is part of the reason I was scared of mGPU setups. Having $300-600+ that you spent that is unable to be utilized in major AAA games is ridiculous. Especially if you're an NV user playing a Gameworks game.
Edit: Or in the case of AC Unity, needing a mGPU setup to get your cards running at 100% of their capability.

I had 4x6970. M-GPUs issues now are nothing like those I had with that Quad Fire rig. Frankly everything is working and if it's not it doesn't need SLI like Fallout:3. I just didn't play the games that had problems with SLI so maybe that's luck, but I haven't specifically avoided them, I just not play games until at least over a month after the release or at least a driver after the release. I only had this strange issue with DA3. Why would they have this windowed full screen mode is beyond me. The only difference from regular full screen is SLI not working, maybe that was the intent for dual-cards that have issues where you can't normally disable M-GPU.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |