Why Doesn't Anyone Get to the Bottom of the Aging of Kepler and GCN?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
By the way, saying bad things about AMD can also get you yanked from the review list. TechReport's investigation into frame pacing did not make it any friends at AMD, and it was eventually denied a Nano for review. Ironically it did lead to Scott Wasson leaving the site to work for AMD.

Oh I am sure both do it. In this example Nvidia wouldn't be happy, but some article that tracked Crossfire support would probably irk AMD.

Hell Apple's media "blacklist" is now a very well known thing. Technology has turned into the movie Mean Girls.

Thanks! I actually just purchased a GTX 970 simply to use it for testing... I didn't have a need for it other than for 290/390/980 (and next-gen) face-offs. I'm beginning to build up a stockpile of older cards for the kind of tests you're talking about. I currently have a 560 Ti, HD 7870, 290, 390X, 970, 980, and 980 Ti. I wish I hadn't sold my 670 and 780 Ti... but that's hindsight. And while I got a Fury as a loaner for testing, I intend to buy one for future face-offs.

That is pretty cool, I look forward to your future results.

I am also happy you are branching out beyond what you own for gaming because you yourself won't probably buy a bad card for personal use. You really need something like a 680 to track what the hell happened to Kepler, and I personally think the 970 will be the candidate for the Maxwell card that tanks in 2017 so you are ready to examine what the hell happened to Maxwell then.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
That's what I would do if I could get away with it. That's what appears to be happening anyway. Most non retarded people would do the same thing if they could get away with it and not hurt future sales. You pretend like Nvidia cares about gamers. That's a little cute actually :wub:

You know the saying. "Fool me once shame on you. Fool me twice shame on me." I think Nvidia is underestimating the consumer and overestimating their brand strength. Sometimes when things go your way it looks great. But when things start going wrong it ends up horribly wrong. This upcoming generation is going to be interesting.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,001
126
Read up on the development of the PS4. The forward thinking was actually done by Sony. This is one area where AMD really did benefit from the consoles. Consoles makers have to predict architecture needs years in advance since that's the expected lifespan of a console. PC video card makers think in much shorter terms since they typically replace architectures every couple of years. Sony predicted that by the middle of the PS4 lifespan that async compute would be a very important feature so they pushed for it in GCN among many other features.


But GCN predates the current consoles by quite a while. Even the first GCN 1.1 GPU launched more than half a year before the PS4.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
TechSpot is the only site that does this kind of testing

No this site has the best comparison and was done January of this year.
You need to read the whole review.

http://www.babeltechreviews.com/hd-7970-vs-gtx-680-2013-revisited/view-all/

Kepler release date 3/2012 = 4 years ago.
Maxwell release date February 2014 = 2 years ago

I think most enthusiast on this ENTHUSIAST forum don't keep there cards for more than 2 years.

I don't think Kepler gets the support it once did because its just plain older tech. I think mabe optimizing for Kepler would hurt Maxwell performance.

Think, if the gtx680, 780 and 980 were just rebrands with the higher clocks and lower power usage, do you think the 680 would still get optimizations? I'd say yes.

Now cards like the 7950/7970/280 and 290/290x/390/390x are the same cards that use the same optimized driver.

That's the answer.

The results are Nvidia uses a 2 year architectural release cycle, sells more cards, and makes more money and AMD doesn't. Plain and simple.
 
Last edited:

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
You know the saying. "Fool me once shame on you. Fool me twice shame on me." I think Nvidia is underestimating the consumer and overestimating their brand strength. Sometimes when things go your way it looks great. But when things start going wrong it ends up horribly wrong. This upcoming generation is going to be interesting.

To be perfectly honest, the target demographic for these products is mostly young men, ages 14 to 20 or so IMO. Boys that age do not use their brain a whole lot, especially the kind who spend a lot of time playing games. They go with the "cool" product and they are happy. They want the new Nvidia card because AMD "sucks".
This is the kind of over simplified thought that goes into buying something like a gaming GPU for most people who buy them. They don't know anything about Kepler sucking or even that their GTX 670 was even called something by the name of "Kepler". What the hell is Kepler? They don't know. They know what Nvidia wants them to know, and that is the fact that new Nvidia cards are coming out and will be way faster than what they have in their rig right now.

EDIT: I described myself by accident. Hehe, oh well. Still TRUE AND LEGIT.
 
Last edited:

Game_dev

Member
Mar 2, 2016
133
0
0
NVIDIA supports directx 12 back to Fermi, while AMD only supports it on their current gen. Not even 12.1. They abandoned VLIW quickly.
 
Feb 19, 2009
10,457
10
76
From what I see, Kepler does not perform worse for games already shipped. There is no tanking.

Maxwell also doesn't improve in performance for games already shipped. Once a "Game Ready" driver is out, it's out and they move on, rarely do they revisit and release drivers with +performance.

Where the two differs greatly, is that newly released games, Kepler tanks in performance compared to Maxwell and GCN.

This is simply undeniable with all the games we've seen it happen in already since 2015. The question then is simply, WHY?

I'll keep it short, but here's my thoughts: if you look at Kepler's uarch, their ratio of SM to wavefront is non-optimal, 192 vs 128 for Maxwell. So if the drivers are not optimized for it, Kepler loses potentially 1/3 of it's performance immediately as 64 out of 192 shaders will idle at each wavefront cycle.

Is that what we see? Yes indeed, 780 under-performs to 960 level, is ~30% under-par. Likewise for 780Ti being well below the 970.

What is clear is NVIDIA has not put effort to optimized "Game Ready" drivers for Kepler in new games. They do not tank Kepler performance in already released titles, they simply shift optimization focus to Maxwell only. So if games or game engines are not optimized for Kepler specifically, NV isn't going to bother to fix it via drivers.

What's interesting out of this is neutral games, not AMD/NV sponsored, have excellent Kepler performance. Likewise for games made with Unity. These developers have optimized for Kepler at the engine level and there's no features that run unoptimized. Games where Kepler tank most often are GameWorks titles, so it seems these NV features are optimized for Maxwell only. NV likes to do that, with even VXAO and HFTS using Maxwell-specific uarch.

As moonbogg said, it's all about frequent upgrades. If NV and AMD had 50:50 marketshare, the focus will be on how NV gets the AMD user to upgrade to NV. But once the marketshare is 80:20, the focus has to be on how to get more money from the 80%.
 
Feb 19, 2009
10,457
10
76
NVIDIA supports directx 12 back to Fermi, while AMD only supports it on their current gen. Not even 12.1. They abandoned VLIW quickly.

Where's DX12 for Fermi?

Where's Vulkan for Fermi?

They promised it, hasn't materialized and it's 2016.
 
Feb 19, 2009
10,457
10
76
When did they promise vulkan for Fermi? Will VLIW get Vulkan?

Will you care if I present evidence/facts to back my claims or will you just loltroll like usual? I ask because if it's the latter, I won't bother.

AMD never promised DX12/Vulkan for VLIW.
 

Game_dev

Member
Mar 2, 2016
133
0
0
Will you care if I present evidence/facts to back my claims or will you just loltroll like usual? I ask because if it's the latter, I won't bother.

AMD never promised DX12/Vulkan for VLIW.

You sure get angry over this stuff. Maybe stop posting insults and learn some technical details.
 
Feb 19, 2009
10,457
10
76
NVIDIA supports directx 12 back to Fermi.

You sure get angry over this stuff. Maybe stop posting insults and learn some technical details.

NVIDIA does not have DX12 support in drivers for Fermi. You claim they do. Where's the proof, where's Fermi running DX12 games?

The reason I asked you, is if you actually care for facts, I would present it, but if you don't care like in most of your other posts that just thread craps, then I won't bother to link facts since you just ignore it as usual. -_-

ps. Here, say hello to FACTS. Deny them like usual or for once, admit you're wrong.

Fermi was promised Vulkan support.



Fermi was promised DX12 support.

http://www.geforce.com/hardware/technology/dx12/windows-10



I'm sure even you know about this, when NV promised Maxwell supports DX12 Async Compute.



Basically NV has a history of just plain lying about their hardware, repeatedly.
 
Last edited:

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
Exactly. The unsaid point there is the Titan was the first $1000 consumer GPU but today in some cases it is beaten by a competitor that is just as old that was almost half the price- the 7970. That is an amazing loss of value over the life of that card.

I forgot to address this. Yes that is a stunning loss of value. I must say though, that card had absolute crap value from day 1. It had crap value the instant Nvidia decided to charge 1k for it. We are talking about a GPU that they used to charge 5-600 for, and even paying $600 for these things is ridiculous enough.
IMO the $1,000.00 GPU's are not the upper performance class products. They are the Jester class products. The drain bamaged, clown shoes class products. They are Nvidia's exclusive, "BAHA I can't believe you fell for that one" class products, and always, ALWAYS will be.
 
Feb 19, 2009
10,457
10
76
I forgot to address this. Yes that is a stunning loss of value. I must say though, that card had absolute crap value from day 1. It had crap value the instant Nvidia decided to charge 1k for it. We are talking about a GPU that they used to charge 5-600 for, and even paying $600 for these things is ridiculous enough.
IMO the $1,000.00 GPU's are not the upper performance class products. They are the Jester class products. The drain bamaged, clown shoes class products. They are Nvidia's exclusive, "BAHA I can't believe you fell for that one" class products, and always, ALWAYS will be.

Titan wasn't good value either way so that argument doesn't hold.

The comparison should be the 780, also $650 for a long time. Beaten by the $400 R290. Also beaten by the much cheaper 7970Ghz/280X. For some parts of 2014 and most of 2015, R290s were ~$220. R290X ~$300.

The 780Ti was $650-699 through those times. Smashed by the half priced R290X, and even by the 1/3 priced R290. -_- Amazing AMD lost marketshare with Hawaii given how strong it was and still is and growing in power with DX12. Marketing > All.
 

MrTeal

Diamond Member
Dec 7, 2003
3,587
1,748
136
Titan wasn't good value either way so that argument doesn't hold.

The comparison should be the 780, also $650 for a long time. Beaten by the $400 R290. Also beaten by the much cheaper 7970Ghz/280X. For some parts of 2014 and most of 2015, R290s were ~$220. R290X ~$300.

The 780Ti was $650-699 through those times. Smashed by the half priced R290X, and even by the 1/3 priced R290. -_- Amazing AMD lost marketshare with Hawaii given how strong it was and still is and growing in power with DX12. Marketing > All.

The comparison shouldn't be the 780, it should be the 780Ti. The 290X launched two weeks before the 780Ti and both were full chips, so it's a much more natural comparison. You had the $700 Ti, the $550 290X, the $500 price dropped 780 and the $400 290. At the time the premium was justified, as the 780Ti was a decent bit faster than the 290X and everyone not under water was stuck with reference Hawaii coolers. Hell, through most of the next half year GK110 was probably a better value given the prices Hawaii sold for until the bottom dropped out of litecoin mining.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Titan wasn't good value either way so that argument doesn't hold.

The comparison should be the 780, also $650 for a long time. Beaten by the $400 R290. Also beaten by the much cheaper 7970Ghz/280X. For some parts of 2014 and most of 2015, R290s were ~$220. R290X ~$300.

The 780Ti was $650-699 through those times. Smashed by the half priced R290X, and even by the 1/3 priced R290. -_- Amazing AMD lost marketshare with Hawaii given how strong it was and still is and growing in power with DX12. Marketing > All.

Where do you get he 290/x series was faster or smashed the 780/ti series??
Also the 290 released at 400$ and the 780 was already down to 499$ at that point and the 290x was 550$ at release not half the price of the 780ti.
http://www.techpowerup.com/reviews/AMD/R9_290/

The 290's dropped in price mid 2014 when the 329$ gtx970 released and forced the price cut.

780 series vs 290 series April 2014, 6 months after release.

 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,058
410
126
well, if you look at the specs it's kind of odd that the GTX 680 managed to beat the 7970 in the first place, perhaps at that point AMD was far behind in terms of supporting GCN with their drivers, and then we have continuity with them keeping close to the same architecture, while Nvidia has changed more, and the consoles are a factor to some level for sure...
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
From what I see, Kepler does not perform worse for games already shipped. There is no tanking.

Maxwell also doesn't improve in performance for games already shipped. Once a "Game Ready" driver is out, it's out and they move on, rarely do they revisit and release drivers with +performance.

Where the two differs greatly, is that newly released games, Kepler tanks in performance compared to Maxwell and GCN.

This is simply undeniable with all the games we've seen it happen in already since 2015. The question then is simply, WHY?

I'll keep it short, but here's my thoughts: if you look at Kepler's uarch, their ratio of SM to wavefront is non-optimal, 192 vs 128 for Maxwell. So if the drivers are not optimized for it, Kepler loses potentially 1/3 of it's performance immediately as 64 out of 192 shaders will idle at each wavefront cycle.

Is that what we see? Yes indeed, 780 under-performs to 960 level, is ~30% under-par. Likewise for 780Ti being well below the 970.

Excellent analysis Silverforce11, that seems right on the money. It also doesn't bode well for either the GTX 970 or Fiji, both which need optimizations to work optimally.

So what about the other big question in your opinion? What has helped GCN more, consoles or better drivers?
 
Last edited:
Feb 19, 2009
10,457
10
76
Excellent analysis Silverforce11, that seems right on the money. It also doesn't bode well for either the GTX 970 or Fiji, both which need optimizations to work optimally.

So what about the other big question in your opinion? What has helped GCN more, consoles or better drivers?

GCN gaining in performance (R290X was 15-20% behind 980, now often equal or ahead in modern games) is a separate issue to the aging of Kepler due to the aforementioned reasons.

For GCN, it's simply the console effect. Games released in 2016 would have been made when the engines have been optimized for PS/Xbone. Games released in 2014, would have been made in 2011/2012, before the current console era. We really started to see the GCN effect in some next-gen 2015 titles. It was when the R290/X started to pull away from the 780/Ti.

There are differences in GCN iterations, but it's not a major fundamental change, except for ACEs and queue depths (which needs to be optimized for DX12/Vulkan).

Polaris is enhanced GCN and so I expect it will retain the advantages and more. The hardware primative discard accelerator(1) and improved command processor(2) alone will make it much better than prior GCN.

(1) The biggest flaw of GCN's front end is it can be bottlenecked by high geometry load and this often occurs in tessellation scenes OR scenes with a lot of complexity, like a dense city, with buildings and details behind the scene. With the hardware capable of discarding un-used (non-seen) geometry before it even gets into the rendering, this will massively improve MIN FPS performance and prevents over-tessellation from being a bottleneck.

(2) Boosting the command processor results in higher shader uptime, something GCN has issues with in DX11, Fiji especially, but so do other GCN SKUs. This is a direct IPC gain, so each shader is worth more.
 
Last edited:

Game_dev

Member
Mar 2, 2016
133
0
0
GCN is the prime reason AMD has lost so much market share. Trying to spin AMD dragging out the release of a new architecture as a good thing is not grounded in reality.
 
Feb 19, 2009
10,457
10
76
GCN is the prime reason AMD has lost so much market share. Trying to spin AMD dragging out the release of a new architecture as a good thing is not grounded in reality.

So, where's NV drivers that support Fermi for DX12 and Vulkan like they promised?

How is it you just say things that are blatantly wrong all the time?

I mean I can understand if it's your opinion. But the way you go about it, seems like you make claims.

NVIDIA supports directx 12 back to Fermi.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
This statement worries the crap out of me. If I am decoding that properly it basically boils down to "If a site did that Nvidia would punish them."

I haven't heard about nvidia doing this, but amd unabashedly did so with one of their recent releases. I think it was the Fury Nano was it not?
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
But GCN predates the current consoles by quite a while. Even the first GCN 1.1 GPU launched more than half a year before the PS4.

Development for the PS4 began in 2008 according to Sony. Like I said, do the research. There are numerous older articles out there containing interviews of Sony executives (Mark Cerny, lead architect, in particular) and people involved with the development of the PS4 that talk about their vision for the PS4 and what features they wanted AMD to implement.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
With the hardware capable of discarding un-used (non-seen) geometry before it even gets into the rendering, this will massively improve MIN FPS performance and prevents over-tessellation from being a bottleneck.

Wow that is awesome. Kinda reminds me of the z-buffer on old Power VR technology.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
I haven't heard about nvidia doing this, but amd unabashedly did so with one of their recent releases. I think it was the Fury Nano was it not?

Yeah you are correct. Video cards are never the battle of good and evil like some make it out to be, I was just clarifying in context. My bad if it seemed out of place.

I mean at the end of the day it doesn't matter why Kepler tanked from purely a predictive standpoint because past performance is no guarantee of future results. I think completely making an arbitrary decision like "Nvidia will drop Maxwell like Kepler," or that future Nvidia GPUs can't overcome deficiencies is folly. And I don't know if I can accept that future AMD GPUs will get the same console advantage going forward, seeing as how the newish GCN GPU Fiji is a basket case when it comes to performance.



But we already had the (very interesting) why is Fiji broken thread.

It will be fun to see how it plays out in 2016 for sure, but I think the retrospective discussion is relevant today.
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |