AMD confirms feature-level 12_0 for GCN maximum

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nvgpu

Senior member
Sep 12, 2014
629
202
81
Resource Binding Tier is irrelevant.

Only Feature Level Tiers are relevant and thats what controls the latest hardware features that requires hardware to fully support it.

If AMD could actually enable Tier 2 Tiled Resources in GCN 1.0, they would have done it already in their driver since GCN 1.1/1.2 hardware has it enabled, but they can't, period.

Also if you actually tried to use ROVs on GCN, it's absolutely unusable.

https://forum.beyond3d.com/threads/direct3d-feature-levels-discussion.56575/page-11#post-1847921

Yeah it was ridiculously bad when I briefly tested it... like turn it on even with a trivial pixel shader and no actual synchronization going on and your frame takes 300ms+. Hopefully that was just a bug or some sort of test implementation, but suffice it to say I'll believe they can support it efficiently when I see it
 
Last edited:

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
Resource Binding Tier is irrelevant.

Only Feature Level Tiers are relevant and thats what controls the latest hardware features.

That's a lie ...

Resource binding tiers ARE relevant otherwise Microsoft wouldn't have included them in the feature levels ...

Because of Maxwell v2's LIMITATIONS, Microsoft HAD to downgrade the INITIAL specs of resource binding tier 2 ACCORDING to Max McMullen!

If it wasn't for them waiving the requirements for Nvidia the situation about feature level 12_0 and up would be AMD GCN ONLY!
 
Last edited:

Hitman928

Diamond Member
Apr 15, 2012
5,593
8,770
136
https://forum.beyond3d.com/threads/directx-12-api-preview.55653/page-10#post-1840525



Nice try at slandering and smearing the wrong company when an Intel employee said it was related to them, if you like to embarass yourself, go ahead though.

I'm not an expert, but I believe Buzzkiller is right as Haswell doesn't support tier 2 as far as I know. Saying that d3d engineers limitted tier 2 for Haswell, when Haswell doesn't support it anyway doesn't make any sense. I believe what the intel guy was talking about was the second half of the original quoted post:

My team worked with the hardware vendor in Tier 1 that had the 55K limit to find alternate means of programming the GPU

That leaves them limitting tier 2 for either AMD or Nvidia and, well, from all accounts, that would leave Nvidia. I don't know why it really matters, but that's how it looks to me.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
If AMD could actually enable Tier 2 Tiled Resources in GCN 1.0, they would have done it already in their driver since GCN 1.1/1.2 hardware has it enabled, but they can't, period.

Also if you actually tried to use ROVs on GCN, it's absolutely unusable.

https://forum.beyond3d.com/threads/direct3d-feature-levels-discussion.56575/page-11#post-1847921

They can but no one has done ANY NEW TESTING for southern Islands ...

As for ROVs, NOT according to the recent testing done by Christophe Riccio ...

https://twitter.com/g_truc/status/581224843556843521
 

Hitman928

Diamond Member
Apr 15, 2012
5,593
8,770
136
Forgive me if I'm just ignorant, who is Christophe Riccio? I looked through his twitter a little bit, seems to be heavily involved with OpenGL?
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
Forgive me if I'm just ignorant, who is Christophe Riccio? I looked through his twitter a little bit, seems to be heavily involved with OpenGL?

He used to work for AMD and possibly Imagination Technologies but now he works at Unity ...
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Or its wrong on nVidias site.
.

Yea, rite.

Maybe take your own advice
Yes, i dont get why people argue against AMDs own facts.

Are you seriously arguing what parts of not yet finished api will be supported by yet to be finished drivers?

It all boils down to performance. Featurelevels means nothing if a vendor is deliberately cutting driver optimizations to make EOL'd products worse.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Yea, rite.

Maybe take your own advice


Are you seriously arguing what parts of not yet finished api will be supported by yet to be finished drivers?

It all boils down to performance. Featurelevels means nothing if a vendor is deliberately cutting driver optimizations to make EOL'd products worse.

For GTX750/TI its just blatantly wrong. You can test this on Windows 8.x for that matter as well. But you can do the same with GCN 1.0.

If you want full featured DX12. You get GCN1.1, 1.2, Skylake or Maxwellv2. Plain simple.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Or its wrong on nVidias site.


Maxwell v1 is DX11.0.

LOL, we both may be right (or wrong, depending on how you look at it). I honestly don't know what supporting 'some, but not all' 11.1 and 11.2 features mean, but it's more than just 11.0 and less than full 11.2. Maybe this is why we see different information for the version? Nevertheless, partial BUT not full 11.2 support. Case in point for how complicated all this compliance stuff gets, and this is just one example...

http://www.anandtech.com/show/7764/the-nvidia-geforce-gtx-750-ti-and-gtx-750-review-maxwell/2

From a graphics/gaming perspective there will not be any changes. Maxwell remains a Direct3D 11.0 compliant design, supporting the base 11.0 functionality along with many (but not all) of the features required for Direct3D 11.1 and 11.2. NVIDIA as a whole has not professed much of an interest in being 11.1/11.2 compliant – they weren’t in a rush on 10.1 either – so this didn’t come as a great surprise to us. Nevertheless it is unfortunate, as NVIDIA carries enough market share that their support (or lack thereof) for a feature is often the deciding factor whether it’s used. Developers can still use cap bits to access the individual features of D3D 11.1/11.2 that Maxwell does support, but we will not be seeing 11.1 or 11.2 becoming a baseline for PC gaming hardware this year.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
As I said, Its nothing but creative PR without base in reality to call it DX11.2 when it is DX11.0.
 

SPBHM

Diamond Member
Sep 12, 2012
5,058
410
126
Ha! At what settings was it faster? Are we talking it ran at 35-40 fps at 1024x768? Let's assess the reasonability of your argument here.

BF2 came out June 21, 2005. Radeon 8500 64MB came out October 17, 2001. Believe me I played games during that time on a ViewSonic 19" 1600x1200 CRT and as a previous owner of an 8500 64MB, by the point you site it was a pile of garbage. Do you know how far GPU hardware has come by that point?

June 22, 2005, nV released the 7800GTX 256MB.

http://images.anandtech.com/graphs/nvidia geforce 7800 gtx_06220580647/7753.png

http://images.anandtech.com/graphs/nvidia geforce 7800 gtx_06220580647/7734.png

Voodoo Power Rankings have:

Radeon 8500 (DX8.1) -- 2.1 VP

By June 2005, I bet one could buy a 6600GT for $80-100 max

Geforce 6600GT (DX9.0c) -- 6.3 VP (3X faster than 8500)

Geforce 7800GTX 256MB (DX9.0c) -- 15.4 VP (7.33X faster than Radeon 8500)
http://forums.anandtech.com/showthread.php?t=2298406

Your argument doesn't make any sense since to play games on an 8500 at that time would have required insane compromises.

if we go back to 2005 we are back to a time when most people had CRT monitors (and most didn't have 1600x1200 CRTs), so 800x600 or 1024x768 was still popular for gaming, you post benchmarks with highest settings, why? if you acknowledge the card was "outdated", the game scaled nicely with lowered settings, and you didn't need to jump from ultra to low,

the 8500LE I owned was a 128MB model, BF2 on the 8500 was playable;
running the game with lowered details beats having an error message,

a new 128mb 8500 (9100) in 2003 was still not far from $100, expecting it to run games in 2005 was no absurd, even if the quality was compromised, I remember on forums GF4TI owners unhappy that they couldn't even launch the game.




Exact same story as above. I am not going to go digging up reviews and games. By the time SM3.0 came into play, the entire GeForce 6 stack became outdated. How do I know? I had 6600GT and I upgraded to Radeon HD4890.

SM3 was relevant a long time before the HD 4890, if you kept the 6600GT (a card from 2004/2005) until at least Q2 2009 (4890 launch) I'm sure at some point you benefited from SM3.0 support, while the ATI equivalent wouldn't have SM3.0 support, I had a 6800 at launch (NU model with the pipelines unlocked using riva tuner) and I remember playing with the Far Cry SM3.0 patch, and CS source SM3.0 HDR pretty soon, by 2006-2007 you had SM3.0 only games on the market, and a 6800 could play Bioshock for example, a x850XT could not (and the owners were unhappy and even created some extensive hacks for the game to run, and it didn't work so well)

But let's go with your story:

Geforce 6800 Ultra 256MB (DX9.0c) -- 10.0 VP
Radeon HD 4890 2GB (DX10.1) -- 88 VP (8.8X faster than GeForce 6800U)

Again, your argument makes no sense. Neither the 9700Pro/9800Pro nor 6800Ultra/X850XT were good enough for modern DX9 games. I had a 1600x1200 monitor which meant there is no way I could have bought a card and used it for 4-5 years as you want to imply. Most of us upgraded way more frequently in the past.

if your monitor was a CRT it would give you decent quality at 1024x768,
and again, both the 9700s and 6800s benefited from higher level of DX support compared to competitors,
and as you said, we upgraded more frequently back in the day, that's why this discussion is interesting, and Fermi supporting DX12 API is a good thing, and the different level of supports for DX12 like DX12.1 vs DX12.0 vs DX11.2 can be relevant soon.

I don't think VP is the best metric, also if you play with a slower card you use different settings


You mean Radeon 8500, not 5800? Look, if you like gaming at 800x600 or 1024x768 at 30 fps with everything on LOW, that's your choice, but don't try claiming how GTX460, 480 or especially Radeon 8500 are going to provide a good TW3 experience in games.

Witcher 3 runs at a similar quality to the Xbone on the 5800, no need for 800x600 in 2015, I'm going to assume you mentioning 8500 and witcher 3 is some kind of joke.

480 and 460 can provide a good experience on Witcher 3.
unlike Iris Pro 6200 for example.
 

.vodka

Golden Member
Dec 5, 2014
1,203
1,537
136
A GTX 750Ti is apparently compatible with DX12.

So. Many. Draw. Calls.

Fermi, Kepler and Maxwellv1 do benefit from the lower overhead part of DX12 and it shows in that draw call benchmark, although they do not support at hardware level what's required for FL12.0/12.1 as most GCN hardware/maxwellv2 do.

When actual DX12 games arrive, we'll see if those older cards run faster in DX12 mode than in DX11 mode.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
Sorry but their are many user who miss guided people here.
Member were miss guiding user by saying AMD support all the features of DX12 and Nvidia will be left alone if they support it.
 

.vodka

Golden Member
Dec 5, 2014
1,203
1,537
136
No one misguided no one. All this DX12 thing and the feature levels and who supports what feature level at what capacity is one huge mess, too much smoke and mirrors. The way I see it no one has the definitive answer right now.

We see reputed game developers who work with the hardware more intimately than others claim GCN supports virtually every feature under the sun (something that makes sense since they work close to the metal on the consoles bypassing DX), others then deny that, other people in the know tweets' contradicting or supporting what's out there, microsoft itself has published different information during these past months... etc and etc and etc and etc and etc. This going back and forth every time stating things as if they were the ultimate truth isn't helping anyone. It's getting tiring and irritating.


Once the dust settles in two months when W10 is officially out and proper DX12 drivers are finished on both sides of the fence, we'll then see who supports what in detail. Us simple mortals should treat all graphics hardware out there now as DX11.x class and be done with it until August.
 
Last edited:

gamervivek

Senior member
Jan 17, 2011
490
53
91
I'd like them two misses myself. Anyway, GCN doesn't support 12_1 features and I doubt that the upcoming cards will either considering that synapse tapeout 'leak' was over a year ago and so the features were set in stone before that. As for it being able to do them in software, the performance won't be there. And expect Kepler to get toasted even further when nvidia do decide to stress on Maxwell's feature set.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
The real question is, will the features that GCN does not support be used by developers? We know that most 11.1 and 11.2 features were never used.

If they aren't used, then it doesn't matter that they are not supported. Only time will tell.
 

.vodka

Golden Member
Dec 5, 2014
1,203
1,537
136
GCN in both consoles is the Hawaii/Bonaire flavor, right? If so, that sets the baseline at 12.0 for the future, at least until the next consoles. 12.1 features could be important going forward if implemented in both vendors' 16nm chips next year and such hardware starts replacing the old.

As for now, a fallback could be 11.0 considering how many Fermi and Kepler cards are out there (I don't think MaxwellV2 is a significant enough part of nV's 75% marketshare).. not to mention GCN 1.0 hardware like Tahiti and Pitcairn.


August can't get here any faster, can it? I'm speculating on the OP's article, but I want to see how all this ends up soon...
 
Feb 19, 2009
10,457
10
76
The real question is, will the features that GCN does not support be used by developers? We know that most 11.1 and 11.2 features were never used.

If they aren't used, then it doesn't matter that they are not supported. Only time will tell.

Console compatibility will determine most if not all the features that will get implemented.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
5000 and 6000 doesnt because AMD doesnt make any new drivers.

And for Kepler its quite obvious in this table why:

1. The very chart you site shows GCN 1.0/1.1 and 1.2 all more future-proof than Fermi, Kepler and Maxwell 1.0. Developers are not going to throw Fermi, Kepler, Maxwell 1.0 and all GCN1.0-1.2 cards under the bus just to support small fraction of Maxwell 2.0 owners unless NV bribes them via GameWorks. It'll take 2-3 years before we see a wide adoption of DX12 games.

2. The information for GCN 1.0 doesn't agree with AMD, maybe they made a mistake?



3. Game developers like zlatan already mentioned that certain hardware level and feature levels, and whether any natively supported feature level could be emulated. Performance won't be the same as native support but it may not be a deal breaker.

Since DX12 was never even finalized during Fermi, Kepler or GCN 1.0/1.1 generations, it's impossible to expect those GPUs to support every feature level of DX12_1 and below. Not sure why this is a surprise now. What matters: How is this going to impact older gen cards?

We don't know until DX12 games launch. This has been repeated to you by others too.

if we go back to 2005 we are back to a time when most people had CRT monitors (and most didn't have 1600x1200 CRTs), so 800x600 or 1024x768 was still popular for gaming, you post benchmarks with highest settings, why?

I got a 19" 1600x1200 CRT late 2001. I remember most PC gamers on this site were gaming at 1280x1024 or 1600x1200. I don't know anyone from this sub-forum who buys high-end cards today who was gaming at 800x600 or 1024x768 in 2005. I am pretty sure you were the minority.

if you acknowledge the card was "outdated", the game scaled nicely with lowered settings, and you didn't need to jump from ultra to low,

the 8500LE I owned was a 128MB model, BF2 on the 8500 was playable;
running the game with lowered details beats having an error message,

I already said if you were OK playing at 800x600 or 1024x768 with most things on low in 2005 when you played BF2, that's your gaming choice. Most gamers weren't playing on an 8500 card at those settings in the summer of 2005. If you are the type of gamer who keeps his cards for 7-10 years, sure maybe get a card with DX12_1 feature set and play at the low settings 5 years from today on a 980Ti.

a new 128mb 8500 (9100) in 2003 was still not far from $100, expecting it to run games in 2005 was no absurd, even if the quality was compromised, I remember on forums GF4TI owners unhappy that they couldn't even launch the game.

We must have had a completely different hardware upgrade path. As I said already, by June 2015, 7800GTX launched, which meant one could pick up GeForce 6800GT for dirt cheap, nevermind 6600GT. Honestly by that point a used 8500 was probably $50. I got my 8500 for $275 7 months before 9700Pro even came out.

SM3 was relevant a long time before the HD 4890, if you kept the 6600GT (a card from 2004/2005) until at least Q2 2009 (4890 launch) I'm sure at some point you benefited from SM3.0 support

No, I didn't list the 8800GTS I owned by accident. I went 6600GT, then it bombed in games, and I got an 8800GTS 320MB.

I just looked up my EVGA account for you: 7/30/2007 is when I bought the 8800GTS 320MB. It cost me $289 from Newegg.

Then I upgraded to an HD4890 in August of 2009 for $195. In some games like Dirt my 4890 was 3X faster than 8800GTS 320MB at 1600x1200 because 320MB wasn't enough with AA. 6600GT long became a slide-show. Its SM3.0 support was completely irrelevant. I got 8800GTS 320MB to play Crysis 1. 6600GT was mostly used for 2D strategy games.

if your monitor was a CRT it would give you decent quality at 1024x768, and again, both the 9700s and 6800s benefited from higher level of DX support compared to competitors, and as you said, we upgraded more frequently back in the day, that's why this discussion is interesting, and Fermi supporting DX12 API is a good thing, and the different level of supports for DX12 like DX12.1 vs DX12.0 vs DX11.2 can be relevant soon.

But look at the performance of Fermi 480/580, Kepler 680/770/780 and even HD7970Ghz. Those cards are on their last 18 months stretch I feel. In 18 months from now both the 680 and 7970 will turn 5 years old or so. GTX480 will be 6.5 years old. These cards will be a serious compromise for games like Star Citizen and DX12 games. That's just my hunch.

I don't think VP is the best metric, also if you play with a slower card you use different settings

VP was developed by BoFox based on many many reviews from various sites. It's not made up of thin air. He actually compiled 10-20 reviews every generation and kept updating the charts accordingly. I verified a lot of the information in that chart using GPU generational comparison from various sites like TPU and Computerbase and TechSpot. It's very accurate as of the time he made it. Today it won't be as accurate since GCN, Fermi and Kepler perform differently to each other. For older cards though I sighted for you, things don't change.

Witcher 3 runs at a similar quality to the Xbone on the 5800, no need for 800x600 in 2015, I'm going to assume you mentioning 8500 and witcher 3 is some kind of joke.

Looks like we both misunderstood each other.

Look what you wrote below:

I assumed you made a mistake and meant to say Radeon 8500. Did you mean to say Nvidia FX5800 Ultra? I don't know what you meant by Radeon 5800. No such card existed.

5 years from 2010 to 2015 had a lot of stability with the OS and API, and even with the Nvidia architecture overall, while in the past we were used to a lot more changes, a 480 or even 460 can play current games a lot better than a 2000 card could in 2005 or a 2005 card could in 2010, 5 years old cards are more relevant now than they used to, I've just finished Witcher 3 with a Radeon 5800

480 and 460 can provide a good experience on Witcher 3.
unlike Iris Pro 6200 for example.

Why are you comparing 460/480 to an Iris Pro 6200? Anyone who bought a $500 GTX480 and held on to it to TW3 has no clue how to upgrade videocards and keep his rig up to date. I am sorry, but it's the honest truth. That person should have bought an HD5850 on launch date at $259, overclocked it, then later could have upgraded to a $280 HD7950. An alternative upgrade path if one upgraded later in the cycle could have been HD6950 unlocked to 6970 for $230-250, then last year a $250 R9 290 or a $330 970.

That's the whole point I am making against yours: You say extra DX feature levels matter long-term but you use examples of being forced to game at crazy low settings and resolutions. That's a horrible gaming experience, no offense. Instead of trying to future proof with a $550 GTX980 September 2014, a gamer is better off grabbing a $250-330 R9 290/GTX970 and then upgrading against in the summer/fall of 2017 for DX12 games to another $250-350 card. The resale value of R9 290/970 will help too.

12.1 features could be important going forward if implemented in both vendors' 16nm chips next year and such hardware starts replacing the old..

Software is always way behind hardware unless we are talking about Crytek! For DX12.1 features to be widely used in games, the developers either have to start making those games right now to be launched in 2 years from now OR they will wait until the market has enough hardware to support those features. Remember all the previous "latest" versions of DX? It usually takes years for game engines to start using them extensively.

Fallout 4, one very highly anticipated game, has graphics that looks waaaaay worse than 2007's Crysis 1. PC game developers are in no hurry at all to start making mind-blowing games that look like UE4 demo.

Did you see AC Syndicate? No revolutionary graphics to be found either.

Witcher 3? Downgraded console port from a technical point of view. Good looking game, but not next gen PC gaming it was hyped to be, nowhere close!

Right now consoles are dictating the direction of graphics, whether we like it or not.

PS4+XB1 are outselling PS3+Xbox 360 by 58%


Total Combined PlayStation 3 and Xbox 360 Sales: 22,032,717
Total Combined PlayStation 4 and Xbox One Sales: 34,893,866

vs. GPU sales:

Overall GPU shipments dropped 13% in Q1’2015 from last quarter
"AMD’s overall unit shipments decreased -17.80% quarter-to-quarter, Intel’s total shipments decreased -12.01% from last quarter, and Nvidia’s decreased -13.5%."
http://jonpeddie.com/press-releases...pments-dropped-13-in-q12015-from-last-quarter

Sure, there is seasonality but game sales don't lie. Just look at the sales of The Witcher 3, GTA V, AC Unity, consoles completely dominate PC sales. Most developers will continue porting console games to PC, just giving us slightly improved versions courtesy of GameWorks/AMD GE effects.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
The information for GCN 1.0 doesn't agree with AMD, maybe they made a mistake?

Game developers like zlatan already mentioned that certain hardware level and feature levels, and whether any natively supported feature level could be emulated. Performance won't be the same as native support but it may not be a deal breaker.

AMD never delivered DX11.2 to GCN 1.0 on Windows 8.1. And if you followed AMD they say no DX12.0 for GCN 1.0 either. In short, AMD lied with GCN 1.0 just like nVidia lied with Maxwell V1.

If Zlatan is a game developer he should know that DX12 features are mandated as hardware. So emulation is out of the question. And as I told you before, he have been proven wrong again and again. ROVs, CR etc.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Console compatibility will determine most if not all the features that will get implemented.

Yep, for all console/console ports a minimum DX12.0 FL could come relatively fast. Putting GCN 1.0 and Fermi/Kepler/Haswell/Broadwell out in the cold.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |