5000 and 6000 doesnt because AMD doesnt make any new drivers.
And for Kepler its quite obvious in this table why:
1. The very chart you site shows GCN 1.0/1.1 and 1.2 all more future-proof than Fermi, Kepler and Maxwell 1.0. Developers are not going to throw Fermi, Kepler, Maxwell 1.0 and all GCN1.0-1.2 cards under the bus just to support small fraction of Maxwell 2.0 owners unless NV bribes them via GameWorks. It'll take 2-3 years before we see a wide adoption of DX12 games.
2. The information for GCN 1.0 doesn't agree with AMD, maybe they made a mistake?
3. Game developers like zlatan already mentioned that certain hardware level and feature levels, and whether any natively supported feature level could be emulated. Performance won't be the same as native support but it may not be a deal breaker.
Since DX12 was never even finalized during Fermi, Kepler or GCN 1.0/1.1 generations, it's impossible to expect those GPUs to support every feature level of DX12_1 and below. Not sure why this is a surprise now. What matters: How is this going to impact older gen cards?
We don't know until DX12 games launch. This has been repeated to you by others too.
if we go back to 2005 we are back to a time when most people had CRT monitors (and most didn't have 1600x1200 CRTs), so 800x600 or 1024x768 was still popular for gaming, you post benchmarks with highest settings, why?
I got a 19" 1600x1200 CRT late 2001. I remember most PC gamers on this site were gaming at 1280x1024 or 1600x1200. I don't know anyone from this sub-forum who buys high-end cards today who was gaming at 800x600 or 1024x768 in 2005. I am pretty sure you were the minority.
if you acknowledge the card was "outdated", the game scaled nicely with lowered settings, and you didn't need to jump from ultra to low,
the 8500LE I owned was a 128MB model, BF2 on the 8500 was playable;
running the game with lowered details beats having an error message,
I already said if you were OK playing at 800x600 or 1024x768 with most things on low in 2005 when you played BF2, that's your gaming choice. Most gamers weren't playing on an 8500 card at those settings in the summer of 2005. If you are the type of gamer who keeps his cards for 7-10 years, sure maybe get a card with DX12_1 feature set and play at the low settings 5 years from today on a 980Ti.
a new 128mb 8500 (9100) in 2003 was still not far from $100, expecting it to run games in 2005 was no absurd, even if the quality was compromised, I remember on forums GF4TI owners unhappy that they couldn't even launch the game.
We must have had a completely different hardware upgrade path. As I said already, by June 2015, 7800GTX launched, which meant one could pick up GeForce 6800GT for dirt cheap, nevermind 6600GT. Honestly by that point a used 8500 was probably $50. I got my 8500 for $275 7 months before 9700Pro even came out.
SM3 was relevant a long time before the HD 4890, if you kept the 6600GT (a card from 2004/2005) until at least Q2 2009 (4890 launch) I'm sure at some point you benefited from SM3.0 support
No, I didn't list the 8800GTS I owned by accident. I went 6600GT, then it bombed in games, and I got an 8800GTS 320MB.
I just looked up my EVGA account for you: 7/30/2007 is when I bought the 8800GTS 320MB. It cost me $289 from Newegg.
Then I upgraded to an HD4890 in August of 2009 for $195. In some games like Dirt my 4890 was 3X faster than 8800GTS 320MB at 1600x1200 because 320MB wasn't enough with AA. 6600GT long became a slide-show. Its SM3.0 support was completely irrelevant. I got 8800GTS 320MB to play Crysis 1. 6600GT was mostly used for 2D strategy games.
if your monitor was a CRT it would give you decent quality at 1024x768, and again, both the 9700s and 6800s benefited from higher level of DX support compared to competitors, and as you said, we upgraded more frequently back in the day, that's why this discussion is interesting, and Fermi supporting DX12 API is a good thing, and the different level of supports for DX12 like DX12.1 vs DX12.0 vs DX11.2 can be relevant soon.
But look at the performance of Fermi 480/580, Kepler 680/770/780 and even HD7970Ghz. Those cards are on their last 18 months stretch I feel. In 18 months from now both the 680 and 7970 will turn 5 years old or so. GTX480 will be 6.5 years old. These cards will be a serious compromise for games like Star Citizen and DX12 games. That's just my hunch.
I don't think VP is the best metric, also if you play with a slower card you use different settings
VP was developed by BoFox based on many many reviews from various sites. It's not made up of thin air. He actually compiled 10-20 reviews every generation and kept updating the charts accordingly. I verified a lot of the information in that chart using GPU generational comparison from various sites like TPU and Computerbase and TechSpot. It's very accurate as of the time he made it. Today it won't be as accurate since GCN, Fermi and Kepler perform differently to each other. For older cards though I sighted for you, things don't change.
Witcher 3 runs at a similar quality to the Xbone on the 5800, no need for 800x600 in 2015, I'm going to assume you mentioning 8500 and witcher 3 is some kind of joke.
Looks like we both misunderstood each other.
Look what you wrote below:
I assumed you made a mistake and meant to say Radeon 8500. Did you mean to say Nvidia FX5800 Ultra? I don't know what you meant by Radeon 5800. No such card existed.
5 years from 2010 to 2015 had a lot of stability with the OS and API, and even with the Nvidia architecture overall, while in the past we were used to a lot more changes, a 480 or even 460 can play current games a lot better than a 2000 card could in 2005 or a 2005 card could in 2010, 5 years old cards are more relevant now than they used to, I've just finished Witcher 3 with a Radeon 5800
480 and 460 can provide a good experience on Witcher 3.
unlike Iris Pro 6200 for example.
Why are you comparing 460/480 to an Iris Pro 6200? Anyone who bought a $500 GTX480 and held on to it to TW3 has no clue how to upgrade videocards and keep his rig up to date. I am sorry, but it's the honest truth. That person should have bought an HD5850 on launch date at $259, overclocked it, then later could have upgraded to a $280 HD7950. An alternative upgrade path if one upgraded later in the cycle could have been HD6950 unlocked to 6970 for $230-250, then last year a $250 R9 290 or a $330 970.
That's the whole point I am making against yours: You say extra DX feature levels matter long-term but you use examples of being forced to game at crazy low settings and resolutions. That's a horrible gaming experience, no offense. Instead of trying to future proof with a $550 GTX980 September 2014, a gamer is better off grabbing a $250-330 R9 290/GTX970 and then upgrading against in the summer/fall of 2017 for DX12 games to another $250-350 card. The resale value of R9 290/970 will help too.
12.1 features could be important going forward if implemented in both vendors' 16nm chips next year and such hardware starts replacing the old..
Software is always way behind hardware unless we are talking about Crytek! For DX12.1 features to be widely used in games, the developers either have to start making those games right now to be launched in 2 years from now OR they will wait until the market has enough hardware to support those features. Remember all the previous "latest" versions of DX? It usually takes years for game engines to start using them extensively.
Fallout 4, one very highly anticipated game, has graphics that looks waaaaay worse than 2007's Crysis 1. PC game developers are in no hurry at all to start making mind-blowing games that look like UE4 demo.
Did you see AC Syndicate?
No revolutionary graphics to be found either.
Witcher 3? Downgraded console port from a technical point of view. Good looking game, but not next gen PC gaming it was hyped to be, nowhere close!
Right now consoles are dictating the direction of graphics, whether we like it or not.
PS4+XB1 are outselling PS3+Xbox 360 by
58%
Total Combined PlayStation 3 and Xbox 360 Sales: 22,032,717
Total Combined PlayStation 4 and Xbox One Sales: 34,893,866
vs. GPU sales:
Overall GPU shipments dropped 13% in Q1’2015 from last quarter
"AMD’s overall unit shipments decreased -17.80% quarter-to-quarter, Intel’s total shipments decreased -12.01% from last quarter, and Nvidia’s decreased -13.5%."
http://jonpeddie.com/press-releases...pments-dropped-13-in-q12015-from-last-quarter
Sure, there is seasonality but game sales don't lie. Just look at the sales of The Witcher 3, GTA V, AC Unity, consoles completely dominate PC sales. Most developers will continue porting console games to PC, just giving us slightly improved versions courtesy of GameWorks/AMD GE effects.