Closed

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Definitely a good investment. In my own personal GPU learning curve I've made a couple of bad choices, but mostly good ones that have lasted me 5+ years at a time.

We probably play at different resolutions, and maybe have different ideas on what is playable. (I refused to turn the details down during the many times I finished Far Cry despite framerates getting as low as 15fps during the action, although now I would consider that unnacceptable). I don't seem to be as sensitive to certain refresh rate issues as others seem to be.

I am still able to max out all AAA games I play (sometimes after having to wait a month or so), with maybe a concession here and there with foliage distance (gta4, Witcher3, DAI, but again these concessions are made during different patch versions and usually resolved) with gameworks and physx enabled. I'm leaning towards a Gsync ultra wide as my next monitor upgrade, depending on how Pascal performs with DX12.
 

Mezzanine

Member
Feb 13, 2006
99
0
66
So much mad about this alleged poor aging of Kepler yet people forget the 6950/70. Those cards became potatoes almost overnight.
 
Feb 19, 2009
10,457
10
76
So much mad about this alleged poor aging of Kepler yet people forget the 6950/70. Those cards became potatoes almost overnight.

Maybe you can find some reliable sources for your claims.

Like pit the 6950/70 vs Fermi 470/480/560/570/580 for their era and we can see how they aged.

Btw, NV promised DX12 & Vulkan for Fermi, never happened either. They just say NOPE!.. and that was all she wrote.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,362
5,033
136
Maybe you can find some reliable sources for your claims.

Like pit the 6950/70 vs Fermi 470/480/560/570/580 for their era and we can see how they aged.

Btw, NV promised DX12 & Vulkan for Fermi, never happened either. They just say NOPE!.. and that was all she wrote.

Ignore the off-topic thread derail attempt.

It's a poor one and not even worth responding to.

I still stand by my point that it makes great business sense to "neglect" (i.e. treat as a lower priority) older architectures. Sure it's bad for the consumer, but there's nothing like encouraging faster upgrades to pad the bottom line. The only reason AMD hasn't done it is because of the long stretch of GCN (along with the console effect) lending continuity to their optimization efforts. And maybe they have fewer business types making decisions...

In any event, I'd expect GCN-based cards to continue to age well, while I'd expect Kepler to be pretty much EOL once Pascal launches or within a year afterwards. At that point it'd be two generations old and it'll be a third rank priority behind Maxwell and Pascal.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
In any event, I'd expect GCN-based cards to continue to age well, while I'd expect Kepler to be pretty much EOL once Pascal launches or within a year afterwards.

I think honestly a more relevant question to be taken from the whole Kepler situation beyond "what happened to Kepler" is "will the same happen to Maxwell?" It is fair to say Kepler is old and Nvidia has obviously moved on. Do we as consumers learn something from the experience and avoid Nvidia and its "business types" in our own purchases, or was the Kepler decline a freak accident caused by poor vision on Nvidia's part for a single generation?

That is the elephant in the room that Kepler aging begs you to look at. Well that plus the steam survey that shows the GTX 970 with its weird RAM setup (which is sure to age quickly without optimizations) is the most popular card out there by far. Is the 970 the next 770 or worse? Because I don't think a lot of current 970 owners will accept their card tanking below a 380x in a year just because it helps Nvidia's bottom line.
 

nismotigerwvu

Golden Member
May 13, 2004
1,568
33
91
I think honestly a more relevant question to be taken from the whole Kepler situation beyond "what happened to Kepler" is "will the same happen to Maxwell?" It is fair to say Kepler is old and Nvidia has obviously moved on. Do we as consumers learn something from the experience and avoid Nvidia and its "business types" in our own purchases, or was the Kepler decline a freak accident caused by poor vision on Nvidia's part for a single generation?

That is the elephant in the room that Kepler aging begs you to look at. Well that plus the steam survey that shows the GTX 970 with its weird RAM setup (which is sure to age quickly without optimizations) is the most popular card out there by far. Is the 970 the next 770 or worse? Because I don't think a lot of current 970 owners will accept their card tanking below a 380x in a year just because it helps Nvidia's bottom line.

Well I think that gets the core of all this. Nvidia has based it's decisions over the last 15 or so years primarily on the bottom line whereas ATI/AMD has put less focus there. It isn't to say that NV has done less to "further the field" but they typically only do so when they are the only ones that benefit (physx, cuda, g sync, gameworks...ect). AMD on the otherhand has been more willing to open their innovations more often then not and have recently been more willing to allocate more resources in maintaining/optimizing older hardware. It just goes to show that what benefits us as consumers quite often comes at a cost for the companies involved. In a perfect situation we would see both companies meet a bit more in the middle where we wouldn't have any talk of AMD going under or seeing relatively recent hardware rot on the vine like this respectively.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Yes I am.
If you can explain to me the specific differences between a 7970, 280x and 380x, mabe I would understand. I know the clocks, memory controllers and such are somewhat different but what else.

I would say a driver for gtx680/gtx780/780ti is comparable to a driver for 7970/280x/290. would you?
Where does Maxwell fit?

You are asking people to give definitive proof, but you just make a statement and it's fact? As you like to say, gotcha.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Yes I am.
If you can explain to me the specific differences between a 7970, 280x and 380x, mabe I would understand. I know the clocks, memory controllers and such are somewhat different but what else.

I would say a driver for gtx680/gtx780/780ti is comparable to a driver for 7970/280x/290. would you?
Where does Maxwell fit?

280x and 380x are different tech completely, GCN 1.0 vs 1.2

290 is GCN 1.1

http://www.anandtech.com/show/9784/the-amd-radeon-r9-380x-review

390 = 290 with updated memory and power, but otherwise same chip, so still useful for the comparison in the chart.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
A classic example, Shadow of Mordor. R290X >= 980. Clearly GCN optimized.
We find the 780Ti is ahead of the 970, between the 970 & 980, where it belongs basically.

This result is verified by other sites, like AnandTech here, where the 780Ti is slight below the 980. Not a huge gap.
...
Or it's a game that responds very well to absolute memory bandwith because it has textures that are too noisy for optimal color compression.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
Nvidia not optimizing Kepler for new games anymore is especially bad news for me as i just bought a Kepler graphics card 2 weeks ago.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Well I think that gets the core of all this. Nvidia has based it's decisions over the last 15 or so years primarily on the bottom line whereas ATI/AMD has put less focus there. It isn't to say that NV has done less to "further the field" but they typically only do so when they are the only ones that benefit (physx, cuda, g sync, gameworks...ect). AMD on the otherhand has been more willing to open their innovations more often then not and have recently been more willing to allocate more resources in maintaining/optimizing older hardware. It just goes to show that what benefits us as consumers quite often comes at a cost for the companies involved. In a perfect situation we would see both companies meet a bit more in the middle where we wouldn't have any talk of AMD going under or seeing relatively recent hardware rot on the vine like this respectively.
I disagree, ati/amd has focussed the bottom line much more. The 4800 and 5800 series were very cost effective, much cheaper to make than nvidia counterparts.

With gcn they changed it, probably to get in on the compute market? Now nvidia has the cheaper to make cards. But they still get away with charging more for them.
 

SPBHM

Diamond Member
Sep 12, 2012
5,058
410
126
Maybe you can find some reliable sources for your claims.

Like pit the 6950/70 vs Fermi 470/480/560/570/580 for their era and we can see how they aged.

Btw, NV promised DX12 & Vulkan for Fermi, never happened either. They just say NOPE!.. and that was all she wrote.

I think Nvidia officially still didn't drop the plans to add DX12 support to Fermi (but I'll be very surprised if they don't give up), also late last year they released a WDDM 2.0 driver for Fermi, so there is still some ongoing support, but yes, NV failed to deliver DX12 so far (which they promised for 2015),

as for Fermi vs 6900, there is a clear trend of the 6900s suffering a lot more on newer games, it's not the case with all, but it's with most.
it's not easy to give you a single link to prove this due to these cards rarely being tested these days, but if you search through the "pclab" game specific tests from 2014 and 2015 will will find some good examples.
the VLIW DX11 cards kind of remind me of the Geforce 7 series, they had DX9C support but as DX9C games developed more they become really horribly slow at it.
I saw some tests with the 6970 delivering half the performance of a 260X on newer games.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
I had an HD 6870 1GB and a GTX 470 1.2GB in two identical machines up until last year. They were very evenly matched in most titles (even newer ones), I specifically remember that they scored within 5% of each other in Unigine Valley DX11 1080p extreme preset (the 6870 won). I saved the results somewhere, so if I find them I'll post them.
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
Take the 2015 titles. Isolate console vs non-console titles and this is what you get..

R9 290x vs GTX 780 Ti (non-console)
1080p: GTX 780 Ti is 8.3% faster
1440p: GTX 780 Ti is 1.7% faster

R9 290x vs GTX 780 Ti (console)
1080p: R9 290x is 16.2% faster
1440p: R9 290x is 20.8% faster

See a pattern?

Let's do the same thing but comparing a GTX 980 vs R9 290x..

R9 290x vs GTX 980 (non-console)
1080p: GTX 980 is 33.5% faster
1440p: GTX 980 is 29.3% faster

R9 290x vs GTX 980 (console)
1080p: GTX 980 is 8.1% faster
1440p: GTX 980 is 5.6% faster

Now as we move towards DX12 and Async Compute titles. That 10-20% boost Async compute offers, as well as the API overhead alleviation of DX12 should result in the R9 290x being around 5-15% faster than a GTX 980. (We can ignore the Rise of the Tomb Raider DX12 patch as it is broken but once fixed you'll see).

What we're seeing is the console effect. With Microsoft pushing unity between the PC and console platforms then we're going to see this push NVIDIA towards a more GCN-like uarch or they won't be able to compete
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
I have this ability to look at a graph full of numbers and intuitively discern patterns. But I understand not everyone is an aspie like me. So I did the math instead.

Console titles:
Star Wars Battlefront
Mad Max
Assassin's Creed
Just Cause 3
Rainbow Six
Dirt Rally
Far Cry Primal
The Division

Non console? All the others from 2015 on.
 

SPBHM

Diamond Member
Sep 12, 2012
5,058
410
126
I had an HD 6870 1GB and a GTX 470 1.2GB in two identical machines up until last year. They were very evenly matched in most titles (even newer ones), I specifically remember that they scored within 5% of each other in Unigine Valley DX11 1080p extreme preset (the 6870 won). I saved the results somewhere, so if I find them I'll post them.

unigine valley is not a game and it's old by now.
also to be fair both can be quite horrible in newer games, even the vram amount is a problem for most games,

but when I see this kind of stuff like
http://pclab.pl/zdjecia/artykuly/chaostheory/2015/12/game_jc3/charts1/jc3_1920l.png
6970 around half of a 7770 makes little sense.
 

Mahigan

Senior member
Aug 22, 2015
573
0
0
These figures..

R9 290x vs GTX 780 Ti (non-console)
1080p: GTX 780 Ti is 8.3% faster
1440p: GTX 780 Ti is 1.7% faster

Match the Hardware Canucks figures.

Once you throw in console titles, the GTX 780 Ti starts to fall back.

Case closed?
 

Goatsecks

Senior member
May 7, 2012
210
7
76
With Microsoft pushing unity between the PC and console platforms then we're going to see this push NVIDIA towards a more GCN-like uarch or they won't be able to compete

I think we really need to wait and see on the actual Polaris and Pascal performance figures; raw performance and no async seems to suit the 980ti just fine.

Can you imagine if the AMD and Nvidia uarchs did converge? I am not sure if that is a good thing? However, it would be fun watching the die-hards argue over nothing.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Don't forget that for almost the entire life of GCN you've been able to mine one flavor of profitable coin or another. My old 7950 bought be a 290 with the coins it made and that 290 paid itself off again with coins. And I've yet to begin ethereum to have my 2 290s pay their value out again
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
unigine valley is not a game and it's old by now.
also to be fair both can be quite horrible in newer games, even the vram amount is a problem for most games,

but when I see this kind of stuff like
http://pclab.pl/zdjecia/artykuly/chaostheory/2015/12/game_jc3/charts1/jc3_1920l.png
6970 around half of a 7770 makes little sense.

Like I said, I retired those cards last year because they were no longer any good for newer games. They are still great for Borderlands 2 and such (well over 60fps), but FC4 dipped down to 30fps. The tradeoff between fidelity and frame rate became too large, IMO.

You're right about Valley, it's not a game, but it was the only specific, objective measurement that I could recall from memory. It was an advanced DX11 benchmark and both cards did well with it, despite being pretty old.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So much mad about this alleged poor aging of Kepler yet people forget the 6950/70. Those cards became potatoes almost overnight.

Source?

$299 HD6920 2GB unlocked to HD6970 95%+ of the time.
$369 HD6970 2GB
vs.
$499 GTX580 1.5GB

March 2016 - 1080p gaming
GTX580 = 109%
HD6970 = 100%
http://www.computerbase.de/thema/grafikkarte/rangliste/

Here is Joker's GTX780 vs. GTX970 comparison as of February 2016. GTX780 got wrecked and that didn't even include some of the latest games where 780 is barely as fast as a 280X.
https://www.youtube.com/watch?v=zgbvqWpJAhY

Ironically, in synthetic benchmarks, 780 is very close to 970.

Nvidia not optimizing Kepler for new games anymore is especially bad news for me as i just bought a Kepler graphics card 2 weeks ago.

Kepler's poor performance was noted as early as late 2014 on these very forums when newer games like Dragon Age Inquisition and Far Cry 4 were fresh:

November 2014:







Even before we had the Witcher 3, Hitman, The Division, NFS, Project CARS, Far Cry Primal, red flags were there but most didn't pay attention or didn't think it would actually become a new trend. I can't imagine buying anything from GTX600/700 series unless it was an incredible deal for a back-up rig. With $180-225 R9 290/290X cards available for the last 12 months on the used market and versions of these cards selling new for $250-270, it was laughable seeing 780s going used on eBay for $230-250+.

The interesting thing is R9 290/290X (aka 390/390X) were never meant to actually compete with GTX970/980. The fact that 290/290X surpassed 780/780Ti are regularly viewed as 970/980 competitors just goes to show that GCN was more forward looking for modern games and had superior driver support from AMD.

Let's not forget that TPU uses reference thermal throttling 290X in their charts. Computerbase concluded that without blasting the fans at 100%, R9 290X's GPU clocks can drop to 839-869mhz!
http://www.computerbase.de/2014-05/amd-radeon-r9-290-290x-roundup-test/2/

Essentially, the data TPU uses for 290/290X is hardly relevant because of their thermal throttled clocks. If we want to look at how a real world after-market 290/290X performs today, we just look at 390/390X numbers and adjust them down maybe 1-2%.

Another way to look at things is compare an after-market R9 290 vs. 780Ti in 2014 and now.

July 2014:

After-market R9 290
780Ti is 11% faster


That means a true indication of 290X's performance (aka an after-market version clocked at 1Ghz) is just 1-2% behind a 390X. When we take this context into account, 780Ti's performance today looks horrible especially since it sold for $150 more.

March 31, 2016 = A stock 390 (aka after-market 290) now beats a 780Ti @ 1440P => The resolution a lot of gamers were buying 780Ti for in the first place.


The reasons behind why Kepler bombed so much can be debated for months. Point is, 780 gets smoked by the 970 too (the earlier video I posted from JokerProductions) and HD7000/R9 290 series perform better in modern titles than their competitors -- this doesn't even include considering the pricing premiums for NV cards ($299 R9 280X vs. $380/$450 770 2-4GB, $500+ GTX780, etc.) or the fact that AMD cards could have earned money with crypto-currency during those generations.
 
Last edited:

kawi6rr

Senior member
Oct 17, 2013
567
156
116
So much mad about this alleged poor aging of Kepler yet people forget the 6950/70. Those cards became potatoes almost overnight.

Given your avatar this shows Nvidia in the category of petty and classless. Posts like this are completely useless you should be ashamed.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
The reasons behind why Kepler bombed so much can be debated for months

Yet we came up with a few real good answers in 2 days.
I feel the better answers were over at Hardocp but that's might be because there is less BS marketing over there.
There are a still a few truly independent posters here that also gave good answers.

Basically, Nvidia moved on to Maxwell.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Yet we came up with a few real good answers in 2 days.
I feel the better answers were over at Hardocp but that's might be because there is less BS marketing over there.
There are a still a few truly independent posters here that also gave good answers.

Basically, Nvidia moved on to Maxwell.

It's interesting that the entire analysis of R9 290X vs. 780Ti presented by ABT ignores 2 major areas:

1) $399 After-market R9 290 ~ R9 290X reference.
http://www.computerbase.de/2014-05/amd-radeon-r9-290-290x-roundup-test/2/

Sure, it can be argued that there were after-market 780Ti cards too but looking at recent performance, 390X = 290X is 10% ahead of 780Ti as is. It also means it was actually possible to purchase R9 290 CF for just $100 more than a single 780Ti. Bonus time: 290 cards had 4GB of VRAM which now helps in modern titles where 780Ti bombs/cannot select above 1080p resolution (Hitman) or higher quality textures in many titles.

2) Head-to-head: R9 290X cost $549 but 780Ti cost $699. On this very forum we see people going with i5-6600K & 980Ti with $2000/2000 EUR couldn't justify spending $150 extra for an i7. Even with countless games/benches showing that it no longer makes sense to buy an i5, people still do -- that means they don't have an extra $100-150 to spend on a better CPU. Even if we were to assume that 780Ti OC ~ R9 290X OC on average in today's games, that's $150 wasted straight up that could have been used on a larger/faster SSD, better monitor, faster CPU i5->i7.

Also, if $150 extra over 780Ti was justified, what about $100 extra for R9 290 CF? Even last year on April 1, 2015, it became clear that 780Ti was never worth its asking price.



Forget Kepler gimping or GCN playing console ported games better. There is a far greater learning lesson here for the future:

$650 GTX280, $500 GTX680, $650 780, $700 GTX780 Ti, $550 GTX980 = every single one was overpriced/underperforming long-term. Yet, when threads like yours pop up, the price disparity is conveniently ignored in the analysis.

Let me put it this way, the R9 290X buyer upgrading to 14nm/16nm GPUs in 2016/2017 buying a $500 card will now be spending the 'equivalent' of $350. Therefore, only looking at how well 780Ti aged against 290X (or 780 vs. 290 or 770 2-4GB vs. 7970/R9 280X) but ignoring the huge NV premiums is misleading imo. It's even more ironic considering how many people trashed on the 7970 for being $50 more expensive than a launch 680. Now repeat the $ saved from not buying flagship overpriced NV cards for the last 5+ consecutive generations and going with AMD and rolling over the savings and now we are talking thousand+ dollars wasted on what Unigine/3DMark e-peen scores? Bring in crypto-currency earnings from AMD cards and objectively speaking, NV cards make no sense at all since now we are talking $0 290X vs. $700 780Ti. :sneaky:
 
Last edited:

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Bring in crypto-currency earnings from AMD cards and objectively speaking, NV cards make no sense at all since now we are talking $0 290X vs. $700 780Ti. :sneaky:

If of course you want to spend most of your time and electric dedicating the card to crypto currency. I think elec costs more in Britian than other the USA so it's not really worth it. Can't say for other countries.

As always you bring up a lot of good points, but although many will share your logic, others may want different things from their cards. We've been through this so many times in different threads so I don't want to start all that up again.
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |