Why are graphic cards not getting cheaper?!

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
Graphics cards have certainly gotten cheaper and more powerful.
For example the 660ti was launched at $299 in 2012 which offers similar performance to today's 950 at $159.
So you are getting a card with similar performance from 3 years ago at almost half the price.
Another example would be gtx 670 at $399 vs Gtx 960 at $199. Again similar performance at half the price.
Even today's mid range cards offer much better performance per dollar compared to high end cards. For example the 980ti at $650 offers only about 2x the performance of 960 but costs over 3x as much.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Compared to 2 year old Haswell, desktop Skylake is just plain embarrassing pointless if you don't give a flying crap to USB3.1. SB users are still having a good laugh on how much they got out of a 4 year old platform.
What I find particularly interesting is the outrage expressed whenever AMD/nVidia don't produce a GPU twice as fast as the previous generation (30% sucks!)

Meanwhile, the 5% we've been getting from Intel every 18 months since 2006's Core 2 Duo is lapped up like sweet milk, at full prices.

The only outlier to this trend has been the 4790K, yet people still scramble to justify the inferior Skylake.

All you need is a new motherboard and overpriced overclocked DDR4, and you too can see 10% more performance in fringe cases. Wow, what progress. :awe:
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
What I find particularly interesting is the outrage expressed whenever AMD/nVidia don't produce a GPU twice as fast as the previous generation (30% sucks!)

Meanwhile, the 5% we've been getting from Intel every 18 months since 2006's Core 2 Duo is lapped up like sweet milk, at full prices.

The only outlier to this trend has been the 4790K, yet people still scramble to justify the inferior Skylake.

All you need is a new motherboard and overpriced overclocked DDR4, and you too can see 10% more performance in fringe cases. Wow, what progress. :awe:

I think I'm a little confused by this post. Who is lapping up the Intel products like sweet milk? Visiting the CPU sub-forum, it's constant "guess I'm staying on Sandy/Ivy/Haswell longer than I thought." Only real people I see buying Skylake are coming from something older than Sandy, are the kind of people that upgrade because it's cheaper than good blow and it gets them high, and corporations that have yearly upgrade contracts.

But, I must agree, I believe the GPUs get more blow back than CPUs. Probably because GPUs get more use by "gamers" then CPUs do by "CPU users".
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
Why are graphics cards not getting cheaper? To sum it up in one word: Telecoms.

For years, big telecom, with their two year subsidized phone plans, have been gobbling up semiconducter capacity. This is no new thing. But ever since phones shifted to "smartphones", the capacity that they gobble up just happens to be part of the same capacity that serves gpu venders. Because the phones are subsidized, the costs are hidden from the consumer thus the consumer ends up paying more money. This drives up the bid on semiconductor manufacturing capacity beyond what the video card market can support without raising prices. So prices go up. Of course, semiconductor prices generally do not actually rise per se, they just fall less compared to how much they would fall absent this dynamic. So what ends up happening is prices simply cease to fall.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Compared to 2 year old Haswell, desktop Skylake is just plain embarrassing pointless if you don't give a flying crap to USB3.1. SB users are still having a good laugh on how much they got out of a 4 year old platform.

Desktop Skylake is thus far shaping up to be a huge failure relative to the massive hype that preceded it (as big of a leap as Prescott to Core 2 Duo -- remember that nonsense?) even more so for 45-47W mobile i7 Skylake SKUs. i7 6820HQ is barely 22-23% faster than my nearly 3-year-old IVB i7 3635QM. That's insane.

The overhyped i5-6600K is a whipping boy for the i7 4790K.
http://www.sweclockers.com/test/20862-intel-core-i7-6700k-och-i5-6600k-skylake/14#content

But the i7 bashers won't ever admit to that, instead focusing on how the i7 4790K is older architecture and keep repeating the mantra that i7 has little benefit over i5 in most games.

Also, unlike i7 2600K OC that more or less made the 6-core i7 990X OC irrelevant for most, i7 6700K OC loses overall to the August 2014 i7 5820K OC (and that's not even a well overclocked i7 5820K).

You know Skylake is a failure when the biggest selling points for it are platform features like NVMe PCIe SSD support (incl. RAID-0) and USB 3.1 (lulz - one can just buy a $30 add-on USB 3.1 card for an i7 2500/2600K system).

This drives up the bid on semiconductor manufacturing capacity beyond what the video card market can support without raising prices. So prices go up. Of course, semiconductor prices generally do not actually rise per se, they just fall less compared to how much they would fall absent this dynamic. So what ends up happening is prices simply cease to fall.

I've seen this argument used many times before but it doesn't work because NV is posting record profits, record revenue growth (despite selling the least amount of units on a quarter basis in a decade) and most importantly record Gross Margins (Revenue - COGS). Gross Margins, far higher than during GTX200, 400, 500 generations, skyrocketing especially during GTX600/700/900 eras suggest the consumers are actually not only subsidizing the new nodes but are putting more $ into the pockets of NV. If NV had to pay way higher prices for wafers due to extra demand from telecomms for the same wafers, then NV's gross margins would be at historical levels or even below.

That is now what we are seeing. NV's gross margins have literally skyrocketed from mid-30s to mid-50% in the last 6 years. The reason NV is raising prices are complex but likely because consumers keep paying them. Do you honestly think NV could have launched a GTX460/GTX560Ti for $499 during the GTX280/285 era and delay the real flagship GTX480/570/580 cards by a year? Are you kidding?



GTX560Ti was a $249 videocard. Starting with Kepler GTX680, the mid-range chip became a $499 product and with Maxwell, a $550 980.




No one in NV's marketing back in the days of GTX460/GTX560TI has used marketing BS metrics like perf/watt to justify slapping a $499 price tag on a mid-range NV chip, while waiting 1 year until we release the real flagship GTX480/580 cards.

AMD flopped hard with HD7970 generation with low clocks and bad launch drivers and NV pounced with what normally would have been a $250-300 GTX680 and turned it into a $500 product. Instead of consumers seeing through this marketing BS, they bought those cards so NV said hmm...let's introduce the new premium pricing segment of $1000 Titan. That worked too. So really, the main reason GPU prices keep rising is consumers keep buying them.

The second reason is because AMD is so financially weak, NV can bifurcate a generation and coast while milking every single release. Because AMD is financially weak, they have no choice but to raise prices just to make $1.

Graphics cards have certainly gotten cheaper and more powerful.

No, they have not overall.

Nvidia

GTX280 $499 -> Titan $999
GTX580 $499 -> Titan X $999
GTX560Ti $249 -> $549 GTX980

GTX980 is just a GTX960Ti, GTX970 is a 960, 960 is a 950 level card. NV moved up all the names to justify the prices. It's pure marketing.

AMD side

HD4850 $199, HD5850 $259 --> became HD7950 $499, R9 290 $399, Fury $550 (!)

HD4870 $299, HD5870 $379, HD6970 $379 -> became HD7970 $550, R9 290X $549, Fury X $649 (!)

Prices have skyrocketed by 50-100%.

Even 980Ti while it costs $649 which seems cheaper than the 6800 Ultra or 8800 Ultra, it's NOT a fully unlocked chip. What that means is a $650 980Ti is really just a $349 GTX570 successor.

Guess what the Titan X is? That's your $550 GTX580 3GB. ^_^

So you are getting a card with similar performance from 3 years ago at almost half the price.

What you described happened every generation of the last 20 years. You are not making a point here.

Furthermore, you are ignoring the time frame and price/performance of existing generations. For example, you say 960 is way cheaper than a 670 but 960 itself is now 3.5 years older than the GTX670. So in order to make the comparison relevant, you have to look at 2 things:

1) In the past how fast was a $200-240 next generation card (hint GTX460/560Ti/HD7870) vs. the flagships. In that case 960 is a failure.

2) Relative context to previous generations. GTX960 is the worst price/performance and worst performance increase from one generation to the next in NV's 5 consecutive x60 generations (!).

GTX680 and especially GTX980 are the worst "flagship" generational increases of all time on team NV. There was already massive outrage when GTX680 outperformed the GTX580 by 35-40%, the lowest generational increase in NV's history. GTX980 against GTX780Ti is literally THE worst ever. But in reality both 680 and 980 are exceptional cards because they are GTX460/560Ti lineage products beating last gen's flagships. But wait a second, that was ALWAYS the case.

GeForce 3 Ti 500 (high-end) < GeForce 4 Ti 4200 (mid-range)
GeForce 4 Ti 4600/4800 (high-end) < GeForce 5600U/5700U (mid-range)
GeForce 5900/5950U (high-end) < GeForce 6600GT (mid-range)
GeForce 6800U (high-end) < 7800GT/7950GT (mid-range)
GeForce 7900GTX (high-end) < 8800GT/8800GTS 320MB (mid-range)
GeForce 8800U/9800GTX+ (high-end) < GTX250/GTX260 (mid-range)
GeForce GTX280/285 (high-end) < GTX460 1GB/560Ti (mid-range)

Now look:

GeForce GTX480/580 (high-end) < $500 GTX680 (mid-range)
GeForce GTX780Ti (high-end) < $550 GTX980 (mid-range)

What's the difference? All of the next generation mid-range NV cards going back at least to GeForce 3 were not $499-550 cards. :biggrin:

What NV/AMD are doing now is bifurcating a generation into two halves and making up flagships arbitrarily.

Even today's mid range cards offer much better performance per dollar compared to high end cards.

That has always been the case. The difference is in the past a $200-300 next gen mid-range GPU would beat last generation's $500-650 flagship card. Today, you cannot buy a launch date $200-300 next generation AMD/NV card that will beat last generation's flagship. In other words, GTX980 would have needed to be $250-300 for that to happen against a $700 GTX780Ti.

Even if we account for inflation, at most next generation mid-range cards should be $350, maybe $400, yet 680 was $500, 980 was $550. BS.

What I find particularly interesting is the outrage expressed whenever AMD/nVidia don't produce a GPU twice as fast as the previous generation (30% sucks!)

The performance is just one aspect, but the price is the other. If NV provided 30% more performance instead of the usual 70-90% as was the case in the past, no problem, but why charge near flagship prices for next generation mid-range performance (aka 680/980)?

You've been purchasing NV cards for years so you should know what I am talking about. Imagine if NV jacked up the price of the GTX470 to $650. Is that OK? They did exactly that with the spec neutered $650 GTX780.

I am pretty sure you would not have been thrilled at all if you had to pay $1000 for a fully unlocked GeForce 6800U 512MB or GTX580 3GB back in the days. NV now sells those products, except they are called the Titan series.

Meanwhile, the 5% we've been getting from Intel every 18 months since 2006's Core 2 Duo is lapped up like sweet milk, at full prices.

The majority of gamers are actually not upgrading to Skylake. Many are still on SB/IVB/Haswell. I think a lot of Skylake upgraders are either those who are always on the cutting edge chasing benchmarks or Q9xxx and 1st generation i5/i7 or i5 2500k owners. i7 2600K and above are not really impressed by Skylake from what I see online.

The only outlier to this trend has been the 4790K, yet people still scramble to justify the inferior Skylake.

More shocking is that people are buying i5-6600K but i7 4790K is often on sale for not much more. Z97 boards are dirt cheap now since they are discontinued. DDR3 1600-2400mhz is cheaper than DDR4 2666-3000 needed for Skylake to show its true potential. In reality that means it's possible to build a 16GB DDR3 i7 4790K system for close to what it would cost to buy an i5-6600K, a far inferior setup. :sneaky:

All you need is a new motherboard and overpriced overclocked DDR4, and you too can see 10% more performance in fringe cases. Wow, what progress. :awe:

The irony is those who have upgraded from SB/IVB/Haswell to Skylake do not even own GTX980Ti SLI/Titan X SLI/Fury X CF. Essentially if anyone is coming from an i7 2600K OC, they would hardly benefit from a Skylake i7 6700K upgrade until they close to maxed out their GPU budget.

There is 1 use case where Skylake is better though - PCIe NVMe performance. I am pretty sure there are no Z97 boards that work well booting off an NVMe drive even if some are compatible.

You made a great call though I must say just reusing your DDR3 and doing an upgrade to the i7 4790K. I remember telling you to wait for Skylake but I think you made the right decision given that i7 6700K is a big flop and DDR4 2666-3000 price premiums persist. And well your DDR3 has already been paid for. You'll basically coast another 3 years on i7 4790K just in time for 2018 Icelake new architecture + by that point DDR4 3500+ should be dirt cheap. :biggrin:
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
stock? why would you run a 6600K without OC?

http://pclab.pl/art65154-26.html

OC vs OC the 6600K beats the 4790K on most games, and it's a lot cheaper, is that the definition of a whipping boy?

Not a chance. If you play a wide variety of PC games like Crysis 3, Ryse Son of Rome, i7 4790K @ 4.7 Ghz would destroy an i5-6600K OC.

But no matter how many times I link actual real world benchmarks, you will still deny it. No one cares about PCLabs because nothing out of there ever matches the 10-20 other websites that do reviews online.

Max overclocked 6700K barely beats the i7 4790K. We also know that in games that are well threaded, an i5 gets dropped hard by the 4- and 6-vote i7s.

I7 4790K is a way better processor overall and for games than an i5-6600K stock vs. stock or OC vs. OC.

It looks like you didn't even bother looking at that review I linked because the 6600K barely hangs in there with an ancient 2600K. Why? Because many modern games benefit from HT.
 

SPBHM

Diamond Member
Sep 12, 2012
5,058
410
126
Not a chance. If you play a wide variety of PC games like Crysis 3, Ryse Son of Rome, i7 4790K @ 4.7 Ghz would destroy an i5-6600K OC.

But no matter how many times I link actual real world benchmarks, you will still deny it. No one cares about PCLabs because nothing out of there ever matches the 10-20 other websites that do reviews online.

Max overclocked 6700K barely beats the i7 4790K. We also know that in games that are well threaded, an i5 gets dropped hard by the 4- and 6-vote i7s.

I7 4790K is a way better processor overall and for games than an i5-6600K stock vs. stock or OC vs. OC.

It looks like you didn't even bother looking at that review I linked because the 6600K barely hangs in there with an ancient 2600K. Why? Because many modern games benefit from HT.

the review you posted don't include OC,
the PClab review includes crysis 3 welcome to the jungle which is the worst MT part as far as I know, and the 4790K lead was small
http://pclab.pl/art65154-28.html

can you post some of the other reviews with those games and overclocked CPUs? if that's not the case I think the PClab review is a good source of information, comparing 4790K stock vs 6600K stock is kind of pointless, unless you are buying a locked MB or something.



Actually, it's not cheaper overall. Both the DDR4 RAM and the motherboards are more expensive by a lot and close the gap.

the CPU is a lot cheaper, you can pay as much for OC DDR3 and z97 boards as you can for z170 and DDR4 depending on what you want, also you can buy cheaper Z170 boards and try using DDR3 with it, but performance would not be as good as with DDR4.
 

Eric1987

Senior member
Mar 22, 2012
748
22
76
Desktop Skylake is thus far shaping up to be a huge failure relative to the massive hype that preceded it (as big of a leap as Prescott to Core 2 Duo -- remember that nonsense?) even more so for 45-47W mobile i7 Skylake SKUs. i7 6820HQ is barely 22-23% faster than my nearly 3-year-old IVB i7 3635QM. That's insane.

The overhyped i5-6600K is a whipping boy for the i7 4790K.
http://www.sweclockers.com/test/20862-intel-core-i7-6700k-och-i5-6600k-skylake/14#content

But the i7 bashers won't ever admit to that, instead focusing on how the i7 4790K is older architecture and keep repeating the mantra that i7 has little benefit over i5 in most games.

Also, unlike i7 2600K OC that more or less made the 6-core i7 990X OC irrelevant for most, i7 6700K OC loses overall to the August 2014 i7 5820K OC (and that's not even a well overclocked i7 5820K).

You know Skylake is a failure when the biggest selling points for it are platform features like NVMe PCIe SSD support (incl. RAID-0) and USB 3.1 (lulz - one can just buy a $30 add-on USB 3.1 card for an i7 2500/2600K system).



I've seen this argument used many times before but it doesn't work because NV is posting record profits, record revenue growth (despite selling the least amount of units on a quarter basis in a decade) and most importantly record Gross Margins (Revenue - COGS). Gross Margins, far higher than during GTX200, 400, 500 generations, skyrocketing especially during GTX600/700/900 eras suggest the consumers are actually not only subsidizing the new nodes but are putting more $ into the pockets of NV. If NV had to pay way higher prices for wafers due to extra demand from telecomms for the same wafers, then NV's gross margins would be at historical levels or even below.

That is now what we are seeing. NV's gross margins have literally skyrocketed from mid-30s to mid-50% in the last 6 years. The reason NV is raising prices are complex but likely because consumers keep paying them. Do you honestly think NV could have launched a GTX460/GTX560Ti for $499 during the GTX280/285 era and delay the real flagship GTX480/570/580 cards by a year? Are you kidding?



GTX560Ti was a $249 videocard. Starting with Kepler GTX680, the mid-range chip became a $499 product and with Maxwell, a $550 980.




No one in NV's marketing back in the days of GTX460/GTX560TI has used marketing BS metrics like perf/watt to justify slapping a $499 price tag on a mid-range NV chip, while waiting 1 year until we release the real flagship GTX480/580 cards.

AMD flopped hard with HD7970 generation with low clocks and bad launch drivers and NV pounced with what normally would have been a $250-300 GTX680 and turned it into a $500 product. Instead of consumers seeing through this marketing BS, they bought those cards so NV said hmm...let's introduce the new premium pricing segment of $1000 Titan. That worked too. So really, the main reason GPU prices keep rising is consumers keep buying them.

The second reason is because AMD is so financially weak, NV can bifurcate a generation and coast while milking every single release. Because AMD is financially weak, they have no choice but to raise prices just to make $1.



No, they have not overall.

Nvidia

GTX280 $499 -> Titan $999
GTX580 $499 -> Titan X $999
GTX560Ti $249 -> $549 GTX980

GTX980 is just a GTX960Ti, GTX970 is a 960, 960 is a 950 level card. NV moved up all the names to justify the prices. It's pure marketing.

AMD side

HD4850 $199, HD5850 $259 --> became HD7950 $499, R9 290 $399, Fury $550 (!)

HD4870 $299, HD5870 $379, HD6970 $379 -> became HD7970 $550, R9 290X $549, Fury X $649 (!)

Prices have skyrocketed by 50-100%.

Even 980Ti while it costs $649 which seems cheaper than the 6800 Ultra or 8800 Ultra, it's NOT a fully unlocked chip. What that means is a $650 980Ti is really just a $349 GTX570 successor.

Guess what the Titan X is? That's your $550 GTX580 3GB. ^_^



What you described happened every generation of the last 20 years. You are not making a point here.

Furthermore, you are ignoring the time frame and price/performance of existing generations. For example, you say 960 is way cheaper than a 670 but 960 itself is now 3.5 years older than the GTX670. So in order to make the comparison relevant, you have to look at 2 things:

1) In the past how fast was a $200-240 next generation card (hint GTX460/560Ti/HD7870) vs. the flagships. In that case 960 is a failure.

2) Relative context to previous generations. GTX960 is the worst price/performance and worst performance increase from one generation to the next in NV's 5 consecutive x60 generations (!).

GTX680 and especially GTX980 are the worst "flagship" generational increases of all time on team NV. There was already massive outrage when GTX680 outperformed the GTX580 by 35-40%, the lowest generational increase in NV's history. GTX980 against GTX780Ti is literally THE worst ever. But in reality both 680 and 980 are exceptional cards because they are GTX460/560Ti lineage products beating last gen's flagships. But wait a second, that was ALWAYS the case.

GeForce 3 Ti 500 (high-end) < GeForce 4 Ti 4200 (mid-range)
GeForce 4 Ti 4600/4800 (high-end) < GeForce 5600U/5700U (mid-range)
GeForce 5900/5950U (high-end) < GeForce 6600GT (mid-range)
GeForce 6800U (high-end) < 7800GT/7950GT (mid-range)
GeForce 7900GTX (high-end) < 8800GT/8800GTS 320MB (mid-range)
GeForce 8800U/9800GTX+ (high-end) < GTX250/GTX260 (mid-range)
GeForce GTX280/285 (high-end) < GTX460 1GB/560Ti (mid-range)

Now look:

GeForce GTX480/580 (high-end) < $500 GTX680 (mid-range)
GeForce GTX780Ti (high-end) < $550 GTX980 (mid-range)

What's the difference? All of the next generation mid-range NV cards going back at least to GeForce 3 were not $499-550 cards. :biggrin:

What NV/AMD are doing now is bifurcating a generation into two halves and making up flagships arbitrarily.



That has always been the case. The difference is in the past a $200-300 next gen mid-range GPU would beat last generation's $500-650 flagship card. Today, you cannot buy a launch date $200-300 next generation AMD/NV card that will beat last generation's flagship. In other words, GTX980 would have needed to be $250-300 for that to happen against a $700 GTX780Ti.

Even if we account for inflation, at most next generation mid-range cards should be $350, maybe $400, yet 680 was $500, 980 was $550. BS.



The performance is just one aspect, but the price is the other. If NV provided 30% more performance instead of the usual 70-90% as was the case in the past, no problem, but why charge near flagship prices for next generation mid-range performance (aka 680/980)?

You've been purchasing NV cards for years so you should know what I am talking about. Imagine if NV jacked up the price of the GTX470 to $650. Is that OK? They did exactly that with the spec neutered $650 GTX780.

I am pretty sure you would not have been thrilled at all if you had to pay $1000 for a fully unlocked GeForce 6800U 512MB or GTX580 3GB back in the days. NV now sells those products, except they are called the Titan series.



The majority of gamers are actually not upgrading to Skylake. Many are still on SB/IVB/Haswell. I think a lot of Skylake upgraders are either those who are always on the cutting edge chasing benchmarks or Q9xxx and 1st generation i5/i7 or i5 2500k owners. i7 2600K and above are not really impressed by Skylake from what I see online.



More shocking is that people are buying i5-6600K but i7 4790K is often on sale for not much more. Z97 boards are dirt cheap now since they are discontinued. DDR3 1600-2400mhz is cheaper than DDR4 2666-3000 needed for Skylake to show its true potential. In reality that means it's possible to build a 16GB DDR3 i7 4790K system for close to what it would cost to buy an i5-6600K, a far inferior setup. :sneaky:



The irony is those who have upgraded from SB/IVB/Haswell to Skylake do not even own GTX980Ti SLI/Titan X SLI/Fury X CF. Essentially if anyone is coming from an i7 2600K OC, they would hardly benefit from a Skylake i7 6700K upgrade until they close to maxed out their GPU budget.

There is 1 use case where Skylake is better though - PCIe NVMe performance. I am pretty sure there are no Z97 boards that work well booting off an NVMe drive even if some are compatible.

You made a great call though I must say just reusing your DDR3 and doing an upgrade to the i7 4790K. I remember telling you to wait for Skylake but I think you made the right decision given that i7 6700K is a big flop and DDR4 2666-3000 price premiums persist. And well your DDR3 has already been paid for. You'll basically coast another 3 years on i7 4790K just in time for 2018 Icelake new architecture + by that point DDR4 3500+ should be dirt cheap. :biggrin:

Sorry so many things wrong with your analysis. 680 and 980 mid range performance? WTF? Why are you comparing the 280 to the titan? You should compare the 280 to the 980.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
Sorry so many things wrong with your analysis. 680 and 980 mid range performance? WTF? Why are you comparing the 280 to the titan? You should compare the 280 to the 980.

You're missing his point. He's comparing the chips to their equivalents. From that perspective, the 560 Ti's (full GF114) successor is the 680 (full GK104), which is in turn succeeded by the 980 (full GM204). The 280, on the other hand, was a full GT200 core, and was succeeded by the 580 (full GF110), then the 780 Ti and Titan Black (GK110), and now the Titan X (full GM200). If you disagree with this way of thinking, that's fine, but there's nothing wrong with what he said. Nvidia didn't create a new tier of chip in the Kepler and Maxwell lines; they just took advantage of how far behind AMD fell so they could raise prices while making it seem like they hadn't.
 
Last edited:

Head1985

Golden Member
Jul 8, 2014
1,866
699
136
Nv releases:
cards to 400USD :Only recently GTX950,GTX960, GTX970.Only 3 cards.Whole last year only 2.
Above 400USD :GTX980,GTX980TI,TitanX,Dual GM200.Thats 4 cards.

I dont like current GPU market condition.Its worst times in history.If this continue we will see fast end in GPU market because not many ppl can afford this crap.
 

SPBHM

Diamond Member
Sep 12, 2012
5,058
410
126
Nv releases:
cards to 400USD :Only recently GTX950,GTX960, GTX970.Only 3 cards.Whole last year only 2.
Above 400USD :GTX980,GTX980TI,TitanX,Dual GM200.Thats 4 cards.

I dont like current GPU market condition.Its worst times in history.If this continue we will see fast end in GPU market because not many ppl can afford this crap.

they still rely on older products for anything bellow the 950, but those are available new and selling a lot (like the 750 for example),

at least the 750 is not as old as the r7 370 for example,
 

tajoh111

Senior member
Mar 28, 2005
305
322
136
dafuq did I just read ?

AMD raised the price of graphic cards by offering a 7970 40% faster than a GTX580 for 10% more, and we should thank nvidia for fighting to keep prices low with a 680 10% slower than a 7970GHz for only 10% more money ?

really ?
REALLY ?

ppl buying that rip of that was the 680 IS exactly what started that whole "high end 1000€ mid range 550€ low end 200€" thing.

Thank anyone who bought a 680/770/780/titan when AMD had faster for less, not someone who bought a 7970 when nVidia had nothing in 40% range of performance.

Bull. The 7970 was never 40 percent faster than a gtx 580 upon release. It was more like 10-20 percent as anandtech and techpowerup shows.

If you call the gtx 680 a rip off, the 7970 was an even greater rip off.

The next generation always has to provide better price to performance than last generation, particularly when that generation hasn't had price cuts yet. There is about an 80% savings in transistors between generations, so you can pack 80% more transistors for the same cost meaning the 7970 was cheaper to produce than a gtx 580. It's why the 5870 was so much faster and was cheaper then the generation before it, the gtx 8800 vs the 7900 series, etc.

What made the 7970 a bad deal was that it was 10% more expensive than gtx 580 and maybe just 15% faster. It didn't pass this transistor savings on to the customer at all.

Considering the node shrink and the generally poor price to performance of the gtx 580 which meant the 7970 had poor price to peformance, it didn't provide much of an upgrade incentive to people who owned a gtx 580, or 6970 . Early generations tended to be 40% faster at the least and the same price vs the top end cards manufactured on the previous node.

The gtx 680 had glowing reviews because it offered 10% better performance while being the fastest card upon release while offering 10% discount on the 7970. This is a better value than the 7970 upon release because your get a faster card for cheaper. It was reflected in reviews, and anandtech was particularly pleased with the card.

BTW. The ghz 7970 had the same speed as the gtx 680 upon release(and was released later) and was the same price. It was also noisy as hell and had the power consumption to boot.

The gtx 680 wasn't that great a value, but compared to the 7970 that was released, it appeared to be a good value.

The reason why graphic prices rose is because AMD being the underdog tends to help keep the upper brands prices in check. If they lower their prices, the better known brands can't have crazy high prices. However if AMD prices their cards high and stops being the value brand, at the very minimum competitors which have higher brand standing can price their cards the very same at the very least. If they price their cards higher, then card prices go up across the board. If the market leader prices their cards under the underdog, you will crater the profits and market share of the underdog which is what happened to the 7970.
 

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
Sorry so many things wrong with your analysis. 680 and 980 mid range performance? yes
look at the 780ti and titan x
only release timing fooled people into buying mid range cards while nv had the real ones in the oven imo
 

Atreidin

Senior member
Mar 31, 2011
464
27
86
So really, the main reason GPU prices keep rising is consumers keep buying them.

This is the core of it. Nvidia has a lot of customers who won't ever consider another brand, because they are comfortable buying Nvidia and don't have the time, interest, or motivation to consider buying any alternatives. They are essentially locked in. For better or worse, they have decided that spending further time deciding what GPU to buy is a waste of their energy, and feel safe by simplifying their purchasing decision by looking at their budget, and picking the most expensive Nvidia card that fits. While there are valid reasons to pick one or the other, any explanations beyond that are essentially rationalizing a decision that was already predetermined, and if a reason also happens to be rational that is pure coincidence.

As a result, AMD has to try to shake things up and break through that mindset and try out risky new tech and try to guide the industry to use it (such as async compute), whereas Nvidia can play it safe and try to brute force things, because the majority of their customers don't care about how future-proof their card is. They will just buy a new Nvidia card with the necessary upgrades when the time comes. With enough people like that, Nvidia is just being smart and raising the prices to just under the level where they piss off their core base of loyal customers.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
This is the core of it. Nvidia has a lot of customers who won't ever consider another brand, because they are comfortable buying Nvidia and don't have the time, interest, or motivation to consider buying any alternatives. They are essentially locked in. For better or worse, they have decided that spending further time deciding what GPU to buy is a waste of their energy, and feel safe by simplifying their purchasing decision by looking at their budget, and picking the most expensive Nvidia card that fits. While there are valid reasons to pick one or the other, any explanations beyond that are essentially rationalizing a decision that was already predetermined, and if a reason also happens to be rational that is pure coincidence.

As a result, AMD has to try to shake things up and break through that mindset and try out risky new tech and try to guide the industry to use it (such as async compute), whereas Nvidia can play it safe and try to brute force things, because the majority of their customers don't care about how future-proof their card is. They will just buy a new Nvidia card with the necessary upgrades when the time comes. With enough people like that, Nvidia is just being smart and raising the prices to just under the level where they piss off their core base of loyal customers.

These watered down generalizations tend to catch me in the mix. Neither an NV fanboy nor a sheeple (or as someone else tried to rationalize it - I incur high debt), did it ever dawn on people that at certain points AMD had inferior products?

I mean, let's look at GCN from launch:
HD 7970, slower than GTX 680 cost more. - GTX 680 won
HD 7970 GHZ, faster than GTX 680, like a handful of games, and cheaper - AMD saw marketshare gains
Then HD 7970 stumbled with driver supports, delayed frame pacing drivers, memory re-write that never materialized, people got tired of waiting moved on.
R9 290X, superior product than GTX 780 (overall) acoustics/thermal were awful, took almost 2 months for custom models to address acoustic/thermal issues, then straight into price gouging due to bitming. It cost more to buy a 290 than a GTX 780 and 290x cost more than a GTX 780 Ti. For almost 6 months this trend happened.

When the bottom collapsed, you expect people to just drop what they bought and pick up AMD?

Fury Nano/Fury/Fury X - price, availability, and performance. Fury Nano is put against GTX 980 though it cost the same as Fury X/980 Ti and more than Fury, WHY? Unless someone is trying to build a case into a shoe box, Fury Nano shouldn't even factor in @ $650. Fury X, where the hell is it? And again price vs performance. 980 Ti beats it, yet same cost? Fury, your winner! And it forced NV to react on 980 Prices, woooo!

But cue novel posts about "290X is free if you only eat ramen for a week" or idiotic "if you weren't bitmining you were doing it wrong" rhetoric yet those posts never reflect that important 6-8 months that 290X was on the market - it was gouged to hell and back.

Is it really just NVidiots? I just read a thread over @ FFXIV reddit where it seems AMD has issues with the game and people are advising others to wait for a driver that was promised a month ago (that sounds familiar). Glad I didn't get a Fury X and then get forced to play DX9 mode because AMD can't deliver a driver that is stable for DX11 mode (most likely due to NV money hatting in the first place).


Anyways, again as someone who seems to always get mixed up in the "NV has these fanboys that will sell their mom for a new card" generalizations, face it AMD dropped the ball so hard NV didn't even need its fanboys. AMD basically drove their own supporters to NV's hotel.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
So, I bought my GTX 760 almost TWO years ago no. I paid 220€ for it and it's been serving me well.

Now.

Why is still being sold for the same price, TODAY?

Simple, because it cost X amount to manufacture at the time and if they sell it for less than X they'll make a loss.

Newer generation cards manufactured with cheaper manufacturing processes but are about the same power/speed as an old 760 will however be cheaper. It's not just GPUs it's most electronics that increase in speed/capacity over time, such as CPUs, RAM etc.
 

Morbus

Senior member
Apr 10, 2009
998
0
0
Meanwhile, I'm sticking to my GTX760, which is doing the rounds two years after purchase...

I'm liking the read though, great stuff in here
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Sorry so many things wrong with your analysis.

Welcome to 2015. This has been discussed ad-nauseum for the last 3 years. You have provided no rebuttal as to what I have gotten wrong.

680 and 980 mid range performance? WTF?

Absolutely.

1) The very definition of a flagship graphics card, before we even get to code-names is it must perform at the top or near the top of its architectural food chain/product stack.

What's the fastest Tesla graphics chip? GTX280/285, not GTX260/260 216.
What's the fastest Fermi graphics chip? GTX480/580, not GTX460/560/560Ti.
What's the fastest Kepler graphics chip? GTX780Ti, not GTX680/770.
What's the fastest Maxwell graphics chip(s)? GTX980Ti/TitanX, not GTX980.

By very definition in their product stack, GTX680 and 980 are mid-range graphics cards.

2) Code-names prove it too.

3) Looking at 10-15 years of NV's generational increases also show where next generation mid-range chips fall against last generation's flagships and where flagship chips fall against last generation's flagship. Using these performance metrics, it's obvious that that 680 and 980 are mid-range graphics cards.

4) Die sizes. Look at the die sizes of NV's flagship chips during Tesla, Fermi, Kepler and Maxwell generation. It's obvious as ever that GTX680, its GTX770 refresh, and GTX970/980 are mid-range products.

5) Flagship NV cards also always have the highest memory bandwidth of that respective generation to the rest of the product line. Once again, 680/980 have mid-range buses of 256-bit vs. 384-bit for the flagship cards of that generation (780Ti and 980Ti).

In fact, the actual marketing names nV chooses are often arbitrary. GTX580 is not a 500 series product. It's really a GTX485 or GTX480 Ultra. NV just made a brand new generation out of nothing because it sounds nice/marketing FTW.

GTX780Ti wrt to GTX680 is really a GTX680/680 Ultra while GTX680 is nothing more than a GTX660Ti. But NV simply split the Kepler architecture into 2 halves and milked each half with "marketing" flagship aka GTX680.

According to all 5 key metrics we have always used to compare graphics cards for 2+ decades, 680/980 are mid-range products: (1) relationship in the architectural product stack of that generation (2) code-names (3) inter-generational performance increases relative to last generation of cards, 680 and 980 are mid-range graphics cards/chips (5) die sizes in that respective architectural generation (5) total memory bandwidth/memory bus width of that architectural generation.

Why are you comparing the 280 to the titan? You should compare the 280 to the 980.

Because GTX280/285 is a flagship Tesla chip, the cream-of-the-crop Tesla graphics card of that generation, the same as the Titan/Titan X series is a flagship/largest die of their respective Kepler/Maxwell generations. By their historical lineage, GTX280/285/480/580 are spiritual predecessors to Titan/Titan X series.

The difference is NV added a fancy marketing name and doubled the VRAM on the Titan cards, while leaving double-precision compute enabled on the Titan, but crippled on the Titan X. The Titan series is just a new prosumer product line but it comes from the same family of GeForce chips. What do you think the 980Ti is? It's just a Titan X with cut-down units and half the VRAM.



I realize many people don't want to admit that they paid $500-550 for a mid-range graphics card but that's exactly what happened because AMD started to seriously lag behind NV since 2012 which allowed NV to execute on this strategy.

There are even rumours that NV originally intended to call GTX680 GTX670Ti, but after seeing the performance of 7970, they realized AMD was way behind.




Also, marketing is often highly misleading.

GTX680 to GTX780Ti is equivalent to GTX980 to a GTX980Ti, yet NV made a new architecture out of nothing with the 780Ti. Alternatively, if NV wanted to, they could have called GTX980Ti as 1080Ti.

Another obvious marketing lies are GTX460->560. 560 is just a refresh of a GTX460 but NV made GTX500 series out of thin air. Why? Because 500 series sounds way more powerful than 400 series. :sneaky:

An even worse misleading marketing is GTX760 vs. GTX960. It appears that GTX960 is 2 full generations ahead of the 760 but in fact they are only 1 architectural generation apart (Kepler -> Maxwell). That means in no way shape or form should the GTX960 ever have been called a 900 series. At most it should have been an 800 series card but NV decided to skip GTX800 on the desktop just cuz....marketing!

GTX960 is also just 14% faster than GTX760, despite launching more than 1.5 years later and having a designation that suggests it's 2 full generations ahead of the 760. How can that be? Well, it's because the GTX960 is not a real x60 series card, that's how. The true x60 series cards using NV's historical generations are middle chips -- that would be GTX970/980. But when AMD is so far behind and is late to launch, why would NV price GTX970 at $199-249 and GTX980 at $249-299? They wouldn't.

NV is loving it, while AMD is forced to raise prices since their price/performance & small die strategy failed. The end result is consumers are paying higher prices for the expected level of performance OR they now have to wait longer before a true flagship is released for a particular GPU architecture.

Many of us expect this to be repeated with the first wave of Pascal chips.
 

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
The difference is NV added a fancy marketing name and doubled the VRAM on the Titan cards, while leaving double-precision compute enabled on the Titan, but crippled on the Titan X.

I wouldn't say it is crippled. The entire Maxwell family has no double precision. They took it out to use the silicon real estate for something else since they were stuck on the 28nm node.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I wouldn't say it is crippled. The entire Maxwell family has no double precision. They took it out to use the silicon real estate for something else since they were stuck on the 28nm node.

They definitely "crippled" it. It doesn't affect us gamers, but it definitely made the Titan X no longer a pro-sumer card. Which was argued as the sole reason for a $1,000 price tag on the original Titan.

Titan X @ $1,000 was just NV milking the Titan name they established and making extra bank.
 

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
They definitely "crippled" it. It doesn't affect us gamers, but it definitely made the Titan X no longer a pro-sumer card. Which was argued as the sole reason for a $1,000 price tag on the original Titan.

Titan X @ $1,000 was just NV milking the Titan name they established and making extra bank.

When I think crippled, I think they shut something down in the silicon. The difference here is that Maxwell 2 never had it in the first place. So, Maxwell 2's design sacrifices to fit as much gaming performance as possible from 28nm silicon should have rendered it invalid for a Titan card anyway.

I wouldn't say Nvidia crippled the Titan X by removing DP, since there was nothing to remove. They just shouldn't have made it. But without it and its higher price, we wouldn't have gotten the 980 Ti love for $650 with binned chips.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
When I think crippled, I think they shut something down in the silicon. The difference here is that Maxwell 2 never had it in the first place. So, Maxwell 2's design sacrifices to fit as much gaming performance as possible from 28nm silicon should have rendered it invalid for a Titan card anyway.

I wouldn't say Nvidia crippled the Titan X by removing DP, since there was nothing to remove. They just shouldn't have made it. But without it and its higher price, we wouldn't have gotten the 980 Ti love for $650 with binned chips.
They shouldn't have made Titan x? Are you serious? They make bank off that card. Nvidia knows what they're doing when they release cards they're making money. That's what they care about.
 

Sabrewings

Golden Member
Jun 27, 2015
1,942
35
51
They shouldn't have made Titan x? Are you serious? They make bank off that card. Nvidia knows what they're doing when they release cards they're making money. That's what they care about.

It watered down the Titan name. They were prosumer cards, but the Titan X lacks some crucial capabilities. It should've been the 980 Ti while the 980 Ti should've been the 980. The 980 should've been the 970 Ti. If Maxwell 2 can't do prosumer tasks well, don't release a card in your prosumer brand. Price them the same as they are now ($1000 980Ti) who cares, but they lost some Titan prestige when they decided to release a Titan on an architecture that has no DP.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |