Anybody else unimpressed with new midrange Nvidia GPUs, and much higher MSRP?

Page 32 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

exar333

Diamond Member
Feb 7, 2004
8,518
8
91

Hmm this is indeed very interesting. This totally contradicts the fact that gtx970 is the most popular graphics card among Steam users. So either people got richer in the last 6 years or Nvdia got a whole lot smarter. Lol.

I don't think it contradicts it at all.

When you get below the top 2-3 GPUs from each company (980Ti/980/970 and Fury/390x) the number of SKUs increases for each smaller price bracket. What that means is that the 'lower, high-end' products like the 290, 290x and 970 are very well represented in user #'s. Imagine if there was a 970 and a 970Ti offered for the last few years. You would see those numbers split between the two. The fact the 970 has been THE NV option between $300-400 for almost 2 years makes it a popular card. If the 390/390x never were released (say a '8GB version' was released) it might be a lot higher versus being split out between 290 and 390 sales. Also throw in the 'X' versions for both as well.

Not saying the 970 didn't sell well, but it has been a consistent product, without anything within a $100 higher (from NV) and almost the same $100 lower...
 

jlee

Lifer
Sep 12, 2001
48,513
221
106
I appreciate the fantastic information and detail provided.

That said, the chart is awfully old (2010). Looking at more recent feedback, the message from AIBs is the $$$ is being made in high-end, rather than low-end. I think NV understands this trend and is taking advantage. Not saying it is right or not, just pointing that out. I would be interested to see hard numbers on ASPs over the last 10 years. Word is they continue to increase and I think they will continue to do so.

Since 2010, a LOT of lower-end GPUs have all been completely integrated to the CPU. That 14% is probably a lot higher. Also, ASP is a lot more important metric rather than % of sales. One 980Ti is a 'more desireable' sale vs. a $100 entry-level card.

http://www.fudzilla.com/news/graphics/38483-overall-gpu-shipments-down-11-percent

edit: I also don't think those categories make much sense now. I would do the following:
-$<$99
-$100-$199
-$200-399
-$400-799
-$800+

http://www.kitguru.net/components/g...op-graphics-cards-hit-10-year-low-in-q2-2015/

Sales of high-end cards are increasing while overall sales are decreasing - it seems that the higher performance of onboard solutions (and/or the fact that a mid-range card from a year or two ago still does the vast majority of what people want) made an impact.
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76

Hmm this is indeed very interesting. This totally contradicts the fact that gtx970 is the most popular graphics card among Steam users. So either people got richer in the last 6 years or Nvdia got a whole lot smarter. Lol.

Well, we know in the USA that the mean adjusted income for middle class has dropped, so you can rule that out.

The chart is old and inflation is real. Although many products have remained approximately the same price regardless (for instance games are still pretty close to 50 bucks.) I think partially it is because the GPU market is not expanding but is shrinking, they are forced to push off inflation onto the customer. However, I am not an economist, I am just guessing.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Well, we know in the USA that the mean adjusted income for middle class has dropped, so you can rule that out.

The chart is old and inflation is real. Although many products have remained approximately the same price regardless (for instance games are still pretty close to 50 bucks.) I think partially it is because the GPU market is not expanding but is shrinking, they are forced to push off inflation onto the customer. However, I am not an economist, I am just guessing.

This one is not that old:


This graph is nv admitting to price gauging and testing the market limits. They admit their products were priced higher than they should - out of reach for most of the market. Yet their marketshare (per board piece, not $ share) increased. Speaks volumes.
 
Last edited:

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Fair enough:


Not much change since fermi.
10%>$300
vs
14%>$200

Percentages that target different price segments don't really paint a clear picture. I don't think you could assert or refute RS's point with that data, IMHO.

Also, it is pretty obvious we have seen significant price and market changes since the 7xx gen, and this pre-dates that.

To reference RS's post, look how fast the GPU landscape changes. Around when your slides were created, the 680 launched at $499. That's a little different than what we are seeing with the 1080.

I looked for a few minutes, but didn't find anything that jumped out at me. I still maintain ASP for discrete GPUs for the last 5-10 years would be really telling.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
This one is not that old:


This graph is nv admitting to price gauging and testing the market limits. They admit their products were priced higher than they should - out of reach for most of the market. Yet their marketshare (per board piece, not $ share) increased. Speaks volumes.

You completely missed the point of the graph. The slides are from the 660 launch and are supposed to point out that the 660 has a huge market as the 670/680 were above that price point.

Nice try though to skew the message...
 

xorbe

Senior member
Sep 7, 2011
368
0
76
I'm sticking to cards that are less than $400 from now on, so I guess that means 1070 for me. They now have 3 cards $599 and up (1080, future 1080Ti, future Titan).
 

tential

Diamond Member
May 13, 2008
7,355
642
121
So if 300+ GPUs are out of reach for 90% of GPU owners... then Nvidia must clealry understand that the top 10% are what drives revenue for them since they continually focus on releasing the 300+ GPUs first, and are even raising the prices for them.

To me, it's the midrange buyers who see new gpus, get excited, but don't buy. It's the high end buyers that almost always buy, so Nvidia is right in launching the 1070/1080 first and Polaris will get some hype but most people will probably just say "eh, the performance I have still is good enough."
 

sam_816

Senior member
Aug 9, 2014
432
0
76
Your smartass response does nothing for my question, but thanks for the effort. However, I do want to point out that $399 in 2004 is $629 in today's dollars...so you're much better staying with the 6600GT example ($250 adjusted for inflation).

I'm quite confident that the bolded is the case - people continue to pay, so what incentive does nVidia have to charge less money?.



I am not a student of economics so I have a couple of questions related to this post and would appreciate if some one can help me here.

Jlee pointed out that value of $$ has changed over time and adjustment for inflation has to be made. Now my questions are:

1. Haven't the incomes increased as per inflation? If I am not wrong USA has seen best rises in incomes & established companies like apple n NVIDIA proudly claim that they have very efficient/economical manufacturing processes. Ya?

2. As the customer base increases & cheaper manufacturing processes are developed the cost of products go down I mean look at the cost of mobile phones. Now are the new gpus by NVIDIA so ground breaking to justify the increase in launch prices or are they just trying to take advantage of the hype?

3. I have read that NVIDIA's support for 700 series gpus is declining. Now in this recent presentation it was claimed that billions of dollars and thousands of engineers were involved for the development of 1000series.

Do you think that's true? If yes, why do they drop or slow down the support on a product in which they invest so much to begin with?

There are some more questions in my mind but I will ask them after I have cleared these doubts.
 

96Firebird

Diamond Member
Nov 8, 2010
5,712
316
126
1. Simple answer - No. Look up "real income", wages have not increased with inflation.

2 . Nvidia can set launch prices, as they have been, when there is lack of competition. Also, who says manufacturing is getting cheaper? Look at the R&D budget for these GPU companies...

3. Technology moves fast. Any company is going to put more resources into future product if it means greater revenue.
 

sam_816

Senior member
Aug 9, 2014
432
0
76
1. Simple answer - No. Look up "real income", wages have not increased with inflation.

2 . Nvidia can set launch prices, as they have been, when there is lack of competition. Also, who says manufacturing is getting cheaper? Look at the R&D budget for these GPU companies...

3. Technology moves fast. Any company is going to put more resources into future product if it means greater revenue.
thank for your reply, appreciate it.

I am not sure i completely agree on the 1st point tho as i see more people buying $600+ cards than before which suggests(i may be wrong) that ppl have more disposable income. also, base wages have not improved but as we move higher in the chain ppl have better incomes(inflation well factored in) and these ppl are actually the consumers for these GPUs.
now i have some more questions but lemme start with these cases.
if i am not wrong ps3 launched at $499(20gb) and $599(60/80gb not sure)
but ps4 launched at $399. if i am not wrong there was a lot of effort/RnD etc etc involved. xbox 360 was 399 but xbone was 499( included their cam thingy)
similar is windows OS scenario. win 7 home basic(lowest ver) was $149.99, whereas wind 8 and win 10 are 119.99(in case u buy fresh license)how come the inflation etc. were not factored in here?
also, when we consider the CPU market where amd-intel situation is similar to amd-nvidia situation, do we see similar price hikes from intel with every new series/processor family?

P.S.: sorry for my bad english, its my 3rd language lol
 
Last edited:

railven

Diamond Member
Mar 25, 2010
6,604
561
126
1. The US is screwed up in regards to cost of living growth and living wages growth. It varies state to state and in some situations people's basic needs have to go compromised due to companies unwilling to provide proper benefits, retirement plans, and wages.

The upper-middle class is slowly declining. You'll find most Americans live outside their means. I would contribute the growth of higher priced GPUs due to both AMD and NV upping the price of their top products.

2. The price hike is a two prong issue. Are you asking about the die size relative to what they are charging or what they are charging to what you are getting? In the last 4 years NV's x80 card went from $500 to $600. That isn't as severe as an incline as you'd think. If you're going to argue die size, you're now arguing relevance. People don't buy die's, and most won't even know the actual die of the card they are buying.

3. After the AMD 7000 series dropped, their older cards were almost abandoned. AMD changed their design and never looked back. You didn't see pitchforks or "planned obsolesces" posts here. AMD invested a lot in their GCN architecture and got it into the hands of important decision makers. Kudos to them. NV basically just optimized their architecture for what was on the table. They handily beat AMD until those investments AMD made started to pay off. But it took years costing AMD market share, mind share, and then some. They are at the cusp of reaping their investment, but then comes Pascal. Updated to take this decisions into account. Suddenly AMD may no longer have that advantage. Does that mean NV is abandoning their old architectures? To a degree, yes, because the market is changing. Just like when the market moved to unified shaders and fixed pipeline hardware was almost instantly abandoned.
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
railven said:
3. After the AMD 7000 series dropped, their older cards were almost abandoned. AMD changed their design and never looked back. You didn't see pitchforks or "planned obsolesces" posts here. AMD invested a lot in their GCN architecture and got it into the hands of important decision makers. Kudos to them. NV basically just optimized their architecture for what was on the table. They handily beat AMD until those investments AMD made started to pay off. But it took years costing AMD market share, mind share, and then some. They are at the cusp of reaping their investment, but then comes Pascal. Updated to take this decisions into account. Suddenly AMD may no longer have that advantage. Does that mean NV is abandoning their old architectures? To a degree, yes, because the market is changing. Just like when the market moved to unified shaders and fixed pipeline hardware was almost instantly abandoned.
That's probably why its performance relative to the 7850 barely changed over the course of 1.5 years, right?

3/22/12 6970 1% faster than 7850


7/25/13 6970 7% faster than 7850


10/22/13 6970 7.5% faster than 7850


AMD sure took a long time to abandon those cards.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Odd, considering the 7870 didn't have such wild swings.

12% Faster
10% Faster
11% Faster
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Radeon 5000 and 6000 series support was decent up until they put them on legacy with Crimson iirc. Crimson even had some nice "farewell" performance improvements at least for my friend's 6850 in Diablo 3. Nvidia is making their own fan base nervous with how quickly and drastically they took engineers off updating drivers for Kepler. Now I read some of them psyching themselves up for "forced" annual upgrades as some sort of driver support fee.
 

sam_816

Senior member
Aug 9, 2014
432
0
76
1. The US is screwed up in regards to cost of living growth and living wages growth. It varies state to state and in some situations people's basic needs have to go compromised due to companies unwilling to provide proper benefits, retirement plans, and wages.
i dont want to be rude here but are you saying american citizens(at least gamers) are (dnt know how to put it more nicely) ppl with low IQ??
The upper-middle class is slowly declining. You'll find most Americans live outside their means. i am really concerned here as it is not a good news for the wole tech industry imo. America is considered to be the target market as they consume maximum tech gadgets(per capita and sheer numbers). what will happen if one day ppl wise up ? lol
I would contribute the growth of higher priced GPUs due to both AMD and NV upping the price of their top products.
I am not happy about pricing strategy of both nvidia n amd but since amd is usually following nvidia they can't be credited or blamed to set the pricing of GPUS( just my IMO) and i am annoyed this time because of this founder's edition vs. msrp shenanigan. i am not sure is any of the vendors will bring a 599 version. so basically the price of reference card is 699 which is 200 bux up from 499 price tag of 980 at launch.
2. The price hike is a two prong issue. Are you asking about the die size relative to what they are charging or what they are charging to what you are getting? In the last 4 years NV's x80 card went from $500 to $600. That isn't as severe as an incline as you'd think. If you're going to argue die size, you're now arguing relevance. People don't buy die's, and most won't even know the actual die of the card they are buying.
No i am not going to argue die size and i don't think 90% of gpu buyers will argue that as well. like i wrote in my previous post. from a regular customer's point of view sony paid alot of RnD cost to make a ps4 which is perceived to be a lot better than ps3(dont care abt the die size of ps4 vs ps3) they are just concerned about the launch prices of these consoles. same goes for windows newer versions and intel's newer chips and samsung's new ssds and most importantly games which are sold for $60 for a long time now.

3. After the AMD 7000 series dropped, their older cards were almost abandoned. AMD changed their design and never looked back. You didn't see pitchforks or "planned obsolesces" posts here. AMD invested a lot in their GCN architecture and got it into the hands of important decision makers. Kudos to them. NV basically just optimized their architecture for what was on the table. They handily beat AMD until those investments AMD made started to pay off. But it took years costing AMD market share, mind share, and then some. They are at the cusp of reaping their investment, but then comes Pascal. Updated to take this decisions into account. Suddenly AMD may no longer have that advantage. Does that mean NV is abandoning their old architectures? To a degree, yes, because the market is changing. Just like when the market moved to unified shaders and fixed pipeline hardware was almost instantly abandoned.
well, i will read more about it as i am not aware of it(new to pc scene). I mentioned nvidia's slowing support since(primarily) i've read about it at multiple forums. I have also read that AMD's driver support has been almost always slow and since they don't have budget like Nvidia i didn't point fingers at them(for now). secondly, i mentioned it as during the pres. their top gun was making vague statements( i felt it) about millions of $$ and thousands of engineers. From a person of his stature i expect statements like over 2 billion in RnD and over 1500 engineers. I honestly paused the pres video to check total HR strength of Nvidia since he sounded like they have 30k-40k employs & i was scratching my head thinking how they can spare thousands of engineers just for RnD but they have staff shortage when it comes to supporting previous gen GPU architecture.(as i read it in one of the threads here on AT forums)
....
 
Last edited:

Dresdenboy

Golden Member
Jul 28, 2003
1,730
554
136
citavia.blog.de
So if 300+ GPUs are out of reach for 90% of GPU owners... then Nvidia must clealry understand that the top 10% are what drives revenue for them since they continually focus on releasing the 300+ GPUs first, and are even raising the prices for them.

To me, it's the midrange buyers who see new gpus, get excited, but don't buy. It's the high end buyers that almost always buy, so Nvidia is right in launching the 1070/1080 first and Polaris will get some hype but most people will probably just say "eh, the performance I have still is good enough."
Would also be interesting to know the replacement rates in these market segments, as this would complete the picture of revenue opportunities.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
i dont want to be rude here but are you saying american citizens(at least gamers) are (dnt know how to put it more nicely) ppl with low IQ??

What does IQ have to do with it? One of my friends is a doctor, and he's broke because he'd rather pay his bills on time. I had a conversation with our PA who chose to go to a casino instead of pay her over due bill. These are both smart people who clearly have different approaches at responsibilities.

i am really concerned here as it is not a good news for the wole tech industry imo. America is considered to be the target market as they consume maximum tech gadgets(per capita and sheer numbers). what will happen if one day ppl wise up ? lol

Don't be. The US is a leading nation in consumerism because credit is easily attainable. People will destroy their credit histories in order to make sure they're wearing Beats by Dre and rocking the newest iPhone. My own brother had his car repoed because he wanted an Infiniti Q60 which he later defaulted on his payments.

Trust me, this country will not give up it's consumerism.

I am not happy about pricing strategy of both nvidia n amd but since amd is usually following nvidia they can't be credited or blamed to set the pricing of GPUS( just my IMO) and i am annoyed this time because of this founder's edition vs. msrp shenanigan. i am not sure is any of the vendors will bring a 599 version. so basically the price of reference card is 699 which is 200 bux up from 499 price tag of 980 at launch.

It takes two to tango. But it seems you've already drawn your conclusion without even seeing what prices are set. If you want to regurgitate the $700 price point, by all means. I'll continue to hope for non-Founder editions in the ~$600 range.

No i am not going to argue die size and i don't think 90% of gpu buyers will argue that as well. like i wrote in my previous post. from a regular customer's point of view sony paid alot of RnD cost to make a ps4 which is perceived to be a lot better than ps3(dont care abt the die size of ps4 vs ps3) they are just concerned about the launch prices of these consoles. same goes for windows newer versions and intel's newer chips and samsung's new ssds and most importantly games which are sold for $60 for a long time now.

The PS3's R&D was waaaaaay higher than the PS4. Both Sony and Microsoft said they went the cheapest route possible. If you look at the hardware found in each console for their perspective generation. The Xbox360 launched with unified shaders, a staple not yet found in the PC Market. The PS3 hat a custom multi-core CPU that basically provided most of the grunt work, hell you had PS3 games doing accelerated tessellation considering the GPU it had in it didn't provide that feature (It was done on the Cell's SPE's).

The PS4/Xbone are a regression in regards to console technology, and it is evident by the fact that they both might be replaced within the next two years.

The issue Windows is more piece of mind. Windows 8 did a great job turning people away from the platform. Basically why Windows 10 was free for the first year it is out.

well, i will read more about it as i am not aware of it(new to pc scene). I mentioned nvidia's slowing support since(primarily) i've read about it at multiple forums. I have also read that AMD's driver support has been almost always slow and since they don't have budget like Nvidia i didn't point fingers at them(for now). secondly, i mentioned it as during the pres. their top gun was making vague statements( i felt it) about millions of $$ and thousands of engineers. From a person of his stature i expect statements like over 2 billion in RnD and over 1500 engineers. I honestly paused the pres video to check total HR strength of Nvidia since he sounded like they have 30k-40k employs & i was scratching my head thinking how they can spare thousands of engineers just for RnD but they have staff shortage when it comes to supporting previous gen GPU architecture.(as i read it in one of the threads here on AT forums)

It's rather interesting how a few articles and vocal (AMD bias) posters can change the perception of a product. There was no regression on Kepler. It never got slower than it's launch period. However, AMD did improve. They improved so much it left Kepler behind. That doesn't mean Nvidia crippled Kepler. It was no longer their focal point. You could argue they abandoned it, and I think any reasonable poster would agree. After a certain point Kepler didn't see any strong gains.

I've been an AMD/ATI exclusive user for over 15 years, that stopped when AMD couldn't provide drivers for my configuration that satisfied me. I'm sure they fixed it by now, but for those few months AMD was basically pushing me to drop them. I also don't propagate that AMD has bad drivers meme, far from it. Considering the resources of the two companies AMD kept up. But this forum does a good job of basically eliminating the first 1-2 years of the HD 7970's tired existence. Because it wasn't all roses. And I owned two 7970's up until 2013 when I couldn't take it anymore and sold em. (I also panned Nvidia's SLI implementation which I tried next. But I guess I'm just too sensitive to microstutter. I later tried 290X Crossfire and found the issues I had left behind to be worse. The games I play frequently had poor support by AMD, which was unfortunate.)

I wish Brightcandle still posted here. He documented well the issues with the 7000 series early on. But around here I quickly found out, if you aren't tossing praise at AMD, you're an NV sheep/shill/fanboy. 15 years of AMD/ATI exclusive support erased over night. Interesting thing.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
3. After the AMD 7000 series dropped, their older cards were almost abandoned. AMD changed their design and never looked back. You didn't see pitchforks or "planned obsolesces" posts here.

Stop making stuff up. Fermi aged way worse than HD6000.

April 23, 2016 Benchmarks

$299 HD6950 @ 6970 = 100%
$499 GTX580 = 109%
http://www.computerbase.de/thema/grafikkarte/rangliste/

9% greater future performance in modern games for $200 USD more paid out in 2010. :thumbsdown:

The 6950 user can literally throw the 6950 into the garbage today but he/she still has $200 USD left over from not wasting it on a 580. This $ can be used towards Polaris 10, 1060/1060Ti/1070, etc. The 6950 user could have also invested the $200 into a GIC or a bond and made some interest income in the last 5 years. Even if we compare a $379 6970 vs. a $499 580, the 580 still was not worth $120 extra. That $120 was better used to purchase an i7 2600K vs. an i5 2500K, or set it aside towards a future GPU upgrade.

The exact same is true for GTX280 vs. 4870, GTX285 vs. HD4890, GTX480 vs. HD5870, GTX780/780Ti vs. R9 290/290X, R9 390/290X vs. 980, etc. Not one of those NV cards was worth its premium for someone who held on to the card beyond 1.5 years.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Nice find! Do you know what games they used to get these averages?
They're still showing the GTX 780Ti beating the 290X and the 780 keeping up with the 290.

That's because Computerbase.de uses reference 290/290X blower cards that throttle. To get them back to factory clocks, they increase powertune + 50% and set fan speed to 100%. Only then their cards actually operate at spec. The label them as 290 OC and 290X OC. This is why you see a 390 beating 290X by 12%, which you know is impossible. Once 290X is adjusted to operate at factory clocks locked, the 290X OC label shows it outperforming 390, which is how it should be. This also explains why reference 290/290X blowers were so damaging to AMD.

They explained this a while back when they tested reference 290X vs. after-market 290X.

Look:
290X Reference on Computerbase charts:

Clock in Corsair Obsidian 800D= 836-869mhz
Clock in Fractal Design R4 = 840-860mhz

That is why you see 290/290X (non-OC labels) performing so poorly in ComputerBase's testing but they did want to show a real world result for reference cards; which is appreciated imo.

So in reality a non-thermal throttling 290/290X beat 780/780Ti

I am not sure about specific games but whatever their list was in 2016.
 
Last edited:

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Radeon 5000 and 6000 series support was decent up until they put them on legacy with Crimson iirc. Crimson even had some nice "farewell" performance improvements at least for my friend's 6850 in Diablo 3. Nvidia is making their own fan base nervous with how quickly and drastically they took engineers off updating drivers for Kepler. Now I read some of them psyching themselves up for "forced" annual upgrades as some sort of driver support fee.
that is pretty insane.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
The PS3's R&D was waaaaaay higher than the PS4. Both Sony and Microsoft said they went the cheapest route possible. If you look at the hardware found in each console for their perspective generation. The Xbox360 launched with unified shaders, a staple not yet found in the PC Market. The PS3 hat a custom multi-core CPU that basically provided most of the grunt work, hell you had PS3 games doing accelerated tessellation considering the GPU it had in it didn't provide that feature (It was done on the Cell's SPE's).

The PS4/Xbone are a regression in regards to console technology, and it is evident by the fact that they both might be replaced within the next two years.

R&D spending is not a definitive indication of final product quality. You know what product AMD sunk a ton of R&D dollars into? Bulldozer.

PS3 wasn't the market failure that Bulldozer was, but it was a technological dead end. Its oddball architecture made it a nightmare to program for. The PS4 and XB1, in contrast, are just PCs, so working with them is far easier for developers.

The era of innovative, ground-up architectures for standard consumer devices is over. All future consoles will be either disguised PCs (x86 SoC) or disguised smartphones/tablets (ARM SoC). With the possible exception of a handful of scientific, industrial, and/or military applications, x86 and ARM are the only two CPU architectures that matter now.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
To reference RS's post, look how fast the GPU landscape changes. Around when your slides were created, the 680 launched at $499. That's a little different than what we are seeing with the 1080.

I looked for a few minutes, but didn't find anything that jumped out at me. I still maintain ASP for discrete GPUs for the last 5-10 years would be really telling.

I couldn't find ASP beyond 5 years but for 2010-2015, NV quotes CAGR of 11% for ASP. (1.11^5 = 68.5% increase in ASP from 2010). Obviously that doesn't include the price increases we are seeing with Pascal ($329 970 -> $379-449 1070 and $549 980 -> $599-699 1080, which suggests NV is continuing to increase ASP).



"On May 6, NVIDIA announced both the GeForce GTX 1070 and GeForce GTX 1080 graphics cards. The MSRP for these cards sit at $379 and $599, respectively, but the models designed and built by NVIDIA (i.e., the Founders Edition cards) will sell for $449 and $699, respectively.

At launch, the prior-generation GTX 970 and GTX 980 cards were offered at $329 and $549, respectively,..."

....

"Gamers are willing to pay for performance, and NVIDIA knows this

One of the reasons that the PC gaming market has been so good to NVIDIA is that games are continually requiring more and more horsepower in order to look their best. This means that gamers upgrade at a relatively rapid clip and, more important, are often willing to pay a bit more in order to get meaningfully more performance.

At some point I suspect there will be a ceiling for gaming-oriented graphics chip selling price increases. However, there does seem to be room for the graphics specialist to add products at higher performance/price points before the limit is reached.

For example, there's room for NVIDIA to introduce a product at the $999 price point (i.e., a next generation Titan product), and I could see a cut-down version of that Titan product slotting in at $799.

Heck, if NVIDIA is able to deliver enough performance, a product at a price point of even $1299, with a cut-down variant at, say, $899 could very well be tolerated.

Additionally, NVIDIA's previous product stack had the GTX 970 at $329 and the GTX 960 at around $199. I could see the Pascal-based successor to the GTX 960 coming in at around $249, but offering performance roughly in line with the $499 GTX 980."

http://www.fool.com/investing/gener...poration-may-be-able-to-push-average-sel.aspx

Time for me to invest into more mining rigs as hedging strategy as I expect GPU prices to rise even more over the next 5 years. :sneaky:

Let's also not discount that SKU names assigned to GPUs are often arbitrary in nature. For example, never in the history of NV did a 2nd tier (x70) card was trailing the 1st tier (x80) by 20-30% but it appears that 1070 may be the first time this is happening. Based on the TFlops, it seems 1080 may only have 1920-2048 CCs at most. With only GDDR5, it's pretty much neutered compared to the 1080, where the performance delta is more akin to 660Ti vs. 680 rather than x70 vs. x80. That's another way NV raises ASPs because it manipulates SKU marketing names.
 
Last edited:

master_shake_

Diamond Member
May 22, 2012
6,430
291
121

Hmm this is indeed very interesting. This totally contradicts the fact that gtx970 is the most popular graphics card among Steam users. So either people got richer in the last 6 years or Nvdia got a whole lot smarter. Lol.

199 gpus suck.

they're pretty much all crap.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |