Insert_Nickname
Diamond Member
- May 6, 2012
- 4,971
- 1,692
- 136
Because you run 4k @ 200% dpi scaling?
Merely 125% on a 43" TV. As I wrote, there really isn't a rational explanation.
I do agree 200% would be a bit pointless.
Because you run 4k @ 200% dpi scaling?
Merely 125% on a 43" TV. As I wrote, there really isn't a rational explanation.
I do agree 200% would be a bit pointless.
It does seem like every 2nd generation is better price performance. The 2xxx series got a new feature RTX/DLSS and the 4xxx series just got more performance without better price/performance ratio.Anyway, my observation that I shared before is that Nvidia seems to react to the sales of the previous gen. The 1000-series was a great improvement in price/perf and sold like gangbusters, so they provided a poor increase in price/perf for the 2000-series. Then they seemed to aim for a very good price/perf increase for the 3000-series, as evidenced by the 3080-price/performance. Yet that generation sold outrageously due to the mining boom. So the 4000-series then got an extremely poor price/perf improvement for the most part.
So my expectation is that Nvidia will overcompensate again for the 5000-series, although the higher wafer-prices will restrict how good a deal they can give.
It does seem like every 2nd generation is better price performance. The 2xxx series got a new feature RTX/DLSS and the 4xxx series just got more performance without better price/performance ratio.
I just upgraded my video card, so I'll just follow from the sideline the next 4 years until a new upgrade is imminentThe 4000-series gives us a new feature as well, in frame generation, so it actually has better price/perf than any generation before it!!!!111
The sad part is that Jensen might actually believe that.
The biggest reason I see why the 5000-series might not be at least good value, is if they have to make a huge price-adjustment for this generation and already overdo it. However, I predict the opposite, where they gradually lower prices and stop once sales are just mediocre.
I think the margins will mostly stay this way. Next gen will bring some price/perf uplift, but it will probably be some boring small percentage like in the CPU space.What I'm wondering is, that the current generation hasn't really improved on the price/performance ratio vs last generation (except for the xx90 or x900 class cards). So what will happen next generation? There must be an upper limit to what they can sell video cards for, and if these prices are going to be the "new normal" then next generation will have to have a better price/performance ratio. Or do you think they will have to lower the prices of the current gen at some point?
Keep in mind that if they keep very similar price/performance for generation after generation, sales won't just stagnate, but go down further, as fewer and fewer buyers will then still have a card from before the stagnation. Eventually nearly everyone will just keep their card until it breaks or switch to really long upgrade cycles.
That's what happens with Moore's Law being dead.
I'm hopeful that GDDR7 will help (help Ada's performance at least) a bit
But Ada already got a huge boost per mm2 compared to Ampere. And the current price increases are simply unexplainable by stagnating price/perf of the nodes. The highest estimate of the difference per mm2 between Samsung and TSMC is 2x the cost, but the 4080 chip is much smaller than the 3080 chip, so you can't explain the price increase of 70% even if the entire BOM cost of the 4080 is determined by the cost of the chip, which it of course isn't.
GPU price follows cryptocurrency prices(mostly BTC and ETH.)
The 3080 was extremely far cut though. You really should compare the 4080 to GA103. Still, they could stand to cut the price a bit. But we're talking to like $999.
No longer. Mining is mostly irrelevant now.
The cut doesn't change the wafer costs of the chip. Nvidia should have significantly higher yields with TSMC anyway
Wow, you're making a believer out of me. I swear I see some channeling of Jensun Huang here.That's what happens with Moore's Law being dead.
I'm hopeful that GDDR7 will help (help Ada's performance at least) a bit... but it's just tough to say how much at this point & whether that would be enough to move people to buy. Assuming that the prices don't change.
The high sales volume of the 3080 is diametrically opposed to your argument, as it implies big losses in manufacturing due to poor yields. But for the sake of the argument, let's look further down the stack:The 3080 was extremely far cut though. You really should compare the 4080 to GA103.
For anyone who has yet to stumble on this talk by Jim Keller, I strongly recommend watching it.Cache ~= 1/2 area of monolithic, can be scaled if you do as the V-cache tech showed and reduce your footprint by 50%.
3060 12GB ~ $330 for a card using a 276mm2 die.
4070Ti 12GB ~ $800 for a card using a 295mm2 die.
Even if we double every manufacturing cost, we're still roughly $100 short of the price Nvidia is asking AFTER they unlaunched the card to sell it cheaper. And remember this means that even screws in Nvidia cards are nearly 100% more expensive.
The wafer price difference between SS8 and 4N is likely more than double. After all, they are getting 2.7x gain in transistor count between the two. You can't compare die sizes anymore as a relative cost.
Add to that the elephant in the room that so many seem to overlook: 4090 pricing vs 3090, a very modest 7% gain in price. Yet that damn 4080 with the smaller chip goes up 70% (vs the large die 3080).But Ada already got a huge boost per mm2 compared to Ampere. And the current price increases are simply unexplainable by stagnating price/perf of the nodes. The highest estimate of the difference per mm2 between Samsung and TSMC is 2x the cost, but the 4080 chip is much smaller than the 3080 chip, so you can't explain the price increase of 70% even if the entire BOM cost of the 4080 is determined by the cost of the chip, which it of course isn't.
Price gouging is the only thing that makes sense.
Bottom line: There is a severe lack of competition in the GPU/wafer production market.
RX 5600 XT (ASRock Challenger) "New" @ Newegg (shipped and sold by) for $199.99! (See my thread in Hot Deals)
What corner of their warehouse did these crawl out of?