The 3090 was probably the easiest to find during the crypto boom, since the extra 20% bandwidth over a 3080 didn't justify the cost increase. Everything eventually sold, but 3090 were at least better than the 3060 Tis and 3080s.
Really? The SM count, TDP, bus width, memory increase and memory bandwidth all suggest it will be a pretty huge increase at 4K. Like 50-60% faster than the 4090.
The 5080 should be around 4090, depending on clock rates. But it'll still only have 16GB of VRAM. So it'll be a downgrade and upgrade for slightly less.
Easier said than done. Haven't missed out on a high end release. Doubt I could break that trend.My solution is to skip the rat race this time.
Especially since if the rumours are true it's a card that's screaming for a full coverage waterblock.Easier said than done. Haven't missed out on a high end release. Doubt I could break that trend.
It will be all the harder to resist if the speculated performance numbers turn out to be true.
I've had a similar bad habit but I'm breaking it - thanks to Nvidia. If I want big FPS I can now turn on DLSS and framegen and it'll be "better quality and smoother than native" on a 5090 for free! 🤣Easier said than done. Haven't missed out on a high end release. Doubt I could break that trend.
It will be all the harder to resist if the speculated performance numbers turn out to be true.
Sooner or later you gonna have to pay the early adopter tax if you keep refusing to tame your patienceEasier said than done. Haven't missed out on a high end release. Doubt I could break that trend.
With Nvidia focusing primarily on AI, who knows how long a consumer Blackwell successor will take to come to market. The 4090 will be 27 months old by the time the 5090 is released (if rumors are true). I could have a 5090 day one that will last 2 and 1/2 years.Sooner or later you gonna have to pay the early adopter tax if you keep refusing to tame your patience
Possibly true but I'm talking about some launch fiasco like the melting connectors. Would suck if you get back a repaired card after doing an RMA. It's not the same as a brand new card.I could have a 5090 day one that will last 2 and 1/2 years.
The 4090 connector thing blew up. Didn't affect me thankfully. A lot of it was a snowballing of controversy by those chasing YouTube algorithms and those latching on to it. The problem is real, but not one that I feel is as big or devastating as it's made out to be.Possibly true but I'm talking about some launch fiasco like the melting connectors. Would suck if you get back a repaired card after doing an RMA. It's not the same as a brand new card.
5080 should be a bit better than 4090 at ~50W less and probably some $ less. But you lose 8GB of VRAM. Seems they really don't want to give you what you want.For me, 4090 performance at lower power in a less thicker card would be more attractive.
My comment wasn't towards how much faster it will be, I'm sure it will be significant, it was more where is the power going to be used when the 4090 already eats up 4K for the most part. For gaming, the 5090 will allow you to turn off frame gen at 4K in a few games where you still need it with a 4090, or hit 120/4K with frame gen.
Not unless you pay out the wazoo for it. It’s the Nvidia way.5080 should be a bit better than 4090 at ~50W less and probably some $ less. But you lose 8GB of VRAM. Seems they really don't want to give you what you want.
The way It's meant to be played.Not unless you pay out the wazoo for it. It’s the Nvidia way.
I’ll be skipping RTX 5000, no point for me as it’s still based on 4nm family of nodes. I’ll get the RTX 6000 series when they move to 2nm or A16 with BSPDN.
You can still limit the power, but then performance will suffer, so not that appealing.I’ll be skipping RTX 5000, no point for me as it’s still based on 4nm family of nodes. I’ll get the RTX 6000 series when they move to 2nm or A16 with BSPDN.
I don’t like how the rumours of power consumption going up. Efficiency is important for me. I skipped RTX 3000 because RDNA2 was more efficient. Lovelace is excellent in perf/w so staying put with my 4070 super. Oh well more money in my bank account.
Yeah, and when will we even see RTX 6000? Q1/Q2 2027? A long time to go. Maybe then 12GB will be the minimum Vram for a GPU.Given that N2 wafers are almost double the price of N4, that's going to be a fun time. A16 even more.
Efficiency should be higher still. 5090 should be >= 20% more power efficient than 4090. But personally I'm not interested for other reasons. Not having a 384 bit die is a misstep by Nvidia and I don't want to encourage them further bifurcating the market.I’ll be skipping RTX 5000, no point for me as it’s still based on 4nm family of nodes. I’ll get the RTX 6000 series when they move to 2nm or A16 with BSPDN.
I don’t like how the rumours of power consumption going up. Efficiency is important for me. I skipped RTX 3000 because RDNA2 was more efficient. Lovelace is excellent in perf/w so staying put with my 4070 super. Oh well more money in my bank account.
Efficiency should be higher still. 5090 should be >= 20% more power efficient than 4090. But personally I'm not interested for other reasons. Not having a 384 bit die is a misstep by Nvidia and I don't want to encourage them further bifurcating the market.
Nvidia is quite bandwidth-limited, so I think that GDDR7 will be a nice benefit. Especially since they are already paying quite a bit for GDDR6X, so 7 should be better for a similar price.