Josh128
Senior member
- Oct 14, 2022
- 736
- 1,248
- 106
Yes they can. But it's pricey.
They cant. Else 6900XT with its node advantage would have done it. NV has pulled too far ahead.
Yes they can. But it's pricey.
Not true according to TPU. 6900XT is on par with 3090 while 4080 is 16% more efficient and even the 4090 is more efficient despite a massive speed advantage.It was as far ahead in efficiency as the RTX 4080 is to the 7900 XTX. And it did all that at 2/3rd the price.
Somehow it never matters when Radeon is ahead though.
lol, gotcha. This is why I hate discussing these things.It is also true according to TPU
RGT's "sources" are usually just the usual twitter leakers and chinese forums.Not even a fake made up peep from MLID, would have checked RTG but can't stand that liar's vewy annoying voice at all ...
Compared to 4090 memory (bandwidth) is much quicker than that - 70%+ (with 512 bit bus), agree with the rest!5090 has 33% more of everything, +33% faster memory.
No more than 40% in raster for 5090 vs 4090, RT probably around the same. Less for other SKU comparisons. 4080 to 5080 in particular is a sad turn of events if the rumored GB203 is correct. Only +5% shaders and RT cores vs AD103. This is a repeat of Turing, with all the real gains going to the top card of the generation. All the rest are barely more than refreshes with new GDDR.5090 has 33% more of everything, +33% faster memory.
Assuming 5% higher raster IPC per SM and 15% higher game clocks, that'd be ~60% more raw raster power with 77% more VRAM bandwidth.
So outside of CPU-limited scenarios (which may be common), I'd expect the 5090 to be 50-60% faster in raster, and 70-100% faster in RT.
It's not rocket science, though. Ada isn't Kepler, there's no Maxwell 2.0 moment to be had here, I'm afraid (I wouldn't mind eating my words later, I just don't see it happening).
No idea about businesses, but from individuals less than before given the economic situation and the fact that the 4090 will play everything just fine. Just vibes though, I really have no idea.Anyone expecting a flood of used 4090s from individuals and businesses ditching them for newer, shinier and more power efficient graphics and GPU/AI compute?
Not from gamers, or at least this gamer. The additional performance of the 5090 is nearly directly proportional to the increased cost, and my 4090 isn't exactly struggling to play the latest games. Also, the tariffs coming next year may make the price to performance ratio worse as well. Regarding tariffs, I moved up a purchases of a laptop, an external 4TB drive and a video card upgrade for my Plex server to this year instead of next, specifically for that reason; namely, I expect electronics in general to increase in price significantly soon due to the incoming tariffs.Anyone expecting a flood of used 4090s from individuals and businesses ditching them for newer, shinier and more power efficient graphics and GPU/AI compute?
You can kinda offset some of the increased price by buying 5090 before the tariffs and selling the 4090 after the tariffs are implemented. But I think there's a risk that Nvidia will introduce something exclusive to the 5090 (and lock it off in the drivers from working on previous RTX cards) that may cause the 4090 to look unattractive. Maybe 60 fps guaranteed path tracing?The additional performance is nearly directly proportional to the increased cost, and my 4090 isn't exactly struggling to play the latest games.
I've thought about trying to time that. The closer to $1999 the 5090 is, the more likely I would be to purchase.You can kinda offset some of the increased price by buying 5090 before the tariffs and selling the 4090 after the tariffs are implemented. But I think there's a risk that Nvidia will introduce something exclusive to the 5090 (and lock it off in the drivers from working on previous RTX cards) that may cause the 4090 to look unattractive. Maybe 60 fps guaranteed path tracing?
It's good to live within a 15min drive to a Micro Center. Especially so during COVID lockdowns. I was able to get upgrades and video cards for new builds while we were in quarantine. Best find was a 3080 10GB on launch day for $699.99 when they were selling on eBay for $1500+.The die size combined with the number of people I have seen interested in a $2000 5090 suggest it will not really be available. Except perhaps to those who queue at Microcenter. It's an ideal scalping item. Not Nvidia's fault, however, as there isn't much they can do when they make a chip this big.
The only card worth buying, so demand will be insane - would be amazed if it's below $2500.The die size combined with the number of people I have seen interested in a $2000 5090 suggest it will not really be available.
The card worth buying would be the 5070. It doesn't really matter because GDDR7 is an Nvidia exclusive this round of GPU's. The race to the bottom and mid grade is up for grabs.The only card worth buying, so demand will be insane - would be amazed if it's below $2500.
Can we stop blaming him? The current mess is not due to himBig bundle of thanks should also go to Raja Koduri, for sabotaging the GPU efforts of the only other two x86 GPU competitors, with his incompetence.
Apple don’t stifle competition they fight them with products and software, not back hand deals like Nvidia does where they prevent manufacturers to work with Intel or AMD. Nvidia is a scummy company but so is AMD and Apple.It's probably worse situation than Apple as that company only has this slaver grip on a minority of the markets they are in and they are not able to really stifle the other computer or phone makers.
I'll decide whether to get one based on availability and resale values of the 4090, but there is nothing I play that would benefit from it at 4K. But if it's effectively microcenter only, then the 4090 prices will stay up and the upgrade might not cost that much.No idea about businesses, but from individuals less than before given the economic situation and the fact that the 4090 will play everything just fine. Just vibes though, I really have no idea.
Nvidia does not want to make an affordable card that will be usable for 5 years - they've learnt their lesson from 1080 tiRandom comment, but I've always thought that a theoretical 3080 20GB or 24GB would have been the next 1080 ti in terms of usability 5 years down the line.
You mean Pascal series- they've learnt their lesson from 1080 ti
Nvidia does not want to make an affordable card that will be usable for 5 years - they've learnt their lesson from 1080 ti