With the NVIDIA CUDA announcement, apparently GPU PhysX support in 32 bit games has been discontinued. This affects games like Borderlands 2. This one blindsided me, the news itself wasn't news, I just didn't know that PhysX was based on 32-bit CUDA. Source:
https://arstechnica.com/gadgets/202...ose-losing-physx-powers-on-newer-nvidia-gpus/
The 4090 was a huge upgrade, one of their best generational increases ever. If they’d have used 3nm for Blackwell, it would have been pretty good as well I’d reckon. They are competing against themselves and milking us gamers like cows.
And they’re apparently trying to cook us over literal fires with Blackwell cards as well.
As an owner with a 4090, agreed!
What? The 4090 was good, but hardly one of the best ever. It was ~50% faster than a 3090 Ti and maybe another 10% over the vanilla 3090. That's the same increase the 3090 Ti had over the 2080 Ti. 1080 Ti was twice as fast as a 980 Ti; 980 Ti was 40-50% faster than big Kepler (780 Ti) on the same node. 780 Ti was twice as fast as Fermi (580), and Fermi was again about twice as fast as Tesla. That's just the CUDA era though, go back further and the big steps were even bigger.
The only Ada and Blackwell are giant leaps forward are if you take the gaming applications of all the AI specific transistors they're filling the die with at face value and run with FG and MFG.
I am sorry, but I have to hard disagree with you. My 4090FE was a huge leap over my 3090 in every metric. It is the fastest (by far), quietest, and most efficient card I've ever owned. Maybe AIB cards aren't so great? This is the first time I've bought an NVIDIA made card. I primarily bought from EVGA, and once, a long time ago, a ZOTAC card or two (GTX 275 and something else I think) Outside of that I used to be a Radeon guy before they fell behind.
From what I understand, the 5090FE is even better. I've yet to play with one, but the fact it is a 2 slot card? Nobody else has managed that.
I suspect you watch very specific reviews or something. FWIW I play a mix of old and new, indie games, AAA games, love setting things to the max, and live for Path Tracing/Ray Tracing because it looks amazing on my OLED monitor or my OLED TV. I'm also not usually concerned about budget because gaming is my hobby. Oh and I'll gladly pay for a 5090FE when the time comes.
Prior to my 3090 I used an EVGA 1080ti. That was also an amazing card. I skipped the 20 series because it was THAT GOOD. If NVIDIA hadn't pulled the 2 slot 5090 stunt I'd probably skip this gen as well, but I really want to go back to my FormD T1. I had to move things out of it due to airflow overheating the chipset, but loved having such a tiny PC on my desk.
That is why I think people need to be careful about using AIB cards as an example of each generation. Board makers aren't really innovating. They throw more/bigger fans and bigger heatsinks on the cards each gen and call it a day. Sure, the cards might clock higher or have dual BIOS or whatever, but often they are noisy, run hot, power hungry, or have some other nonsense going on. NVIDIA is innovating bigly and their cards should be used as the example to follow.
Sorry for the rant, twas not intended to be one.
Oh and while I state the above, it only applies to the *90 cards. Every other 50, 40, and 30 series card has been a disappointment in my eyes.