inf64
Diamond Member
- Mar 11, 2011
- 3,884
- 4,689
- 136
Userbenchmark guy is on a call with FrameChasers cringelord, they are in crisis mode.I would like to see how the userbenchmark guy is coping? Might be bald by now.
Userbenchmark guy is on a call with FrameChasers cringelord, they are in crisis mode.I would like to see how the userbenchmark guy is coping? Might be bald by now.
Maybe he is inserting chips into the socket to create burnouts? I have heard people say.I would like to see how the userbenchmark guy is coping? Might be bald by now.
Very common in specific areas of big RPGs. I remember in Starfield the whole New Atlantis is like this, the framerate is inexplicably much lower than the rest of the game.I find it kind of baffling that people will game a lot but never seemingly be curious enough when the frame rate drops significantly and just assume its GPU. Large swings are in my experience more often CPU, and they're situational so you notice them even more, like entering heavily crowded areas, or very complex geometry, or lots of effects from various sources like in MMO's.
But then there is the AMD Ryzen 7 9800X3D, the new king of the block, there is simply nothing better for gaming, and in Minecraft it shows even greater improvements than in other games - if you have the money for it, it is a no-brainer gaming chip.
THANKS!Minecraft CPU Benchmarks: Winter 2024 Update
Quick Minecraft performance update with the Ryzen 7 9800X3D and Core Ultra 7 265K for Winter 2024.nemez.net
Part 2: https://www.igorslab.de/en/amd-ryze...vil-or-does-it-just-want-to-play-part-2-of-2/Igor is here with part one of his review - https://www.igorslab.de/en/amd-ryze...overclocked-undervolted-and-found-to-be-good/
I get what they're saying. If I game at 4K and I am not buying a new GPU - since that'll be $2000 - what do I care about 720p and 1080p charts? Future games? Approximately 99.99% of future games I won't care about. I find it strange since these X3D chips are far more hit or miss than past CPUs. And until I'm in 2026 how can I determine if it gets along?It is the same partisan crap with the parties switched. Back when AMD CPUs were slower than Intel's, the same folks sang different tunes.
As for myself, I'd say bring on 480p for all CPU tests. I can infer the 4K performance myself. Thank you very much.
If you look at my hwinfo I am not even cpu bound on that dawntrail at laptop low 720p 🤷Honestly, I mainly care about what happens around my resolution. So 2560x1440 is mainly what I look at. Now obviously lower resolutions are more CPU bound, but I still could see benefits at 1440p, particularly with smoothness and minimum frame rates, as well as in certain older games or CPU/engine bound games. I also understand that people would be interested in seeing how their potential setup would fair at 4k or 1080p, or whatever resolution they may play at.
This was the area I saw the most improvement in gaming when I upgraded from a 5600X to a 5800X3D last year. Went from 100%cpu usage and ~45-50 fps to 90% and a capped 60fps solid. I was definitely CPU bottlenecked pretty hard with a 7800XT.Very common in specific areas of big RPGs. I remember in Starfield the whole New Atlantis is like this, the framerate is inexplicably much lower than the rest of the game.
Second video in a couple of days explaining why they don't benchmark cpu's at 4k. They must be getting extreme levels of cope Intel copers.
You were not joking. RIP 4/4, 4/8, 6/6, 8/8, and older higher core count CPUs in this game. Daniel-San testing the 5600X in a village and it is getting hammered. Video is time stamped -As expected 9800X3D is a killer CPU for this killer CPU game. Re stalker 2
Well, the best thing is to just lowtest at 1080, 1440, and 4k. Personally, I think anything below 1080 testing is a waste of time. I know, people say you can predict future performance from lower resolution results, but I dont know if that theory has actually been proven scientifically.I get what they're saying. If I game at 4K and I am not buying a new GPU - since that'll be $2000 - what do I care about 720p and 1080p charts? Future games? Approximately 99.99% of future games I won't care about. I find it strange since these X3D chips are far more hit or miss than past CPUs. And until I'm in 2026 how can I determine if it gets along?
However, the alternate chip would have to offer something significant. Maybe it isn't great at games but if it has much more MT performance at a similar or lower price I'd consider it. Or maybe somehow it has better browser performance. It needs significants advantages to be compelling in any way and I'm not sure Arrow Lake has enough real advantages to compel people to pick it instead.
That's no longer the case. There are now settings that stress the CPU more that are not active on low settings in some games. it is why you will see some reviewers test on ultra and add RT results often.Well, the best thing is to just lowtest at 1080, 1440, and 4k.
Is anyone else thinking that the 9950X3D will have a slightly better bin and a better boost on the die with the v-cache or something else that makes the performance even better?
There was the leaked Factorio benchmark from a few weeks ago, but that's just as likely the higher core count as anything and not definitive.
True. Spiderman RT is more multithreading sensitive than sans RT.it is why you will see some reviewers test on ultra and add RT results often.