MiddleOfTheRoad
Golden Member
- Aug 6, 2014
- 1,123
- 5
- 0
The FX haven't aged well. Its directly awful.
You can just as well use an i3 instead of an FX.
Ha Ha Ha
......and buying a dual core i3 will age even worse than that.
The FX haven't aged well. Its directly awful.
You can just as well use an i3 instead of an FX.
Isn't DirectX 12 restricted to Win10 only?
Really guys. I'm only around this forum for a few days and the ongoing Intel vs. AMD nonsense is already getting on my nerves. It's not even so much what is said, but how it's said. Every other thread is full of this. Get over it and use what you want. But keep the threads so that reading remains fun, not a burden.
The FX haven't aged well. Its directly awful.
And I think you will be sorely disappointed in the dreams of the DX12 savior as games comes. The AMD sponsored AOTS will pretty much be the best case possible for AMD products we will see.
http://www.computerbase.de/2016-02/ashes-of-the-singularity-directx-12-amd-nvidia/5/
You can just as well use an i3 instead of an FX.
Likely not happening. But at least I get my post count up to where I can send PMs.It's been going on 17+ years here ... you'd be the first person people listened to.
Yeah, It's kinda hard to base the performance improvements of DirectX 12 as a whole from just one unfinished game.
Despite how lazy they are, I just cant see devs not programming games to take advantage of something like a 6700k or 5820k in comparison to an ancient phenom or core 2, especially with all the gpu power we will supposedly have with next gen dgpus.
Dont drag me into your alternate universe where thinking stuff up magically makes it reality.Unless it comes from Raja I am sure you would reject it.
Only with AMD cards since AMD didn't do any DX11 optimization.
Yep, there is no lack of highend CPUs under recommendations in DX12 only titles. Not to mention minimum demands.
Yea, eerily similar to the hype that accompanied Mantle, and that pretty much ended up having zero effect. DX12 will undoubtedly eventually have wide adoption, in contrast to mantle, but I still think a modern powerful cpu will be required for optimal gameplay. Despite how lazy they are, I just cant see devs not programming games to take advantage of something like a 6700k or 5820k in comparison to an ancient phenom or core 2, especially with all the gpu power we will supposedly have with next gen dgpus.
You should be happy for all those people gaming on older hardware but i guess since the AMD name is associated with this thread you just couldn't help yourself. By the way, Celeron users will benefit too.
Since consoles don't have the fastest cpu's by today's standard, i believe DX12 will be easy for low end and older cpu's to handle. Blame Mantle which i believe was a masterful plan by whoever created it.
Of course not. But as long as the 1 to 7 cores not running the main and/or gfx driver thread are at 20%, 40% or 60% utilization for example, there's still something to gain. OK, depending on the game some used MT concepts might still suffer from low ST performance even with DX12/Mantle/Vulkan, but not all.As always when resources gets freed, developers tend to use them elsewhere to improve the experience. Low end CPU users isn't going to get a free lunch.
Yeah, prepare it, is similar to the old Xeon hexa Users... they will be happy on this massive improvement.Time to pull out my true 6 core Phenom II X6 1090T?
In this example I wonder, though: Where is the gain when three cores run at 20%, 40% and 60% at lower frequency versus one core running the same load at 100% at higher frequency?But as long as the 1 to 7 cores not running the main and/or gfx driver thread are at 20%, 40% or 60% utilization for example, there's still something to gain.
In this example I wonder, though: Where is the gain when three cores run at 20%, 40% and 60% at lower frequency versus one core running the same load at 100% at higher frequency?
The one core being fully utilized even has the benefit of having data for all three threads in the very same cache. At least it's not unlikely that those concurrently running threads work on some shared data. If not much data is shared then using more cores with their respective own cache (size) might be beneficial. That is, if games are affected much by CPU cache size anyway.
The Q6600 will live once more.