frozentundra123456
Lifer
- Aug 11, 2008
- 10,451
- 642
- 126
I keep seeing winrar + playing a game. Is that a thing people do? I mean, why?! Anyone?
Not at all. It has never been done by anyone, ever...unless they were trying to make AMDs ultra-low IPC, but high core count CPUs look better. That is the reason that only that one, single website does that test.
And the same reason that one particular poster seems obsessed with it, and posts it incessantly.
I have a TV hooked up to HDMI on my video card that I can play something on it while a game is playing (movie, TV show, youtube video, maybe a gameplay strategy video), and a shared folder on the network, so pretty much at any time someone might use it while I'm gaming. I know there are people who religiously make sure nothing else is happening on their computer when they play a game but for me that's just not realistic usage.
That's not even getting into other things like browsers with a bunch of tabs I don't want to close, because I'm just going to reference them after I finish a quick gaming session and it's annoying to load an old session in a browser with a bunch of tabs all loading at once.
I don't have an AMD processor but I can see the value in knowing if a CPU has enough grunt to handle my apparently heretical usage patterns.
Solder.
So the answer "to the OP" is "no" but still good enough, correct?
I keep seeing winrar + playing a game. Is that a thing people do? I mean, why?! Anyone?
edit:
winrar takes so little time. so I alt+tab and encode something else? Or I have a massive queue of things that need encoded or decoded? is there a common user situation where this is a thing?
also, winrar benchmark is one thing, using winrar for real is another, you will also load the SSD/HD and it's probably the main bottleneck for this scenario?
Enough reviews have been Cinebench plus a poorly-threaded game, a combo that is designed to make FX chips look bad.I keep seeing winrar + playing a game. Is that a thing people do? I mean, why?! Anyone?
I got an 8320E and a UD3P 2.0 board for $133.75 (tax included).Get the 6300 with a cheap mobo and it is completely possible that the 6300 will not even work at stock clocks but throttle heavily due to bad VRM's...
https://www.google.gr/search?q=Fx63...&es_sm=122&ie=UTF-8#q=stock+Fx6300+throttling
Yes you can overclock it even on cheap mobos but a large amount of knowledge,parts(for cooling) and LUCK is necessary.
Overclocking a 6300 in not a budged solution.
Well that's exactly what I was saying, the random person would not even know to put a fan on the VRM or get a good cooler,the random person would just get heavy drops blame it on AMD and sell the setup.I got an 8320E and a UD3P 2.0 board for $133.75 (tax included).
That is budget and the board will get to 4.4 or 4.5 as long as a person puts a fan on the VRM sink.
Buy a high-quality cooler like the D15 and remember that it can be used again in the next system, when you give away the FX combo to family (at stock with the stock cooler) or sell it.
They do use a "real" scenario,they just compress some files,that's also the reason why the fx looks good, the winrar benchmark would choke up any CPU if it where to run at a 100% if you compress for real the CPU usage is lower.also, winrar benchmark is one thing, using winrar for real is another, you will also load the SSD/HD and it's probably the main bottleneck for this scenario?
I got an 8320E and a UD3P 2.0 board for $133.75 (tax included).
That is budget and the board will get to 4.4 or 4.5 as long as a person puts a fan on the VRM sink.
Buy a high-quality cooler like the D15 and remember that it can be used again in the next system, when you give away the FX combo to family (at stock with the stock cooler) or sell it.
It might suggest that the FX does well if you are a heavy user that has CPU intensive work happening in the background, though. I'm sure that's a minority of gamers, but there probably are those people out there, and for them a $170 FX8350 might be tempting compared to an i3 or i5, maybe even an i7 given the price difference.
$133 for an FX-8320 and a decent board is fantastic, but the value proposition is gone once you add $100 worth of cooling (and probably some change for a slightly larger PSU). You're now in the price bracket that you can get a Skylake i5 (non-k) and board, which is hands-down a better choice if you're gaming. If you're not gaming, add a discrete GPU to the FX's BOM too (not necessary for Intel CPUs), and you're now competing with Haswell i7 Xeons with integrated GPUs.
Now, I understand there are intermediate price points for coolers and such, but my point is that spending money to get a moderate to large OC out of an FX CPU almost necessarily pushes its price up into a range where there are better choices, unless you're overclocking for the sake of overclocking, and not for performance or value.
Now, I understand there are intermediate price points for coolers and such, but my point is that spending money to get a moderate to large OC out of an FX CPU almost necessarily pushes its price up into a range where there are better choices, unless you're overclocking for the sake of overclocking, and not for performance or value.
Orochi (Zambezi/BDv1) and its derivatives are very obviously server chips shoehorned into the mainstream. There are some use cases for which their architecture works very well; specifically, any highly-threaded, integer-heavy load.
Niche of niches, but for massive compilations (Gentoo Linux!) this thing is a godsend. I am told compiling is entirely integer, and in most cases is what is known as embarrassingly-parallel; for this specific, "heretical" use case, the 8320E and 8370E are the best value for money, and punch well above their weight, while still having enough single-thread performance to run any Linux WM or DE. Paired with an SSD and Linux's superior scheduling (and because the binaries won't be compiled with ICC...) this setup is a beast.
Horses for courses!
But ironically, the chips inefficiency caused them to lose essentially the entire server market as well.