schmuckley
Platinum Member
- Aug 18, 2011
- 2,335
- 1
- 0
Should't x/h265 be the standard now?
OT ..but yes..and no it isn't.
PS: A $75 X5650 is better at absolutely everything than FX-series chips.
..and that's basically Nehalem.
Should't x/h265 be the standard now?
PS: A $75 X5650 is better at absolutely everything than FX-series chips.
..and that's basically Nehalem.
OT ..but yes..and no it isn't.
PS: A $75 X5650 is better at absolutely everything than FX-series chips.
..and that's basically Nehalem.
I mean if you want to talk about used chips that cost 75$ then be free to do so.
and the 1337'ist just buy Alienware
and the 1337'ist just buy Alienware
First thing I thought.
AMD CPU's are great for price for performance and FX 8350 is great for multitasking, just play your games and render videos in background as soon as you're done recording the footage of the game and then upload it to youtube if you have fast enough internet connection to play online while also upload videos.
I really wonder why people keep thinking that this is some sort of great feat,any CPU can do this,I am doing this on my celeron while recording gameplay , the only difference is that I can lower the games FPS to give more power to the transcoding while the FX* forces you to have low FPS and a lot of empty cores, due to it's minced meat module-cores.
*any multicore(more than 2) CPU really, no matter if AMD or intel, the more cores the more of them will stay unused
I really wonder why people keep thinking that this is some sort of great feat,any CPU can do this,I am doing this on my celeron while recording gameplay , the only difference is that I can lower the games FPS to give more power to the transcoding while the FX* forces you to have low FPS and a lot of empty cores, due to it's minced meat module-cores.
*any multicore(more than 2) CPU really, no matter if AMD or intel, the more cores the more of them will stay unused
With quick sync the impact is ~5-10% for the celeron which has the slowest current cores,depending on the capture software used, and since almost no game can utilize 100% of even a dual core the impact is even lower than that,no, with amd fx s cpu there is really almost-no impact on gaming performace if they're 100% working also on other things.
With quick sync the impact is ~5-10% for the celeron which has the slowest current cores,depending on the capture software used, and since almost no game can utilize 100% of even a dual core the impact is even lower than that,
with faster cores the impact is even smaller.
On the other side an FX will have a huge in build performance hit for every game you're gonna play on it,you have no additional performance hit because no game can utilize nowhere near all it's cores.
..that have a good bit more performance than FX chips all the way around.
:awe:
that cost a grand new and is five years old..
I love a xeon or such as much or more than the next guy, but that's not even in the same ballpark. I get it and it's a cool chip but it's not very relevant.
I actually looked into some of that era chip since they are cheap now but the ancient motherboards that still aren't especially cheap were a turnoff. I really wanted m.2/SATAx without a card if I was going to board-hop. They are as much of a dead-end as the FX is. A very cool and interesting dead-end, but just as dead. I'd like to have one to play with if I had room and spare budget.
Reverse is also valid... :awe:cheap fake-pro-users love amd. They claim about free oc and MT power
cheap real-pro-users buy on ebay 1366 server with dual 4core xeon at 600$
With quick sync....
no. no .no
Set low details 1280x720p or buy a hi-performance card, u re gonna see 100% cpu usage all the day with your dual core
With quick sync the impact is ~5-10% for the celeron which has the slowest current cores,depending on the capture software used, and since almost no game can utilize 100% of even a dual core the impact is even lower than that,
with faster cores the impact is even smaller.
On the other side an FX will have a huge in build performance hit for every game you're gonna play on it,you have no additional performance hit because no game can utilize nowhere near all it's cores.
Not sure it is really that simple. Doesn't windows split up the loads among the cores dynamically? I mean if you look at the core usage of almost any well threaded game on an FX all 8 cores are already being used to some extent, and in a game like BF4, all 8 cores are loaded almost equally.