Lol. It's impressive that people still try to paint these turd APUs as awesome options for gaming in the X1/PS4 era. Unless you're looking for an ultraportable device with limited battery capacity you'll want discrete graphics to play the latest games at decent settings.
Which raises the obvious question - if a 7730 class 45w APU is magically "
good enough to run modern games at high resolution & settings", then why did AMD bother to waste all that silicone & expensive DDR5 RAM on the 7850 class PS4 when they could be giving console gamers the "glories" of 15fps slowdowns purely to shave another $50 off the console's price?... I honestly don't understand where this "rock bottom" APU mediocrity advocacy that sprung up over the past two years came from. "Too cheap" is just as bad as "too expensive" regardless of brand. Baseline gaming requirements are a moving target that change each year, and today's "console equivalent" baseline is virtually an XB1 equivalent of 260X / 750Ti (even on Medium), not a 7730 (not matter how "free" it is).
You cannot say that for the Intel CPUs at the same price/TDP.
Dear Lord... It's almost like some people are "hard-wired" to persistently "not get it".
Most desktop gamers are like iGPU-less FX desktop gamers - they do not care about iGPU's - ANYONE's iGPU including Iris Pro. Intel iGPU's to Intel desktop gamers are useful primarily as a backup if a dGPU fails. That's it. Only AMD APU owners obsess about arguing over 12fps vs 20fps in games. It's still as dumb today as those fake pre-price-cut comparisons of "
$180 7850K is better than a $180 i5-4670K for gaming due to the APU" whilst ignoring the fact most people with $200 would buy a $100 CPU + $100 dGPU and get far better perf/$ all round. Or that people don't buy unlocked Intel's for gaming just to use the iGPU. Or that many people increasingly don't combine CPU & GPU upgrades, ie upgrade CPU every 3-4 years but GPU every 1.5-2 years resulting in them already having an existing dGPU to plug in...
This stuff is just an endless stream of "False Dilemma" logical fallacies - you set up a fake "either / or" iGPU vs APU scenario ignoring the other alternatives that exist or the different priorities people have (ie, willing to spend 10-15% more as a percentage of total rig cost if it nets a +150% GPU gain). It's obvious that most of the time you guys don't even have a serious fixed budget, you simply look-up whatever the cheapest "price of the day" for APU's are then declare it to be a universal "every gamer's budget" (along with "every budget gamer wants 720p", etc). Because the A8-7800 is $90, your "total CPU+GPU combined budget" is coincidentally $90. And yet pre-price cut and before A8-7800 availability, your budget managed to stretch all the way to $180 because that's what the A10-7850K was previously priced at... :sneaky:
As for TDP - it's amusing to see some grasp at straws with fake misused TDP limitation scenario's.
The max TDP of a CPU + dGPU with 2x cooling points, 2x heatsinks & 2x fans is not interchangeable with an APU with only 1x cooling point, 1x heatsink & 1x fan. Eg, a 35w Akasa Euler's "heatsink case" maximum cooling capability is limited to 35w for the whole system, yet there are plenty of other compact Mini-ITX cases that can easily accomodate a 50w CPU cooler + a 50-60w dGPU on top and still remain powered by an external 120-160w "brick" Pico-PSU. How do you think +100-130w PS4 & XBox's One's manage if they're "
only allowed 45w"? All this "
Mini-ITX must = 45w max purely because AMD have a 45w APU and a single cooling point" is such a fake restriction it's comical.
As for perf/price/TDP, if a $150 95w A10-7850K APU
can manage 28fps in BI at 1080p on Medium, whilst a $70 70w
R7 260 can manage 68fps at same settings & resolution mated to say a $100 55w i3 (total $170 and 125w), then the CPU+GPU nets a +142% perf gain for only a $20 / 13% increase in price, and 125w per 68fps results in 1.84 watts per fps (vs 3.4 watts per fps of the 7850K... )
Now let's do the A8-7600 - it runs BI at 25fps (65w) resulting in 2.6 watts per fps, and it run 23fps in 45w resulting in 1.95 watts per fps. Both of which are still worse perf-per-watt than the CPU + dGPU combo, (which itself isn't even the most efficient GFX architecture anymore). In terms of price, it works out $90 for 23fps vs $170 for 68fps, or an 88% price increase for a 195% increase in performance. And that's cherry picking a "low power" 45w AMD vs a stock 55w i3 (instead of a say a 35w i3-4160T @ 3.1GHz).
In short - no matter what metric you use - even the 45w APU's still lose out on perf-per-watt & perf-per-$ to typical $70-$100 CPU's + $70 "low-mid" class dGPU's. APU's are not the "magic beans" 'bargain' you think they are, and this is precisely why AMD are being forced to slash & slash APU prices - people are in reality doing these maths and figuring out for a new PC that has to last 2-3 years, it's better to spend an extra $50, $60, even $70 to double-triple GPU performance via a budget dGPU because most people buy on "bang per buck" and not
"the absolute cheapest no matter how slow it is because it comes with a free 7730".
Dragon Age Inquisition (build in benchmark)
720p Medium
Hang on, earlier on you were talking about it being good enough for "High resolution" (ie, 1080p) and "high quality". Now the bar has been lowered to 720p on "Medium" to avoid showing the obvious sub 20fps frame-rates (let alone slowdowns)? Precisely my point... :sneaky: