The Official Kaveri Review Thread (A10-7850K, etc)

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
As its been said, let's not pretend these APU are something they are not. Clearly they are not a good option for people who want to use a discrete GPU, period. There are better options.

But as a package and when size, thermals, power is a constraint, they are unmatched in their own category.
If you cared about all that at once, you probably want an HTPC or a laptop. At least on the HTPC end, you face stiff competition from modern consoles (which are APUs from AMD, so it's not really a loss). In a laptop, I can see it being very useful, but the market is pretty used to seeing a WOW 4GB GT620M and not realizing how crappy it really is. They want to see bigger numbers on a sticker than the neighboring computer.
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,391
31
91
Toms (DE) showed how much the low power kaveri loves bandwidth:


Looks like a good replacement for casual PC that I have:
Athlon X2 2.2GHz + HD4670 1GB DDR3

Except it's barely faster than your 4670 on the GPU side. So, assuming that's an AM2 X2, rather than spending $190 for a processor that needs both a new mobo and premium RAM, you'd probably be better off getting a Phenom II or Athlon II X4 as a drop-in replacement.

e: Didn't see A8. That helps on the price, but makes it even less worthwhile for performance.
 
Last edited:

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
I think we can say somethings about Kaveri:

- It won't change much of the desktop market, except for budget gamers.

- It's no smaller than its predecessors, so margins should still be bad, and should make AMD vulnerable on the bottom market to Intel pricing,

- It will suck big time on mobile, especially because there we will have Broadwell, not Haswell as the main competitor.

In the end, 2014 is more of the same for AMD big core line: Market share bleed and PR spin.
 

Atreidin

Senior member
Mar 31, 2011
464
27
86
Why people doesnt say the same thing when it comes to Intel Reviews ??

You can buy a G3220 + HD7790 for less than just the Core i5 4670K and have faster gaming performance vs HD4600. :whiste:

The Kaveri is not a CPU, it is an APU. You pay for having the fastest iGPU in a 95W TDP Single Chip that also has 4 powerful CPU Cores, you pay more for added features like TrueAudio etc.

Having an "integration premium" factored into the price makes sense if it's target market will pay it. Most of the people in this thread don't seem to be in the target market.
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,391
31
91
Why people doesnt say the same thing when it comes to Intel Reviews ??

You can buy a G3220 + HD7790 for less than just the Core i5 4670K and have faster gaming performance vs HD4600. :whiste:

When has anyone in history ever suggested an i5 or i7 + integrated as a budget gaming build? It's ALWAYS Pentium + dGPU or i3 + dGPU over an i5 or i7 with integrated graphics. So that's, "When."

Only AMD fanboys suggest using an IGP for gaming.

So guess which retarded suggestion of "use the IGP" needs to be countered? Hmmm... perhaps the only one being made?

Take it down a notch. Posts like these won't fly here
-ViRGE
 
Last edited by a moderator:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
I think we can say somethings about Kaveri:

- It won't change much of the desktop market, except for budget gamers.

- It's no smaller than its predecessors, so margins should still be bad, and should make AMD vulnerable on the bottom market to Intel pricing,

- It will suck big time on mobile, especially because there we will have Broadwell, not Haswell as the main competitor.

In the end, 2014 is more of the same for AMD big core line: Market share bleed and PR spin.

Its obvious that their big core line is completely dead. And the cats family already seems to be in trouble. I wonder what AMD will actually base its future on. The consoles will keep the company running in one way or the other for the next 7-10 years. But at that time the dGPU is close to, if not already gone. And ARM servers, thats not exactly looking bright either.
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,391
31
91
Its obvious that their big core line is completely dead. And the cats family already seems to be in trouble. I wonder what AMD will actually base its future on. The consoles will keep the company running in one way or the other for the next 7-10 years. But at that time the dGPU is close to, if not already gone. And ARM servers, thats not exactly looking bright either.

Definitely not a great time to be an enthusiast when AMD's latest has all the power of an overclocked Q6600 + 8800 GTS.
That was early 2007. Seven years.
 
Last edited:
Feb 19, 2009
10,457
10
76
Its obvious that their big core line is completely dead. And the cats family already seems to be in trouble. I wonder what AMD will actually base its future on. The consoles will keep the company running in one way or the other for the next 7-10 years. But at that time the dGPU is close to, if not already gone. And ARM servers, thats not exactly looking bright either.

People keep on saying dGPU is dead or close to dead for so long, its not gonna happen.

On the same node, a massive 550mm2 GPU die that has the potential to use 400W is going to destroy an iGPU on performance. Unless you belong to the camp that feels there's a certain level of performance that is magically "enough" and people will no longer want extra.

I dont, because soon 4K res will be standard and not even the best GPU can handle it yet. Moving forward, 4K res at 120hz. Or wide angle curved monitors. Or heck, more fancy game engines will continue to push the hardware.

There's a LONG way to go for iGPU to be competitive even at 1080p, we've finally reached a point where an APU can handle 1080p at medium settings at around 37-40 fps in modern games. When are we going to get iGPU that can push 4K res at ultra settings at 60 fps?

I do agree AMD's CPU division is doomed, they have shown no signs of making a leap in IPC so their CPU are worthless for high end usage. But this is where their APUs need to take over, they need design wins on mobile to ensure future success, thats something they haven't been able to do with Llano or Richland.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
People keep on saying dGPU is dead or close to dead for so long, its not gonna happen.

On the same node, a massive 550mm2 GPU die that has the potential to use 400W is going to destroy an iGPU on performance. Unless you belong to the camp that feels there's a certain level of performance that is magically "enough" and people will no longer want extra.

I dont, because soon 4K res will be standard and not even the best GPU can handle it yet. Moving forward, 4K res at 120hz. Or wide angle curved monitors. Or heck, more fancy game engines will continue to push the hardware.

There's a LONG way to go for iGPU to be competitive even at 1080p, we've finally reached a point where an APU can handle 1080p at medium settings at around 37-40 fps in modern games. When are we going to get iGPU that can push 4K res at ultra settings at 60 fps?

I do agree AMD's CPU division is doomed, they have shown no signs of making a leap in IPC so their CPU are worthless for high end usage. But this is where their APUs need to take over, they need design wins on mobile to ensure future survival.

The same node is the exact problem. We might get 20nm GPUs at the end of the year, if lucky. Then next we get 14/16nm thats actually 20nm with Finfets. Meaning close to zero benefit for dGPUs. So before we get a real 14/16nm from GloFo/TSMC, its another 4 years or more ahead. So you gonna end up with 20nm dGPUs against 10 or even 7nm IGPs. Add stacked memory to remove bandwidth issue completely. And the dGPU is dead.

IGPs dont have to beat 550mm2 dies. They only need to compete with something like a GTX750/GTX760/270/270X. Then the ROI is gone in the dGPU segment.

But for the fun of it. Try an example with 2 nodes in between. Shrink a 28nm GK104 to 14nm. And see what the result roughly is. Thats something you can fit as an IGP.
 

NTMBK

Lifer
Nov 14, 2011
10,269
5,134
136
People keep on saying dGPU is dead or close to dead for so long, its not gonna happen.

On the same node, a massive 550mm2 GPU die that has the potential to use 400W is going to destroy an iGPU on performance. Unless you belong to the camp that feels there's a certain level of performance that is magically "enough" and people will no longer want extra.

I dont, because soon 4K res will be standard and not even the best GPU can handle it yet. Moving forward, 4K res at 120hz. Or wide angle curved monitors. Or heck, more fancy game engines will continue to push the hardware.

There's a LONG way to go for iGPU to be competitive even at 1080p, we've finally reached a point where an APU can handle 1080p at medium settings at around 37-40 fps in modern games. When are we going to get iGPU that can push 4K res at ultra settings at 60 fps?

I do agree AMD's CPU division is doomed, they have shown no signs of making a leap in IPC so their CPU are worthless for high end usage. But this is where their APUs need to take over, they need design wins on mobile to ensure future success, thats something they haven't been able to do with Llano or Richland.

Can the limited demand for 500mm^2 space heaters sustain the dGPU market? The market for anything less than a HD7770 has already been killed by integrated graphics. Broadwell GT4 will aim even higher. The mass market volume is in the low end parts- there's a reason why NVidia are desperately trying to make Tegra into a viable business.
 
Feb 19, 2009
10,457
10
76
Definitely not a great time to be an enthusiast when AMD's latest has all the power of an overclocked Q6600 + 8800 GTS.
That was early 2007. Seven years.

No, its a great time to be an enthusiast. You get an i5 and a gtx780/R290/X. Or with the R290 CF issues fixed, a CF R290 setup is uber performance for the $.

Even bringing up enthusiast market segment in an APU discussion means you expect too much, way too much, out of a 250mm2 die that has half half and goes for cheap.

For its targeted segment, it is an awesome product. I am no AMD CPU fan, I havent used an AMD CPU in my main PC for years since their Athlon glory days. But I do see massive potential for small HTPC gaming rigs with these APU, or even making sense in a cheap mITX gaming rig, fits in your shoebox, doesn't need a massive power supply and can be lugged around easily. At its thermal or physical size level, it can be a powerful gaming setup on a very thin SFF chasis WITH an external power pack like a laptop.
 
Last edited:
Feb 19, 2009
10,457
10
76
Can the limited demand for 500mm^2 space heaters sustain the dGPU market? The market for anything less than a HD7770 has already been killed by integrated graphics. Broadwell GT4 will aim even higher. The mass market volume is in the low end parts- there's a reason why NVidia are desperately trying to make Tegra into a viable business.

As long as people are willing to pay $600 or even $1,000 for a fast discrete, it certainly is viable. If it gets to a point where it isn't, there are gamers who will always want the best and are willing to pay a higher price for it. Look at super cars for example.

HD7770 is low end and has been for a long time with rebadges at that perf levels. If and when iGPU kill it too, the current mid-range will become the new low-end. I've read a market analysis report that found several years ago, the price range for mid-range dGPU was ~$200. Now its been shifted to above $300.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
On the same node, a massive 550mm2 GPU die that has the potential to use 400W is going to destroy an iGPU on performance. Unless you belong to the camp that feels there's a certain level of performance that is magically "enough" and people will no longer want extra.

Power has the strength to beat any Xeon out there, but it does so with a much higher cost per unit and has a lot less units to spread R&D costs, and because of that it is being overwhelmed by Xeon on the big iron market.

If you look at the market today, Both Nvidia and AMD are making less money with dGPU than they used to do in 2008. With IC design costs growing with each generation, there will be a time where the ROI won't be enough, and they will either stretch developments or quit the market.

The bottom market was effectivelly killed on the desktop, and Broadwell should wipe out whatever is left of it on the notebook market. How long until the economics of the dGPU market become unbearable?
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
At this point, I'm thoroughly convinced that AnandTech's perceived bias is all in people's heads. People will believe whatever they want in order to feel persecuted.

It's pretty clear that a lot of fans have some sort of persecution complex. When their brand loses, it's because of ridiculous reasons, such as:

1) Reviewer was paid off
2) Intel bias
3) nvidia + Intel "axis of evil" bias
4) Intel paid the website off
5) Intel compiler tricks

Just roll with it. For some, the perception of being persecuted is easier than accepting the fact that a new product just isn't that great.
 
Aug 11, 2008
10,451
642
126
It's pretty clear that a lot of fans have some sort of persecution complex. When their brand loses, it's because of ridiculous reasons, such as:

1) Reviewer was paid off
2) Intel bias
3) nvidia + Intel "axis of evil" bias
4) Intel paid the website off
5) Intel compiler tricks

Just roll with it. For some, the perception of being persecuted is easier than accepting the fact that a new product just isn't that great.

I actually thought the article showed kaveri in the best light possible vs a discrete solution, since it compared it to the 6570 rather than to a HD7750. And there were playable settings tested in every game except COH.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
I see you have faith Intel can keep on shrinking down the node without issues or on time even.

Even on the "same node" they got a clear solid advantage.



But as its said over and over. Its not about beating the top parts. because those cant substain itself. Its just a matter removing the ROI. Then the dGPU is gone.

nVidia already knows this, and it can be seen on their moves into new markets.
 

NTMBK

Lifer
Nov 14, 2011
10,269
5,134
136
As long as people are willing to pay $600 or even $1,000 for a fast discrete, it certainly is viable. If it gets to a point where it isn't, there are gamers who will always want the best and are willing to pay a higher price for it. Look at super cars for example.

HD7770 is low end and has been for a long time with rebadges at that perf levels. If and when iGPU kill it too, the current mid-range will become the new low-end. I've read a market analysis report that found several years ago, the price range for mid-range dGPU was ~$200. Now its been shifted to above $300.

You just described the demise of the high end GPU market.

Sportscars- bad analogy. Sportscars are a visible status symbol and icon of conspicuous consumption, and a 20 year old sportscar can often be more valuable than a brand new one. How much do you think you will get for a GTX Titan in 20 years' time? The price really can't survive being pushed much higher. Even on a tech enthusiast forum like this, what percentage of users were actually willing to pay the initial $1000 price for a Titan?

You're also ignoring the matter of shared R&D costs. The high volume low-end market subsidises R&D, developing the same GCN/Kepler cores that go into the high end cards. End of low end sales = reduction in R&D funding for dGPUs.

And the "pushing the low end higher" story is pretty ridiculous. A HD7770 is low end now? A 100W, 123mm^2 chip is the low end? How high can you push the definition of low end before you run out of die size and TDP budget? By pushing the definition of "low end", what you are actually describing is the gradual shrinking of the dGPU market. Those people who used to buy things like the HD6570, HD5450, HD6450 aren't all buying HD7770s now; they're using integrated graphics. That's a segment of the market that is gone forever, and isn't coming back.
 
Feb 19, 2009
10,457
10
76
You just described the demise of the high end GPU market.

Sportscars- bad analogy. Sportscars are a visible status symbol and icon of conspicuous consumption, and a 20 year old sportscar can often be more valuable than a brand new one. How much do you think you will get for a GTX Titan in 20 years' time? The price really can't survive being pushed much higher. Even on a tech enthusiast forum like this, what percentage of users were actually willing to pay the initial $1000 price for a Titan?

You're also ignoring the matter of shared R&D costs. The high volume low-end market subsidises R&D, developing the same GCN/Kepler cores that go into the high end cards. End of low end sales = reduction in R&D funding for dGPUs.

And the "pushing the low end higher" story is pretty ridiculous. A HD7770 is low end now? A 100W, 123mm^2 chip is the low end? How high can you push the definition of low end before you run out of die size and TDP budget? By pushing the definition of "low end", what you are actually describing is the gradual shrinking of the dGPU market. Those people who used to buy things like the HD6570, HD5450, HD6450 aren't all buying HD7770s now; they're using integrated graphics. That's a segment of the market that is gone forever, and isn't coming back.

I'm pretty sure my buddies appreciate their rigs and love it as much as any status symbols. There are dudes who have quadSLI Titans. I don't think they GAF about pricing, they want the best, the fastest and willing to pay for it. As long as these types of people exist, there will be a supplier to feed their needs. It may not be from as huge a company or as wide in scope/activity, but high end GPU will remain for a long long time.

Yes, the 7770 is low end. At around $100 it fits that segment very well. That segment is in danger from iGPU, but more migrate towards the mid-range, pushing the prices up. I don't see a drop in gaming consumption anytime soon.
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,391
31
91
Even bringing up enthusiast market segment in an APU discussion means you expect too much, way too much, out of a 250mm2 die that has half half and goes for cheap.

No, seven years.

Could you imagine being on a Celeron 300A and TNT in 2006? Great setup for 1999, but in '06 I was on a X2 5200+ (which was down to a ~$120 processor) and I shortly thereafter got the aforementioned 8800 GTS, which in 320MB flavor was only $300. Putting a 300A and TNT onto a single chip in 2006 would not have been performance that would be on anyone's radar. Asking $190 for it would've been absurd.
Yet here you are drooling over a $190 C2Q + 8800 GTS because it's branded "AMD."
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
Thats nonsence, the i5 4670K will slaughter the G3220 on CPU


When you were talking about Kaveri it was all about GPU performance and Gaming, when the discussion went to the Intel courtyard you are talking about CPU performance.

So did anyone said how G3220 or 750K + dGPU will slaughter the Core i5 4670K with iGPU in Gaming at the same or lower price ??
Did anyone really think that in order to have the Kaveri CPU and GPU performance with an Intel product you have to buy the Core i3 + dGPU like GTX630/640 ??
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
No, seven years.

Could you imagine being on a Celeron 300A and TNT in 2006? Great setup for 1999, but in '06 I was on a X2 5200+ (which was down to a ~$120 processor) and I shortly thereafter got the aforementioned 8800 GTS, which in 320MB flavor was only $300. Putting a 300A and TNT onto a single chip in 2006 would not have been performance that would be on anyone's radar. Asking $190 for it would've been absurd.
Yet here you are drooling over a $190 C2Q + 8800 GTS because it's branded "AMD."

You do know that People play games in Pentium 4 performance CPUs inside Tablets and Mobile Phones ???
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Anand did the exact same thing with the Iris Pro review tested low quality at low res, them scaling up quality and resolution at the same time.

I dont remember anyone complaining about "bias" back them.

Anand generally tested 768p med and 900p high. There were a few 1050p high tests. And in every test except metro fps at the highest settings were over 25 fps for the top card (650m), generally around 30-40.

768p medium and 900p high generally got acceptable framerates. (And anand also compared iris to a 650m at 900/1250 mhz, the FASTEST 650m on the market, overclocked beyond stock and with GDDR5 to test intel's claim of ~ 650m performance. This was anti-intel if anything).

Compare that to kaveri. 1280 x 1024 on low. This is not a common resolution and kaveri gets 60 fps+ in every test except CoH. Its a useless resolution and at useless settings. 1050p high is pushing playability and 1080p extreme is useless because everything is a slideshow. Toms used better settings (generally) but TR also used sub 30 fps settings.

Thankfully Anand doesn't test hardware only for you. He tests it for people who are actually relevant, like me. In order to put together builds for people I need to know where the performance lies in relation to alternatives, and nobody tests discrete GPUs on Low.

Honestly, your complaints don't hold up to the barest of scrutiny. If games can be scaled down to be playable for the notebook reviews, why do you need to see them scaled down to see if they're playable on a desktop APU?



Oh gee, I wonder if I there is any setting that would allow me to play Skyrim at better than 21.6 FPS on an A10-7870K? How ever could I figure this one out?

You're desperate to get one word that you can use in your internet fanboy arguments: "Playable." But the tailor-made settings you ask for are useless to everyone else because nothing else is benched at those settings.

He is testing igp's not dgpu's. The one or two dgpu's are also not playable at 1080p extreme.

I actually thought the article showed kaveri in the best light possible vs a discrete solution, since it compared it to the 6570 rather than to a HD7750. And there were playable settings tested in every game except COH.

1280 x 1024 is not really a playable resolution (a resolution that will be relevant and used by anyone). Make is 1366 x 768, 900p, or 1080p are playable fps.
 
Last edited:
Aug 11, 2008
10,451
642
126
Anand generally tested 768p med and 900p high. There were a few 1050p high tests. And in every test except metro fps at the highest settings were over 25 fps for the top card (650m), generally around 30-40.

768p medium and 900p high generally got acceptable framerates. (And anand also compared iris to a 650m at 900/1250 mhz, the FASTEST 650m on the market, overclocked beyond stock and with GDDR5 to test intel's claim of ~ 650m performance. This was anti-intel if anything).

Compare that to kaveri. 1280 x 1024 on low. This is not a common resolution and kaveri gets 60 fps+ in every test except CoH. Its a useless resolution and at useless settings. 1050p high is pushing playability and 1080p extreme is useless because everything is a slideshow. Toms used better settings (generally) but TR also used sub 30 fps settings.



He is testing igp's not dgpu's. The one or two dgpu's are also not playable at 1080p extreme.



1280 x 1024 is not really a playable resolution (a resolution that will be relevant and used by anyone). Make is 1366 x 768, 900p, or 1080p are playable fps.


Granted, 1280 x 1024 is a strange resolution, but I viewed it as a substitute for 768p. Since it is 25% more pixels, I would assume anything playable at that res would also be playable at 768p, and most of the games were playable at that setting. To me the point is, however, not whether you can lower the resolution/settings enough to get playability on an APU, but whether you get better performance in the same price range with a discrete card and low end cpu. They picked a strange resolution, which could make the APU appear less competent, but picked a discrete card that would show it in a good light relative to discrete. I dont really see how that translates to bias.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |