Qualcomm Snapdragon 800 (MSM8974) Performance Preview

R0H1T

Platinum Member
Jan 12, 2013
2,582
162
106
Final Words

Qualcomm's Snapdragon 800 is quite possibly its most ambitious SoC to date. The goal? To drive absolute performance while maintaining power efficiency. While Snapdragon 600 was clearly about delivering evolutionary gains in performance, Snapdragon 800 intends to compete with ARM's Cortex A15 and Intel's Bay Trail platform.

On the CPU performance front, Snapdragon 800's 2.3GHz Krait 400 cores appear to do hold their own quite well against ARM's Cortex A15. In some cases ARM holds the advantage, while in others the higher clocked Krait 400 takes the lead. We still have the question of power to answer, but Qualcomm bets it can deliver A15-like performance without A15-like power thanks to the 28nm HPM process at its foundry partners.

Qualcomm didn't have any power demos setup, so power analysis and battery life performance will have to come at a later date, but the claim is better performance at equivalent platform power as Snapdragon 600.

On the GPU side, we have a new king. Adreno 330 delivers huge performance improvements over Adreno 320 and everything else we've tested thus far. Snapdragon 800 is the new benchmark to beat. It's very clear to me why many tablet designs scheduled for later this year are based on Snapdragon 800 silicon.

Source
 

R0H1T

Platinum Member
Jan 12, 2013
2,582
162
106
Without further ado I'd say this is the reason why mobile/tablet space isn't going to be a walk in the park for Intel, as some Evangelists claim, because the likes of Apple/Samsung/Qualcomm will squeeze it at the top end while Mediatek/Rockchip will eat the bottom half leaving very little elbow space for'em btw next gen Microsoft Surface 2.0(Windows 8.1) is gonna have Qualcomm 800 Chip ~
http://news.softpedia.com/news/Microsoft-Surface-2-0-Windows-8-1-Qualcomm-800-Chip-362003.shtml
 

R0H1T

Platinum Member
Jan 12, 2013
2,582
162
106
No much to see here without actual power consumption figures.
The thing is power consumption numbers are only a part of the equation, I've said this before but I'll reiterate that for a smartphone the display will eat roughly 70% of the battery whilst the Wifi/3G radios et al will use 10% or more leaving the actual SoC with consumption anywhere between a sixth to a third, depending upon the usage pattern. For tablets the SoC will not use more than 40% of total power at anytime for avg users so I personally believe Qualcomm's claim of similar level of efficiency wrt Snapdragon 600.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
I think its good we a s800 thread. Thanx. The new ARM Cpu is by far the most interesting cpu on the market, while we still tend to discuss utterly unimportant 5 GHz cpus.

Qualcomm continues to surprise me. The GPU part is very healthy, and i judge from the Great Ewqulizer that the new Atoms with eg. 4 or 8 EU and lower MHz will be far behind from day one even from this product on the GPU side.

Intel simply needs a more effective GPU side. SDP talk dont help on a professional OEM market. Results does.

And Qualcomms pipeline is healthy continuing with new product every half year. If one wonder why Qualcomms market cap is so high as Intel - look at this product. Super high performance in a single chip solution. Delivering what the market need - profit for the OEM.
 

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
The Adreno (Radeon) 330 is looking quite good. I have the 1st gen 320 in my phone and at 1080P it's holds it's own. I will say some of the newer mobile games coming out will cause it to stutter some.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
I think its good we a s800 thread. Thanx. The new ARM Cpu is by far the most interesting cpu on the market, while we still tend to discuss utterly unimportant 5 GHz cpus.

Qualcomm continues to surprise me. The GPU part is very healthy, and i judge from the Great Ewqulizer that the new Atoms with eg. 4 or 8 EU and lower MHz will be far behind from day one even from this product on the GPU side.

Intel simply needs a more effective GPU side. SDP talk dont help on a professional OEM market. Results does.

And Qualcomms pipeline is healthy continuing with new product every half year. If one wonder why Qualcomms market cap is so high as Intel - look at this product. Super high performance in a single chip solution. Delivering what the market need - profit for the OEM.

According to Intel slides @ Computex (http://files.shareholder.com/downlo...9b220d3/Hermann_Eul_Computex_2013_keynote.pdf see page 28), Bay Trail will perform 4x better than Clover Trail in GLBenchmark 2.5 Egypt HD. That would be about 28-32fps, which is roughly the same performance that Clover Trail+ gives today:

https://gfxbench.com/subtest_results_of_device.jsp?D=Lenovo+K900&id=484&benchmark=gfx27

You can see from the chart the article gave that this score is worse than what most current tablet SoCs are giving, and S800 will be over twice as fast. So Bay Trail's Gen 7 GPU is definitely not very competitive performance-wise. No idea how it'll do in terms of perf/W. Sadly no one even tries to do perf normalized power consumption tests for mobile GPUs, they just let them run full tilt in a game that can fully utilize them and go from there.

Merrifield is rumored to be using an IMG GPU still, possibly Series 6 based. It'll be pretty awkward if it ends up having a stronger GPU than Bay Trail does, but that's starting to look like a real possibility.
 

Khato

Golden Member
Jul 15, 2001
1,225
281
136
Certainly won't deny the fact that the Adreno 330 looks poised to have the 3D performance crown, but I wouldn't be quite so pessimistic with respect to Bay Trail projections. The Z2760 gets a score of 12 fps in GLBenchmark 2.5 - http://www.techinferno.com/2013/06/08/samsung-ativ-smart-pc-500t-intel-z2760-review/7/ - so 4x the performance would put it up to 48 fps. Which isn't too bad when you consider that there are currently only 3 higher ARM scores than that in the GFXBench database (highest being NVIDIA's Project Shield at 54.5 fps... which at 18.3 fps in GLBenchmark 2.7 means it still gets handily beat by Adreno 330.)

The other point of comparison we have is 3DMark, and there we can use Anandtech's numbers given that they included a Z2760 in the charts. Of course Intel's slide doesn't state whether it's 4.7x the graphics score or total - if it's graphics that would be 13606 versus Adreno 330's 21680 while if total it'd be 11463 versus 19661. Again, no question that Adreno 330 is quite a bit more powerful than anything else currently on the market (including Tegra 4 apparently.)

Also, as has already been pointed out, there's still the rather important question of power - throwing more die space and power at graphics is typically the only way to increase performance. Which isn't necessarily a bad thing since it should only use that extra power when it's actually necessary, it's just a trade-off in terms of die space/cost.
 

LogOver

Member
May 29, 2011
198
0
0
According to Intel slides @ Computex (http://files.shareholder.com/downlo...9b220d3/Hermann_Eul_Computex_2013_keynote.pdf see page 28), Bay Trail will perform 4x better than Clover Trail in GLBenchmark 2.5 Egypt HD. That would be about 28-32fps, which is roughly the same performance that Clover Trail+ gives today:

Qualcomm graphics still stuck at DX feature level 9.3 and supports OpenGL ES 3.0 only (assuming Adreno 330 is an overclocked Adreno 320), means feature wise Adreno 330 is just a subset of Intel's Gen 7 graphics which fully supports DX 11 and OpenGL 4.0.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Certainly won't deny the fact that the Adreno 330 looks poised to have the 3D performance crown, but I wouldn't be quite so pessimistic with respect to Bay Trail projections. The Z2760 gets a score of 12 fps in GLBenchmark 2.5 - http://www.techinferno.com/2013/06/08/samsung-ativ-smart-pc-500t-intel-z2760-review/7/ - so 4x the performance would put it up to 48 fps. Which isn't too bad when you consider that there are currently only 3 higher ARM scores than that in the GFXBench database (highest being NVIDIA's Project Shield at 54.5 fps... which at 18.3 fps in GLBenchmark 2.7 means it still gets handily beat by Adreno 330.)

The review number you quoted is the onscreen score. Onscreen is affected by native resolution, in this case the tablet is running at 1366x768 instead of 1920x1080 used in the offscreen test. In other words, it's the wrong number.

See here:

https://gfxbench.com/device.jsp?D=Samsung%20ATIV%20Smart%20PC%20500T

The right one for comparison is the 7FPS number. I've seen some numbers as high as 8FPS. Hence the 28-32 number I gave.

Didn't look at 3DMark because, as you say, it's vague what they're actually talking about. Gen 7 traditionally does very well in 3DMark vs other things, so I would take it with a little less weight to begin with. Some real game comparisons would be nice but good luck getting that...

Power is an issue, but we're talking about a tablet setting where there should be a fair power budget and the user should be able to exercise that in games if he/she desires (but there should really be more user controls than there are for scaling max allowed GPU load).

Qualcomm graphics still stuck at DX feature level 9.3 and supports OpenGL ES 3.0 only (assuming Adreno 330 is an overclocked Adreno 320), means feature wise Adreno 330 is just a subset of Intel's Gen 7 graphics which fully supports DX 11 and OpenGL 4.0.

And at these performance levels DX11 is close to useless even when running Windows.
 
Last edited:

LogOver

Member
May 29, 2011
198
0
0
And at these performance levels DX11 is close to useless even when running Windows.

DX11 makes less sense indeed (as its main features are tessellation and GPGPU compute), but DX10 is major improvement over DX9. It introduces a lot of new features. For example, Adreno still lacks geometry shaders. Basically using DX10 instead of DX9 you can get either better performance with the same picture quality or same performance with better picture quality.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
DX11 makes less sense indeed (as its main features are tessellation and GPGPU compute), but DX10 is major improvement over DX9. It introduces a lot of new features. For example, Adreno still lacks geometry shaders. Basically using DX10 instead of DX9 you can get either better performance with the same picture quality or same performance with better picture quality.

That all is true, but it's really all going to be dictated by the number of games that make use of the features and run acceptably at the performance levels provided by the card. Intel is going to have an uphill battle trying to get developers to make new Windows games that scale well all the way down to this level (not to mention, are tablet friendly) if that's their plan. Getting Android developers to utilize it through OGL ES extensions is even less likely.

I don't know much about the actual benefits DX10 gives performance wise, you can probably tell me better, but allegedly instancing support (in ES 3.0) at least helps reduce the performance gap where applicable.
 
Last edited:

Khato

Golden Member
Jul 15, 2001
1,225
281
136
The review number you quoted is the onscreen score. Onscreen is affected by native resolution, in this case the tablet is running at 1366x768 instead of 1920x1080 used in the offscreen test. In other words, it's the wrong number.

See here:

https://gfxbench.com/device.jsp?D=Samsung%20ATIV%20Smart%20PC%20500T

The right one for comparison is the 7FPS number. I've seen some numbers as high as 8FPS. Hence the 28-32 number I gave.

Such wasn't clear from the review, but 12 fps matched up with where SGX 545 should be at Z2760's 533 MHz frequency in comparison to Z2460's SGX 540 at a 400 MHz frequency - gfxbench for Atom with SGX 540 @ 400 MHz. But you're quite correct that the number I linked was incorrect.

However there's another wrench in all these comparisons that I forgot about initially and hence arrived at the incorrect number - OS differences. The numbers given in Intel's slide are based on Windows 8, but it's quite obvious that the PowerVR architecture obtains better performance in GLBenchmark 2.5 under Android than Windows... which could mean that a decent chunk of the performance improvement Intel is claiming are due to the superior Windows drivers they have for Gen compared to PowerVR. (Yes, there are drivers that suck more than those for Intel's Gen architecture.)

Oh, and as for power... yes, tablets have higher tolerances, however I wouldn't be at all surprised if Adreno 330 was drawing over 4 watts to achieve that level of performance, more likely somewhere in the 6-8 watt range. (Based on the GPU power usage in Anandtech's x86 vs ARM power usage article - Mali T-604 uses around 3.5 watts for 43 fps in GLBenchmark 2.5 while Adreno 225 uses around 1 watt for 14.5 fps on the android side while Tegra 3 uses around 1.75 watts for 11.5 fps and SGX 545 uses 0.75 watts for 7 fps on the windows side which isn't as useful for this extrapolation.)
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Such wasn't clear from the review, but 12 fps matched up with where SGX 545 should be at Z2760's 533 MHz frequency in comparison to Z2460's SGX 540 at a 400 MHz frequency - gfxbench for Atom with SGX 540 @ 400 MHz. But you're quite correct that the number I linked was incorrect.

I agree, the numbers for CT on Win 8 look lower than they should be and bad drivers could be to blame. But unless there's a universal penalty for glbench 2.5 running on Win 8 instead of Android then it shouldn't really matter why the CT score is as bad as it is, only that it's a baseline for estimating the 28-32 or so score that BT is going to get. This is simply not a competitive score.

Oh, and as for power... yes, tablets have higher tolerances, however I wouldn't be at all surprised if Adreno 330 was drawing over 4 watts to achieve that level of performance, more likely somewhere in the 6-8 watt range. (Based on the GPU power usage in Anandtech's x86 vs ARM power usage article - Mali T-604 uses around 3.5 watts for 43 fps in GLBenchmark 2.5 while Adreno 225 uses around 1 watt for 14.5 fps on the android side while Tegra 3 uses around 1.75 watts for 11.5 fps and SGX 545 uses 0.75 watts for 7 fps on the windows side which isn't as useful for this extrapolation.)

I don't know, if Adreno 225 used 1W and the performance is 4.7x better then it sounds like 6-8W will mean 28% to 70% worse perf/W at peak. Some of the peak performance improvement is due to a clock bump going from 300MHz to 450MHz which will take its toll in efficiency, but the rest of it is due to a wider uarch which for GPUs shouldn't hurt peak perf/W that much. Given the move to HKMG and uarch improvements 8W seems on the overly pessimistic side, 6W not nearly as much. I do agree well over 4W is pretty much a given.

But everyone else is willing to allow this kind of power drain with the GPU running heavily on tablets. I don't think anyone is going to prefer an inability to play more demanding content over playing it while sucking down the battery. That extra performance should mainly go unused when not needed. People just need an option to run games at artificially lower frame rates and/or with less features when they're comfortable doing so to prioritize battery life.

Intel should have better perf/W due to their process advantage, but as far as GPU goes they're really not capitalizing on it at all. Nothing new here though, I suppose. They're getting a lot more serious about offering fatter GPUs with Haswell, but that could be highly influenced by Apple as a customer in that space.
 

Khato

Golden Member
Jul 15, 2001
1,225
281
136
I agree, the numbers for CT on Win 8 look lower than they should be and bad drivers could be to blame. But unless there's a universal penalty for glbench 2.5 running on Win 8 instead of Android then it shouldn't really matter why the CT score is as bad as it is, only that it's a baseline for estimating the 28-32 or so score that BT is going to get. This is simply not a competitive score.
Agreed. And it's likely that unlike PowerVR there won't be a boost to performance under Android compared to Windows. 28-32 fps seems a reasonable expectation when compared to ultrabook HD4000 at ~85 fps.

I don't know, if Adreno 225 used 1W and the performance is 4.7x better then it sounds like 6-8W will mean 28% to 70% worse perf/W at peak. Some of the peak performance improvement is due to a clock bump going from 300MHz to 450MHz which will take its toll in efficiency, but the rest of it is due to a wider uarch which for GPUs shouldn't hurt peak perf/W that much. Given the move to HKMG and uarch improvements 8W seems on the overly pessimistic side, 6W not nearly as much. I do agree well over 4W is pretty much a given.
Well, Adreno 225 gets 4.6 fps in GLBenchmark 2.7, so Adreno 330 is roughly 5.7x the performance. Of course comparing against the older architecture means it's likely the 330 is more efficient, making up for the increased frequency. So yeah, 6W is the more likely estimate, but I could easily see it drift upwards. Either way it'd definitely be pushing the limits of passive tablet cooling at that performance level. Who knows, we could soon see GPU frequency throttling down after a certain amount of time in order to keep inside a certain thermal profile.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Oh, and as for power... yes, tablets have higher tolerances, however I wouldn't be at all surprised if Adreno 330 was drawing over 4 watts to achieve that level of performance, more likely somewhere in the 6-8 watt range. (Based on the GPU power usage in Anandtech's x86 vs ARM power usage article - Mali T-604 uses around 3.5 watts for 43 fps in GLBenchmark 2.5 while Adreno 225 uses around 1 watt for 14.5 fps on the android side while Tegra 3 uses around 1.75 watts for 11.5 fps and SGX 545 uses 0.75 watts for 7 fps on the windows side which isn't as useful for this extrapolation.)

As information:
Tegra 4 uses 3,674W for 54FPS in 2.5:
http://www.tomshardware.com/reviews/tegra-4-tegra-4i-gpu-architecture,3445-5.html
 

Khato

Golden Member
Jul 15, 2001
1,225
281
136

Nice find, thanks - I'd only read Anandtech's coverage of the Tegra 4 announcement and I don't recall the power numbers being included there. Makes for a nice data-point that basically reinforces the point of there being a ceiling on graphics efficiency on a particular process. It also points towards something around 6 W being the more likely figure for Adreno 330 under full load. (Scaling Tegra 4 performance in 2.7 up to 26 fps assuming linear power would be around 5.2 watts, and they may well have the more efficient architecture now.)
 
Mar 10, 2006
11,715
2,012
126
Qualcomm did a good job, although we need power numbers. For all we know, Snapdragon 800 consumes power in the league of a Haswell-Y rather than a typical mobile SoC.

That being said, in the mobile world, I think Qualcomm and Intel will be the two left at the high end. Should be pretty incredible to watch.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
According to Fudo S800 is supposed to be limited to 5W in tablets:

http://www.fudzilla.com/home/item/31532-qualcomm-aims-at-25-to-3w-tdp-for-phones

Now, this isn't saying that you can run all four CPU cores and the GPU at full clock speed and expect to stay under 5W, or that a power virus type pathological condition can cause throttling. But it should mean that the GPU part running a normal load at full capacity uses well under 5W, that or Qualcomm is lying through their teeth.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
We don't know if Qualcomm's reference plattform was limited to 5W.

I don't believe that Qualcomm released a reference tablet that clocks at GPU at a rate they don't recommend other tablets clock it at. Kind of belies the whole "reference" aspect of it all. And if that tablet included active cooling - which is what the article says is the driver for this power limit - I think someone would have mentioned it.

Now Shield may not be all that representative of usual tablets since it does have active cooling...
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Shield's chip has no contact to the surface so it needs an active cooling.

But we have numbers from the Slatebook x2 and the GPU clock is around 10% slower. On the other hand Asus uses the 1,9GHz version for the Tegra 4 tablet.

BTW: The reference phone archived the same GL numbers like the tablet:
http://www.engadget.com/2013/06/18/qualcomm-snapdragon-800-mdp-benchmarks/

I don't think that Qualcomm limited the TDP of their plattforms...
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Yeah, well unless that phone also has a fan in it (and it doesn't) there's a hard limit to how much power it can dissipate. Regardless of software limiting. GLBench 2.5 ran for nearly 2 minutes. I don't think it would take doing so while drawing 8+W, and I doubt the SoC alone drew > 5W.
 

Khato

Golden Member
Jul 15, 2001
1,225
281
136
I don't believe that Qualcomm released a reference tablet that clocks at GPU at a rate they don't recommend other tablets clock it at. Kind of belies the whole "reference" aspect of it all. And if that tablet included active cooling - which is what the article says is the driver for this power limit - I think someone would have mentioned it.

While it is semantics, keep in mind that Qualcomm does not offer reference platforms, rather mobile development platforms.

Another indication of the mentioned TDP caps of 5W for tablet and 3W for phone not being in place on these development platforms comes from Engadget who ran the benchmark suite on both - http://www.engadget.com/2013/06/18/qualcomm-snapdragon-800-mdp-benchmarks/ In order to provide that level of graphics performance in a 3W envelope Qualcomm would somehow have to double their efficiency compared to previous designs while staying on the same process... and since Adreno 330 isn't a major shift from Adreno 320 I don't see that happening.

Who knows if journalists were even allowed to check/report device temperatures. (Note that I'm not claiming such is necessarily the case... just that it wouldn't be the first time that Qualcomm's MDP deliver higher performance than actual products.)
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |