Web browsing battery life down... That's not good.
What was Samsung thinking?
9.6/2.8 = 3.42; 8.73/2.55 = 3.42 - Looks equal to me...S5 2.8 amp/h battery and 9:36 hours surf time
S6 2.55 amp/h battery and 8:44 hours surf time
- both android 5.0 numbers
It seems to me the surf hour/amp is 3.42 for s6 vs 3.45 for s5, meaning marginal better s5 efficiency here for that type of task.
Looking at the endless ressources eg Samsung have and accelerating r&d - and the interest to use own tech for branding and marketing purpose - one have to wonder if arm can regain the position and that the s6 is the last one in this round and for a forseable future.
I have an issue with this numbers, if we check DisplayMate numbers for the Note 4 we see:S5:
1.50 watts
351 cd/m2
S6:
1.20 watts
348 cd/m2
... At 100% APL, meaning a full white screen, we see the power consumption rise up to 1.7W. The phone has a base power consumption of around 440mW when displaying a black screen, meaning the screen emission power to display white at 336 cd/m² of this particular device comes in at about 1.25W....
I have an issue with this numbers, if we check DisplayMate numbers for the Note 4 we see:
Maximum Display Power Full White Screen at Maximum Brightness:
1.80 watts
350 cd/m2
If we check the Note 4 numbers from the Anandtech measurements chart:
1.70 watts
336 cd/m2
Here everything looks fine, this looks like a normal/acceptable variance from panel to panel, but if you read the Anandtech article:
There can be only one conclusion that makes sense after reading that numbers:
DisplayMate measures idle phone (display on) power consumption and presents it to the reader as display power consumption and that leads them to associate any platform power saving to display power saving, which is IMHO a bit sloppy at best.
9.6/2.8 = 3.42; 8.73/2.55 = 3.42 - Looks equal to me...
Regarding the display, the power numbers from displaymate are relative to a fixed luminance while gsmarena just use 50% brightness. So the values are not directly comparable. Could be that the S6 has higher luminance at 50% brightness level.
In addition of course, the GPU has significantly more pixels to drive, so GPUs power contribution should significantly go up. That is in addition to the fact, that S6 GPU has quite a few more gates.
Finally you assume that the SoC power driver implementation of Samsung is as efficient as Qualcomms. Do they even employ big.LITTLE in the web browsing scenario?
In summary i think your analysis is not conclusive - far from it.
Even if Samsung would draw the conclusion that going with own cores is the right path forward, they would still need an architectural ARM licence, which is also a win for ARM Holdings.
You're correct. DM wrongly labels their power figures as display power. I was also surprised when I made the chart but it's the only explanation as my "full device" power pretty much matches what he is measuring.I have an issue with this numbers, if we check DisplayMate numbers for the Note 4 we see:
Maximum Display Power Full White Screen at Maximum Brightness:
1.80 watts
350 cd/m2
If we check the Note 4 numbers from the Anandtech measurements chart:
1.70 watts
336 cd/m2
Here everything looks fine, this looks like a normal/acceptable variance from panel to panel, but if you read the Anandtech article:
There can be only one conclusion that makes sense after reading that numbers:
DisplayMate measures idle phone (display on) power consumption and presents it to the reader as display power consumption and that leads them to associate any platform power saving to display power saving, which is IMHO a bit sloppy at best.
S810 strikes again. I was hoping HTC One M9 would be one of the best implementations, it's a thick aluminium phone after all.
http://tweakers.net/nieuws/101871/htc-one-m9-heeft-last-van-oververhitting-bij-zware-belasting.html
So far Qualcomm is having issues with their TSCM 20nm Cortex A57 design, LG supposedly cancelled their second SoC (Cortex A57, probably TSCM 20nm) due to overheating, Huawei skipped straight to TSCM 16nm FF (Kirin 930, based on Cortex A57).
Is it so hard hard to get a Cortex A57 phone/tablet SoC to work fine without FinFETs? Samsung got Exynos 5433 (Samsung 20nm) out since last September/October with the Galaxy Note 4 and it will power some of the thinnest (if not the thinnest) tablets around when the 5.4mm thick Galaxy Tab S 2 line launches later this year. Galaxy Note 4 users who used both versions even say Exynos 5433-based units don't get as warm as the Snapdragon 805 ones - not sure which version they tested above. Subjectively my Exynos 5433 unit barely gets warm in everyday use (including multitasking).
Others of us are having the warmest winter on recordIt's a shame the M9 is delayed - would have been great to have this winter to keep you warm.
Man, that's crazy. Samsung had some major delays getting 20 nm out as well -- don't forget. I wouldn't be surprised at all if it's because of bulk planar being at the end of its lifespan. SOI and 3D transistors are certainly going to be a requirement, going forward.S810 strikes again. I was hoping HTC One M9 would be one of the best implementations, it's a thick aluminium phone after all.
http://tweakers.net/nieuws/101871/htc-one-m9-heeft-last-van-oververhitting-bij-zware-belasting.html
So far Qualcomm is having issues with their TSCM 20nm Cortex A57 design, LG supposedly cancelled their second SoC (Cortex A57, probably TSCM 20nm) due to overheating, Huawei skipped straight to TSCM 16nm FF (Kirin 930, based on Cortex A57).
Well, it took them longer to get to market... about an extra quarter. They got 14 nm out quicker, though, but it's more expensive than 20 nm. Until those costs come down enough where the price difference is less of a deal, 20 nm is going to make more sense for Samsung.Is it so hard hard to get a Cortex A57 phone/tablet SoC to work fine without FinFETs? Samsung got Exynos 5433 (Samsung 20nm) out since last September/October with the Galaxy Note 4 and it will power some of the thinnest (if not the thinnest) tablets around when the 5.4mm thick Galaxy Tab S 2 line launches later this year. Galaxy Note 4 users who used both versions even say Exynos 5433-based units don't get as warm as the Snapdragon 805 ones - not sure which version they tested above. Subjectively my Exynos 5433 unit barely gets warm in everyday use (including multitasking).
S810 strikes again. I was hoping HTC One M9 would be one of the best implementations, it's a thick aluminium phone after all.
http://tweakers.net/nieuws/101871/htc-one-m9-heeft-last-van-oververhitting-bij-zware-belasting.html
So far Qualcomm is having issues with their TSCM 20nm Cortex A57 design, LG supposedly cancelled their second SoC (Cortex A57, probably TSCM 20nm) due to overheating, Huawei skipped straight to TSCM 16nm FF (Kirin 930, based on Cortex A57).
Is it so hard hard to get a Cortex A57 phone/tablet SoC to work fine without FinFETs? Samsung got Exynos 5433 (Samsung 20nm) out since last September/October with the Galaxy Note 4 and it will power some of the thinnest (if not the thinnest) tablets around when the 5.4mm thick Galaxy Tab S 2 line launches later this year. Galaxy Note 4 users who used both versions even say Exynos 5433-based units don't get as warm as the Snapdragon 805 ones - not sure which version they tested above. Subjectively my Exynos 5433 unit barely gets warm in everyday use (including multitasking).
Damn, didn't Qualcomm come out with an official statement rebuking the rumors that the S810 had overheating issues?
Samsung Electronics has started running projects in order to improve its own semiconductor design ability. This is mostly because Samsung Electronics Vice Chairman Lee Jae-yong has directly instructed them to do so. The company also intends to maintain its initiative in semiconductor design technology, which is the foundation of not only mobile devices but also the upcoming era of the Internet of Things.
According to the semiconductor industry on March 17, Vice Chairman Lee recently ordered the company's executive team “to strengthen technology capability in order to design not only mobile devices but also various semiconductors.”
So Samsung Electronics’ System LSI division has recently started its own custom core technology development project for mobile application processors (AP). The company expects that it will see results by Q1 next year at the latest.
Unlike Apple and Qualcomm, Samsung Electronics has not possessed its own custom core technology mobile APs, the heart of smartphones. The company is also making progress on a project to develop smartphone chips integrating mobile APs and modems, for which the company has been depending on Qualcomm until now.
Samsung Electronics aims to mount its own integrated chip on high-end smartphones to be released after the Galaxy S6. The company will also expand its foundry businesses, which mostly produced mobile APs, and gradually diversify product groups such as graphics processing units (GPUs) and central processing units (CPUs) for PCs.
Which begs the question, will Note 5's Exynos SoC integrate Samsung's custom ARM cores by Q4 or that's a bit early? Anyway, 2.3-2.5GHz Cortex A72 cores wouldn't hurt either.
Snafuh said:Yes, Qualcom did.
But Throttling and battery life of the M9 are horrible.
Off topic pet peeve: "begs the question" is a logical fallacy, specifically a type of circular reasoning. A common, incorrect usage--seen here--is to use it to introduce an interesting tangential question.
I suppose the incorrect usage has become so common that in some sense it is now correct. But, I will continue my fight against it nonetheless.
Now back to your regularly scheduled program.
Battery Life:
Sony Xperia Z3 9h 29 min (Excellent)
Samsung Galaxy S6 edge 8h 11 min (Excellent)
Samsung Galaxy S6 7h 14 min (Good)
HTC One M9 6h 25 min (Average)
Not sure if this got posted yet, but this is a very damning article for Samsung. They seem to have ethical issues - the last time it was rigging benchmarks.
Ref Link - there's a TON of information on the investigation they did into these rumors :
https://semiaccurate.com/2015/03/02/behind-fake-qualcomm-snapdragon-810-overheating-rumors/
Basically it boils down to these :
"All of the ‘overheating problems’ were found not to be true by many testers on dozens of production devices both at trade shows like CES and in the wild. You can buy an LG G Flex 2 now and it doesn’t overheat, it wasn’t underclocked as rumored, and has none of the ‘reported problems’."
"Conspiracy or idiots?
In spite of the complete and utter debunking of the rumors, the Snapdragon 810 ‘overheating’ story kept coming back.... The first thing that became clear is that each new echo started out within a few days of important events in the launch of the 810. "
And finally (much explanation before this) :
"...Samsung is scared, they have good reason to be frightened. With the launches of the S6, G4, and countless ODM Snapdragon 810 products happening in the coming weeks, what can they do? If you can’t win on merit, FUD. They are. The thoroughness of the FUD and smear campaigns strongly intone that Samsung is going to take a pounding during the next product cycle and that they know it too. If they had a winning product, they wouldn’t have gone to the extraordinary lengths they did to attack LG and the ODMs via the proxy of Qualcomm."
Traditional Cortex A53 cores run at 1.2GHz which is very low as compared to competitors. For example, the MT6752 octa-core SoC is clocked at 1.7GHz. So, in order to reach the same performance as these rival chipsets, the company had to increase the frequency to no less than 2.0 GHz.
It could have used the Cortex A57 processor which is almost 56% faster than A53. However, the company claims that A57 uses 256% more power than the A53, which would adversely affect the battery life. Moreover, Cortex A57 tends to heat faster, thereby increasing the temperature of the smartphone.
So, in order to avoid these problems, the company used the enahnced Cortex A53e cores on the chip. Of course, we still don’t think the performance will be at par with rival processors based on A57, however, for that you get efficiency and therefore additional battery life.