S|A: (rumour) Apple dumps Intel from laptop lines

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Voo

Golden Member
Feb 27, 2009
1,684
0
76
"delivers its peak performance of 4000 DMIPS while consuming less than 250mW per CPU when selected from typical silicon."

Same page..
Wait you want to compete with a Intel x86 CPU with a power optimized ARM and not even the performance optimized version? Yeah, good luck with that. 4k MIPS at 2ghz is.. well for a comparison I think the first Athlons running at 1ghz had about that.

I'm still not sure why almost everyone thinks that ARM will be the savior of the industry from the ancient x86 ISA. Sure x86 is a grown monstrosity, but most (not all though) can be handled by the decoder. That makes the decoder larger (negligible) and powerintense, but Intel has some tricks up its sleeve - things like a µop cache or the fact that modern compilers don't output problematic opcodes, reduces that problem quite a bit.

So maybe an x86 CPU is 3% less power efficient than a comparable ARM core (which doesn't exist so far) on the same process node - not insignificant, but also not especially noteworthy and that's ignoring the fact that Intel's process is more advanced usually. Also the usual power figures for ARM cpus are cited CPU only - so without the cache and co, which is getting more and more important with every die shrink. And cache is cache - the ISA doesn't have much of an influence on that.

The most prominent feature about ARM cores is that they're a whole lot cheaper than x86 CPUs - well that and the fact that Intel so far hasn't produced a chip for the power envelope modern ARM CPUs are operating (and atom really was handled like a stepchild - old tech, old process nodes)
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,118
59
91
Nvidia didn't become the big daddy of the GPU world by commanding high margins on their products, they did it by maintaining an aggressive release cycle that allowed them to hit the entire market in short order and rarely let any competitors maintain a lead over them before their next product was out. Nvidia is used to iterating products must faster than the typical ARM giants, and the willingness to cannibalize their own product is what may make them a big player in the market. Of course, their competition in the ARM world is larger than anything they ever faced in the GPU world.

I totally agree. IMO this has more to do with the ruthless business savvy of JHH in comparison to CEO's of established companies that want easy-to-manage business groups under their helm.

Guys like JHH who are still running billion dollar companies that they personally had a hand in starting (the likes of Gates, Ellison, Grove, Jobs, Sanders, Page, Dell, Chang, etc) are just a whole different breed of individual and business leader compared to the CEO's who simply inherited a pre-existing business to manage (Ruiz, Templeton, Otellini, Fiorina etc). Sometimes its a good thing, other times it is a handicap in a business sense (Woz, Allen, etc).

The challenge for a JHH in the world of ARM though is that the customers of his products (which is not us, it is the people who ultimately sell us stuff that has Nvidia chips in them like tablets and video cards) are not super-keen to the idea of a 6-12 month planned obsolescence rate. That gets expense from a product-design side of the equation.

Nokia is not looking to completely redesign their smartphone portfolio every 6 months for example, they'd never ship enough volume in 6 months of a product cycle to justify the cost they sunk into creating the short-lived product in the first place.

So a short(er) refresh cycle from Nvidia is not exactly an advantage in the marketspace they are headed. Stability of product volume is though. These guys like to know you have a roadmap that they can have confidence in you hitting. Promising new chips with super performance every 12 months but consistently missing the delivery date by two or even one month is a deal killer.

Nokia won't accept having their product development team being idled for 1 month because Nvidia needed another stepping to get the via yields fixed for their super-duper ARM chip. Nokia will go with a company that can manage the timeline with consistency, even if it sacrifices some performance off the top, because in those markets time (more specifically volume) to market is everything.
 

Ares1214

Senior member
Sep 12, 2010
268
0
0
Wait you want to compete with a Intel x86 CPU with a power optimized ARM and not even the performance optimized version? Yeah, good luck with that. 4k MIPS at 2ghz is.. well for a comparison I think the first Athlons running at 1ghz had about that.

I'm still not sure why almost everyone thinks that ARM will be the savior of the industry from the ancient x86 ISA. Sure x86 is a grown monstrosity, but most (not all though) can be handled by the decoder. That makes the decoder larger (negligible) and powerintense, but Intel has some tricks up its sleeve - things like a µop cache or the fact that modern compilers don't output problematic opcodes, reduces that problem quite a bit.

So maybe an x86 CPU is 3% less power efficient than a comparable ARM core (which doesn't exist so far) on the same process node - not insignificant, but also not especially noteworthy and that's ignoring the fact that Intel's process is more advanced usually. Also the usual power figures for ARM cpus are cited CPU only - so without the cache and co, which is getting more and more important with every die shrink. And cache is cache - the ISA doesn't have much of an influence on that.

The most prominent feature about ARM cores is that they're a whole lot cheaper than x86 CPUs - well that and the fact that Intel so far hasn't produced a chip for the power envelope modern ARM CPUs are operating (and atom really was handled like a stepchild - old tech, old process nodes)

I did not say I want to compare that to an x86 core, I was merely saying that an average ARM core in a phone needs between 80-250mW depending on load. That was against another argument saying it was 2W. Anyway, for you, you are doing exactly what I was saying people were doing. Comparing current x86 CPU's to current ARM CPU's. You yourself said they werent even in the same bracket, so why compare? Im comparing ARM to x86. Just so happens that ARM is more power efficient, scalable, and costs a lot less, x86 is generally faster. Both are good for their respective markets, and have obviously found their place. But now with both trying to enter each others market, one has to either scale ARM up or x86 down. Its not entirely fair to compare a 80w CPU to a 80, 800, or even 8000mW CPU. What will be fair is when ARM goes to above a few hundred mW, even above a few watts. Then we can compare a 5w ARM CPU to a 7w Bobcat. Later on a 35w ARM to a 50w x86. And besides, who really needs x86 performance anyway? My Android phone runs perfectly fine on a 1 GHz ARM CPU (MT4G). The current/soon to be used ARM CPU's can support up to 4GB memory and 2560x1600. Thats perfectly acceptable for a netbook. The next gen, which from Nvidia is supposed to be released in August, can do even more. After that, Nvidia moves on to Stark or Logan, forget to be honest, which brings along a 2x performance increase over the last one, and maybe support for 64 bit, meaning desktop sized memory amounts. There is nothing wrong with ARM performance, only thing "wrong" is the lack of the market it covers. Id love to see ARM make an exclusively prototype 6+ core CPU at 300mm^2+, 3 GHz+, and 80W+ just to show people that ARM can be as fast as x86 can be designed to be. Cost less too.
 

Khato

Golden Member
Jul 15, 2001
1,225
280
136
"delivers its peak performance of 4000 DMIPS while consuming less than 250mW per CPU when selected from typical silicon."

Same page...I said as low as 80mW, and that would be 1000x less power than a typical desktop CPU. I then said on a more typical basis it would probably be 250-500x less power. Your 2w number was no where near.

Yes, 250mW per CPU equates to 500mW in this simple example since there are two cores in use for that 4000 DMIPS score. What's so difficult to understand about this when they explicitly state that value on the 'performance' tab of their product page? Which is also where they state a full 1.9W for the 2GHz 'performance optimized' hard macro design.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
Id love to see ARM make an exclusively prototype 6+ core CPU at 300mm^2+, 3 GHz+, and 80W+ just to show people that ARM can be as fast as x86 can be designed to be. Cost less too.
Sure, but my whole point is, that that CPU wouldn't be any more power efficient than the x86 core - at least not noticeable (that's including Intel's process advantage).

Just look at their current designs - their performance products (which is obviously what we're targeting here) shows 1.9W for 2ghz core which DOESN'T include l2 cache (!) and the controller (and the memory controller? I'd wager not). Include those numbers and you're probably over 3W and the performance is still far way from a modern x86 core. That just shows once again that power usually doesn't scale linearly for performance - so on what are all those claims based that ARM will be so much more power efficient than x86?

Also while I'm sure that ARM is able to produce high performance CPUs, that takes its time. Who exactly would want to buy a CPU in 2012 that has the performance of a c2d that could be bought in 2008? After all there are already laptops out there that get 8h+ life out of their batteries (and with every new node that should increase accordingly) which is more than enough for most people - that's different for smartphones (nobody would try to sell a phone that has 8h battery life today; at least not without getting laughed at).
 

mnewsham

Lifer
Oct 2, 2010
14,539
428
136
(nobody would try to sell a phone that has 8h battery life today; at least not without getting laughed at).

My phone usually dies around lunch time if i don't have it plugged in to re-charge it at some point. DroidX circa 2010.
 

drizek

Golden Member
Jul 7, 2005
1,410
0
71
Arm is all about idle power consumption. You can't leave an x86 machine to idle for 72+ hours. Until that changes, all the process technology in the world isn't going to allow x86 to take hold in mobile.
 

cotak13

Member
Nov 10, 2010
129
0
0
Arm is all about idle power consumption. You can't leave an x86 machine to idle for 72+ hours. Until that changes, all the process technology in the world isn't going to allow x86 to take hold in mobile.

You'd think that if you don't have one of the modern smartphone running multi task in the background. My iphone runs on ARM and it doesn't last 72 hours if it stays off the charger and idle. No way, no how.

Used to be people want phones that last a long time because well with nicads and NiMh you have the memory issue to deal with so you typically don't charge your phone till it's almost dead anyhow. So endurance is important because people do want to make a call with 1 bar left.

Today with the li-ion batteries you can plug them in if you are low on juice. I know my habit is to charge it often. I think most people have also developed this habit. So ultimate endurance especially idle time isn't as important as some would like to make it out to be. At anyrate, your RF parts consumes the most power. My iphone can easily run for 2 hours playing games. Typically it's a lot longer than that. Switch to doing things on the 3G and ouch it's a pretty rapid drain.

Btw about idle. I can sleep my macbook pro with it's C2D for 72 hours no problem (mine's just been asleep for about 24 hours now and it's at 85% battery left). How's that for idle time? And if you want to talk resume time. It's pretty much open the lid and I get the password prompt.
 
Last edited:

cotak13

Member
Nov 10, 2010
129
0
0
I totally agree. IMO this has more to do with the ruthless business savvy of JHH in comparison to CEO's of established companies that want easy-to-manage business groups under their helm.

Guys like JHH who are still running billion dollar companies that they personally had a hand in starting (the likes of Gates, Ellison, Grove, Jobs, Sanders, Page, Dell, Chang, etc) are just a whole different breed of individual and business leader compared to the CEO's who simply inherited a pre-existing business to manage (Ruiz, Templeton, Otellini, Fiorina etc). Sometimes its a good thing, other times it is a handicap in a business sense (Woz, Allen, etc).

The challenge for a JHH in the world of ARM though is that the customers of his products (which is not us, it is the people who ultimately sell us stuff that has Nvidia chips in them like tablets and video cards) are not super-keen to the idea of a 6-12 month planned obsolescence rate. That gets expense from a product-design side of the equation.

Nokia is not looking to completely redesign their smartphone portfolio every 6 months for example, they'd never ship enough volume in 6 months of a product cycle to justify the cost they sunk into creating the short-lived product in the first place.

So a short(er) refresh cycle from Nvidia is not exactly an advantage in the marketspace they are headed. Stability of product volume is though. These guys like to know you have a roadmap that they can have confidence in you hitting. Promising new chips with super performance every 12 months but consistently missing the delivery date by two or even one month is a deal killer.

Nokia won't accept having their product development team being idled for 1 month because Nvidia needed another stepping to get the via yields fixed for their super-duper ARM chip. Nokia will go with a company that can manage the timeline with consistency, even if it sacrifices some performance off the top, because in those markets time (more specifically volume) to market is everything.

Exactly. Most people not in the industry don't realize how razor thin OEM's product development budgets are. They don't have the money to play around with you missing your target dates. And they mostly also don't have money to develop products out of their normal product cycles.

Rushing the OEMs by pushing out chip after chip is going to drive them nuts. Especially if they see no benefits in it. You can't just dump a new chip on them mid cycle they will resist using it unless it really make a meaningful differences.

Also a great deal of the cycle is driven by typical consumer spending cycles. You want to release your hot new product a few months (at most) before the "back to school" and "holiday" periods. Shoving a product onto the shelf just after those cycles is a money loser. Big box retailers will be doing their inventory flush offering discounts. Making your brand new expensive product less attractive. And when the buying period arrives your no longer new product is now maybe competing against newer better devices that were just released.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I don't think your argument applies as nvidia is not nokia (a single phone making company). Nvidia make chips and don't sell too one company, they sell to any. In a hyper competitive market where everyone is looking for an edge if nvidia have just produced the latest greatest chip there'll always be someone to buy it. You think every company in the market is going to agree in unison to only update at the same time once a year?

The traditional market is changing anyway - when I used to buy a phone I'd just use it, I'd never think of updating it. Now a phone is like a computer with major software updates and changes happening all the time. You no longer release a finished product like you once did.
 

drizek

Golden Member
Jul 7, 2005
1,410
0
71
You'd think that if you don't have one of the modern smartphone running multi task in the background. My iphone runs on ARM and it doesn't last 72 hours if it stays off the charger and idle. No way, no how.

Used to be people want phones that last a long time because well with nicads and NiMh you have the memory issue to deal with so you typically don't charge your phone till it's almost dead anyhow. So endurance is important because people do want to make a call with 1 bar left.

Today with the li-ion batteries you can plug them in if you are low on juice. I know my habit is to charge it often. I think most people have also developed this habit. So ultimate endurance especially idle time isn't as important as some would like to make it out to be. At anyrate, your RF parts consumes the most power. My iphone can easily run for 2 hours playing games. Typically it's a lot longer than that. Switch to doing things on the 3G and ouch it's a pretty rapid drain.

Btw about idle. I can sleep my macbook pro with it's C2D for 72 hours no problem (mine's just been asleep for about 24 hours now and it's at 85% battery left). How's that for idle time? And if you want to talk resume time. It's pretty much open the lid and I get the password prompt.

If you think that then you don't have a modern tablet. My iPad can last... well... I've never really found out, but it is way more than 72 hours when "idle". Idle means on, downloading emails, receiving push notifications, waking me up in the morning, etc. Even my Palm Pre can last for several days on a charge if I turn off 3G but leaving it connected as a phone.

Your laptop is not idle, it is in standby. The CPU is literally turned off in this mode. THe thing draining your battery is your ram.
 

Ares1214

Senior member
Sep 12, 2010
268
0
0
You'd think that if you don't have one of the modern smartphone running multi task in the background. My iphone runs on ARM and it doesn't last 72 hours if it stays off the charger and idle. No way, no how.

Used to be people want phones that last a long time because well with nicads and NiMh you have the memory issue to deal with so you typically don't charge your phone till it's almost dead anyhow. So endurance is important because people do want to make a call with 1 bar left.

Today with the li-ion batteries you can plug them in if you are low on juice. I know my habit is to charge it often. I think most people have also developed this habit. So ultimate endurance especially idle time isn't as important as some would like to make it out to be. At anyrate, your RF parts consumes the most power. My iphone can easily run for 2 hours playing games. Typically it's a lot longer than that. Switch to doing things on the 3G and ouch it's a pretty rapid drain.

Btw about idle. I can sleep my macbook pro with it's C2D for 72 hours no problem (mine's just been asleep for about 24 hours now and it's at 85% battery left). How's that for idle time? And if you want to talk resume time. It's pretty much open the lid and I get the password prompt.

What Iphone do you have? My MT4G can do about 8-10 hours intensive tasks, probably a week if I dont touch it, but definitely longer than 72 hours. And once again, apples to oranges, your Mac's battery is 2x the size of my phone
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
I don't think your argument applies as nvidia is not nokia (a single phone making company). Nvidia make chips and don't sell too one company, they sell to any. In a hyper competitive market where everyone is looking for an edge if nvidia have just produced the latest greatest chip there'll always be someone to buy it. You think every company in the market is going to agree in unison to only update at the same time once a year?

The traditional market is changing anyway - when I used to buy a phone I'd just use it, I'd never think of updating it. Now a phone is like a computer with major software updates and changes happening all the time. You no longer release a finished product like you once did.

yeah right and F the other oem by that.
 

RobertPters77

Senior member
Feb 11, 2011
480
0
0
Does anyone seriously believe that nvidia will be the choice OEM for the ARM chips?

Apple want's control. Apple already makes it's own ipad chips. The only company that can supply apple, will be apple.

Oh I can see it now. In 2013 the New Macbooks are released with Apple in-house chips, and Steve(blow)Jobs will be going on and on about how apple decided to cut out the middle man and bring a better user experience to the macbook. And the mindless apple drones will rave about how great it is. The Media and the techworld will be lit aflame of how apple made another ingenious tech move...

Meanwhile the entire techworld will snicker behind their backs about what an idiotic move it was. And these new macs will be slower clock for clock than the ancient PPC variants!
 
Mar 11, 2004
23,177
5,641
146
Does anyone seriously believe that nvidia will be the choice OEM for the ARM chips?

Apple want's control. Apple already makes it's own ipad chips. The only company that can supply apple, will be apple.

Oh I can see it now. In 2013 the New Macbooks are released with Apple in-house chips, and Steve(blow)Jobs will be going on and on about how apple decided to cut out the middle man and bring a better user experience to the macbook. And the mindless apple drones will rave about how great it is. The Media and the techworld will be lit aflame of how apple made another ingenious tech move...

Meanwhile the entire techworld will snicker behind their backs about what an idiotic move it was. And these new macs will be slower clock for clock than the ancient PPC variants!

I might have missed something, but where was it said nvidia was to be the OEM? They were just referencing an nvidia chip and nvidia's claim about how powerful it will be (~Core 2 Duo), as well as making a point about the instruction set. That's pretty sketchy to go on, and maybe I'm wrong, but I get the hint that it is something different from what people are thinking with ARM. I'm expecting something significantly different from even their current projected future designs.

There's nothing stopping ARM from designing something much more performance focused, but people seem stuck on thinking of ARM in the sense of the mobile chips in phones. I don't expect it to be more powerful than what Intel is making at the time (which would be what's after Ivy Bridge), but I don't think it needs to be, as they can do a lot of things to make the performance differences null. For instance, they could come up with a dock that enables substantially higher performance when plugged in, or they could offload intensive tasks to say servers.

Apple doesn't have their own fabs, that's what they're talking about supplying Apple. Apple does not make their own iPad chips, they just designed it and then its made by someone else (I believe Samsung currently). Apple doesn't even own the factories their products are assembled in.

Let them snicker, Apple just keeps making more and more money. A lot of people have been making similar predictions, first about the iPhone and then about the iPad, and see how that's gone.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
It didn't took long until someone found out about a nasty little detail about the performance numbers; when looking at the compiler versions and settings used to compile the CoreMark benchmark, the Core 2 Duo numbers were produced via GCC 3.4 and only the standard set of optimizations (-O2), while the Tegra 3 numbers were run on a more recent GCC 4.4 with aggressive optimizations (-O3). Il Sistemista website took a Core 2 Duo T7200 and re-ran the benchmark compiled with GCC 4.4 and the same optimization settings. The results were no longer in favor of NVIDIA, as the Core 2 chip scored about 15,200 points, compared to the Tegra's 11,352.

CoreMark benchmark "real" scores, not hampered for the Core2duo:

Core2duo T7200 = 15,200 points scored
Tegra 3 "kal-el" = 11,352 points scored (about ~35% slower)

So Kal-el which isnt released yet, is still slower than a Core2Duo T7200, which is a 65nm 35watt CPU thats a 2006 old thingy.

A Ivy Bridge will probably be like 5 times (500%) or so faster than a Kal-el.
People give AMD sh*t for haveing CPUs that are like 20% slower than the Intel ones, how is something that ll make a AMD CPU look crazy fast, gonna end up in desktops?

Poweruse....But Im wondering, if Intels next gen Atom, or AMDs next gen C-50 type APU,... wont be low enough.

I mean on 40nm AMD's C-50 can reach 5watts without takeing away from the cpu/gpu, and probably be a good bit faster than the Kal-el.

Question is how much lower in power use will we have to go?
Will people be satisifed with 5watt TPD if its faster than ARM?
 

Martimus

Diamond Member
Apr 24, 2007
4,488
153
106
CoreMark benchmark "real" scores, not hampered for the Core2duo:

Core2duo T7200 = 15,200 points scored
Tegra 3 "kal-el" = 11,352 points scored (about ~35% slower)

So Kal-el which isnt released yet, is still slower than a Core2Duo T7200, which is a 65nm 35watt CPU thats a 2006 old thingy.

A Ivy Bridge will probably be like 5 times (500%) or so faster than a Kal-el.
People give AMD sh*t for haveing CPUs that are like 20% slower than the Intel ones, how is something that ll make a AMD CPU look crazy fast, gonna end up in desktops?

Poweruse....But Im wondering, if Intels next gen Atom, or AMDs next gen C-50 type APU,... wont be low enough.

I mean on 40nm AMD's C-50 can reach 5watts without takeing away from the cpu/gpu, and probably be a good bit faster than the Kal-el.

Question is how much lower in power use will we have to go?
Will people be satisifed with 5watt TPD if its faster than ARM?

If they can get C-50 performance with a 100mW power draw, then it will be extremely useful in handheld devices such as phones. 5W is more than an order of magnitude too much to be useful in a small battery powered device.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
59
91
If they can get C-50 performance with a 100mW power draw, then it will be extremely useful in handheld devices such as phones. 5W is more than an order of magnitude too much to be useful in a small battery powered device.

I recall reading somewhere that the CPU that powers the Iphone's draw between 500mW and 800mW max.

That should probably set the upper-limit of expectations of what the market will bear in terms of power-consumption as it relates to both batter life as well as the general heat/comfort/size of the unit itself.

And the ARM guys aren't standing still either. While Intel and/or AMD are trying to get their x86-based solutions to fit within a sub-1W footprint the ARM designers are busy working on getting their 800mW products to deliver comparable performance while fitting within a 500mW or 250mW footprint.

The performance/watt targets and thresholds are a moving benchmark and right now ARM holds the upper-hand.
 

OS

Lifer
Oct 11, 1999
15,581
1
76
CPU power consumption is just one part of total device/platform power use.

The larger the device or more features, the more power other parts of the system also require, in which case it makes less sense to sacrifice CPU performance when the rest of your system takes 7-8 watts idle.
 

Cogman

Lifer
Sep 19, 2000
10,278
126
106
CPU power consumption is just one part of total device/platform power use.

The larger the device or more features, the more power other parts of the system also require, in which case it makes less sense to sacrifice CPU performance when the rest of your system takes 7-8 watts idle.

eeeehhh, this is not always true. For example, moving the memory controller from the north bridge to the CPU will result in a net loss of power consumption. There are several reasons for this, the CPU waists less cycles waiting for a Memory request, the memory controller is built on a smaller form factor than it would be in a motherboard chipset, and the memory controllers interface can be simple as the distance for input signals to travel is drastically reduced.

A SOC is going to use less power than having each individual component for the same reasons.
 

drizek

Golden Member
Jul 7, 2005
1,410
0
71
CPU power consumption is just one part of total device/platform power use.

The larger the device or more features, the more power other parts of the system also require, in which case it makes less sense to sacrifice CPU performance when the rest of your system takes 7-8 watts idle.

x86 CPUs place a glass ceiling on battery life. You're not going to get more than 10 hours or so on a normal system, even if it is doing nothing and the display is turned off.

They also massively degrade the user experience by having such large variation in power consumption. One of the coolest things about the iPad is that you know it will last 10 hours on a charge no matter what, whether you are streaming netflix or reading a book. This doesn't extend to PC laptops, not even close. From Apples perspective, Arms low idle and low load power consumption is part of the "magic".
 

OS

Lifer
Oct 11, 1999
15,581
1
76
you're going to integrate an hdd/sdd and RAM into your SOC?

a medium size LCD panel takes something like ~4W by itself
 

OS

Lifer
Oct 11, 1999
15,581
1
76
x86 CPUs place a glass ceiling on battery life. You're not going to get more than 10 hours or so on a normal system, even if it is doing nothing and the display is turned off.

They also massively degrade the user experience by having such large variation in power consumption. One of the coolest things about the iPad is that you know it will last 10 hours on a charge no matter what, whether you are streaming netflix or reading a book. This doesn't extend to PC laptops, not even close. From Apples perspective, Arms low idle and low load power consumption is part of the "magic".


that isn't my point, my point is on a larger device the other parts take up more power also, which reduces the benefits of CPU power savings.
 

drizek

Golden Member
Jul 7, 2005
1,410
0
71
And thats my point, it doesn't. Having a low power CPU on a large device gives you much more consistent battery performance. Apple will not sell a tablet that gets 5 hours of life playing video and 10 hours reading books. Stable/consistent battery life is as big a part of the user experience as long battery life. Battery life is also "buffered" by the ambient light sensor, that keeps LCD power draw relatively consistent.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |