I know it can't be used as actual data as its anecdotal. What I'm trying to get at is that we need to come up with a better testing methodology. Looping through websites until the battery dies, who does that? Voice calling nonstop until the phone dies, who does that? We need some kind of test that does a little bit of everything throughout an entire day and then see how it fares. Like make X number of phone calls, browse X number of sites, stream music for X hours, watch a video for X hours, all together. I dunno, something like that.
And I also don't put much faith in AT because they have really dropped the ball when it comes to Android. When the new iPhone launched not only did we get a complete battery test, they changed the entire testing process for it. The Note 2? The Razr line? The DNA? We're still waiting for even just battery figures.
No way.
I prefer it the way it is now to your suggestion.
Not everyone watches Netflix or Blu-Ray rips on their phones(that's what my tablet is for). Not everyone talks non-stop until the phone dies. Not everyone keeps browsing on their phone forever until the phone dies 8-10 hours later.
I don't do either of those things...However, I can use the basis of each scenario to reflect and estimate
my own true world usage. This is how I do for all phones.
For example, reviewer quotes a talk time of 12 or 20 hours for the battery to reach 0%, 10%, or whatever standard value on a particular phone.
I estimate that I talk for 30mins a day, no more than an hour max so I take that reviewer's 12 hour figure that it takes for the battery to reach 0% and calculate mine from there.
Talk time: 1hr talk time max = 8.33% battery drain in a day from talking on the phone.
Web browsing: I browse the web or read news for a "limited" time. This is also where Anandtech's benchmarks fail because they load pages constantly every 10 seconds(or whatever "constant" figure) until the phone dies. I only read news/browse the web for 30mins-1hr max. I don't care how fast the phone takes to load the web whether 3G, LTE, Edge, or whatever. I only read news for 30mins to 1hr. How much battery can it theoretically drain loading pages for an hour? Whatever I don't finish reading or checking can wait until when I get home.
Tethering/Hotspot: I use this very often especially if I'm with my Nexus 7 tablet at the airport, hotel, or some other boring place like waiting for meetings at work to start, waiting for events at "wherever" to start, or checking real estate apps/information on the go since I'm in the process of buying a house.
Videos: I don't watch videos on my phone, except maybe YouTube which I'll prefer to watch either on my tablet or when I get home on my desktop. If I'm at a hotel and hook up some MHL cable to my phone and the hotel TV, fine but those days are almost as rare as hen's teeth. I don't care so much about video test hourly ratings as I do about other things like WiFi standby time, etc...
Standby time: How many hours does it take to kill the phone if WiFi was on 100% of the time? This is also important since I'm not one of those people that tries to micromanage by using JuiceDefender, disabling sync, turning off WiFi when outside, or turning off WiFi before you go to bed and turning it on 30 minutes before your alarm clock goes off using Tasker or some 3rd party app so you can sync everything at the same time to save battery, etc...I do use different kernels and/or SetCPU app, however that's as far as I will go in trying to optimize battery life. WiFi for me is
always on 100% when my phone is on, it doesn't matter whether I'm inside or outside anywhere. The only time WiFi is not on is if I'm tethering.
And so on:
And so on:
Your own test suggestion seems worse than the current alternative.
What you are trying to argue is that reviewers should do the average of everything, or use some ratio/combination that "supposedly" reflects the reviewer's own true usage(which probably won't match with anyone else) or the average true usage of what the crowd thinks, or simply the average of everything and lump them all together.
About the only way your way would make sense is if you those tests and keep the results
separately rather than trying to lump them into one individual sum to create some "average" battery rating figure.
This is like quoting average MPG in cars. Not every one does 50:50 highway/city driving. Not everyone does 80:20 highway/city driving. Not everyone does 20:80 highway/city driving.
Does anyone do 100% city or highway driving? Obviously not, but I would prefer the auto manufacturers list both theoretical 100%(or as close to it as possible) highway and 100%(or as close to it as possible) city MPG separately so I can calculate it to reflect my own usage of whatever ratio highway/city driving I personally do and not what Ford, Honda, EPA's bureaucratic estimation of "true" usage/MPG, or what John Q. Public thinks reflects true usage because the average person uses this or that ratio.
I'll prefer to see a car MPG rated 19/33 city/highway than for someone to take arbitrarily defined figures to average out and say MPG is 26(assuming 50:50 ratio city/highway) or 28(assuming whatever ratio it takes to get that).
Regarding Anandtech's battery ratings, I agree with you and I'm a little suspicious.
Like you mentioned, they haven't reviewed the battery life of the Note, Razr HD, Razr MAXX HD which clearly would either match or beat soundly the battery life of anything Apple could ever come up with on the iPhone.
However, their numbers along with GSMArena are the only ones reliable for me to go by and reflect my own true world usage scenario rather than using a very subjective test from TheVerge that doesn't show any numbers or comparison against other phones and is just the reviewer's or editor's intuition of what he thinks with no numbers to verify or back them up.