Apple A5X SoC

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
The buck doesn't stop at a useless "on paper" comparison. Where the rubber hits the road is where it counts.

I don't think GLBenchmark is an on-paper comparison. At most it's a synthetic benchmark, but it still shows the difference. There's really no other way to compare the two SoCs since they are on different platforms.
 

cheezy321

Diamond Member
Dec 31, 2003
6,218
2
0
LOL. So now the anti-apple crowd thinks they should dictate how apple should market their products? The hilarity continues!

The phrase is always true...haters gonna hate.
 

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
I don't get the point of doing "Offscreen" benchmarks...
If "Offscreen" benchmarks are the holy grail, then why don't we start using them in benching CPUs/GPUs instead of doing numerous resolution changes, numerous AA/AF/HDR changes, numerous game platform changes, and so on?

It seems to me that "Offscreen" benchmarks benches theoretical performance, not actual performance.

It's safe to say that Apple just took a single benchmark that had 4x performance out of the dozens of other benches they tested.
Nvidia does that a lot on the PC discrete graphics side too. In fact, they've been doing that for decades. It's an art for them.

Offscreen means it's resolution-independent.

It's a good metric for crossplatform benchmarking, especially when there is no way you can test the same hardware on the same resolution.

The reason why they do multiple resolutions + AA/AF for modern GPUs is because it's doable, it gives them more data, and they don't have to rework their tools to test different hardwares.

Say, how do you do an offscreen benchmark of Crysis 2? You can't. But what if you want to know how a certain GPU does against Crysis 2? You'll have to use standard resolution.

But with GLBenchmark, it's not that simple. You have tablets running at 1024 x 768 and other tablets running at 1280 x 800. Needless to say, the pixel count isn't the same, and if they don't do "offscreen", each device will have an advantage/disadvantage due to their native resolution.

But again, "offscreen" is not "on-paper" because the GPU still needs to work and churn out graphics during that time. "On-paper" is you take the number the clock speed, number of transistors and other hard numbers about the GPU and then calculate the result using math. There is no benchmark involved to get "on-paper" data. The only difference between regular and "off-screen" is that with "off-screen", you don't see the output of the graphics, but the GPU still has to do all of the rendering, and so on.

If you want a good representation of "off-screen", then think CPU benchmarks. It doesn't show you what's being calculated, but that doesn't mean the CPU isn't being stressed.
 

zsdersw

Lifer
Oct 29, 2003
10,560
2
0
But again, "offscreen" is not "on-paper" because the GPU still needs to work and churn out graphics during that time. "On-paper" is you take the number the clock speed, number of transistors and other hard numbers about the GPU and then calculate the result using math. There is no benchmark involved to get "on-paper" data. The only difference between regular and "off-screen" is that with "off-screen", you don't see the output of the graphics, but the GPU still has to do all of the rendering, and so on.

It remains as useless, though. We don't use our computers, tablets, and smartphones without seeing the output.
 

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
It remains as useless, though. We don't use our computers, tablets, and smartphones without seeing the output.

Outputting it won't make a difference in performance. It's like you unplug the monitor from your computer while running Crysis 2. It doesn't make Crysis 2 automagically 10x faster.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
I don't get the point of doing "Offscreen" benchmarks...
If "Offscreen" benchmarks are the holy grail, then why don't we start using them in benching CPUs/GPUs instead of doing numerous resolution changes, numerous AA/AF/HDR changes, numerous game platform changes, and so on?

It's so they can benchmark the devices at the same resolution.

Also, it's kind of inane to say, "then let's stop producing such robust PC benchmarks!" If such tests were possible across the three mobile operating systems, I guarantee that they would be used. However, at this point, the mobile benchmarks just aren't that good, and even if one is good, it's probably not multi-platform anyway.

I don't know... while I think seeing the new mobile technology is exciting, I think the benchmarks are kind of useless in the end. Smartphones seem to be a lot like consoles in regard to gaming. Games tend to be tailored to the system. This isn't like the PC world where you have to adjust the game for your individual system to maintain an acceptable frame rate.

So, these benchmarks really just show us that device A is better than device B, but what does that really tell us? Well... nothing really. When I look up video card benchmarks, I tend to look at specific games to see how they will increase performance at specific settings. For example, I looked at the Ivy Bridge preview to see how it would increase performance under World of Warcraft and StarCraft II (both tend to be affected by CPU performance a modest amount). If I see a modest enough improvement over my i5-2500k, I'll swap it out.

That's not really what we do with our phones. Sometimes developers will increase graphical capabilities on the faster devices. I usually check the update descriptions on iTunes before I download the apps (have to make sure the bastards didn't sneak ads into a game I paid for! :|), and I've seen a few that talked about increasing fancy effects on A5-equipped hardware.
 

zsdersw

Lifer
Oct 29, 2003
10,560
2
0
Outputting it won't make a difference in performance. It's like you unplug the monitor from your computer while running Crysis 2. It doesn't make Crysis 2 automagically 10x faster.

Viewing output is how we all use these devices, which is the only performance comparison that truly matters.
 

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
Viewing output is how we all use these devices, which is the only performance comparison that truly matters.

If you have to view output, given there is no other output than the built-in display, then 1024 x 768 on A5 is faster than 1280 x 800 on Tegra 3 any day. That's a no-brainer.

It stands to reason that 1024 x 768 on A5X would also be significantly faster than Tegra 3 at 1280 x 800 as well. Given that developers may still continue to do graphics at 1024 x 768 for a while yet, it's not unreasonable to assume that the iPad 3 running some 3D games at 1024 x 768 would far outperform Tegra 3.

In the unlikely scenario that Anand could hook up a display that's capable of showing 1280 x 720 to both A5 and Tegra 3, then the benchmark numbers would still be the same when you view them on that display. It makes no difference.
 

smartpatrol

Senior member
Mar 8, 2006
870
0
0
They shouldn't make the "4x the performance of Tegra 3" claim.

Uh no, that's not what you were arguing before. You were whining that the high-res display needs a more powerful GPU, and that Apple was somehow being deceptive by not mentioning that fact. Then when someone brought up the fact that Apple does indeed mention that in their promo video, you whined some more about how it needs to be "omnipresent" in their marketing. I was pointing out how completely ridiculous that suggestion is.

Oh well. I'm sure if Samsung or Asus released a $500 tablet with an insane 3.1 megapixel screen and GPU equal to the Playstation Vita's (or hell, even one that managed to match the A5's), everybody here would be trumpeting it to the heavens.
 

zsdersw

Lifer
Oct 29, 2003
10,560
2
0
If you have to view output, given there is no other output than the built-in display, then 1024 x 768 on A5 is faster than 1280 x 800 on Tegra 3 any day. That's a no-brainer.

It stands to reason that 1024 x 768 on A5X would also be significantly faster than Tegra 3 at 1280 x 800 as well. Given that developers may still continue to do graphics at 1024 x 768 for a while yet, it's not unreasonable to assume that the iPad 3 running some 3D games at 1024 x 768 would far outperform Tegra 3.

In the unlikely scenario that Anand could hook up a display that's capable of showing 1280 x 720 to both A5 and Tegra 3, then the benchmark numbers would still be the same when you view them on that display. It makes no difference.

The point is... what makes the difference is comparing the performance of one tablet versus another while viewing the output. Everything else is purely academic.
 

Fire&Blood

Platinum Member
Jan 13, 2009
2,331
16
81
I'm sure if Samsung or Asus released a $500 tablet with an insane 3.1 megapixel screen and GPU equal to the Playstation Vita's (or hell, even one that managed to match the A5's), everybody here would be trumpeting it to the heavens.

QFT
 

zsdersw

Lifer
Oct 29, 2003
10,560
2
0
Uh no, that's not what you were arguing before. You were whining that the high-res display needs a more powerful GPU, and that Apple was somehow being deceptive by not mentioning that fact. Then when someone brought up the fact that Apple does indeed mention that in their promo video, you whined some more about how it needs to be "omnipresent" in their marketing. I was pointing out how completely ridiculous that suggestion is.

Uh yes. They only mentioned needing a more powerful GPU in the video when they talked about the number of pixels, not when making the "4x the performance of Tegra 3" claim.

Oh well. I'm sure if Samsung or Asus released a $500 tablet with an insane 3.1 megapixel screen and GPU equal to the Playstation Vita's (or hell, even one that managed to match the A5's), everybody here would be trumpeting it to the heavens.

I wouldn't.
 

cheezy321

Diamond Member
Dec 31, 2003
6,218
2
0
The point is... what makes the difference is comparing the performance of one tablet versus another while viewing the output. Everything else is purely academic.

Isn't that what this entire thread is about? The SOC? This thread is about comparing SOC's, why do you keep on trying to change it to something that its not?
 

zsdersw

Lifer
Oct 29, 2003
10,560
2
0
Isn't that what this entire thread is about? The SOC? This thread is about comparing SOC's, why do you keep on trying to change it to something that its not?

I'm not changing the subject, I'm essentially saying that an academic discussion about the SoC is useless. I'm commenting on the subject, not changing it.
 

cheezy321

Diamond Member
Dec 31, 2003
6,218
2
0
I'm not changing the subject, I'm essentially saying that an academic discussion about the SoC is useless. I'm commenting on the subject, not changing it.

Great, you commented and nobody agrees with you. Now its over. So what else do you have to post in here?
 

MrX8503

Diamond Member
Oct 23, 2005
4,529
0
0
Doesn't look that way when reading your posts. You always defend/credit Apple and rarely, if ever, criticize anything they make or do.

I don't like the magic mouse, apple tv, routers, mac mini....but those things aren't really talked about to begin with. The iPad1/iPad2 release, I never said that they were great tablets, but I never said they were bad tablets either simply because Android tablets were bad too.

LOL, I know, it's awesome to watch them squirm and bitch.

I'll bench one of mine next week, hell, I might keep both of them

There's much squirming indeed. Its just like when the A5 was released, much squirming there too and the A5 lead the performance pack for nearly a year! People need to get over it.
 
Last edited:

zsdersw

Lifer
Oct 29, 2003
10,560
2
0
Great, you commented and nobody agrees with you. Now its over. So what else do you have to post in here?

I don't care if they don't agree. I'll post in here just to piss you off, how's that?

If you're posting just to piss someone off, you're doing it wrong.
-ViRGE
 
Last edited by a moderator:

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
The point is... what makes the difference is comparing the performance of one tablet versus another while viewing the output. Everything else is purely academic.

And as I said, if you view the output, 1024 x 768 on A5 is faster than 1280 x 800 on Tegra 3. And due to there being less pixels to work with, A5 is undoubtedly 2x faster than Tegra 3 in that case.

Also if you plug a 1280 x 720 display into A5 and Tegra 3 then run GLBenchmark, then you'll get the same results Anand got.

That's it. Tegra 3 is just slower than A5. I'm sure that has been beaten to death and back in other threads. But I do agree with you that the difference between A5X and Tegra 3 can't be 4x.

In any case, even Anand has admitted that a 4x increase in GPU horsepower from A5 is not very necessary to deliver a similar gaming experience on the iPad 3, possibly hinting that a 2x increase might just be enough to some extent.

Anand said:
If you remember back to our iPad 2 review however, the PowerVR SGX 543MP2 used in it was largely overkill for the 1024 x 768 display. It's likely that a 4x increase in GPU horsepower wasn't necessary to deliver a similar experience on games.

He also went on to note that memory bandwidth would play a more significant role. But I'd guess that since the GPU core count is doubled, bandwidth should be boosted in some way to prevent bottleneck as well. If the A5X has higher clock speed, then that would surely remedy the bandwidth issue somewhat.

There are too many things we don't know about it right now, so it's hard to judge. What I do know, though, is that Apple successfully drove 2048 x 1536 at the same smoothness as iPad 2 for general interface performance. It wouldn't be a stretch to say that they tried to do the same with 3D gaming.
 
Last edited:

zsdersw

Lifer
Oct 29, 2003
10,560
2
0
And as I said, if you view the output, 1024 x 768 on A5 is faster than 1280 x 800 on Tegra 3. And due to there being less pixels to work with, A5 is undoubtedly 2x faster than Tegra 3 in that case.

Also if you plug a 1280 x 720 display into A5 and Tegra 3 then run GLBenchmark, then you'll get the same results Anand got.

That's it. Tegra 3 is just slower than A5. I'm sure that has been beaten to death and back in other threads. But I do agree with you that the difference between A5X and Tegra 3 can't be 4x.

In any case, even Anand has admitted that a 4x increase in GPU horsepower from A5 is not very necessary to deliver a similar gaming experience on the iPad 3, possibly hinting that a 2x increase might just be enough to some extent.



He also went on to note that memory bandwidth would play a more significant role. But I'd guess that since the GPU core count is doubled, bandwidth should be boosted in some way to prevent bottleneck as well. If the A5X has higher clock speed, then that would surely remedy the bandwidth issue somewhat.

There are too many things we don't know about it right now, so it's hard to judge. What I do know, though, is that Apple successfully drove 2048 x 1536 at the same smoothness as iPad 2 for general interface performance. It wouldn't be a stretch to say that they tried to do the same with 3D gaming.

This entire post is a response to an argument that doesn't exist. You yourself said:

runawayprisoner said:
But what I'm more interested in, is how much faster A5X is compared to Tegra 3 when both have to handle 2048 x 1536, and I'm also interested in how much performance A5X loses when it has to handle such a monstrous res.

... which is what I want to see, too.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
I like that Apple made sure to up the GPU for its Retina Pad. I was scared that they would pull another iPhone 4- up the res with the same GPU. It is kinda disappointing they didn't throw in two mores cores, or better/faster cores but in iOS I guess that isn't as needed as in Android.

What is much more interesting to me is the single-core AppleTV. That tells me that now that Apple is getting closer to the fab process by moving away from Samsung THEY have to find something to do with all the silicon that is not 100%. We don't know yet but I bet the AppleTV is an A5 with a (faulty) core disabled. That plus the fact that the iPad 2 is sticking around at a lower price point might indicate that Apple is planning on making the fab switch with A5 parts instead of waiting for A6.

Finally there is the discussion about what this new iPad means for the next iPhone. I disagree with Anand that the A5X will be in the iPhone 5- the iPhone doesn't need the same beefy GPU (unless again we are doubling the resolution for it) especially when it will hurt the battery life so much.

That implies that the iPhone 5 will be much like the iPhone 4- almost no upgrade on the SoC side (maybe even no upgrade except for the fab process) but a different form factor.
 

MrX8503

Diamond Member
Oct 23, 2005
4,529
0
0
... which is what I want to see, too.

I would assume that Tegra 3 would fall flat on its face running 1536p. As for A5X, Anand predicts similar performance to the iPad2 A5 at 720p, but a little slower.
 

lothar

Diamond Member
Jan 5, 2000
6,674
7
76
Offscreen means it's resolution-independent.

It's a good metric for crossplatform benchmarking, especially when there is no way you can test the same hardware on the same resolution.

The reason why they do multiple resolutions + AA/AF for modern GPUs is because it's doable, it gives them more data, and they don't have to rework their tools to test different hardwares.

Say, how do you do an offscreen benchmark of Crysis 2? You can't. But what if you want to know how a certain GPU does against Crysis 2? You'll have to use standard resolution.

But with GLBenchmark, it's not that simple. You have tablets running at 1024 x 768 and other tablets running at 1280 x 800. Needless to say, the pixel count isn't the same, and if they don't do "offscreen", each device will have an advantage/disadvantage due to their native resolution.

But again, "offscreen" is not "on-paper" because the GPU still needs to work and churn out graphics during that time. "On-paper" is you take the number the clock speed, number of transistors and other hard numbers about the GPU and then calculate the result using math. There is no benchmark involved to get "on-paper" data.
Resolution, AA/AF? Sure they can be compared.
MSAA, TSAA, and whatever from Nvidia can't be compared to the ATI versions as they're completely different methods...If I remember correctly this was the argument the nVidia fanboys on every Video Cards and Graphics forum(yes, even the one on Anandtech as well) were bringing up about 2-3 years ago when ATI was the king.

You(or was it someone else?) mentioned earlier that it can be compared to a "synthetic" benchmark.
In that case then it's almost as useless as Quadrant(okay, no mobile benchmark is worse than this so I take that back), 3DMarksXXXX(I lost track on whatever number they're on now), SiS SoftSandra(or whatever it's called), and a few others. I skip those benchmarks whenever I'm reading reviews on CPU's and GPU's. I'm not sure why review sites continue to use them.

The word "synthetic" IMO automatically removes some credibility from any such benchmark.
 

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
While synthetic benchmarks generally don't match real-world performance because different games/apps are optimized differently, they still tell the general story.

It's only when you "game" the synthetic benchmark that it becomes ridiculous, but at stock, it's a good indicator of performance.

It's the synthetic benchmarks of the whole systems that should be taken with a grain of salt. Synthetic GPU benchmarks are still good.

Whichever the case, A5 is faster than Tegra 3. Even if you don't believe that, then A5X is still definitely faster than Tegra 3. The only question that has been posed a lot thus far is... how much faster A5X is compared to Tegra 3.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |