Sneak Peak of Tegra 4 (Codename Wayne)

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

djgandy

Member
Nov 2, 2012
78
0
0
All of this speculation about an alleged benchmark on a pre-development board has no real meaning until the silicon is final. It can take multiple samplings before it is, just a working silicon sample at 80% can have a cascade effect before the last 20% is finally optimized.
The fact that Nvidia is cramming that much performance in T4 (>50% less than A6X) silicon and still will match or surpass the A6X performance while reducing power consumption compared to T3 (>45%) should be a plus for battery life.


Well according to BenSkyWalker silicon is final and T4 is a shipping product that can be compared to A6. We can only hope one day we will all have read as many EE books as him. For now I'll just have to continue working in the industry itself.

The fact is everyone compares GLBenchmark, and if you have silicon back you are not in early development stages. Either Nvidia have underclocked their dev boards massively (which is odd since you can stick active cooling on dev boards very easily) or must have some seriously broken hardware/drivers at this stage if they are going to pull out a 50% performance increase on such a simple benchmark that they will have been working on optimising since day 1. Generally if silicon is that bad it would never leave the premises either, in fact you wouldn't even bother going to silicon at that stage

BSW: Where are you getting these poor iPhone battery figures from? It is consistently near the top on every test Anand has run?
http://www.anandtech.com/show/6330/the-iphone-5-review/13
The only weak point of the iPhone is talk time, something that is irrelevant in terms of CPU/GPU and completely due to its smaller battery. This is Apple's design choice, and they clearly pull it off with a far superior SoC. If Apple decide to go down the large battery route they have far more headroom to play with than the current battery monsters.

Look how crap the T3 based HTC One X does though in GPU tests. Hilarious. Completed dominated in performance and power. And guess what is on top with 3x more battery life than the T3, an SGX 540. iPhone 4S completely destroys T3 and is a faster GPU too and was available before T3 also!

Now I await your logic that because the Razr i has a slower GPU that is why it has a 3x advantage over T3. Extending that though should T3 have at least 3x the battery life of the iPhone 5?
 

djgandy

Member
Nov 2, 2012
78
0
0
We have a bunch of Apple fans in this thread. When the Tegra 3 came out and roflstomped Apple's latest and greatest their discussion just swapped around to only fandroids cared about specs, until the second Apple got a more up to date SoC then it was the most important thing ever of course.
.

What exactly did T3 come out and beat, Transformer prime could match the iPad 2 on a few things and lost on the rest, is that your definition of winning?

BSW: When do you expect T4 to actually launch and be available for Project Shield and further to that when do you expect non Nvidia customers to have it? The reason I ask is because something weird happened with the T4 announcement. A chip that is supposedly next generation and available for mass production by March/April in my estimates was announced, but there was no mention big licensees or devices outside of camp Nvidia. Don't you find that a little odd? Nvidia's usual stance is to shout from the rooftops about how great they are at every opportunity.

Also since your electronic engineering knowledge is so good, how do you expect T4 to provide memory bandwidth to all 72 shader cores? Wider memory buses, higher memory bus clocks?
 
Last edited:

MrX8503

Diamond Member
Oct 23, 2005
4,529
0
0
So according to you, A5X was old news as soon as it came out. Funny, that sure as hell isn't close to what you were saying when it happened. I detest hypocrisy, have a backbone and stick by your stance.

Why would the A5X be old news? It beat Tegra 3's GPU by a fair margin. T4 looks like it barely beats A6X, if it does at all.

T4 is newer, it should beat A6X by the same margins or more to be impressive.

And? Apple designs/specs the whole phone out. They chose to have sub optimal battery life compared to the top tier phones.

Let me break it down for you since you have a hard time following.

You originally said that the iPhone's battery strengths are due to its screen size.

My rebuttal is that, yeah it has a smaller screen, but it also has a smaller battery due to its size.

Now the iPhone has sub optimal battery life? Lol!

We have a bunch of Apple fans in this thread. When the Tegra 3 came out and roflstomped Apple's latest and greatest their discussion just swapped around to only fandroids cared about specs, until the second Apple got a more up to date SoC then it was the most important thing ever of course.

Too bad T3 didn't stomp Apple. T3's CPU was impressive, but its GPU wasn't. T4 seems to follow the same path.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
BSW: When do you expect T4 to actually launch and be available for Project Shield and further to that when do you expect non Nvidia customers to have it? The reason I ask is because something weird happened with the T4 announcement. A chip that is supposedly next generation and available for mass production by March/April in my estimates was announced, but there was no mention big licensees or devices outside of camp Nvidia. Don't you find that a little odd? Nvidia's usual stance is to shout from the rooftops about how great they are at every opportunity.

So, what's the difference to Samsung and Qualcomm? :\
 

MrX8503

Diamond Member
Oct 23, 2005
4,529
0
0
Name a percentage that something has to be so it isn't old news. Go ahead and prove you have some credibility, before T4 official benches hit, you name the exacting, to the decimal point so you can't backpedal later when it doesn't make Apple look good, percentage that something has to be in order to not be old news.

Now you're backpedaling. Not sure how you came to the conclusion the A5X is old news compared to T3. IF Tegra4 is marginally faster than A6X, I'm not impressed.

I stated someone could make a mini screen phone like the iPhone using the Tegra 4 that could easily compete with Apple's battery life. Biggest drain on batteries? Screen. Are you really going to try to deny that?

I never said that the screen isn't the biggest drain.

Compared to the Maxx? Absolutely.

Because of its huge battery. *golf clap*.

There is another member of your faith in this thread trying to put forth that using non Apple SoCs gives inferior battery life.

Never said that either. I only said that iPhones generally have better battery life as a whole.

The majority of Android devices are below the iPhone in battery life tests. If the iPhone had terrible battery life it would be at the bottom and if it was ok, it would be in the middle.

In tests the iPhone battery life is near the top. Far from sub optimal.
 
Last edited:

MrX8503

Diamond Member
Oct 23, 2005
4,529
0
0
You are the one that keeps changing your standards. I say T4 isn't old news, you say it is because it isn't much faster, I say then A5X was old news because it wasn't much faster

Uh...the A5X's GPU is a lot faster than T3's GPU.

When did I say you did? That's right, I didn't.

You just asked me if I'm denying if the screen causes the most battery drain.

I'm telling you that I never said that it didn't.

The majority of Android devices have terrible battery life. Never confuse myself with your kind. If something sucks I call it out, doesn't matter who it is. I am not the lapdog of any corporation.

So the iPhone doesn't have good battery life, its just that the majority of Android phones have terrible battery life? Lol ok.
 

djgandy

Member
Nov 2, 2012
78
0
0
Ah whats the point. Apparently MWC is a launch for T4 despite no availability of products. Apparently Tegra 3 was available 6 months before the transformer prime, despite both being launched in Q4 2011.

Also the memory interface question was a trap, but I knew you'd fall for it. Don't you think that Tegra will be slightly starved of bandwidth for intense workloads? 4 CPU cores + a fairly primitive IMR GPU?
 

MrX8503

Diamond Member
Oct 23, 2005
4,529
0
0
Uh....the A5X came out after Tegra 3 for starters.

We all know that. Are you really not following? T4 is coming out after A6X and I'm saying that if its GPU is close to the A6X, then its not impressive.

Want to compare the top cell phones over the last ten years and plot where the iPhone is? I'm thinking it wouldn't make the top 100.

That makes absolutely no sense. When you do comparisons, you compare the same class, in this case, SMARTPHONES. If you want to talk about dumb phones, make a thread about how it has better battery life than smartphones. I'm sure people won't laugh at ya.
 

djgandy

Member
Nov 2, 2012
78
0
0
When the Transformer Prime came out, the iPad 2 didn't exist. The Infinity is the T3 tablet that launched after the iP2.

March 2011: http://en.wikipedia.org/wiki/IPad_2
Dec 2011: http://en.wikipedia.org/wiki/Asus_Eee_Pad_Transformer_Prime

It's just too easy. And apparently it is everyone else in a RDF.


5.3GB/sec-6.4GB/sec versus ~12GB/sec-~25GB/sec. That is the generational difference in bandwidth between T3 and T4. Is it possible T4 could still be bandwidth limited under certain situations? Of course. Does it have more the twice the bandwidth of the previous generation? Yep. It is trivial to make a 7970GE bandwidth limited with an order of magnitude more to play with. These are SoCs we are talking about. The upper limit is ~300% faster then the T3 was. If T3 was 100% bandwidth limited, then we should expect a 300% performance increase for T4, I don't think too many people would consider that poor.

Well thats all lovely. Let's actually come back to the real world again, where T4 has dual 32-bit LPDDR3 @ 800 MHz i.e 12.8 GB/s. How is it going to feed all those ALUs? I guess at least they have cut bandwidth requirements by ~35% by chopping all the precision from the Pixel Shaders, haha.
 
Last edited:

runawayprisoner

Platinum Member
Apr 2, 2008
2,496
0
76
Mx do you still think the soc in the iphone 5 is not there a15 design?

A15 is more power-hungry than Apple's Swift cores. Apple made a custom core that's more akin to Qualcomm's Snapdragon.

Thus the SoC in the iPhone 5 is not A15. It's just faster than whatever they had in there before.

But even going by preliminary benchmarks (of early T4 samples), it looks like A6 in the iPhone 5 is still about on par with T4 at least with graphics performance. Everything else be damned until we get more coverage.
 

grkM3

Golden Member
Jul 29, 2011
1,407
0
0
waiting to see how t4 does against the exynos 5 as its said samsungs 28nm will consume 70% less power then the exynos 5 dual core in the nexus 10.

its also getting a povervr gpu that is clocked at 533 vs the iphone 5 266 speeds.

If anything the exynos octo core is the soc of 2013 to get
 
Last edited:

lothar

Diamond Member
Jan 5, 2000
6,674
7
76
waiting to see how t4 does against the exynos 5 as its said samsungs 28nm will consume 70% less power then the exynos 5 dual core in the nexus 10.

its also getting a povervr gpu that is clocked at 533 vs the iphone 5 266 speeds.

If anything the exynos octo core is the soc of 2013 to get
Way too early to speculate for the entire year of 2013.
My personal guess is that if Samsung doesn't launch Mali T658 with Exynos, they'll be behind somebody.
Don't know if that sombody will be Nvidia, Qualcomm, or both.

Of course if Mali T658 isn't ready on time for the Galaxy S IV launch, Samsung may "partially" redeem themselves by launching it with the Galaxy Note III.

Where are we on this chart?
 

dagamer34

Platinum Member
Aug 15, 2005
2,591
0
71
Way too early to speculate for the entire year of 2013.
My personal guess is that if Samsung doesn't launch Mali T658 with Exynos, they'll be behind somebody.
Don't know if that sombody will be Nvidia, Qualcomm, or both.

Of course if Mali T658 isn't ready on time for the Galaxy S IV launch, Samsung may "partially" redeem themselves by launching it with the Galaxy Note III.

Where are we on this chart?

The chart isn't really accurate anymore. We've already got quad-core A7 chips which shouldn't come out until next year and the 4xA15/4xA7 doesn't even exist.

Thing is, you can stick as many cores as you want on a chip (within reason) because die area is largely a function of cost, not chip design.
 

lothar

Diamond Member
Jan 5, 2000
6,674
7
76
The chart isn't really accurate anymore. We've already got quad-core A7 chips which shouldn't come out until next year and the 4xA15/4xA7 doesn't even exist.

Thing is, you can stick as many cores as you want on a chip (within reason) because die area is largely a function of cost, not chip design.
Which quad core A7 chips did we get? If you're referring to the one in the Galaxy Note II and International Galaxy S III, I think that is quad core A9 not A7.

Isn't 4xA15/4xA7 Samsung's new rumored Exynos Octacore SoC?
If so, then it does exist in some lab somewhere out there and will probably be launched in Korea sometime in April.
 

djgandy

Member
Nov 2, 2012
78
0
0
Way too early to speculate for the entire year of 2013.
My personal guess is that if Samsung doesn't launch Mali T658 with Exynos, they'll be behind somebody.
Don't know if that sombody will be Nvidia, Qualcomm, or both.

Of course if Mali T658 isn't ready on time for the Galaxy S IV launch, Samsung may "partially" redeem themselves by launching it with the Galaxy Note III.

Where are we on this chart?

I think that chart is dead. http://www.arm.com/products/multimedia/mali-graphics-plus-gpu-compute/index.php

Also it is kinda odd that they thought Mali-658 would still be going in 2016. You'd generally assume next gen by that point
 

djgandy

Member
Nov 2, 2012
78
0
0
So they have a new revision of Tegra 4 that removed not only the low end LPDDR2 support, but also support for DDR3L and furthermore have it all set at a fixed frequency? Got any links to that? That is actually fairly major news that Tegra 4 has your limitation, I'm sure the major tech sites would like to share that information with everyone as it is kind of a big deal

You think they'll spin another SoC with a DDR2 memory interface? Doesn't that defeat the object of creating a next generation chip? Maybe they will in a years time, but we're talking about what is being launched initially here. Who said they dropped DDR3L?


I think it is a safe bet that most devices are going to use LPDDR3, nothing I have ever seen indicates that they are all going to be operating at precisely the clock speed you have come up with.

I don't think that clock speed is the maximum memory clockspeed for all eternity. The launch devices will be at those clocks though. Lets focus on what is going to be launched and the product that was announced, not what might exist in 5 years time eh? LPDDR3 is in its infancy, do you think we are ready to ramp clocks already? The power penalty of running at 1066MHz will be huge.




Yeah because there was a good chance someone was going to be using FP32 shaders on a mobile device anytime soon, heh. Even using the specs we are all waiting to see the source of, T4 worst case has double the bandwidth of T3. You speak as if T4 is going to be entirely bandwidth limited which if true, would mean clearly T3 also was indicating we should have a direct 100% performance improvement if we believe your numbers are the final word on clock rate.

This is Nvidias latest chip, you think they could have at least gone FP24. Latest chip won't even support GLES 3.0? Compute is out of the question too, although that is not really a biggy.

How does T4 being bandwidth limited imply that T3 was? T3 has 12 ALUs vs 72 for T4. I am struggling to follow your logic here. Twice the bandwidth, six times as many ALUs. T3 has 3x the bandwidth per ALU.
 

MrX8503

Diamond Member
Oct 23, 2005
4,529
0
0
WHAT PERCENTAGE?

I keep asking, you keep doing an Apple. Say it now, get on the record and show for once you aren't the lapdog of a company. Name the percentage for you not to consider it old news.

You need a percentage to determine the difference between marginal and significant?

But not the Maxx, because, you know, they designed it to get long battery life.....

Do you want to include 10 years of phones or the Razr Maxx to prove that the iPhone has sub optimal battery life. Seems like you want it both ways.
 

dagamer34

Platinum Member
Aug 15, 2005
2,591
0
71
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |