nvidia tegra K1

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
No. According to the whitepaper, the average power consumed by the Kepler.M GPU with some of today's best Android games is < 2w. This makes sense because Kepler.M easily handles most Android games today. Pushed to peak levels, Kepler.M really cannot consume more than ~4w to fit in a tablet form factor.

5w is the rated TDP for the entire TK1 SoC (same as T4).

The perf. per watt (and hence perf.) of the Kepler.M GPU is well beyond any high end ultra mobile GPU used today.

Ummm, doesn't a quad core A15 processor have a TDP of 5-10w by itself? I've got no clue how Nvidia could fit FOUR A15 cores into a 1w power envelope. It's more likely to be somewhere around 10-15w TDP with quad A15's, that's not bad at all for a tablet. For a phone, they'd have to cut the clocks and perhaps end up at around 7-10?
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
nVidia spent more money than AMD on their GPU tech. Kepler is the first architecture they really focused on efficient. Kepler.M is an evolution of the concept. nVidia has started to develop new architectures with mobile in mind. Maxwell will be the first architecture of this concept.

It is possible to deliver more performance with less power when you focus on this.

I'm just skeptical of something that promises to be 3x+ more efficient than kabini's gcn.
 

ams23

Senior member
Feb 18, 2013
907
0
0
Ummm, doesn't a quad core A15 processor have a TDP of 5-10w by itself? I've got no clue how Nvidia could fit FOUR A15 cores into a 1w power envelope. It's more likely to be somewhere around 10-15w TDP with quad A15's, that's not bad at all for a tablet. For a phone, they'd have to cut the clocks and perhaps end up at around 7-10?

That's not how it works. TDP for an SoC needs to be split between CPU/GPU/mem/etc. Depending on CPU and GPU utilization percentages, power consumption will vary, but on average the total dissipated power can be close to 5w in total with CPU and/or GPU intensive apps for an SoC such as T4 or TK1. Peak power is subject to go a bit higher of course.

The total power consumed in a handheld device is a function of SoC, screen, and anything else that consumes power. IIRC, something like an ipad 4 has a peak (not sustained) power consumption of ~ 12w in total for the whole system. A Tegra 4 high res tablet would be similar in consumption.
 
Last edited:

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
That's not how it works. TDP for an SoC needs to be split between CPU/GPU/mem/etc. Depending on CPU and GPU utilization, power allocation will vary, but on average the total dissipated power can be close to 5w in total with CPU and/or GPU intensive apps. Peak power is subject to go a bit higher of course.

The total power consumed in a handheld device is a function of both SoC and screen power. IIRC, something like an ipad 4 has a peak (not sustained) power consumption of ~ 12w in total for the whole system.

TDP should be the maximum power, not "peak" per se but the maximum sustained (ie. longer than a few seconds) load. I'm skeptical of the TK1 being able to operate a graphically intense game at anything close to max clockspeed, even at <2W for the GPU, that leaves ~2W for FOUR A15 cores and ~1W for the rest of the SoC. 2W for four A15 cores? That requires some serious throttling/clockspeed reduction.

Sure, it could pull 5w with a reasonable (ie. not intense) load, but at maximum power draw? 5W is going to be really tough to believe without lots of throttling.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
I'm just skeptical of something that promises to be 3x+ more efficient than kabini's gcn.

Why? We know that efficiency comes from the architecture on the same node. And we know that nVidia has money to spend on R&D.

For me it's clear that AMD is not even trying to improve their perf/watt because for that they need money. Kabini shows it: Kabini is alright, Temash was DoA.
 
Last edited:

ams23

Senior member
Feb 18, 2013
907
0
0
TDP should be the maximum power, not "peak" per se but the maximum sustained (ie. longer than a few seconds) load. I'm skeptical of the TK1 being able to operate a graphically intense game at anything close to max clockspeed, even at <2W for the GPU, that leaves ~2W for FOUR A15 cores and ~1W for the rest of the SoC. 2W for four A15 cores? That requires some serious throttling/clockspeed reduction.

Sure, it could pull 5w with a reasonable (ie. not intense) load, but at maximum power draw? 5W is going to be really tough to believe without lots of throttling.

It is very unlikely that any modern day Android game will be pegging all four A15 CPU cores (let alone pegging two in fact). So in most games, GPU utilization % will be much higher than CPU utilization %. And with the vast majority of Android games, Kepler.M will not come close to being fully utilized either. Last but not least, the R3 variant of Cortex A15 on 28nm HPM used in Tegra K1 has superior power efficiency compared to the Cortex A15 variant in Tegra 4.

Once again, 5w is the TDP for the entire SoC. For CPU-intensive apps, most of the power is allocated to the CPU (and vice-versa with the GPU). Any scenario where both CPU and GPU are pegged at the same time is pretty unrealistic. It would also be pretty unrealistic to expect Kepler.M to have the same clock operating frequencies in a device like Shield compared to a tablet or phone.
 
Last edited:

ams23

Senior member
Feb 18, 2013
907
0
0
Why? We know that efficiency comes from the architecture on the same node. And we know that nVidia has money to spend on R&D.

For me it's clear that AMD is not even trying to improve their perf/watt because for that they need money. Kabini shows it: Kabini is alright, Temash was DoA.

Even though Tegra has had growing pains, just as Atom has had growing pains, investing many years ago in ultra-mobile technology was the right thing for NVIDIA and Intel to do. AMD was (and is) more financially strapped in comparison, and invested in "semi-custom" APU's instead. It is what it is, and these decisions will translate directly into products and technology we see from these companies in the near future.
 

NTMBK

Lifer
Nov 14, 2011
10,322
5,352
136
Even though Tegra has had growing pains, just as Atom has had growing pains, investing many years ago in ultra-mobile technology was the right thing for NVIDIA and Intel to do. AMD was (and is) more financially strapped in comparison, and invested in "semi-custom" APU's instead. It is what it is, and these decisions will translate directly into products and technology we see from these companies in the near future.

Here's hoping. They need to turn around the Tegra line, fast, and actually start making a profit on it.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
^tegra k1 is very promising, maybe the comparison with the consoles was to spark the ouyas of the world to rise up and get console performance for a relatively low price.
 

NTMBK

Lifer
Nov 14, 2011
10,322
5,352
136
^tegra k1 is very promising, maybe the comparison with the consoles was to spark the ouyas of the world to rise up and get console performance for a relatively low price.

I think that seeing the roaring success of the Ouya may put them off rather more than NVidia's performance promises.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
It is very unlikely that any modern day Android game will be pegging all four A15 CPU cores (let alone pegging two in fact). So in most games, GPU utilization % will be much higher than CPU utilization %. And with the vast majority of Android games, Kepler.M will not come close to being fully utilized either. Last but not least, the R3 variant of Cortex A15 on 28nm HPM used in Tegra K1 has superior power efficiency compared to the Cortex A15 variant in Tegra 4.

Once again, 5w is the TDP for the entire SoC. For CPU-intensive apps, most of the power is allocated to the CPU (and vice-versa with the GPU). Any scenario where both CPU and GPU are pegged at the same time is pretty unrealistic. It would also be pretty unrealistic to expect Kepler.M to have the same clock operating frequencies in a device like Shield compared to a tablet or phone.

So in other words intel's sdp is completely valid because that is what everyone in the mobile space is doing.
 

jpiniero

Lifer
Oct 1, 2010
15,176
5,717
136
Maybe not a console, but I'd like to see nVidia do a Denver NUC-like device (at a price that might actually move units)
 

NTMBK

Lifer
Nov 14, 2011
10,322
5,352
136
Maybe not a console, but I'd like to see nVidia do a Denver NUC-like device (at a price that might actually move units)

But what OS would it run? Android isn't made for keyboard and mouse, Windows RT is a complete disaster, and Ubuntu doesn't sell consumer boxes. Only real option is Chrome OS.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,770
775
136
But what OS would it run? Android isn't made for keyboard and mouse, Windows RT is a complete disaster, and Ubuntu doesn't sell consumer boxes. Only real option is Chrome OS.

The limitations of Chrome OS makes it possibly the worst choice. AOSP with a custom UI would be better. Considering Denver K1's purported code morphing when nVidia can do x86 that way (currently blocked by Intel I think) SteamOS would be an option.
 

NTMBK

Lifer
Nov 14, 2011
10,322
5,352
136
The limitations of Chrome OS makes it possibly the worst choice. AOSP with a custom UI would be better. Considering Denver K1's purported code morphing when nVidia can do x86 that way (currently blocked by Intel I think) SteamOS would be an option.

Yeah, if it was x86 then it would have a lot more potential- but I think that Intel have got that route too locked down.
 

lopri

Elite Member
Jul 27, 2002
13,221
612
126

Go check out AT's bench results for the past 2~3 years in mobile devices. They are a mess (even after giving benefit of doubt, such as updated OS). I won't speculate why that is so.

P.S. What happened to Tegra 4i? No "journalists" had a curiosity to ask about it? :biggrin:
 

lopri

Elite Member
Jul 27, 2002
13,221
612
126
Short memory-span of collective tech community + lack of ombudsman in this industry = corporations' propaganda dwarfing critical consumer voices at every occasion.

Sad but true.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |