Tensor Processing Units (TPU) for Consumers; The next big thing?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

beginner99

Diamond Member
Jun 2, 2009
5,223
1,598
136
A thought:

If they can use Deep learning for AA, wouldn't it be possible to create a deep learning for raytracing? Render the game offline / not real-time with high ray settings. Then run DL on top of it and use that to do "pseudo-raytracing" in real-time.
 

mv2devnull

Golden Member
Apr 13, 2010
1,503
145
106
"Deep learning for AA": you give a function a 2D image and it gives you a 2D image.

Traditional rendering: You give a function objects, materials and lights of a scene and you get a 2D image.
Raytracing: You give a function objects, materials and lights of a scene and you get a 2D image.

Do you think that a DL function that computes a 2D image from objects, materials and lights of a scene is significantly simpler than either of the handcrafted functions?
 

ZeroRift

Member
Apr 13, 2005
195
6
81
It might be possible to use deep learning to optimize RT in a similar way as it is used in AA.

For instance, you could train the driver to dynamically adjust the number of rays / bounces that need to be cast to light a given scene based on data generated from rendering similar scenes in the past. In a sense, this would be almost like the pre-rendered shadows of the olden days, in that it would use historical data to accelerate a real-time projection.
 

Geegeeoh

Member
Oct 16, 2011
145
126
116
Yes it can do wonders, but lately it's like "quantum physics explains everything"... a bit misrepresented.
You can't just throw it to any problem.
 

ZeroRift

Member
Apr 13, 2005
195
6
81
But it is an interesting new tool in the toolbox, which often leads to novel solutions.
Indeed.

Fundamentally, "Deep Learning" is just a training technique for AI compute units (in this case, tensor cores). Strictly speaking, it doesn't address any problem beyond that.

In computer science, there is always a trade off between memory and compute to solve a given problem.They usually don't trade off evenly, which is where the majority of code optimization comes from (does this workload scale better with memory or with compute?).

What makes AI compute interesting is that it's a like a hybrid solution. It uses massive amounts of memory in the form of historical data to optimize the compute unit cycles, rather than to replace them. In theory, that approach can be used to optimize compute for nearly any problem for which you have a large quantity of historical data.

Circling back to the OP a bit:
Do I think NVIDIA will keep the use of tensor cores on GeForce GPUs proprietary?
Not intentionally, though uptake will always be a function of available hardware. Tensor cores, like any compute unit, will run any code you feed it; meaning that 3rd parties will always be able to provide their own training data/code for the cores to leverage. This will most likely start out as a few game devs providing training data for their specific title, but could one day even be crowd-sourced by gamers. I can see AMD finding their niche in there pretty easily once they get their legs under them, providing an open method by which all AI compute units can be optimized / "trained."

Do I think tensor cores (or more broadly, AI compute units) will become widely popular / ubiquitous?
Yes - the idea of tensor cores is essentially doing arbitrary compute workloads with smaller / faster compute units. This provides an advantage to any problem that can leverage machine learning, with the only limiting factor being whether it makes cost/benefit sense to implement for a given use-case.

Like RT, I don't think we'll see much of an immediate uptake, but it seems likely that most (if not all) types of data processing will one day incorporate some kind of AI optimized feature.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Has DX12 even paid off yet? I’m asking because the last I heard is that most NVidia cards perform better on DX11. Does that still hold?
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
In BFV beta this weekend DX11 performed better for my 1080 Ti, could be the beta status but more likely nVidia still just performs best on dx11. It's the same in total warhammer 2 for me
 

beginner99

Diamond Member
Jun 2, 2009
5,223
1,598
136
Has DX12 even paid off yet? I’m asking because the last I heard is that most NVidia cards perform better on DX11. Does that still hold?

I'm not sure if it still holds but the real problem is, that we have yet to see a pure dx12 game. Available dx12 games just have it tacked-on but deep down are still dx11. This also applies to the game engines themselves and yes it also applies to AotS as well. As long as the game needs to run on dx11, it influences it's whole design and architecture.

Simply said no, it hasn't paid of yet. You might get a bit more fps in some cases especially with an AMD GPU but that's about it. I'm also not sure why this did not take off as consoles already are programmed closer to the metal. The only company benefiting from dx12 delay and hence dx11 staying around was NV...
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
TPUs are already a thing and there are a slew of new corporations with products in the pipeline. Both AMD/Nvidia are going to face headwinds in the future in this area and it will be great for consumers. Google has spinned their own. NEC is spinning their own and there are a slew of startups with chips heading to tape out doing sampling. FPGAs are back in vogue as well as completely new architectural approaches. Were in the age of hardware. Just because a company ruled the previous age doesn't mean they will continue on as such. Fresh new blood, ideas, and competition are what progress technology and thank god it's coming.
 
Mar 11, 2004
23,181
5,643
146
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |