But it is an interesting new tool in the toolbox, which often leads to novel solutions.
Indeed.
Fundamentally, "Deep Learning" is just a training technique for AI compute units (in this case, tensor cores). Strictly speaking, it doesn't address any problem beyond that.
In computer science, there is always a trade off between memory and compute to solve a given problem.They usually don't trade off evenly, which is where the majority of code optimization comes from (does this workload scale better with memory or with compute?).
What makes AI compute interesting is that it's a like a hybrid solution. It uses massive amounts of memory in the form of historical data to
optimize the compute unit cycles, rather than to
replace them. In theory, that approach can be used to optimize compute for nearly any problem for which you have a large quantity of historical data.
Circling back to the OP a bit:
Do I think NVIDIA will keep the use of tensor cores on GeForce GPUs proprietary?
Not intentionally, though uptake will always be a function of available hardware. Tensor cores, like any compute unit, will run any code you feed it; meaning that 3rd parties will always be able to provide their own training data/code for the cores to leverage. This will most likely start out as a few game devs providing training data for their specific title, but could one day even be crowd-sourced by gamers. I can see AMD finding their niche in there pretty easily once they get their legs under them, providing an open method by which all AI compute units can be optimized / "trained."
Do I think tensor cores (or more broadly, AI compute units) will become widely popular / ubiquitous?
Yes - the idea of tensor cores is essentially doing arbitrary compute workloads with smaller / faster compute units. This provides an advantage to any problem that can leverage machine learning, with the only limiting factor being whether it makes cost/benefit sense to implement for a given use-case.
Like RT, I don't think we'll see much of an immediate uptake, but it seems likely that most (if not all) types of data processing will one day incorporate some kind of AI optimized feature.