Then tell me, why is Nvidia maximizing profits in the short term a bad business model when they have no competition?
Nvidia should have been maximizing profit for the two years there was no challenge to Pascal. During the period they had record profits in the gaming segment and also thanks to the windfall from memecoin. Prior to that, they were already the segment leader w/ a dominant brand. They correctly expanded into general purpose computing with CUDA and cemented a larger and new segment of business. From there a sound attempt to push into the embedded market. They milked the snot out of these markets w/ little to no competition, expanded brand awareness and supremacy into new markets and have been riding on a cloud of properly executed business decisions. Now they face competition in every one of these segments. Thus, for longer term success it would be more sensible to retract the price premiums and push proprietary standards so as to solidify market capture before your competition delivers their product. Instead, they went for the sophomoric business approach of trying to continue to milk all of its markets including its exhausted cash cow with an unproven technology. This will result in what you're seeing which is a broad based market rejection, a reduction as to the quality of your brand, and a negative sentiment among your primary core market which favors your competitors. If your competitors rebuff you correctly, they easily take market share away from you in your core market and then mount fresh pressure against you in your unproven growth markets with the momentum they solidified. So, this is formulaic and classic miscalculation in business that has severe consequences. Never screw your primary market when you have not solidified volume in sought after growth markets.
What made the others fail was lack of innovation which allowed a competitor to leapfrog them. Nvidia is not doing that, so what do you think its doing wrong here?
What others? There was only ATI/Radeon who has been languishing but in no shape or form has failed. There's console markets, secondary relationships and a slew of other pro markets that still are up in the air. Not only that, Nvidia has been creating quite the public stink w/ its greedy maneuvers while not delivering anything new over the past 2 years...
Have you forgotten about :
https://www.pcworld.com/article/326...ics/nvidia-kills-geforce-partner-program.html
^They had a pipeline of pure greed maneuvers that the market has been signaling its tired of.
So, whose failed at innovation? Vega 56/64 can actually best a number of pascal cards in gaming performance and compute. They come equipped with HBM2 and have a slew of innovative features. The only thing that's bad about them are the dev tools/support/drivers which have improved w/ time. The big difference being that theres are open source an non-proprietary and Nvidia's is full-retard proprietary. Guess who wins in the long term? Open standards. So, in all aspects, Nvidia's making a broad and arrogant miscalculation that will result in a number of consequences in the coming years. They milked for too long and slacked off like intel did. Ray tracing already occurs in the traditional graphics pipeline via emulation. Nvidia's ray trace cores only deal with a segment of the ray tracing process. It's a beta feature at most. DLSS is a meme for AA and needed in order to hide the flaws in their ray tracing cores. The real thing causing the performance bump is :
Where's the innovation?
Die shrink and cache reconfiguration...
AMD meanwhile brought HBM2.0 to a desktop GPU and has double rate FP16.. Something Nvidia kept segmented to its pro market with no intention to bring to the consumer. Meanwhile, you get a teaser helping of ray trace cores and tensor cores which are nothing but matrix math units and a retuned Pascal (An SM architecture Nvidia has been sitting on for years?). So, this is a mistake from a business perspective and excuse me for not praising the introduction of meme cores at
Quadro pricing. I've personally terminated my order of Geforce20 as an eval card, will not be looking for ways to exploit ray tracing cores or tensor cores for speedups and have instructed further development to focus on speedups in standardized FP16 compute. I was interested in how tensor cores and nvlink would shape up towards the consumer market. My assessment is that others will catalyze and standardize a more open and cheaper linkage and acceleration pipeline. So, I will not be investing in something of proprietary makeup that will not win out in the long run. In the coming years, Vulkan should be sound enough that a wealth of pluggable hardware will fill the gap that Nvidia is leaving open. I look forward to 2019 and even the most ardent Nvidia fans do. People are discussing the price more than their excitement of the new features .. This is an across the board failure. One that is more likely fueled by a greedy and overconfident Business department and not the engineering/R&D team whose crucial work is being slowly parceled out at moronic pricing.
Again, intel made the same mistake did they not? And look how its working out for them.
It's a classic misstep mature companies make when they're in the lead. It's as natural as the air you breathe and likely unavoidable... thank God