Originally posted by: apoppin
Originally posted by: Gstanfor
LOL! I was just interested in your opinion. I can remember a time between roughly GF1 - GF4 when nvidia was effectively a monopoly (none of the "competition" did much in the way of actually competing anyway...) and that era was responsible for some great 3D games, in fact the 3D scene pretty much exploded during that timeframe.
you mean during the Rage to Radeon 8500 timeframe?
ATi existed as a worthy 'alternative' to nvidia's faster GPUs . ..since Rage Fury 32. . . they always had better IQ - at least thru the R8500 era.
and don't worry about PC gaming . . . it doesn't matter who is making GPUs . . . it has enough problems surviving competition from the Consoles.
ATI was largely a nothing during that time, they never hit it big with any of their products (Except in the mobile market) and generally didn't have the performance to compete. The voodoo5 was a decent competitor (taking I think 20% of the overall market I think, with nvidia taking the majority of the rest), along with the kyro 2, which was a decent competitor mainly on price and that it was something "fresh."
XGI sold off its graphics division quite a while ago, so that counts them out. Matrox probably doesn't have the resources to take on the 3D high end graphics realm again. That really only leaves S3, unless a new company just pops up out of nowhere.
Matrox let their 3d graphics division go awhile ago. That leaves S3, who won't go for hte highend, and Bitboys, who was never in the PC market, and PowerVR, who withdrew years ago. However, it's very possible Intel could take a stab at it.
BTW, I'd imagine AMD will cut substantial employees, hopefully they don't gut ATI.
Now, if you add in GPU functionality to the cores, not a GPU on the die, but integrated into the x86 pipeline, you have something that can, on a command, eat a GPU for lunch. A very smart game developer told me that with one quarter of the raw power, a CPU can do the same real work as a GPU due to a variety of effects, memory scatter-gather being near the top of that list.
See if you can get a cpu up to one quarter of the power of a gpu.
Oh, you could just implement hidden surface removal into a gpu and accomplish the same thing with 1 quarter of the power (and much lower transistor count). Most likely a CGPU will not beat a graphics card, but be merely "good enough" for the mainstream. (though can they put a directx interface on it and still play to its strengths?)