Heartbreaker
Diamond Member
- Apr 3, 2006
- 4,655
- 6,117
- 136
No offense but Intel has been making drivers for their GPUs for 20+ years. They just haven't focused in games (for obvious reasons). They have been putting in more effort lately though.
By all accounts, they've been doing a pretty lousy job for 20+ years.
Intels drivers have been bad for gaming since the original "Extreme" Graphics*. Those where bad. They even had the CPU handle part of the rendering pipeline back then. Intel hasn't really helped themselves by leaving previous generations to wither on the wine, while focusing on new and shiny. They've done that every CPU generation until Skylake. Where they finally made a unified driver package going forward.
It was only after the IGP security debacle they released new drivers for Gen7(.5) and Gen8 after all. With a hefty performance penalty.
*The original "Extreme" Graphics (2) couldn't even render the Windows desktop properly (GDI+ decelerator). And was very often accompanied by... ahhh... "poor"... RAMDACs.
Intels drivers have been bad for gaming since the original "Extreme" Graphics*. Those where bad. They even had the CPU handle part of the rendering pipeline back then. Intel hasn't really helped themselves by leaving previous generations to wither on the wine, while focusing on new and shiny. They've done that every CPU generation until Skylake. Where they finally made a unified driver package going forward.
It was only after the IGP security debacle they released new drivers for Gen7(.5) and Gen8 after all. With a hefty performance penalty.
*The original "Extreme" Graphics (2) couldn't even render the Windows desktop properly (GDI+ decelerator). And was very often accompanied by... ahhh... "poor"... RAMDACs.
If i remember well SiS started the IGP thing with the highly integrated Sis 530 Socket 7 motherboards that had IGP, sound and even ethernet... that was amazing, at that time, i still have a working one stored, that may be the first ever comercial motherboard with IGP because i dont remember a earlier one. Then Intel followed with the 810 chipset (i belive), that started the mess of Extreme Graphics, with VIA joining shortly after buying Savage.
that was a hardware limitation, it didn't support vertex shaders in hardware like all other GPUs did (DX7, after the Geforce 256 and Radeon), but games required it so it did in software, up until the X3100 and 4500..
The SiS530 really was the starting point for integrating as much as possible on the chipset. For cost cutting, but nobody minded that at the time since it enabled much cheaper systems. I remember it well, but it wasn't really geared for anything to do with 3D. The original nForce on the other hand. Now there was a good chipset.
If I remember rightly, Intel started with integrated graphics because they just needed to do something with the spare die area on the northbridge. The chip had a minimum size due to IO requirements, and hey, why not throw some graphics in there?
up until the X3100 and 4500..
i used to game with a SiS IGP on the motherboard back in the day with my Celeron D single core 2.53GHz (early 2000s).
SiS530(Socket 7) and SiS620(Slot 1) had a SiS6306 integrated, what im petty sure that its just the integrated version of the 6326, with DX5 support, it was still able to handle some games, and had the MPEG-2 decoder. Some boards even had 8MB of dedicated memory for the iGPU. So the iGPU supported both shared and dedicated ram.The SiS530 really was the starting point for integrating as much as possible on the chipset. For cost cutting, but nobody minded that at the time since it enabled much cheaper systems. I remember it well, but it wasn't really geared for anything to do with 3D. The original nForce on the other hand. Now there was a good chipset.
it was a socket 478 Celeron D 2.53Ghz CPU in there, so according to this list:Was that the SiS630 by any chance? I had a laptop with it, and it could run most things.
It wasnt THAT bad, even with software features, it arrived two years before the SIS 315 and the nForce with the Geforce 2 IGP.
By all accounts, they've been doing a pretty lousy job for 20+ years.
But with Xe they are ready now. In no Intel GPU history they were ready in any shape or form to get proper dGPUs out.
Let's hope so. Though to be honest, I would not be too upset if DG2's drivers suck for GPGPU. Not that I'm in the market for one, mind you. But I think it would be okay for the market as a whole if that happened.
It may not be the case of Intel's GPGPU being bad as simply crypto software not existing.
How much will you really benefit from mining on an Intel GPU if the timeframe is that short?
But once you start selling it as an add on card with nearly a magnitude better performance, well people will care.
I fear it will work just fine for modern games but not for many older ones.
Do you guys remember when Intel came out with graphics cards, or at least graphics chips, and were trying to take on NVIDIA (and maybe 3DFX) in the 90s? What was their chip called? It got its ass handed to it mightily lol. Hope this time around they do a good bit better!!!
Intel740 - Wikipedia
en.wikipedia.org
One of the first AGP video cards on the market. I think the Riva 128 was earlier though.
Ah, yes, the i740!
The i740 was released in February 1998, at $34.50 in large quantities
Do you guys remember when Intel came out with graphics cards, or at least graphics chips, and were trying to take on NVIDIA (and maybe 3DFX) in the 90s? What was their chip called? It got its ass handed to it mightily lol. Hope this time around they do a good bit better!!!
Ah, yes, the i740! Damn, I remember it was bad, but the wikipedia link reminded me of just how bad. I remember the whole AGP hype - who needs memory when you have the fat AGP pipeline!!!! LOL, fun times!!! Thanks DrMrLordX!!!