For that to be true Intel would have to be lying through their teeth when they say Crystalwell gets an average 95% hit rate in games. That would make the backing RAM technology close to irrelevant, especially for average FPS.
95% hit rate doesn't really mean much though does it? It only means that the data that IS in the cache is used well. However, we don't see 128MB GPUs that also rely on system DDR3 on the desktop as being competitive in any meaningful way.
That kind of setup is
ALWAYS a low end solution, and never as good as a complete pool of dedicated local memory to the GPU.
I'm not denying it helps the Iris in the context of what they have to work with, but follow me here :
Iris Pro w/128MB eDRAM + DDR3 1600 system memory
vs.
Iris Pro w/8GB DDR4-2800 shared main system memory and no eDRAM
vs.
Theoretical Iris Pro discrete with 2GB GDDR5 6Gbps 384-bit interface
Which would dominate? The GDDR5 variant, easily. What next? The DDR4 variant. The DDR3 + eDRAM would easily be the last-place contender.
The only reason they're doing that is because designing a special mobo chipset for Iris Pro with a dedicated 256-bit memory bus to solder GDDR5 onboard would be silly, and very hard to ask OEMs to do in the context of the ultrabook form factor.
eDRAM is a bandaid, a way to boost GPU performance a bit when they're saddled with crappy slow main system memory. But it's far too small to truly equal a good dGPU and GDDR5 large enough to handle AAA gaming on the same level.
Think about it : you have 128MB to work with. What happens when you need to move data into that 128MB that's not already there while you're playing BF4? Yes, it has to come from that crusty old DDR3, which is already shared with the rest of the platform. So your magical superfast eDRAM is suddenly bottlenecked by DDR3-1333, 1600, whatever. Even DDR3-2133 sort of sucks for video honestly. A low end GDDR5 7770 blows the crap out of any IGP on earth.