As I remember, in the early RDRAM days, the biggest problem with RDRAM was the expense. SDRAM was significantly cheaper by comparison, so a lot of people kept going the SDRAM route.
If RDRAM had been implemented more cheaply, it might have been another story. Plus the other reasons mentioned above.
I get that Intel was pandering to the audience this week, and trying to get people a little excited about the possibilities. If I had to guess as to what their 'goal' was, I'd say that they were looking to lure the up and coming 'out of box thinkers' to the Intel brand.
Which ends up not being as much fun for us number guys...
I linked a presentation in another thread where some guy was talking about the transition to 14nm, which he noted as being more effective for power savings than for performance. Intel's latest CPUs seem to bear that thinking out, and indeed with the 'high performance user' market not being what it once was, you have to respect the strategy.
The thing that Intel has done in the last couple of years which has been impressive to me is drastically increasing the performance of their iGPUs. They essentially went from lackluster performance (most computer enthusiasts did not take them seriously) to quite respectable in just a couple of years. With the amount of intellectual muscle that Intel has at their disposal, I can see the next generation(s) of Intel iGPUs giving NVidia and AMD some serious headaches.
It does look like Q3 2014 should be very interesting for you Intel gamer types:
http://www.guru3d.com/news_story/intel_haswell_e_roadmap_confirms_launch_in_q3_2014.html
For me, the bigger news this week was Alan Mulally deciding to stay put at Ford. Kinda makes you wonder where the next Microsoft leadership is going to come from...