WhoBeDaPlaya
Diamond Member
- Sep 15, 2000
- 7,414
- 401
- 126
Heh...given Intel's (relatively) precarious financial situation I'd say they are just focusing their efforts on the really sore spots where they need to break in. Once phones/tablets aren't so much of a damn sore spot, we enthusiasts may get some love again
That is what engineering samples are for...
Probably dating ourselves here, but God, I do remember that.I was days away from pulling the trigger on an Alpha 21164PC desktop, had the money in hand after months of drooling at computer shopper adverts while eating top ramen to save my nickels and dimes.
What you are proposing is impossible because it could never happen to Intel. I cannot fathom what possessed you to even propose such lunacy.it isn't clear to me why I shouldn't expect history to repeat itself here.
I'm out of the loop but reading this thread it seems the enthusiast community is pissed at intel. Why exactly?
DEC was trying to get into the then leading edge of the new wave of personal compute devices (the PC desktop) which is very analogous to Intel's efforts to come up behind the curve (missed the Apple opportunity) and get onto the bandwagon for what is now the leading edge of the new wave of personal compute (smartphones).
Unless Intel's decision-making cabinet is made up of superior humans, folks not cut from the same cloth as DEC's decision makers, it isn't clear to me why I shouldn't expect history to repeat itself here.
I'm out of the loop but reading this thread it seems the enthusiast community is pissed at intel. Why exactly?
Because they couldnt use the BIOS auto overclocking and set the multiplier to 50 :awe:
If people call themselves "enthutiasts", I am sure delidding is not an issue.
Because they couldnt use the BIOS auto overclocking and set the multiplier to 50 :awe:
If people call themselves "enthutiasts", I am sure delidding is not an issue.
He sounds more like a "mr silly" by killing it during the delidding process and whining about it. Delidding a soldered cpu is not an easy task so I think it's completely acceptable for him to destroy the cpu in the process. The costs are well worth it to the community as a whole to destroy a single engineering sample in order to prove to everyone that ivy-e is soldered.Why mention this? Because I can't believe this guy had access to an ES, even if it was loaned from a friend of a friend of a friend, and yet he did not have access to anyone in an unofficial sense who could have told him his ES was soldered before he destroyed it.
Which leads me to conclude the following as a more probable scenario here: First, the guy probably knew his IBE was soldered, who wouldn't know that? Second, he probably fried it while overclocking, too many volts or too much current...but rather than be embarrassed about it he takes on the project of "well I'll just delid this thing, I can claim it died during delidding and save face with all my OC'ing buddies".
That way he gets uber points for being Mr. Extreme with his delidding efforts, and doesn't get caught out for being Mr. Silly for having fried his IBE before the delidding occurred.
So on this CPU it's 7 watts from ~30C difference that's attributable to a combination of the H100 (vs stock cooler), the reduced voltage, and de-lidding / TIM. So I think you're looking at maybe 1-2 watts, since I think the H100 vs. stock cooler and the voltage difference are much bigger factors than de-lidding / TIM at stock clocks. And that's with IBT, one of the heaviest loads imagineable. As loads and heat decreases, any consumption difference also decreases.
I'm not sure this is a great comparison. DEC was a computer maker, not a chip maker, and more like HP (where it ended up) than Intel. It was also a company with serious problems by the time they got to even designing Alpha.
I don't know why this is bad news. It's good news to me.
What you are proposing is impossible because it could never happen to Intel. I cannot fathom what possessed you to even propose such lunacy.
Nowhere in history can you find such a force in any industry, such a goliath, that then fails to ride the next generation of products / evolution in said industry. Nowhere. (DEC, old IBM, Kodak, etc? It happened to them because they weren't Intel. Intel is better than them + you + history, combined.)
So please, sir, stop with your Intel common sense predictions. It is not welcome in these boards. I do, however, encourage you to focus on, and harp about, Intel's strengths, such as their unmitigatable process lead and inexhaustible coffers, and may these help to correct your future outlook towards Intel. :thumbsup:
/tongue-firmly-in-cheek
This is sad for overclockers that prefer Intel.Intel employees hinted that Haswell would be a nice surprise for overclockers. Instead it overclocks worse than Ivy Bridge- and they have removed all remaining overclocking from the non-K SKUs. Oh, and the K-series has even more features disabled than last time- including TSX, which is meant to improve multithreaded performance.