How much are these going to cost?
It is hard to imagine how a 14nm quadcore with no HT is going to possibly run anywhere near 95W even at 3.9GHz. I thought the 4690 topped out around 70 watts in real world testing? And the 6MB cache is just a plain ol ripoff. These chips are going to be so frickin small they are going to be gouging the everloving crap out of us if they charge the same price as a 2500k which is like 5 times bigger. We are rapidly approaching the point where the Intel useless gpu tax is exceeding half the cost of the part. How long will enthusiasts tolerate that?
It is hard to imagine how a 14nm quadcore with no HT is going to possibly run anywhere near 95W even at 3.9GHz. I thought the 4690 topped out around 70 watts in real world testing? And the 6MB cache is just a plain ol ripoff. These chips are going to be so frickin small they are going to be gouging the everloving crap out of us if they charge the same price as a 2500k which is like 5 times bigger. We are rapidly approaching the point where the Intel useless gpu tax is exceeding half the cost of the part. How long will enthusiasts tolerate that?
Do you measure the worth of a chip by the size of the silicon or the value delivered?
95W doesn't mean anything. My 4770K consumes more than 84 watts (official TDP), fully stock, when running LinX.AVX2 load is the key issue. Also 95W may still be the platform and not actual chip.
Here's to hoping, it's faster4.0 GHz base 4.2 GHz Turbo? Only other CPU that does is FX-8350.
It is hard to imagine how a 14nm quadcore with no HT is going to possibly run anywhere near 95W even at 3.9GHz. I thought the 4690 topped out around 70 watts in real world testing? And the 6MB cache is just a plain ol ripoff. These chips are going to be so frickin small they are going to be gouging the everloving crap out of us if they charge the same price as a 2500k which is like 5 times bigger. We are rapidly approaching the point where the Intel useless gpu tax is exceeding half the cost of the part. How long will enthusiasts tolerate that?
4.0 GHz base 4.2 GHz Turbo? Only other CPU that does is FX-8350.
@sm625 - if u think igpu is useless skylake-E is your chip. more cache more cores. this may not be intended for you
But if these clocks are correct, it will debunk some of the claims that 14nm would not clock as high as 22nm. Wonder how it will overclock? I am not expecting much more than Haswell unfortunately.
You cannot measure worth by "value delivered" since that invokes circular logic. You have to measure by the size of the silicon as it relates to the fractional cost of the wafer. In this case they are getting a crapton of cores from one wafer. They are probably paying only $12 per core for the actual silicon. So when we buy a i5 quadcore we are basically paying for $50 worth of cpu and another $50 of useless gpu and $130 for intel's shareholders and CEO. They shouldnt be wasting half of their wafers on gpu transistors that no K model purchaser is going to actually use.
It is hard to imagine how a 14nm quadcore with no HT is going to possibly run anywhere near 95W even at 3.9GHz
Don't forget not everyone games with these CPUs. I run a 4770 non K in my office box so the iGPU is essential otherwise I'd need to fork out for a dGPU I don't need. Won't be upgrading either. I'll be running this box into the ground first. That iGPU is useful for hardware media decoding too.
III-V expected 5GHz.
We are rapidly approaching the point where the Intel useless gpu tax is exceeding half the cost of the part. How long will enthusiasts tolerate that?