- Apr 9, 2013
- 4,470
- 0
- 0
thanks for that clarification.
given the better tim did improve 4790k over 4770k.
that would confirm the theory of cost cutting.
:thumbsup:
thanks for that clarification.
given the better tim did improve 4790k over 4770k.
that would confirm the theory of cost cutting.
Again, the more I delve the clearer I become that this is an ongoing, dynamically unfolding story. That Intel apparently wants to get away from LGA appears centered on trying to move away from conventional desktops and moving to all in one units and mobile.
If true, for me, that's a serious issue, given the advantages of a full sized, entirely configurable/upgradeable desktop system in my experience and my view.
I also now see this as akin to why W8 failed......given MS's thinking/strategy in that was very much the same as Intel's in THIS.
As always, boys and girls...it's about cultivating Eyes by Marcel Proust...as per my signature.
The gossip rumour was based on Broadwell wasnt really launched for the desktop(besides K and E models). And if you look in the threads on the first page, or any other relevant site news about Skylake. You would already know its socketted using the LGA1151 platform.
@UaVag: Devil's Canyon CPUs are not soldered. They instead feature a what is claimed to be an improved thermal interface material.
Great quote. I've never had a signature here in 3 years, but for the first time I feel the desire to use this as one. So true.Most of the people I know now addicted to tablets know ugatz about computers and settle for using devices in the simplest, most basic applications. Good for Instagram and Twitter if those comprise someone's world.
Sorry.....to me, it reflects the dumbing down of America.
Great quote. I've never had a signature here in 3 years, but for the first time I feel the desire to use this as one. So true.
I think you need to grab a beer and then read OP's posting history. They're very passionate about this sort of thing.
Hahaha... you remind me so much of my old friend Priscilla! The way you speak so elaborately/intellectually. We would just sit in bed or on rooftops of buildings and contemplate stuff like this all the time for hours... perspectives, human nature, what's actually real, etc. and most of all conspiracies :sneaky: You would fit right in with our crew :biggrin:Only just caught this. First, I am a she. Passionate is accurate. And it's not virtual, it is actual. I also do not miss much in seeing all that is there, AND IN PERSPECTIVE (I get paid not to)......and make no apologizes for coming upon something I did not know about. That anyone actually thinks anyone should....is against the core dynamic of life.
And, given some of the posts in this thread, I submit the wanna be chill experts here maybe need to grow the courage of humility and being open. Cause no WAY is this issue resolved. And it's also very important for the future of our devices. VERY.
Hahaha... you remind me so much of my old friend Priscilla! The way you speak so elaborately/intellectually. We would just sit in bed or on rooftops of buildings and contemplate stuff like this all the time for hours... perspectives, human nature, what's actually real, etc. and most of all conspiracies :sneaky: You would fit right in with our crew :biggrin:
It has been nearly 3 years since Ivy Bridge processors have launched and I have not seen or heard of any evidence that there is an abnormally high failure rate. Can you cite any source to the contrary to back up your claims of a conspiracy to make the chips fail early?
If this is directed at me.....when and where did I say there was a CONSPIRACY?
Never and nowhere. Is where.
If what was in the original piece from Japan and other pieces I found and put up links to is true....that would be the inevitable result of cost cutting and trying to cater to a dumbed down marketplace to maximize profits. Just as an OS designed to incorporate all platforms was. Is.
Was just moved to search coolest running Intel desktop chips.....and LOOK WUT I FOUND!!!!!!! O M G.
http://www.makeuseof.com/tag/two-way...ing-intel-cpu/
I thought the NSA HDD firmware spy was the ultimate scandal.....but suddenly, here's a whole new one! WUT?
I think we humans need heatsinks more than our computers do now.
Could this story be true?
You answer your own question, here:
Well just the chronology of this thread, along with my usual delving during its unfolding brought the answer, re yes, this IS TRUE, and the future in this is unknown. It is also generating more questions and explorations.... which is the way life, in its richness is engineered to work.
And for me, it sure is a scandal.....same as Windows 8. The species of scandal many assume is a normal part of commerce, be it in technology of health care, orthodox Pharms, etc....and therefore must be accepted. I am not among those many.
But did I ever suggest some conspiracy? No.
Pls examine that you seem threatened and loaded for bear....I am being ingenuous as usual, and making a normal, organic journey. No COMBAT called for or justified....and, certainly no screaming offerings in 72 point BOLD type.
You seem to be trying to EXAGGERATE, the differences between conspiracy/scandal/tin-hat/big-brother/outrageous like words, to the point of claiming that they mean massively different things. But the basic concept of these words is reasonably similar. But yes, technically speaking they are different words. When you mentioned "NSA HDD firmware spy was the ultimate scandal ", you were conveying the concept of conspiracy/scandal/tin-hat/big-brother/outrageous, even if you tried calling it "scandal".
There are various potentially 100% legitimate reasons why Intel has followed this course, such as (these are examples ONLY, and could well be 100% wrong): The original method used materials which were "blood money" resources, they wanted to stop using and/or they were being phased out because of being toxic etc. The original materials have become too expensive (rising metal etc prices), so cheaper alternatives have been sourced. The reduction in chip area, meant other cooling solutions needed to be used. They wanted to use a method which readily allowed multiple chips to be used in the same component. They wanted the manufacturing costs to be more comparable to Arms. They decided that it was both cheaper and "good enough" to meet Intels specifications and desired quality levels. Many, many other possibilities.
So it would be better to have some faith in Intel, and that there are perfectly reasonable (commercial) reason(s), why they changed things the way they did, rather than jump into "Scandal/Conspiracy theories". SOFTengCOMPelec is offline Report Post
the TIM isn't the real problem, The real problem is the gap between the IHS and the core caused by the black glue. You can use any TIM and still get a huge temp drop as long as you remove the glue.
Now THIS......is INTERESTING.:thumbsup:
I agree, hence the careful wording of my post. It's not clear whether the improved material or better binning make up the difference, heck, it could be something as simple as tightening up the manufacturing tolerance between the die and heat spreader. I do own a 4790K, and while overclocking it is not simple if you want to avoid automatic overvoltage and excess heat, with an H110 cooler I have to say it is decisively faster than the 2700K and the 4770K that I had before now. It's actually a bit anticlimactic, for my purposes there is no faster CPU to be had right now. Broadwell-K may yet prove to be stillborn for all we can tell, leaving me waiting for Skylake and needing a new socket to progress.It's my understanding that DC may on avg. indeed clock a couple 100s or so mhz better, but I still want to see evidence that those chips run cooler due to "improved TIM". I haven't seen such evidence yet, anywhere. The main "issue" Haswell has/had, the crazy heat when you overclock and feed them more voltage still exists. I think most would agree here?
Taken the other way though, you could say that Intel only has $$$ on their brains and wants to keep selling you more chips to make more profit, whereas the article doesn't have this problem so it has the readers interests at heart. Of course, this is probably not true, both Intel and the website probably make $$$ off people... Intel from selling chips and the website from ad revenue + satisfied readers coming back to read more articles, so it might be a tie.In short, if you have to make a choice in terms of whom you place your faith, I'd place your faith in the engineers at Intel over placing your faith in the random blog writings that can be googled.
You are getting what you pay for, and the blogs came for free. Think about that.
See this post, from this thread.
Is directly cited and linked in the article on which you based this thread.
Taken the other way though, you could say that Intel only has $$$ on their brains and wants to keep selling you more chips to make more profit, whereas the article doesn't have this problem so it has the readers interests at heart. Of course, this is probably not true, both Intel and the website probably make $$$ off people... Intel from selling chips and the website from ad revenue + satisfied readers coming back to read more articles, so it might be a tie.
Your premise is that most humans have the time or interest/orientation to stay current of all this esoteric stuff.
Most do not.