Charles Kozierok
Elite Member
- May 14, 2012
- 6,762
- 1
- 0
Doing something first isnt a risk in itself. Doing something the others didnt consider or didnt want to=risk.
You keep using that word. I do not think it means what you think it means.
Doing something first isnt a risk in itself. Doing something the others didnt consider or didnt want to=risk.
Doing something first isnt a risk in itself.
Didn't Intel finish Itanium? Can't we buy the processor? I imagine if people wanted to use IA64 and get the software ball rolling with it, they would have. I wouldn't mind seeing it go places if it eventually made things better, but it seems clear that more people wanted x64 for convenience / backwards compatibility anyway. Since the IA64 technology is already out there though... I suppose Windows/software/Intel could pick it back up anytime if they thought it was viable/beneficial to do so. Right?Intel used 10billion$ on IA64 to frogleap performance for the future. AMD used what, 50mio$ on a slap on x64 to keep us locked for another 30 years. Who won? Do you call it innovation? I call it stagnation.
I hope they do. If anything, just for sheer knowledge/experience of knowing what graphene/photonics/etc. can really bring to the table. From what I understand, with a lot of effort, graphene chips could finally enable stable clocks of 10Ghz+ into the mainstream without overheating easier. Is this true? Which company is willing to take the risk to find out (I read that Samsung is already attempting graphene in the mobile space)?In my view I dont see it that way. For me a risk needs to be a game changing event. Something to set a true difference. HKMG, trigates etc is just a natural step on the process manufactoring development. If Intel had gambled on graphene, photonics etc. Then yes.
Discrete, IDC, discrete. Discreet is how you handle your mistress.
That's because it was never going to happen, because Itanium has generally sucked, compared to our commodity CPUs. Support for it hasn't waned because of x86. If that were the case, PPC support would also be a problem. The difference is that low-end PCC are still good, and Power still keeps kicking ass. IA64's greatest success was killing off PA-RISC and Alpha, and marginalizing SPARC, at a critical time, when many now-old series of computers were more or less dying off. IBM and Intel ended up the only strong survivors, though Sun/Oracle and Fujitsu still keep popping their heads up.Intel used 10billion$ on IA64 to frogleap performance for the future. AMD used what, 50mio$ on a slap on x64 to keep us locked for another 30 years. Who won? Do you call it innovation? I call it stagnation.
The goal of competition within capitalism is monopoly, and Intel is very good at working it. No big company truly values the need to innovate (provide more output with less input) that customers having options necessitates.The goal of competition and capitalism is monopolies.
More competition would also be a scary idea to any representatives being lobbied to, when it comes to silicon-industry special treatment, as they thrive on a similar duopoly system.A duopoly is more damaging than a monopoly. If you wanted competition we would need much more companies competiting. But the ROI aint possible for that.
I don't see why not.
It was a completely new technology and could have cost them billions of dollars and been a complete failure.
If Intel had gambled on graphene, photonics etc. Then yes.
Initial results of Tri Gate were also worse off then cutting edge planar technology. They had to improve it to get to the point they have now. Same is true with others.
Sure it is. "Hasn't been done before" is heavily weighed in business.
Was that the reason for the IB delays? I know they had yield issues (or at least that was the rumor) but I don't think Intel admitted anything publicly other than that it'll be late.
Hypothetically if traditional scaling never disappeared, we wouldn't have seen CPUs implement Strained Silicon, HKMG, and Tri Gates. I bet by the time we get whatever we think is "revolutionary" in real products, there would have been so much advancement on "less revolutionary" products that it might not be that big of a change.FinFETs/Tri-gate are a natural progression and not necessarily a "risk.
Wot?
ARM =! Apple
Windows is the de facto standard for business software, and even if Microsoft utterly fails at everything over the next decade, they will likely continue to dominate the business computing sphere.
On the consumer end, Microsoft will become less relevant imo, but for that matter so will Apple as the products get better and cheaper from the competition.
ARM is also utterly terrible for certain tasks while being very efficient and effective at others. It's just a side effect of RISC architechture.
I also wouldn't put too many beans in the GPU/APU basket, Intel is headed there pretty fast, but in the very areas you're talking about, it's ironically ARM-powered devices that are replacing the units (think of how many folks pick up iPads or Galaxy tablets and use them to run dumbshit games and such on them rather than get a real laptop with a real GPU or Fusion/Llano/etc).
errr... Apple uses "cheap" Intel CPU's in their Ultrabooks (MBA) and that's not changing anytime soon.
Intel is never going to be taken down by AMD, but by ARM licensee holders. TI, Nvidia, Qualcomm, Apple, and Samsung all have a combined R&D budget that is greater than AMD could ever hope for, and the future is clearly mobile which Intel isn't a strong player in because they rested on their laurels in x86 desktop/laptop/server land for so long (which, if anything, will be their downfall).
True competition will come from the fact that more and more people, though still a small number, are using iPads instead of traditional x86 PCs, and THAT is what would keep Paul Otellini up at night. It is not pure coincidence that as soon as the iPad came out, Intel started really targeting low power chips, especially with the promise of Haswell.
Intel's been lazy on the mobile front, but they're getting in gear.
A single misstep against TI/Qualcomm means they leap ahead whereas Intel can afford 2-3 missteps on the desktop and still be ahead.
They are but it's at a higher cost than the competing architectures.
The mid-range Z2460 phones are supposed to be in $400 range unsubsidized.
The low end Z2000 phones are rumored to be $200 unsubsidized.
They've got to price their chips like the ARM manufacturers price theirs.
You guys really need to read the Innovator's Dilemma by Clay Christensen.
For the most part when dealing with effective monopolies as with Intel, the only way they can be "taken down" is by actually marginalizing their business to make it irrelevant, basically a "disruptive innovation"
A few examples:
1) Digital photography killed film photography, along with Kodak.
2) LCDs replaced CRTs.
3) GPS units replaced traditional maps.
4) Smartphones destroyed numerous markets, including GPS units, point and shoot cameras,
5) Wikipedia killed traditional encyclopedias
6) Digital downloads/streaming are killing CDs/DVDs, which killed floppy drives.
7) SSDs are slowly replacing magnetic HDs.
Source: http://en.wikipedia.org/wiki/Disruptive_innovation#Examples_of_disruptive_innovations
Intel is never going to be taken down by AMD, but by ARM licensee holders. TI, Nvidia, Qualcomm, Apple, and Samsung all have a combined R&D budget that is greater than AMD could ever hope for, and the future is clearly mobile which Intel isn't a strong player in because they rested on their laurels in x86 desktop/laptop/server land for so long (which, if anything, will be their downfall).
True competition will come from the fact that more and more people, though still a small number, are using iPads instead of traditional x86 PCs, and THAT is what would keep Paul Otellini up at night. It is not pure coincidence that as soon as the iPad came out, Intel started really targeting low power chips, especially with the promise of Haswell.
We've been hearing about the death of the PC from analysts for how many years now? Well, if you look, PC shipments are still growing (not by much, and not by as much at the toy tablet market). There is a flawed assumption that what I'm calling toy computers increase in sales must come from a decrease in traditional PCs. So far, that just isn't the case.
Silvermont doesn't even need to be great. They can leave performance the way it is and use all the TDP headroom from 22nm to reduce power. They could even sell silvermont at a loss just to put everybody else out of business, and then make all that money back on the next node.