I'm astonished at the speed of design. It's amazing to me.
From a simple enthusiast point of view I see processors as some of the most complex and amazing machines mankind is able to make, requiring collaboration from a wide range of disciplines (chemistry, engineering in many of its variants, physics, computing obviously, etc), and each year that passes the complexity gets even more mind boggling. I mean, from simple calculators with a few transistors to a 486 not that long ago with a million transistors doing their thing, we're now into the billions! Breathtaking stuff.
I can easily see how millions of man hours go into these little wonders we buy and use, it's amazing to think all the knowledge that's been put into practice to make these a reality. I love forums for this very reason, the possibility to see people in their respective fields give their take on the matter.
+1
Computer processors are incredibly complex devices. A new high-end chip fab can cost
billions of dollars.
A consumer-level chip can easily have a few hundred million transistors.
Design a machine that has 200 million switches in it.
Done? Ok, great.
Are you sure it's 100% correct, and that some special combination of inputs won't cause some of it to trigger incorrectly? I sure hope so, the tooling is expensive.
Now each of those switches has to be a few dozen atoms wide. Consistently, too. Scrapping precision-doped pure silicon isn't cheap.
Write a program to do it? Sure. You could just string together a bunch of Pentium Pro cores and call it good; a computer program could surely handle that. Want more computing power? Just add more cores. Better add another 200A breaker box to your house to handle the power and air conditioning requirements.
Die shrinking is what pushes the limits. "Hey, the wavelength of UV light (a few hundred billionths of a meter) is becoming a problem. It's too big. We need smaller light." What would the computer program do to handle that?
They're incredibly complex little devices that incorporate some of our most advanced scientific knowledge in their production, which
are designed using computers that can take the place of tens of thousands of
people performing calculations
with a slide rule or hand calculator.
They push the boundaries of what we know about physics, like reaching the point where insulating barriers are so thin that they can't reliably block electrons from passing through. Computers aren't yet smart enough to be able to figure out alternate design paths when physics says "You can't do this anymore."
I can't even begin to think what would happen when a technology that does to the transistor what it did to the vacuum tube gets invented. Not to mention if it has the ability to scale and progress along the years just as much as the transistor did and does...
Room-temperature
quantum computing.
"If a quantum computer could be built with just 50 quantum bits (qubits), no combination of today’s top 500 supercomputers could successfully outperform it."
Problem is, making those quantum bits in the first place is kind of tricky.
Build something like that in a compact form, and walking not too far behind it will be an android at or beyond the level of Data.
Incidentally, it is this level of complexity that is a big reason you have multiple revisions of the silicon, and errata sheets for chips: The thing is such an incredibly complex machine that the human mind is simply unable to keep track of all its workings. Things are forgotten, or not anticipated.
Then you add software into the mix, which instructs that complex machine how to function. Sometimes it seems amazing that any of it works at all.