- Aug 22, 2001
- 29,565
- 24,439
- 146
:beer:Originally posted by: n0cmonkey
Do we need to turn this into a flame war? I was highly enjoying the glaze over my eyes while reading the posts from the Intel guys, and you two have to kill my buzz.
:beer:Originally posted by: n0cmonkey
Do we need to turn this into a flame war? I was highly enjoying the glaze over my eyes while reading the posts from the Intel guys, and you two have to kill my buzz.
Originally posted by: pm
"Moore's Law" is dead? I remember reading that sometime in the early 80's too. I wouldn't advise going to Vegas and putting any money down on that assertion. Industries that are worth hundreds of billions of dollars don't just die suddenly. Beyond the fact that it's not the thickness of the gate that makes Moore's Law a reality (ie. we could stop shrinking the gate thickness altogether and continue to make progress), there are plenty of alternative materials and topologies that could be used. Conventional CMOS will most likely take us down to the 16nm node... maybe lower.
So the way that this generally works is that a task force is created to investigate a new microarchitecture. You have a mix of skills from architects, to logic engineers, to circuit engineers who interface with the process engineers. They typically choose a few key features - like the time to do a register read, the time to do an ALU add, the time to do a cache read, etc. - and figure out using process data and circuit simulations how long this will take in terms of gate delays - essentially how many inverter delays does it take to get an ALU add done. Then you couple this with estimated performance research regarding how many pipeline stages are optimal depending on the other features that you are considering adding to your chip, add a little black magic based on experience/marketing/customer feedback and come up with how many delays you can have in your design. This is communicated to the team. The logic engineers working on the RTL code then can check their code against this benchmark by running some form of script or something.Originally posted by: GunnarIf the microarchitecture guys have to take process into consideration, does that meant he process engineers are responsible for providing the VHDL toolset?
There's a mix of VHDL and Verilog and proprietary tools used in the industry. I'm not a big fan of VHDL and prefer Verilog.Everytime a new process comes out, a new toolkit/model would have to be put out? Am I wrong in thinking microarchitecture guys use VHDL for creating their models? (personally, I've never used VHDL, I've done Verilog, but I've been told VHDL is what non-government work is done in).
Back in college I worked with HgCdTe wafers. I did some material properties experiments with them. SoS (Silicon on Sapphire) is a form of SOI and is typically used in space applications.I was also wondering if you worked with any sort of funky wafer material. I remember reading that the Voyager 1 and 2 satellites had an RCA 1802 fabbed from sapphire. Seeing as how heat is becoming an issue, how long before a switch occurs?
The reason that they tend to use older technologies has more to do with fabrication and licensing issues. . Here's a brief article on radiation hardening. The key techniques in hardening are: isolating devices from each other, data redundancy, and error checking. Material properties also help significantly. And some circuitry techniques can be used as well.Lastly, I'm approaching my knowledge barrier here, but can any of these processes be approved for radiation hardened chips? I'm not sure how companies radiation harden, but I always though the reason why satellites still use ancient 386 level processing is because its pathways are huge, to compensate for high energy particle bombardment.
[Most of?] AMD uses Verilog (well, not exactly, but if you know verilog, it would take about 5 minutes to pick up what AMD uses ).Originally posted by: pm
There's a mix of VHDL and Verilog and proprietary tools used in the industry. I'm not a big fan of VHDL and prefer Verilog.Everytime a new process comes out, a new toolkit/model would have to be put out? Am I wrong in thinking microarchitecture guys use VHDL for creating their models? (personally, I've never used VHDL, I've done Verilog, but I've been told VHDL is what non-government work is done in).
No, you are correct - a LOT of the logic in high-end cores is hand designed. However, certain parts aren't as critical for performance, so you can just synthesize them and have your designers work on more important paths.Originally posted by: RaynorWolfcastle
Wow, a genuinely interesting thread in General Hardware, I haven't seen that in awhile!
pm, I thought that a good portion of the CPU cores was hand-designed (at the gate/transistor level) to squezze every last bit of performance, was I mistaken?
FWIW, I'm a co-op at AMD right now, and I don't entirely understand all of that (although people explain it VERY well when I have questions) . Probably about half of what I know comes from pm patiently answering my billions of questions, and the rest comes from a combination of books and school.All I know is that I find digital circuit design interesting but when we briefly looked at SPICE MOSFET level 7 parameters in an electronics class I didn't have the slightest idea of what was going on. Heck I didn't understand what half the parameters were refering to.
The flow in my group goes something like this...BTW, I'm kind of curious is anything at all done in SPICE for digital electronics in industry? We used a model for TSMC's .18um process when we looked at that insane SPICE model so I'm thinking maybe the digital synthesis tools compile everything to a SPICE deck for simulation using physical parameters at some point? Can anyone tell me what what kind of software the designers use during various phases?
Well, on a large scale, maybe. (Large being across-chip connections) But the stuff I work on is mostly in the datapath, and you only really have to worry about getting from one flop to the next within a cycle. A lot of the time, there's nothing I can do about interconnect lengths - the amount of logic between two points would be the limiting factor. Of course, if I meet timing requirements, minimum interconect length isn't necessarily important, so long as I don't waste power because I use a larger driver device. Equalizing trace lengths is not really an issue on scales like that.Originally posted by: RaynorWolfcastle
Very interesting stuff, CTho! I thought that these days specialized layout tools took care of a lot of the circuit drawing so as to equalize/minimize trace lengths though.
Cool . AMD bought the Geode line from National (in August '03, I think). I think they gave up on digital designs .BTW, I've got a contact at National Semiconductor and I think I might be able to pull off an internship there next summer. The chips I've seen from National are mostly analog though, so I assume it's going to be an insane amount of SPICE, I hope I can handle a whole summer of SPICE work
Originally posted by: beer
Originally posted by: biostud666
It will be interesting to see how much the AT forums knows about the actual process of making the chip.
Personally I doubt that many knows what actually happens in the factories. Could be some kind of dark ritual that made it all work
I agree. Next semester I have a microelectronic fabrication techniques course. I'll get back to ya in a few months
Yeah, I haven't been posting as much recently. <grins, ducks and runs>Originally posted by: RaynorWolfcastle
Wow, a genuinely interesting thread in General Hardware, I haven't seen that in awhile!
Chris is right, the important bits are done by hand at the FET level, but large chunks are done with synthesis and most of the transistors nowadays are cache and, though they are done by hand, they are largely the equivalent of making one and repeating it ad infinitum.pm, I thought that a good portion of the CPU cores was hand-designed (at the gate/transistor level) to squezze every last bit of performance, was I mistaken?
Yeah, SPICE level 7 parameters are largely incomprehensible. Don't let that put you off... I can't barely guess at most of them and I am supposed to be doing this for a living. The older SPICE models like level 3 were pretty readable and made sense, but the later ones have pretty much departed reality and it's really hard to see what they are supposed to phystically mean. A lot of the parameters in L7 models are essentially curve-fitting parameters that don't have much physical meaning - if any at all. You are ahead of most circuit designers that I've met if you have even looked at the parameter file.All I know is that I find digital circuit design interesting but when we briefly looked at SPICE MOSFET level 7 parameters in an electronics class I didn't have the slightest idea of what was going on. Heck I didn't understand what half the parameters were refering to.
Intel tends to use a lot of in-house software as do many of the other larger corporations (IBM, HP). Still, Intel's simulator bears such a striking resemblance to SPICE (at least in terms of file formats and on the surface) that if you didn't know what was under the 'hood' you'd think it was SPICE. And yes and no on the way the synthesis tools work. They could dump everything to SPICE but that would take forever to simulate, so the ones that I have seen (such as Silicon Ensemble from Cadence) either use an external static timing analysis tool, or have some form of pseudo-static timing analysis tool inside of the synthesizer that does the timing. A common static timing analysis tool used in the industry is Synopsys's Pathmill. The advantage of static timing analysis over SPICE is a huge increase in speed (and a reduction in computing requirements) with the cost being accuracy. The minimal loss in accuracy is a small price to pay for the huge improvement in timing analysis throughput.BTW, I'm kind of curious is anything at all done in SPICE for digital electronics in industry? We used a model for TSMC's .18um process when we looked at that insane SPICE model so I'm thinking maybe the digital synthesis tools compile everything to a SPICE deck for simulation using physical parameters at some point? Can anyone tell me what what kind of software the designers use during various phases?