As a long-time user of EDA tools for IC design, I have a couple of comments. First, the EDA tools that I use and have used in the recent past, don't crash - pretty much absolutely never unless I'm doing something really weird or stupid - like trying to pull up the layout design for several million transistors completely flat, or running some recursive macro that I didn't think through thoroughly. In my experience, barring the occassional mistake by the develops where they release a poorly validated release, they don't have graphical problems, they don't leave things out. They are mostly designed in-house, or designed by a vendor and then supported locally.
Then we move on to my personal preferences, but I think that a lot of the long-time engineers think like I do... certainly the ones that I work with share my preferences.
I don't like complex or pretty GUI's for IC design. They are good for approachability, but once you know the system, people rapidly move on to more productive ways to access functions. Navigating a menu tree is nowhere near as efficient as an arcane key combination, or a custom-created macro, or even mouse-based gestures. In fact, I mostly use Perl (or Python) scripts to generate schematic hierarchy automatically and then make minor changes using the GUI. It's hard to learn this kind of stuff - it makes it really hard to ramp newhires up - but it's vastly more productive. For me, a GUI should graphically show me what I'm working on and then avoid interfering with me as much as possible. I want speed to draw large complex objects and allow me to pan and zoom extremely quickly, a large working area, minimal intrusion by the GUI itself. I want the interface to be easily extensible, easily customizable, easily macro'able, and to write and read data in both a highly compressed format as well as an easily modifyable ASCII format. I want to be able to write highly complex scripts to automate tasks and be able to easily integrate them into the GUI with little to no pain.
The systems that I like for schematic capture and layout design (polygon pushing) tend towards the very ugly-looking - minimal colors, very high speed, with a small command set that can be used to build very complex macro systems. I really liked the version of PIGLET that I used a few years back -
link to a Linux port of Piglet. Most people would look at it and wrinkle their nose and say, "ugh, that's awful. That looks like something from the early 80's." But Piglet is a good example of what I prefer in a GUI. Note the size of the interface panel (not visible in most shots), the lack of colors, line-based... it's fast. Super fast. If you are loading up the top level clock distribution route for a monster chip like Montecito, you want something that's screaming fast. The original Piglet that I used was highly integratable with Perl. If you hypothetically needed to go through 8 levels of hierarchy and 50 schematics total and change 128 signal names from a name starting with "fpu" and change them all to start with "idp" because someone decided that your FPU circuit needs to be ported to the IDP, that's all of about 2 minutes (including coding, writing, and executing) for an experienced Perl/Piglet user. Now
that's efficient and highly productive. I've seen CAD tools, where you need to click each signal individually, right-click, choose "rename", type the new name in and move on. That could take literally days. And if you want to automate it, you need to write, compile and debug a LISP or MainSail program to do it for you. I've seen people who like LISP (although I have yet to see someone who likes Mailsail ), but even they would concede that Perl is a vastly better choice for a one-off task like this.
I do not like working under Windows - I don't like the way scripting is handled, I don't like the security of the file system, I don't like the inability to easily batch jobs off, I don't like the overhead that the OS seems to add to things. Loading Cygwin onto Windows helps with many of my complaints, but then leaves me wondering why we don't just run Unix instead. For example, you are working in the GUI and you want to run a mammoth full runset DRC (design rule check) job on everything that you've done to check for subtle mistakes. This is likely to take a while and chew on a lot of resources, so you don't want to run it on your machine. You think, I'll run it on my friend Joe's box because he's off skiing today. Under Linux this takes all of one command line - probably alias'd up except for the machine name - and takes less time to kick off the script to execute the job than it takes to figure out a good machine to run it on. I still haven't found a good way to do something like this in Windows. I'd have to rdesktop in, log in, wait for the OS to log me in and mount my drives, wait some more, pull up the command line, and type out a long command line to execute the command.
Several times there have been pushes to get the engineers at Intel to migrate to Windows for CAD work. They file us all into a large conference room and show us the "next generation EDA tools" in some glorious Powerpoint slideset and then at the end someone asks "That looks like Windows XP? This will all be available under Linux?" and if the answer is "no" then it's time to break out the bodyguards and the asbestos flamesuit. Still, I am seeing a gradual, slow and grudging migration towards Windows by EDA tools. Mostly fought "tooth and nail" by the vast majority of the engineers.
Patrick Mahoney
Microprocessor Design Engineer
Intel Corp.
Fort Collins, CO