Brown Bag Lunch: Sanguinetti & Sandler
John – It was a serial process. In the end, it was always a combination of this particular version of the operating system with that particular hardware. The hardware was built to a specification, and the software was implemented to that specification. If we discovered the hardware didn’t really match the specification, we’d change the software.
Scott – With additional capacity on the silicon, now we’re embedding the processors and the software onto the chip. It’s embedded software now. The application that gets shipped in the black box, the IC, includes lines and lines of code. My Sony camera? Internally, there’s the SoC and there’s a set of programs, depending on the model of the camera. You can buy a different Sony, but it’s the same hardware. Only the software’s different.
Peggy – But doesn’t that make it more hackable, like the iPhone?
Scott – Sure, so you need to verify that it’s not hackable. You invite people to hack into it by putting in additional [safeguards, and letting them try].
John – That’s been going on for 20-plus years. When I got to Amdahl, I was appalled to discover they were going to create a family of computers out of the 580, but only had one machine. They just changed some code in the thing, some macrocode, [although] that slowed the thing down. That was back in 1982.
Scott – Didn’t they get sued over that?
John – How can you get sued for that?
Peggy – Okay, back to verification.
Scott – With verification, you need to add more value by tying things together to deliver more of what’s missing for the customer. With some of our customers, they can afford to hire the best people. But other can’t, and they struggle more. [Clearly] there’s a continuum of talent and different companies pay at different levels, so the key word in “EDA” is “Automation.” We look at what our customers are doing, and how they’re spending their time and energy, and we seek to automate that. For the most advanced customers, it’s way harder to automate [their designs], but for the run-of-the-mill customers – the vast mainstream – it’s easier. Back in the 1980’s, John and I were writing directed tests and making the tools. Then [various companies] came along, and they wrote programs and automated the process. It’s my observation that EDA is about creating practical programs that automate best practices. Verification planning is the next step in that automation.
John – That’s a real salient point. In EDA, in general, automation has value to the extent that is automates a process. That’s not just in verification, but in all EDA products. They’re only valuable to the extent that they automate a process that’s not possible, or is too tedious to be done by hand. EDA tools don’t just automate best practices. When you provide the tools to the leading-edge guys who need them to build the leading-edge products, the tools themselves really need to lead.
Scott – Can you give an example?
John – High-level synthesis.
Scott – That’s being done by a human?
John – Probably. How about place and route?
Scott – Sometimes that was best practices and sometimes it was grunt work. How did we verify before simulators? Somebody wrote 1’s and 0’s, and watched them propagate [through the design]. And we needed models for transistors. It’s been layers and layers of programs written there, that was always grunt work before. Timing analysis is another example. On my first chip at Intel, I created a list of timing parameters for every cell. I had a cell for every path and referred back to the models along the way. Whenever we changed something, it was recalculated. I did it on an Intel blue box in SuperCalc.
Peggy – How would the flows look today if we could start all over again from scratch?
John – That’s fairly difficult to do. We’ve spent the last 9 years trying to do that.
Scott – There have been various attempts to do that. I think there can be vastly different, alternative ways to specify and implement logic into silicon. Getting them to be widely adopted is another big challenge. We’re talking about people here, people with habits who don’t like to take risks. Look back at silicon compilation. Those were efforts to redefine the whole process end-to-end. We haven’t seen anything like that emerge or be discussed [for some time], although some academics are still talking about it.
John – There are a number of companies who have been engaged in electronic design who have developed their own methodologies, and in some cases they look very different. I’ve met with customers in military electronics who have a set of constraints [which define their] business. The military has platforms that must last 30 to 40 years, with the electronics replaced periodically. They’ve got boards crammed full of chips [trying to move to] one big board and one big chip. There has to be absolute compatibility, with not one line of software changed. These guys develop their own methodology that’s completely different from what we use. They're deciding right now if this is too big of an overhead to keep going this way. They may want to switch to commercial tools.
Scott – We’ve all run into companies over the years that wanted to build their own flows, but it became a liability. So they bought commercial tools and moved on. Even today, however, the big companies still have their own simulators and tools.
John – Look at NEC. By far, they have the most experience doing ESL design. They’ve had their own tools for 10 to 15 years. Periodically, they’ve thought they wanted to commercialize them, but have never succeeded because it’s an awfully big investment to do that.
Peggy – Is EDA about revolution, or about quick-paced evolution?
Scott – In all walks of life, there will be revolutionaries. Mostly they will be outcasts or ignored, if not shot. But from time to time, revolution does in fact happen. Was synthesis a revolution? Not really. It was the automation of something that had already been done manually.
John – Yes, but a qualitative change happened because synthesis allowed a change in the level of abstraction.
Scott – Was that revolutionary?
John – Yes, it was.
Scott – But correct-by-construction says the software system can guarantee that automated implementation is, in fact, correct without manual intervention.
John – Correct-by-construction is automating the transition from one level to another. There’s nothing wrong with that.
Scott – But people have not been able to trust the transition. Why do we still do gate-level simulation and timing analysis? Lots of effort still goes into checking the results of the synthesis tools.
John – Most of the effort goes into verifying at the level of abstraction of the design.
Scott – Constructing models of the specification at two different levels of abstraction.
Peggy – Isn’t that what John’s doing?
John – Not really.
Scott – He’s taking a specification, creating a C model, and then creating an RTL model.
John – We’re saying, you’ve got a C model, create the RTL model, and verify that the C model is right. That’s really the right way of doing things – if you’re in love with building two models.
Scott - I’m not here to take a stand on the right way to do things. But there are a variety of different ways in use today. There are companies making profitable chips [doing things a variety of ways].
Be the first to review this article