Make New Friends, but Keep The Old

Gary Smith is the chief EDA analyst for Dataquest/Gartner Group. I asked him in a recent phone call to talk about the evolution of EDA tools and to comment on the advisability of re-packaging or re-purposing older tools under the guise of new tools. Gary, by his own admission, has been in the industry for a long time - as a customer and an industry analyst. It's for good reason that when Gary speaks, the industry listens. His wide-ranging comments on the evolution of the tools are as follows:

"Basically, the main driver of the tools is the silicon. In essence, an EDA lifetime [for a generation of tools] is two process nodes. Every two process nodes, we run into significantly more severe problems, so that we [find ourselves] having to re-tool. That's been the driving force for moving to the next generation of tools. But the fact that the products [that emerge as a result of the new tools] are more complex, means that the boards are more complex and means that the tools have a wide effect in delaying [the introduction of new products] out into the market."

"[As it turns out], every 10 or 12 years, there's a major inflection point where we come to a place in [design] complexity that we can no longer use the current methodology and timing [strategies] to produce functional silicon. So far, we've seen three generations of tools."

"Initially, there were CAD companies - Calma and the like. The tools from these companies were used to design transistor-type designs, designs that included a handful of transistors like the 7400 logic family from TI. When a maximum of hundreds of transistors started stretching into VLSI-type designs, which included thousands of transistors, we started running into problems."

"Then we moved to the next level of abstraction - gate-level technology, where you designed a physical library, and that physical library became the basis for a design platform, which was basically schematic capture and logic simulation. At the transistor level or CAD level, Spice was our simulator. Those were the days of Daisy-Mentor-Valid and those tools not only did IC design, but also started doing the PCB design, which at that point started to be automated as well."

"[However], once we started approaching 20,000 gates, we found we could no longer get designs out. [Meanwhile], the design cycle has been 9 months to a year FOREVER. They talk about ever-decreasing time for designs - it had decreased by 1997, in fact, in every industry except the automotive industry. The auto industry tried to get it down to 9 months, [but had difficulty doing that]. Historically, the aeronautics industry took 3 to 5 years to get a design out, the automotive industry takes 1 to 2 years, but the computer guys were always at 9 months. Now, everybody is at 9 months."

"[As a result], when you start a design you always ask, 'How much functionality can I put into a design in 9 months?' When we hit 20,000 gates, we found we really couldn't get that many gates done in a 9-month period. That's when you saw the emergence of the RTL technology. [RTL] was pushed commercially - remember LSI was renowned for having the first gate-level simulator, or maybe we can blame IBM, but Synopsys and Cadence came along with this RTL methodology.”

“The RTL technology has proven to be fairly robust - it's been [the norm] for longer than 10 years. We went into floorplanning-based design and the infamous IP snafu, and saw productivity go up by 360%. It's all been due to the major boost out of RTL design."

"Now, however, we're out of steam - as of last year, actually, with 90 nanometers. [Today], we're in the middle, somewhere between the 130 and 90-nanometer tool families. But at 100 million gates, none of the initial products for power users are useful - we find we don't have the tools.”

“[It's true], there are quite a few designs with 40 to 50 million gates, some for example from IBM, some from Agere Systems, and you're now seeing a second generation of designs coming out of the big guys at 77 million gates. I've even heard stories in excess of 90 million gates - they're doing the design with ESL methodology and in-house tools. Well, actually, there are some commercial tools [in use there as well], but designs are being done with a combination of in-house ES-level tools and brute strength."

“I was on a panel recently at DATE [in Europe] and heard discussion of 25 designs that have been done with SystemC. SystemC is part of the reason that we're being able to move up [into higher levels of abstraction in design]. Everybody [on the panel] had used CoCentric tools after the architectural tools were done. There were some low-level tools being used - Axis Systems' XoC tool was a major breakthrough in ESL verification. [Today], Cadence has Incisive, which is a similar tool, but not as powerful - it doesn't [have the] concentration on the software side.”

“One mistake that all of the EDA vendors have made is that they forgot about the software. [Historically], they concentrated on the hardware tools, but ESL tools have to combine both hardware and software [considerations].”

“As a sidebar here, it's important to note that one of the reasons that Verilog doesn't compete with SystemC is that it's not a software-centric tool at all. Verilog's an RTL tool and SystemVerilog is just a misnomer. I had a guy tell me at DATE that when we first attempted this - in 1994, or '95, or '96 - we were going to use VHDL with all of its software capability. But we [ended up] saying, 'Ouch - that was painful!' I head another comment at DATE: 'You think that writing software in VHDL was painful, try writing it in Verilog!' Also the other comment I heard being made at DATE: 'SystemVerilog is great. It's finally caught up with VHDL!'”

“[Meanwhile] back to tool evolution - what did Synopsys do? They took over the RTL-level methodology, which made them Number 1. [After all], if you own the methodology whose sales go from the RTL level to the gate level, money will be made at the ES level. That's one of the reasons that Synopsys hasn't put together a good virtual silicon prototype. Because when they do, mainstream users won't use their tools any more. Their customers are the ASIC designers and the company wants those guys to buy their layout tools. There's a big reluctance to admit that your world, the one that you own, is going away. [Historically], that was Mentor's problem as well.”

“Mentor's good news/bad news was that they owned the military world. But as we moved up to the RTL level, military budgets got slashed as the Berlin wall came down. The military were power users and they didn't move to the RTL level. Mentor sat around and asked their customers - the military - if they were going to make the move and the customers said they'd never do that. Mentor said, 'You won't?' The military said, 'We need a good framework.' So the Falcon framework was developed, while Synopsys and Cadence took the rest of the market away.”

“[Today], Cadence has lost the power users to Synopsys - their customers are mainstream. Cadence customers are saying, 'We need a good framework.' So today there's OpenAccess. Now there's nothing wrong with a good framework - a completion of the methodology so that you can take engineers and move them up into ES level work and design. If ES becomes the design level, then RTL becomes the implementation level, just as ASIC implementation is design in the gate-level world - give them a netlist and let them do it in silicon.”

“Meanwhile, power users are going to continue to do their own layout. They know that you always lose 15% when you move up to the next level. But as you get better, older, more mature - you know you always have to lose 15% as you get that much farther away from the silicon. That's a given.”

“We've got something I call the old tool syndrome. You've always used your old tools. They're in the flow, they're being used. The oldest tool is Spice. We've found that in RTL and all the way throughout, when the problems get tough … I first saw it in 1990 or '91 with critical paths - the Verilog timing analyzer was not accurate enough to pick up all the timing violations. At the time, I told a guy he could just Spice those critical paths, because the timing violations wouldn't pass. The guy asked me what Spice was, and I said it was analog simulation. He wouldn't use it. The next time somebody asked me for a solution and I suggested Spice, I said it was transistor simulation. Then it was used and used successfully. Even now as we're moving up, we're seeing fast Spice being used a lot - providing the accuracy of the old transistor level.”

1 | 2 | 3 | 4 | 5 | 6 | 7  Next Page »


Review Article Be the first to review this article
CST Webinar Series


Featured Video
Peggy AycinenaWhat Would Joe Do?
by Peggy Aycinena
Acquiring Mentor: Four Good Ideas, One Great
More Editorial  
Manager, Field Applications Engineering for Real Intent at Sunnyvale, CA
Upcoming Events
DeviceWerx - 2016 at Green Valley Ranch Casino & Resort Las Vegas NV - Nov 3 - 4, 2016
2016 International Conference On Computer Aided Design at Doubletree Hotel Austin TX - Nov 7 - 10, 2016
ICCAD 2016, Nov 7-10, 2016 at Doubletree Hotel in Austin, TX at Doubletree Hotel Austin TX - Nov 7 - 10, 2016
Electric&Hybrid Aerospace Technology Symposium 2016 at Conference Centre East. Koelnmesse (East Entrance) Messeplatz 1 Cologne Germany - Nov 9 - 10, 2016
S2C: FPGA Base prototyping- Download white paper

Internet Business Systems © 2016 Internet Business Systems, Inc.
595 Millich Dr., Suite 216, Campbell, CA 95008
+1 (408)-337-6870 — Contact Us, or visit our other sites:
AECCafe - Architectural Design and Engineering TechJobsCafe - Technical Jobs and Resumes GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy Policy