Open side-bar Menu
 Real Talk

Archive for August, 2010

The 10 Year Retooling Cycle

Monday, August 23rd, 2010

I still remember the enthusiastic talk around the 10-year EDA retooling cycle in 2000.  There was optimism fueled by the dot-com boom. Moore’s Law was in full force. Communications industry was in infancy, ready for innovative new products. Products were evolving quickly, pressuring designers to produce more and more in less time. This, in turn, was fueling an unprecedented demand for new and innovative EDA solutions.

 

Those were the days…  EDA startups were abundant. There were many trade shows, most notably DAC.  Hotels were sold out! The big 3 had huge parties, and oh yes, design engineers could learn of all the new developments over the week.  You really needed a good pair of walking shoes in those days… It was like going to a candy store!

 

From a methodology perspective, automation and re-use quickly became a big focus. Mixed signal designs, multiple clock domains and advanced power management schemes became the norm. Simulators did not have enough horsepower to test all aspects of a chip. Accelerators and emulators became more heavily used, but with them came additional issues.

 

Standards have evolved around key issues. The Verilog language evolved into SystemVerilog. Standards define good coding practices including re-use practices. LINT tools became more heavily utilized to improve the quality of the design and to ensure that re-use guidelines were followed.

 

It is now 2010. The big EDA companies have adopted an all inclusive volume sales model, putting the squeeze on the smaller companies that have to compete with their “free” software.  As a result, there are fewer EDA companies providing innovation. DAC is a much smaller show. And we don’t hear much about this 10 year re-tooling cycle.

 

But Moore’s law is still active, albeit at a slower pace.  Chip sizes continue to grow and complexity continues to increase.  The time to market pressures are as strong as before, if not worse. Verification continues to have key challenges that beg for automation. And, not surprisingly, the 10-years old software has slowly aged and is no longer meeting today’s design requirements.

 

Some lint tools run for 10s of hours on designs when it is possible to run in minutes.  Some CDC tools run for days when it is possible to run in hours.  Some rule checking tools produce 100s of thousands of warnings – the wasted debugging effort may add up to an army of engineers.  The confluence of clocking domains, power domains and DFT requirements have added significant pressure on design methodologies.

 

There may be fewer EDA companies these days but innovation is still going strong.  Products for the next 10-years are available and getting adopted. Precise Lint tools with blazing performance are available. Precise CDC tools make it possible to achieve reliable sign-off on today’s designs. New innovations are underway for solving complex issues such as X-Optimism and X-Pessimism in simulation.  Automatic Formal Analysis tools quickly improve design quality with minimal effort.  SDC tools ensure the effectiveness of time consuming STA efforts. The 10-year retooling cycle is in effect again.

 

So what tools are in your flow?  Are they current?  Are they working well?  Can your supplier respond to your needs?  Are you getting what you paid for?

 

You need today’s innovations to deal with tomorrow’s problems!

Hardware-Assisted Verification Usage Survey of DAC Attendees

Monday, August 2nd, 2010

Tradeshows and technical conferences serve as great places to survey the verification landscape and the Design Automation Conference in June was no exception.

 

EVE took the opportunity to poll visitors to its booth with a survey similar to the one used at EDSFair in Japan earlier in the year.  Interestingly enough, some of our findings in the DAC survey tracked with findings from EDSFair.  In some cases, they were widely dissimilar.

 

Our DAC attendees who took part in the survey included designers/engineers, managers, system architects, verification/validation engineers and EDA Tool Support or CAD managers.

 

Both sets of respondents noted that challenges are getting more complex as design teams merge hardware and software into systems on chip (SoCs).  The Verilog Hardware Description Language (HDL) wins out as the number one language for both ASIC and testbench design, with SystemVerilog a distant second.  DAC attendees ranked SystemC ahead of VHDL for ASIC design, while VHDL is used more than SystemC for testbench design.

 

Surprisingly, while more than 70% answered that they own between one and 100 simulation seats, 17% claimed to have more than 20 seats compared to only 12% between 100 and 200 seats.  Our conclusion is that very large farms are more popular than large ones.

 

Unlike their counterparts at EDSFair, DAC attendees are less than satisfied with their current verification flow.  Almost 70% of EDSFair attendees claim to be satisfied with their verification flow.

 

DAC attendees noted the same dissatisfaction for runtime performance and rated poorly the setup time for their verification flow.  Also, efficiency in catching corner cases and reusability was ranked less than satisfied to fairly satisfied in both categories.

 

When asked to rate the importance of various benefits of a hardware-assisted verification platform when making a purchasing decision, they chose runtime performance, followed by price as most important.  Visibility into the design and In-Circuit Emulation (ICE) came next.  Compilation performance, simulation acceleration and transaction-based design, while considered important, received lower grades than the other criteria.

 

While simulation acceleration doesn’t rank highly in purchasing criteria, those surveyed claimed that simulation acceleration is the mode they use most for their hardware-assisted verification platform.  ICE is listed as the second most used mode, and stand-alone emulation came in third.  Few use it for transaction-based emulation.  By comparison, the EDSFair survey revealed that transaction-based emulation was second after simulation acceleration and significantly more popular than stand-alone emulation and ICE.

 

The primary use for hardware-assisted verification is ASIC validation, with hardware/software co-verification a close second, a trend we also observed with EDSFair attendees and is most likely because of the move to include embedded software in SoCs. 

Emulation can be used for hardware/software co-verification because it works simultaneously to verify the correctness of both hardware and embedded software.  It can process quickly billions of verification cycles at high speeds.  Unlike older generations that were prohibitively expensive, pricing for today’ emulators is competitive, a key consideration for EDSFair and DAC attendees.

 

The news from Japan in January was positive and I projected that the widespread adoption of hardware/software co-verification would be good for EDA’s verification sector in 2010.  While the DAC survey didn’t offer up the encouraging signs, it did confirm that hardware/software co-verification is taking root.  At EVE, we consider that a plus for the hardware-assisted verification market segment.

CST Webinar Series



Internet Business Systems © 2016 Internet Business Systems, Inc.
595 Millich Dr., Suite 216, Campbell, CA 95008
+1 (408)-337-6870 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy Policy