Open side-bar Menu
 Real Talk
Rick Eram, Sales & Marketing VP
Rick Eram, Sales & Marketing VP
Rick has over 20 years of hands on experience in EDA industry, designing tools and directly involved in development and management of engineering teams as well as managing sales and marketing campaigns. Rick's work was instrumental in 2 IPO's with Analogy and Magma. During his tenure at Atrenta he … More »

Mind the Verification Gap

 
May 24th, 2010 by Rick Eram, Sales & Marketing VP

Would you ever use a wrench to tighten a Philips screw? Or hammer a square peg into a round hole?

Chip design today has become more of verification task than design. Designers spend more than 50% of their time trying to come up with ways to verify their designs or, worse yet, someone else’s design. Despite the change in the nature of the design work, designers keep using the same old design tools, hammering away trying to close the design and verification Gap. Must you not Mind The Gap?

Over the past decade or so, designs have transitioned from code writing to IP and code verification. Most designers today are tasked with taking a piece of IP designed by someone else who may not be even around in the company, or a design so old that the original designer does not even remember the details, or even IP your company bought from a third party and try to make it satisfy the spec. All is well until you realize that the changes you made to the code have left many holes in the functionality which are not covered by the original vectors you got with IP/design. In turn, the changes resulted in many unintended consequences that you could not have predicted based on the IP/design spec. The issues only magnify once you put all the IP blocks together.

Well that’s exactly what happens when you try to hammer a Philips screw into place. Step back and take a good look at the techniques you use today! Are you still using the same simulation methods? Are you still relying on LEC to catch some of the problems? Are you tossing the verification work over the wall to the verification folks and calling it the day – that’s their problem (until it comes back to you with an embarrassing bug!)?

Over the last decade design teams have added linting to their flow. EDA vendors extended linting to cover even more exotic checks. The tools helped the managers to become a design IRS and gain a little more visibility into the quality of the design. But, neither did the verification tasks did get any easier, nor did the design quality improve by what was promised. Most designers used these tools only as a check list. The unintended consequence was the amount of extra work deciphering linter reports. The problem is that this activity often has low ROI because of the noise, the difficulty in setup and managing yet another set of files and results.

Even though designers are finding themselves doing more verification work than design, the tool of choice is still basically a big hammer (i.e. the simulator). Linters so far have helped managers more than the designers in the trenches.

It is perhaps time for more finesse and a bit of strategy. Next-generation tools can help designers better strategize their work, and better targeting their simulations. With targeted simulation and functionally checking the design on the fly, designers can now look deeper into design and make sure they did not overlook potential bugs.

What tools can help in this process?  Is it time to rethink strategies and retool? Perhaps it is time to address the Design and Verification Gap. This means marrying verification and design activities together, and starting verification essentially right at the outset. Perhaps it is also time to go beyond traditional simulation, linting and traditional verification techniques. Verification essentially needs to move hand-in-hand with the design. Early verification will not only increase productivity and ROI, but it will also focus designers to cover as many functionality scenarios as possible. Next-generation tools must also incorporate a simple setup along with super fast analysis runtimes to incrementally check the design, help designer target simulation, debug the design on the fly, and to provide feedback on the potential holes left in the design as a result of recoding or other changes. 

As your designs grow and you include more IP, your verification tasks will certainly grow. Be sure to Mind the Verification Gap.

Related posts:

2 Responses to “Mind the Verification Gap”

  1. Tom Anderson says:

    Rick, some good comments on the evolving and intersecting roles of design and verification. There’s an additional aspect that you didn’t address – many of the advanced techniques supported by the Open Verification Methodology (OVM) and the Universal Verification Methodology (UVM) are not widely adopted by designers. Developing modern object-oriented testbenches requires significant software skills that seem to be more common among dedicated verification engineers than hardware designers.

    So what’s a designer to do? You hit the nail on the head – early verification is the key. Cadence has been very successful in the formal analysis space largely because we target designers. We show them how to run lint and other types of automatic analysis, then how to write effective assertions, and finally how to apply formal analysis and debug the results in the familiar world of simulation. I’m not going to claim that assertions and formal are always easy to deploy, but they do work well for many design teams.

    Tom A.

  2. Mike Carrell says:

    Rick

    Design and verification are coming closer together, as you and TomA both point out.

    I agree that integrating IP is becoming more challenging. Whether the IP is from a third party or from within your own company (across the globe or from a different group a few cubicles away)!

    There is great advantage to tackle these problems as early as possible. After spending so much effort to verify and make the RTL golden, it’s much better to discover EARLY that shutting down a power domain affects your design differently than you’ve planned & verified at RTL. And better to find out at RTL that the timing constraints you THINK are good, somewhere have a max delay which is less than a min delay constraint, which would cause an optimization or STA tool to chew and chew until it cannot converge, and exits.
    It’s the designer, not the manager, who has to deal with these surprises and spend more time in non-productive iterations!!

    I’m seeing that the vision, like the of EDA360 vision, will pull technology, tools, methodologies closer together and to higher levels of abstraction, and make IP re-use more efficient and practical. That will translate to higher productivity and better profitability for teams who realize the silicon in every application.

    Inspiring article!
    +++ Mike
    P.S. Just for the record, I would not try to use a wrench to hammer a round Philips screw into a square hole while minding the gap on the London Underground. ;-)

Leave a Reply

Your email address will not be published. Required fields are marked *


*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

CST Webinar Series



Internet Business Systems © 2016 Internet Business Systems, Inc.
595 Millich Dr., Suite 216, Campbell, CA 95008
+1 (408)-337-6870 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy Policy