Beating the Debug Delivery Rush using Packaged Metrics

 Carter HamiltonThe author, Hamilton Carter is a current Senior Technical Leader for Verification at Cadence. Mr Carter has been awarded 14 patents that address efficient sequencers for verification simulators, MESI cache coherency verification, and component based re-usable verification systems. He has also worked on verification of the K5, K6 and K7 processors and their chipsets at AMD. He managed the first functional verificatiion team at Cirrus Logic as well as over 20 commercial chips and EDA projects. Hamilton is also co-author of the forthcoming book: Metric Driven Design Verification: An Engineer's and Executive's Guide to First Pass Success

 

Has your project team been really good this holiday season?  Give them the gift of efficient debug using packaged metrics and metric driven processes.

 

Engineers who perform functional debug of a logic design usually engage in a process similar to that shown in Figure 1.

 

Figure 1
 Figure  1 - Debug Flow
 
 

The verification engineer first categorizes and investigates detected failures to determine which failing testcases are best suited to rapid debug.  They then re-run the testcase with debug information turned on to gather more information about the failure.  After getting a better level of detail, the engineer studies the failure in earnest to determine if it is an actual design failure, or a failure of the surrounding verification environment.  If an actual design failure is found, then the testcase is passed along with appropriate debug information to the design engineer.

And amazingly, this seemingly simple handoff is where a couple of hours of engineering time can be lost. Usually there’s some confusion as to which version of the source code the testcase was run on.  Then, there may be confusion about how environment variables were set for the simulator and the testbench.  What was the LD_LIBRARY_PATH value?  Where did we store the pre-built library files for this testbench?  All of this information can change from engineer to engineer.

All of these pieces of information also happen to be metrics.  Today, these gems of information are stored in disparate places. Some are stored as cultural knowledge, or as part of what Jung called the ‘group mind’, others are scattered across the landscape.

fig2

 

  

But when part of the group goes out for lunch, or you’re not as in-tune with the culture as you could be, debug can grind to a halt.

By loading these gems into a deliverable package of process metrics, we guarantee that the knowledge will always be available.  If we go one step further and control our processes with this metric package, the handoff becomes completely automated.   Using verification process automation tools such as Enterprise Manager from Cadence Design, we can do exactly that by metric-enabling our processes today.

Gift box

 

 

First, we plan for the process.  We need to determine what metrics our process will consume and what metrics it will produce.   For our purposes we’re interested in the metrics that the process will consume.  What metrics does our simulation engine need to successfully complete its job?  A partial list of these metrics follows:


  • Settings of environment variables that effect the simulation process including:

    o       Various path variables

    o       Tool specific environment variables

  •        
  • Tool command line arguments

  •        
  • Tool version

  •        
  • Paths to the tools that are used.  Not only the simulator, but the memory modeler, the verification tool, etc...

  •        
  • Revision control information.  The release tag for the design and for the verification environment

    There may be several other metrics depending on your specific verification tool and environment setup.  We use these metrics as shown in the figure below.

     

    Figure 4

     

    Figure 2 - Metric Enabling a Pricess

    The planned metrics are first encapsulated into a package that can be read by our VPA tool.  The VPA tool in turn uses these metrics to drive our simulator, passing the correct command line arguments and setting the appropriate environment variables.  As the simulation runs, the VPA tool creates a copy of our original metric package and adds output metrics that are specific to this simulation run such as cycle count and simulation start time.

    This newly created metrics package is used to automate the handoff.  Using this package and a VPA tool, the design engineer can run the identical simulation on their workstation, the first time, every time.  They also have all the other simulation specific information from the original run at their fingertips, no more searching the hallways for the verification engineer.

    This article has discussed a technique called process cloning that is enabled by new metric driven verification process automation tools, (VPA tools), such as Enterprise Manager from Cadence Design.  This technique allows entire processes to be cloned and passed from engineer to engineer without the usual time-consuming miscommunications that plague work flow hand-offs.  It’s only one of a number of new techniques that are enabled by metric driven technologies.  Do you have a new metric driven technique?  I’d love to hear about it.  Send an e-mail to Email Contact

     

     

     

     

     
     

     

     


    Rating:
    Reviews:
    Review Article
    • Great article, down to earth, a nice gift for the holidays! December 15, 2006
      Reviewed by 'John Cooley'
      I am usually never positive. But for this one, I have to say --- Hamilton, you rock!
      -JC

        20 of 22 found this review helpful.
        Was this review helpful to you?   (Report this review as inappropriate)


    For more discussions, follow this link …
    CST: Webinar

    Aldec Simulator Evaluate Now

    Featured Video
    Jobs
    RF IC Design Engineering Manager for Intel at Santa Clara, CA
    Senior DSP Firmware Engineer for Cirrus Logic, Inc. at Austin, TX
    Senior Formal FAE Location OPEN for EDA Careers at San Jose or Anywhere, CA
    ASIC Design Engineer for Infinera Corp at Sunnyvale, CA
    Upcoming Events
    Essentials of Electronic Technology: A Crash Course at Columbia MD - Jan 16 - 18, 2018
    Essentials of Digital Technology at MD - Feb 13 - 14, 2018
    IPC APEX EXPO 2018 at San Diego Convention Center San Diego CA - Feb 24 - 1, 2018
    CST: Webinar series



    Internet Business Systems © 2017 Internet Business Systems, Inc.
    25 North 14th Steet, Suite 710, San Jose, CA 95112
    +1 (408) 882-6554 — Contact Us, or visit our other sites:
    AECCafe - Architectural Design and Engineering TechJobsCafe - Technical Jobs and Resumes GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
      Privacy PolicyAdvertise