|The author, Hamilton Carter is a current Senior Technical Leader for Verification at Cadence. Mr Carter has been awarded 14 patents that address efficient sequencers for verification simulators, MESI cache coherency verification, and component based re-usable verification systems. He has also worked on verification of the K5, K6 and K7 processors and their chipsets at AMD. He managed the first functional verificatiion team at Cirrus Logic as well as over 20 commercial chips and EDA projects. Hamilton is also co-author of the forthcoming book: Metric Driven Design Verification: An Engineer's and Executive's Guide to First Pass Success|
Has your project team been really good this holiday season? Give them the gift of efficient debug using packaged metrics and metric driven processes.
Engineers who perform functional debug of a logic design usually engage in a process similar to that shown in Figure 1.
The verification engineer first categorizes and investigates detected failures to determine which failing testcases are best suited to rapid debug. They then re-run the testcase with debug information turned on to gather more information about the failure. After getting a better level of detail, the engineer studies the failure in earnest to determine if it is an actual design failure, or a failure of the surrounding verification environment. If an actual design failure is found, then the testcase is passed along with appropriate debug information to the design engineer.
And amazingly, this seemingly simple handoff is where a couple of hours of engineering time can be lost. Usually there’s some confusion as to which version of the source code the testcase was run on. Then, there may be confusion about how environment variables were set for the simulator and the testbench. What was the LD_LIBRARY_PATH value? Where did we store the pre-built library files for this testbench? All of this information can change from engineer to engineer.
All of these pieces of information also happen to be metrics. Today, these gems of information are stored in disparate places. Some are stored as cultural knowledge, or as part of what Jung called the group mind’, others are scattered across the landscape.
But when part of the group goes out for lunch, or you’re not as in-tune with the culture as you could be, debug can grind to a halt.
By loading these gems into a deliverable package of process metrics, we guarantee that the knowledge will always be available. If we go one step further and control our processes with this metric package, the handoff becomes completely automated. Using verification process automation tools such as Enterprise Manager from Cadence Design, we can do exactly that by metric-enabling our processes today.
First, we plan for the process. We need to determine what metrics our process will consume and what metrics it will produce. For our purposes we’re interested in the metrics that the process will consume. What metrics does our simulation engine need to successfully complete its job? A partial list of these metrics follows:
o Various path variables
o Tool specific environment variables
There may be several other metrics depending on your specific verification tool and environment setup. We use these metrics as shown in the figure below.
Figure 2 - Metric Enabling a Pricess
The planned metrics are first encapsulated into a package that can be read by our VPA tool. The VPA tool in turn uses these metrics to drive our simulator, passing the correct command line arguments and setting the appropriate environment variables. As the simulation runs, the VPA tool creates a copy of our original metric package and adds output metrics that are specific to this simulation run such as cycle count and simulation start time.
This newly created metrics package is used to automate the handoff. Using this package and a VPA tool, the design engineer can run the identical simulation on their workstation, the first time, every time. They also have all the other simulation specific information from the original run at their fingertips, no more searching the hallways for the verification engineer.
This article has discussed a technique called process cloning that is enabled by new metric driven verification process automation tools, (VPA tools), such as Enterprise Manager from Cadence Design. This technique allows entire processes to be cloned and passed from engineer to engineer without the usual time-consuming miscommunications that plague work flow hand-offs. It’s only one of a number of new techniques that are enabled by metric driven technologies. Do you have a new metric driven technique? I’d love to hear about it. Send an e-mail to Email Contact