Open side-bar Menu
 The Breker Trekker
Tom Anderson, VP of Marketing
Tom Anderson, VP of Marketing
Tom Anderson is vice president of Marketing for Breker Verification Systems. He previously served as Product Management Group Director for Advanced Verification Solutions at Cadence, Technical Marketing Director in the Verification Group at Synopsys and Vice President of Applications Engineering at … More »

Bugged about Debug? We Can Help!

February 11th, 2014 by Tom Anderson, VP of Marketing

For today’s blog post, we use as our text a recent article on SemiWiki by well-known verification expert Hemendra Talesara. He provides a nice summary of a recent talk given in Austin by another verification expert, Harry Foster from Mentor. Many of you have probably seen Harry’s blog posts dissecting in great detail the results of a bi-annual survey that Mentor commissions from Wilson Research Group. There is much less coverage and analysis of the EDA world available today than there used to be, so we all applaud Mentor’s willingness to fund this survey and share the results.

Hemendra’s focus is on the well-known phenomenon of verification consuming more and more of a chip project’s resources. It is not uncommon to find that SoC projects have two or three verification engineers for every design engineer. So what do these verification engineers do with all their time and resources? The interesting result from the Mentor survey is that verification engineers spend 36% of their time on debug. At Breker, we’ve given a lot of thought about how to reduce debug time and effort, so we’d like to share some thoughts.

Just to fill out the rest of the results, the Mentor survey that Hemendra references shows the following breakdown for verification engineers’ time:

  • 36% on debug
  • 23% on creating and running tests
  • 22% on testbench development
  • 16% on verification planning
  • 4% on everything else

To anyone who’s been involved in verification, the general outline of these results is not surprising. Other studies and surveys have called out debug as the single biggest consumer of verification time and human resources. The 36% number is on the high side compared with many past assessments, presumably reflecting the ever-growing complexity of SoC designs and the exponentially increasing verification effort that they require.

So what can EDA vendors do to help with the debug problem? First, let’s separate debug into three categories: hardware, software, and infrastructure. Hardware debugging occurs when the verification process has uncovered an actual bug in the design. Software debugging is required when a bug in production code is revealed. In both of these cases, there is a baseline of tracking down the source of the problem that is fundamentally hard to reduce. However, the more that verification tools and models can do to point to the source of a bug, or at least offer some guidance, the shorter the debug time.

The third category refers to a wide variety of cases in which the bug is found in neither the hardware nor the software, but rather in the verification infrastructure itself. Verification engineers are no more perfect than designers or programmers. They can make mistakes in hand-written tests, testbench models, stimulus generators, result checkers, scoreboards, assertions, constraints, coverage models, diagnostic code, and so forth. In fact, hundreds of such problems may be found and fixed before the SoC verification environment is stable enough to start finding real bugs in the hardware or software design.

In some sense, this is wasted time since it’s not finding SoC bugs, but it is a necessary process. The goal should be to shorten it as much as possible both by reducing the number of bugs in the verification infrastructure and by making it easier to diagnose and fix those that sneak in anyway. The answer, of course, is to automate verification as much as possible and to raise the level of abstraction at which the verification engineers are working.

Many studies have shown that the number of bugs tends to scale with the number of lines of code. Programmers who moved from assembly code to higher-level languages learned this decades ago. Everyone who upgraded from hand-drawn gates to RTL and logic synthesis had a similar experience. Most recently, verification engineers moving from hand-written directed simulation tests to constrained-random stimulus also found that whole categories of little bugs just didn’t happen anymore.

We believe that the time has come to further automate SoC verification and to raise the level of abstraction for the models used. In our next blog post, we’ll provide some specific examples of how our graph-based scenario models, visualization technologies, and automatic test case generation reduce the number of errors in the verification infrastructure and reduce the time spent on all forms of debug. In the meantime, we’d love to have your comments on this topic.

Tom A.

The truth is out there … sometimes it’s in a blog.

The test map shown in this post is from our new x86 server validation case study. Please request it at

Related posts:

Tags: , , , , , , , , , , , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *



DownStream: Solutions for Post Processing PCB Designs
TrueCircuits: IoTPLL

Internet Business Systems © 2018 Internet Business Systems, Inc.
25 North 14th Steet, Suite 710, San Jose, CA 95112
+1 (408) 882-6554 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise