In late March, Brian Bailey of Semiconductor Engineering published an article on standards: “Design by Architect or Committee?” This made me think of my own experience with the Accellera Unified Coverage Interoperability Standard (UCIS), something of which I am both proud and embarrassed. Proud, because when I was at Mentor Graphics I was the architect of the winning donation, and that’s a rare thing in any career — to contribute the design and architecture for an industry standard. However, I am embarrassed because I know I could have done better in a re-design. Any software engineer will tell you this: the second design is always better, because you’ve learned from the first. We did some re-design as part of the standardization effort, but not to the degree I wanted. (more…)
Posts Tagged ‘coverage’
Last week I attended the Design and Verification Conference in San Jose. It had been six years since my last visit to the conference. Before then, I had attended five years in a row, so it was interesting to see what had changed in the industry. I focused on test bench topics, so this blog records my impressions in that area.
First, my favorite paper was “Lies, Damned Lies, and Coverage” by Mark Litterick of Verilab, which won an Honorable Mention in the Best Paper category. Mark explained common shortcomings of coverage models implemented as SystemVerilog covergroups. For example, because a covergroup has its own sampling event, that may or may not be appropriate for the design. If you sample when a value change does not matter for the design, the covergroup has counted a value as covered when in fact it really isn’t. In the slides, Mark’s descriptions of common errors were pithy and, like any good observation, obvious only in retrospect. More interestingly, he proposed correlating coverage events via the UCIS (Unified Coverage Interoperability Standard) to verify that they have the expected relationships. For example, a particular covergroup bin count might be expected to be the same as the pass count of some cover property (in SystemVerilog Assertions) somewhere else, or perhaps as much as some block count in code coverage. It struck me that some aspects of this must be verifiable using formal analysis. You can read the entire paper here and see the presentation slides here.
I was also impressed by the use of the C language in verification — not SystemC, but old-fashioned C itself. Harry Foster of Mentor Graphics shared some results of his Verification Survey, and there were only two languages whose use had increased from year-to-year: SystemVerilog and C. For example, there was a Cypress paper by David Crutchfield et al where configuration files were processed in C. Why is this, I wondered? Perhaps because SystemVerilog makes it easy via the Direct Programming Interface (DPI): you can call SystemVerilog functions from C and vice-versa. Also, a lot of people know C. I imagine if there were a Python DPI or Perl DPI, people would use those a lot as well! (more…)