Open side-bar Menu
 Real Talk
David Scott
David Scott
Dave Scott is Principal Architect at Real Intent. He has been at Real Intent for a little over one year and has gotten a lot better at table tennis, besides being apprenticed as R&D on Implied Intent Verification. He was drawn into EDA verification software development more than 20 years ago … More »

My Impressions of DVCon USA 2015: Lies; Experts; Art or Science?

 
March 12th, 2015 by David Scott

Last week I attended the Design and Verification Conference in San Jose.  It had been six years since my last visit to the conference.  Before then, I had attended five years in a row, so it was interesting to see what had changed in the industry.  I focused on test bench topics, so this blog records my impressions in that area.

First, my favorite paper was “Lies, Damned Lies, and Coverage” by Mark Litterick of Verilab, which won an Honorable Mention in the Best Paper category.  Mark explained common shortcomings of coverage models implemented as SystemVerilog covergroups.  For example, because a covergroup has its own sampling event, that may or may not be appropriate for the design.  If you sample when a value change does not matter for the design, the covergroup has counted a value as covered when in fact it really isn’t.  In the slides, Mark’s descriptions of common errors were pithy and, like any good observation, obvious only in retrospect.  More interestingly, he proposed correlating coverage events via the UCIS (Unified Coverage Interoperability Standard) to verify that they have the expected relationships.  For example, a particular covergroup bin count might be expected to be the same as the pass count of some cover property (in SystemVerilog Assertions) somewhere else, or perhaps as much as some block count in code coverage.  It struck me that some aspects of this must be verifiable using formal analysis. You can read the entire paper here and see the presentation slides here.

I was also impressed by the use of the C language in verification — not SystemC, but old-fashioned C itself.  Harry Foster of Mentor Graphics shared some results of his Verification Survey, and there were only two languages whose use had increased from year-to-year: SystemVerilog and C.  For example, there was a Cypress paper by David Crutchfield et al where configuration files were processed in C.  Why is this, I wondered?  Perhaps because SystemVerilog makes it easy via the Direct Programming Interface (DPI): you can call SystemVerilog functions from C and vice-versa.  Also, a lot of people know C.  I imagine if there were a Python DPI or Perl DPI, people would use those a lot as well!

Of course, the Universal Verification Methodology (UVM) is becoming, well… almost universal.  I get the impression that verification architects are turning into software engineers.  They are having fun, if that is the word, creating abstractions so that they can re-use the same top-level verification code in different circumstances, with differing design blocks or versions of IP.  But like creating classes in C++ software, as I do for Real Intent, there are many different ways of doing the same thing.  It seems to me UVM has made the verification problem less constrained rather than more constrained, in some sense, and that does add some risks, as well as make static analysis more difficult.

The crowd got a kick out of the fact that even the UVM experts can’t agree among themselves how much of it is minimally necessary; there were some lively discussions among the presenters in the UVM Session on Wednesday afternoon.  First, Stu Sutherland and Tom Fitzpatrick proposed a minimal subset.  The next two authors contradicted it.  One feature that Tom said never to use was then the subject of a paper by John Aynsley.  Last in the session, my friend Rich Edelman described his UVM template generator.  I think there could be as many template generators as authors!

Some presentations had the tinge of an advertisement.  There was an “e” paper where a user described reasons to miss aspect-oriented programming, which is not found in SystemVerilog.  For the first time, I got a good definition of aspect-oriented programming, which you will find on Wikipedia, as focused on cross-cutting concerns.  My paraphrase of cross-cutting concerns is a feature that usually requires implementation in multiple locations; an aspect-oriented language can put the cross-cutting concerns in one place.  But it also strikes me that an aspect-oriented language really allows the extension or re-definition of anything from anywhere.  This may in fact be aspect-oriented, or it may not; nothing guarantees that it is.  If not, you risk a giant mess where you need to read all the source code to understand anything.  At least, object-oriented languages like SystemVerilog have features that push people in an object-oriented direction.

Finally, for Real Intent, I was encouraged to hear from Harry Foster, during the “Art or Science?” panel,  that “formal apps” — or focused formal applications dedicated to analysis of a particular problem area — grew in usage year-to-year by over 60%, and this is the fastest-growing area for EDA tools.  I’m glad to be working for a company in such an interesting area.

P.S.  The answer, by the way, to the question of whether verification is “Art or Science” is easy.  Of course, it’s both!

Related posts:

Tags: , , , , , , , , , , ,

4 Responses to “My Impressions of DVCon USA 2015: Lies; Experts; Art or Science?”

  1. Mark Glasser says:

    As one of the co-authors of David Crutchfield’s paper I can tell you that we were not processing configuration files in C. That doesn’t change your point that C is re-emerging as an important testing language. The reason is not necessarily because of the DPI connection, although that certainly helps. The reason is that people want to write tests that span media – i.e. simulator, emulator, FPGA, and yes, even silicon. They also want to write tests that span hierarchical levels – e.g. cluster-level tests that also run at the system-level. The easiest way to do this is to write the tests in C. C tests can be run on the host simulator via DPI; they can be cross-compiled and run on an ISS or an embedded processor. Once cross-compiled they can also be moved to FPGA or silicon. This is linked with the fact that software is part of the complete system. So, sometimes tests are low-level streams of reads and writes, other times they are driver or application code that runs above the HAL and the low-level firmware.

  2. David Scott David Scott says:

    Thanks, Mark! I was puzzled by the resurgence of C; that explains it well.

    I wondered if, for example, a Python DPI would get much use, and your comment suggests not. Nonetheless, a friend of mine said users do ask for one. I didn’t mention that Mark Litterick actually built a Python layer on top of the UCIS API!

  3. […] I am interested to see some positive signs, like Mark Litterick’s DVCon paper I blogged about last time.  But now UCIS has a life of its own without me.  As one of its several parents, I will follow it […]

  4. […] it. I am interested to see some positive signs, like Mark Litterick’s DVCon paper I blogged about last time.  But now UCIS has a life of its own without me.  As one of its several parents, I will follow it […]

Leave a Reply

Your email address will not be published. Required fields are marked *


*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

CST Webinar Series



Internet Business Systems © 2016 Internet Business Systems, Inc.
595 Millich Dr., Suite 216, Campbell, CA 95008
+1 (408)-337-6870 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy Policy