Posts Tagged ‘static verification’
Thursday, June 25th, 2015
The propagation of unknown (“X”) states has become a more pressing issue with the move toward billion-gate SoC designs, and especially so with power-managed SoC designs. The SystemVerilog standard defines an X as an “unknown” value used to represent the state in which simulation cannot definitely resolve a signal to a “1,” a “0,” or a “Z.”
Synthesis, on the other hand, defines an X as a “don’t care,” enabling greater flexibility and optimization. Unfortunately, Verilog RTL simulation semantics often mask the propagation of an X value, while gate-level simulations show additional Xs that will not exist in real hardware.
The sheer complexity and common use of power management schemes increase the likelihood of an unknown “X” state in the design translating into a functional bug in the final chip. This possibility has been the subject of two technical presentations at the Design and Verification Conference during the last couple of years: I’m Still in Love With My X! But, Do I Want My X to Be an Optimist, a Pessimist, or Eliminated? and X-Propagation Woes: Masking Bugs at RTL and Unnecessary Debug at the Netlist. Let’s look more closely at this issue and the requirements for a solution.
Thursday, March 12th, 2015
Last week I attended the Design and Verification Conference in San Jose. It had been six years since my last visit to the conference. Before then, I had attended five years in a row, so it was interesting to see what had changed in the industry. I focused on test bench topics, so this blog records my impressions in that area.
First, my favorite paper was “Lies, Damned Lies, and Coverage” by Mark Litterick of Verilab, which won an Honorable Mention in the Best Paper category. Mark explained common shortcomings of coverage models implemented as SystemVerilog covergroups. For example, because a covergroup has its own sampling event, that may or may not be appropriate for the design. If you sample when a value change does not matter for the design, the covergroup has counted a value as covered when in fact it really isn’t. In the slides, Mark’s descriptions of common errors were pithy and, like any good observation, obvious only in retrospect. More interestingly, he proposed correlating coverage events via the UCIS (Unified Coverage Interoperability Standard) to verify that they have the expected relationships. For example, a particular covergroup bin count might be expected to be the same as the pass count of some cover property (in SystemVerilog Assertions) somewhere else, or perhaps as much as some block count in code coverage. It struck me that some aspects of this must be verifiable using formal analysis. You can read the entire paper here and see the presentation slides here.
I was also impressed by the use of the C language in verification — not SystemC, but old-fashioned C itself. Harry Foster of Mentor Graphics shared some results of his Verification Survey, and there were only two languages whose use had increased from year-to-year: SystemVerilog and C. For example, there was a Cypress paper by David Crutchfield et al where configuration files were processed in C. Why is this, I wondered? Perhaps because SystemVerilog makes it easy via the Direct Programming Interface (DPI): you can call SystemVerilog functions from C and vice-versa. Also, a lot of people know C. I imagine if there were a Python DPI or Perl DPI, people would use those a lot as well! (more…)
Thursday, March 5th, 2015
The Design and Verification Conference Silicon Valley was held this week. During Aart de Geus’ keynote, he shared how SoC verification is “shifting left”, so that debug starts earlier and results are delivered more quickly. He identified a number of key technologies that have made this possible:
- Static verification that uses a mix of specialized code analysis and formal technology which are must faster and more focused than traditional simulation
- New third generation of analysis engines
- Advancements in debug
Real Intent has also been talking about this new suite of technologies that improve the whole process of SoC verification. Pranav Ashar, CTO at Real Intent wrote about these in a blog posted on the EETimes web-site. Titled “Shifting Mindsets: Static Verification Transforms SoC Design at RT Level“, it introduces the idea of objective-driven verification:
We are at the dawn of a new age of digital verification for SoCs. A fundamental change is underway. We are moving away from a tool and technology approach — “I have a hammer, where are some nails?” — and toward a verification-objective mindset for design sign-off, such as “Does my design achieve reset in two cycles?”
Objective-driven verification at the RT level now is being accomplished using static-verification technologies. Static verification comprises deep semantic analysis (DSA) and formal methods. DSA is about understanding the purpose and intent of logic, flip-flops, state machines, etc. in a design, in the context of the verification objective being addressed. When this understanding is at the core of an EDA tool set, a major part of the sign-off process happens before the use or need of formal analysis. (more…)
Thursday, September 4th, 2014
Thursday, August 7th, 2014
This article was originally published on TechDesignForums and is reproduced here by permission.
Sometimes it’s useful to take an ongoing debate and flip it on its head. Recent discussion around the future of simulation has tended to concentrate on aspects best understood – and acted upon – by a verification engineer. Similarly, the debate surrounding hardware-software flow convergence has focused on differences between the two.
Pranav Ashar, CTO of Real Intent, has a good position from which to look across these silos. His company is seen as a verification specialist, particularly in areas such as lint, X-propagation and clock domain crossing. But talk to some of its users and you find they can be either design or verification engineers.
How Real Intent addresses some of today’s challenges – and how it got there – offer useful pointers on how to improve your own flow and meet emerging or increasingly complex tasks.
Thursday, July 3rd, 2014
SoC companies are coming to rely on RTL sign-off of many verification objectives as a means to achieve a sensible division of labor between their RTL design team and their system-level verification team. Given the sign-off expectation, the verification of those objectives at the RT level must absolutely be comprehensive.
Increasingly, sign-off at the RTL level can be accomplished using static-verification technologies. Static verification stands on two pillars: Deep Semantic Analysis and Formal Methods. With the judicious synthesis of these two, the need for dynamic analysis (a euphemism for simulation) gets pushed to the margins. To be sure, dynamic analysis continues to have a role, but is increasingly as a backstop rather than the main thrust of the verification flow. Even where simulation is used, static methods play an important role in improving its efficacy.
Deep Semantic Analysis is about understanding the purpose or role of RTL structures (logic, flip-flops, state machines, etc.) in a design in the context of the verification objective being addressed. This type of intelligence is at the core of everything that Real Intent does, to the extent that it is even ingrained into the company’s name. Much of sign-off happens based just on the deep semantic intelligence in Real Intent’s tools without the invocation of classical formal analysis.