View in your browser

EDA Magazine Review October 25, 2004

Assertion Based Verification
Please note that contributed articles, blog entries, and comments posted on are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
Jack Horgan - Contributing Editor

by Jack Horgan - Contributing Editor
Posted anew every four weeks or so, the EDA WEEKLY delivers to its readers information concerning the latest happenings in the EDA industry, covering vendors, products, finances and new developments. Frequently, feature articles on selected public or private EDA companies are presented. Brought to you by If we miss a story or subject that you feel deserves to be included, or you just want to suggest a future topic, please contact us! Questions? Feedback? Click here. Thank you!


In October of 1991 a storm stronger than any in recorded history hit the coast off of Gloucester, Massachusetts. This "Perfect Storm" - so called because it was three storms combined into one - created an almost apocalyptic situation in the Atlantic ocean, where boats encountered waves of 100 feet. This was the basis of a film by the same name. It may not qualify as a perfect storm but something is happening on the EDA sea with the convergence of a need, technology and standards.

First, the need. Much, perhaps too much, has been written about the dramatic increase in the complexity of IC's by any available metric. This situation has ramifications for all aspects of the product lifecycle: design, verification, fabrication and test. EDA tools now enable design teams to generate designs that can not be truly verified in anything like a reasonable time. Many articles cite the fact that 70% of the time is spent in verification. The traditional black box approach to verification is to provide a set of input test vectors to a design simulation and monitor the actual output against the intended output. The test bench consists of a combination of cases that reflect the specification, worry cases and random inputs. Unfortunately, even rather simple designs can require an enormous number of test vectors and therefore simulation time to exhaustively test for all possible combination of inputs. Consider the case of a simple 32 bit comparator which would require 264 vectors to exhaustively evaluate. Eliminating redundant and illegal inputs from test benches to reduce the number of vectors is itself a major challenge. A second shortcoming of simulation is that internal errors may never propagate to the outputs making them unobservable. Even if the effects do reach the output, it may be very difficult to trace the effect backwards to their cause. Lastly, certain types of implementations errors are difficult to stimulate with functional tests. These are referred to as verification hot spots. There is a lack of control, i.e. an inability to activate, stimulate or sensitizes a specific point within the design. The consequence of these observability and controllability limitations is that a chip may require an expensive (~$1M) re-spin or worse a field recall due to design flaws undetected during the verification stage. According to well publicized study by Collet International Research, Inc 70% of designs requiring multiple respins have logic flaws.

Another issue with simulation is coverage. How do we know that the verification process is complete? There is both functional or specification coverage and implementation coverage. Metrics for the later include toggle coverage, code coverage (line, branch and sub-expression coverage), and finite state machine (FSM) coverage (state, arc and path). While easy to measure, it is difficulty to correlate these measures with coverage of functional intent. Functional coverage metrics, including coverage points, are designed to check that the verification tests have exercised specific key functionality - for example, executing all instruction types or transmitting and receiving all packet types. However, functional coverage on the chip inputs and outputs may miss the exercise of the internal structures likes FIFIOs.

Next, the technology: Assertion Based Verification (ABV). Assertions capture the design intent. They specify both legal and illegal behavior of a circuit structure. The basic ABV method compares the implementation of a design against its specified assertions by embedding assertions in a design and having them monitor design activities. Assertions are placed wherever the protocols may be misused, assumptions violated, or design intent incorrectly implemented. Whenever an assertion is violated, it is reported immediately without the requirement for the problem to propagate to the design's outputs. One can also insert protocol assertion monitors to check for violations of the corresponding interface protocols. Generally, there are two kinds of assertions. Concurrent assertions state that a given property must always be true throughout a simulation, while procedural (sometimes called temporal) assertions apply only for a limited time, or under specified conditions.

ABV can also be combined with formal verification techniques. Formal verification uses mathematical techniques to prove some assertions true (given a set of assumptions) and other assertions false (by discovering counterexamples). A proof means that static formal verification has exhaustively explored all possible behavior with respect to the assertion and has determined it cannot be violated. A counterexample shows the circumstances under which the assertion is not satisfied. One can also produced a bounded proof as measured by a proof radius. The proof radius of an assertion is the number of cycles of design operation from an initial state that are covered by exhaustive formal analysis. Within the proof radius, the formal analysis guarantees that the assertion cannot be violated using any legal input sequence.

Proof radius is an assertion-based coverage metric that suggests whether the design needs more assertions. Assertion density measures the number of assertions of each type in each module. Minimum sequential distance measures the minimum number of levels of sequential logic from a given register to any assertion. Together, these metrics give the development team guidance about where to add more assertions.

0-In Design Automation identifies the fundamental technologies and methodologies (“pillars”) that make the ABV process successful.
-    Automatic Assertion Check
A set of pre-defined assertion rules is applied to the RTL code of the design to target common netlist problems and design flaws. Unlike a syntax and sematic checker like lint, this synthesizes the design and employs formal methods to analyze the internal structures of the design. The process is automatic; no input is required from users.

-    Static Formal Verification
Designers add assertions to their code to capture design intent and to annotate their assumptions. Exhaustive formal verification analyzes the assertions in a design determining whether they are true, false or indeterminate. No testbench or simulation is required. This approach is typically employed at the block level.

-    Simulation with Assertions
User-specified assertions are simulated with the design and the test environment. The assertions monitor activities inside the design and on the interfaces and report immediately on any violation. At the same time, the assertions collect coverage and statistics data.

-    Dynamic Formal Verification
This runs concurrently with simulation and leverages all simulation vectors. It identifies states “close” to the assertions and performs local formal analysis about them. Dynamic formal verification takes advantage of the design knowledge encapsulated in the simulation vectors to explore new behaviors in the design-instead of simply duplicating activities already performed by simulation.
Assertion-Based Verification provides benefit to users by:
- Increasing observability of bugs

- Increasing controllability

- Facilitating the diagnosis of bugs by identifying bugs at or near their cause

- Finding bugs earlier in the design cycle (TTM)

- Uncovering bugs that would have otherwise remained undetected (more bugs)

- Enabling the use of formal and semi-formal verification techniques

- Preventing wasted simulation cycles

- Improving verification productivity

- Lowering risk of bugs avoiding discovery

- Facilitating the integration of work from multiple designers

- Supporting design reuse and third part IP

In a paper from DAC in 1996 DEC revealed that a third of all bugs found in their Alpha microprocessor were detected by assertions.

Third, the standards. Three standards have emerged that are relevant to ABV, namely, the Open Verification Library (OVL), Property Specification Language (PSL), and System Verilog Assertions (SVA). Standards enable end users and producers of technology to avoid or lower the risk that any resources devoted to developing assertions and assertion related tools will not be subsequently obsolesced. Standards help create a market for commercial implementations. User can have confidence that once assertions are inserted into a design they can be used by many tools and are quite portable for design reuse. The three standards are described below followed by a section on Accellera, the organization responsible for driving these standards.

1 | 2 | 3 | 4 | 5  Next Page »

You can find the full EDACafe event calendar here.

To read more news, click here.

-- Jack Horgan, Contributing Editor.

You are registered as: [_EMAIL_].

CafeNews is a service for EDA professionals. respects your online time and Internet privacy. Edit or Change my newsletter's profile details. Unsubscribe me from this newsletter.

Copyright © 2020, Internet Business Systems, Inc. — 25 North 14th Steet, Suite 710 San Jose, CA 95112 — +1 (408) 882-6554 — All rights reserved.