Verification Test Plan: The Book
Harry Foster - The verification plan is developed concurrently with the design. Keep in mind that the design development is different than the implementation development (that is, RTL). Also, the way we define and build our verification infrastructure has more impact on effectively verifying late-stage spec changes than the verification plan itself.
For example, modern simulation environments are modular object oriented class-based components, versus large monolithic (tightly coupled) software programs. These modern simulation environments simplify support for late-stage changes through localization. Changes can be isolated by extending base-class library verification components to accommodate changes without affecting the entire simulation ecosystem.
Formal verification can also assist in rapidly verifying late-stage changes for blocks that are good candidates for formal and ones we have previously chosen to prove (thus having existing formal environments we can modify).
10) Does your company endorse the idea of a verification plan? Do they sell tools to help customers put in place such a plan?
Catherine Ahlschlager - Yes, we definitely endorse the idea of a verification plan at Sun.
Doron Stein - Cisco endorses verification plans, and constantly is searching for ways to improve the verification process.
Craig Cochran & Rajeev Ranjan - Jasper Design Automation absolutely endorses the idea of a verification plan. We have developed GamePlan Verification Planner - a free tool - to assist verification teams with the creation of structured, flexible verification plans. We believe that teams that use a structured approach will benefit more from Jasper's brand of systematic formal verification.
Harry Foster - I guess this means I have to take my "Harry Foster" hat off and put my "Mentor" hat on. Yes, of course my company endorses planning for success. Mentor Graphics provides tools to assist in the bookkeeping aspect of the verification planning process - such as the Unified Coverage DataBase (UCDB) with its open API for third-party tool integration.
Andy Piziali - Cadence Design Systems emphatically endorses the use of a verification plan, in particular an executable verification plan: the vPlan. Our verification management product, Incisive Management, is bundled with vPlan examples and templates for a variety of word processors. The vPlan is read by Incisive Management and displayed in a vPlan window with up-to-date verification progress metrics and roll-ups back- annotated into the user's plan. The vPlan may be configured and instantiated along with verification IP and design IP for an integrated reuse strategy.
Each verification IP (VIP) that Cadence sells is equipped with a complete protocol compliance vPlan linked to the functional coverage models of the VIP. This allows our customer to instantiate the VIP vPlan into their master vPlan rather than creating it from scratch for the specific protocol. vPlans are critical to achieve true design, VIP, and verification plan reuse since they allow an IP integrator to clearly understand exactly what was intended to be verified by the IP developer.
Janick Bergeron - Verification planning is an important part of the verification process, in addition to methodology and technology. But a verification plan cannot be achieved without the right technology or the methodology. Synopsys provides the broadly adopted VMM methodology for developing robust environments and VCS NTB technology delivering up to 5 times faster verification to enable DV engineers to predictably achieve their verification plan. Stay tuned for more.
Chapter III: Closing Commentary
Rich Faris - At a high level, verification planning hasn't changed. Before the chip is built, the functions should be defined clearly in specifications at the system, chip, and module levels. The flow through the tool chain should be defined so that the right information is generated at each stage in the process, and the right tools are available.
Each major function in the specification needs to be exercised by one tool or another, until it is deemed to have high enough verification coverage. But that doesn't really tell the whole story. That's the same as saying that going into battle is the same now as it was when the weapons were bows and arrows, but now they are precision smart bombs. In today's wars, the concept is totally different; lots of damage can be done without even setting foot on the battlefield.
In the same way, the tools of verification have matured drastically. While the system and chip designers are still required to capture their thoughts and plans in English specifications, this isn't enough. Both for static and dynamic verification, designers should capture their expectations for the different scenarios that need to be tested, and use constraints to define these modes or scenarios.
By using PSL (Property Specification Language) or SVA (SystemVerilog Assertions) to capture the constraints that define a mode, then either the design or verification engineer can write properties that are assertions that the design must conform to. Capturing the constraints and properties along with the design, as part of the design and verification process, removes some of the ambiguity of relying totally on English language specifications.
The other part of planning that is changing is the definition of the quality of test, and deciding when enough is enough testing. For simulation there always was the concept of code coverage. Simulation-based code coverage gave the engineers some idea of how well the lines and states in their code were executed. Now, dynamic ABV (assertion-based verification) or static formal ABV tools purport to report numbers that define some kind of a quality of coverage of the properties on the given design.
Having an idea of how the coverage of the various tools in the chain will be overlaid, and how much weight to put on the different tools numbers, is still going to be more art than science. Someday perhaps there will be a common database for various tools in the chain to write their reports, and this will ease the engineer's burden to merge and understand the disparate coverage data.
Chapter IV: Additional Books
Applied Formal Verification
by Harry Foster and Douglas Perry
Functional Verification Coverage Measurement and Analysis
by Andy Piziali
Writing Testbenches: Functional Verification of HDL Models
by Janick Bergeron
Hardware Verification with C++
by Mike Mintz and Robert Ekendahl
Verification Methodology Manual for SystemVerilog
by Janick Bergeron, Eduard Cerny, Alan Hunter, and Andy Nightingale
SystemVerilog for Design: A Guide to Using SystemVerilog for Hardware Design and Modeling
by Phil Moorby, Stuart Sutherland, Simon Davidmann, and Peter Flake
Chapter V: A Note of Thanks
My thanks to Francine Bacchini for her help with this article. Francine organized the original panel at DAC, chaired by Sharad Malik from Princeton University, and in recent weeks has provided additional, invaluable help in encouraging the panelists to submit their written responses to the questions.
In addition, I am grateful to everyone involved here for their contributions to this discussion.
Peggy Aycinena is Editor of EDA Confidential and a Contributing Editor to EDA Weekly.