October 09, 2006
Verification Test Plan: The Book
Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
Andy Piziali - A "verification test plan" is an oxymoron because the name blends one verification method - test - with the broader design intent preservation process - verification. As a verification engineer, I believe the intent of the panel was to discuss the verification plan, so I will address your question in that context.
A verification plan is a natural language document that defines the scope of the verification problem and its solution. The scope is quantified by a structured set of design-under-verification (DUV) features and their respective coverage models. The solution is captured as the functional specification for a verification environment that employs dynamic and static verification techniques (simulation, formal analysis, assertion-based verification, acceleration, etc.).
If this document is machine readable, we refer to it as an executable verification plan - or vPlan - because a verification management tool can annotate the plan with live progress metrics as regressions and formal proofs are run. This transforms the plan from a verification process artifact into an application-specific document user interface.
Regarding how this plan differs from traditional verification strategies: in the past we wrote a test plan that enumerated each of the functional test cases or scenarios that needed to be exercised. A test (or set of tests) was associated with each scenario. Once each test was written, run, and passed, the scenario in the test plan was checked off. When all of the scenarios were checked off, verification was deemed complete.
Using a verification plan, verification is considered complete when all verification goals defined in the plan have been reached, usually 100% coverage of each coverage model. However, directed tests are still employed in some situations so each test, run and passed, is also considered a goal to be achieved.
Craig Cochran & Rajeev Ranjan - A verification test plan can be in any of those formats. However, the plan should be well structured, flexible, dynamically updated, and prioritized. For these reasons, Jasper developed GamePlan Verification Planner, which produces a customizable dynamic verification plan, which is stored in XML format and generates reports in HTML for easy sharing of status data.
A verification test plan does not differ from traditional verification strategies. Rather, it organizes them so that the verification team can make the most appropriate use of each strategy in the overall verification effort.
2) The DAC panel description referred to "trading brute force for finesse" when putting a verification plan into place. What does this mean?
Andy Piziali - I interpret this to mean that simulation cycles alone are insufficient to achieve functional verification closure. Finesse is required to carefully analyze all sources of DUV design intent - functional specifications, design specifications, whiteboard diagrams, etc. - capture the DUV requirements as named features with concise descriptions, design a coverage model for each feature that quantifies the behavioral space, implement each coverage model (using code coverage for some) and measure progress against each model as simulation and formal analysis proceed.
Associated generation constraints for a constrained random simulation environment and properties for formal analysis must be designed to achieve full coverage.
Harry Foster - Actually, I didn't like this description - trading brute force for finesse. I'm certainly not saying that I am for brute force over finesse. However, this subtitle moves the discussion down to debating verification infrastructure and tools too soon. And in my mind, the real importance of verification planning is the thought process.
My son recently graduated from high school, and I felt it was my fatherly obligation to offer him some words of wisdom that will shape the future he builds. So I told him: "Always remember, in this incredible world of automation - there is no substitute for thinking."
The same holds true for our industry. Automation can help us by providing solutions to the tedious bookkeeping aspect of the verification planning process. However, getting architects, designers, and verification engineers to all think about the problem space, and share their thoughts with each other, is really fundamental to verification success - and this cannot be automated.
With this understanding, I'll now answer the rest of your first question. The results of the verification planning process are generally described in a document referred to as the Verification Plan. This is a living document that captures the conclusions and decisions derived during the verification planning process - such as resource allocation, verification infrastructure, verification metrics objectives, completion criteria, tracking mechanisms, risk analysis, and feature (and functionality) sets that must be verified.
Craig Cochran & Rajeev Ranjan - “Brute force” in verification usually refers to massive amounts of constrained random simulation. Often, teams will determine that the verification effort is complete when they stop finding bugs using this method.
In our view, the “finesse” method to verification planning involves prioritizing the most critical functionality, determining the coverage thresholds and most appropriate verification approach for each feature, and systematically verifying each feature to the required coverage threshold. This ensures correctness where it matters most in the design, and doesn't waste additional time in brute force simulation.
Catherine Ahlschlager - As verification tools become more sophisticated, it's important for us to effectively use the right tools to address the different verification challenges. As an example, formal verification won't be able tell us if a microprocessor can execute an assembly program correctly. But on the other hand, formal verification can easily tell us if thread starvation will ever occur in a multi-threaded processor design, which is hard to verify otherwise.
Janick Bergeron - The "finesse" refers to using modern technologies with modern methodologies. For example, Synopsys' VCS functional verification solution supports Native Testbench (NTB) technology, which allows engineers to use the built-in constrained-random stimulus generator and create powerful SystemVerilog testbenches to generate corner-case scenarios. Such scenarios are impossible to conceive manually.
Also, VCS supports native SystemVerilog Assertions (SVA), making it easier to track and find design bugs. A carefully developed verification plan should leverage the power of these advanced technologies to find more design bugs in a given time. The plan should factor use of high-quality verification IP (VIP) for standard protocols. Synopsys has a rich portfolio of VIPs in its VCS Verification Library.
New verification techniques require deployment of modern verification methodologies. For example, Synopsys partnered with ARM to develop the widely used Verification Methodology Manual (VMM) for SystemVerilog. VMM documents best practices for setting up a robust and efficient verification environment leveraging coverage-driven constrained-random techniques, assertions and formal technologies. So, it's all about working smarter and not just working harder.
3) Isn't having to build a verification test plan just another layer of "structural obligation" that adds to the complexity of verification?
Catherine Ahlschlager - Quite the contrary; a verification test plan offers a quantitative way to measure the progress. It's the document that one references in review meetings. It forces verification engineers to think of ways to not only prove that the design works the way it should, but also how to break the design and possible corner cases relating to the same tests.
Doron Stein - If the verification test plan is regarded as a "layer," indeed this makes it a structural obligation. Yet, if the verification plan (coverage-driven) is a live, dynamic combination of a database that gets its input from a coverage matrix, as well as updated results being fed back (hopefully, automatically) from the regression, then this "verification plan" becomes the overall axis upon which the progress of the design project moves forward.
Janick Bergeron - It's an investment that will ultimately result in higher quality design. Just like it's not wise to build a chip without clear specifications, it's not wise to perform ad-hoc verification. The better the verification plan, the higher the chances of realizing a high quality design.
It used to be that the only thing you couldn't avoid was death and taxes. Now, we're getting to the point where we might have to add verification planning to the list.
Andy Piziali - No. When the verification plan was an artifact that became obsolete the
day File->Save and File->Exit were selected, it could have been interpreted as pure overhead. But even at that time, substantial value was derived from simply creating the plan because the necessary specification analysis exposed bugs during the process.
You can find the full EDACafe event calendar here.
To read more news, click here.
-- Peggy Aycinena, EDACafe.com Contributing Editor.