With the pending acquisition of Jasper Design Systems by Cadence, there is a renewed discussion on the state of formal verification of RTL design. It has become a mainstream technology and no longer requires a Phd. to use. It is now considered a part of the verification process. What has changed? The following is a perspective by Pranav Ashar, CTO at Real Intent.
The capacity of formal tools has certainly improved by orders of magnitude in the last 20 years, and done so far beyond the basic increase in speed of computers. This makes everything else I am saying here possible. The second very crucial change is that people have figured out where and how to apply formal so it makes a real difference. Turns out that the best places are where the scope is narrow and where there is a full understanding of the failure modes that need to be analyzed.
The hurdles faced in the ease of use of assertion-based formal analysis have been (1) coming up with the assertions, (2) controlling analysis complexity, (3) debug, and (4) absence of definitive outcome. Narrowly scoped verification objectives allows assertions to be generated automatically, have manageable formal analysis complexity, provide precise debug info and always return actionable feedback.
Clock domain crossing (CDC) analysis is a great example of how this can be done. The failure mode that needs to be covered involves a confluence of functionality and timing, which is where simulation falls short. Also, the logical scope of the failure analysis is local, i.e. the complexity of formal analysis does not directly scale with chip size. Finally, because the failure mode is well understood, the checks are generated automatically and the debug feedback is in the context of the designer’s intent. These are powerful reasons for using formal methods. Beyond CDC, there is an increasing number of such high-value verification objectives where static analysis and formal methods are essential and viable.
On always returning actionable feedback, the formal community has been too focused on improving the core engines in formal analysis. While that is essential, of course, the community has failed to realize that there is a gold mine of information in the intermediate (bounded) results produced during formal analysis that can be fed back to the user. This feedback provides a better handle on debug, coverage and on actionable next steps to improve verification confidence. We are missing out on some low hanging fruit by not exploiting this information, especially in the context of narrowly scoped verification objectives.
The simulation of of an SOC or any other system is always a fallback option. You would always prefer an analytical model if it serves the purpose. The more you understand a problem, the more you can apply a precise specification and an analytical flavor to the analysis, i.e. formal analysis. As SOC design steps become more understood, there are more areas that are becoming amenable to formal solutions.
The status today is that formal analysis is already being used under-the-hood for high-value sign-off verification objectives like the verification of clock domain crossings, power management, design constraints and timing exceptions, power-on and dynamic reset, X-effects and a few more. More high-value verification objectives will continue to be identified where the narrow-scope template can be applied to make formal analysis viable and simulation-like in its ease of use. Formal verification methods for SOCs are firmly anchored in terms of demonstrated value and I see a bright future ahead for them.
For further comments by Dr. Ashar, view the following video interview. He discusses the keynote speech he gave at the FMCAD 2013 conference on the topic “Static verification based sign-off is a key enabler for managing verification complexity in the modern SoC.”