Open side-bar Menu
 The Breker Trekker
Tom Anderson, VP of Marketing
Tom Anderson, VP of Marketing
Tom Anderson is vice president of Marketing for Breker Verification Systems. He previously served as Product Management Group Director for Advanced Verification Solutions at Cadence, Technical Marketing Director in the Verification Group at Synopsys and Vice President of Applications Engineering at … More »

Guest Post: Formal Verification’s Perfect Storm of Change

January 14th, 2014 by Tom Anderson, VP of Marketing

Both our original post challenging Jasper Design Automation’s statement that “formal will dominate verification” and Jasper’s response have generated excellent readership. Another major player in the formal world, OneSpin Solutions, also has some strong opinions to share. Please join us in welcoming OneSpin’s Director of Marketing Dave Kelf with his guest post:

I would like to thank Breker for driving this debate on the future importance of formal verification. In my opinion, not only will formal dominate verification, but my belief is that the effect of this technology will be as transformational as the advent of logic synthesis.

We need to look at the factors that will drive this perfect storm of change. I would hypothesize that, as with many discontinuous developments, these are:

  • The old methodology, complete with fixes, proving increasingly inadequate
  • A new solution, plus ecosystem, coming of age for the mass market
  • Most important, the gut instinct of a critical mass of decision makers

OK, we all know this. However, I would suggest that the longer these factors take to evolve, the greater the pressure for change (look at LTE in the wireless industry for a good example) and, in the case of formal verification, the lid is about to come off. Let’s look at the change factors.

Now, I know that anyone watching EDA has heard this many times. Simulation simply can’t scale to meet modern designs and actually hasn’t kept up for years. However, the problem has worsened recently, even though silicon geometry shrinkage has slowed. Why? Well, semiconductor companies can’t rely on accelerating clock rates due to process improvements. Instead, they have resorted to adding hardware components to keep up with the demand for more processing capacity. Think multicore processors and algorithm accelerators.

The verification requirement for these tends to rise as a squared law versus circuit size, as new components must be tested standalone and with other components. Multiply these curves and it is easy to see that verification needs are going to skyrocket. A rigorous IP-based methodology helps to control this, but only to a certain extent. The victim of these events is design quality, at a time of total intolerance to production bugs.

Companies strive to improve their simulation-based environments. In 2010, the emulation market shot from $180 million to $300 million as its application for simulation acceleration went mainstream. SystemVerilog, and before it Verisity’s e language with Specman, saw adoption explosions as engineers struggled with the creation of effective simulation stimulus. Yet, the verification effort as a proportion of project time continues to grow. What new simulation-based technology is on the horizon to improve this situation? Nothing!

What about formal verification? By this I mean property checking rather than equivalency checking. For years, companies have realized that there is potential in this concept. Intel, IBM, and Siemens, to name a few, invested heavily in formal technology, realizing that the exhaustive nature of the solution was what their designs needed. The early tools picked up a reputation for ease-of-use and capacity issues. Those days are gone. Modern formal engines can handle substantial block sizes and, with better assertion standards, assertion synthesis, and assertion IP, formal has arrived as a mainstream technology.

Formal verification can tackle problems that simulation simply can’t approach, particularly today. Formal is the only solution to problems where there are many combinations to test. It becomes complex to consider all the scenarios, such as checking communication between multiple IP blocks or ensuring that a power domain can always be restarted regardless of design state. Formal has the fundamental advantage that a property can be verified for all possible operational scenarios without an engineer having to dream up tests to mimic every one. In addition, formal can perform a battery of straightforward design tests, prior to heavy verification and impractical for simulation, to reveal numerous issues that would otherwise consume hours of debug effort.

All the formal companies are reporting strong 2013 results. OneSpin’s business more than doubled in 2013. Mentor Graphics’ recent verification survey suggests that 27% of participants had adopted formal checking explicitly for manual property checking (not automated assertion generation), and on designs over 20-million gates this shot up to 41%. Yes, 41% of these engineers are writing assertions and running property checking. The formal property checking market has tripled in the last four years. It looks as if it’s catching on!

What stands in the way of formal domination? We can consider specific requirements met only with simulation, and simulation will always have a reasonably significant verification share. However, let’s consider another driver that could influence engineering gut feel. That is an intrinsic need to know what a tool is actually doing. For example, synthesis took off when engineers could see the relationship between RTL registers and logic, and the gates that were produced. This is possibly one of the reasons why behavioral synthesis is only starting to catch on after many years. In simulation, designers can see the design responding to stimulus just like the final circuit. Could this be what has hampered formal growth?

Formal tools provide information in a manner that allows their results to relate easily to the design. Assertions are easier to understand and techniques have evolved that relate them to real-world operation. Coverage tools now point out where assertion sets have left portions of the design untested. Formal is an understood process and one where confidence in its operation and results are starting to be taken for granted.

Many IP blocks of a reasonable size can be tested today exclusively with formal verification. Formal can also take care of many SoC integration and validation tasks. We’ve already seen formal equivalency checking replace most gate-level simulation.

The technology has come of age and just in time for those old simulation-based flows. Instead of considering where can formal replace simulation, the real question is, what will be left for simulation to do?

Dave Kelf
OneSpin Solutions

For more information, visit

Related posts:

Tags: , , , , , , , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *


You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

S2C: FPGA Base prototyping- Download white paper

Internet Business Systems © 2016 Internet Business Systems, Inc.
595 Millich Dr., Suite 216, Campbell, CA 95008
+1 (408)-337-6870 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy Policy