Open side-bar Menu
 The Dominion of Design
Sanjay Gangal
Sanjay Gangal
Sanjay Gangal is a veteran of Electronics Design industry with over 25 years experience. He has previously worked at Mentor Graphics, Meta Software and Sun Microsystems. He has been contributing to EDACafe since 1999.

Will Verification Allow Time for Design?

December 2nd, 2013 by Sanjay Gangal

Article source: TVS

Design engineers are increasingly spending their time on verification. Research suggests that it is now more than 50% of their time and, according to Harry Foster of Mentor Graphics in his lighter moments, if we continue the current linear trend then it will reach 100% by 2030! So why is verification so demanding? It seems that IP reuse has enabled designers to create larger, more complex designs to keep pace with our manufacturing capability but our verification productivity has not kept pace.

Looking to tools for productivity gains, EDAC (the EDA Consortium) reported that the overall EDA verification market grew by 38% from 2010 to 2012 with emulation up by 94%. But, as Mark Olen of Mentor pointed out “if Henry Ford had asked people what they wanted, they would have said faster horses”. So innovation is also required and Chris Brown of Broadcom set EDA companies the challenge of “collaborative competition” through standards. For example, UCIS has enabled TVS to build an innovative requirements sign off tool (asureSign) by reading verification data from multiple tools.

Such innovation is required to deal with the increasing design complexity that verification engineers say is their biggest challenge; from IP, through SoCs to systems. For example, Tim Blackmore of Infineon, highlighted the growing amount of configurability in the TriCore family of CPUs, with over 200 parameters needed in the verification environment to track things such as memory configurations and safety hardware used. . Mark Olen of Mentor highlighted the complexity of SoC interconnects converting transactions across multiple protocols, rich with features to manage power and security, and with cache coherency support. Francois Cerisier of TVS presented a interconnect VIP that can cope with verifying such complexities.

JL Gray of Cadence suggested that improved productivity might come from challenging our assumed “rules and behaviours” to see if they help or hinder our verification. For example, is separating design and verification always best? Is adoption of UVM a given? According to Mentor over 40%” of verification teams have adopted UVM already but an audience member asked why they would move to UVM from a successful internal System Verilog testbench solution. John Aynsley of Doulos suggested strategic issues of availability of VIP and engineers might be deciding factors. However, many warn that adoption is difficult due to lack of Methodology and it will not necessarily lead to high test bench re-use, better staff portability or improved verification. Also UVM does not solve all issues as the audience confirmed they all use directed testing at SoC level and are mainly looking to control external transaction and event generation. Thus we began to question both the U and the M in UVM! And  Doulos Easier UVM might point the way forward. Given that “Integrating Languages, Views and Techniques” is also quoted as our highest equal challenge we need to ensure UVM fits with the legacy environments.

One area where we do see a lot innovation is in formal verification where a number of “apps” are now available to solve specific verification challenges and EDAC shows a 31% growth in adoption of “formal” between 2010 and 2012. Oz Levia of Jasper presented a solution for security with automated proofs on data leakage and absolute data sanctity for example. Sven Beyer of OneSpin highlighted use by Renasas for automated SoC integration verification. Robert Eichner of Real Intent focused on their automated solutions for clock domain crossing.  However, the main prize for formal is in the general purpose use of the technology in bug Hunting and demonstrating bug Absence (using the “AHAA” model outlined by Laurent Arditi of ARM which covers Bug Avoidance, Hunting, Absence and Analysis). Such use will require closer integration of dynamic and static coverage, a topic covered by Oz Levia. Mark Olen of Mentor highlights their “Intelligent Testbench Automation” as a major innovation and Sponsor Breker also have a solution in the same space.

Research from Mentor studies continues to show most of our verification time is spent in debug and Bindesh Patel of Synopsys showed recent debug innovations built around the Verdi platform. The new UVM-aware debug capability includes SystemVerilog code breakpoints and stepping, dynamic objects and stacks access, Interactive rewind, Macro and Constraints debug. In addition, Bindesh showed how Verdi’s Hw-Sw Debug module leverages the Eclipse CDT  to give a full programmers view of C/C++ code running on the processor core during RTL simulation and debug.

A number of user papers promoted home grown solutions. Jerome Bombal of Samsung focused on going from IP to SOC through Subsystems, quickly, safely and predictably through Verification-ready IP bundles. Simon Bewick of Ericsson and Yassine Elkhourassani of ST articulated the thoughts of many by asking for “fewer bugs, earlier bugs”. Fewer bugs fit well with the “Avoidance” from the ARM “AHAA” model but how? Maybe through the use of formal tools to generate waveforms to check the design from a few constraints. Mark Daniel of Infineon reported how he achieved a 10x speed up in regressions. This has enabled much more comprehensive check-in verification and early indications are that this has reduced the number of bugs checked in. This prompted discussion of a more “agile” approach to design and verification. Andy Walton of Altera agreed that the “Waterfall” approach to verification doesn’t work.

Simon Bewick of Ericsson also considered the challenge of testbench qualification and discussions naturally turned to Certitude from Synopsys. Joachim Geishauser of Freescale reported on their positive experience of the tool which helped them find test cases that have accidently not run or had part of the code commented, problems in the verification infrastructure and

test cases that did not cover all promised faults. Holger Bosch of Infineon also highlighted such qualification available using formal with OneSpin.

Tim Joyce of ST and Martin Ruhwandl of Infineon both highlighted the challenge of verification resourcing and the difficulty of knowledge management when using short-term resources. Tim Joyce reported on positive results from taking on a team of 10 external verification engineers through a managed service arrangement to help deliver three first time working chips.

Mike Bartley of TVS summarised the 74 verification challenges from 26 verification engineers over the past three years of Verification Futures in Europe and India. It is clear that verification will continue to absorb the highest part of our development efforts and the Verification Challenge is not going away any time soon!

Related posts:

Tags: ,

Leave a Reply

S2C: FPGA Base prototyping- Download white paper

Internet Business Systems © 2016 Internet Business Systems, Inc.
595 Millich Dr., Suite 216, Campbell, CA 95008
+1 (408)-337-6870 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy Policy