Open side-bar Menu
 Real Talk
Graham Bell
Graham Bell
Graham is VP of Marketing at Real Intent. He has over 20 years experience in the design automation industry. He has founded startups, brought Nassda to an IPO and previously was Sales and Marketing Director at Internet Business Systems, a web portal company. Graham has a Bachelor of Computer … More »

Fundamentals of Clock Domain Crossing Verification: Part Four

July 31st, 2014 by Graham Bell

Last time we discussed practical considerations for designing CDC interfaces.  In this posting, we look at the costs associated with debugging and sign-off verification.

Design setup cost

Design setup starts with importing the design. With the increasing complexity of SOCs, designs include RTL and netlist blocks in a Verilog and VHDL mixed-language environment. In addition, functional setup is required for good quality of verification. A typical SOC has multiple modes of operation characterized by clocking schemes, reset sequences and mode controls. Functional setup requires the design to be set up in functionally valid modes for verification, by proper identification of clocks, resets and mode select pins. Bad setup can lead to poor quality of verification results.

Given the management complexity for the multitude of design tasks, it is highly desirable that there be a large overlap between setup requirements for different flows. For example, design compilation can be accomplished by processing the existing simulation scripts. Also, there is a large overlap between the functional setup requirements for CDC and that for static timing analysis. Hence, STA setup, based upon Synopsys Design Constraints (SDCs), can be leveraged for cost-effective functional setup.

Design constraints are usually either requirements or properties in your design. You use constraints to ensure that your design meets its performance goals and pin assignment requirements. Traditionally these are timing constraints but can include power, synthesis, and clocking.optimization tools (synthesis) in order to meet these goals. You can set timing constraints either globally or to a specific set of paths in your design. You can apply timing constraints to:

  • Specify the required minimum speed of a clock domain.
  • Set the input and output port timing information.
  • Define the maximum delay for a specific path.
  • Identify paths that are considered false and excluded from the analysis.
  • Identify paths that require more than one clock cycle to propagate the data.
  • Provide the external load at a specific port.

Correct functional setup of large designs may require setup of a very large number of signals. This cumbersome and time-consuming drudgery can be avoided with automatic setup generation. Also, setup has the first-order effect on the quality of verification. Hence, early feedback on setup quality can lead to easy and effective setup refinement for high quality of verification.

Figure 14. Design setup flow.

Debugging and sign-off cost

The debugging cost is dependent upon the number of errors flagged by the CDC tool. Assuming good setup, this, in turn, depends upon the size and CDC complexity of the design and the maturity of the design. Typically, the debugging cost for top-level runs on immature designs will be high. This is because the design may contain a large number of immature CDC interfaces. This can generate a large number of failures requiring significant debugging effort. Also, the ownership of these CDC interfaces may be distributed between multiple designers.

Debugging cost is heavily dependent upon the reporting style of the tools. Source-code oriented reporting relates the errors to the real source, i.e., HDL functionality. Also, it produces much more compact reports. CDC verification employs multiple technologies of increasing sophistication, such as structural analysis and formal analysis. As a result, a composite report is essential to determine the overall quality of CDC verification. Most waveform viewers can read an industrial standard waveform database known as Value Change Dump (VCD).

Good clock-domain, functional, structural and VCD visualization is essential for effective debugging. Automated and advanced pre-processing of these views, to isolate the error context, further reduces the debugging cost. Finally, debugging support requires advanced sign-off capabilities so that the same issues are not analyzed multiple times in the iterative verification flow.

Verification run-time cost

CDC checking is based upon multiple technologies with varying degrees of precision. In the first step, structural techniques are used to identify clock-domain crossings and to identify possible error sources in the design. Structural analysis tends to be relatively fast and is very useful at detecting gross errors in the design. To guarantee design correctness, however, structural analysis identifies all potential errors in the design. This set can be very large.

As an example, consider the design in Figure 12. This reduced-latency design can operate correctly or can be erroneous depending upon the relative frequency of the clock domains. Also, this structure can be included in a more complex interface that handles stall and other issues making precise structural identification difficult. If a structural technique does not compromise the quality of checking, it has to flag this interface for manual review and sign-off.

Formal analysis is an excellent technology to filter out false failures from structural analysis and to precisely identify failures in the design. As mentioned earlier, traditional formal analysis is built to analyze steady-state design behavior, and these formal techniques are incapable of formally analyzing uncertain behavior because of metastability and glitches. As a result, special formal-analysis techniques that are capable of handling behavioral uncertainty, are needed for CDC applications. For example, consider the failure shown in Figure 13. Here the MCP on data path is violated because of a hazard. Vanilla formal analysis will pass the data stability check (MCP) for this structure. Data stability for CDC interfaces can only be proven with glitch-sensitive formal-analysis techniques.

Formal analysis needs to be seamlessly integrated into the application all the way from invocation to reporting and debugging. This eliminates the huge overhead of integrating external formal-analysis tools into the flow and to correlate the results from these different tools to arrive at an integrated view of the verification status.

As the computational complexity of formal analysis is very high, this can require a large amount of computation time. This cost is well worth it, however, as it provides significant savings in debugging and sign-off cost.

Figure 15. Verification and debug flow.

Next time we will look at a practical and efficient CDC verification methodology.

Related posts:

Tags: , , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *


You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

CST Webinar Series

Internet Business Systems © 2016 Internet Business Systems, Inc.
595 Millich Dr., Suite 216, Campbell, CA 95008
+1 (408)-337-6870 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy Policy