Open side-bar Menu
 Real Talk
Graham Bell
Graham Bell
Graham is VP of Marketing at Real Intent. He has over 20 years experience in the design automation industry. He has founded startups, brought Nassda to an IPO and previously was Sales and Marketing Director at Internet Business Systems, a web portal company. Graham has a Bachelor of Computer … More »

When to Retool the Front-End Design Flow?

 
August 30th, 2012 by Graham Bell

The following blog entry was written by Rick Eram, Director of Field and Sales Operations at Real Intent

I work with many design teams who are trying to find an optimal time point for updating and retooling their front-end design flow.  The decision is not as easy as you might think.  The various managers I meet struggle with this question, since it requires careful analysis of the existing flow, identifying any bottlenecks, and a detailed understanding of the current engineering design cost compared to a replacement toolset.  Managers also have to understand the team interactions around the world, their deliverables and responsibilities, and how designers work within each functional group.  And the switching cost must be quantified in hard numbers.

In the back-end world of circuit netlists and layouts, the decision to retool is simpler since the move to a new silicon technology node typically dictates when to change.  The benefits are obvious and much easier to quantify.  Metrics for run time, capacity, accuracy, and ease of achieving timing closure makes the job of understanding and analyzing the cost of current versus new tools much simpler to understand, quantify, and justify.  If these performance metrics in the current tool-set are degrading significantly because of greater design complexity, and the impact of multiple operating modes and statistical effects, the design team will not be successful.  A change is clearly needed.

So, how does a manager determine when to retool the front-end design flow and maximize efficiency?  Are current tools costing way too much of engineering time and not as efficient as they once were?  What is the real switching cost?  And what about the impact on verification?  Since verification is more and more intertwined with actual RTL design, a decision about a tool change must take that into account.

A manager can take the easy way out and simply make a decision based on what has been traditionally accepted in the market place; “the standard” or “the market leader”.  This leads to renewal of older generation technology year after year without considering the ROI when applied to new designs.  The reason for this hesitancy is fear of breaking the flow.  However, older generation technology carries with it the burden of a legacy software architecture.  The danger is that design quality becomes impacted when the effort to use the tool does not produce the results and reward that is needed with larger, ever more complex designs.  Finally this underperforming toolset compromises and delays the development schedule in hand-off to the back-end flow.

A vigilant manager needs to separate fact from conventional wisdom and make a decision based on technology advantages.  At the micro level, metrics to consider include how much time is spent on setting up a design for analysis (load time), tool run time, and the time it takes to debug a design.  For example, if a standard tool provides reasonable run time correcting the design is very time consuming and engineering intensive, you not only risk losing the engineer’s attention in debugging the design, but also increase the risk of missing critical bug.  The result is a tool which is used infrequently or ineffectively and does not deliver the necessary ROI.

At the macro level, a design manager needs to consider the impact of the tool on different stage of the entire tool chain.  I have seen situations where a design team had a tool to analyze RTL early on, but due to difficulty in using the tool, the tool was run much later in the development flow.  Another scenario is the situation where a designer uses the tools, but due to poor results at the design level, the verification team end up catching problems which could have been caught much earlier.

To address these micro and macro level issues, a design manager must monitor the tool’s impact at the point of use as well as the optimal place to position it in the flow.  Additionally, the impact on up and down stream components of the flow must be understood.  To complete such analysis, designers’ behavior, deliverable responsibilities, and well as functional teams and locations must be understood.

So back to the original question; when is it the right time to replace front-end tools? The simple answer is when designers don’t deliver the quality level they used to.

One important design area I have seen is the verification of clock domain crossings (CDC).  The analysis of violations requires days of designer time to correct and the analysis results are not as effective as they used to be.  This is not because the designer is being lazy, but the designs themselves have moved away from the “standard” tools’ sweet-spot in terms of size, number of clocks, and mode complexity.  But a manager may not notice this lack of efficiency until schedules get significantly delayed.  In addition, a manager may not be aware that the use of the tool is also being delayed.  A designer may run CDC analysis at any stage of the design, but if they only run it once at the end of the design cycle many aspects of the CDC may be ignored and not fully analyzed, or the required code changes as a result of analysis may be simply too late.  So how to avoid such issues?

1)      Make sure the tool technology being used is the most effective solution to the problem, and can handle your future designs,

2)      The tool is capable of providing meaningful results for engineers to understand and debug,

3)      The tool is capable of and being used at the right place at the right time in the flow, and finally

4)      The designers can more quickly adopt the tool.

To understand the ROI, not only the tool direct cost must be understood, but also the tool’s impact on the designers and the design flow must be understood.  The switching cost after analyzing such metrics may be surprisingly small.

Related posts:

Leave a Reply

Your email address will not be published. Required fields are marked *


*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

CST Webinar Series



Internet Business Systems © 2016 Internet Business Systems, Inc.
595 Millich Dr., Suite 216, Campbell, CA 95008
+1 (408)-337-6870 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy Policy