September 04, 2006
Mentor Calibre nmDRC
Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
with the ability to write that data out and we were done.
The nanometer platform that we have put in place facilitates this entire handoff and analysis and enhancement to manufacturability.
against multiple optimization possibilities. There is also an incremental analysis and enhancement capability that lets us look at where those different options will send us in terms of the overall yield space as well as a report and visualization area where we can take that morass of error markers, turn it into real information about a design that a design can use to model their yield effects. That whole thing put together we are calling the Calibre nm platform.
What types of outputs does it generate?
First you have error markers. As an example you might have this DFM rule that says your spacing is less than 200 nm where your DRC is 130 nm. You get lots and lots of error markers which the designer has no idea what to do with. Then there is a Praeto analysis of all the violations of that particular rule identifying which are the worse violators and would be the best place to go off and spend your energy. You can also you see collected data over the entire chip for these rules which then lets you build out and understand where your biggest yield effects are overall and how this is going to impact your chip yield.
There a number of other reports such as different Praeto tools and hot spot mapping into the design data base. There is a by cell analysis which can make for interesting conversation with your IP provider as you are talking about what their deliverables should be not only in terms of functionality and performance but also for yield detractors.
For model based verification streams we are doing several things ranging from the critical area analysis tool that includes defect density model as well as systematic edge effects around the litho space. Probably the most interesting is tying that variability into our silicon modeling which goes off and determines not only the variability of transistors over the lithography but also their detailed context within active regions. Essentially this tells us is the deterministic variability of the design and enable people to have a smaller guard banding around what they are putting in place for parametric performance.
We have described fairly radical changes in terms of what customers need to ensure manufacturability especially as we go down towards and look forward to 45 nm. We are not only running more and more DRC checks but also adding all those other manufacturability analyses. In a classical environment that quite simply could run for weeks or months. With all the new capabilities we have built into both the nanometer DRC tool and the performance characteristic that it brings as well as all of the analysis, incremental and statistical gathering that we are able to build into the tools, we actually enable people to deal with that new environment within their existing design cycle.
When is Calibre nmDRC available?
Customers can gain access to this today. In terms of first customer ship in the middle of Q3 which is August.
What is the pricing of the offering?
Highly variable depending on the number of CPUs and which part of the analysis packages but at the same price as the old Calibre DRC.
Are current users of the older Calibre DRC entitled to a free or reduced price upgrade?
If they are on maintenance, then they get this as an upgrade.
What is the number of beta sites?
Twenty four as of yesterday.
How would you characterize their feedback?
It has been great because what we've done is gone in, loaded the software, and within about an hour they are operating configs. They run it. It's done in 2 or 3 hours. The reaction in general has been Wow. It's been really successful.
Whom do you see as competitors to the older and now the new Calibre?
In terms of existing tool we have market penetration up above 60%. There are tools in place out there from Cadence, Synopsys and Magma. I expect all three to continue to compete. From what we are seeing from the results, we expect to be significantly faster than all of them as well as having with the nanometer platform a far more complete roadmap for manufacturability.
If some one makes a design change there may be unintended ramifications and therefore a need to re-simulate the entire design. I understood you to say that if you make a change on the manufacturing side (line width or spacing), you can determine the maximum area of influence.
Yes. That's part of the incremental capability where if someone goes in and edits a particular region, our tool will find where the change was and verify only in the necessary region around that.
So changing the shape of geometry but not the function?
It depends on what someone is doing. If someone is doing something as extensive as timing check in the Place and Route tool and having to do a replace, rip-up and reroute, they will tend to do a full verification run rather than an incremental one. Incremental tends to be about whether what you are doing is essentially a custom layout editing of a particular region.
The target for Calibre nm is the more complex designs, lower processor nodes.
That's definitely the target. We expect that our customers will upgrade as a default. So even for the older 180 nm and 130 nm nodes, they will be taking advantage of the same technology because we can still run on those backwards compatible rule files. So they will essentially just get faster run times and be happy about it.
What are the ramifications of going forward to 45 nm and below? Will litho-friendly changes breakdown at some point? Do we need new breakthrough to go to lower processor nodes?
you turn around and grab another standard cell library for 65 nm and do the same analysis, because that library was not designed to be invariant, there would be a huge amount of variability. With the second cell library you end up with difficulties in doing timing closure. With the first cell library they had far fewer problems. The aspects around doing these different analyses really can make an impact on the design where rather than things getting harder, they are getting easier.
You can find the full EDACafe event calendar here.
To read more news, click here.
-- Jack Horgan, EDACafe.com Contributing Editor.
Be the first to review this article