March 10, 2008
Closing the Verification Gap.
Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
Jack Horgan - Contributing Editor


by Jack Horgan - Contributing Editor
Posted anew every four weeks or so, the EDA WEEKLY delivers to its readers information concerning the latest happenings in the EDA industry, covering vendors, products, finances and new developments. Frequently, feature articles on selected public or private EDA companies are presented. Brought to you by EDACafe.com. If we miss a story or subject that you feel deserves to be included, or you just want to suggest a future topic, please contact us! Questions? Feedback? Click here. Thank you!


If you are building systems that do not have a lot of reuse, you will find that these tools give you tremendous productivity because you have to write all these models from scratch. If you have a system where you have a lot of reuse and you already have these models, then your productivity improvement is going to be a lot less because you already made the investment. It just makes life smoother. You have to look at productivity improvement in light of where people are in terms of overall maturity and experience.


In the case of MVC the model itself has separated the notions of functionality versus timing where timing is a transaction, a bunch of clock edges, or a bunch of waveforms down at the gate level. We can use a synthesis technology to make sure that the ports on that model are going to adapt themselves to the rest of the environment that you are verifying. It gives you reasonable RTL and TLM components, reduces your testbench development and refinement time and lets you get abstraction adaptation between levels of modeling hierarchy.


The InFact technology is quite interesting. It takes a very different approach to writing testbenches. The way InFact thinks about it is as follows: Image that you have a block with ports on its. Typically there are some control signals that go here and some data signals that go these. Then you have memory which is fairly simple. Ports on a memory can speak a language that can read, write and modify a memory location. To read you have to provide an address. The memory will look up the data at that address and give it back to you. You can have a write cycle. You give the memory an address and some data. It put that data into the memory. You can have a read-modify-write cycle. You give the memory an address. It reads the location, you change the data and it rewrites the data into the same location. You can give the address once. You do an output, then an input into the same memory location. You might also have an idle cycle that lets the memory recharge certain things or whatever. The memory might also test itself. It might have some certain built-in self tests commands that go in and reconfigure the memory. You might have ECC capability; some commands and controls to do error correction. If you think about it this way, this memory block has a language which it speaks. The vocabulary it has is things like read, write, read-modify-write, start BIST, Stop BIST and so on. These are words that it speaks. Send it a command to read and it will read. Send it a command to write, it will write. There are sentences that you can construct. A sentence is a string of words that have legal grammar. Our particular memory may have this weirdness in to that when you do a read-modify-write cycle, you have to follow it with an idle cycle because the memory has to recharge. So there is a grammatical rule. If you do not do that, you have violated the grammar and bad things are going to happen. With InFact you have a declarative way of capturing the vocabulary and the grammar rules by which you can form sentences. This comes out of compiler theory when you capture the syntax
of a language in BNF or Backus-Naur Formal. InFact lets you capture the language of your blocks in a kind of modified BNF. It is very compact BNF. It is very easy to do. Languages like C have a very compact BNF. Not very big. XML is another language that lets you capture the essence of syntactical rules. Once you capture the grammar and vocabulary for your blocks, you now come to the problem that you would like to create sentences that verify certain properties of the blocks. In InFact we have a way of allowing you to generate sentences based upon state transition graphs.


For your blocks you can specify certain state nodes you want to visit and certain arcs which are transition paths you are going to take through your design in order to verify certain properties. This is more powerful than specifying constrained random because constrained random tends to be kind of verbose when it comes to generating vectors because constraints are not about the design. Constraints are about the vector space. In InFact the constraints are derived from your design not strictly from the vector space. Therefore you are going to get a set of vectors that are much more highly tuned to what your circuit does and how it does it. If you were to plot a graph which had on the horizontal axis the number of simulation cycles and in the vertical axis the functional coverage (of course we would have to argue about what functional coverage means but assume we had agreement of this point), then we would find out that the constrained random graph has an asymptotic curve that eventually gets you lots of functional coverage but it requires exponentially large number of test vectors to accomplish that. InFact technology, because it understands more of your design, can generate a much more focused vector set and can give you coverage which in many cases is 20x to 50x few simulator cycles. It would be able to traverse your coverage space, the things you want to know about
your circuit, much more effectively. If you have fewer simulation cycles, you have fewer simulation output files to look at. You get your results much more quickly and you can use this technology at the architectural level and look at things that are interesting there and then reuse the same thing down at the RTL when you are at the implementation stage so that you can in fact show that your RTL level does implement correctly what you thought you wanted at the architectural level. That’s what InFact does.


What is the availability of these two products?

InFact is available immediately. Questa Multi-view Components is available 2Q 2008.


What is the pricing?

Pricing start at $25K.


For each or for both?

For each.


One year TBL or perpetual?

One year TBL.


Who are the major competitors in this verification area and do they offer anything like what you have been describing?

There are three companies that compete in the verification area. It is between Mentor Graphics, Cadence and Synopsys. Gary Smith issued his marketshare numbers after he spun out of the Gartner Group. Those number show that Mentor and Synopsys are neck and neck and Cadence is 2 or 3 market points behind. It is about 34% or 35% each for Synopsys and Mentor while Cadence is at 32% marketshare.


Synopsys has a set of verification IP out of their Designware group. That IP is not retargetable to abstraction levels. They target their VIP at the implementation level. Synopsys is not known for its architectural level tools. They have very little there. They had some tools which were really not much of a market success a few years ago.


On the Cadence side of the world, they typically do much better at the architectural level but most of their VIP has been generated from the acquisition of Verisity. That VIP is proprietary. They are well respected in the industry but they are not portable. They only run with Verisity. People are shying away from those things because it tends to lock them in. Synopsys has a similar problem in that their IP components are proprietary and when you use them, you are kind of stuck with their tool set. Neither company has the ability to do this automatic abstraction adaptation. That’s unique to Mentor Graphics.


We have this view of creating an open and portable verification methodology. We had announced AVM, Advanced Verification Methodology, last year. This year Cadence and Mentor have gotten together on OVM, Open Verification Methodology. There was a press release a few weeks ago. OVM has been very well received in the industry.


Multi-view Components and InFact are compatible with that technology. The point of it is to encourage the industry to create open standards. Testbenches for a large design has hundreds of thousands liens of code. Users want that code to be portable across toolsets. They do not want to be tied into one vendor. What we are encouraging is creating these open standards so that user created artifacts are protected from obsolescence. Anybody who has a testbench today is worried because there is only one vendor in the world that supports it. What if there is a falling out between them and that vendor. So, for example, on the Cadence side they were quite late on
entering SystemVerilog. Users were sitting out there thinking that SystemVerilog is good and I want to go their but I have all these models and I am stuck. OVM is trying to solve that problem. Cadence is working with us to get model interoperability working. The emphasis of that is to make sure that customers are not disadvantaged by all these EDA squabbles.


« Previous Page 1 | 2 | 3 | 4 | 5  Next Page »


You can find the full EDACafe event calendar here.


To read more news, click here.



-- Jack Horgan, EDACafe.com Contributing Editor.


Rating:
Reviews:
Review Article
  • Closing the Verification Gap. March 10, 2008
    Reviewed by 'Tao Chen'

    It is a great article about verification. Finally, I can really say executives at big EDA companies understand the issues. The article addresses reusability across different designs and one design with different representations. Like Java, "write once and run everywhere" is the key factor that increases productivities.
    However, are the technologies really new? The answer is no. We (www.tarek.com) have been addressed those issues since 2004.

      2 of 6 found this review helpful.
      Was this review helpful to you?   (Report this review as inappropriate)


For more discussions, follow this link …
CST: Webinar

Aldec Simulator Evaluate Now

Featured Video
Jobs
Acoustic Systems Test Engineer for Cirrus Logic, Inc. at Austin, TX
Principal PIC Hardware Controls Engineer for Infinera Corp at Sunnyvale, CA
ASIC Design Engineer for Infinera Corp at Sunnyvale, CA
Applications Engineer for intersil at Palm Bay, FL
Design Verification Engineer for Cirrus Logic, Inc. at Austin, TX
Upcoming Events
Essentials of Electronic Technology: A Crash Course at Columbia MD - Jan 16 - 18, 2018
Essentials of Digital Technology at MD - Feb 13 - 14, 2018
IPC APEX EXPO 2018 at San Diego Convention Center San Diego CA - Feb 24 - 1, 2018
CST: Webinar series
DownStream: Solutions for Post Processing PCB Designs
TrueCircuits: IoTPLL



Internet Business Systems © 2017 Internet Business Systems, Inc.
25 North 14th Steet, Suite 710, San Jose, CA 95112
+1 (408) 882-6554 — Contact Us, or visit our other sites:
AECCafe - Architectural Design and Engineering TechJobsCafe - Technical Jobs and Resumes GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise