[ Back ]   [ More News ]   [ Home ]
May 29, 2006
Verification Update
Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
Jack Horgan - Contributing Editor


by Jack Horgan - Contributing Editor
Posted anew every four weeks or so, the EDA WEEKLY delivers to its readers information concerning the latest happenings in the EDA industry, covering vendors, products, finances and new developments. Frequently, feature articles on selected public or private EDA companies are presented. Brought to you by EDACafe.com. If we miss a story or subject that you feel deserves to be included, or you just want to suggest a future topic, please contact us! Questions? Feedback? Click here. Thank you!

Introduction

Verification accounts for a significant portion of the budget and time for developing a chip. EDA vendors address the verification market in a variety of ways: improved simulation software, simulation farms, hardware acceleration and emulation and so forth. I had a chance to discuss two recent product introductions in this arena from Cadence and Mentor Graphics.


On April 24th Cadence announced its Incisive® Enterprise Scenario Builder. I had an opportunity to talk with Sylvia Hurat of Product Marketing and Steve Brown author of Scenario Builder.


Everything about verification solutions is actually about the risk of shipping a product that has not been properly verified. Each time you want to introduce a new methodology or a new tool you are increasing your risk. The real balance is to help customers improve their verification process while reducing the overall risk of their design and verification. The Incisive verification solution from Cadence is all about reducing the risk of verification.


We have created different families of verification products within the Incisive platform that address the needs of different types of customers. These are the HDL family, the Design Team family and the Enterprise family. Very small customers doing small ASIC designs may just need to improve the performance of their simulation. That's why we have a HDL family which is all about HDL creation and simulation. For larger teams doing larger chip we have the Design Team Verification solution. The design team is in charge of doing the design as well as the verification. This solution includes the creation of HDL and simulation as well as formal analysis and assertion based verification. We have created a methodology, a plan-to-closure methodology for this particular type of customer. For those that have a higher level of complexity when they design very large chips or system level chips we have the Enterprise level of verification solutions that addressees the needs of complex product teams that include specialists that are in charge of different tasks during the project life. You have logic designers doing the design, verification engineers in charge of performing the verification of the chip and the system, verification managers and so forth. All these multi-specialist teams have a high level of needs in terms of verification. They also have the need of a solution to
address all the different specialist skills. For that we also have a plan-to-closure methodology and expert support. For these users we are adding to the Design Team solution SystemC modeling, acceleration and emulation capabilities.


What is Scenario Builder?

Scenario Builder is a technology that we are adding to the Enterprise family solution. It is a technology that addresses the needs of multi-specialists teams. Scenario Builder addresses the needs of designers by enabling them to help with advanced verification without the need of acquiring verification expertise. This is a brand new technology developed by Cadence, a really unique technology which enables people to create very complex multi-channel scenarios without any knowledge of object oriented programming as was the case in the past. It does not require any knowledge of hardware verification languages either. It is all about graphical creation and addition. That is what is truly
unique. The tool we have has an abstract representation of very complex verification environment and verification IP so that customers can achieve maximum benefit from its reusability.


The Scenario Builder flow supports verification specialists that are in charge of planning, architecting and developing the verification environment. Those are very high skilled tasks. When the verification environment is ready, they can deliver it to the designers or to the test writers. They in turn can use Scenario Builder to create reusable sequences, complex real life scenarios that can be run in this verification environment. The goal is to have a verification specialist focus on high value tasks while once the designers know what has to be tested, they can focus on creating those tests.


Where did the idea for Scenario Builder come from?

When we started thinking about Scenario Builder, we asked what technology we could use to enable non-specialists to do a very skilled task like creating multi-channel scenarios. We thought about audio recording software that enables people like you and me to create multi-track sounds, music by combining existing loops and samples in a multi-track audio system and do pretty much professional recording at home on a PC. This was the idea we leveraged to create Scenario Builder.


The Scenario Builder is a GUI based tool which enables people who do not know anything about verification to create complex scenarios. The verification environment created by the specialist is abstracted to a level that is really easy to understand by anyone. You have multi-channels. But each interface to the design is actually a different channel. You can create sequences that are actual stimuli that are applied to a channel or you can create multi-channel sequences combining these sequences on each channel to form more complex real life scenarios. The customer is presented with graphics. Once a channel is selected he has access to all the building blocks that can be used to start
building those scenarios. These blocks can be quite complex sequences that have been built in the past and stored in sequence libraries. These sequence libraries are created by the designer or by the verification environment builder who wants to provide people with complex building blocks.
Scenario Builder Graphical User Interface


At any time the user is presented with context sensitive Help. If the user doesn't know exactly what the sequence in the library is doing, then by just moving the cursor on top of it he will see a command that tells him what the sequence is doing. They can very easily start combining them using drag and drop capabilities.


The designer can control the constraint he wants to apply to each element, so he can reach his corner cases in a more efficient manner. This tool is highly useful when incorporating commercial Verification IP because VIP provider actually builds a set of predefined sequences. The test writing is much more accelerated.


Scenario Builder is about the natural visualization of advanced verification environment for people who have no knowledge or expertise in verification. It extracts only the important information for creating tests. It is about the easy composition of doing complex scenarios by using drag and drop elements in different interfaces and channels. You end up with real-life scenarios without knowledge of object oriented programming. It simplifies the use of verification IP by filtering all the info that is needed.


The benefit for the customer is more flexible way of managing the verification team. You can use people with little or no experience in the verification process to help them with test creation. You remove the need for training. You maximize the value of VIP reuse.


Pricing and packaging?

Scenario Builder is a standalone product. It lists for $20K per year for a shared license.


Can Scenario Builder be effectively used outside the Incisive environment?

It is meant to be used in that environment.


Is there any target opportunity in terms of the chips being designed?

Not really. It is targeted at all the customers using Incisive for verification.


Would you estimate the number of Incisive customers or market share?

No. Incisive is a broad set of verification capabilities. Analyst firms like Gartner provide estimates.
Editor's note: Functional Verification accounted for 21% of total revenue for Cadence in 2005.


How long has Scenario Builder been under development?

It was part of the Verisity solution set.


Was it a commercial product at Verisity or technology still under development?

It was under development there.


Any early release or beta sites?

Yes! It is actually in production. See the quotes from Ceva and Globaltech Solutions in the press announcement. There are about a dozen customers using the tool.


You make the comparison to the average person being able to compose multi-channel audio recording. In that case the average person would know whether the end result was satisfactory without understanding much about the tool or the underlying technology. In the case of Scenario Builder how does the end user know that they have created correct scenarios and that they have reasonable coverage?

The tool is actually creating the scenarios for them. The scenario is human readable. The scenarios are correct by construction. The coverage will be given by running the simulation in the verification system.


My point is that if you give an automated or semiautomated tool to a class of users that has no experience in an area, how do they or how do their managers know that they are using the tool correctly and efficiently?

In this case they are using the same tool which is a simulator and using the same debugging tools to observe the behavior of the simulation. The only thing that is different is the specification of the scenario. The same skills they used to identify a correct test today will be used to identify a correct test with Scenario Builder. They will just be able to capture a more sophisticated test with Scenario Builder. It is not a new requirement for them.


The value proposition is to use lower skilled people to create the same tests, to do it quicker, or to do a more complete verification?

The sophisticated challenge is the abstraction of using a tool to describe the scenario. The debugging and observing the behavior of the system is not a new thing at all. Even the less sophisticated people have been doing for a while and will continue to do. It is this concept of what is a verification scenario which is a new domain for them and we are lowering the barrier on that. Once they have a scenario, debugging or observing of the system is the same skill.


Are you aware of any competition?

No! This is a tremendous innovation built upon the innovations in Specman and the verification process automation environment. We are further extending that solution and making it more accessible.


What can be expected in the future fro Scenario Builder?

We try to improve even further the ease of use and the abstraction of the presentation of the environment. At a macro level there are things we are doing with the whole Incisive platform. For example more multi-language support is a possibility. We are working with customers to prioritize all this stuff. We are expanding the domain application of Incisive to other specialists. So it is possible that other specialists will have their versions with expansions of Scenario Builder. This version is explicitly for designers. We have a roadmap for Incisive that Scenario Builder would track. The kinds of things we are going towards are hardware/software, analog and mixed signal,
architectural validation and so on and towards the different specialists that are involved in these systems from front to back. Those are the kind of extensions you could reasonably expect in Scenario Builder and we are discussing with our customers.


In the case of Verification IP the developer or vendor provides certain information, is the information Scenario Builder needs typically provided or is additional data required?

The tool works well with the information already available today from the VIP developer. It does not require extra work. Obviously, there is a capability of featuring extra information that makes the usage by the test writer even easier. This might require a little work but very minor, a couple of hours work.


On May 8 Mentor Graphics announced its comprehensive next-generation Questa 6.2 verification solution. I discussed this announcement this with Robert Hum, VP and GM Design Verification and Test Division.


What are you announcing?

There are three highlights. The first is the new Questa Verification platform, a single kernel SystemVerilog based simulation environment. The second thing is open source standard based Advanced Verification Methodology called AVM. The third thing is the Questa Vanguard Program which is an organization of companies, currently 25 firms enrolled, who are contributing to SystemVerilog in some way. The list includes people who are doing training, people who are doing consulting, people who are doing conversions from e or Vera to SystemVerilog and folks who have tools available like SpiraTech. It is kind of an ecosystem of companies working together to help the electronics industry to make
a transition to SystemVerilog.


Questa is delivering on three fronts. It is necessary but insufficient to deliver only tools. If all you have is a simulator, you are going to get yourself in trouble because SystemVerilog itself has quite a bit of capability in it. The question is “How do you most effectively use that capability?” How do you become not only efficient but effective in verification? You really have to deliver three things to the marketplace to make this work: tools, a methodology to use these tools effectively and the other thing is infrastructure. You have to have models, methodologies and an ecosystem of companies that provide services and things that simulators consume. Those are
the three things being announced.


What is the motivation behind this?

The impact of complexity on the number of cycles as seen the Collet study. Intel has slides showing the number of code vectors increase with design size. Bigger designs simply give you more bugs. More bugs means more tests to find them. More tests mean more people to write them. With more people to write tests, you need more simulators to run them. Also some of the new tests will have bugs themselves. Hence you will need more people to debug them. Complexity has some interesting effects. You need to do a much more complete job of testing and that increases the load on an organization which drives you to add more people; a very expensive spiral to get into. One of the goals that
SystemVerilog has had in the marketplace is to create a verification environment to do things more efficiently. The implications being that you will need less resources in people and resources. Another goal is to do things more effectively. The implication being there that you will find more bugs. Therefore the design entry into the marketplace will more often than not be correct compared to what you get in today's verification market.


The EDA industry had been responding to the growth and the needs of verification by providing tools and methods. There are lots of tools and methods out there: assertion based verification, functional coverage, constrained-random testing, etc. There are lots of these things available that can be applied to the verification problem. The question of course ends up being “How do you know which one of these things you should use in your situation?”


In the old days people used to think that all they needed for better verification was a faster simulator. In fact simulator speed has always been one of the things that everybody benchmarks. These days if you are only benchmarking simulator speed, you are doing yourself a disservice. Simulator speed is all about finding the same bugs you have been finding but finding them in less time. The industry realized that this wasn't enough. Some of the technology and technique we mentioned earlier came into play. The trick there was to go find more bugs as quickly as you can. Then we discovered there was a huge learning curve. There were so many different methods and tools and ways of doing
things that for anybody to become productive took a long time. This was an era when people were experimenting with different testbench technologies, e and Vera grew up, people were trying different approaches to assertions, PSL grew up. Eventually the industry settled on good ways of doing these things. SystemVerilog was born. With SystemVerilog came a realization that there is a methodology or a set of methodologies that can be employed to make the use of these tools more effective and more efficient at the same time and thus get the learning curve under control. This then spawned the thought that something like AVM would be a good thing.


Let's talk about methodology.

In many methodologies that are in use today by customers we find that engineers in the companies have specific tasks. There are engineers that are architectural level folks, engineers that are hardware implementation level folks, verification engineers and software engineers. These folks tend to stratify in terms of what abstraction level they work at. They talk to each other at these abstraction levels.


Software engineers talk to SoC architects but very rarely talk to verification engineers. They really have nothing to talk about. Verification engineers talk to hardware engineers but very rarely to SoC architects. This leads to misinterpretation of specifications, misunderstanding, incomplete things and so on. You find that the kind of bugs that creep through the verification process today and there is a range of them but you find that the really painful ones are ones that miss cross communicating between high level of abstraction and lower level of abstraction, especially today with the amount of reuse going on.


One of the goals we had in designing AVM is to have folks in these various jobs - higher level of abstraction people and lower level of abstraction people - to be able to communicate with each other around what the specifications are and what a particular design is supposed to do in terms of meeting these specs.


What is AVM?

One of the essential features of AVM is the way of handling abstractions, the way of allowing people to move between levels of abstraction in their designs seamlessly. We believe that AVM will lead to the industry's first system level to gate level verification methodology.


AVM has been designed not only with SystemVerilog in mind but also SystemC. SystemC in the industry is a language that is heavily used for testbenches. There is no sense in going to the designers in the industry and saying you have to rewrite all of your SystemC in SystemVerilog. One thing we have learned about moving things forward in the industry is discontinuity is difficult for people to accept. You need to build bridges to get them from where they are to where they want to go. You have to keep their business on the air. So AVM embraces both SystemC and SystemVerilog.


The point about abstraction layers is that AVM is built upon an object oriented technology plus transaction level modeling (TLM). TLM is the key in the abstraction adaptation layer. In AVM we cover stimulus generation, assertions and coverage. Coverage is another key point. We have a how to guide, examples, base class libraries and utilities. We make all that available in source code. We use open-source licensing. This means that you do not have to strictly be a Mentor customer to use AVM. If you want to use a simulator from a competitor or use your own proprietary simulator, you can pick up AVM and use it in your environment. There is licensing. We use Apache 2.0 license which is
a very gentle open source licensing approach. The reason we want to make it available in source code and with open source license is that we would like to create a community in the industry that pushes this methodology forward. One of the things I believe is that EDA companies do not invent methodologies. I believe that the only people that really invent methodologies are end users. Therefore we are trying to create an infrastructure to kick start a user community to develop methodologies that will then move the entire industry forward.


Proprietary methodologies generally fall flat on their faces sooner or later. The real methodological changes in the industry have always been motivated and promoted by user companies. Synchronous design is that design style that enabled the move from gate level to RTL simulation and also enabled the advent of RTL level synthesis. Synchronous design was motivated by the end user community. That's the theory behind AVM.


What are the properties of AVM?

It makes a verification team more efficient and more effective. When you write less code, you find more bugs. Reusability is one of the best ways to get efficiency into your operation. Because AVM relies on TLM you can see why you can write reasonable code. We also reuse SystemC so that you do not have to rewrite your testbenches from scratch. It has in it all the classic stuff like constrained-random testing and assertions to broaden bug search. There are AVM libraries for common tasks. You don't have to write serializers and deserializers. It is quite a rich library of systems services. We deliver examples of testbenches that have certain properties. You might have a testbench
aimed at a device with 8 serial channels. You might have a testbench aimed at very transaction rich and control rich chip. Those are handled differently. We have examples of these testbenches that you can actually use and incorporate into you verification environment as a starting point for advanced testbench in SystemVerilog.


A high level model is transaction level model that deals with packets and is concerned with throughput and latency. A low level model is a detailed pin level of the device under test (DUT) sending waveforms, 1's and 0's. It is concerned with setup time, hold time and various low level transactions. It would be very good in both cases that you would not have to rewrite your test benches. The way you can do that is by using abstraction adaptation layer, abstraction conversion layer that will take packets and convert them to pin wiggles when you need them to be pin wiggles. If your DUT is a high level model you don't need that. You need something that basically generates packets. It is
complicated to explain.


It is a key point. It is what sets the AVM apart from other verification methodologies in the industry like Synopsys VMM which does not have this concept in it. We believe that this concept is pretty unique in the industry and will make a significant difference to the efficiency and effectiveness with which people can verify their design.


We believe that AVM is a bite sized methodology that allows you to implement incrementally. You do not have to swallow the whole thing. I go back to the statement I made that to get people to go from where they are to where they want to be you have to provide a path for people to follow that is smooth and prevents them from falling off cliffs. One of the design goals of AVM was to give people a way of starting with SystemVerilog without having to take a year off to reeducate themselves. You can use object programming which SystemVerilog supports or you can use standard old HDL with SystemC and make incremental changes to your testbenches. AVM has the property that you can use both
object oriented and HDL-style constructs. You can mix and match.


Are there any competitors out there?

Synopsis VMM (Verification Methodology Manual)! In terms of access AVM is open, VMM is closed. We can execute AVM with confidence on any SystemVerilog simulator. With Synopsys VMM it is closed, you have to be on VCS. With AVM source code is available, with VMM it is not. Both have base libraries. AVM is TLM based which includes this abstraction adaptation layer. We believe that is unique and we believe it is one of the key things that you have to do to make progress in verification. If you don't do that, I believe that you will find that your testbenches are not serving you well. You will not be able to use them to validate things at the system level. We support SystemVerilog and
SystemC to RTL. The use model is hybrid in contrast to VMM which is class based. VMM has been in the industry for a while. AVM is relatively new. We have done some learning in the industry as to what works and what doesn't. We have had the benefit of people using previous methodologies and getting their comments and their suggested improvements. AVM is really a second generation methodology. VMM is first generation.


Any other competitors?

Not really. Our friends at Cadence have been silent on that topic which is quite curious. We are getting some indication that something maybe be coming. What we understand is that it will be more than likely aligned with AVM and not VMM. So there are only these two things in the industry right now that we know of. There are a bunch of smaller things like Spiratech in the UK that talks about tools to automatically generate transaction level models. But that is not an entire methodology. It is something very local. I think AVM and VMM are unique.


When was Questa first released?

We announced Questa as the first SystemVerilog simulator in the world at the DAC conference. This year Questa 6.2 is coming out with a much more complete feature set, a very stable tool. We are announcing it in time for the DAC conference.


Questa itself works all the way from a specification level down to the gate level. It has links to MathLab. It is a five language simulator (SystemVerilog, Verilog, VHDL, PSL, and SystemC) plus C and C++). It is a single kernel simulator. That means all of these capabilities will execute with minimal overhead. Questa has a simulator, a constraint solver, an assertion engine and a function coverage engine. What we believe is unique about Questa 6.2 is the way it handles coverage. We are pretty excited about this.


The other thing is performance. As you know every simulator release has to come out with the simulator running a little faster because of course the circuits being worked on are a little bigger. We have more performance. We have a new optimization engine in there called vopt. We have seen designs that have been accelerate up to 10X using vopt. Vopt, typical of optimization engines, extracts the piece of designs where they are not interested in poking into certain aspects of those circuits. Those things are abstracted out. We have way more debugging in it so we can debug TLM and assertions. We can debug cause and effect relationships. We have a 5 language native verification
environment. And of course Quest 6.2 executes AVM.


We believe that coverage is really that thing that helps verification and design engineers become more efficient. What you need to be able to do is gauge what impact your testbench is having on your design. If you are applying stimuli from your test bench that is not covering new aspects of your design, you are really wasting your time. You have to be able to tell what the testbench you are executing is doing to your design, what it is covering and not covering. In Questa 6.2 we have automated a great deal of that. SystemVerilog has covergroup and coverpoint statements and things like that. If you use these statements properly, you will get a good snapshot of what is going on in your
circuit relative to what the testbench is simulating. You can get to the point where your testbench can query coverage metrics in order to make choices about what vectors it generates next or what path it will take on the next execution cycle. It answers questions like “How much of my test plan is covered?” The test plan is a high level document that says you will check for FIFO overflow, check for clock recovery … Then when you write your testbench, you have to relate what your testbench does to your design back to your test plan. Did I or did I not cover FIFO overflow? Did I or did I not cover clock recovery error?


Where does coverage come from?

Inside the Mentor system you have several choices on where coverage comes from. Part of your coverage can come from formal analysis. You might have a block in your circuit and the best way to cover this piece of the circuit is to use static formal. Once I use static formal, I know that this block is correct. I would like to enter data about the fact that I have covered this block into a coverage database. The other places where you can create coverage is through the simulator. The simulator will tell you which assertions fired and which didn't. It will tell you which parts of your code executed. Your testbench further tells you what kind of functional coverage you have. You can see
that coverage is not one simple thing. Function coverage, assertion coverage, code coverage and plain old coverage coverage.


We are introducing as part of Quest 6.2 a new Unified Coverage Database (UCDB). It has a read and write API that allow users to customize the reporting. What happens with UCDB is that all the coverage metrics are put into a high performance database in real time while the simulator or formal is running. Thus for example you might kick off a regression run on Friday. The regressions runs all weekend. On Monday morning the manager of a large SoC gets a report on his desk to understand what happened with those regression tests. Did all the changes that were made last week in the chip result in more or less bugs and what is the actual coverage?


UCDB is a very high performance database. It allows more than 1,000,000 insertions per second. The read and write APIs allow users to create other fields, tag things, tie things together with scripts and what ever else they want to do. We are creating an environment that allows people to understand what their coverage is, what is left to cover on the chip, and things like that. One of the problems we have in the industry today is that generally verification stops when somebody says it is time to tapeout rather than when someone says that I have enough coverage. I think that leads to respins and a whole bunch of heartache. We believe that if people have the ability to understand their
designs and what has been covered, they will do a much better job.


Using UCDB you will be able to identify coverage holes. Graphical printouts tell you out of the coverage you wanted to have done how much is in fact done and how much is not done; color coding for people to key in on. You can tie the UCDB output back to the test plan. You can review a test plan and see that you have covered everything you want. You no longer have to test a particular module for those things because you know that stuff has been tested.


Infrastructure helps the industry make a transition to SystemVerilog. We are announcing the Vanguard Program, a group of companies from around the world working closely together to accelerate the adoption of advanced verification techniques and methodologies. There are a range of companies offering services in training, consulting, conversion and verification IP. All that we require to join Vanguard is that whatever they create has to support AVM and Questa. It can support anything else but at a minimum they have to be able to execute in the Questa environment and be compatible with AVM.


The list of companies includes Sutherland HDL, Willamette HDL, Denali, Averant, and Spiratech. It is a good start in the industry. It is important that customers have an ecosystem otherwise it is very difficult for them to make rapid progress. We are offering this to the industry as a way of motivating the adoption of SystemVerilog.


When was Questa first introduced commercially?

Last year just before the DAC conference (May 2005).


How many customers or seats are out there?

That is something we not specifically publish. But I can tell you that 30% of our deals have Questa in it. That doesn't mean that 30% of our sales are Questa. I think the question you are asking is how much penetration and how many people have started using Questa in the marketplace. We find it interesting that 30% of our deals have Questa in them. There is a percentage of our customers that have now converted to SystemVerilog. The problem we have in understanding the metrics is that no large company has made a wholesale shift. We have a lot of startups, generally in the Bay Area and in the Valley. A new startup company with no legacy can make a choice about going full tilt with
SystemVerilog. We have quite a few startups that take a SystemVerilog route and are quite successful at it. We know how many licenses that are out there but the metric itself doesn't mean a heck of a lot. So we haven't published that number.


Has the methodology been tested out with any customers of significant size?

Yes. In fact it has. My statement that the EDA industry does not generate methodologies is absolutely true. So the methodology AVM is not something developed in a vacuum. It was developed with several lead customers. In the press release there is a quote from one customer (HDL Design House) who was a pretty close partner for us. AVM has been out in the industry in both alpha and beta format for about 9 months. What you are seeing is AVM 2.0. This is really a methodology that has made the rounds through the industry. We have been very quite about it, but it has seen real use. The problem with developing a methodology as an EDA company is that we don't earn our living by designing
chips. We earn our living by helping people design chips. The methodology really has to come from the user community. What we have done is found people at the leading edge of things, who have experience with methodologies. They have been the ones to advise us what to do. We have done our homework well and have been able to package up the best ideas in the industry on how these things ought to be done.


Are there any other barriers to adoption out there?

The way people deal with adoption is that they do it incrementally. They say we have all this legacy and we will maintain our current legacy for the products we have. We will launch new products on SystemVerilog. Every medium to large customer we have that has made a commitment to SystemVerilog has taken an incremental approach to it. We've eased the problem for them quite a bit because when you use Questa you do not have to “get rid of” existing methodology. Questa will very happily execute e. If you are an e user, whatever e testbench you have, you have. You can make changes to that testbench with SystemVerilog and quite happily get on with your life. The trick in
adoption is to make progress incrementally. Some people want to make faster progress. Some firms will offer conversion services to these people. People take various approaches to this whole thing. The adoption curve is limited by how quickly people can move.


What other hurdles are out there?

Those are the biggest ones. The ecosystem - the Vanguard Program - is a way of having the industry educate itself around the good things in SystemVerilog. Models are needed. IP needs to come out in ways that are compatible with SystemVerilog. Fortunately, the SystemVerilog spec says you must be able to execute plain old Verilog. So, all the old Verilog models will continue to work under SystemVerilog which is a big plus. You can create new models with some of the more advanced features of SystemVerilog.


How would you characterize the adoption of SystemVerilog itself?

Pretty good! As I said 30% of our deals in the first quarter after we released Questa had SystemVerilog in them. The SystemVerilog sales in terms of revenue is significant. The adoption has been rapid. It is more rapid in startups. However, the momentum is picking up in larger companies. We will be able to announce a major deal this quarter with a company that has made a whole sale conversion to SystemVerilog. You are going to see this move to SystemVerilog accelerating.


What about this transition outside of Mentor's customer base?

Aart de Geus recently stood up at an analysts meeting announcing that Synopsys has a reasonably mature SystemVerilog simulator available. I think Synopsys is also experiencing people getting on the SystemVerilog bandwagon. Cadence is a little late but I think that they will catch up pretty rapidly. The other way to judge SystemVerilog adoption is to have a look at how many companies at DAC have SystemVerilog in any of their documentation. Last year at DAC there was a sponsored lunch with an opportunity for companies to stand up and give a 2 minute pitch on what they are doing with SystemVerilog. Over 85 companies got up and talked.


It is there in the industry. Things are moving forward. It is hard to measure precisely. From our own metric we are pleased with what we see. I think Synopsys is experiencing the same. Cadence, I just do not know.


Faster simulators and advanced methodologies. What about front end tools, ESL?

One of the design goals of AVM was to allow people, encourage people, to check things as early in the design as possible. The feedback from our customers is that our testbench and verification approach has to have this abstraction adaptation layer in it because people want to start off with high level models, check throughput and latency at the high level before they spend man years writing RTL only to find that they have correctly implemented the wrong spec. That's what you are trying to prevent. The way to do that is to have a methodology that lets you smoothly shift between abstraction layers so that you can write a very high level testbench to test your high level models and then as
you refine your design and add more timing details to it that testbench continues to give you value. In the past the problem has been that you wrote a testbench for a high level model, threw it away and then had to write a new testbench for a low level model. The question always was “Did the two testbenches cover the same functionality?” It may be that when you rewrote your testbench for the lower level, you forgot some key things that you are now not checking for any more. Your design may come out leaning slightly to the right or left.


A good methodology has to span system level down to gate level. The only way to do that is to have those abstraction conversion layers inside your testbench. It is an architecture of testbenches. It is a way of writing a testbench that lets you do that. You can do that with SystemC. You don't strictly need SystemVerilog to do it. It just turns out with SystemVerilog you have a whole lot of help with coverage statement and stuff like that. AVM has really taken that into account because it is one of the key points about becoming more effective.




Product Availability and Pricing?


The Questa 6.2 verification platform ships in Q2 2006 and includes access to the Advanced Verification Methodology portal. Pricing starts at $28,000 USD. The AVM will be available in Q2 2006 at no charge under a standard, open-source license.




The top articles over the last two weeks as determined by the number of readers were


POSEIDON Announces Sales Channel Partnership with PROTOtyping to Service Growing Japanese Electronics Market Motohiko Torimoto, CEO of PROTOtyping Japan Corp. said, "We are very excited to work with Poseidon. Interest in architectural optimization and acceleration for processor-based platforms by Japanese companies is increasing. We are honored to have reached an agreement with Poseidon to be their ESL software and design services representative in Japan." Poseidon products will be demonstrated in Japan both at PSS2006 Kobe (May 22nd) & Shibuya (May 24th), and ESEC2006 (Tokyo Big Site) on June
28th-30th, 2006 in Japan.


VaST Systems Technology and Wipro Limited Join Forces VaST -powering embedded design innovation-and Wipro Technologies, the global IT services business of Wipro Limited announced that Wipro has become a certified development partner of VaST peripheral models. Wipro will serve as an extended R&D facility for the rapid creation of VaST peripheral models such as UARTs, memory and other controllers, including ARM Prime Cells.


Mentor Graphics Adds Transient Simulation Capability to CHS Named Capital SimTransient, the new product utilizes industry-standard VHDL-AMS component models to simulate time domain effects such as wire melt and fuse blow. This enables harness cost and weight optimizations to be made in the design of electrical distribution systems in the transportation industry.


Virage Logic and Cadence Join to Present Technical Webinar on Advanced Design Methodology for Low-Power Applications; Industry Leaders Provide Integrated Low-Power Design Flow Supporting RTL-to-GDSII for Complete Approach The live webinar entitled, "Advanced Design Methodology for Low-Power Applications: An Integrated RTL-to-GDSII Approach." will be broadcast via TechOnLine on Wednesday, May 24, 2006, at 1:00 p.m. EDT.


Berkeley Design Automation Named a Red Herring 100 Winner This list of 100 privately held companies in North America recognizes those that play a leading role in innovating the technology business. Red Herring is a magazine reporting on how innovation and entrepreneurship are transforming business and how the business of technology is transforming the world. Berkeley Design Automation is a venture-backed, private company whose technology characterizes the nonlinear, time-varying behavior of complex analog and RF circuits.




Other EDA News


Engineering Scholarships Available for 43rd Design Automation Conference


Magma President Roy E. Jewell to Speak at SG Cowen Technology Conference


Jasper Design Automation Promotes Rajeev Ranjan To the Position Of Chief Technology Officer


Magma Announces Repurchase of $40.3 million Principal Amount of Convertible Notes


STMicroelectronics Certifies Mentor Graphics Catapult C Synthesis Libraries and Joins Silicon Vendor Partners Program


Synopsys' JupiterXT Tool Cuts Prototyping and Implementation Time on Cavium Networks' Octeon MIPS64 Processors


Cadence Chief Financial Officer Bill Porter to Present at the Cowen & Co. Technology Conference


UMC Announces Readiness for 65-nanometer X Architecture Designs


ENOVIA MatrixOne to Host Seminars on Environmental Compliance for Electronics Companies


Dongbu Electronics Collaborates with Cadence to Deliver RTL-GDSII Reference Flow


Synopsys Chief Financial Officer Brian Beattie to Speak at the Cowen & Co. 34th Annual Technology Conference


VaST Systems Technology at DAC 2006


Precience Announces New Seamless Integration for OrCAD and PADS Users


Altera Enables ZAPiT Games Breakthrough in Interactive, Console-Based Family Gaming


Pioneer Chooses Mentor Graphics Catapult C Synthesis Tool for R&D of Digital Signal Processing Applications


HP Boosts Server Lines with Dual Core-based Platforms


Ansoft Corporation Reports Record Results


LogicVision Announces the Industry's First True At-Speed Deterministic Test Compression Solution


HARDI Announces First Silicon Success for C2 Microsystems


Magma CEO Rajeev Madhavan to Speak at JP Morgan Technology Conference


ArchPro Raises $4.5 Million in Latest Financing Round


Berkeley Design Automation Named a Red Herring 100 Winner


Winner Selected for IBSystems iPOD Nano Sweepstakes


Altera Delivers PCI-SIG-Compliant x8 PCI Express Solution Supporting Stratix II GX FPGAs


Giga Scale IC Changes Name to Chip Estimate Corporation


Cadence Unites Industry Leaders to Overcome Low-Power Barriers for the Electronics Industry


Altium Designer 6.3 Smoothes Migration to 'Next-Generation' Electronics Product Development Platform


Athena Design Names EDA Veteran Reichmuth to Head Global Sales, Business Development


STARC Adopts Simucad PDK for Analog Design Flow


Calypto Expands Sequential Analysis Capabilities with SLEC 2.0 Release


Denali Announces Verification Solution for Consumer Electronics Storage Systems


ProDesign's CHIPit Supports Transaction Based Verification


Other IP & SoC News


Court Rules in Favor of Genesis Microchip in MStar's Patent Infringement Lawsuit Appeal


Rising Micro Electronics Successfully Develops RF Transceiver Chipset for 3G WCDMA Wireless Handsets


Zarlink DirectConnect Embedded Ethernet Switch Simplifies Next-Generation Packet-Based Equipment Design


TSMC Production-Ready for 65-nm X Architecture Designs


ON Semiconductor Expands ECLinPS MAX(TM) Clock Management Portfolio with New High-Performance LVDS Fanout Buffers


Axiom Microdevices Vindicated in Silicon Labs v. Axiom Microdevices Lawsuit and Allowed to Commence Sales of the World's Only Fully Integrated CMOS Power Amplifier for Mobile Telephones


SIGMA-C CEO Peter Feist to Participate on Panel at Canaccord Adams' "Technology Trends -- Design for Manufacturing" Conference on June 8, 2006


Arithmatica Announces Floating Point Library Customer in Advanced 3D Graphics Group


SigmaTel Announces Availability of Production Ready Reference Platforms Optimized for Video Capability of STMP3600


Toshiba to Launch 2GB miniSD Memory Card for Cell Phones


Cypress Introduces High-Speed, Small-Form-Factor Non-Volatile SRAM Family


POSDATA Spurs Mobile WiMAX Terminal Chip Set Using Silicon Hive Processor


Zarlink Releases Fourth Quarter and Fiscal 2006 Results


Ground-Breaking Transmitter Chip an Industry First


Atmel and u-blox Launch Low-power GPS Baseband IC With SuperSense(R) Weak-signal Tracking Software


Intel's Core Microarchitecture Redefines Computing


NVIDIA Introduces Fastest GPU for Performance Business Notebooks


National Semiconductor's High-Speed 14-bit ADC Offers Industry's Highest Full-Power Bandwidth


Elpida Memory's Fully Buffered DIMMs Support New Dual-Core Intel(R) Xeon(R)-Based Server Platforms


PMC-Sierra Introduces Security-Enabled Multi-Service One GHz MIPS Processor


Pericom Tackles LVDS Video Signal Routing With Two New High-Bandwidth Switches


Tower Semiconductor Begins Production of Metro Wi-Fi Baseband Controller for Wavion


ATI Unveils New Chipset Line-Up Bringing New Features and More Performance to AMD Socket AM2 PCs


Infineon Demonstrates Second Generation Ultra Low-Cost GSM Single-Chip with Successful Live Phone Call


AMD Delivers Desktop Product Powerhouse While Reducing Costs for Ecosystem Partners


Akustica and SigmaTel Develop New Digital Microphone Array Solution for Notebook PCs


Ramtron Introduces the First FRAM-Enhanced 8051 MCU


New SigmaTel Audio CODECs Enable Systems to Be Compliant with Microsoft Vista Premium Logo Program


Chipidea Data Converter Recognized For Leading-Edge Performance at Top Semiconductor Design Showcase


CoWare Joins WiMAX Forum(TM) to Accelerate WiMAX SoC Design


Key ASIC and Silterra Partner to Deliver Design-to-Manufacture Services Targeted for Mobile and Consumer Electronics Markets


Tensilica Granted 8 New Configurable Processor Technology Patents


Bay Microsystems Unveils World's First 40G Network Processor


STMicroelectronics Unveils 90nm System-on-Chip Capability for Hard Disk Drives


Zarlink Strengthens Optical Component Portfolio with Acquisition of Primarion Optical I/O Group


National Semiconductor Introduces New Family of Output Capacitor-less Headphone Amplifiers


Wavesat Previews WiMAX Nomadic Chipset Targeting Wireless Laptop and PDA Applications


BroadLight Announces the High Performance BL2340 System-on-Chip for GPON ONT


Freescale Extends Controller Continuum with First RS08-Based 8-Bit Microcontroller


Renesas Technology's Highly Integrated 32-bit "Euclid" SuperH Processor Provides Sophisticated Functions Essential for Cost-Sensitive Telematics Systems


AnalogicTech Announces New Step-up DC/DC Converter for Ultraportable Systems Powered by Single, Dual AA Cells


Infineon Selected to Co-Develop Specialized Security Chips for Microsoft FlexGo


AsicAhead Enters WiMAX Market with Breakthrough Single-Chip Programmable Radio


Transmeta Delivers Specialized Processor to Support Microsoft's Pay-as-you-go Computing With FlexGo for Emerging Markets


Freescale Paves the Way for Autonomous Vehicles with Industry's First 32-Bit Flash-Based Microcontroller with FlexRay(TM)


You can find the full EDACafe event calendar here.


To read more news, click here.



-- Jack Horgan, EDACafe.com Contributing Editor.


Rating: