May 29, 2006
Verification Update
Please note that contributed articles, blog entries, and comments posted on are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
Jack Horgan - Contributing Editor

by Jack Horgan - Contributing Editor
Posted anew every four weeks or so, the EDA WEEKLY delivers to its readers information concerning the latest happenings in the EDA industry, covering vendors, products, finances and new developments. Frequently, feature articles on selected public or private EDA companies are presented. Brought to you by If we miss a story or subject that you feel deserves to be included, or you just want to suggest a future topic, please contact us! Questions? Feedback? Click here. Thank you!

You make the comparison to the average person being able to compose multi-channel audio recording. In that case the average person would know whether the end result was satisfactory without understanding much about the tool or the underlying technology. In the case of Scenario Builder how does the end user know that they have created correct scenarios and that they have reasonable coverage?

The tool is actually creating the scenarios for them. The scenario is human readable. The scenarios are correct by construction. The coverage will be given by running the simulation in the verification system.

My point is that if you give an automated or semiautomated tool to a class of users that has no experience in an area, how do they or how do their managers know that they are using the tool correctly and efficiently?

In this case they are using the same tool which is a simulator and using the same debugging tools to observe the behavior of the simulation. The only thing that is different is the specification of the scenario. The same skills they used to identify a correct test today will be used to identify a correct test with Scenario Builder. They will just be able to capture a more sophisticated test with Scenario Builder. It is not a new requirement for them.

The value proposition is to use lower skilled people to create the same tests, to do it quicker, or to do a more complete verification?

The sophisticated challenge is the abstraction of using a tool to describe the scenario. The debugging and observing the behavior of the system is not a new thing at all. Even the less sophisticated people have been doing for a while and will continue to do. It is this concept of what is a verification scenario which is a new domain for them and we are lowering the barrier on that. Once they have a scenario, debugging or observing of the system is the same skill.

Are you aware of any competition?

No! This is a tremendous innovation built upon the innovations in Specman and the verification process automation environment. We are further extending that solution and making it more accessible.

What can be expected in the future fro Scenario Builder?

We try to improve even further the ease of use and the abstraction of the presentation of the environment. At a macro level there are things we are doing with the whole Incisive platform. For example more multi-language support is a possibility. We are working with customers to prioritize all this stuff. We are expanding the domain application of Incisive to other specialists. So it is possible that other specialists will have their versions with expansions of Scenario Builder. This version is explicitly for designers. We have a roadmap for Incisive that Scenario Builder would track. The kinds of things we are going towards are hardware/software, analog and mixed signal,
architectural validation and so on and towards the different specialists that are involved in these systems from front to back. Those are the kind of extensions you could reasonably expect in Scenario Builder and we are discussing with our customers.

In the case of Verification IP the developer or vendor provides certain information, is the information Scenario Builder needs typically provided or is additional data required?

The tool works well with the information already available today from the VIP developer. It does not require extra work. Obviously, there is a capability of featuring extra information that makes the usage by the test writer even easier. This might require a little work but very minor, a couple of hours work.

On May 8 Mentor Graphics announced its comprehensive next-generation Questa 6.2 verification solution. I discussed this announcement this with Robert Hum, VP and GM Design Verification and Test Division.

What are you announcing?

There are three highlights. The first is the new Questa Verification platform, a single kernel SystemVerilog based simulation environment. The second thing is open source standard based Advanced Verification Methodology called AVM. The third thing is the Questa Vanguard Program which is an organization of companies, currently 25 firms enrolled, who are contributing to SystemVerilog in some way. The list includes people who are doing training, people who are doing consulting, people who are doing conversions from e or Vera to SystemVerilog and folks who have tools available like SpiraTech. It is kind of an ecosystem of companies working together to help the electronics industry to make
a transition to SystemVerilog.

Questa is delivering on three fronts. It is necessary but insufficient to deliver only tools. If all you have is a simulator, you are going to get yourself in trouble because SystemVerilog itself has quite a bit of capability in it. The question is “How do you most effectively use that capability?” How do you become not only efficient but effective in verification? You really have to deliver three things to the marketplace to make this work: tools, a methodology to use these tools effectively and the other thing is infrastructure. You have to have models, methodologies and an ecosystem of companies that provide services and things that simulators consume. Those are
the three things being announced.

What is the motivation behind this?

The impact of complexity on the number of cycles as seen the Collet study. Intel has slides showing the number of code vectors increase with design size. Bigger designs simply give you more bugs. More bugs means more tests to find them. More tests mean more people to write them. With more people to write tests, you need more simulators to run them. Also some of the new tests will have bugs themselves. Hence you will need more people to debug them. Complexity has some interesting effects. You need to do a much more complete job of testing and that increases the load on an organization which drives you to add more people; a very expensive spiral to get into. One of the goals that
SystemVerilog has had in the marketplace is to create a verification environment to do things more efficiently. The implications being that you will need less resources in people and resources. Another goal is to do things more effectively. The implication being there that you will find more bugs. Therefore the design entry into the marketplace will more often than not be correct compared to what you get in today's verification market.

The EDA industry had been responding to the growth and the needs of verification by providing tools and methods. There are lots of tools and methods out there: assertion based verification, functional coverage, constrained-random testing, etc. There are lots of these things available that can be applied to the verification problem. The question of course ends up being “How do you know which one of these things you should use in your situation?”

In the old days people used to think that all they needed for better verification was a faster simulator. In fact simulator speed has always been one of the things that everybody benchmarks. These days if you are only benchmarking simulator speed, you are doing yourself a disservice. Simulator speed is all about finding the same bugs you have been finding but finding them in less time. The industry realized that this wasn't enough. Some of the technology and technique we mentioned earlier came into play. The trick there was to go find more bugs as quickly as you can. Then we discovered there was a huge learning curve. There were so many different methods and tools and ways of doing
things that for anybody to become productive took a long time. This was an era when people were experimenting with different testbench technologies, e and Vera grew up, people were trying different approaches to assertions, PSL grew up. Eventually the industry settled on good ways of doing these things. SystemVerilog was born. With SystemVerilog came a realization that there is a methodology or a set of methodologies that can be employed to make the use of these tools more effective and more efficient at the same time and thus get the learning curve under control. This then spawned the thought that something like AVM would be a good thing.

You can find the full EDACafe event calendar here.

To read more news, click here.

-- Jack Horgan, Contributing Editor.


Review Article Be the first to review this article
Featured Video
Currently No Featured Jobs
Upcoming Events
RISC-V Workshop Chennai at IIT Madras Chinnai India - Jul 18 - 19, 2018
CDNLive Japan 2018 at The Yokohama Bay Hotel Tokyu Yokohama Japan - Jul 20, 2018
International Test Conference India 2018 at Bangalore India - Jul 22 - 24, 2018
MAPPS 2018 Summer Conference at The Belmond Charleston Place Charleston SC - Jul 22 - 25, 2018
DownStream: Solutions for Post Processing PCB Designs
TrueCircuits: IoTPLL

Internet Business Systems © 2018 Internet Business Systems, Inc.
25 North 14th Steet, Suite 710, San Jose, CA 95112
+1 (408) 882-6554 — Contact Us, or visit our other sites:
AECCafe - Architectural Design and Engineering TechJobsCafe - Technical Jobs and Resumes GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise