May 29, 2006
Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
You make the comparison to the average person being able to compose multi-channel audio recording. In that case the average person would know whether the end result was satisfactory without understanding much about the tool or the underlying technology. In the case of Scenario Builder how does the end user know that they have created correct scenarios and that they have reasonable coverage?
The tool is actually creating the scenarios for them. The scenario is human readable. The scenarios are correct by construction. The coverage will be given by running the simulation in the verification system.
My point is that if you give an automated or semiautomated tool to a class of users that has no experience in an area, how do they or how do their managers know that they are using the tool correctly and efficiently?
In this case they are using the same tool which is a simulator and using the same debugging tools to observe the behavior of the simulation. The only thing that is different is the specification of the scenario. The same skills they used to identify a correct test today will be used to identify a correct test with Scenario Builder. They will just be able to capture a more sophisticated test with Scenario Builder. It is not a new requirement for them.
The value proposition is to use lower skilled people to create the same tests, to do it quicker, or to do a more complete verification?
The sophisticated challenge is the abstraction of using a tool to describe the scenario. The debugging and observing the behavior of the system is not a new thing at all. Even the less sophisticated people have been doing for a while and will continue to do. It is this concept of what is a verification scenario which is a new domain for them and we are lowering the barrier on that. Once they have a scenario, debugging or observing of the system is the same skill.
Are you aware of any competition?
No! This is a tremendous innovation built upon the innovations in Specman and the verification process automation environment. We are further extending that solution and making it more accessible.
What can be expected in the future fro Scenario Builder?
architectural validation and so on and towards the different specialists that are involved in these systems from front to back. Those are the kind of extensions you could reasonably expect in Scenario Builder and we are discussing with our customers.
In the case of Verification IP the developer or vendor provides certain information, is the information Scenario Builder needs typically provided or is additional data required?
The tool works well with the information already available today from the VIP developer. It does not require extra work. Obviously, there is a capability of featuring extra information that makes the usage by the test writer even easier. This might require a little work but very minor, a couple of hours work.
On May 8 Mentor Graphics announced its comprehensive next-generation Questa 6.2 verification solution. I discussed this announcement this with Robert Hum, VP and GM Design Verification and Test Division.
What are you announcing?
a transition to SystemVerilog.
the three things being announced.
What is the motivation behind this?
SystemVerilog has had in the marketplace is to create a verification environment to do things more efficiently. The implications being that you will need less resources in people and resources. Another goal is to do things more effectively. The implication being there that you will find more bugs. Therefore the design entry into the marketplace will more often than not be correct compared to what you get in today's verification market.
The EDA industry had been responding to the growth and the needs of verification by providing tools and methods. There are lots of tools and methods out there: assertion based verification, functional coverage, constrained-random testing, etc. There are lots of these things available that can be applied to the verification problem. The question of course ends up being “How do you know which one of these things you should use in your situation?”
things that for anybody to become productive took a long time. This was an era when people were experimenting with different testbench technologies, e and Vera grew up, people were trying different approaches to assertions, PSL grew up. Eventually the industry settled on good ways of doing these things. SystemVerilog was born. With SystemVerilog came a realization that there is a methodology or a set of methodologies that can be employed to make the use of these tools more effective and more efficient at the same time and thus get the learning curve under control. This then spawned the thought that something like AVM would be a good thing.
You can find the full EDACafe event calendar here.
To read more news, click here.
-- Jack Horgan, EDACafe.com Contributing Editor.
Be the first to review this article