Open side-bar Menu
 The Breker Trekker
Adnan Hamid, CEO of Breker
Adnan Hamid, CEO of Breker
Adnan Hamid is the founder CEO of Breker and the inventor of its core technology. Under his leadership, Breker has come to be a market leader in functional verification technologies for complex systems-on-chips (SoCs), and Portable Stimulus in particular. The Breker expertise in the automation of … More »

Multi-Dimensional Verification

 
May 28th, 2019 by Adnan Hamid, CEO of Breker

It seems like ancient history now, but in the not so distant past, verification was performed by one tool – simulation;  at one point in the flow – completion of RTL;  using one language and methodology – SystemVerilog and UVM. That changed when designs continued to get larger and simulators stopped getting fast enough.  Additional help became necessary in the form of emulators and formal verification, but that coincided with an increasingly difficult task of creating a stable testbench. It was no longer possible to migrate a design from a simulator to an emulator without doing a considerable amount of work on the testbench.

The increasing size and complexity of the design also made it necessary to think about verification as a hierarchy. You could no longer fit all of the design into a simulator and even if you could, it would be highly wasteful. It would make it too difficult and time consuming to get the levels of controllability and observability necessary for complete verification. Unfortunately, when a testbench is developed for a block, it cannot be fully reused when that block is integrated into a larger sub-system without significant re-work.

There is a third axis with today's verification flow. It is perhaps best described as closeness to eventual application. Consider block-level verification. It has no notion of how the block will eventually be used in the system. It is thus important that all potential behaviors are verified such that no bugs exist within the block. As the block is integrated into a sub-system, its purpose may not be known. It may be part of an audio decoder, or encryption engine. The way in which the block is integrated may restrict the behavior space of the block. At the other extreme, you may want to perform verification using the production software, running on top of the operating system. You may not have time to explore all possible behaviors, but you need to make sure that the most important ones execute successfully. Again, there was no way in which any of this can be accommodated with existing verification languages and flows.

The industry desires a verification methodology that can scale. Teams must be able to trade-off the thoroughness of block-level verification against system-level validation. They want to be able to use the best available execution platforms. They need an integrated methodology to track progress. They need a multi-dimensional verification flow.

At this point, you have probably heard about the new Portable Stimulus language created by Accellera. The original application for this was to make the migration of the testbench from simulator to emulator easier. The Portable Stimulus Standard Committee also saw the importance of being able to migrate testbenches from block to sub-system to system and made sure that this could be supported. But it stopped short at supporting the third dimension – the closeness to eventual application.

Thankfully, that does not stop vendors, such as Breker, from making this possible today. Other companies within the verification community are also working on aspects of this and together, a comprehensive hardware/software interface layer is being defined. This is part of what Breker calls a Virtual Realization Layer.

The Virtual Realization Layer can be considered to be a set of services that all hardware/software systems require and are present in major operating systems (OS). But you are unlikely to think about booting Linux on a simulated platform and this may not even be necessary most of the time when running on an emulator. However, you may not want to think of the hardware as being entirely bare metal either. Memory management is one example. Having such a service available can make writing tests a lot easier. Indeed, you may want to run tests with a variety of allocation schemes under your control. You do not want to be wasting time developing such a library of capabilities, neither is it a trivial matter to extract the necessary capabilities from the OS. In many cases, there may be different services used for the diagnostic testing of the hardware versus its eventual deployment.

What is required is a library of these services, potentially with different levels of capability, that can be selected on demand during test synthesis. To make this possible, the hardware/software interface becomes necessary. It insulates the scenario model from the implementation that is to execute on the final platform.

To make this multi-dimensional verification platform possible, Breker is partnering with virtual platform and emulation providers. The flow potentially starts with a virtual prototype of the system, running the target OS and maybe also some subset of the final application software. This may be used for architectural validation, performance optimization and for the verification of critical pieces of the software. The testbench is coded in PSS using the Breker Virtual Realization Layer.

Blocks of the system can be carved out for implementation and verification using a combined PSS/UVM methodology. All behaviors necessary to support the system are encapsulated with the PSS model. Additional levels of detail could either be added into that model or supplemented with a UVM testbench.

When the block is reintegrated into the system, it is most likely via a hybrid emulation platform, where the RTL code of the block runs on the emulator along with the processors and other parts of the system continue to run within a virtual environment. This provides the fastest execution using only the necessary emulator resources for the implemented block. Basic services from the Virtual Realization Layer are used here so that the focus remains on the system-level hardware verification. Over time, pieces of actual software may be substituted. This is only possible using the hardware/software interface to be able to switch between these services without having to modify the testbench.

Finally, the system will be brought up on the final silicon, which is likely to use the actual OS and application software. The testbench and the control of the external I/O necessary to run these tests remains unchanged throughout the flow, with the exception that the model may have been enhanced during the implementation or changed to reflect modified marketing requirements. At any stage when this happens, all previous tests on any of the configured platforms can be rerun, making sure that the change does not impact any aspect of the system that has already been verified.

We are getting close to this reality. This is what our users demand, and this is what Breker and our partners are working on to ensure a complete flow. Together we can do this.

Tags: , , , , , , , , , , , , , , , ,

Category: Knowledge Depot


Warning: Undefined variable $user_ID in /www/www10/htdocs/blogs/wp-content/themes/ibs_default/comments.php on line 83

You must be logged in to post a comment.




© 2024 Internet Business Systems, Inc.
670 Aberdeen Way, Milpitas, CA 95035
+1 (408) 882-6554 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise