The Breker Trekker
Tom Anderson, VP of Marketing
Tom Anderson is vice president of Marketing for Breker Verification Systems. He previously served as Product Management Group Director for Advanced Verification Solutions at Cadence, Technical Marketing Director in the Verification Group at Synopsys and Vice President of Applications Engineering at … More »
There Is No Silver Bullet for Low-Power Verification
September 23rd, 2015 by Tom Anderson, VP of Marketing
Anyone who reads The Breker Trekker from time to time needs no convincing from me that verification is a huge challenge for today’s complex chips. Breker’s Trek family of products exists, along with dozens if not hundreds of other EDA products, specifically to address functional verification. There are more technologies, tools, platforms, libraries, and methodologies than any one verification engineer can possibly learn and use on a day-to-day basis.
Why this diversity of solutions? As I first observed in Electronic Engineering Times nearly a decade ago, there is no silver bullet for verification. The problem is both so broad and so deep that no single tool or technology will ever satisfy the need. It takes a mix of solutions, guided by methodologies, to have any chance of first-silicon success. Low-power verification is an area where this is especially true, and unfortunately there is no silver bullet to be found here either.
This topic for the post was promoted by yesterday’s DVClub Europe, a quarterly, mostly virtual event organized by our good friends at TVS. Breker has been a proud sponsor of this series for several years and we generally present a talk every few meetings. As you might have guessed, yesterday’s topic was power verification. Our CEO Adnan Hamid spoke, as did representatives from Mentor Graphics, Synopsys, and ARM. I found the diversity of topics covered in an hour and a half intriguing.
Today’s system-on-chip (SoC) designs typically have anywhere from a few to a hundred different power domains, portions of the chip that can be switched on or off. However, it is usually not the case that there are 2^n legal possible combinations for n power domains. There is typically a much smaller set of allowed power states, often captured in a power state table. That doesn’t sound too complicated, but in fact a whole range of verification techniques is required to ensure that everything works.
For a start, there’s not even a universally accepted way to describe the low-power features of the SoC. The “war” between the Common Power Format (CPF) and the Unified Power Format (UPF) has been well documented. The IEEE attempted to satisfy all parties with its 1801 Standard, but there are still some reports of spotty adoption by both users and vendors. Assuming 100% convergence to this standard, there are many verification steps to be extracted from a power format description.
At the circuit level, isolation cells are required on signals running between power domains that are active and those that are inactive. Level translators may also be needed if the different domains run at different voltages. These circuit elements are be inserted automatically based on the power description, and static netlist analysis tools can check to ensure that all required elements are in their proper locations. This aspect of low-power verification is well accepted and has been in place for years.
A second aspect is ensuring that only legal power states can be enabled in the SoC. The legal state transitions can be captured succinctly as a state machine, and this can be automatically translated into a set of assertions or properties. A formal analysis tool can then verify that the power state control logic in the SoC fully obeys the rules for legal power states and legal transitions between these states. This is critically important for chips that will “melt” if too many domains are turned on at once.
If all the power rules are being followed, the next question is whether the design will continue to operate properly as power domains turn on and off. As Adnan’s talk showed, graph-based scenario models provide an excellent method for this verification. Once the functional scenarios are working, the graph-based scenario model can be overlaid with another graph that exercises all the legal power state transitions. I will schedule a future blog post going into this topic in much more detail.
This approach can check, for example, that a scenario not requiring the GPU will run correctly if the GPU is powered off. If power domains are controlled by software, then one could imagine a test requiring the GPU that starts with the unit off and then turns it on just before it is needed. If power domains are controlled by hardware, a test might start with the GPU off and complete successfully only if the unit was turned on in time by a hardware trigger. Graphs enable these possibilities and more.
Yet another dimension to low-power verification is estimating SoC power consumption as early as possible in the verification process. These days, that usually means booting the production software (operating system plus applications) on the design in an emulation system and measuring power. The multi-threaded test cases generated by Breker’s Trek products exercise the design very well and can be used to gauge performance in simulation or emulation before production software is ready.
Clearly, low-power verification is incredibly complex, indeed with no silver bullet. I’ve touched on some, but not all, of the aspects in this post. It is a topic we will revisit in future posts. Thank you for reading, and as always your comments are most welcome.
The truth is out there … sometimes it’s in a blog.
Tags: 1801, Accellera, ARM, Breker, Common Power Format, CPF, DV, EDA, emulation, formal analysis, functional verification, graph, graph-based, mentor, portable stimulus, scenario model, simulation, SoC verification, standards, Synopsys, test generation, TrekSoC, TrekSoC-Si, Unified Power Format, UPF, use cases