EDACafe Editorial Peggy Aycinena
Peggy Aycinena is a contributing editor for EDACafe.Com Breker: Anderson’s verification tutorial rocks DesignConFebruary 4th, 2013 by Peggy Aycinena
Breker Verification Systems VP Tom Anderson presented a concise tutorial on low-power SOC verification at DesignCon on January 30th. He began by laying out the challenges of low-power design, with an eye to the verification problems associated with various strategies: *********************** The need for low-power design is ubiquitous, with today’s plethora of consumer devices being battery-powered. ‘Big iron’ machines in modern data centers are also driving the need for low-power chips. As well, governments worldwide – especially in Europe – are passing ‘green’ laws; if you’re building a ‘big iron’ class of machine, you may be required by law to meet specified power limits. There are various techniques emerging to meet these needs. Circuit-level design strategies include special transistor and cell design for non-critical paths. Different voltage thresholds are also an option, yielding a variety of performance levels and power consumption at different points on-chip; designers can make a one-time trade-off between performance level and path options on-chip. These techniques have little or no impact on functional verification. Other strategies, however, do. Substrate biasing can help reduce leakage current, an increasing source of power draw. Designers can also make tradeoffs between increased routing congestion and increased area when working to reduce power. Dynamic voltage and frequency scaling [DVFS] is also a choice, with system software enabling fine-grained control of power versus functionality. Designers using DVFS should simulate with combinations of allowed frequencies, and at a full-SOC level where simulators unfortunately tend to be very slow. Another available choice is to turn off the clocks to any unused logic, a strategy that can also be controlled by system software on the fly. Such choices save dynamic power but not static leakage, however. Fortunately, all memory and register states can be retained while the clock is off. Under such circumstances, a full SOC-level simulation is also recommended, because a clock turned off may be one that somebody else was actually depending on. Finally, one of the most widely used low-power design techniques is power shutoff (PSO), a choice that can have the biggest impact on functional verification. Turning off the power completely to unused logic regions called ‘power domains’ saves both dynamic and static leakage power. The outputs of such un-powered domains must be held in isolation from the rest of the chip, and often there is critical state that must be saved and restored. There are four steps to verification if PSO is used, each with its special challenges. First is to use static RTL analysis to verify the low-power structures, such as isolation cells where needed. Second is to use formal analysis to verify operation of the power control module (PCM), which controls the state of all the power domains. Third is to ensure that simulations still pass with the appropriate power domains turned off, critical because low-power bugs can be very hard to work around in silicon. Fourth and finally, a new functional verification choice is to generate and run C testcases to verify the system-level power behavior. It’s possible to illustrate PSO functional verification with a simple 3-block design, each block with its own power domain. The PCM sends signals to individual blocks to isolate, save the state, power down, power up, restore state, and reconnect. This simple design can also illustrate a static-RTL analysis and the use of a “super-lint” tool to look at the structure of the RTL design, looking for more than just poor coding practices or coding mistakes. Recent efforts to standardize power formats are a result of such considerations, and have produced well-known disagreements between Si2/Cadence’s Common Power Format and Accellera/Mentor Graphics/Synopsys’ Unified Power Format. Both formats define the power domains on-chip, how they’re controlled and which combinations in a functionally verified design should be allowed. The reality of these disagreements has forced many EDA vendors to fashion their tools to read one or both format. Meanwhile, others are working to deliver the IEEE 1801 standard which intends to solve the present format confabulation. In the midst of all of these efforts, it is still the case that designers want to prove their design rules, expressed in forms of assertions, through a formal analysis of the PCM. Happily, it turns out that the PCM is essentially a finite state machine (FSM), and formal analysis is always very effective at verifying FSMs. Such rules might include: * Isolation must be asserted using power down. Static analysis, power-aware simulations, and formal analysis therefore, mean that UPF, CPF or IEEE 1801 must all be read in. At the same time, the PCM must turn off the power domain only when they’re truly not needed for proper functionality. Simulating with appropriate domains turned off helps to verify the systems. It’s true that today’s modern simulators are power aware allowing users to model power on/off, save/restore, and isolation – all important because it they provide a way to be sure you’re not relying on something that’s been powered down. Meanwhile, setting the power for simulation is usually a manual process. And, as in other scenarios, the PCM is usually controlled by systems software. Looking at a simplified model of a digital camera permits a hypothetical application of these techniques. The example illustrates testbench-based simulation as a challenge at the full SOC level – slow speed is a problem, especially when low-power features are in use. It’s hard to simulate deep behaviors “from the outside in”, and there is no link between the testbench and embedded processors. And, neither UVM nor OVM has any idea about embedded processors. [For further detail, contact Tom Anderson at Breker. http://www.brekersystems.com/] Running production in an emulator or FPGA prototype is not the full answer. You need to do this for hardware/software co-verification, but production software is available too late in the project schedule. However, production software tends to be well behaved and does not verify SOCs against future software revisions. For instance, millions of people are writing apps for iPhones that may trigger a hardware bug. Therefore, you’ve got to go beyond production software and look for that ‘special something ‘ that will beat up the chip early enough in the process to verify future use cases. Some people manually write C tests to accomplish that, but only for a single processor with a single thread of code. Breker advocates automatic C testcase generation. In summary, low-power design is part of virtually all SOC projects. Numerous lower-power design techniques exit, many of which have low impact on functional verification. Four steps, as laid out here, are needed to verify PSO, a common low-power design technique. System-level power verification using scenario models is emerging to complement the other three established steps.
“Tom Anderson has more than a dozen years of experience in EDA verification applications and marketing. He has served as Product Management Group Director for Advanced Verification Solutions at Cadence, Technical Marketing Director in the Verification Group at Synopsys, and Vice President of Applications Engineering at 0-In Design Automation. Before moving into EDA he was Vice President of Engineering at IP pioneer Virtual Chips, following roles in ASIC design and management.” Tags: Breker Verification Systems, DesignCon, Low-power SoC verification, Tom Anderson |