Verification Consultant & Investor at Oregon Angel Fund
Five Questions about Emulation No One’s Afraid to Ask
March 31st, 2016 by Lauro Rizzatti
Attending a conference like DVCon offers many benefits, including the opportunity for loads of hallway discussions. I was stopped continuously during DVCon by friends, colleagues and acquaintances all wanting to talk about emulation, which convinced me that it’s the hottest verification tool and topic today.
Here are five of the questions I was asked, along with the answers.
Q1. For many years, emulation has been an exotic and rather expensive tool used in very limited market segments, such as the largest processor and graphics designs. Today, it is used across the board by virtually all semiconductor industry segments. What facilitated this broad adoption?
A1. Significant improvements in the emulation’s hardware and in the supporting software. Perhaps even more significant is a dramatic drop in the cost of ownership (COO). Just consider that on a dollar-per-gate basis, the cost dropped from $5 in the early 90s to less than half a penny now. Add to that the radical enhancement in system reliability, the dramatic improvements in the usage model, the multi-user and remote access capabilities and it’s a COO that is a small fraction of what it used to be a decade ago.
Q2. Can you elaborate a bit on the usage model?
A2. Emulation was conceived three decades ago to allow for testing a pre-silicon design with real data and real traffic. Such a deployment mode was called ICE for in-circuit-emulation. It consisted of connecting the design-under-test (DUT) mapped inside the emulator to the physical target system where the taped out design would reside ultimately. In the middle of the 90s, all emulation players supported the ability to drive the DUT via a software-based testbench, although with a rather limited acceleration factor. At the end of the 90s, IKOS Systems (now Mentor Graphics) pioneered a new deployment method that split the testbench in two parts. The first was a front-end written at a higher level of abstraction than RTL and executed in the workstation that generated traffic or “transactions.” A back-end processed in the emulator converted “transactions” into bit or signal-level communication to/from the DUT. The approach provided a speedup factor of two or three orders of magnitude compared to emulation driven by RTL testbenches. This approach was quickly adopted by all emulation players, and opened the door to adoption in large scale.
Q3. Today, as important as it is, functional design verification is only one aspect of several verification areas. Can emulation meet all the other areas involved in the SOC design verification?
A3. With the exception of timing analysis –– the timing behavior of the emulated DUT is way off the actual timing due to the long propagation delays in the vastly larger footprint of the DUT –– virtually all SoC verification aspects can and are tackled by a modern emulation platform. A leading-edge emulation system now supports SVAs and functional coverage, low-power domain verification, DUT switching activity tracking for power estimation via power estimation tools. It can validate drivers and operation systems –– an impossible task for an HDL simulator –– and execute application software, albeit at a slower speed than in real silicon or even in an FPGA or a virtual prototype. None of the latter can trace bugs that show their presence when the embedded software is executed on the DUT hardware inside the emulator.
Q4. Over the years, emulation has earned the reputation of being a very costly proposition. Is it still the case today?
A4. It has been proven over and over again that a late market entry of a product in a highly competitive market such as the semiconductor industry leads to revenue loss that may kill a project. Consider that a one month delay on a 24-month product lifecycle will shed about 12% of revenues. For a 12-month product lifecycle, the loss in revenues amounts to about 25%. The risk of missing a project schedule can be catastrophic. De-risking a project is of the utmost importance. Emulation is the best verification tool to thoroughly test the functionality of the hardware of an SoC design. It can check the integration of the hardware with the embedded software, the validation of the applications software ahead of silicon availability, and the validation of the entire SoC design. Reducing the number of respins and keeping a project on-schedule vastly pay off what seems otherwise a costly proposition.
Q5. New and hot markets today include automotive and IoT. Also, safety and security are the buzzwords that every executive in the high-tech industry is using. How does emulation help in these fields?
A5. Emulation powered up by a robust set of apps is the most versatile verification engine available to a verification engineering team. Since more and more embedded software now drive automotive and IoT designs, it becomes vital to use the versatility of emulation to address all verification needs of such designs. It is only matter of time before we see new application software targeting the verification of safety and security issues in automotive designs.
Questions about hardware emulation? Ask them in the comments section below and I’ll answer them in a future blog post.