EDACafe Weekly Review May 31st, 2016

DAC53_LongLogoOf course, anyone who reads my blog posts on EDACafe knows I have a huge bias toward hardware emulation –– In fact, my blog is called Hardware Emulation Journal! I’ve been a part of this Design Automation market segment since 1995 and continue to believe it is the most versatile of all verification tools.

If you happen to be at DAC and want to learn more about hardware emulation and its growing use models, stop by the Mentor Graphics Booth (#949) Monday, June 6, at 4pm. I’m moderating an hour-long panel of exceedingly qualified verification experts who will help me answer the intriguing question: What’s up with all those new use models on an emulator?

Sitting on the podium with me will be the inimitable Alex Starr from AMD, Guy Hutchinson of Cavium Networks and Rick Leatherman at Imagination Technologies. Guy and I had a conversation earlier this week and I can attest to his enthusiasm and knowledge about hardware emulation.

The four of us agree that hardware emulation is moving into the mainstream and away from the dusty back corners of an engineering department. Its reputation has been rehabilitated as well. That’s because it’s accessible to all types of verification engineers and, fortunately for them, they do not need to be experts in the nuances of emulation. This means that emulation can be used to solve problems that previously were almost too tough to solve, and many verification tasks can be completed more quickly and thoroughly. One easily recognizable example is hardware/software co-verification. Hardware emulation is the only verification tool able to track a bug across the embedded software and the underlying hardware. By all accounts, a big challenge these days.

Another point of agreement is emulation’s horsepower, flexibility and versatility, which suggests we’re moving into the fourth era of emulation where applications rule. In the “apps” era, the emulator becomes the “verification hub” of a verification strategy because it is able to address the end-to-end verification needs of today’s complex designs. Applications extend the use of emulators beyond RTL verification, making it possible to develop new scenarios to target an increasing number of vertical markets, from networking and graphics to automotive and beyond.

Given hardware emulation’s emerging popularity and growing uses, each panelist will be asked to describe the types and sizes of designs he’s asked to debug, along with their applications and the basic verification flow. DAC attendees have noted their interested in hardware emulation’s various deployment modes, so we’ll look at several, including traditional ICE, TBA/TBX and virtual. I’ll ask each panelist to describe which modes they’ve used and the varying degrees of success. We’ll attempt to identify the capabilities lacking in each.

Hardware emulation is being used for some new and, perhaps, unlikely tasks, such as low-power verification, power estimation and design for test. I intend to ask each panelist whether he’s familiar with these new modes and his perspective on the effectiveness of hardware emulation to debug chips with these characteristics.

As you might expect, our goal is to make this panel lively and thought provoking. I predict the panelists will confirm through experience and expertise that it’s much more usable than most commercial verification tools. We will leave room for questions, so please come armed with questions that we can try to answer or stump us.

Please join us. I look forward to seeing you in Austin.

Life is Short: Carpe Eruditio at DAC 2016
May 26, 2016  by Peggy Aycinena


There are clearly a lot of collateral distractions at the Design Automation Conference
: Networking. Social Hours. Parties. Chotzkies. But the real fun at DAC comes from carving time out to attend technical sessions. This is year in Austin, the offerings are particularly rich.

On Sunday, June 5th, my two favorites are: The Workshop on Design Automation for Cyber-Physical Systems, and The Workshop on Computing in Heterogeneous, Autonomous ‘N’ Goal-Oriented Environments. Both of these all-day events feature experts from academia and industry, most speaking for at least 30 minutes. The topics will be very technical and the schedules allow for detailed presentations. Of course, this doesn’t mean the other workshops on Sunday don’t have great merit, but the two I have identified look to be particularly rich opportunities for learning.

Sunday evening, for the first time, there will also be a 2-hour panel focused on Career Perspectives in EDA, a discussion sponsored by CEDA. Although many will be obliged to attend networking dinners on Sunday evening, or will still be busy setting up booths for Monday morning’s Exhibit Hall opening, attending this Career Panel seems an opportunity not to be missed, particularly as it will be moderated by the supremely knowledgeable Bill Joyner from SRC. Admittedly, this is not a technical session, but the implications for the industry are profound. [File under the heading: ‘Concern for an Aging Industry’]


Ten years ago, Rich Weber and Jamsheed Agahi
surveyed an industry they knew well – they each had 10+ years’ involvement in the technology – and found no one was providing hardware/software interface solutions. So in February 2006, they founded a company to “provide good solutions to the industry” and got busy coding. They had their software up and running by DAC, held that year in San Francisco, were featured in the July 2006 issue of EETimes, and were working with their first customers by the end of the year.

Those early successes were an indication of the credibility of Semifore Inc. and a reflection of the singular vision of founders who knew each other well; they had worked with together at various companies prior to 2006, Data General, Silicon Graphics, StratumOne and Cisco Systems. Starting Semifore together was the logical next step in their collaborations. Now ten years on, both founders are still with the company

The Power and Simplicity of Path Constraints
May 25, 2016  by Tom Anderson, VP of Marketing

Last week on The Breker Trekker, we talked about path constraints and how they differ from other kinds of constraints commonly used in SoC design and verification. Our whole approach to verification is based on graph-based scenario models, and constraints on the paths through the graph are a natural way to control how our Trek family of products automatically generates test cases. It’s easy to eliminate some paths, focus on others, or bias the randomization of selections. We believe that path constraints should be a part of any portable stimulus solution that meets the forthcoming Accellera standard.

We have heard some people in the industry argue that path constraints are not needed, and that value constraints would suffice. While we agree that value constraints are a familiar concept from the UVM and other constrained-random approaches, we do not feel that they are the best way to control the scenarios generated from a portable stimulus model. In today’s post we will expand on the example from last week and show how path constraints can handle a more complex design better than value constraints.

Cliosoft: DAC #519
Real Intent Job

You are registered as: [_EMAIL_].

CafeNews is a service for EDA professionals. EDACafe.com respects your online time and Internet privacy. Edit or Change my newsletter's profile details. Unsubscribe me from this newsletter.

Copyright © 2017, Internet Business Systems, Inc. — 595 Millich Dr., Suite 216 Campbell, CA 95008 — +1 (408)-337-6870 — All rights reserved.