Hardware Emulation Journal
Verification Consultant & Investor at Oregon Angel Fund
May 31st, 2016 by Lauro Rizzatti
Of course, anyone who reads my blog posts on EDACafe knows I have a huge bias toward hardware emulation –– In fact, my blog is called Hardware Emulation Journal! I’ve been a part of this Design Automation market segment since 1995 and continue to believe it is the most versatile of all verification tools.
If you happen to be at DAC and want to learn more about hardware emulation and its growing use models, stop by the Mentor Graphics Booth (#949) Monday, June 6, at 4pm. I’m moderating an hour-long panel of exceedingly qualified verification experts who will help me answer the intriguing question: What’s up with all those new use models on an emulator?
Sitting on the podium with me will be the inimitable Alex Starr from AMD, Guy Hutchinson of Cavium Networks and Rick Leatherman at Imagination Technologies. Guy and I had a conversation earlier this week and I can attest to his enthusiasm and knowledge about hardware emulation.
The four of us agree that hardware emulation is moving into the mainstream and away from the dusty back corners of an engineering department. Its reputation has been rehabilitated as well. That’s because it’s accessible to all types of verification engineers and, fortunately for them, they do not need to be experts in the nuances of emulation. This means that emulation can be used to solve problems that previously were almost too tough to solve, and many verification tasks can be completed more quickly and thoroughly. One easily recognizable example is hardware/software co-verification. Hardware emulation is the only verification tool able to track a bug across the embedded software and the underlying hardware. By all accounts, a big challenge these days.
Another point of agreement is emulation’s horsepower, flexibility and versatility, which suggests we’re moving into the fourth era of emulation where applications rule. In the “apps” era, the emulator becomes the “verification hub” of a verification strategy because it is able to address the end-to-end verification needs of today’s complex designs. Applications extend the use of emulators beyond RTL verification, making it possible to develop new scenarios to target an increasing number of vertical markets, from networking and graphics to automotive and beyond.
Given hardware emulation’s emerging popularity and growing uses, each panelist will be asked to describe the types and sizes of designs he’s asked to debug, along with their applications and the basic verification flow. DAC attendees have noted their interested in hardware emulation’s various deployment modes, so we’ll look at several, including traditional ICE, TBA/TBX and virtual. I’ll ask each panelist to describe which modes they’ve used and the varying degrees of success. We’ll attempt to identify the capabilities lacking in each.
Hardware emulation is being used for some new and, perhaps, unlikely tasks, such as low-power verification, power estimation and design for test. I intend to ask each panelist whether he’s familiar with these new modes and his perspective on the effectiveness of hardware emulation to debug chips with these characteristics.
As you might expect, our goal is to make this panel lively and thought provoking. I predict the panelists will confirm through experience and expertise that it’s much more usable than most commercial verification tools. We will leave room for questions, so please come armed with questions that we can try to answer or stump us.
Please join us. I look forward to seeing you in Austin.
May 20th, 2016 by Lauro Rizzatti
Why, the vendor sends in a village of AEs, R&D engineers and PhDs, of course, to work onsite with the lead designer and his or her designs. Yes, it’s common for a design automation company to send AEs into an account as “super” users of a specific software tool, such as formal verification, because it’s specialized and specific technology not everyone has mastered. And yes, old-style hardware emulators came with AEs because early generations of the tool were difficult to deploy. They required expertise and plenty of manual effort to get them operational, and hence the refrain “time to emulation.”
Some more positive project teams would assume this gesture seemingly is a sign of commitment. Actually, it’s not. Not even close. It’s a sign of the tool’s weakness because it doesn’t work.
May 3rd, 2016 by Lauro Rizzatti
Many proponents and users of hardware emulation continue to enthuse about its benefits, expanding use models and growing popularity among hardware design and verification engineering groups. We believe it is the foundation of almost all verification strategies today, not replacing simulation, but augmenting it.
The topic of hardware emulation’s popularity is starting to show up in technical conferences and other industry events, reinforcing what we’re seeing. Wall Street’s paying attention as well.
March 31st, 2016 by Lauro Rizzatti
Attending a conference like DVCon offers many benefits, including the opportunity for loads of hallway discussions. I was stopped continuously during DVCon by friends, colleagues and acquaintances all wanting to talk about emulation, which convinced me that it’s the hottest verification tool and topic today.
Here are five of the questions I was asked, along with the answers.
Q1. For many years, emulation has been an exotic and rather expensive tool used in very limited market segments, such as the largest processor and graphics designs. Today, it is used across the board by virtually all semiconductor industry segments. What facilitated this broad adoption?
A1. Significant improvements in the emulation’s hardware and in the supporting software. Perhaps even more significant is a dramatic drop in the cost of ownership (COO). Just consider that on a dollar-per-gate basis, the cost dropped from $5 in the early 90s to less than half a penny now. Add to that the radical enhancement in system reliability, the dramatic improvements in the usage model, the multi-user and remote access capabilities and it’s a COO that is a small fraction of what it used to be a decade ago.
March 21st, 2016 by Lauro Rizzatti
Attended by approximately 1,200 visitors (about 700 paying customers and 400+ on the free day) and 30 exhibitors, its program included 12 tutorials, 13 technical papers and 45 posters. Topics were on various aspects of design and design verification, with particular emphasis on hardware emulation, Universal Verification Methodology (UVM), portable stimulus, low-power design, and formal.
The event offered a keynote, two panels, one roundtable and three sponsored lunches by each of the three main EDA giants. A conversation between Jim Hogan and Ajoy Bose titled “Crossing the Chasm: From technology to Valuable Enterprise,” was hosted by the EDA Consortium and based on Dr. Bose’s career in building several businesses in the high-tech industry.
February 29th, 2016 by Lauro Rizzatti
Note: @Extension Media, (2016). This is the author’s version of the work. It is posted here by permission Extension Media for your personal use. Not for redistribution. The definitive version was published February 1, 2106 in Embedded Systems Design http://www.extensionmedia.com
Booting an OS such as Linux gets the designer to the starting line, but how does the real work commence for complex multicore designs destined for gaming, digital signage, and more?
When software developers hear the word emulation, they often think software emulators and not hardware emulation. That’s changing, though, as the versatile hardware emulator gains a more widespread reputation for being able to ensure that the embedded system software works correctly with the underlying hardware. It’s happening as it becomes a shared resource between hardware and software teams to accelerate hardware/software integration ahead of first silicon. As it should, especially when the ratio of software developers is greater than hardware engineers on a chip project.
February 10th, 2016 by Lauro Rizzatti
If I were to study the astrological charts for the week of February 29, I’d see many of the planets aligning around DVCon, the center of our verification universe. As always, it will be a special week chock full of topics related to verification. Tutorials, paper sessions, panels and exhibits should keep all attendees energized as they learn about advances in technologies, methodologies and deployment modes to make their professional lives easier.
Our verification tool universe keeps expanding and will be on full display at DVCon. Lately, I’ve been hearing reports that “apps” may be coming to hardware emulation … and why not? What’s good enough for software should be good for hardware as well, especially when it will save time, improve performance and help a verification engineer avoid risk.
As we’ve learned, in emulation, hardware alone is no longer the differentiating factor and, hence, the introduction of apps. As the apps transformed a mobile phone into a smart phone, so would hardware with an operating system running specific applications transform a hardware emulation platform into a verification hub and drive new use models, greatly reducing risk and improving performance.
January 19th, 2016 by Lauro Rizzatti
Herein is my 2016 prediction: The Electronics Industry will see more hardware emulation experts and specialists this year than ever before due in large measure to the widespread proliferation of hardware emulators. These experts are both hardware designers and software developers –– giving hardware emulation the distinction of being the only verification tool able to make this claim –– as co-verification becomes a way of life for SoC debug. They become proficient emulation users as the tool tracks a bug across software and into the hardware design and the reverse, from the hardware design to the embedded software.
Another reason why we’ll see more experts can be attributed to hardware emulators now being used as datacenter resources, making them more accessible to more users. These efficient tools can be accessed remotely in transaction-based emulation mode or in standalone emulation mode any time anywhere.
December 17th, 2015 by Lauro Rizzatti
One day recently, I was considering the varied use models for hardware emulation. It brought back a long-forgotten memory of an evening bowling in New England, where I lived for several years in the ‘80s.
New England has a quaint, little-known (outside of the region) type of bowling called Candlepin. While the play is the same as the more popular form of bowling, the pins are long and narrow, and look a bit like candles. The 10 candlepins are set up in an inverted triangle –– one ball in the first row, two in the second, three in the third, four in the fourth –– and look vaguely as if they’re in “lanes.” This could be a diagram for the verification tool space with most of the available verification tools and techniques in separate and distinct lanes, each with its own function.
Not so with hardware emulation because it’s able to fan cross lanes or boundaries and is multi-functioned. The best example is hardware/software co-verification. Emulation can track hardware bugs from a hardware glitch or software failure or detect software bugs caused by software breakdowns or hardware problems. Its final step is verifying that the hardware has been properly designed to run the software.
November 18th, 2015 by Lauro Rizzatti
The 2015 DVCon/Europe was held in Munich at the Holliday Inns City Center Hotel November 11-12. November in Munich brought back long ago memories of a snow covered city with freezing temperatures when I lived there in the 80s. Not this November. Warm, sunny days crowned the conference and invited attendees to take a 20-minute stroll across the Isar River to Marienplatz, the heart of the city.
2015 was the second year of the conference. For the enjoyment of Accellera, the sponsor of the event, it recorded an increase in attendees (from 220 to 270) and in exhibitors (25). The success was achieved despite a Lufthansa’s attendant strike that prevented about 50 people from attending and the simultaneous running of the popular Productronica Conference in Munich and ARM TechCon in U.S.