Hardware Emulation Journal
Verification Consultant & Investor at Oregon Angel Fund
September 19th, 2016 by Lauro Rizzatti
Although I don’t want to repeat myself, my 2015 report included my assessment that “the traffic on Bangalore’s roads reminded me of a Circle of Hell from Dante Alighieri’s 14th-century poem, ‘Divine Comedy.’” It was like that again this year. As if this wasn’t enough, the timing of this year’s conference coincided with an unanticipated and unpleasant event.
A little background. Bangalore is the city center of the Karnataka State bordering the Tamil Nadu State. The border is delimited by the river Cauvery that provides crucial and life sustaining water to the two states via a dam. In good years, the dam has enough water to fulfill the needs of both. In bad years, the scarcity of water causes grief, tensions and confrontations. The Summer of 2016 was really bad. I was told that the discord between the two states turned into an enormous political crisis that escalated to the attention of the Prime Minister of India.
September 6th, 2016 by Lauro Rizzatti
DVCon India could be considered the official start of the fall season for our industry. It kicks off Thursday, September 15, and runs through Friday, September 16, at The Leela Palace in Bangalore, an elegant hotel and a great place to host a content-rich technical event like this.
The two-day event, now in its third year, will offer a bit of technical everything for design and verification engineers and engineering managers, from keynotes and panels to tutorials and papers. I hope to see an increment in attendance from about 650 in 2016. The attendees will have the opportunity to take part in many of the informal technical discussions. It’s a great networking opportunity.
One not-to-be missed keynote, “Design Verification: Challenging Yesterday, Today and Tomorrow,” will be delivered by Mentor Graphics’ Wally Rhines. According to the abstract found on the DVCon India website, he will review the major phases of the verification evolution over the past several decades and focus on the challenges of newly emerging problems. I’m looking forward to his insights and expect to see some terrific visuals.
As I did last year, I will moderate a panel titled, “The Future Verification Flow,” the first day from 12:10 p.m. until 1 p.m. in the Grand Ballroom. Panelists will be Mike Bartley of Test and Verification Solutions (T&VS), Shankar Narayana Bhat who hails from Qualcomm’s Bangalore Design Centre and Ashish Kumar who will join us from the Broadcom India Design Centre.
We plan on a lively discussion as we review the challenges of the current verification flows and hash over whether emulation will become the de facto verification tool replacing simulation and, if so, the kind of disruption it could create. We intend to take a hard look at emulation versus simulation in the verification flow and determine the effectiveness of a simulation/formal verification flow versus a simulation/emulation flow. I’m planning to put each panelist on the spot and ask them to predict what’s coming next in the continuing evolution of verification.
Descriptions of all the technical sessions, papers and tutorials, make them all seem interesting and thought-provoking, but one in particular stands out for me. It’s the ESL Tutorial: Hybrid Solution Combining Emulation and Virtual Prototyping. That’s high on my “Must See” list.
And then, there is the exhibit floor. The big three –– Cadence, Mentor and Synopsys –– will have booths, as will Verific, Aceic Design Technologies, Breker Verification Systems, Dassault Systemes, Doulos, Magillem, NEC, Real Intent, SmartDV Technologies, T&VS and True Chip.
This year’s DVCon India should be as much of a standout event as it was last year. For more information, visit: https://dvcon-india.org/
My next blog post will be a trip report on DVCon India. Look for it later in September.
August 24th, 2016 by Lauro Rizzatti
I had a chat with a friend yesterday who announced: “Less efforting is working for me.” The use of the noun effort as a verb –– efforting –– didn’t send me to my online dictionary to check my grammar or linguistic skills. Instead, it took me back 30-odd years to the early days of hardware emulation when efforting could have been the catchphrase.
In those days from the 1980s, the emulator arrived with a crew of applications engineers (AEs in a box, we used to say). Even they didn’t have a magic touch –– it seemingly took forever to tweak the system just so to get it to work. Pricing required some justification efforting as well because they were expense verification tools. As a result, they were reserved for only the largest and most complex chip designs, which, in those days average about 100,000 ASIC gates. Big price tag, big chips, lots of efforting.
Efforting continued into the 1990s as hardware emulation became a bit more popular, though they were an unsightly mess with cables snaking around the boxes, like spaghetti enveloping meatballs, so much so that they were relegated to a back room. With all those cables came in circuit emulation (ICE), the default, actually the only use model to verify the design-under-test (DUT) with real traffic data. While effective, the data in and out of the emulator ran at a lower speed than the actual speed of the real traffic data, requiring the insertion of speed adapters and additional efforting. Further, the manned supervision commanded by the ICE mode limited hardware emulation’s ability to become a shared remote resource.
August 5th, 2016 by Lauro Rizzatti
A panel in DAC’s technical program this year continues to yield returns. I looked over my notes the other day and found that the moderator and the five panelists identified a few trends that are outside the scope of the traditional verification as known for many years.
Trend #1: Engineering and verification teams are becoming more strategic. They are looking more carefully at the objectives to determine which verification engine is best suited for the task.
Trend #2: Verification today encompasses much more than just simulation, engineering and verification teams acknowledge. It includes hardware-based verification, whether in the form of emulation or FPGA prototyping, and formal verification. It involves pre-silicon verification and post-silicon validation.
July 6th, 2016 by Lauro Rizzatti
Recently, I spent a few enjoyable days in Austin at the Design Automation Conference, June 6-9, attending sessions, checking out verification vendors exhibiting on the show floor and catching up with long-time friends and colleagues. Evenings were filled with dinner events and more catching up.
The event was opened Monday morning by Chuck J. Alpert, general chair of the 53rd DAC, who delivered a motivational welcome supported with a few statistics. The following summarizes the paper and poster submissions and the acceptance rate, including the territorial breakdown.
May 31st, 2016 by Lauro Rizzatti
Of course, anyone who reads my blog posts on EDACafe knows I have a huge bias toward hardware emulation –– In fact, my blog is called Hardware Emulation Journal! I’ve been a part of this Design Automation market segment since 1995 and continue to believe it is the most versatile of all verification tools.
If you happen to be at DAC and want to learn more about hardware emulation and its growing use models, stop by the Mentor Graphics Booth (#949) Monday, June 6, at 4pm. I’m moderating an hour-long panel of exceedingly qualified verification experts who will help me answer the intriguing question: What’s up with all those new use models on an emulator?
Sitting on the podium with me will be the inimitable Alex Starr from AMD, Guy Hutchinson of Cavium Networks and Rick Leatherman at Imagination Technologies. Guy and I had a conversation earlier this week and I can attest to his enthusiasm and knowledge about hardware emulation.
The four of us agree that hardware emulation is moving into the mainstream and away from the dusty back corners of an engineering department. Its reputation has been rehabilitated as well. That’s because it’s accessible to all types of verification engineers and, fortunately for them, they do not need to be experts in the nuances of emulation. This means that emulation can be used to solve problems that previously were almost too tough to solve, and many verification tasks can be completed more quickly and thoroughly. One easily recognizable example is hardware/software co-verification. Hardware emulation is the only verification tool able to track a bug across the embedded software and the underlying hardware. By all accounts, a big challenge these days.
Another point of agreement is emulation’s horsepower, flexibility and versatility, which suggests we’re moving into the fourth era of emulation where applications rule. In the “apps” era, the emulator becomes the “verification hub” of a verification strategy because it is able to address the end-to-end verification needs of today’s complex designs. Applications extend the use of emulators beyond RTL verification, making it possible to develop new scenarios to target an increasing number of vertical markets, from networking and graphics to automotive and beyond.
Given hardware emulation’s emerging popularity and growing uses, each panelist will be asked to describe the types and sizes of designs he’s asked to debug, along with their applications and the basic verification flow. DAC attendees have noted their interested in hardware emulation’s various deployment modes, so we’ll look at several, including traditional ICE, TBA/TBX and virtual. I’ll ask each panelist to describe which modes they’ve used and the varying degrees of success. We’ll attempt to identify the capabilities lacking in each.
Hardware emulation is being used for some new and, perhaps, unlikely tasks, such as low-power verification, power estimation and design for test. I intend to ask each panelist whether he’s familiar with these new modes and his perspective on the effectiveness of hardware emulation to debug chips with these characteristics.
As you might expect, our goal is to make this panel lively and thought provoking. I predict the panelists will confirm through experience and expertise that it’s much more usable than most commercial verification tools. We will leave room for questions, so please come armed with questions that we can try to answer or stump us.
Please join us. I look forward to seeing you in Austin.
May 20th, 2016 by Lauro Rizzatti
Why, the vendor sends in a village of AEs, R&D engineers and PhDs, of course, to work onsite with the lead designer and his or her designs. Yes, it’s common for a design automation company to send AEs into an account as “super” users of a specific software tool, such as formal verification, because it’s specialized and specific technology not everyone has mastered. And yes, old-style hardware emulators came with AEs because early generations of the tool were difficult to deploy. They required expertise and plenty of manual effort to get them operational, and hence the refrain “time to emulation.”
Some more positive project teams would assume this gesture seemingly is a sign of commitment. Actually, it’s not. Not even close. It’s a sign of the tool’s weakness because it doesn’t work.
May 3rd, 2016 by Lauro Rizzatti
Many proponents and users of hardware emulation continue to enthuse about its benefits, expanding use models and growing popularity among hardware design and verification engineering groups. We believe it is the foundation of almost all verification strategies today, not replacing simulation, but augmenting it.
The topic of hardware emulation’s popularity is starting to show up in technical conferences and other industry events, reinforcing what we’re seeing. Wall Street’s paying attention as well.
March 31st, 2016 by Lauro Rizzatti
Attending a conference like DVCon offers many benefits, including the opportunity for loads of hallway discussions. I was stopped continuously during DVCon by friends, colleagues and acquaintances all wanting to talk about emulation, which convinced me that it’s the hottest verification tool and topic today.
Here are five of the questions I was asked, along with the answers.
Q1. For many years, emulation has been an exotic and rather expensive tool used in very limited market segments, such as the largest processor and graphics designs. Today, it is used across the board by virtually all semiconductor industry segments. What facilitated this broad adoption?
A1. Significant improvements in the emulation’s hardware and in the supporting software. Perhaps even more significant is a dramatic drop in the cost of ownership (COO). Just consider that on a dollar-per-gate basis, the cost dropped from $5 in the early 90s to less than half a penny now. Add to that the radical enhancement in system reliability, the dramatic improvements in the usage model, the multi-user and remote access capabilities and it’s a COO that is a small fraction of what it used to be a decade ago.
March 21st, 2016 by Lauro Rizzatti
Attended by approximately 1,200 visitors (about 700 paying customers and 400+ on the free day) and 30 exhibitors, its program included 12 tutorials, 13 technical papers and 45 posters. Topics were on various aspects of design and design verification, with particular emphasis on hardware emulation, Universal Verification Methodology (UVM), portable stimulus, low-power design, and formal.
The event offered a keynote, two panels, one roundtable and three sponsored lunches by each of the three main EDA giants. A conversation between Jim Hogan and Ajoy Bose titled “Crossing the Chasm: From technology to Valuable Enterprise,” was hosted by the EDA Consortium and based on Dr. Bose’s career in building several businesses in the high-tech industry.