Hardware Emulation Journal
Verification Consultant & Investor at Oregon Angel Fund
December 20th, 2016 by Lauro Rizzatti
I recently read a whitepaper on hardware emulation for the IoT market and captured some thoughts here. At the end of this article is a link to the entire whitepaper if you want to read more.
IoT captured the semiconductor industry’s attention and the race is on to design chips to support this emerging market. Naturally, IoT chip designs need powerful verification tools, which is why design verification engineers are taking a closer look at hardware emulation. It is the only verification tool to provide capacity, performance and cycles to verify IoT designs.
One hardware emulation platform verifies IoT designs by disconnecting the hardware and operating system from applications that run on the end-user products. Design verification engineers test their designs using applications in the same way they run applications on their electronic devices.
Its OS provides an interface to the emulator for applications that run on top of a single operating system. The OS supports an enterprise server that optimizes resource utilization and provides job management for users to submit jobs from their desktops to emulation resources housed in datacenters. The enterprise server supports concurrent use of the emulator for multiple projects, groups, users and use modes. It determines where to allocate a single or multiple projects to ensure the most efficient use of resources for highly efficient access to datacenter-friendly emulation.
November 29th, 2016 by Lauro Rizzatti
Continuing a tradition started in the early days of the company, the European edition of the Mentor Graphics’ User Group meeting, now renamed User2User or U2U, was held Tuesday, October 11, in Munich, Germany.
In the opening remarks Matthias Knoppik, Mentor Graphics’ Area Director Northern-Central Europe, expressed his excitement for the record attendance, and briefly presented the agenda for the one-day event. Two keynote presentations and seven technical tracks packed the day, from 9:15 a.m. to 4:30 p.m. An exhibit area setup with podiums mounted by computers gave Mentor’s partners the opportunity to demonstrate their products.
The first keynote was delivered by Malcom Penn, Chairman and CEO of Future Horizon, a market research and analysis enterprise. Titled “Caveat Emptor: The Triumph of Hype vs Reality,” Malcom highlighted the four factors that influence industry growth: the economy, unit growth, capacity or the ability to make those units, and the price exacted from them. In discussing each of them, he peppered the delivery with anecdotal and ironic references with lighthearted spirit. It was an enjoyable presentation.
November 18th, 2016 by Lauro Rizzatti
The 2016 DVCon Europe was held in Munich, Germany, at the Holiday Inns City Center Hotel on October 19-20. This was the third year of the conference and it has a decidedly local focus. In its short life, DVCon Europe has become the leading European event for electronic industry participants, mainly chip and system design verification engineers and managers, to gather and share information on innovative design and verification techniques.
In his opening remarks, Matthias Bauer from Infineon Technologies and Program Chair of the Event, expressed his satisfaction for the record attendance, said to be 20% higher than the previous year.
The two-day program included two keynotes, 16 tutorials, 43 technical papers in 13 sessions, two panels and a presentation at the gala dinner. For this year’s event, the decision was made to eliminate the technical posters. As is its custom, an exhibition hall was setup to give 24 exhibitors the opportunity to display and demo their wares.
September 19th, 2016 by Lauro Rizzatti
Although I don’t want to repeat myself, my 2015 report included my assessment that “the traffic on Bangalore’s roads reminded me of a Circle of Hell from Dante Alighieri’s 14th-century poem, ‘Divine Comedy.’” It was like that again this year. As if this wasn’t enough, the timing of this year’s conference coincided with an unanticipated and unpleasant event.
A little background. Bangalore is the city center of the Karnataka State bordering the Tamil Nadu State. The border is delimited by the river Cauvery that provides crucial and life sustaining water to the two states via a dam. In good years, the dam has enough water to fulfill the needs of both. In bad years, the scarcity of water causes grief, tensions and confrontations. The Summer of 2016 was really bad. I was told that the discord between the two states turned into an enormous political crisis that escalated to the attention of the Prime Minister of India.
September 6th, 2016 by Lauro Rizzatti
DVCon India could be considered the official start of the fall season for our industry. It kicks off Thursday, September 15, and runs through Friday, September 16, at The Leela Palace in Bangalore, an elegant hotel and a great place to host a content-rich technical event like this.
The two-day event, now in its third year, will offer a bit of technical everything for design and verification engineers and engineering managers, from keynotes and panels to tutorials and papers. I hope to see an increment in attendance from about 650 in 2016. The attendees will have the opportunity to take part in many of the informal technical discussions. It’s a great networking opportunity.
One not-to-be missed keynote, “Design Verification: Challenging Yesterday, Today and Tomorrow,” will be delivered by Mentor Graphics’ Wally Rhines. According to the abstract found on the DVCon India website, he will review the major phases of the verification evolution over the past several decades and focus on the challenges of newly emerging problems. I’m looking forward to his insights and expect to see some terrific visuals.
As I did last year, I will moderate a panel titled, “The Future Verification Flow,” the first day from 12:10 p.m. until 1 p.m. in the Grand Ballroom. Panelists will be Mike Bartley of Test and Verification Solutions (T&VS), Shankar Narayana Bhat who hails from Qualcomm’s Bangalore Design Centre and Ashish Kumar who will join us from the Broadcom India Design Centre.
We plan on a lively discussion as we review the challenges of the current verification flows and hash over whether emulation will become the de facto verification tool replacing simulation and, if so, the kind of disruption it could create. We intend to take a hard look at emulation versus simulation in the verification flow and determine the effectiveness of a simulation/formal verification flow versus a simulation/emulation flow. I’m planning to put each panelist on the spot and ask them to predict what’s coming next in the continuing evolution of verification.
Descriptions of all the technical sessions, papers and tutorials, make them all seem interesting and thought-provoking, but one in particular stands out for me. It’s the ESL Tutorial: Hybrid Solution Combining Emulation and Virtual Prototyping. That’s high on my “Must See” list.
And then, there is the exhibit floor. The big three –– Cadence, Mentor and Synopsys –– will have booths, as will Verific, Aceic Design Technologies, Breker Verification Systems, Dassault Systemes, Doulos, Magillem, NEC, Real Intent, SmartDV Technologies, T&VS and True Chip.
This year’s DVCon India should be as much of a standout event as it was last year. For more information, visit: https://dvcon-india.org/
My next blog post will be a trip report on DVCon India. Look for it later in September.
August 24th, 2016 by Lauro Rizzatti
I had a chat with a friend yesterday who announced: “Less efforting is working for me.” The use of the noun effort as a verb –– efforting –– didn’t send me to my online dictionary to check my grammar or linguistic skills. Instead, it took me back 30-odd years to the early days of hardware emulation when efforting could have been the catchphrase.
In those days from the 1980s, the emulator arrived with a crew of applications engineers (AEs in a box, we used to say). Even they didn’t have a magic touch –– it seemingly took forever to tweak the system just so to get it to work. Pricing required some justification efforting as well because they were expense verification tools. As a result, they were reserved for only the largest and most complex chip designs, which, in those days average about 100,000 ASIC gates. Big price tag, big chips, lots of efforting.
Efforting continued into the 1990s as hardware emulation became a bit more popular, though they were an unsightly mess with cables snaking around the boxes, like spaghetti enveloping meatballs, so much so that they were relegated to a back room. With all those cables came in circuit emulation (ICE), the default, actually the only use model to verify the design-under-test (DUT) with real traffic data. While effective, the data in and out of the emulator ran at a lower speed than the actual speed of the real traffic data, requiring the insertion of speed adapters and additional efforting. Further, the manned supervision commanded by the ICE mode limited hardware emulation’s ability to become a shared remote resource.
August 5th, 2016 by Lauro Rizzatti
A panel in DAC’s technical program this year continues to yield returns. I looked over my notes the other day and found that the moderator and the five panelists identified a few trends that are outside the scope of the traditional verification as known for many years.
Trend #1: Engineering and verification teams are becoming more strategic. They are looking more carefully at the objectives to determine which verification engine is best suited for the task.
Trend #2: Verification today encompasses much more than just simulation, engineering and verification teams acknowledge. It includes hardware-based verification, whether in the form of emulation or FPGA prototyping, and formal verification. It involves pre-silicon verification and post-silicon validation.
July 6th, 2016 by Lauro Rizzatti
Recently, I spent a few enjoyable days in Austin at the Design Automation Conference, June 6-9, attending sessions, checking out verification vendors exhibiting on the show floor and catching up with long-time friends and colleagues. Evenings were filled with dinner events and more catching up.
The event was opened Monday morning by Chuck J. Alpert, general chair of the 53rd DAC, who delivered a motivational welcome supported with a few statistics. The following summarizes the paper and poster submissions and the acceptance rate, including the territorial breakdown.
May 31st, 2016 by Lauro Rizzatti
Of course, anyone who reads my blog posts on EDACafe knows I have a huge bias toward hardware emulation –– In fact, my blog is called Hardware Emulation Journal! I’ve been a part of this Design Automation market segment since 1995 and continue to believe it is the most versatile of all verification tools.
If you happen to be at DAC and want to learn more about hardware emulation and its growing use models, stop by the Mentor Graphics Booth (#949) Monday, June 6, at 4pm. I’m moderating an hour-long panel of exceedingly qualified verification experts who will help me answer the intriguing question: What’s up with all those new use models on an emulator?
Sitting on the podium with me will be the inimitable Alex Starr from AMD, Guy Hutchinson of Cavium Networks and Rick Leatherman at Imagination Technologies. Guy and I had a conversation earlier this week and I can attest to his enthusiasm and knowledge about hardware emulation.
The four of us agree that hardware emulation is moving into the mainstream and away from the dusty back corners of an engineering department. Its reputation has been rehabilitated as well. That’s because it’s accessible to all types of verification engineers and, fortunately for them, they do not need to be experts in the nuances of emulation. This means that emulation can be used to solve problems that previously were almost too tough to solve, and many verification tasks can be completed more quickly and thoroughly. One easily recognizable example is hardware/software co-verification. Hardware emulation is the only verification tool able to track a bug across the embedded software and the underlying hardware. By all accounts, a big challenge these days.
Another point of agreement is emulation’s horsepower, flexibility and versatility, which suggests we’re moving into the fourth era of emulation where applications rule. In the “apps” era, the emulator becomes the “verification hub” of a verification strategy because it is able to address the end-to-end verification needs of today’s complex designs. Applications extend the use of emulators beyond RTL verification, making it possible to develop new scenarios to target an increasing number of vertical markets, from networking and graphics to automotive and beyond.
Given hardware emulation’s emerging popularity and growing uses, each panelist will be asked to describe the types and sizes of designs he’s asked to debug, along with their applications and the basic verification flow. DAC attendees have noted their interested in hardware emulation’s various deployment modes, so we’ll look at several, including traditional ICE, TBA/TBX and virtual. I’ll ask each panelist to describe which modes they’ve used and the varying degrees of success. We’ll attempt to identify the capabilities lacking in each.
Hardware emulation is being used for some new and, perhaps, unlikely tasks, such as low-power verification, power estimation and design for test. I intend to ask each panelist whether he’s familiar with these new modes and his perspective on the effectiveness of hardware emulation to debug chips with these characteristics.
As you might expect, our goal is to make this panel lively and thought provoking. I predict the panelists will confirm through experience and expertise that it’s much more usable than most commercial verification tools. We will leave room for questions, so please come armed with questions that we can try to answer or stump us.
Please join us. I look forward to seeing you in Austin.
May 20th, 2016 by Lauro Rizzatti
Why, the vendor sends in a village of AEs, R&D engineers and PhDs, of course, to work onsite with the lead designer and his or her designs. Yes, it’s common for a design automation company to send AEs into an account as “super” users of a specific software tool, such as formal verification, because it’s specialized and specific technology not everyone has mastered. And yes, old-style hardware emulators came with AEs because early generations of the tool were difficult to deploy. They required expertise and plenty of manual effort to get them operational, and hence the refrain “time to emulation.”
Some more positive project teams would assume this gesture seemingly is a sign of commitment. Actually, it’s not. Not even close. It’s a sign of the tool’s weakness because it doesn’t work.