Hardware Emulation Journal
Verification Consultant & Investor at Oregon Angel Fund
November 29th, 2016 by Lauro Rizzatti
Continuing a tradition started in the early days of the company, the European edition of the Mentor Graphics’ User Group meeting, now renamed User2User or U2U, was held Tuesday, October 11, in Munich, Germany.
In the opening remarks Matthias Knoppik, Mentor Graphics’ Area Director Northern-Central Europe, expressed his excitement for the record attendance, and briefly presented the agenda for the one-day event. Two keynote presentations and seven technical tracks packed the day, from 9:15 a.m. to 4:30 p.m. An exhibit area setup with podiums mounted by computers gave Mentor’s partners the opportunity to demonstrate their products.
The first keynote was delivered by Malcom Penn, Chairman and CEO of Future Horizon, a market research and analysis enterprise. Titled “Caveat Emptor: The Triumph of Hype vs Reality,” Malcom highlighted the four factors that influence industry growth: the economy, unit growth, capacity or the ability to make those units, and the price exacted from them. In discussing each of them, he peppered the delivery with anecdotal and ironic references with lighthearted spirit. It was an enjoyable presentation.
The second keynote was presented by Dr. Walden (Wally) C. Rhines, Chairman and CEO of Mentor Graphics. Titled “Predicting the Next Wave of Semiconductor Growth,” it perfectly matched Malcom’s earlier keynote, prompting Wally to draw few parallels between the two. As is always the case, Wally did a terrific job. In this instance, what impressed me was Wally’s use of the Gompertz curve to predict the sectors with the best potential growth in the semiconductor industry. As Wally explained, the curve or function, was conceived in 1825 by Benjamin Gompertz, a British self-educated mathematician. It is a mathematical model for time series and is used widely to predict things that happen in time, like the growth of tumors, population growth, bubble foam uptake, market impacts and finance, and so on. It has withstood the test of time.
Wally applied the Gompertz curve to several test cases to show what the future could look like. He started with desktop PCs, today growing at negative rate. He moved on to notebooks, again with a negative growth-rate, but with a probability to ship 40% of the total notebooks ever built in the future. He continued with cell phones, smart phones, internet use, IoT, set-top boxes, smart meters, fitness trackers, medical wearables, electric vehicles, data centers, gateways and data storage. Wally also discussed transistor production, and predicted its future using the Gompertz function.
For more on Wally’s keynote and how the Gompertz curve can be applied to the semiconductor industry, see my latest blog post on EE Times.
For the technical tracks, I attended SoC Functional Verification, which complements my area of expertise. The track opened with a keynote by John Lenyo, Vice President and General Manager of the Design Verification Technology Division of Mentor Graphics. John presented industry trends in today’s verification landscape, starting with the growth in verification productivity based on growth of transistors per design engineer, drop in EDA cost per transistor, and decrease in total IC revenues per transistors. This was followed by charts on the rising verification complexity with emphasis on design security and safety critical design, comparing the worldwide trend to the European trend. A set of slides focused on design verification best practices, with a chart that mapped the past and anticipated future growth in emulation adoption.
Five technical presentations filled the SoC Functional Verification tack.
Nigel Elliot from Mentor and Thomas Alofs from STMicroelectronics delivered an interesting test case on the verification challenges and solutions in designing a mixed-signal USB Type-C device.
Daniel Gruber from Univa presented a detailed analysis of the benefits of Univa Grid Engine to manage the workload of a modern multi-user, multi-job, emulation platform in a datacenter. I will write about this technology and its benefits in a future piece.
Pranab Saharia of ARM, U.K., presented a paper titled “Reckoning GPU Power with Veloce.” Pranab first highlighted the limits of using simulation on estimating power dissipation in today’s complex SoC designs. He then described the adoption of the Veloce emulation platform to estimate the power consumption of the ARM Mali GPUs, the generation of the SAIF files for average power consumption, and the creation of the Switching Activity Graph plotting the peak power. A set of benchmark data concluded his presentation.
Antti Rautakoura from Nokia in Finland presented a step-by-step verification management methodology with emphasis on functional coverage.
Dr. Carol Marsh from Leonardo, one of the biggest suppliers of defense equipment to the U.K. Ministry of Defence (MoD), presented a test case on adopting UVM with a large development team. Given the nature of the Leonardo business, no slide handouts were allowed. From memory, I recall that the team was able to switch from designing entirely in VHDL without any knowledge of Verilog to adopting SystemVerilog and UVM across the board in 18 months or so. This is an absolute record that gives credit to management for supporting the trial, but also to the training put in place. Mentor’s Verification Academy was highly prized for helping to achieve this ambitious goal. It goes without saying that the engineers were motivated and included the cream of the crop in the category.
All told, it was an exceptional event starting with the two keynotes. The presentations I saw in the SoC Functional Verification Track were first rate. Kudos to the organizers of U2U for making the event memorable.
November 18th, 2016 by Lauro Rizzatti
The 2016 DVCon Europe was held in Munich, Germany, at the Holiday Inns City Center Hotel on October 19-20. This was the third year of the conference and it has a decidedly local focus. In its short life, DVCon Europe has become the leading European event for electronic industry participants, mainly chip and system design verification engineers and managers, to gather and share information on innovative design and verification techniques.
In his opening remarks, Matthias Bauer from Infineon Technologies and Program Chair of the Event, expressed his satisfaction for the record attendance, said to be 20% higher than the previous year.
The two-day program included two keynotes, 16 tutorials, 43 technical papers in 13 sessions, two panels and a presentation at the gala dinner. For this year’s event, the decision was made to eliminate the technical posters. As is its custom, an exhibition hall was setup to give 24 exhibitors the opportunity to display and demo their wares.
September 19th, 2016 by Lauro Rizzatti
Although I don’t want to repeat myself, my 2015 report included my assessment that “the traffic on Bangalore’s roads reminded me of a Circle of Hell from Dante Alighieri’s 14th-century poem, ‘Divine Comedy.’” It was like that again this year. As if this wasn’t enough, the timing of this year’s conference coincided with an unanticipated and unpleasant event.
A little background. Bangalore is the city center of the Karnataka State bordering the Tamil Nadu State. The border is delimited by the river Cauvery that provides crucial and life sustaining water to the two states via a dam. In good years, the dam has enough water to fulfill the needs of both. In bad years, the scarcity of water causes grief, tensions and confrontations. The Summer of 2016 was really bad. I was told that the discord between the two states turned into an enormous political crisis that escalated to the attention of the Prime Minister of India.
September 6th, 2016 by Lauro Rizzatti
DVCon India could be considered the official start of the fall season for our industry. It kicks off Thursday, September 15, and runs through Friday, September 16, at The Leela Palace in Bangalore, an elegant hotel and a great place to host a content-rich technical event like this.
The two-day event, now in its third year, will offer a bit of technical everything for design and verification engineers and engineering managers, from keynotes and panels to tutorials and papers. I hope to see an increment in attendance from about 650 in 2016. The attendees will have the opportunity to take part in many of the informal technical discussions. It’s a great networking opportunity.
One not-to-be missed keynote, “Design Verification: Challenging Yesterday, Today and Tomorrow,” will be delivered by Mentor Graphics’ Wally Rhines. According to the abstract found on the DVCon India website, he will review the major phases of the verification evolution over the past several decades and focus on the challenges of newly emerging problems. I’m looking forward to his insights and expect to see some terrific visuals.
As I did last year, I will moderate a panel titled, “The Future Verification Flow,” the first day from 12:10 p.m. until 1 p.m. in the Grand Ballroom. Panelists will be Mike Bartley of Test and Verification Solutions (T&VS), Shankar Narayana Bhat who hails from Qualcomm’s Bangalore Design Centre and Ashish Kumar who will join us from the Broadcom India Design Centre.
We plan on a lively discussion as we review the challenges of the current verification flows and hash over whether emulation will become the de facto verification tool replacing simulation and, if so, the kind of disruption it could create. We intend to take a hard look at emulation versus simulation in the verification flow and determine the effectiveness of a simulation/formal verification flow versus a simulation/emulation flow. I’m planning to put each panelist on the spot and ask them to predict what’s coming next in the continuing evolution of verification.
Descriptions of all the technical sessions, papers and tutorials, make them all seem interesting and thought-provoking, but one in particular stands out for me. It’s the ESL Tutorial: Hybrid Solution Combining Emulation and Virtual Prototyping. That’s high on my “Must See” list.
And then, there is the exhibit floor. The big three –– Cadence, Mentor and Synopsys –– will have booths, as will Verific, Aceic Design Technologies, Breker Verification Systems, Dassault Systemes, Doulos, Magillem, NEC, Real Intent, SmartDV Technologies, T&VS and True Chip.
This year’s DVCon India should be as much of a standout event as it was last year. For more information, visit: https://dvcon-india.org/
My next blog post will be a trip report on DVCon India. Look for it later in September.
August 24th, 2016 by Lauro Rizzatti
I had a chat with a friend yesterday who announced: “Less efforting is working for me.” The use of the noun effort as a verb –– efforting –– didn’t send me to my online dictionary to check my grammar or linguistic skills. Instead, it took me back 30-odd years to the early days of hardware emulation when efforting could have been the catchphrase.
In those days from the 1980s, the emulator arrived with a crew of applications engineers (AEs in a box, we used to say). Even they didn’t have a magic touch –– it seemingly took forever to tweak the system just so to get it to work. Pricing required some justification efforting as well because they were expense verification tools. As a result, they were reserved for only the largest and most complex chip designs, which, in those days average about 100,000 ASIC gates. Big price tag, big chips, lots of efforting.
Efforting continued into the 1990s as hardware emulation became a bit more popular, though they were an unsightly mess with cables snaking around the boxes, like spaghetti enveloping meatballs, so much so that they were relegated to a back room. With all those cables came in circuit emulation (ICE), the default, actually the only use model to verify the design-under-test (DUT) with real traffic data. While effective, the data in and out of the emulator ran at a lower speed than the actual speed of the real traffic data, requiring the insertion of speed adapters and additional efforting. Further, the manned supervision commanded by the ICE mode limited hardware emulation’s ability to become a shared remote resource.
August 5th, 2016 by Lauro Rizzatti
A panel in DAC’s technical program this year continues to yield returns. I looked over my notes the other day and found that the moderator and the five panelists identified a few trends that are outside the scope of the traditional verification as known for many years.
Trend #1: Engineering and verification teams are becoming more strategic. They are looking more carefully at the objectives to determine which verification engine is best suited for the task.
Trend #2: Verification today encompasses much more than just simulation, engineering and verification teams acknowledge. It includes hardware-based verification, whether in the form of emulation or FPGA prototyping, and formal verification. It involves pre-silicon verification and post-silicon validation.
July 6th, 2016 by Lauro Rizzatti
Recently, I spent a few enjoyable days in Austin at the Design Automation Conference, June 6-9, attending sessions, checking out verification vendors exhibiting on the show floor and catching up with long-time friends and colleagues. Evenings were filled with dinner events and more catching up.
The event was opened Monday morning by Chuck J. Alpert, general chair of the 53rd DAC, who delivered a motivational welcome supported with a few statistics. The following summarizes the paper and poster submissions and the acceptance rate, including the territorial breakdown.
May 31st, 2016 by Lauro Rizzatti
Of course, anyone who reads my blog posts on EDACafe knows I have a huge bias toward hardware emulation –– In fact, my blog is called Hardware Emulation Journal! I’ve been a part of this Design Automation market segment since 1995 and continue to believe it is the most versatile of all verification tools.
If you happen to be at DAC and want to learn more about hardware emulation and its growing use models, stop by the Mentor Graphics Booth (#949) Monday, June 6, at 4pm. I’m moderating an hour-long panel of exceedingly qualified verification experts who will help me answer the intriguing question: What’s up with all those new use models on an emulator?
Sitting on the podium with me will be the inimitable Alex Starr from AMD, Guy Hutchinson of Cavium Networks and Rick Leatherman at Imagination Technologies. Guy and I had a conversation earlier this week and I can attest to his enthusiasm and knowledge about hardware emulation.
The four of us agree that hardware emulation is moving into the mainstream and away from the dusty back corners of an engineering department. Its reputation has been rehabilitated as well. That’s because it’s accessible to all types of verification engineers and, fortunately for them, they do not need to be experts in the nuances of emulation. This means that emulation can be used to solve problems that previously were almost too tough to solve, and many verification tasks can be completed more quickly and thoroughly. One easily recognizable example is hardware/software co-verification. Hardware emulation is the only verification tool able to track a bug across the embedded software and the underlying hardware. By all accounts, a big challenge these days.
Another point of agreement is emulation’s horsepower, flexibility and versatility, which suggests we’re moving into the fourth era of emulation where applications rule. In the “apps” era, the emulator becomes the “verification hub” of a verification strategy because it is able to address the end-to-end verification needs of today’s complex designs. Applications extend the use of emulators beyond RTL verification, making it possible to develop new scenarios to target an increasing number of vertical markets, from networking and graphics to automotive and beyond.
Given hardware emulation’s emerging popularity and growing uses, each panelist will be asked to describe the types and sizes of designs he’s asked to debug, along with their applications and the basic verification flow. DAC attendees have noted their interested in hardware emulation’s various deployment modes, so we’ll look at several, including traditional ICE, TBA/TBX and virtual. I’ll ask each panelist to describe which modes they’ve used and the varying degrees of success. We’ll attempt to identify the capabilities lacking in each.
Hardware emulation is being used for some new and, perhaps, unlikely tasks, such as low-power verification, power estimation and design for test. I intend to ask each panelist whether he’s familiar with these new modes and his perspective on the effectiveness of hardware emulation to debug chips with these characteristics.
As you might expect, our goal is to make this panel lively and thought provoking. I predict the panelists will confirm through experience and expertise that it’s much more usable than most commercial verification tools. We will leave room for questions, so please come armed with questions that we can try to answer or stump us.
Please join us. I look forward to seeing you in Austin.
May 20th, 2016 by Lauro Rizzatti
Why, the vendor sends in a village of AEs, R&D engineers and PhDs, of course, to work onsite with the lead designer and his or her designs. Yes, it’s common for a design automation company to send AEs into an account as “super” users of a specific software tool, such as formal verification, because it’s specialized and specific technology not everyone has mastered. And yes, old-style hardware emulators came with AEs because early generations of the tool were difficult to deploy. They required expertise and plenty of manual effort to get them operational, and hence the refrain “time to emulation.”
Some more positive project teams would assume this gesture seemingly is a sign of commitment. Actually, it’s not. Not even close. It’s a sign of the tool’s weakness because it doesn’t work.
May 3rd, 2016 by Lauro Rizzatti
Many proponents and users of hardware emulation continue to enthuse about its benefits, expanding use models and growing popularity among hardware design and verification engineering groups. We believe it is the foundation of almost all verification strategies today, not replacing simulation, but augmenting it.
The topic of hardware emulation’s popularity is starting to show up in technical conferences and other industry events, reinforcing what we’re seeing. Wall Street’s paying attention as well.