Hardware Emulation Journal
Jean-Marie Brunet is Sr. Director of Marketing, at Mentor, a Siemens Business. He has served for over 20 years in application engineering, marketing and management roles in the EDA industry, and has held IC design and design management positions at STMicrolectronics, Cadence, and Micron among … More »
March 6th, 2019 by Jean-Marie Brunet
You can’t turn around these days without seeing a reference to AI – even as a consumer. AI, or artificial intelligence, is hot due to the new machine-learning (ML) techniques that are evolving daily. It’s often cited as one of the critical markets for electronics purveyors, but it’s not really a market: it’s a technology. And it’s quietly – or not so quietly – moving into many, many markets. Some of those markets include safety-critical uses, meaning that life and limb can depend on how well it works.
AI is incredibly important, but it differs from many other important technologies in how it’s verified.
Three Key Requirements
AI/ML verification brings with it three key needs: determinism, scalability, and virtualization. These aren’t uncommon hardware emulation requirements, but many other technologies require only two out of those three. AI is the perfect storm that needs all three.
ML involves the creation of a model during what is called the “training phase” – at least in its supervised version. That model is then implemented in a device or in the cloud for inference, where the trained model is put to use in an application.
March 13th, 2017 by Lauro Rizzatti
Each DVCon event tends to have a common thread throughout the keynote presentations, panels, sponsored lunches, tutorial and technical sessions. I would pick “machine learning” as the new EDA frontier for DVCon US 2017, held in San Jose, Calif., at the DoubleTree Hotel February 27-March 2. In the keynote and panels machine learning was mentioned here and there as the panacea to achieve full automation of the verification process.
Attended by approximately 1,051 badge holders (exhibit-only, technical conference and booth staff), it encompassed nine tutorials, 12 sessions with a total of 39 technical papers and 19 posters, and 31 exhibits.
Presenters debated diverse facets of design verification:
A few more tutorials, sessions and posters debated design verification, in general, including coverage, testbench automation, SoC verification, IP verification, design safety verification and analog/mix-signal verification.
Hardware emulation, my favorite topic, was mentioned in the keynote, but it was the subject of only one paper delivered by Samsung in the Power Optimization session.
The keynote presentation, titled “Tomorrow’s Verification Today” was delivered by Dr. Anirudh Devgan, Sr. VP and GM of the Digital & Signoff Group and System & Verification Group at Cadence. In the opening remarks Dr. Devgan reminded the audience that it was Mardi Gras, and went on to discuss the trends in design verification. His unique position overlooking the entire design verification landscape served by a major EDA player gave him the perspective to review the limitations in today’s tools, and to appraise the requirements to address tomorrow’s hardware and software development needs.
Regarding formal analysis, Dr. Devgan said that formal verification has made lots of progress over the years, but there is room for improvement. Three areas need attention:
A Special Session hosted by Harry Foster, chief technologist at Mentor Graphics presented the summary of the 2016 Industry Survey of the functional verification landscape. The 2016 survey continued on the footsteps of the previous 2007, 2009, 2012, 2014 surveys. The 2014 and 2016 surveys, conducted on a worldwide basis in a double-blind study covering all electronic industry market segments were never published.
The presentation included a trove of interesting data. Given my personal interest in hardware emulation, I wish to mention that the study showed that today, 24% of the projects adopted emulation. Between 2014 and 2016, there was a significant increase in the use of emulation for IP development and verification. The following chart maps various objectives for adopting emulation.
DVCon US also included three panels, and four sponsored lunches by each of the three EDA giants and Accellera.
In one of the panels titled “The Verify Seven” and backed by the ESD Alliance, OneSpin Solutions and Vista Ventures’ Jim Hogan, six prominent representatives from small EDA companies debated the pains and joys of starting a company. Presenters included:
All shared similar experiences. They spent years and dollars trying different alternatives before settling on a definite product and a business model. My recommendation to a potential entrepreneur is to think long and deep what to do before jumping into it. It would save time, money and, mostly, frustration.
Phil Moorby, credited as the father of Verilog, participated in two other panels and in all three stated that the SystemVerilog language, while the pinnacle for design description, is a rather poor choice for testbench description. Inadequate for parallelization, it is rather inefficient when processed by x86 or GPU architectures. A better choice, said Phil, would be a C/C++ approach. More than one attendee wondered what was cooking at Montana, but Phil kept his lips sealed.
In a panel titled “Users Talk Back on Portable Stimulus”, organized by Nanette Collins, founder of NVC Marketing and Public Relations, and moderated by Adnan Hamid, CEO of Breaker Verification System, the five panelists, including Asad Khan, Cavium, Dave Brownell, Analog Devices, Mark Glasser, nVidia, Wolfgang Roesner, IBM, and Sanjay Gupta, Qualcomm, debated the status of the portable stimulus initiative. All five agreed that looked from 30,000 feet, the initiative is welcome and badly needed. But as they moved down to the details of the proposed plans disagreement, contradiction, and a bit of confusion arose. It will be interesting to observe the evolution of the proposal.
All considered, the 2017 DVCon US conference was successful, but I cannot withhold my disappointment for the virtual absence of the topic of hardware emulation, especially when compared to the exposure of formal analysis. Just consider that via the deployment in virtual mode, an alternative to in-circuit emulation (ICE), and the adoption of emulation Apps to perform a variety of functional verification tasks, hardware emulation is used across the verification landscape, and its “shift-left” adoption continues to progress. When installed in a data center, it can serve a worldwide design verification community 24/7 all year round, optimizing the return-on-investment. All market segments from processor to networking, storage, multi-media, automotive and more, benefit from the use of emulation. In fact, without emulation, functional verification of the monstrous chip designs the industry is cranking out today would not be possible.
February 6th, 2017 by Lauro Rizzatti
If it’s February, that means the chip design verification community is gearing up for the annual DVCon. This year it will be held Monday, February 27, through Thursday, March 2. As always, it will be a jam-packed week full of tutorials, paper sessions and panels. And, of course, the Expo with 32 of the leading verification companies showcasing their tools will be a highlight.
The stage will be set Monday by conference sponsor and host Accellera with three standards-related tutorials on portable stimulus, universal verification methodology (UVM) and SystemC. An Accellera lunch will update attendees on working groups and an outlook for the future. The lunch will include an awards ceremony as well.
Capping Monday’s activities will be the popular Booth Crawl in the expo area from 5 p.m. until 7 p.m.
December 20th, 2016 by Lauro Rizzatti
I recently read a whitepaper on hardware emulation for the IoT market and captured some thoughts here. At the end of this article is a link to the entire whitepaper if you want to read more.
IoT captured the semiconductor industry’s attention and the race is on to design chips to support this emerging market. Naturally, IoT chip designs need powerful verification tools, which is why design verification engineers are taking a closer look at hardware emulation. It is the only verification tool to provide capacity, performance and cycles to verify IoT designs.
One hardware emulation platform verifies IoT designs by disconnecting the hardware and operating system from applications that run on the end-user products. Design verification engineers test their designs using applications in the same way they run applications on their electronic devices.
Its OS provides an interface to the emulator for applications that run on top of a single operating system. The OS supports an enterprise server that optimizes resource utilization and provides job management for users to submit jobs from their desktops to emulation resources housed in datacenters. The enterprise server supports concurrent use of the emulator for multiple projects, groups, users and use modes. It determines where to allocate a single or multiple projects to ensure the most efficient use of resources for highly efficient access to datacenter-friendly emulation.
November 29th, 2016 by Lauro Rizzatti
Continuing a tradition started in the early days of the company, the European edition of the Mentor Graphics’ User Group meeting, now renamed User2User or U2U, was held Tuesday, October 11, in Munich, Germany.
In the opening remarks Matthias Knoppik, Mentor Graphics’ Area Director Northern-Central Europe, expressed his excitement for the record attendance, and briefly presented the agenda for the one-day event. Two keynote presentations and seven technical tracks packed the day, from 9:15 a.m. to 4:30 p.m. An exhibit area setup with podiums mounted by computers gave Mentor’s partners the opportunity to demonstrate their products.
The first keynote was delivered by Malcom Penn, Chairman and CEO of Future Horizon, a market research and analysis enterprise. Titled “Caveat Emptor: The Triumph of Hype vs Reality,” Malcom highlighted the four factors that influence industry growth: the economy, unit growth, capacity or the ability to make those units, and the price exacted from them. In discussing each of them, he peppered the delivery with anecdotal and ironic references with lighthearted spirit. It was an enjoyable presentation.
November 18th, 2016 by Lauro Rizzatti
The 2016 DVCon Europe was held in Munich, Germany, at the Holiday Inns City Center Hotel on October 19-20. This was the third year of the conference and it has a decidedly local focus. In its short life, DVCon Europe has become the leading European event for electronic industry participants, mainly chip and system design verification engineers and managers, to gather and share information on innovative design and verification techniques.
In his opening remarks, Matthias Bauer from Infineon Technologies and Program Chair of the Event, expressed his satisfaction for the record attendance, said to be 20% higher than the previous year.
The two-day program included two keynotes, 16 tutorials, 43 technical papers in 13 sessions, two panels and a presentation at the gala dinner. For this year’s event, the decision was made to eliminate the technical posters. As is its custom, an exhibition hall was setup to give 24 exhibitors the opportunity to display and demo their wares.
September 19th, 2016 by Lauro Rizzatti
Although I don’t want to repeat myself, my 2015 report included my assessment that “the traffic on Bangalore’s roads reminded me of a Circle of Hell from Dante Alighieri’s 14th-century poem, ‘Divine Comedy.’” It was like that again this year. As if this wasn’t enough, the timing of this year’s conference coincided with an unanticipated and unpleasant event.
A little background. Bangalore is the city center of the Karnataka State bordering the Tamil Nadu State. The border is delimited by the river Cauvery that provides crucial and life sustaining water to the two states via a dam. In good years, the dam has enough water to fulfill the needs of both. In bad years, the scarcity of water causes grief, tensions and confrontations. The Summer of 2016 was really bad. I was told that the discord between the two states turned into an enormous political crisis that escalated to the attention of the Prime Minister of India.
September 6th, 2016 by Lauro Rizzatti
DVCon India could be considered the official start of the fall season for our industry. It kicks off Thursday, September 15, and runs through Friday, September 16, at The Leela Palace in Bangalore, an elegant hotel and a great place to host a content-rich technical event like this.
The two-day event, now in its third year, will offer a bit of technical everything for design and verification engineers and engineering managers, from keynotes and panels to tutorials and papers. I hope to see an increment in attendance from about 650 in 2016. The attendees will have the opportunity to take part in many of the informal technical discussions. It’s a great networking opportunity.
One not-to-be missed keynote, “Design Verification: Challenging Yesterday, Today and Tomorrow,” will be delivered by Mentor Graphics’ Wally Rhines. According to the abstract found on the DVCon India website, he will review the major phases of the verification evolution over the past several decades and focus on the challenges of newly emerging problems. I’m looking forward to his insights and expect to see some terrific visuals.
As I did last year, I will moderate a panel titled, “The Future Verification Flow,” the first day from 12:10 p.m. until 1 p.m. in the Grand Ballroom. Panelists will be Mike Bartley of Test and Verification Solutions (T&VS), Shankar Narayana Bhat who hails from Qualcomm’s Bangalore Design Centre and Ashish Kumar who will join us from the Broadcom India Design Centre.
We plan on a lively discussion as we review the challenges of the current verification flows and hash over whether emulation will become the de facto verification tool replacing simulation and, if so, the kind of disruption it could create. We intend to take a hard look at emulation versus simulation in the verification flow and determine the effectiveness of a simulation/formal verification flow versus a simulation/emulation flow. I’m planning to put each panelist on the spot and ask them to predict what’s coming next in the continuing evolution of verification.
Descriptions of all the technical sessions, papers and tutorials, make them all seem interesting and thought-provoking, but one in particular stands out for me. It’s the ESL Tutorial: Hybrid Solution Combining Emulation and Virtual Prototyping. That’s high on my “Must See” list.
And then, there is the exhibit floor. The big three –– Cadence, Mentor and Synopsys –– will have booths, as will Verific, Aceic Design Technologies, Breker Verification Systems, Dassault Systemes, Doulos, Magillem, NEC, Real Intent, SmartDV Technologies, T&VS and True Chip.
This year’s DVCon India should be as much of a standout event as it was last year. For more information, visit: https://dvcon-india.org/
My next blog post will be a trip report on DVCon India. Look for it later in September.
August 24th, 2016 by Lauro Rizzatti
I had a chat with a friend yesterday who announced: “Less efforting is working for me.” The use of the noun effort as a verb –– efforting –– didn’t send me to my online dictionary to check my grammar or linguistic skills. Instead, it took me back 30-odd years to the early days of hardware emulation when efforting could have been the catchphrase.
In those days from the 1980s, the emulator arrived with a crew of applications engineers (AEs in a box, we used to say). Even they didn’t have a magic touch –– it seemingly took forever to tweak the system just so to get it to work. Pricing required some justification efforting as well because they were expense verification tools. As a result, they were reserved for only the largest and most complex chip designs, which, in those days average about 100,000 ASIC gates. Big price tag, big chips, lots of efforting.
Efforting continued into the 1990s as hardware emulation became a bit more popular, though they were an unsightly mess with cables snaking around the boxes, like spaghetti enveloping meatballs, so much so that they were relegated to a back room. With all those cables came in circuit emulation (ICE), the default, actually the only use model to verify the design-under-test (DUT) with real traffic data. While effective, the data in and out of the emulator ran at a lower speed than the actual speed of the real traffic data, requiring the insertion of speed adapters and additional efforting. Further, the manned supervision commanded by the ICE mode limited hardware emulation’s ability to become a shared remote resource.
August 5th, 2016 by Lauro Rizzatti
A panel in DAC’s technical program this year continues to yield returns. I looked over my notes the other day and found that the moderator and the five panelists identified a few trends that are outside the scope of the traditional verification as known for many years.
Trend #1: Engineering and verification teams are becoming more strategic. They are looking more carefully at the objectives to determine which verification engine is best suited for the task.
Trend #2: Verification today encompasses much more than just simulation, engineering and verification teams acknowledge. It includes hardware-based verification, whether in the form of emulation or FPGA prototyping, and formal verification. It involves pre-silicon verification and post-silicon validation.