Cindy Wilson, Marketing Communications Manager
Wilson has been marketing communications manager at EVE since 2006 and is responsible for its tradeshows, seminars and events planning, media planning, social networking programs and budgeting. She has worked in EDA since 2002 when she joined Tharas Systems and held various positions, including … More »
Results from DAC Survey
August 27th, 2012 by Cindy Wilson, Marketing Communications Manager
The EVE Marketing Department takes the opportunity during each live event where we exhibit, such as DAC, to survey attendees who stop by to see us. This year’s DAC was no exception and we were pleased with the number of attendees willing to take time to answer our questions.
Most of the DAC attendees we met were from the U.S., though there was a healthy number of international visitors to our booth as well. When asked their job function, most answered “designer” or simply “engineer.” Other job functions included EDA tool support, management, verification/validation specialist or system architect. In other words, DAC had a wide range of attendees, many of whom appeared to be interested in reining in their verification challenges.
On the design side of things, there was no real surprise –– Verilog was by far the most common HDL language, followed, in order of preference, by SystemVerilog, VHDL and SystemC. Perhaps a bit more surprisingly: VHDL beat out Verilog, SystemVerilog and SystemC, in that order, for testbenches.
When asked about simulators, the responses we received aligned with what those familiar with the EDA industry would expect. The #1 simulator ranked in EDA also came up #1 in our survey.
Further, most survey respondents noted that their companies had either 0-100 simulation seats or 200 plus seats, which parallels the nature of the SoC market –– big and smaller companies, but not as many in the middle.
Overall, respondents were reasonably satisfied with specific aspects of their existing verification flow, although there is room for improvement, with the highest ratings at 4 or 5 on a scale of 1 to 6. Those criteria covered runtime performance, setup time, efficiency in catching corner cases and reusability, and overall satisfaction with their verification flow.
In a subsequent question, we asked respondents to rank the importance of various criteria for their next hardware-assisted verification platform purchase decision. Price and runtime performance were the only criteria where the highest number of respondents rated them at a 6 on the scale of 1-6, making them the most important to this set of potential customers. Simulation acceleration and visibility into the design scored ratings of 5 and 6 in importance. Compilation performance and in-circuit emulation and transaction-based emulation methodologies were rated in the middle of the 1-6 scale, deeming them important, but not the primary selection criteria.
Next, we asked a question about the current or anticipated mode of operation for hardware-assisted verification. Nearly 70% of respondents checked simulation acceleration and transactor-based emulation. This response shows the growing importance of accelerated virtual testing accessible to anyone on the network versus traditional standalone emulation and in-circuit emulation use operating modes.
Perhaps to no one’s surprise, primary uses for a hardware-assisted verification platform were split almost evenly between ASIC validation (RTL debug) and hardware/software co-verification.
Surveying attendees at events is a key part of staying on top of trends, we believe. As a result, we make certain a carefully developed survey is always part of our events checklist. It has served us well and helps us better plan our product strategies.