Archive for April, 2010
Monday, April 26th, 2010
EDA developers need to have a very active imagination. They need to imagine becoming their own end users. Sometimes they may become the designer, sometimes the verification engineer or perhaps even the design manager. This role play is essential for creating tools that will be embraced by the designers or else they are going to be just one-tool-wonders. For an EDA tool to become a regular tool in a designer’s tool chest, it needs to have a very high usability quotient, and a role play is essential for creating that.
A tool’s usage in a design flow can nominally be broken into three distinct phases a) Setup (b) Analysis and (c) Debug. Setup is required just once per design and is should not be a very onerous step. Analysis is done within the tool and should be highly, if not completely automated. It should minimize the need for much user attention other than tracking things like performance, memory size, etc. It is the debug phase where the user spends most of his/her bandwidth. They have to examine the output of the tool, combine their design knowledge with the tool’s analysis data, and quickly identify and repair the source of any detected issues.
To accurately capture and automate this flow, developers need to imagine how the users are going interact with their tool. For example – does the tool present information in the proper terms and conventions of the language in use? Is the debugging output organized consistently with the design structure? Is the tool effective at propagating bugs to observable points in the design? Is the debug environment able to reconstruct the faulty effect easily under user control? Effective organization of the output is essential to enable a user to view the results in ways that can be internalized easily.
By a large margin, debug is the major factor in a verification tool’s usability. An accurate understanding of the designer’s desires and needs is the most effective way of organizing the output in a clear, logical fashion. Developer imagination is a key part of this effort, as is real customer feedback to gauge how effectively the goal has been reached. Despite the availability of dedicated tools and methodologies for verification, users are spending a lot of time tracking down bugs that should have been easily caught and debugged. Sometimes, the only difference between failure and success is just a little imagination.
Monday, April 19th, 2010
I have been involved in verification projects for the last ten years. One thing I can say for sure is the level of complexity is ever rising for both design and verification. With more and more ASICs being designed with applications using high computing power for mobile consoles in the consumer market, the time to market has become critical and there is zero tolerance of being late. At the same time, the increasing power sensitive devices add to the functional requirements. The verification effort grows exponentially rather than in a linear manner with respect to the added features. The reason is simple, when the feature list grows in 30%, true verification requires not only 30% more feature checking but also cross feature verification. The higher demands coupled with shorter schedules make verification a challenging task.
In addition, globalization brought about knowledge sharing as well as creating tough competitions around the world. The rule of the game has become to deliver as fast as possible. As a result, the verification professionals need to be constantly on the look for the right technologies to be deployed in their verification flow in order to keep up with the pace of change and get ahead of competition. Verification flow can become a competitive advantage for a firm.
Random based coverage driven verification (CDV) is becoming an industry standard, and I believe that trying to deliver bug free ASICs or FPGAs using direct testing is practically impossible. The problem with CDV is that it involves a huge amount of engineering effort to build all the verification environments. This effort is at least as complex as the design itself and in many times even more so. It also requires dedicated teams specializing in environment development. The consequence is that the debugging process has longer iterations. Every time there is a test failure, two engineers, responsible for two different systems (design and verification environment) have to find out what went wrong, and only then can the mistakes be corrected. Bug fixing turnaround time can reach days or even weeks.
This is why automatic formal verification becomes useful. Automatic formal gives the team a way to find many bugs in a much cheaper (found earlier and therefore easier to detect, debug and fix) manner. These tools can prove that the design is clean of many issues that can be difficult to find using simulation. These issues include dead code segments, logical equations that are implemented incorrectly, thus giving a constant value, state machine which are deadlocked, pair state machines that lock each other, and even incorrect clock domain crossing problems such as data stability problems and incorrect control & feedback implementation.
One might say that these tools have no understanding of the functional specification, but in many cases functional bugs are direct consequence of these kinds of issues in the design. Another might say that most of these bugs can be found by well designed verification environments. Possible, however the great thing about automatic formal verification tools is that they do not need any verification environment development, which saves tremendous amount of time and effort. The concept is similar to lint tools. All you need is a short run script and the design itself. Therefore, when the designer is done with coding, he can use these tools, get answers quickly, and debug on his own with a very short turnaround time. All this occurs right when the RTL design is being developed, and it is usually much easier to debug early than the ones found weeks or even months later in the project, when verifying at a larger scope.
On the whole, automatic formal usage on top of CDV can save between 10% – 15% of the verification effort and give greater confidence that the design is ready for the next steps in the flow. In a competitive environment that we live in, and with the amount of resources put into verification, this saving can make a big difference. Taking the time right now to ensure that you have the right tools in the right place in your verification flow can save you time and money which equates to revenue for your company.
Do you have the next generation verification flow?
Monday, April 12th, 2010
There is a lot of buzz about SNUG lately. It is not surprising. SNUG’s traditional Tuesday night event went through a big transformation this year and it was a big hit for all participants. The original Interoperability Fair was transformed into “Designer Community Expo” by allowing Synopsys’ partners and suppliers to gather in seven communities (IC Design, IC Verification, IP, System Design, FPGA, Custom Design and AMD Verification, and Compute Infrastructure) and to present integrated solutions to Synopsys users.
Real Intent exhibited its automated formal verification product families at the SNUG Designer Community Expo, including: Ascent, for early functional verification, and Meridian for clock domain crossing verification. We also demonstrated the integrated solutions between our products and Synopsys’ VCS simulator. Our engineers were all smiles when reciting the event. According to Jay Littlefield, Sr. Applications Engineers at Real Intent:
SNUG this year appeared to be very well-attended. The vendor show area was easily 4-6x the size of DVCon, and had a much more open floor plan. This made it easy for people to mingle, identify vendors of interest, and quickly gauge the wait at the buffet lines. Most people we spoke to were there for very focused reasons; desiring to find solutions to specific problems as opposed to “just browsing”. This made it much easier to connect with engineers on individual issues for which we provide solutions.
We found a great deal of interest in our products from many different people. Nearly all engaged in detailed explanations of past problems they hoped not to repeat in future designs. We found a lot of interest in both functional verification and clock domain checking across the board. Interestingly, the level of detailed knowledge regarding CDC issues was definitely higher among engineers than years gone past, indicating a work force both more educated and more concerned about the potential issues these class of failures could inflict upon their designs. Many times we heard those magic words, “I’d like to evaluate your tool”, which is the justification any vendor needs for attending a venue like this. So all-in-all, it was a very worthwhile show and I hope we’ll be going back next year.
When asking Karen Bartleson, Sr. Director, Community Marketing at Synopsys on what prompted the change this year and what benefits they saw, she said:
We understand that the world is made up of communities and our industry is no different. We wanted to bring value to our customers from their perspective – from within their communities of interest. Taking the concept to the logistical level, we developed the layout and color scheme to make it easy for customers to identify the communities of interest to them. Customer appreciation was, of course, the biggest benefit. They expressed delight in the new concept for our event and obtained valuable information to take away. Our partners, too, appreciated the opportunity to participate in the Designer Community Expo which has a high quality audience. It was a means for strengthening our relationships with our partners and hence strengthening the seven communities.
SNUG, true to its name, has become a very focused and intimate event for the design community. Karen told me that more people attended SNUG worldwide than any other events in our industry. We hope that “Designer Community Expo”, with its success, will be extended to SNUGs at other locations.
Monday, April 5th, 2010
Add me to a growing list of EDA globetrotters because I spent five weeks of the first two months of 2010 traveling around the world, visiting India, Japan, Taiwan, Korea, China and France. It was an eye-opening experience and showed me that, while the world economy has not completely recovered, there is plenty of optimism and design activity in our semiconductor market segment.
During my travels, I found that chip design and verification seem to be on everyone’s mind. For example, many of the design teams I met with are starting new projects in the hot, hot, hot multimedia area to support high-definition TV. I talked with teams designing Blu-Ray and other high-definition disc players. Other consumer electronics areas are booming as well, as is the fast-paced networking and communications market.
In Asia, electronic system level (ESL) is in widespread use and, in Europe, STMicroelectronics still serves as the ESL early adopter and role model. It is also a leader in the move to transaction-level modeling through its efforts on the Open SystemC Initiative (OSCI) Transaction-Level Modeling Working Group. This standard is meant to enable interoperability between system models, intellectual property (IP) models and ESL design tools, and promote widespread acceptance of ESL.
Back in the United States and in meetings around Silicon Valley, I don’t see ESL adoption as yet, though that may change as full designs in all market segments around the world are now at least 10-million gates. Moreover, individual blocks are topping out at between two- and four-million.
My travelogue continues with the worldwide challenges of verification. In the verification niche shared by Real Intent, pioneer of automating formal technology for design verification, and EVE, developer of emulation and hardware-assisted verification, 10-million gate designs results in boundless opportunities.
In my roam around the world, I discovered that many “nice to have” technologies have become “must have” verification tools in the design flow, in particular, formal technology and emulation. That’s because design complexity is only increasing due to new features, added capabilities of existing products and need to get products to market faster. The added complexity brings forth isolated failure modes which demand specific technologies for the most efficient and effective verification, such as asynchronous clock domain crossing verification and timing exception verification using automated formal technologies. Equally attractive is emulation’s ability to be used across the entire development cycle, from hardware verification, hardware/software integration to embedded software validation. A new generation of emulators is capable of handling up to one-billion or more ASIC gates at high speeds in a short period of time, making them a great choice for such huge designs. Pricing is more competitive, too.
As a member of the EDA Globetrotter Travel Club, I’ve recently had the chance to meet with semiconductor companies worldwide embarking on all sorts of new and exciting development projects. In almost all cases, their verification needs are real and, almost always, verification solutions are available for almost every need. I didn’t need to globetrot the world to learn that.