Open side-bar Menu
 Real Talk

Archive for 2010

Will 70 Remain the Verification Number?

Monday, June 7th, 2010

It’s that time of year again.  The design automation community is about to descend on Anaheim for the yearly conference.  The build up of anticipation, the buzz and the extra effort preparing for our booth have me pondering the topic of verification.

With verification consuming 70% of the design cycle, will 70% of the exhibitors at DAC this year offer tools to solve the verification challenge?  We will see.  While the percentage may not reach 70, I am confident that many companies will offer a variety of new, old or repackaged techniques, methodologies and tools for a verification engineer’s consumption.

With an abundance of options and choices, could the verification tool categories make up 70% of the EDA tools category?  Well, that is our space, hardware emulation, and Real Intent’s in the formal verification area.  Add acceleration, assertions, debug, prototyping, simulation, testbench generation, TLM models, validation, functional qualification, static verification and the list is growing, but not quite overtaking the rest of the field.

Next are the attendees at this hallowed event.  One can’t help but wonder if 70% of attendees are verification engineers, given the mammoth effort to verify that a chip will work as intended.  Will 70% come from the U.S. or will we see some attendees from Europe, Asia and the rest of the world, as well?  What’s more, of this group, are they spending 70% of their time on the exhibit floor researching verification solutions and new technologies?  Or, for that matter, 70% of their CAD budget on verification tools?

And, lest we forget, does verification account for 70% of the yearly EDA revenue?  Not according to the EDA Consortium.  In 2009, Computer Aided Engineering (CAE) contributions to the EDA worldwide revenues were in the ballpark of 40%, which includes IC Physical Design and Verification, PCB and MCM, Semiconductor IP Products and Tools, and Services.  Within CAE, by adding all forms of verification, such as logic, formal, timing, analog and ESL, that number exceeds 70%.

Even if you’re not a verification engineer, verification must matter as SoC design sizes and complexity continue to outwit even the most sophisticated EDA design flow.  After all, the average design size is about 10-million ASIC gates, with individual blocks running between two- and four-million ASIC gates.  And, the push to get products to market is only increasing.

As DAC kicks off next week in Anaheim, the question is whether a company on the exhibit floor will have the breakthrough verification tool to crack the 70% barrier.  Many will have software and hardware that will help to reduce the insidious verification challenges.  Emulation, for instance, is emerging as a tool for debugging hardware and for testing the integration of hardware and software within SoCs ahead of first silicon.  Stop by EVE’s booth (#510) during DAC to see a range of hardware/software co-verification solutions, including super fast emulation.  You’ll walk away with greater understanding of ways to reduce the time consumed doing verification, a handy reusable tote bag, and chances to win one of two iPADs or one $100 visa check card. Stop by Real Intent booth 722 to see how Real Intent’s solutions bridge the verification gap in Lint, CDC, SDC, DFT and X-Prop verification. They are giving away some real good looking and useful carabiner flashlights and carabiner watches.

A Model for Justifying More EDA Tools

Monday, May 31st, 2010

One of the overwhelming issues facing the EDA community is the need and desire to increase total sales. One of the greatest hurdles in the ongoing chase to get more seats is the inability to convert the design software budget dollars into new seat licenses. Although most large companies have more than adequate dollars budgeted for software, less than a quarter of the dollars represent new tool acquisitions. The balance of the funds are for maintenance, training, and management functions like parceling out the limited number of seats available.

The inherent value of EDA tools is to provide more automation to the design task, thereby increasing the individual engineer’s productivity. As an example of the value of a tool, design for test tools reduce the time for test development and are able to improve fault coverage over manual methods in the test to over 90 percent of all faults. The tool leads to better test coverage of the design resulting in a higher probability of catching the rare or random errors that make the system fail. So the tools simultaneously reduce engineering time and improve test quality by enhancing internal node observability and controllability. As an added benefit, the window to the internal nodes makes the system debug and integration much easier, due to the availability of the internal state data at the time of failure. So here an additional tool not only improves the risk-performance equation in its intended department, but also aids another group in performing the debugging work.

The EDAC work on ROI justification does a good job of addressing the investment parts of the equation. (See the presentation on the EDAC web page www.edac.org/EDAC/EDACHOME/) The problems with the standard financial models for return on investment (ROI), however, include the lack of a sense of time (ROI equals the average return divided by average investment) and the total lack of connection with the issues that most concern the engineering managers. The managers are most concerned with risk reduction, overall productivity, and net increases in total dollar sales, whereas the standard ROI measures only look at changes in the direct outputs from the investment. The greatest problem in approaching the issue from an investment perspective is the need to quantify the results from a change before the fact.

The EDAC analysis does a very good job of displaying the effects of delays in product release on costs and revenues, but suffers in this regard, because it requires the quantification of risk factors and clear estimates of productivity changes. These are exactly the values that people want to measure, but are also the most difficult values to determine.

In addition, the direct outputs for new tool acquisitions are changes in productivity, a metric that the engineering community abhors because it implies the design task is a quantifiable, fixed process and not the exercise in creativity and skill in design that the engineers say it is. Therefore, the attempts to assign weighting values in the financial analysis to adjust the productivity creates a conflict for the person who will be reporting the numbers. A dramatic increase in productivity implies a large part of what the engineer does can be replaced by a piece of software. A small increase or a decrease in productivity implies the tool is not of great value. Neither of these results is desirable for the EDA community or for the engineer reporting the numbers.

One reason that the financial model breaks down in the ASIC world if that the return on investment depends on more than just the engineering department’s efforts. External factors like market position, pricing, profitability, and product features are all part of the return portion of the equation, but these factors are not in the control of the EDA tool purchase decision maker.  The overall history of ASICs has been, unfortunately, that although over 90 percent of all ASICs pass customer specifications on the first pass, less than half go into production. If a new product doesn’t go into production, the return on investment becomes a negative value that has no real relation to the measurement parameters of productivity.

Another reason that the basic financial models break down is the need to factor in some adjustment for risk. The relative productivity changes, as difficult as they are to measure, are much easier to quantify than risk reduction, because the level of risk may have no correlation to any dollar amounts. The addition of a tool may increase the risk due to the down time to learn the tool, or may cause a large enough change in the overall design methodology to expose other missing links in the tool chain. On the other hand, an incremental tool change can reduce the risk by enabling a more complete exploration of the design space, thereby ensuring a successful product design.  The risk reduction and productivity improvement are probably the most difficult parameters to quantify in assessing the value of a new tool, and the traditional financial analyses only point out the inability to predict a virtually unmeasurable future result.

New model

As an attempt to address some of the other issues in the valuation of tools, here is a simplified model that combines the traditional financial items like return on investment with some concepts from time to market analyses. The traditional inputs for ROI are the costs for the tools and the savings (in time and money) as a result of the tools. The new model also incorporates the estimated reduction in end-item unit volume and ASP for every month the product release is delayed from the best case schedule. Despite the statement that productivity and risk are hard to quantify, the model generates an ROI number as well as provides a means to evaluate a number of scenarios to bound the relative risk.

The model is in an Excel workbook with three worksheets. The assumptions and variables are entered into the first table called “Inputs”. This  passes the data to another worksheet for cost, ROI, and productivity analysis. The final sheet shows the time-to-market effects of the tools purchase, in terms of total design costs, size of market, and product sales. The effects of new tool purchases shows up in the “Impacts” worksheet, where relatively small changes in product development time have significant affect on the company’s sales numbers. The number of variables for contributions to the bottom line are too complex for a general analysis, but are easily available for more detailed analysis within the company doing the design.

All of the inputs for the analysis are available on the first page, and are the details you will need to get from the customer. The values are linked into the following sheets as variables in fairly simple equations. The pages are protected only to keep the formula intact. If you find a better algorithm for the cost/benefit evaluation, please feel free to modify the spreadsheet, by turning protection off and making your changes.

Note the “Costs “ page shows fairly small changes in productivity and a negative ROI for most cases. This is the problem with the traditional measurements, one can’t always find much in the way of good news in productivity or ROI for a standard analysis. If a new tool makes a sufficiently large change in productivity, the ROI eventually goes positive.

By combining the costs data and the effects on the total product life revenues, the model provides a means of identifying the total influence a tool purchase makes on the company’s revenues. In the “Impacts” worksheet, we observe the effects of tool purchases on the release of the target IC. By adjusting costs and delays, a user can also get an estimate for the end-of-life function, which is the  cross-over point in a late introduction where revenue goes below some threshold value.

For some scenarios, this cross-over point is before the design is completed, and therefore is a useful early indicator that a design program should be stopped early, rather than expending resources on a money-losing proposition. If the

EDA tool can help a company recover from this situation, then the tool truly is of much higher value to the user than just the change in productivity or some ROI. The value of the tool might be the salvation of a company.

Mind the Verification Gap

Monday, May 24th, 2010

Would you ever use a wrench to tighten a Philips screw? Or hammer a square peg into a round hole?

Chip design today has become more of verification task than design. Designers spend more than 50% of their time trying to come up with ways to verify their designs or, worse yet, someone else’s design. Despite the change in the nature of the design work, designers keep using the same old design tools, hammering away trying to close the design and verification Gap. Must you not Mind The Gap?

Over the past decade or so, designs have transitioned from code writing to IP and code verification. Most designers today are tasked with taking a piece of IP designed by someone else who may not be even around in the company, or a design so old that the original designer does not even remember the details, or even IP your company bought from a third party and try to make it satisfy the spec. All is well until you realize that the changes you made to the code have left many holes in the functionality which are not covered by the original vectors you got with IP/design. In turn, the changes resulted in many unintended consequences that you could not have predicted based on the IP/design spec. The issues only magnify once you put all the IP blocks together.

Well that’s exactly what happens when you try to hammer a Philips screw into place. Step back and take a good look at the techniques you use today! Are you still using the same simulation methods? Are you still relying on LEC to catch some of the problems? Are you tossing the verification work over the wall to the verification folks and calling it the day – that’s their problem (until it comes back to you with an embarrassing bug!)?

Over the last decade design teams have added linting to their flow. EDA vendors extended linting to cover even more exotic checks. The tools helped the managers to become a design IRS and gain a little more visibility into the quality of the design. But, neither did the verification tasks did get any easier, nor did the design quality improve by what was promised. Most designers used these tools only as a check list. The unintended consequence was the amount of extra work deciphering linter reports. The problem is that this activity often has low ROI because of the noise, the difficulty in setup and managing yet another set of files and results.

Even though designers are finding themselves doing more verification work than design, the tool of choice is still basically a big hammer (i.e. the simulator). Linters so far have helped managers more than the designers in the trenches.

It is perhaps time for more finesse and a bit of strategy. Next-generation tools can help designers better strategize their work, and better targeting their simulations. With targeted simulation and functionally checking the design on the fly, designers can now look deeper into design and make sure they did not overlook potential bugs.

What tools can help in this process?  Is it time to rethink strategies and retool? Perhaps it is time to address the Design and Verification Gap. This means marrying verification and design activities together, and starting verification essentially right at the outset. Perhaps it is also time to go beyond traditional simulation, linting and traditional verification techniques. Verification essentially needs to move hand-in-hand with the design. Early verification will not only increase productivity and ROI, but it will also focus designers to cover as many functionality scenarios as possible. Next-generation tools must also incorporate a simple setup along with super fast analysis runtimes to incrementally check the design, help designer target simulation, debug the design on the fly, and to provide feedback on the potential holes left in the design as a result of recoding or other changes. 

As your designs grow and you include more IP, your verification tasks will certainly grow. Be sure to Mind the Verification Gap.

ChipEx 2010: a Hot Show under the Hot Sun

Monday, May 17th, 2010

May 4th, 2010. Airport city, Israel. The weather forecast promised rain, so we all came dressed for a storm, only to find a big sun smiling above us! It turned out to be a very hot and dry day. And it was the day of the ChipEx. 

 

“ChipEx, what is that?” you might ask. Don’t feel bad for not knowing, it is only the second year that ChipEx is in the trade show business.

 

ChipEx is an annual international event of the Israeli semiconductor industry, sponsored by TAPEOUT magazine in cooperation with the Global Semiconductor Alliance (GSA). ChipEx consists of 3 main parts – vendor exhibition, technical conference and GSA executive forum. Given that the economy of Israel fared better than many other western countries, and Israelis are known for their technical innovation (The world economic forum designated Israel as one of the leading countries in the world in technological innovation. No surprise that all the latest Intel microprocessors were developed in Israel), the show was hot and sizzling with activities. Over 800+ people participated with about 50 EDA & IP companies presenting, and key-notes addressed by some industry heavy weights such as Rajiv Madhavan (CEO of Magma) and Gary Smith (Industry Analyst).

 

For the second year, Real Intent joined our Israel distributor Satris at ChipEx and we couldn’t be happier with our success. Our booth was hot with activity the entire time.  Besides normal networking among exhibitors and friends, many senior engineers and managers were hunting for the next generation technology and Real Intent has it! It all fits that innovative people are always seeking out innovative technologies that can help them stay at the leading edge.

 

You will also find that a trade show in Israel is a very different experience than other places in the world. People hardly want to hear any of the “Marketing stuff” (often described in stronger word), and that’s what they call our presentations, brochures etc.  Instead, people would step in, and ask about the technology. You hardly finished 2 sentences before the next question came in, as if to say – ‘we have no time to waste, give us the highlights and we’ll decide here and now if we want to hear more!’ In most cases, people did not want to see a demo or a short presentation. If they were interested, they’d ask to see it on a follow up visit. And this fits too, as innovative people usually have little patience for nonsense and are hot on the heels of the very best solutions.

 

We had over 40 visitors in total and many follow up visits scheduled. We definitely hit the “Real” needs in the design community with our automatic formal verification solutions targeting early functional verification, clock domain crossing verification and timing exception verification. 

 

At the end of the day, after all the grueling questions under the constant pressure of keeping poised and technical, I was hot tired! But it was well worth the effort and also great fun for me to engage in intelligent conversation with smart people having real needs.

 

Thanks for the Real Intent marketing team (even though I didn’t use their “marketing stuff”J), Satris and the ChipEX2010 committee for a successful show under the hot sun!

 

See you next year!!

We Sell Canaries

Monday, May 10th, 2010

When someone asked me the other day what Real Intent does, I told him, only half in jest, that we make and sell canaries. If you think about it, the verification tools we develop are the proverbial canaries for the chip-design coal mine. Their role is for them to be used in the advance party to give early warnings of bugs lurking in the chip. Used in this manner, our tools prevent late-stage blow-ups in chip functionality that can potentially ruin profit margins and may be even subvert an entire business model.

 

Talking about business models makes me think of start-up companies. It is very hard today to get a start-up company venture funded if it has a significant chip design component in its development roadmap. This bias is not wholly without reason. Hardware design is expensive and having to design your own chips makes it more so. While getting the product wrong the first time around is expensive for any start-up, it is especially so for a hardware company. If you need to reposition your hardware product or fix problems in it, it is all the more difficult and expensive if it involves redesigning a complex homegrown chip. The realization of the company’s product concept, and indeed the entire business model, becomes a prisoner of the chip design latency. You must get the chip right-enough quickly-enough to leave any wiggle room in the business model.

 

The risk is scary, but so is mining coal. Coal continues to be mined despite its risks and so must entrepreneurial initiative in chip design be perpetuated. As in coal mining, systematic processes must be instituted in chip design to mitigate risk. Accidents cannot be done away with, but can certainly be reduced in frequency.

 

One of the important technologies with the potential to significantly mitigate chip design risk is the application of pre-simulation static verification tools that target chip design errors in the context of specific failure mode classes. The technology has matured enough in the last decade to provide tangible value today. If I was evaluating a chip-design-heavy business proposal at a venture capital firm, I would certainly gate the funding based on whether the founders have experience with and instituted the use of static verification tools as an integral part of their chip design process and roadmap.

 

Real Intent has been a pioneer in this space and provides pre-simulation static verification tools that address some of the key failure modes. Real Intent’s Ascent product family finds bugs in control-dominated logic without the need to write assertions or testbenches. Because Ascent tools perform sequential formal analysis, they can even identify deep bugs that take many clock cycles to manifest as observable symptoms. Our Meridian tool family finds bugs in the implementation of clocks and clock-domain crossings. These bugs result from a confluence of timing and functionality and can be so subtle as to require a specific combination of process parameters for them to materialize. If ever there was a canary for chip design, it is Meridian. Finally, our PureTime tool family finds bugs to do with incorrect timing constraint specifications. Like clock-domain crossing bugs, these bugs too arise from a confluence of timing and functionality. Real Intent continues to develop new tools of this ilk to target additional failure modes. Our goal is to help make chip design risk acceptable again.

 

The adoption of these tools is up to you. Do you have a canary in your design flow?

Celebrating 10 Years of Emulation Leadership

Monday, May 3rd, 2010

            EVE is celebrating its 10th anniversary this year.  It has been quite a ride for all of us associated with this industry disrupter out of Paris.  Many of the same team from April 2000 are key member of today’s EVE team and wouldn’t have missed any of the excitement these past 10 years.

Exciting, it’s been.  It’s especially gratifying to know that our basic assumptions that served as EVE’s foundation when we started the company have turned out to be right.  I am talking about taking a novel approach to hardware-assisted verification by selecting a commercial FPGA instead of designing a custom ASIC as the building block of the emulator.  Similarly, we prioritized speed of execution to address the hardware/software integration stage of SoC verification.

            As for the rational behind our first criteria, we concluded early on that custom silicon would not scale and would be excessively expensive to adopt to address an overall market in the ballpark of $200 million.  Redesigning a chip every two to three years at smaller and smaller technology nodes would be economically disastrous.  We instead chose the best FPGA on the market and have continued to do so.

            As for the second assumption, we thought that speed of execution should not be compromised, particularly if we wanted to move outside the traditional space of hardware emulation.

            Over time, we have addressed all of the other important parameters that make an emulator a best-in-class tool.  They include fast compilation, thorough design debugging and scalability to accommodate a large spectrum of designs from a few million ASIC gates to one or more billion ASIC gates.  Equally, we have addressed energy efficiency by reducing the emulator’s footprint, energy consumption and air cooling requirements.  We did all of this by devising an architecture that is simple, elegant and efficient, and, even more important, by developing stacks of unique software.

            This focus on off-the-shelf FPGA parts and speed has paid off with installations at nine of the top 10 semiconductor companies and more than 60 customers.  Our hardware emulator ZeBu is used to verify designs of almost every conceivable consumer electronic product.

            The mention of ZeBu brings me to another point about our strategy –– how we came up with ZeBu.  Well, a best-in-class verification tool needs to support a best-in-class design … with zero bugs.  Zero Bugs, ZeBu.  Got it?

            It’s been a heady trip for the entire EVE team.  You’ll forgive us if our sense of pride seems outrageously boastful, but 10 years of solid achievement and growth is no small accomplishment.  We look forward to the years to come confident that we will continue the growth we have enjoyed in the past and today.  And, more important, support current and future design teams with the best-in-class emulation system.  Let’s raise our glasses and toast ZeBu and the team behind it.

Imagining Verification Success

Monday, April 26th, 2010

EDA developers need to have a very active imagination. They need to imagine becoming their own end users. Sometimes they may become the designer, sometimes the verification engineer or perhaps even the design manager.  This role play is essential for creating tools that will be embraced by the designers or else they are going to be just one-tool-wonders. For an EDA tool to become a regular tool in a designer’s tool chest, it needs to have a very high usability quotient, and a role play is essential for creating that.

 

A tool’s usage in a design flow can nominally be broken into three distinct phases a) Setup (b) Analysis and (c) Debug. Setup is required just once per design and is should not be a very onerous step.  Analysis is done within the tool and should be highly, if not completely automated.  It should minimize the need for much user attention other than tracking things like performance, memory size, etc.  It is the debug phase where the user spends most of his/her bandwidth. They have to examine the output of the tool, combine their design knowledge with the tool’s analysis data, and quickly identify and repair the source of any detected issues.

 

To accurately capture and automate this flow,  developers need to imagine how the users are going interact with their tool. For example – does the tool present information in the proper terms and conventions of the language in use? Is the debugging output organized consistently with the design structure? Is the tool effective at propagating bugs to observable points in the design? Is the debug environment able to reconstruct the faulty effect easily under user control? Effective organization of the output is essential to enable a user to view the results in ways that can be internalized easily.

 

By a large margin, debug is the major factor in a verification tool’s usability. An accurate understanding of the designer’s desires and needs is the most effective way of organizing the output in a clear, logical fashion.  Developer imagination is a key part of this effort, as is real customer feedback to gauge how effectively the goal has been reached.  Despite the availability of dedicated tools and methodologies for verification, users are spending a lot of time tracking down bugs that should have been easily caught and debugged. Sometimes, the only difference between failure and success is just a little imagination.

Do you have the next generation verification flow?

Monday, April 19th, 2010

I have been involved in verification projects for the last ten years. One thing I can say for sure is the level of complexity is ever rising for both design and verification. With more and more ASICs being designed with applications using high computing power for mobile consoles in the consumer market, the time to market has become critical and there is zero tolerance of being late.  At the same time, the increasing power sensitive devices add to the functional requirements. The verification effort grows exponentially rather than in a linear manner with respect to the added features. The reason is simple, when the feature list grows in 30%, true verification requires not only 30% more feature checking but also cross feature verification. The higher demands coupled with shorter schedules make verification a challenging task.

In addition, globalization brought about knowledge sharing as well as creating tough competitions around the world. The rule of the game has become to deliver as fast as possible. As a result, the verification professionals need to be constantly on the look for the right technologies to be deployed in their verification flow in order to keep up with the pace of change and get ahead of competition.  Verification flow can become a competitive advantage for a firm.

Random based coverage driven verification (CDV) is becoming an industry standard, and I believe that trying to deliver bug free ASICs or FPGAs using direct testing is practically impossible. The problem with CDV is that it involves a huge amount of engineering effort to build all the verification environments. This effort is at least as complex as the design itself and in many times even more so. It also requires dedicated teams specializing in environment development. The consequence is that the debugging process has longer iterations. Every time there is a test failure, two engineers, responsible for two different systems (design and verification environment) have to find out what went wrong, and only then can the mistakes be corrected. Bug fixing turnaround time can reach days or even weeks.

This is why automatic formal verification becomes useful. Automatic formal gives the team a way to find many bugs in a much cheaper (found earlier and therefore easier to detect, debug and fix) manner. These tools can prove that the design is clean of many issues that can be difficult to find using simulation. These issues include dead code segments, logical equations that are implemented incorrectly, thus giving a constant value, state machine which are deadlocked, pair state machines that lock each other, and even incorrect clock domain crossing problems such as data stability problems and incorrect control & feedback implementation.

One might say that these tools have no understanding of the functional specification, but in many cases functional bugs are direct consequence of these kinds of issues in the design. Another might say that most of these bugs can be found by well designed verification environments. Possible, however the great thing about automatic formal verification tools is that they do not need any verification environment development, which saves tremendous amount of time and effort. The concept is similar to lint tools. All you need is a short run script and the design itself. Therefore, when the designer is done with coding, he can use these tools, get answers quickly, and debug on his own with a very short turnaround time. All this occurs right when the RTL design is being developed, and it is usually much easier to debug early than the ones found weeks or even months later in the project, when verifying at a larger scope.

On the whole, automatic formal usage on top of CDV can save between 10% – 15% of the verification effort and give greater confidence that the design is ready for the next steps in the flow.  In a competitive environment that we live in, and with the amount of resources put into verification, this saving can make a big difference.  Taking the time right now to ensure that you have the right tools in the right place in your verification flow can save you time and money which equates to revenue for your company.


Do you have the next generation verification flow?

A Bug’s Eye View under the Rug of SNUG

Monday, April 12th, 2010

There is a lot of buzz about SNUG lately.  It is not surprising. SNUG’s traditional Tuesday night event went through a big transformation this year and it was a big hit for all participants. The original Interoperability Fair was transformed into “Designer Community Expo” by allowing Synopsys’ partners and suppliers to gather in seven communities (IC Design, IC Verification, IP, System Design, FPGA, Custom Design and AMD Verification, and Compute Infrastructure) and to present integrated solutions to Synopsys users.

Real Intent exhibited its automated formal verification product families at the SNUG Designer Community Expo, including: Ascent, for early functional verification, and Meridian for clock domain crossing verification.  We also demonstrated the integrated solutions between our products and Synopsys’ VCS simulator. Our engineers were all smiles when reciting the event. According to Jay Littlefield, Sr. Applications Engineers at Real Intent:

SNUG this year appeared to be very well-attended.  The vendor show area was easily 4-6x the size of DVCon, and had a much more open floor plan.  This made it easy for people to mingle, identify vendors of interest, and quickly gauge the wait at the buffet lines.  Most people we spoke to were there for very focused reasons; desiring to find solutions to specific problems as opposed to “just browsing”.  This made it much easier to connect with engineers on individual issues for which we provide solutions.

We found a great deal of interest in our products from many different people.  Nearly all engaged in detailed explanations of past problems they hoped not to repeat in future designs.  We found a lot of interest in both functional verification and clock domain checking across the board.  Interestingly, the level of detailed knowledge regarding CDC issues was definitely higher among engineers than years gone past, indicating a work force both more educated and more concerned about the potential issues these class of failures could inflict upon their designs.  Many times we heard those magic words, “I’d like to evaluate your tool”, which is the justification any vendor needs for attending a venue like this.  So all-in-all, it was a very worthwhile show and I hope we’ll be going back next year.

When asking Karen Bartleson, Sr. Director, Community Marketing at Synopsys on what prompted the change this year and what benefits they saw, she said:

We understand that the world is made up of communities and our industry is no different. We wanted to bring value to our customers from their perspective – from within their communities of interest. Taking the concept to the logistical level, we developed the layout and color scheme to make it easy for customers to identify the communities of interest to them. Customer appreciation was, of course, the biggest benefit. They expressed delight in the new concept for our event and obtained valuable information to take away. Our partners, too, appreciated the opportunity to participate in the Designer Community Expo which has a high quality audience. It was a means for strengthening our relationships with our partners and hence strengthening the seven communities.

SNUG, true to its name, has become a very focused and intimate event for the design community. Karen told me that more people attended SNUG worldwide than any other events in our industry.  We hope that “Designer Community Expo”, with its success, will be extended to SNUGs at other locations.

Globetrotting 2010

Monday, April 5th, 2010

Add me to a growing list of EDA globetrotters because I spent five weeks of the first two months of 2010 traveling around the world, visiting India, Japan, Taiwan, Korea, China and France. It was an eye-opening experience and showed me that, while the world economy has not completely recovered, there is plenty of optimism and design activity in our semiconductor market segment.
During my travels, I found that chip design and verification seem to be on everyone’s mind. For example, many of the design teams I met with are starting new projects in the hot, hot, hot multimedia area to support high-definition TV. I talked with teams designing Blu-Ray and other high-definition disc players. Other consumer electronics areas are booming as well, as is the fast-paced networking and communications market.

In Asia, electronic system level (ESL) is in widespread use and, in Europe, STMicroelectronics still serves as the ESL early adopter and role model. It is also a leader in the move to transaction-level modeling through its efforts on the Open SystemC Initiative (OSCI) Transaction-Level Modeling Working Group. This standard is meant to enable interoperability between system models, intellectual property (IP) models and ESL design tools, and promote widespread acceptance of ESL.
Back in the United States and in meetings around Silicon Valley, I don’t see ESL adoption as yet, though that may change as full designs in all market segments around the world are now at least 10-million gates. Moreover, individual blocks are topping out at between two- and four-million.

My travelogue continues with the worldwide challenges of verification. In the verification niche shared by Real Intent, pioneer of automating formal technology for design verification, and EVE, developer of emulation and hardware-assisted verification, 10-million gate designs results in boundless opportunities.

In my roam around the world, I discovered that many “nice to have” technologies have become “must have” verification tools in the design flow, in particular, formal technology and emulation. That’s because design complexity is only increasing due to new features, added capabilities of existing products and need to get products to market faster. The added complexity brings forth isolated failure modes which demand specific technologies for the most efficient and effective verification, such as asynchronous clock domain crossing verification and timing exception verification using automated formal technologies. Equally attractive is emulation’s ability to be used across the entire development cycle, from hardware verification, hardware/software integration to embedded software validation. A new generation of emulators is capable of handling up to one-billion or more ASIC gates at high speeds in a short period of time, making them a great choice for such huge designs. Pricing is more competitive, too.

As a member of the EDA Globetrotter Travel Club, I’ve recently had the chance to meet with semiconductor companies worldwide embarking on all sorts of new and exciting development projects. In almost all cases, their verification needs are real and, almost always, verification solutions are available for almost every need. I didn’t need to globetrot the world to learn that.

CST Webinar Series



Internet Business Systems © 2016 Internet Business Systems, Inc.
595 Millich Dr., Suite 216, Campbell, CA 95008
+1 (408)-337-6870 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy Policy