[ Back ]   [ More News ]   [ Home ]
February 23, 2004
Where the Rubber Meets the Road
Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
Peggy Aycinena - Contributing Editor

by Peggy Aycinena - Contributing Editor
Posted anew every four weeks or so, the EDA WEEKLY delivers to its readers information concerning the latest happenings in the EDA industry, covering vendors, products, finances and new developments. Frequently, feature articles on selected public or private EDA companies are presented. Brought to you by EDACafe.com. If we miss a story or subject that you feel deserves to be included, or you just want to suggest a future topic, please contact us! Questions? Feedback? Click here. Thank you!

FPGA Tools - Part II

Ping Chao at Cadence Design Systems, Inc.

John Isaac at Mentor Graphics Corp.

Industry News

Coming soon to a theater near you


Bits & Bytes

In Conclusion

There's just so much to talk about in high tech, and really so little time. This week there's a doubling up - a tripling up, actually - of topic material. We start with Part II of an on-going conversation about the tools for FPGA design.

We then move to a conversation with Ping Chao of Cadence Design Systems, speaking about his vision for the company and his new role there. Saving the best for last, we end with a really interesting set of comments from John Isaac of Mentor Graphics, talking about boards, system design, and all that jazz.

So please go get that cup of coffee, sit back, relax, and read on.

Getting More than You Paid For - Part II

There's actually two chapters in Part II of this discussion of the tools for FPGA design.

The first chapter includes several Letters to the Editor in response to
Part I. As you will see, people are expressing some pretty strong and well-articulated sentiments about the current FPGA tools conundrum.

The second chapter includes comments from two tool vendors, one specifically targeting the FPGA market, and three FPGA vendors. As you'll see from their discussion, despite the hypothesis that EDA vendors are inhibited from competing in the FPGA space because of the quality of the tools and pricing offered by the FPGA vendors themselves, the facts of the matter are actually quite a bit more complex.

But first, a word from Xilinx:

Tamara Snowden, Public Relations at Xilinx - “Xilinx President & CEO Wim Roelandts recently gave a talk to a large group of customers. He assured them that fully 50 percent of our resources are spent on software development and support. He said that nobody can use our parts without superior software, and that we're providing our customers with that software. Xilinx is putting as much investment into our software tools as we put into our hardware engineering. That's because the tools are so critical to the design flow.”

FPGA Tools Part II - Chapter 1

Letter No. 1 -

FPGAs and the Beauty of Not Being Cadence/Mentor

The current debate over EDA for FPGAs seems to be built on a few interesting assumptions, echoed by Gary Smith and the other pundits who preside over the design tools landscape. First is the notion that the FPGA vendors 'give the tools away.' Of course, they don't give these tools or anything else away - they simply see the tools as the necessary, enabling technology to drive chip sales. This has made sense because highly optimized FPGA fitters, driven by rapidly evolving platform architectures, have been the only way to get designs into these programmables. A dedicated FPGA tool chain works fine when the FPGA is used as a container for glue logic. But, what happens when Xilinx and Altera
start commoditizing large-scale programmables (they have) and offer these as a delivery platform (they have) - the so-called SoPC (system on programmable chip) concept?

Even if FPGA design ended 'between the pins' (and it certainly does not), building real systems and getting them into these devices remains a 'black art' and tantalizingly beyond the reach of many engineers. Embedding systems requires some pretty sophisticated tools and techniques and, to-date, neither the FPGA vendors nor the EDA giants seem to be willing to tackle this ogre. The reality is that engineers will need to exploit these devices. If you're part of a large team, the skills are probably on tap. Maybe you're already a seasoned ASIC developer? Maybe Verification is your middle name? But, what if it isn't?

Here's another reality. Engineers have been building complex systems since the birth of the MPU. They do this by sticking down and wiring components. They're working with millions of gates, but the gates are neatly, safely packaged. So why can't this same model work for FPGAs? That is exactly the business that Altium will make from this 'dumb market' (courtesy of Gary, again). Our view is that we can integrate system-level development with FPGAs - we call this 'Board-on-Chip'. We provide a single environment that gives designers access to pre-packaged IP (processor and peripheral device cores), embedded compilers and debuggers, synthesis and virtual instruments - even a vendor-independent
development board with swappable Altera and Xilinx FPGAs. Designers are able to build real systems, in real time, with the skills they have today and get full advantage of these new 'platform capable' FPGAs. And, our customers are willing to pay for this solution.

Bruce Edwards

Executive Director

Altium Ltd.

Letter No. 2 -

Very nice article on FPGA design tools. It's true that the major FPGA players in that market have 'poisoned the well' concerning setting the value for the technology they're licensing for design. It's also true that the multi-million dollar mask costs at 90 nanometers and beyond will change the economics of semiconductor product development from 'boutique' fabless companies marketing to a niche market, to companies and markets that can reasonably assure high production volumes to offset the increasingly higher product development NRE costs.

The major problem, as I see it, is that the technical nature of FPGA EDA tools makes them 'hard wired' to a particular firm's hardware architecture and products. Accordingly, since the only FPGA devices that Altera, Xilinx or Actel's respective EDA tools will work with are their own, they've fallen into the 'hole' of giving them away to get the design win ('socket,' they call it) and the future silicon revenue.

The second problem, is the sales channel for the majority of FPGAs is through distributors and manufacturers reps. The majority of their sales forces are experienced in selling components, and are compensated on the dollar volume of FPGAs shipped. As a result, they have no vested interest, nor the training and selling skills, to create value in the EDA software and demand a premium price.

Frank Purdue said in his advertisements (East Coast), 'It takes a tough man to make a tender chicken.' Likewise, it takes a tough sales rep to say 'No' to free software when they're compensated on silicon shipments.

If you go back in history, Cadence had a program in the early-to-mid 1990's code named 'Figaro.' It was their attempt to 'corner' the back-end place-and-route FPGA market. It was not successful, and when Cadence 'end-of-lifed' the program, one major semiconductor company reversed their strategy for entering the FPGA market at that time - despite having successful and high-performance sample parts already in-hand - when they realized the magnitude of the development effort (and cost) associated with the EDA tools for the products.

Now, my understanding is that the number of software developers for EDA tools at the major FPGA suppliers far exceeds the number of hardware developers for products in those companies.


Letter No. 3 -

I enjoyed the article on EDA tools for FPGA. I wrote a related article for FPGA Journal a few weeks ago on the same subject: To Buy or Not to Buy - Will FPGA Designers Pay ASIC prices for EDA tools?


While the It's-the-FPGA-Vendors'-Fault argument is often cited by EDA vendors for the current pricing model, the problem is more complex than that. The issue goes to the number of FPGA vendors playing in the market, the number of potential customers, and the extremely diverse demographics of the FPGA design community compared with the ASIC community.

FPGA Journal did a survey of FPGA design projects that highlighted this diversity with an indication that a very small group of designers (about 27% of the total) account for 97% of FPGA-related spending. This group of designers has budgets similar to those of a typical ASIC design team. (In fact, they often ARE a typical ASIC design team.)

The remaining 73% of FPGA designers rely heavily on free tools, and range from people doing hobby projects in their garage, to students working on degree program projects and small start-ups developing prototypes. This type of user simply does not show up in the ASIC space because the cost of entry is too high.

As a result, looking at the FPGA market as a 'whole' and making assumptions like 'FPGA designers expect free tools' is grossly misleading. The small group that spend the lion's share of the money on FPGAs are both willing and able to spend money for tools that can earn their keep. The remaining three-quarters of the FPGA designers skew the results and paint a false picture.

Tools supplied by the FPGA vendors are still widely acknowledged as incomplete and inferior in quality and performance to the top EDA-supplied tools. While the vendor tools may be sufficient for 80% to 90% of users, there is still the potential for a robust market for EDA tools in FPGA. The remaining percentage of FPGA designers is still probably similar in size to the entire ASIC community.

[Meanwhile], as discussed in the article, one big problem faced by EDA vendors is their own distribution channels - I won't repeat the argument here.

Kevin Morris


FPGA Journal

FPGA Tools Part II - Chapter 2

David Stewart, CEO at CriticalBlue - “The first important thing to remember is that the complexity of FPGAs has now reached a point where a lot of traditional methods for ASICs or SoCs now need to be applied to FPGAs. Originally, they were pretty much a push-button thing - you ran it through a synthesis tool, through place and route, and you were finished.”

“Now it's a lot more complicated, with hierarchical [design] and floorplanning and timing issues all part of the process. Many companies that used to do complex ASIC designs, and are used to spending reasonable amounts of money on the tools, are now converting exclusively to doing large FPGAs. [It remains to be seen if] they'll bring the same kind of money to those FPGA projects.”

“I think that we're either at, or fast approaching, a discontinuity. We have the simpler FPGA tools, which don't justify more than $10,000 or so. But now, at 10 million gates and beyond - because of the need for floorplanning, implementation, hardware/software levels of design rather than just RTL - companies understand the need for additional sophistication today.”

“Today it's at the top end of the FPGA market where we need to use new approaches. [Meanwhile], there are now embedded processors actually on the ship. So we see a market opportunity to provide hardware co-processor accelerators to co-exist with the main processor on the chip, as well.”

“I think the challenge that large EDA vendors have is in the marketplace, because of the profile of the FPGA users. Generally, their tool budgets are significantly lower than those for ASIC or SoC designs. For the vendors, it's one thing to be able to demonstrate the project value of their tools, and it's something else to be able to recognize that value [through real sales].”

“We're not pursuing FPGA customers right now, although we do support FPGAs in our prototyping. We are looking at attacking the stand-alone FPGA market, but I think we'll have to come up with some sort of business model to get the dollars that we believe are commensurate with [the effort that would require].”

Jeff Garrison, Director of Marketing for FPGA Products at Synplicity, Inc. - “The FPGA vendors basically want to sell silicon. They want to make it as easy as possible for their customers to start a design. They want to provide a complete package with simulation, synthesis, place and route, [as so forth]. They've been doing that for years, and for years we've competed with that. The segment of customers that we serve needs both performance and maximum cost effectiveness from their FPGA tools. The FPGA vendors are only really interested in providing a basic package to their customers.”

“Synthesis and place-and-route have a direct impact [on FPGA vendor revenues], so FPGA vendors are participating there. In that, to a certain extent, they're in competition with the EDA industry, although they haven't really participated in the simulation [market].”

“But, things are changing. Now there's a lot of value in providing the best FPGA synthesis tools. The FPGA vendors know that if you can fit your design into a smaller design, you can provide a huge return to your customers. For customers who use Synplicity tools [to achieve this], they're seeing that the tools pay for themselves quickly. And it's that segment of FPGA customers who are looking for performance, cost effectiveness who are willing to pay for the best tools. As FPGAs get bigger, our value [as a tool provider] becomes even more significant.”

“If you look at the Xilinx website, you'll see that they're saying that their silicon, Synplicity's tools, and their back-end tools provide the best-in-class for projects. Altera also recommends Synplicity's tools and then the Altera back-end tools.”

“Clearly, there really aren't any [third-party] alternatives at the back-end. Place and route for an FPGA is tightly tied to the architecture of the device itself. So the FPGA vendors provide their own back-end tools to their customers. The EDA companies aren't able to, and don't try. Our tool does do some routing and adds some additional performance, but doesn't do place and routing.”

“We feel that the FPGA vendors re-evaluate tool offerings all the time. From their perspective, they want their silicon to look as good as possible. Quality comes from the silicon architecture and the tools used to produce the silicon - specifically synthesis and place-and-route tools. They're constantly evaluating where they stand with respect to their competition in this complete flow.”

“The FPGA vendors do give away their tools for a segment of their customers. But those customers are the ones who aren't worried about performance or cost, the ones doing small mediocre designs. However, [FPGA vendors giving away the tools for free] is the barrier that the big EDA vendors see to entering the market. We see that barrier as well, but we also see that the number of FPGA designs is growing, where the number of ASIC designs is coming down. We know for the newest devices that, frankly, it's hard for FPGA vendors to provide all the tools that their customers need.”

“[Meanwhile], I don't think that the big EDA vendors have their sales channels set up for the FPGA market. They're set up for a small number of large customers, those customers who spend millions of dollars on tools. At Synplicity, we understand the FPGA market - we're set up for a smaller number of sales transactions.”

“If you look at what it take to do 130-nanometer full ASIC designs, you've got to have $150 million revenue on this product to even think of justifying doing a custom ASIC. How many designs are there out there today justifying that kind of volume? The number of ASIC starts is shrinking from [upwards of] 10,000 some years ago, to 2,000 or less today.”

“On the other hand, in the FPGA and structured ASIC segment, those design starts are going up. There are always going to be ASICs for the high-volume, high-performance market segment, but ASIC guys can [think about] re-inventing themselves as FPGA guys. An ASIC designer has always had to deal with VHDL and Verilog, which is the front step to FPGAs, as well. Xilinx and Altera both offer training with regards to their architecture, their IP, and how to design within their environments. Any knowledgeable HDL person can understand the technology.”

“So yes, the FPGA vendors are putting a lot of effort into developing tools, but they're offering a silicon product which is worthless without tools. They see the tools as a critical part of their business, and they want some level of control over the design process.”

“As far as EDA players are concerned, however, there's a lot of room for EDA providers to provide additional value in the FPGA market in areas like debug and physical synthesis that will fit around the core FPGA design. There's a market for tools that will allow FPGA designers to get more productivity out of their efforts. At Synplicity, we've been able to compete successfully against the free tools and make a very nice living out of it. Going forward, as we continue to customize our software in response to each new major architecture announced by the FPGA vendors, our advantage will only increase as the FPGA designs get bigger and bigger and more complex.”

Saloni Howard-Sarin, Director of Tools Marketing at Actel Corp. - “Our front-end design flow depends on contributions from a number of vendors, while the back-end flow continues to be made up from our own tools for physical design, and verification and logic analysis. At Actel, our strategy is to provide our customers with the best tools. Unlike our competition, what we do is look at the existing [third-party] vendors, and see if we can leverage their tools [on behalf of our customers].”

“[It's true], FPGA vendors don't like paying high prices for EDA tools. However, the only area where we like to use our own tools, which is the case for most FPGA vendors, is in the back-end, in place and route where it requires a deep, clear understanding of the architecture. In any other part of the flow, we can't compete with companies like Magma, Synplicity, Mentor, or SynaptiCAD.”

“We've never actually had a complete toolset that we developed and supported in-house. In the simulation case, we've always used Mentor's ModelSim from the beginning. In synthesis, we did at one point have an offering that was our own - but eventually decided to go with others.”

“As the FPGA market has changed, as the devices have gotten bigger and bigger - Actel's devices are now at 2-million gates - a whole different set of problems have [emerged]. You have to look within the product lines of the third-party tool vendors and ask, 'Who has reasonable offerings and do those products offer what our customers need?' Then, you have to OEM the tools [you select].”

“We do two or three major product releases a year. With each release, we include the latest releases of the tool technology, as well. We're constantly re-evaluating. We say to the vendors that we need this or that tool - or that we need an entirely new methodology - as we try to figure out what our customers need.”

“We want to provide a one-stop shopping [experience] for our customers. Our Libero environment is a wrapper around the other tools, a dashboard with a flow that walks you through every phase of your design. It runs about $2,500. If you add in physical synthesis, it's about $3,500. We like to say that we're offering ASIC-level tools at FPGA prices.”

“Currently, the FPGA vendors are the channel for the third-party EDA vendors in the FPGA space. FPGA companies run the gamut today - from huge to small mom-and-pop shops. So ease of use is very important, and much more an emphasis for the tools than in the ASIC industry.”

“Today, ASICs have the tools and the methodology to address a huge number of gates. But as FPGAs grow, those technologies are going to have to come into the FPGA space, as well. Verification is very important for ASICs, but FPGAs have repgrogrammability. FPGA customers are more used to the luxury of knowing if it doesn't work, it can be reprogrammed. Some people see that as a weakness in the technology. We choose to see that as a strength.”

“If I was in an EDA vendor's shoes, I wouldn't be wildly enthusiastic about the FPGA space either. But if you look at the economics of ASICs versus FPGAs, the FPGAs are [very attractive]. Outside the United States, FPGAs are on an even steeper ramp [than they are in North America]. China and Japan are getting into FPGAs faster than anywhere else in the world and we're seeing tremendous growth there.”

“But changing the business model is never easy. The big EDA vendors are used to calling on just one or two large accounts. They've got to change their sales forces and their philosophy. Meanwhile, the prices for FPGA tools are so much lower, it's really difficult for them. But the FPGA industry is hungry for tools and for models that will make the tools work. Whether it's through the FPGA vendors, or through the EDA vendors, there's an extraordinary opportunity there for anyone who sells good tools.”

Chris Balough, Director of Software and Tools Marketing, and Jim Smith, Director of EDA Vendor Relationships at Altera Corp. -

Balough: “Altera is in the time-to-market business. The whole idea of reconfigurable, reprogrammable devices is what people come to FPGAs for. We offer time-to-market power and service those customers who want the best results. Altera can provide outstanding results no matter what tools a customer uses.”

“However, that we can also offer the option of a total flow to the customer who is starting from scratch with FPGAs. The key aspect here is the relationships between our core architecture and the tools. It's artificial to try to separate those two things, because it's truly an integrated technology.”

“Our focus will always be on both synthesis, specifically, and place and route - it will never end. They are both vital for integrating our FPGA architecture. We are currently making announcements that offer the first revolutionary change in the logic fabric and structure in 5 years. These developments are only possible because we can model and test a variety of different architectures, which in turn is only possible if we have excellent tools in-house.”

“In an FPGA company, the FPGA fabric is highly influenced by the software engineering organization. The ability to design in logic in a cost effective manner means that the tools play a very vital role here.”

“As far as our structured ASIC products are concerned - these tools offer an outstanding and unique value proposition. We're the only ones who offer an FPGA with an equivalent structured ASIC. You can target your design for an FPGA, get the design out into the field, debug it in the field, and once it's proven and working, you can use a wizard to implement the design in a structured ASIC. We believe structured ASICs exist - we've got real customers there. One reason that we're seeing such a huge interest our structured ASICs is, there's nowhere else you can go to get an FPGA and then have the ability to put it into a structured ASIC. It's a low-risk proposition.”

Smith: “We put our focus and R&D focus on those things that are specific to our architecture, specifically in the place-and-route area. We also work closely with third parties to address the front-end aspects of design. So you can see, from RTL on down is where we put the focus of our tools. Above and around RTL, we work with third parties.”

“We don't see any change in that going forward. The fitter is unique to our device architecture. It's critical to have a very good understanding of the architecture to get the best optimization possible with fitter, so naturally that technology will stay with us. Synthesis, however, plays a role between the architecture and the customer's design and place and route. So, a complete synergy needs to be there. We are working very closely with third party synthesis vendors to enable them to be able to provide the best solutions possible as we role out each one of our new architectures. We're willing to work with anyone, in fact, who can provide value to our customers.”

“With regards to the issue of system-level design, we're interested in developing implementations there and will work with third-party vendors t solve those system-level issue.”

“[Meanwhile], on the verification side, the devices are getting more complex and the number of gates, memories, I/Os are increasing. In all of those aspects, we work with all of the big EDA vendors. We're also looking increasingly at the smaller players in EDA. We've got over 14,000 customers - we touch almost every FPGA designer out there - so it's important that we touch every tool that our customers use.”

Steve Lass, Director of Software Product Marketing at Xilinx, Inc. - “Our decisions depend on how good the third-party tools are. But, due to competition, we have to sell our tools fairly inexpensively. We basically have a front to back-end shrink-wrapped solution, as does our main competitor - there's a certain amount of competition in providing the tools. Our foundation tools runs about $2,500 for a time-based license that comes with support.”

“Why would anybody need to look any farther than our tools? Well, we're doing the tools for inside the FPGA. Our main decision-making criteria are the quality of results and the performance of our FPGAs. Clearly we want to have control over that. Something like a schematic editor, or a board tool, or a signal integrity tool for verification would not impact the quality of results with respect to using our products, because our quality of results mainly depends on clock speed.”

“Lately, we've been evaluating third-party offerings every several months. We actually have strong partnerships with a number of third parties. And, there have been some changes lately. Some of the EDA companies are deciding right now whether they want to be in FPGAs or not be in FPGAs. What I'm hearing is that they're not really that interested. But, we do monitor the situation frequently, watching start-ups in the industry like Hier Design.”

“If we were to see [higher levels of dedication] from the EDA vendors, and we didn't have this competitive issue, things might be different. Clearly, if our competitors are offering something essentially for free, but we were to point our customers to third-party vendors, then basically we would lose.”

“However, we aren't forcing our customers to stay with us or to go to another tool vendor, but they can if they want to. Sometimes customers actually like the third-party tools. Synplicity's tools offer a great design environment and debugging capabilities, and they offer independence.”

“Synplicity is shifting to structured ASICs. It's not clear right now if that's good or bad. On the bad side, it provides our customers an easy way to move over to structured ASICs. On the good side, a lot of people want to use FPGAs, but at higher volumes want to go to ASICs. If you provide them with a tools flow which allows them to get to that situation, then that's a good thing. It falls on both sides right now.”

“Currently, at the system level, we don't think there are any tools out there really doing a good job. System level hasn't been hit yet, although it does seem like a good opportunity for third-party vendors. The FPGA vendors haven't gone into that area yet.”

“[As far as the state of the art for the FPGA tools today], if you're talking about plain vanilla synthesis, our tools do a very good job. They could be better, of course, but right now they're very reliable. As far as the back-end mapping, place and route, and timing analysis are concerned, however, these tools will always remain with Xilinx as far as I can tell. They are very specific to our architecture.”

“Will things ever change? Well, when you go out to talk to a big customer and they say, 'Look, we're already spending tens of millions of dollars for the silicon, we're not going to pay for the software as well.' - obviously the tool situation today is the model that we're kind of stuck with for now. We can look at potential revenue from software and that from silicon, and it's obvious, which is going to be the bigger number. If you give your software to everybody for free, you silicon revenues go up.”

“Right now, we've got about 200,000 users of our tools. It's hard to tell how many FPGA designers there are in the world, however. We don't have any security on our software, so it's really simple to steal.”

“Actually, it would be really nice to figure out a line we could draw between ourselves and the EDA vendors, so they could find an area to go into that we wouldn't be in. However, the EDA vendors would have to lower their costs, because the FPGA customers wouldn't tolerate the pricing pain threshold.”

“[The EDA vendors need to face] the difference between ASICs and FPGAs. If you don't go through all of the expensive tools to make an ASIC perfect, you're stuck having to do a multi-million dollar re-spin. With an FPGA, if you make a mistake, you can just reprogram it.”

The Vision Going Forward

Ping Chao is recognized industry wide as a distinguished technologist. In his current capacity as Executive Vice President and General Manager of the Design and Verification Division at Cadence Design Systems, he shoulders a heavy burden in defining the present and future technology vision for the company. I had a chance to speak with Chao at length several weeks ago. At my request, he talked and I typed. He started off by answering a somewhat personal question, “Are you enjoying your current situation?”

“For the most part, yes I am. Although I'll admit I'm a little overworked these days, but I'm not complaining. The work is extremely interesting and the technology is compelling.”

“Even before I took this job, I was involved in Cadence's strategy. Fundamentally, the question is what kind of role should we be playing in this industry. We want to be a long-term business partner to our customers - an overused term, I know - but we know there's a lot of substance and hard work that must go into that. We already have had some success with Agilent, Toshiba, IBM, Infineon, and so forth. We've raised our level of working with these people to more than just being a vendor, which is what we're driving for.”

“The partnerships with our customers are really CEO-level [arrangements]; Cadence addresses those partnerships at the management level. But we participate in conversations with our customers at many levels [within their organization] and take into consideration how the total solutions works best for them.”

“As far as what we're doing here, Cadence realizes that our technical leadership needs to be at the forefront of our strategy. We will establish ourselves as a technology leader across the board - particularly in the digital space. More broadly, we enjoy strength in the chip package and board space, and in the analog space. At the moment, however, the most pressing industry wide issues are in the digital space. That space includes design, verification, RTL to GDSII, and so forth. Everything in that space is undergoing a major change, and we know we must win that space to stay at the leading edge.”

“Complex chips today are at 0.13 micron and below, with 10 millions gates and up. That's the emerging and growing segment and we want to be sure to have the leadership position in that segment. As I said, we will have the technology leadership there across the board, and will have substantial relationships with our customers [to support that position]. And not just in design and verification, but also in implementation. It takes leadership in all three of those areas to deliver the technology.”

“Cadence today has two divisions. Lavi Lev manages the Implementation Division, and I manage the Design and Verification Division. However, our strategy is to offer one product offering, not two. There are two divisions, so that we can focus on the concerns of the design and verification engineers versus those of the implementation engineers. But the customer is defined as the entire design team.”

“If you look at the team, there are 3 primary user groups there - each with their own design domain and unique requirements. There's the front-end design engineering team, the verification team, and the implementation team. Together they're jointly developing the nanometer silicon, so we're approaching the customer in one unified way. It looks like one customer to us, as it should.”

“In the past couple of years, there's been a lot of activity on the implementation side - timing closure, silicon-design issues, and so forth - lots of investment and hype and buzz words around that area. Fundamentally, we're moving towards nanometer, wire-centric complexity and the issues there related to power, timing, and yield.”

“Then comes verification. There are all kinds of new demands in terms of verification, moving from point tools and simulation and emulation towards a more integrated solution. We're calling it the verification platform”

“The last domain is in the synthesis space. Nanometer issues are now the hot topic in that space, and synthesis is at the center of it. The synthesis technology in use today is not designed for that purpose, particularly global synthesis. Logic and physical synthesis are now moving into nanometer design. In general, when people hear about physical synthesis, they think about what's been done through place and route, some sort of working through the existing synthesis techniques.”

“Now, however, we're talking about large-scale synthesis. Not optimizing at 10,000 gates, but looking at millions of gates. That implies [the move to] global synthesis. You're not just working with wire-load models at that level - the synthesis engine itself needs to be considering the physical effects. The trick is to determine whether or not you're optimizing on the wrong assumptions, whether your down-stream implementation will be accurately modeled in your synthesis. Again, we're dealing here with the large-scale issues in what we consider to be global synthesis.”

“Global synthesis implies an efficiency in the run time - you consider the final physical effects, not just the placement, but the wires must be predicted as well. It's funny that inside of Cadence, we have the notion of the silicon virtual prototype. That's a realistically modeled physical design, which includes the wires at a very early stage in the process. We have to go to the end of the process, and bring it up to the beginning. But that can take too long, which is the advantage [offered] by silicon virtual prototyping. You make the best trade off between timing and accuracy - in favor of timing, mostly. Then, if you're able to be statistically within 5 percent of the final
accuracy, and can do that 100 times faster, then this gives you a fast forward.”

“Silicon virtual prototyping is the notion of gaining very accurate physical data early on, based on the trade-off of getting to the physical model quickly while still including the wires. We can calibrate and control that trade-off by going through silicon virtual prototypes.”

“You go internally, and literally, through all of the process of synthesis, placement, and routing - except that it's fast-forward accurate. It's not sufficient to just have a first pass. We believe that most of the time you spend should be on the closing part [of the design process]. Prototyping [assists in the process] and helps with accurate models, but it's not something you can tape-out with. Synthesis still does all of the optimization, but now the technology allows you to model the physics and to optimize to the right thing.”

“We acquired an RTL compiler when we acquired Get2Chip, and subsequently integrated it with other technologies. Now many of our customers are using that in conjunction with our Encounter platform to get to tape-out. The RTL compiler basically has this notion of global synthesis and optimization. What I just described is actually at different stages of refinement right now. Ultimately, when the technologies do integrate, you'll have one integrated flow that shares all the internal models and engines, but we're not completely there yet.”

“Future demands, and markets, in technology are difficult to predict. I don't know the specific time frame that will apply for each particular technology node ahead, even for today's trend to 90 nanometers. I do know there's renewed interest in a couple of areas. Certainly co-verification of hardware and software has become quite interesting. I was involved at QuickTurn a number of years ago, but that technology is very different today.”

“Also it used to be the graphics guys [who were on the cutting edge]. Now it's the consumer and automotive guys who are doing interesting things. I was amazed recently to learn that the chip in a toy can have up to several hundred thousand lines of codes. For those guys, they say their tape-out schedule is primarily dependent on the bottleneck of whether they can verify their software. They need software sign-off before they can commit, so now the trend is towards high-level and system-level verification.”

“Cadence is pretty committed to SystemC. I'm watching [developments there] and it's pretty interesting. It's only now starting to show some momentum, although it's not quite there yet. But it seems that it's definitely going to happen.”

“We support both SystemC and SystemVerilog, but languages are moving towards system-level design and modeling verification - both of which are starting to happen now, driven by the nanometer process technologies. I believe it's going to be mark-up languages for system verification, which is going to be a challenge, as we'll have to double, almost triple, our work to support it. But that's how it's shaping up right now. If we need to address mark-up languages, we will - and we'll see it as a very interesting phenomenon. [Meanwhile], we know that SystemVerilog and VHDL are not going to go away anytime soon. Unfortunately, the final choice [in languages] is not something we can dictate.
It's a hassle, but we'll deal with it as it happens.”

“I can't see into my crystal ball to predict anything in the future. In fact, I don't even have a crystal ball. That's my philosophy, actually - not to have a crystal ball. I'm not a visionary type of guy. I try to see what's going on in the marketplace, and to adjust to that. The important thing is for us to stay very close to what's going on, and to have a sense of pragmatism. Oftentimes, you can have a strategy and start to think that's how things will go. I think it's important for us to have a strategy, but I also think it's important to have an attitude of knowing what's going on in reality and then adjusting to that reality.”

“How do I keep track of what's going on? I read the [San Francisco] Chronicle and the [San Jose] Mercury, and Time, and other magazines. Also, I travel a lot. - although, I'm traveling more and enjoying it less. I enjoy most everything that I do, but not the [increased] travel.”

“When I took on this job, people asked me, 'Why did you take the job?' I thought about it and I would have asked instead, 'Why did I even stick with EDA?' After all, it's not my first love.”

“[Originally], I wanted to be in computer architecture. But now I like EDA a great deal. There are a lot of very sharp people in this industry and I'll bet this is more difficult technology compared to anything in software. In fact, it's really too hard - and maybe that's why I like it. The problems over the last two years have gotten even harder, just in the time since I joined Cadence.”

“We've been talking about the productivity gap in this industry for years and years. In fact, I often wonder - how do you actually measure engineering productivity? Today, it's even more difficult to measure, because it's not just how much you can design [in a certain amount of time], now it needs to be defined in terms of production as that technology has become so very, very hard.”

“[Clearly], we need all of the interplay between the different disciplines to solve these problems. We need hardware, and software, and manufacturability in design, and lots and lots more. Over the last two years, nanometer design has made it all even that much harder - and made for an even greater separation between the winners and the losers in the industry, now and going forward. It's going to be harder and harder to be a winner. But if you do win, you'll win big!”

Where the Rubber Meets the Road

John Isaac is Director of Market Development for the Systems Design Division at Mentor Graphics Corp., based in Longmont, CO. As an avid sports enthusiast - skiing, hiking, off-road biking, windsurfing - working adjacent to the Colorado Rockies is a natural fit for him. He can leave the office and be out on the slopes in less than hour.

Unfortunately, that rarely happens. John's way too busy attending to the system and board-level design issues for Mentor and their customers. He comes by his expertise in these areas through long years spent at IBM managing the development of their internal EDA systems and with Mentor Graphics since 1984 (except for a 3 year non-EDA start-up experience) where he has grown with their systems design solutions since they were first introduced in 1985.

On the day that I spoke by phone with John, he was hard at work. Never mind that his family was off skiing because, undoubtedly given the choice, he would surely choose the work over the sport. He could always hear later from those who did hit the slopes how things went. But being away from the office for even a day might mean that he missed out on a development or two in board or systems area. Clearly, John's first love is his work. The skiing is only secondary.

I asked John to expound on the state of affairs in the system/board area and to start by distinguishing between system and board-level issues. As he talked, I typed, and here's the outcome:

“I think systems design is rather analogous to board design. Basically, it's where the printed circuit board and the components come together to create a system. Digital designers may feel they're creating a system on a single chip, but there's normally lots of other stuff required on the board with that chip for the final product to function. We tend to call it system design because of that - it's a little broader than just board layout in that there's verification technology and tool design included [under the umbrella] of system design. That's where our division fits into Mentor.”

“Relative to what we look at in terms of the system or board design world, it's often [perceived to be] less exciting or not as fast moving as the design challenges in the integrated circuit world. But, we're finding that's really not the case, especially right now. There's lots of stuff happening with respect to printed circuit board fabrication. The IC world today is producing high pin count, high-speed chips that are driving the PCB design and fabrication world into new, rapidly advancing areas. In order for the system design to make use of those pins, it's really causing board fabrication to change - and requiring new tools from both a design and verification point of view. These
ICs need
to be optimized for speed, timing, and signal integrity when you put them on the PCB and connect them to the rest of the logic.”

“If we took these developments casually, we feel that some of our leading-edge customers would find that their tools have fallen behind. We're working to not let that happen. We're investing heavily both in R&D and from an acquisitions point of view to keep our customers up-to-speed and ahead of the curve with the advanced technologies coming down the pike. From our standpoint, the printed circuit board world is anything but dull.”

“So, let's talk about two major categories of change currently under way. One area is High Density Interconnect (HDI)/microvia fabrication technology and the other is the area of embedded passives.”

“If you have an IC now packaged with 1500 to 3000 pins, you've got to consider what all that requires when you try to melt that chip onto a PCB. The first thing that's required in dealing with these small packages and dense pin counts is a way to route that device on the printed circuit board. You can't solve it by coming up with a board that's a foot by a foot in size. So, what's happening is that the fabrication industry is moving away from the pure fabrication techniques that they've known really forever in terms of the laminate technology, especially in terms of the relevant dimensions.”

“People have adapted their designs to use microvias on the board which is basically using integrated circuit technology to form the top layers of the board. Many printed circuit boards are now layered laminates (the classical technology) but with additional high-density layers with microvia superimposed on them. You use those extra layers to break out from your IC package to your laminate layers. This is a strategy not unlike that on a chip. Using these strategies, you get a huge savings in terms of the size of your circuit board. The longer you have to run the routing, the more delay you have in the signal, which means you can't capitalize on the performance capabilities of the IC on
board. The new fabrication techniques are solving this. So HDI - high-density interconnects and microvias - are the cutting edge in boards today.”

“Also, if you look at the FPGA world, where FPGAs used to only be considered appropriate for glue logic and the more mundane prototyping tasks associated with putting systems together - today, FPGAs are coming into their own in terms of both density and performance. You get in the range of hundreds of megahertz on an FPGA, with capacity for multi-gigabit speeds externally. Melting a high-performance FPGA onto a board needs to be done to capitalize on the increased capabilities of those chips. As a result, even FPGAs are driving the fabrication and design tool technologies to new levels.”

“In the area of embedded passives, there's lots going on there. If you want to mount, for instance, a Pentium IV on a printed circuit board, that requires an ever-increasing number of passive components on the board to make the thing operate properly at speed. A Pentium IV can need up to 400 resistors and capacitors to make it function on a printed circuit board. You can imagine that if you had to mount 400 discrete components next to your IC, it would take up a lot of room and spread your board out significantly. Therefore, a whole new technology, referred to as embedded passives, is emerging. These are basically resistors, capacitors, and even inductors and transformers that you can
inside of the printed layers of the printed circuit board. This allows you to either reduce significantly the size of the board, or allows you to add functionality and obtain optimal performance on the board.”

“This is very important in things like cell phones where size is a driving factor, and you want to put more and more functionality on your product to be competitive. Embedded passives technology is definitely coming into its own. It used to be a very expensive option, but like everything - with use, the price is coming down so that even cell phone manufacturers can incorporate the technology in an economical fashion.”

“As you can see, developments in both HDI and microvia layers, and embedded passives, mean that traditional place and route techniques on a board are moving to a next generation of requirements. Synthesis within the laminate layers is becoming a dominant technique and one that has to be dealt with by the designers and their tool providers. So those are some of the things that are happening in PCB fabrication land directly driven by what's happening in IC design.”

“Meanwhile, new and higher speed ICs are also causing problems that need to be dealt with. For instance, even if you've got an IC operating at 500 megahertz or more on-chip, unfortunately, when you try to run a chip faster than that across a printed circuit board, you end up defeating yourself by having to drive the busses up to 32-bit or 64-bit capacity.”

“Xilinx, for instance, has come out with a family of FPGA technology which includes third generation IO, 3G IO. In that same vein, Intel has what it calls PCI Express, which is a protocol used to communicate between chips in a serial fashion, which allows you to avoid the high bit count bus problem. You can run a differential pair in your PCB that allows you to send your signal between chips in a serial fashion at 3 to 10 gigabits per second, and still achieve the performance that you want and dramatically reduce the number of 32-bit and 64-bit busses. That, in turn, allows you to reduce the size of your printed circuit board - you can go with a smaller solution or put more
functionality in the same space.”

“As a result, a whole new set of design technologies are being required to accommodate these differential pair routings, as well as the analysis of those signals. Today, we need to be able to route those pairs in a finely tuned parallel manner - while taking care how they're terminated - so that signal integrity will remain intact. So then, a whole new generation of signal integrity design and analysis tools is required.”

“I should mention at this point that there's actually a third area of growing importance in printed circuit boards. It's what we call global design. Companies are going global in terms of their design process, in terms of their manufacturing. This means for a company that may be designing a cell phone, they want to design around the world to meet their time-to-market crunches. They might partition their global team in to a digital section, an analog section, and an RF section. They might then have specialists in these three separate technologies in different locations, and the company must coordinate the various teams [to create] one global team. It's very important to these companies
how the technology teams are mixed if they want to 'follow the sun.'”

“This allows a global design team to work on the same board simultaneously. Revisions control is obviously a big part of this. The way we've implemented our tools, you can partition the design, so that there's a master controlling the effort - usually a single individual - who can manage any number of fellow designers, whether the rest of the team is right there in the room with the master, or in a room on the other side of the world. The thing is coordinated by periodic communication over a LAN or a WAN and control commands from the project master.”

“We've invested heavily in this team design functionality in our tools, things that support the master/client relationship. We've developed the system in-house and it's called Team PCB. It's what is needed today as the world gets smaller and companies grow fewer and fewer as they swallow each other up.”

“Our industry is very specialized. Using software for project management that is not specifically developed for the EDA industry is really not possible. Someone developing these management tools must understand both the designers and the tools. If a PLM vendor, for instance, were to try to address the EDA industry with their management tools, it wouldn't be able to address many of the specifics like an EDA vendor could. The same thing, in fact, applies to specific databases - merging design databases back together varies significantly depending on the vendor and the tools.”

“Boards have a lot of pre-set design elements. But, when it comes to actually determining whether the design is going to be a 6 versus an 8-layer board (most boards are in this range, though some today may have as many as 20 layers), whether it's a rectangle of 6 inches by 8 inches, or a cell phone that's only 2 inches square - that's the system or product designer's call. They then work with a single manufacturer or a small number of manufacturers to send that data - form factor, thickness, etc. - and integrate it into the manufacturer's pre-set product lines. Printed circuit board manufacturing is pretty much standard across the industry. That's a good thing because it gives the
product house choices of which manufacturer to go with for a particular design.”

“In general, advances in PCB fabrication technology are coming out of the R&D organizations in industry. But there are also joint efforts between industry and the military, and universities as well. The universities often have connections to manufacturing, so it's a combination of [groups] contributing to the advancements.”

“Mentor has a large and loyal customer base. Of course, most of the big vendors in EDA sell printed circuit board design tools. But we feel that some of our competitors' tools are running out of gas - that those companies are resting on their laurels a bit. As a result of our continued investment, however, we feel that our tools will continue to meet the requirements that system designers face now and will face in the future. And, as a result, we're picking up new customers.”

“Mentor's business revenues are fully one-third based on board and system design tools. It's one of those deals where it's sort of a self-fulfilling prophecy. If you're not ahead in the game, you start to lose market share. Like much of EDA recently, the PCB market has been flat to down, but we're finding that when you have new technologies and tools, you can grow. That may not make it a very exciting market for many EDA vendors, but as Mentor has continued to invest we have actually seen an increase in our systems design business over the last several years.”

“We also believe that because our competitors are not taking this business sector very seriously, they're turning even their high-end systems design distribution channels over to third-parties. Alternate channels are okay for the ready-to-use customer but the high-end electronics companies that are pushing to the edge of technology need the “touch” of a direct EDA channel and direct access to development to succeed with their tools. Mentor is continuing to support this business, including direct distribution of our products through high-end, high-touch sales channels to our customers.”

“We know our customers well, both the layout guy and the EE, and we've seen them evolve with the technology. As this high-speed stuff has become the norm people can no longer view board designers as less sophisticated than digital designers. The layout board guy today has to be more and more cognizant of the effects of high-speed. He can't have an EE sitting on his lap all day helping him to layout the traces. Also, our market place is not just the PCB layout guy. It's also the EE who designs the system and is responsible for assuring that the system performs to the specs.”

“You ask if form factor for a printed circuit board is usually the over-riding design consideration. Actually, that's far from the case. In medical applications, for instance, it's reliability. In telecomm and switching, it's performance. For the telecomm consumer market, it is as you suggest: small form factor, with high functionality, and low power. Finally of course, for the military it's reliability. The requirements vary according to the end-customer's industry, although there's usually a common thread there in performance.”

“But no matter who's on the receiving end of the design put out by the printed circuit designer, we like to think that it's the board where the rubber meets the road.”

Industry News - Tools and IP

Accelerated Technology, the Embedded Systems Division of Mentor Graphics Corp. announced that a version of the Nucleus RTOS and the code|lab Embedded Developer Suite will begin shipping with the Intel IQ80315 Evaluation Kit for the Intel IOP315 I/O processor based on Intel XScale technology in Q4 2003. Per the Press Release: “Developers building storage applications such as RAID solutions, network attached storage (NAS), storage area networks (SAN), and other storage-related applications will benefit directly from this all-inclusive kit by being able to quick-start their embedded application.”

Also from Accelerated Technology - The division announced the addition of the Nucleus POSIX kernel to the Nucleus family of products. The POSIX (Portable Operating System Interface for UNIX) standard provides a standard application program interface (API) for developers. POSIX joins the growing list of APIs supported by the Nucleus software, which in addition to the Nucleus PLUS standard API, also includes the micro-ITRON and OSEK APIs. The division says that the use of the Nucleus POSIX API provides several advantages to the Nucleus application developers. It allows developers that are currently using a different
real-time operating system (RTOS), which also supports a POSIX API, to easily migrate their applications to the Nucleus RTOS.

Per the Press Release: “Traditionally, once a company has used a particular RTOS in their design, it is difficult to switch to another RTOS because much of the application code is hard to transfer to a different RTOS. This may prevent a company from migrating to an RTOS that better fits their needs because the switching costs are too high. The Nucleus POSIX API solves this switching problem for those that have the POSIX applications. Secondly, many embedded developers come from a UNIX background where the POSIX API is pervasive. Now those developers will be able to write embedded applications in an API that they are already familiar with, instead of learning a new one. Finally, since
standard is supported by other operating systems (OS), the use of a POSIX OS protects engineering investments in several ways. It is easier for a company to recruit engineers with existing knowledge of the POSIX API, resulting in less training of personnel. Also, should it become necessary to change the OS, it will be simpler to port the application from one POSIX OS to another."

Altium Ltd. announced the release of Nexar and a new 2004 range of LiveDesign-enabled products. Altium will commence rolling out its new 2004 generation of LiveDesign-enabled tools immediately, with others to be released throughout the year. Along with Nexar 2004, the LiveDesign-enabled products being released include: the Protel 2004 board-level design system, the CircuitStudio 2004 universal front-end design system for board-level and FPGA design; the CAMtastic 2004 PCB CAM tool; and the NanoBoard NB1 FPGA-based development board.

Per the Press Release: “Nexar, a vendor-independent solution for embedded system-level design on an FPGA platform, introduces a new design methodology that Altium calls LiveDesign. LiveDesign is a real-time, interactive design and development methodology that enables rapid implementation, testing and debugging of digital designs through a combination of FPGA-targeted virtual instruments that are incorporated at the schematic level, JTAG communications technology, and the NanoBoard, which connects to the engineer's PC. LiveDesign provides the engineer with a hands-on hardware and software environment for on-the-fly development and implementation of a real, physical circuit, including
processor cores, which is directly accessible from their desktop. Nexar's LiveDesign environment minimizes the need for simulation at the system level and enables the development of complete embedded systems on an FPGA without the need for HDL-based entry.”

Also from Altium - The company announced support for Infineon Technologies' new microcontroller, the TC1130, in the TASKING TriCore VX-toolset. Per the Press Release : “The latest addition to Infineon's TriCore microcontroller line is a high-performance 32-bit RISC microcontroller specifically developed for industrial control and automotive applications. Altium is now the only embedded software development tool vendor to support all microcontrollers from Infineon Technologies.”

AmmoCore Technology, Inc. announced that Fujitsu Ltd.'s LSI group has entered into a licensing agreement for Fabrix software, an integrated design flow from netlist to GDSII. Fujitsu says it will deploy Fabrix in the physical design implementation of current and future SoC designs. AmmoCore says the licensing agreement is “the direct result of Fabrix exceeding the strict benchmark requirements set by Fujitsu against both in-house and other physical design implementation solutions.”

Applied Wave Research, Inc. (AWR) announced an agreement with TriQuint Semiconductor, Inc. to “increases the use of AWR's monolithic microwave integrated circuit (MMIC) design tools throughout its Richardson, TX operation.” TriQuint says it will use AWR's Microwave Office 2003 software to streamline the development of its components, which are deployed in a wide array of advanced communications applications throughout the world. Eli Reese, TriQuint's Design Engineering Director at TriQuint, is quoted in the Press Release: “We have seen significant improvements in engineering productivity since we incorporated Microwave Office software into our design

Cadence Design Systems, Inc. announced the Cadence Encounter RTL Compiler Ultra synthesis tool with support for VHDL. Per the Press Release: “Encounter RTL Compiler synthesis is a key component of the Encounter digital IC design platform and a critical step in the fastest route to superior silicon. Supporting the Cadence multi-language strategy, Encounter RTL Compiler Ultra synthesis works with existing Verilog and VHDL design flows to increase chip performance, decrease design times, and provide the highest quality of silicon (QoS) for Cadence customers throughout the design chain. QoS measures a design's physical characteristics, in terms of area, performance and power
- using wires. Encounter RTL Compiler Ultra synthesis is used throughout the design chain by ASIC and IP vendors and IC designers to help increase overall chip speed performance by 10 percent and improve area by 10 percent. In addition, runtime can be up to three times faster compared to traditional tools.”

“The new generation technology behind Encounter RTL Compiler Ultra synthesis delivers global synthesis for timing closure using a unique patented set of global focus algorithms that maximize the performance of challenging designs. Encounter RTL Compiler synthesis fits into existing flows and can adapt to old and new approaches to design. In addition to the performance benefits, the Encounter RTL Compiler product is drop-in compatible with existing solutions, making it easy to evaluate and deploy into production flows for ASIC vendors, IP suppliers and end users”

Catalytic Inc. announced plans to create an automated path from algorithm specification to implementation that the company says will lead to faster and more efficient programming. Catalytic says the first application development and implementation software will target the DSP market. Randy Allen, Catalytic Founder and CEO, is quoted in the Press Release: “As fast as the DSP market is growing, its growth is throttled compared to its potential. The limiting factor is not processor speed, but the complexity of the programming process. That's what we intend to change.”

ChipVision Design Systems, AG announced a new version of the company's ORINOCO design tool. Per the Press Release: “ORINOCO 4.1 [also] offers advanced opportunities for tackling power issues at a highest level of abstraction. It features an enhanced support of SystemC, and allows multi-language support for different standard system-level specification languages (i.e., C and SystemC) in a unified front end. This integrated front-end provides designers with greatly increased flexibility for maximum design leverage, allowing them to mix, as appropriate, C and SystemC modules in designs to be analyzed by ORINOCO. The new ORINOCO version also provides new estimation and optimization
techniques that can be used at the system/algorithmic level for creating power optimal architectures. These include enhanced sampling techniques, which will facilitate processing the very large amounts of data that are typically generated when analyzing complex chip designs. Additional features have been added … to provide the user with improved, user-friendly reports that allow for intuitive and accurate analysis of the detailed power-related data generated by the tool.”

CoWare Inc. announced major new functionality in its LISATek suite of products. With the latest release, the company says embedded processor designers are now able to model their processor using a high-level language, and automatically generate Instruction Set Simulators (ISSs) and a complete set of associated software tools including the associated C compiler.

Per the Press Release: “Custom processors, such as Application Specific Instruction Processors (ASIPs) for DSP and control applications, are also enabled by the automatic generation of synthesizeable RTL code. Increasingly, companies are deciding to create their own programmable IP, typically embedded processors or ASIPs, because these devices provide the necessary flexibility for performing algorithmic acceleration, with the added benefit of easier re-use for derivatives or other projects. The problem they face is the cost and time taken to develop the necessary ISS and software development tools required. [Therefore], the LISATek suite of tools includes Processor Designer for the
of processor IP simulation models and their software development tools; C Compiler Designer for the creation of custom C compilers; and Processor Generator for producing RTL implementation code for the processor hardware.”

HARDI Electronics AB announced three new daughter boards in its HAPS family. HAPS is a modular ASIC prototyping system providing real-time speed, real-time debugging and full ASIC functionality for ASIC prototyping designers. Hardi says the new boards add capabilities such as Ethernet, USB, and analog video to the HAPS system, and that until now designers who needed functions such as these were required to develop their own add-on boards. HARDI also says the new daughter boards will allow customers to “immediately start transforming ASIC designs to FPGAs without the need to design additional hardware.”

The new boards include: the combined Ethernet/USB board, called the ETH_USB_1x1, which enhances HAPS debug and configuration capabilities; the AVID_1x1 analog video and audio input/output board, which the company says makes HAPS an ideal platform for ASIC designers working with multimedia applications such as digital cameras, mobile phones, DVD players and PDAs; and the CONF_1x1 stand-alone configuration board, which can configure Xilinx devices on any JTAG-capable Xilinx board with designs stored on a detachable CompactFlash card. The CompactFlash card holds up to eight designs, any of which can be downloaded to the Xilinx devices with a press of a button. All of the boards are available

MIPS Technologies, Inc. announced a new chip set from Entropic Communications is based on the MIPS32 4KEm processor core. The Entropic chip set allows homeowners to create a high-speed multimedia network and share high-speed video and data throughout the home using existing coaxial cable, transmitting digital data at 270 Mbps - more than 20x the speed of most home networks. The chip set's two ICs consist of an RF front-end, and a baseband controller built around the 4KEm processor core.

Per the Press Release: “Entropic is a founding member of the Multimedia Over Coax Alliance (MoCA), which made headlines when it was announced last month at the Consumer Electronics Show, and which includes industry leaders Cisco Systems, Comcast, EchoStar Communications, Panasonic, Motorola, RadioShack, and Toshiba. MoCA is developing standards for the distribution of content over in-home coax cable, allowing consumers to easily link household devices such as TVs, digital video recorders and PCs, and access high-quality voice, video and data from room to room.”

Nassda Corp. announced it will integrate its HSIM hierarchical circuit simulator with Mentor Graphic's multi-language ADVance MS (ADMS) mixed-signal simulation platform for the verification of complex mixed-signal SoC designs. The two companies say they will cooperate in delivering the jointly developed solution, with Mentor distributing ADMS and Nassda distributing HSIM.

Sang Wang, CEO at Nassda, is quoted: “By collaborating with Mentor, we believe we will better meet the needs of our shared customers who work with complex mixed-signal designs. Through an ADMS integration, design teams will have access to Nassda's high-capacity circuit simulation capabilities within Mentor's market-leading multi-language simulation platform for their complex mixed-signal SoC designs.”

Jue-Hsien Chern, Vice President and General Manager of the Deep-Submicron Division at Mentor, is also quoted: “By working with Nassda, we intend to add its market-leading hierarchical fast-spice simulation to our comprehensive mixed-signal verification environment. This demonstrates Mentor's commitment to delivering scaleable verification solutions for complex mixed-signal SoCs.”

Novas Software, Inc. announced full debug support for Synopsys' Vera testbench automation tool with Novas' new nBench capability, a component of its unified design and testbench debug tool. The company says, “nBench enables the seamless debug of Vera testbenches and OpenVera assertions (OVA) with Verilog, SystemVerilog, VHDL and mixed-language system-on-a-chip designs. Engineers can now trace cause-and-effect relationships across all design, assertion and testbench source code developed in Vera. This makes it much easier for them to fully comprehend design and testbench behavior using a single debug environment.”

But first, Novas Software announced a “breakthrough in Verification Process Automation (VPA) that allows engineers to debug automated testbench code, assertions, and Verilog and VHDL designs all within a single environment. For the first time, engineers can debug across language boundaries that until now have separated the verification language source code used to create testbench programs from the hardware design language (HDL) code used to describe chip designs.”

“By applying powerful debug capabilities to the testbench, advanced verification environments and assertion domains, engineers can analyze and better understand the behavior of complex verification structures for faster overall debug. Novas is delivering the new nBench capability as an integral component of its core debug platform with the Verdi Behavior-Based and Debussy Debug Systems. nBench applies advanced debug features, including language tracing, event analysis and active annotation, to testbench debug in a manner consistent with design debug. The result is a complete verification solution that allows for a more effective, unified approach to design and testbench

All of this probably means the competition needs to sit up a little straighter, and take some serious notice of what's going on over at Novas.

PMC-Sierra, Inc. announced that PhatNoise, Inc. has selected PMC-Sierra's 400 MHz RM5231A 64-bit MIPS-based microprocessor for their next-generation car entertainment system. Dannie Lau, Co-founder and Executive Vice President at PhatNoise, is quoted in the Press Release: “PMC-Sierra provides a high performance, low power solution that meets the demands of the new features for our next-generation car entertainment system, and also allows us to seamlessly upgrade performance for future products while preserving code and dollar investments.”

Also from PMC-Sierra, Inc. - The company announced a licensing agreement with Integrated Technology Express (ITE) that allows PMC-Sierra to market and sell the ITE IT8172G System Controller under PMC-Sierra's branding. Jason Chiang, manager of product marketing, Microprocessor Products Division at PMC-Sierra, is quoted: “With this license agreement, PMC-Sierra will now be able to offer customers scaleable chip set solutions for multiple advanced consumer applications with the added convenience of one vendor. For networking and other applications, PMC-Sierra will continue to support other third-party partner solutions.”

Synplicity Inc. announced it has enhanced its Amplify FPGA physical synthesis and Synplify Pro FPGA synthesis software to include new support for Altera and Xilinx devices. The company says that with this new version, Synplify Pro offers new features to support the Stratix II family of FPGAs from Altera. Enhancements to Synplicity's Amplify FPGA physical synthesis software include support for a hierarchical timing report, faster timing closure, and support for the Spartan 3 family of FPGAs from Xilinx.

Per the Press Release: “With the latest versions of the Amplify 3.5 and Synplify Pro 7.5 software, Synplicity continues its efforts to deliver best-in-class performance for FPGA designers.”

Also from Synplicity - Looking forward, the company announced that it intends to offer timing estimation based on placement and automatic initial floorplanning as an alternative to traditional wireload model-based RTL synthesis in future releases of the Synplify ASIC software. The company intends to offer this technology free of charge to customers who are under maintenance at the time of release. The company says the new technology should help users reduce iterations between front-end and back-end design teams and provide more accurate estimates early in the design flow.

Also from Synplicity - The company announced the latest release of its Amplify ASIC software, which includes the company's new router-independent Sensitive Net Analysis and Prevention (SNAP) technology.

Per the Press Release: “The SNAP technology enables tight timing closure regardless of the back-end router used. First, specific routes in the design that are susceptible to significant timing variations due to possible routing choices are identified. Once identified, circuit topology around these sensitive nets is modified to remove router choices that lead to poor results. Synplicity has filed patents for this technology, which will be available in all future versions of Synplicity's ASIC physical synthesis tools - Amplify ASIC, Amplify RapidChip, and Amplify ISSP.”

Finally, and not surprisingly - Synplicity announced the “rapid adoption of its ASIC synthesis technology, including a doubling of ASIC synthesis license revenues in 2003 over the prior year.” The company says that since it entered the ASIC synthesis market in June 2001, 70 companies have purchased its Synplify ASIC and Amplify ASIC software, and 12 ASIC vendors have endorsed the software.

Synplicity says it attributes the success of its ASIC synthesis software to the tool's “industry-leading capacity and runtime, feature-rich functionality, plug-and-play compatibility with current ASIC flows and its focus on ASIC vendors and the design handoff market.”

Verisity Ltd. and ARM announced their collaboration to provide mutual customers with verification IP solutions to address the complexities of system-level verification. The two companies say they will jointly develop verification IP for the ARM11 core family, starting with the AXI e Verification Component (eVC), and advanced methodologies based on Verisity's VPA solutions.

Per the Press Release: “Many of the ARM semiconductor Partners are pushing the limit of system integration on a single chip. These designs, many of which are based on the ARM11 micro-architecture, require billions of verification cycles and hundreds of Gbytes of information, distributed over several compute and engineering resource locations. It is with this problem in mind that ARM has teamed up with Verisity to help ease the issues associated with the verification of these next-generation designs.”

Teseda Corp. and Agilent Technologies Inc. announced the first link that “ensures transportability of Design-for-Test (DFT) data between engineering and production test platforms. Customers of the Teseda V500 and the Agilent 93000 SOC Series can now quickly and reliably validate, debug, and apply IEEE 1450 (STIL)-based production test data generated by electronic design automation (EDA) tools. The net result is a test development flow that cuts weeks from time-to-money for many of today's semiconductor products.”

“Agilent and Teseda verified STIL transportability between the Agilent 93000 and the Teseda V500 using pattern files created by automatic test pattern generators (ATPGs) from leading DFT tool vendors, including Synopsys, Mentor Graphics, Cadence, and SynTest. The STIL files were imported into the two systems and validated as equivalent. Pattern edits were made on the V500 and the revised patterns were output in STIL by the V500 and then read into the Agilent 93000; these patterns were also validated as equivalent.
This process ensures that DFT tests that run on one system will also run on the other with equivalent results.”

Virage Logic announced that MobilEye Vision Technologies used Virage's STAR Memory system to “reduce cost and increase reliability” in the MobilEye single camera driving assistance SoC chip, a product named EyeQ.

Per the Press Release: “The MobilEye EyeQ offers a solution for computationally intensive real-time visual recognition and scene interpretation. The chip architecture is designed to maximize cost performance by performing a full-fledged image application, such as a low-cost Adaptive Cruise Control, using a single video source on a single, ultra low-cost chip. The system can be used in a variety of automotive applications, including on side mirrors to alert the driver of cars coming from behind, inside the car to control the release of air bags based on passenger size, in adaptive cruise control, in forward collision warning, rain sensing, tunnel sensing, and much more. The STAR Memory
which self-tests and repairs embedded memories, enables MobilEye and its customers to reduce manufacturing costs and deliver higher product reliability.”

Also from Virage Logic - The company announced that DongbuAnam Semiconductor and Virage have entered into a ”licensing and royalty-bearing agreement that provides for the delivery of Virage Logic's Technology-Optimized Platforms for DongbuAnam's new 0.13-micron CMOS processes. In addition, customers with designs featuring Virage Logic memory IP on the 0.18-micron process can now take their designs to DongbuAnam in order to take advantage of the price and performance benefits realized on DongbuAnam's 0.18-micron process technology.”

Coming soon to a theater near you

Semico Summit - The 7th annual Semico Summit Executive Conference will, per the organizers, “feature CEOs and industry executives sharing their vision of today's trends shaping the global semiconductor industry,” from March 14th to the 16th at the Marriott Camelback Inn Resort in Scottsdale, AZ. Panel topics are set to include: Where Is Our Industry Now; The Next Killer App; and Strategies for Moving Forward in Bringing Products to Market in the Deep-Submicron Era. All of these subjects, and more, should be of interest to those inclined to attend.

Mentor Graphics User Group Conference - The company's 20th annual meeting, User2User, will be taking place April 19th to the 21st at the Santa Clara Marriott in Santa Clara, CA. The conference program includes 100+ technical presentations selected from 140+ user-submitted abstracts submitted, as well as presentations by Mentor Graphics personnel. Organizers say the keynote address will be delivered by the in-your-face (my judgment, not Mentor's) computer columnist and media personality John Dvorak. A contributing editor of PC Magazine, Dvorak's work appears in several magazines and newspapers. He's also authored several books. (www.mentor.com)

Synopsys Technical Seminar Series - The company announced its annual technical seminar series - addressing mixed-signal, functional verification, and implementation solutions - will begin in February and will take place in a variety of worldwide venues. The seminar is described as a “forum for members of the electronic design community to get the most recent information on design automation products, methodologies and processes. Intended for designers, developers, verification engineers and managers, these free in-depth technical sessions discuss the latest technological advances and future trends of Synopsys' leading EDA tools and solutions for mixed-signal design, functional
verification and implementation.” (

TSMC Technology Symposium - The one-day event will take place on April 13th at the San Jose Convention Center. Per the organizers: “TSMC's Technology Symposium will showcase state-of-the art semiconductor technology and services that benefit the leading Silicon Valley semiconductor companies. Additionally, TSMC will have a featured keynote speaker and panels that will focus on the current state of the industry.” (


Cadence Design Systems has announced the first graduating class from the Cadence-sponsored Device and System Design program at the Moscow Institute of Electronic Technology (MIET). The company says the program offers a master's degree in analog/mixed-signal engineering and includes 25 technical courses and accompanying laboratory projects and practical training. Students also study English and complete internships at prominent IC design companies.

Per the Press Release: “This first graduation marks an important milestone for Cadence and for Russia's IC design industry. It illustrates Cadence's strong commitment to providing increased technical competency worldwide to meet its needs and those of its customers. The goal of the MIET 3-year program is to provide students with the skills and knowledge to hold positions with international technology companies in Russia. By growing a community of engineering talent in Russia, the program not only provides companies with the best and brightest minds, but also supports the strong emerging marketplace in Russia.”

Celoxica Ltd. says it has signed a global distribution agreement with XJTAG Ltd., a part of the Cambridge Technology Group, to resell Celoxica's XJTAG development system. Simon Payne, CEO at XJTAG, is quoted in the Press Release: “This is an important deal for XJTAG as Celoxica's customer base is truly global and includes design and development engineers in major defense and electronics OEMs across the Americas, Europe and Asia.”

Magma Design Automation, Inc. and Nassda Corp. announced that Nassda has joined the MagmaTies EDA Partnership Program, to “further expand their collaboration.” Previously, Nassda partnered with Silicon Metrics, now Magma's Silicon Correlation Division, for full-chip characterization of standalone and embedded memories using Nassda's HSIM. Magma and Nassda say they will now offer designers the ability to characterize IP for the latest process geometries through their combined EDA software. Additionally, Magma and Nassda intend to develop a means for designers to correlate delay and noise effects on circuit
performance in SoCs using Blast Fusion.

Summit Design, Inc. announced that its System Architect for system modeling and validation was selected as a finalist for this year's EDN Magazine's Innovation Awards. The program honors outstanding electronic products and technologies, and the engineers who invent them. Nominees must have demonstrated innovation that resulted in a significant advance in technology and/or product development during the past twelve months. Summit says System Architect was recognized in the EDA: Design Exploration category.

The X Initiative announced that Infineon Technologies has joined the semiconductor supply-chain consortium. Infineon says it has tested its X Architecture manufacturing readiness with the successful fabrication of a 130-nanometer test chip and plans to further validate production designs using the X interconnect architecture in 2004.

Per the Press Release: “The X Architecture represents a new way of orienting a chip's microscopic interconnecting wires using diagonal pathways, as well as the traditional right-angle, or 'Manhattan,' configuration. By enabling designs with significantly less wire and fewer vias (the connectors between wiring layers), the X Architecture can provide significant improvements in chip performance, power consumption and cost. Infineon fabricated the X Architecture test chip at its Corbeil-Essonnes facility using its 130-nanometer production flow. Cadence Design Systems provided the test structure design, DuPont Photomasks and the Infineon maskhouse produced the X
masks, and Nikon's equipment was employed for photolithography.”

Bits & Bytes

1 - Let me throw my ARM(s) around you ...

1A - Book signing party at DAC

ARM and Synopsys, Inc. announced a joint project to develop a reference methodology to define a coverage-driven verification architecture using SystemVerilog. ARM and Synopsys will publish the methodology in a co-authored book titled, “SystemVerilog Verification Methodology Manual (VMM).”

The SystemVerilog VMM intends to provide engineers with architecture guidelines and industry best practices for more effective and faster functional verification of SoCs. It will also provide verification IP developers with a standard verification architecture to encourage development of interoperable verification IP. The companies say the SystemVerilog VMM will be based on the collective verification and IP experience from ARM and Synopsys, including input from experts such as Janick Bergeron, Phil Moorby, Peter Flake, and John Goodenough. The book will be on show at DAC 2004 in San Diego, CA. That's one book signing we should all be planning to attend.

Per the Press Release: “The manual will describe SystemVerilog language features relevant to functional verification, as well as document a robust, reusable verification methodology to enable faster and more effective design verification. The manual will deliver a specification of a standard set of libraries for assertions and commonly used verification functions, such as stimulus generation, simulation control and coverage analysis to help implement the recommended methodology. It will provide a blueprint for a robust, scaleable verification architecture based on industry best practices. The methodology will address all aspects of functional verification,
including design-for-verification techniques using SystemVerilog Assertions (SVA) for formal analysis and dynamic verification; use of constrained-random stimulus generation techniques; and use of coverage metrics to achieve rapid verification closure. The methodology will also enable verification IP providers to adhere to a consistent and well-documented architecture, enabling end users to easily integrate verification IP from multiple sources.”

Janick Bergeron, moderator of Verification Guild and a principal R&D engineer at Synopsys, Inc., is quoted in the Press Release: “Every verification project requires a detailed methodology that aims for first-time success. The SystemVerilog VMM will teach engineers how to create a single, reusable verification environment that can be used to verify transaction-level models written in SystemC as well as the RTL implementation of the design. The methodology documented in the SystemVerilog VMM will reduce the amount of code needed to write and maintain tests, and enable extensive re-use within and between projects.”

1B - The Encounter Reference Methodology

Cadence Design Systems, Inc. and ARM have announced the availability of the upgraded ARM-Cadence Encounter Reference Methodology, which the companies say now incorporates Encounter RTL Compiler synthesis, and completes another milestone in the first year of the design chain collaboration between ARM and Cadence.

Per the Press Release: “At 130 nanometers and below, wires dominate the performance and present a host of signal integrity problems to be solved in order to achieve first silicon success. The upgraded ARM-Cadence Reference Methodology, based on the Cadence Encounter digital IC platform, provides an integrated, wire-centric RTL-to-GDSII implementation for ARM Partners. This upgraded release of the Reference Methodology enables customers to achieve improved QoS, the new metric of silicon quality, measured after wires for accuracy.”

John Goodenough, Global Methodology Manager at ARM, is quoted: “The ARM-Cadence Encounter Reference Methodology is now available in limited release for some ARM9 family cores. This release delivers significant performance results over the current Cadence methodology as a result of the addition of Encounter RTL Compiler. This open collaboration demonstrates the commitment of ARM and Cadence to increase the access to new-generation nanometer solutions for our mutual customers.”

Jan Willis, Senior Vice President of Industry Marketing at Cadence Design Systems, is also quoted: “The momentum of the ARM-Cadence alliance in the past year has given ARM Partners an open choice of solutions and an open path to the future. Today, using the upgraded Reference Methodology, our mutual customers will be able to achieve outstanding quality of silicon in less time. Our collaboration with ARM will continue to focus on new-generation technology, open standards and optimizing the silicon design chain to deliver the critical solutions needed for nanometer design. ”

2 - ChipMD disavows autopsies

Dale Pollek is President and CEO at ChipMD, Inc. I had a chance to talk with him over lunch recently. Here's what he told me:

“As far as the current design methodologies are concerned, especially for analog and mixed-signal design - the vast majority of experts I've talked to admit they're only doing a subset of analysis. They push the silicon out and then attempt to fix the design layers after the fact. They [go into the process], fully expecting to have to do an autopsy on the project. Frankly, that's the mode they've been in for 20 years because they haven't had the tools to do it differently.”

“The only methods available are very compute-intensive, things like running a Monte Carlo analysis - but that's for verification, not analysis. Some don't even do Monte Carlo, they just do the corner cases. I like to say it's as if they're boxing themselves in with the corner cases. Excuse the pun, but you've really got to start thinking out of the box - those corner cases are traditionally derived from digital work where things are very linear and all about speed, power, and chip size. But we're talking about analog and mixed-signal here, and in the analog and mixed-signal world, worst-case conditions rarely happen at those extreme corner conditions.”

“Consider that at 0.35 micron and above, analog and digital designs are still being done on two separate chips. Things still work at those geometries because the voltages are high enough, the speeds low enough, and there's not enough ambient flow to affect the performance for designs done at 0.35 micron and above.”

“For designs at 0.35 micron and below, however, people are seeing vast portions of the silicon fail, principally because of the analog portions of the chip. People have now realized that you can't get analog or mixed-signal designs to work at 0.35 micron or below. In fact, estimates are that 50 to 60 percent of the failures are caused by the analog parts today. You're getting a lot of dead, or poorly performing silicon. When you throw in the digital problems, you may only be getting 15 to 20 percent of your product off the line that actually works.”

“Certainly this is why you're seeing a lot of new analog companies coming up - because for 20 years now, engineers have wanted better tools. ChipMD is offering a new set of silicon-proven tools, which have been in production for 2 years now. The algorithms address worst-case conditions, analyze them over the operating conditions and process variations that the chip may encounter.”

“Take process variations for instance. In a 0.35-micron world, voltages at 1.5 volts or 2 volts might vary by as much as 0.10 volt - it's still very close to the target voltage. In a 0.15-micron world, however, where you're dealing with a threshold voltage in the range of a half a volt - a 0.10-volt shift is a dramatic one. In that case, variations on a relative basis are much larger and can depend on things like oxide thickness, or doping densities, which have a much greater impact at the smaller geometries. In the digital world, you might say 'Who cares - handle it by way
of corner cases.'”

“In the analog world, worst case analysis doesn't work at the corners. The issues are very different and very complex. We can help look at those issues, do the analysis, and tell you what your specific design will be sensitive to. We help the designer make the proper changes and size their devices accordingly, to remove the impact of process variations.”

“Unfortunately, people are often too busy these days fighting fires with outdated technology to look at new technologies. But they need to find at least one member of the team who does more than just fight fires. That guy needs to learn how to stop the fires in the first place. It's like the companies are trying to douse fires without extinguishing the sources of the fires. In this last week alone, we've talked to three different VPs of engineering who all say they're fighting fires, but don't have the time to look at the [technology] on the fire truck.”

“We're talking here about the engineers who haven't had a lot of automation tools up until now. It's true that over the years, all kinds of people have resisted change in how they do things. But, the question is always there - can you afford for your competitors to have the solutions and not take advantage of them yourself.”

3 - Proceeding with Prosyd

Quoting the Press Release: “A number of big-name semiconductor companies, with help to the tune of millions of euros from the European Union, today launched a collaborative research effort, based on a popular 'chip making language,' aimed at dramatically improving the productivity of the design methodology used to make microchips. The result of this research effort - to be known as Prosyd - will give engineers across Europe access to what is known in the technical community as 'PSL-based tools and methodologies,' which will better enable them to create higher quality electronic systems with faster time to market.”

“The Prosyd project was officially launched at the DATE'04 conference in Paris, an important European event for companies involved in electronic system design and testing methodologies. The Prosyd methodology will holistically guide engineers, by integrating processes and tools throughout the development process, from specification through design and implementation to verification. The vehicle for achieving this ambitious goal will be the PSL specification language, recently selected by the Accellera EDA standards organization as a basis for an IEEE standard. PSL, based on the Sugar language from IBM, offers a concise and intuitive means for expressing and sharing design intent. The
will facilitate and encourage the use of PSL by developing supporting tools and expediting their large-scale deployment in Europe.”

“This new property-based paradigm will streamline the chip design process to enable the development of higher quality electronic systems within shorter design cycles and with lower costs. The Prosyd partners represent a wide range of expertise in system design and verification, and in particular have collaborated in the standardization of PSL. The project was a joint initiative of researchers from the IBM Research Lab in Haifa and Graz University of Technology in Austria. The participants include a group of leading European systems companies, R&D centers, and universities, including IBM Haifa Research Lab; Infineon Technologies in Munich, Germany; STMicroelectronics laboratories in
UK, Italy
and France; the Technical University of Graz in Austria; ITC-IRST in Trento; Verimag in France; the Weizmann Institute of Science in Israel; and the Accellera EDA standards organization.”

“Initial funding for the Prosyd project is 7 million euros (about $8.7 million U.S.), of which 4 million euros coming from EU funds. The project will last 3 years. A large portion of the activity will be dedicated to marketing and dissemination of the tools and methodologies developed by the project team. It is being supported by the Information Society Technology (IST) sub-program of the European Union's sixth framework of research, and is coordinated by Dr. Daniel Geist of IBM Haifa Research Laboratory.”

In the category of ...

Copy editing, one last time

Apologies to Dino Caporossi, Vice President of Marketing at Hier Design, for misspelling his name and giving him the wrong title in last week's article. Thanks for your patience, Dino.


The EDA industry is one that few people walk away from, once caught up in the ebb and flow of the thing. No matter what skill set you bring to this industry - whether it's in management, technology, marketing, sales, PR, editorial, or industry analysis - somehow there's a sense of community here that keeps drawing people back into the fold, even if they wander away for a while. I sometimes struggle to identify what the over-riding themes in EDA actually are, the themes that make this such a unique, enriching, and oftentimes difficult place to inhabit - or walk away from.

Some days, I think it's that peculiar mix of technology that seems best suited to the innovative atmosphere of small companies that's most unique to EDA. On other days, I think it's the comfort and ballast offered to the industry by the handful of relatively gigantic corporate entities that's peculiar to EDA - a level of comfort and ballast that actually provides and protects the environment within which the small companies can afford to thrive, even with all of their endearing complaints, and grousing, and David-like throwing of stones against the Goliath-like armor of the Big Guys.

Sometimes I think it's the hubris and swagger of the Goliaths - their ego-soaked attitudes often in stark and bemusing contrast to the nimbleness, honesty, and humility of the small guys - that uniquely characterizes EDA.

At other times I think it's that mix of world-class intellects - and anybody who doesn't think there are, pound-for-pound, more highly educated people in this industry than anywhere else in the high-tech world doesn't really know this industry - and the small, almost incomprehensible bits of intellectual property that they come up with day-by-day which characterizes this place. Or, it's that unique mix of intellect and inventiveness which fosters more layering upon layering of legal infrastructuring/wrangling - in the vain hope that you'll get paid for what you invent and I'll get paid for what I invent - that defines EDA above all else.

Of course, patents and legal fees aren't unique to EDA, nor are venture capitalists and the influence they wield unique to the industry, nor is the power of partnering with universities, IP vendors, foundries, FPGA guys, fabless guys, or the big IDMs, unique to EDA. But there is a special combination and permutation of all of these influences taken together, which pervades the very air we breathe here in EDA in such an intoxicating way - more, in fact, than in any other industry I'm aware of.

The EDA industry is so amazingly small in comparison to the enormous/plucky shadow that it casts across the global semiconductor landscape. EDA is just tiny - many, if not all, of the major players are on a first name basis. Practically everybody's worked with everybody else at one point or another. The place is like a family.

Which actually begs comparison with the old adage that you can choose your friends, but you can't choose your relatives - a lament that probably characterizes attitudes in this intimate EDA fraternity more frequently than not. It's hard at times not to see the EDA industry as a game of water polo, stimulating and well coached from above - team spirit and sportsman-like conduct all the norm - while beneath the water's surface you suspect there are some questionable moves and an inordinate amount of blood letting going on.

Maybe the unique thing about EDA is the tools themselves. How in hell does anybody really know how many millions of lines of code are actually embedded in all of those tools, how many algorithms, brilliant ideas, stupid ideas, inefficient nested loops, kludgey solutions, elegant solutions, stolen solutions, secret solutions, old ideas, new ideas, cobbled together new and old ideas, are embedded in there? Who could ever unravel it all? Who would ever really want to? After all, it's the bedrock of the industry; it's the ground we maneuver on.

It's the stuff that executives and managers base salaries, stock options, and annual bonuses on. It's the stuff professors and graduate students build grants on. It's the stuff developers and users chew on. It's the stuff Sales and PR and MarCom brag on. It's the stuff lawyers brief on. It's the stuff that editors and gadflies critique on. It's the stuff analysts prophesies on. Who'd ever really want to unravel it all? Who'd want to destroy the mystique, the mystery, the bigger-than-life nature of it all?

It would be like disassembling the human brain and trying to find the spark that makes the thing work as an inspired and integrated whole. The tools do somehow work as an inspired and integrated whole - rarely perfect and often unreliable. But then, so is the human brain rarely perfect or reliable, but we can't seem to live without it. Our lives are built on it, and so it is with the EDA tools.

In the end, however - all of the attributes listed above notwithstanding - there probably is one thing that uniquely characterizes, motivates, and redeems everything else that goes on here:

“It's not enough to demand respect for what you do. You need to inspire respect for what you do.”

Maybe it's that alone which uniquely characterizes EDA. There's only one way to get respect in this industry and that's to earn it. It's too small of a place to have it be otherwise. The true contributors know it. It's fundamentally what makes them tick. It's the glue that keeps the industry together and the reason that people rarely walk away. And ultimately, I think it's what makes EDA such a darn interesting place to be.

You can find the full EDACafe event calendar here.

To read more news, click here.

-- Peggy Aycinena, EDACafe.com Contributing Editor.