Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /www/www10/htdocs/nbc/articles/view_weekly.php on line 750

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /www/www10/htdocs/nbc/articles/config.inc.php on line 369

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /www/www10/htdocs/nbc/articles/config.inc.php on line 392

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /www/www10/htdocs/nbc/articles/config.inc.php on line 369

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /www/www10/htdocs/nbc/articles/config.inc.php on line 392

Warning: Cannot modify header information - headers already sent by (output started at /www/www10/htdocs/nbc/articles/view_weekly.php:750) in /www/www_com/htdocs/html.inc.php on line 229

Warning: Cannot modify header information - headers already sent by (output started at /www/www10/htdocs/nbc/articles/view_weekly.php:750) in /www/www_com/htdocs/html.inc.php on line 230

Warning: Cannot modify header information - headers already sent by (output started at /www/www10/htdocs/nbc/articles/view_weekly.php:750) in /www/www_com/htdocs/html.inc.php on line 237

Warning: Cannot modify header information - headers already sent by (output started at /www/www10/htdocs/nbc/articles/view_weekly.php:750) in /www/www_com/htdocs/html.inc.php on line 239

Warning: Cannot modify header information - headers already sent by (output started at /www/www10/htdocs/nbc/articles/view_weekly.php:750) in /www/www_com/htdocs/html.inc.php on line 240

Warning: Cannot modify header information - headers already sent by (output started at /www/www10/htdocs/nbc/articles/view_weekly.php:750) in /www/www_com/htdocs/html.inc.php on line 241
Brown Bag Lunch: Sanguinetti & Sandler - November 05, 2007
Warning: Undefined variable $module in /www/www10/htdocs/nbc/articles/view_weekly.php on line 623
[ Back ]   [ More News ]   [ Home ]
November 05, 2007
Brown Bag Lunch: Sanguinetti & Sandler

Warning: Undefined variable $vote in /www/www10/htdocs/nbc/articles/view_weekly.php on line 732
Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
Peggy Aycinena - Contributing Editor


by Peggy Aycinena - Contributing Editor
Posted anew every four weeks or so, the EDA WEEKLY delivers to its readers information concerning the latest happenings in the EDA industry, covering vendors, products, finances and new developments. Frequently, feature articles on selected public or private EDA companies are presented. Brought to you by EDACafe.com. If we miss a story or subject that you feel deserves to be included, or you just want to suggest a future topic, please contact us! Questions? Feedback? Click here. Thank you!


Warning: Undefined variable $module in /www/www10/htdocs/nbc/articles/view_weekly.php on line 843

Warning: Undefined array key "upload_with_nude_flag" in /www/www_com/htdocs/get_weekly_feature_ad.inc.php on line 69


It’s autumn and the industry’s as busy as a gaggle of ground squirrels preparing for winter. EDAC’s announcing robust growth numbers for the industry, plus celebrating U.C. Berkeley’s Dr. Robert Brayton as the 2007 Phil Kaufman Award winner with help from IEEE‘s Council on EDA. CoWare’s announcing their new ESL 2.0 initiative. There have been multiple acquisitions: S3 has acquired Acacia Semiconductor; Apache has acquired Optimal; and CLK DA has acquired Synchronous. And ITC 2007 has come and gone leaving multiple test-related announcements in its wake.

Meanwhile, if you’re finding yourself a little short on time this week, you’re not alone. The 11th OpenAccess Conference is happening all day on Monday, the 5th, in Santa Clara. ICCAD 2007 starts at the DoubleTree in San Jose on the 5th and wraps up on the 8th. The IBM/Samsung/Chartered Common Platform Technology Forum is happening in Santa Clara all day on the 6th. And the 5th International SoC Conference is happening on the 7th and 8th in Newport Beach. If you‘re still raring to go next week, Denali’s putting on their PureSpec SystemVerilog course on the 16th.

The details of these and other developments can be found below. First a brief recap of last week’s Kaufman Award Dinner, and then a Brown Bag Lunch with Novas Software’s Scott Sandler and Forte’s John Sanguinetti.

John and Scott can be admired for their technical and business savvy, for their friendship and candor, or for all of the above. It’s up to you to decide. But first, it’s November and the cold is setting in. So go get that grande, extra-hot, non-fat, white mocha and click on “Print Article” up there on the right. You’ll be able to enjoy your coffee and uninterrupted access to this article at the same time.

***************************

EDAC News …

Dr. Robert Brayton, the 2007 recipient of the Phil Kaufman Award, was honored Thursday, November 1st, at a dinner in Santa Clara hosted by the EDA Consortium, the IEEE’s Council on EDA, and the Silicon Valley law firm of White and Lee. “Sincerity” was the operative word for the evening.

EDAC Executive Director Bob Gardner was MC for the evening. The after-dinner presentations began with Synopsys CEO and current EDAC President, Aart de Geus, welcoming the room full of old friends, thanking the EDAC Board for another year of hard work (complete with visuals of CG-enhanced hair-do‘s for various Board members who shall remain nameless), and presenting a series of slides reflecting the state of growth in the industry, which is currently something to celebrate (see below).

U.C. Berkeley’s Alberto Sangiovanni-Vincentelli then offered an extensive introduction to the evening’s guest of honor in a very Italian and affectionate re-telling (read “roast”) of Dr. Brayton’s rise to prominence in industry (26 years at IBM) and academia (26 years at U.C. Berkeley). Dr. S-V detailed Brayton’s seminal contributions in computer science (the first LISP compiler), in mathematics (the Brayton-Moser theorem), and in circuit simulation and synthesis (Espresso, MIS, SIS, et al). He had the crowd in his hands in describing his many decades of association with Dr. Brayton, his admiration for Brayton’s legendary intelligence, and his affection for Bob Brayton the Family Man (the Brayton family were at the head table), Brayton the Competitive Athlete (per Divine decree, Alberto has never beaten Brayton at tennis), and Brayton the Exceptional Friend, Teacher, and Mentor. It was clear from Dr. S-V’s slide show that Brayton has influenced or been associated with a remarkable number of the major players in EDA over the years.

When Dr. S-V was done, Dr. Brayton took the stage and gave an emotional and understated speech of his own. He thanked his wife, acknowledged with pride his accomplished children, their spouses, and his grandchildren. He acknowledged the many people who have influenced his life, expressed gratitude for the multitude of opportunities that have come his way, singled out his many friends in the room who had come from far and wide to attend the dinner, and thanked the EDA industry for pulling out all the stops to publicize his award and arrange such a spectacular evening in his honor. He was rewarded with a standing ovation.

Over the years, the Kaufman Award dinner has frequently, and perhaps not surprisingly, been both a deep reflection of the state of the industry and of the character of each particular year’s award winner. The 2007 Kaufman Award dinner was an evening memorable for it’s elegant simplicity and the sincerity with which Dr. Brayton’s contributions and personal accomplishments were honored. It was an evening that brought out the best in the EDA industry, and was an appropriate and welcome reflection of the dignity of the 2007 Phil Kaufman Honoree.

[To learn more about Dr. Brayton, please visit my recent profile in the DACeZine.]

EDAC’s Market Statistics Service announced the EDA industry’s revenue for Q2 2007 increased by 11.4 percent over Q2 2006, from $1.265 billion to $1.409 billion. Not too shabby!

***************************

Brown Bag Lunch: Sandler & Sanguinetti …

Scott Sandler is President and CEO of Novas Software, and sits on the EDAC Board. Scott’s been in the business for well over 20 years, at Intel, Gateway Design Automation, Cadence, and Chrysalis. His degree in CSE is out of the University of Massachusetts.

John Sanguinetti is CTO of Forte Design Systems. John’s also been in the business for 20-something years, at DEC, Amdahl, ELXSI, Ardent, NeXT, and at Chronologic Simulation, which he founded in 1991. John also co-founded CynApps, and was the principal architect of VCS. His Ph.D. is out of the University of Michigan.

Our conversation over lunch unfolded in a conference room at Novas, and ostensibly was to be about verification. Not surprisingly it started there, but moved on. For openers, I asked John and Scott for an update on how Novas and Forte are doing these days. Scott put his hands behind his head, leaned back in his chair, and assured me that our get-together had not been contingent on that messaging. I asked for the update anyway.

Scott – For Novas, we’re feeling it’s up and to the right. Our standard products are doing fantastically, which gives me the opportunity to talk more about verification and where it’s going. Within the context of Novas, we feel that debug enhances verification, so everything we work on enhances verification, particularly around the simulator. Digital functional verification is fundamentally simulation based, and making that flow work better is something we’re dedicated to here at Novas.

Peggy – John? Forte?

John – ESL is an important part of this market. Forte’s not in verification per se – we provide a bridge to implementation from ESL to producing hardware. But from a verification point of view, ESL is pretty interesting. It addresses many of the problems that people are dealing with. A lot of the problems people discovered in traditional verification don’t happen at a higher level, because verification activity changes at that level. The level of abstraction of the design affects the verification activity.

Peggy – Scott, can you give me a status update on the Design for Debug Consortium that was introduced at DAC several years ago?

Scott – Frankly, that never went anywhere. It was a good idea, but it was ahead of its time. [We felt debug] was a fragmented and slowly emerging practice – fairly well established at the processor level, but much less established at the bit level. All of the big companies have their own ways of doing this, but it’s more in the research phase today. We attempted to get people to come together over those practices and talk it over, which they did.

The commercial deployment of DFD techniques in the silicon is nascent, infinitesimal in fact, although it’s fairly well established at the processor level. At the bit level, the necessity has not yet proven itself, so it remains to be seen where all of that goes. We have not dropped any of our interactions with any of those partners or customers, but the need for a Consortium has not really proven.

Peggy – How about the concept of verification planning? What’s your take on that?

Scott – What’s so new about that idea? When I start at Intel in 1983, we spent a lot of time on verification planning. [In fact], when I met John Sanguinetti in 1986, he was a verification engineer.

John – That was a time when you figured out how you were going to get the tools for your model, you didn’t spend a lot of time figuring it out ahead of time. Even today, not nearly enough planning [goes into verification].

Scott – We spent a lot of time figuring out what the interfaces were, and in what sequence, to make ourselves confident that we’d covered the bases. We had to make sure that one Write didn’t step on another, to look at all the scenarios. But those designs were fairly straightforward compared to today’s SoCs. Now scenario planning is a big part of verification. There was directed test, and then with the advent of Specman, we got commercialization of directed, random test, which had been done by hook or crook with C or SystemVerilog. The verification community still doesn’t use verification testbenches. There’s no built-in, standard way of doing directed random testing.

It’s a theme I’m [actually] playing with right now, because there’s a lot of talent and human effort and knowledge involved in verification, but nobody’s figured out how to automate verification planning.

John – If you could automate it, that would be tantamount to doing a formal proof. It’s like most computationally intractable problems, if you narrow it down it can be solved in one way or another. That’s how verification’s been done – not by trying to generally prove that one specification matches another, but by trying to prove that this particular representation is the implementation. That’s where people can succeed.

When I first got into design verification, I was trying to verify the design of a whole computing system. [We used] a graphics supercomputer that did 8 megaflops. We built a whole system model and then asked what to do with it. We figured out how to do that, and then figured out how to do directed tests. Each designer would do his design in schematic capture, run a few vectors by hand, and then stick it in at the system level. When I first learned how to approach directed tests, [we would] line up events on this cycle and figure out how we were going to cover interesting cases.

Scott – We’d walk them past each other, and see which one succeeded.

John – After I decided I was spending too much time writing directed tests, I figured out how to write random tests. That was in 1988.

Scott – The only change was, with the advent of e, Vera, and SystemVerilog, it was easier to create directed and random tests, and to blend them in more powerful ways – and it was easier to measure what’s called coverage. There was a fairly well established practice. Remember that ‘ology’ means study, so I prefer to use the term 'practice,' not 'methodology.'

[You] create a bunch of directed tests, and make them directed random – predictable, but widely varied ways to hit as many corners as you can, and measure if you’ve done enough. So, functional coverage and closure gets tied in with planning. You have some metrics, and you build towards closure with respect to the metrics.

John – A large part of the effort in verification planning is trying to understand what the issues are with your design. What’s important, what set of transactions are important in test, and what’s not? There’s new technology out that people are working on – being able to analyze a design and identify the states that need to be covered, and then determine whether your set of tests have covered those states. There’s actually a fair amount of reason to be optimistic that the technology will improve.

Peggy – Optimistic that we’ll get there?

John – Optimistic that the tools and technology will become available that will give more confidence in the determination of, ‘Are we done yet?’ You never knew for sure [in the past] in the absence of formal methods. You never knew if you’d simulated enough.

Scott – We still don’t know.

John – With simulation, but it’s not a formal method.

Scott – The coverage metrics being used are low level. TransEDA pioneered toggle coverage – branch coverage, derived from codes that have errors in it. You have the assumption that if you exercise these things, you’ll have found whether things work. There’s a lot of work going into how to automatically improve the testbench based on some end-to-end analysis of stimulus flows through the design. There’s as much IP and creativity and engineering in the stimulus generator and response checker today as there is in the design. Tools that improve levels of abstraction and the ability to measure the quality of the end-to-end process are all the rage. I really think people are onto something in trying to automate this, because right now it’s really only determined by the talent of the verification engineer.

Peggy – Do we have enough talent working on these things?

John – There’s never enough.

Scott – Your car, the eBay site, your computer – they all work. So A, it’s never good enough. But B, it is, in fact, good enough. So there’s an interesting dichotomy in the situation.

John – I mean it’s never good enough in terms of efficiency. A lot of the chips in your computer [function], but better verification would have reduced the costs.

Scott – People make trade-offs and decide how much better they can afford to make the design before they spin it. The question is: How do you find the bugs and how far do you go? Emulation enters the picture, and then you run the software before you actually build it. But there are a lot of choices to be made.

John – One of the interesting things we’re seeing at Forte, is that virtually all of our customers are using FPGAs for prototyping. Most of our business is in consumer electronics, big SoCs. People are seeing so many limits to traditional design verification, that it’s worth it to them to build a prototype and to try to run it as much as they can. But even then, it’s a bug verification effort. What are you going to run on this prototype? Are you sure you have enough real cases to exercise everything?

Scott – That’s another really important part of this conversation – the software is now part of the system. It’s not something that sits on top of the design that we’re trying to verify. People have been trying to point this out for a long time, that design must be a blend. There’s a cultural inertia, although [at last] we’ve finally broken out of that. Now we’re looking at the verification problem as a combined effort of verifying the software and the hardware together.

John – That’s probably true, but it’s not all that much of a sea change from where we were before. My training before design verification was in software and operating systems. I did a stint doing performance analysis for a couple of computer makers, and was really astounded when I was doing verification and brining up machines. We’d do work-arounds in the software and then ship the thing. It was pretty astounding the extent to which [we would do that].

Scott – Was it culturally ingrained in those days, or was it hardware verification and then software verification?

John – It was a serial process. In the end, it was always a combination of this particular version of the operating system with that particular hardware. The hardware was built to a specification, and the software was implemented to that specification. If we discovered the hardware didn’t really match the specification, we’d change the software.

Scott – With additional capacity on the silicon, now we’re embedding the processors and the software onto the chip. It’s embedded software now. The application that gets shipped in the black box, the IC, includes lines and lines of code. My Sony camera? Internally, there’s the SoC and there’s a set of programs, depending on the model of the camera. You can buy a different Sony, but it’s the same hardware. Only the software’s different.

Peggy – But doesn’t that make it more hackable, like the iPhone?

Scott – Sure, so you need to verify that it’s not hackable. You invite people to hack into it by putting in additional [safeguards, and letting them try].

John – That’s been going on for 20-plus years. When I got to Amdahl, I was appalled to discover they were going to create a family of computers out of the 580, but only had one machine. They just changed some code in the thing, some macrocode, [although] that slowed the thing down. That was back in 1982.

Scott – Didn’t they get sued over that?

John – How can you get sued for that?

Peggy – Okay, back to verification.

Scott – With verification, you need to add more value by tying things together to deliver more of what’s missing for the customer. With some of our customers, they can afford to hire the best people. But other can’t, and they struggle more. [Clearly] there’s a continuum of talent and different companies pay at different levels, so the key word in “EDA” is “Automation.” We look at what our customers are doing, and how they’re spending their time and energy, and we seek to automate that. For the most advanced customers, it’s way harder to automate [their designs], but for the run-of-the-mill customers – the vast mainstream – it’s easier. Back in the 1980’s, John and I were writing directed tests and making the tools. Then [various companies] came along, and they wrote programs and automated the process. It’s my observation that EDA is about creating practical programs that automate best practices. Verification planning is the next step in that automation.

John – That’s a real salient point. In EDA, in general, automation has value to the extent that is automates a process. That’s not just in verification, but in all EDA products. They’re only valuable to the extent that they automate a process that’s not possible, or is too tedious to be done by hand. EDA tools don’t just automate best practices. When you provide the tools to the leading-edge guys who need them to build the leading-edge products, the tools themselves really need to lead.

Scott – Can you give an example?

John – High-level synthesis.

Scott – That’s being done by a human?

John – Probably. How about place and route?

Scott – Sometimes that was best practices and sometimes it was grunt work. How did we verify before simulators? Somebody wrote 1’s and 0’s, and watched them propagate [through the design]. And we needed models for transistors. It’s been layers and layers of programs written there, that was always grunt work before. Timing analysis is another example. On my first chip at Intel, I created a list of timing parameters for every cell. I had a cell for every path and referred back to the models along the way. Whenever we changed something, it was recalculated. I did it on an Intel blue box in SuperCalc.

Peggy – How would the flows look today if we could start all over again from scratch?

John – That’s fairly difficult to do. We’ve spent the last 9 years trying to do that.

Scott – There have been various attempts to do that. I think there can be vastly different, alternative ways to specify and implement logic into silicon. Getting them to be widely adopted is another big challenge. We’re talking about people here, people with habits who don’t like to take risks. Look back at silicon compilation. Those were efforts to redefine the whole process end-to-end. We haven’t seen anything like that emerge or be discussed [for some time], although some academics are still talking about it.

John – There are a number of companies who have been engaged in electronic design who have developed their own methodologies, and in some cases they look very different. I’ve met with customers in military electronics who have a set of constraints [which define their] business. The military has platforms that must last 30 to 40 years, with the electronics replaced periodically. They’ve got boards crammed full of chips [trying to move to] one big board and one big chip. There has to be absolute compatibility, with not one line of software changed. These guys develop their own methodology that’s completely different from what we use. They're deciding right now if this is too big of an overhead to keep going this way. They may want to switch to commercial tools.

Scott – We’ve all run into companies over the years that wanted to build their own flows, but it became a liability. So they bought commercial tools and moved on. Even today, however, the big companies still have their own simulators and tools.

John – Look at NEC. By far, they have the most experience doing ESL design. They’ve had their own tools for 10 to 15 years. Periodically, they’ve thought they wanted to commercialize them, but have never succeeded because it’s an awfully big investment to do that.

Peggy – Is EDA about revolution, or about quick-paced evolution?

Scott – In all walks of life, there will be revolutionaries. Mostly they will be outcasts or ignored, if not shot. But from time to time, revolution does in fact happen. Was synthesis a revolution? Not really. It was the automation of something that had already been done manually.

John – Yes, but a qualitative change happened because synthesis allowed a change in the level of abstraction.

Scott – Was that revolutionary?

John – Yes, it was.

Scott – But correct-by-construction says the software system can guarantee that automated implementation is, in fact, correct without manual intervention.

John – Correct-by-construction is automating the transition from one level to another. There’s nothing wrong with that.

Scott – But people have not been able to trust the transition. Why do we still do gate-level simulation and timing analysis? Lots of effort still goes into checking the results of the synthesis tools.

John – Most of the effort goes into verifying at the level of abstraction of the design.

Scott – Constructing models of the specification at two different levels of abstraction.

Peggy – Isn’t that what John’s doing?

John – Not really.

Scott – He’s taking a specification, creating a C model, and then creating an RTL model.

John – We’re saying, you’ve got a C model, create the RTL model, and verify that the C model is right. That’s really the right way of doing things – if you’re in love with building two models.

Scott - I’m not here to take a stand on the right way to do things. But there are a variety of different ways in use today. There are companies making profitable chips [doing things a variety of ways].

John – Pretty much everybody these days uses a high-level model of their specification. Not everybody uses a means [or a tool] of moving to a lower model. The two-level model is a problem we’ve had for a long time. Basically, you have to have two different models to compare them. It’s very real that the testbench and a set of tests that you run is the model, typically regression suites. You’re comparing the two. People have been trying to figure out how we automatically derive the testbench and vectors from the design.

Peggy – Do we call that innovation or implementation?

Scott – Innovation comes in creating, in building the automation in a practical way. Remember the example of simulation. When Verilog/VHDL came on the scene, there were a whole variety of simulators in use, and in various stages of market acceptance. For a variety of reasons at Gateway, we were able to effectively wipe them off the map. There were 8 different multi-mixed level simulators, RTL mixed with gate level, and most of them had warts. They weren’t as fun or as effective, and they withered on the vine. The analogy today – there are Hyundai’s and there are BMWs.

Peggy – But are the differences there real or psychological? They’ll both get you where you’re going.

Scott – If you measure the speed to go from 0 to 60, the differences are very real. In our world, there was Cadabra, but Verilog was so much more satisfying.

John – It was easier to do the job I wanted to do. The first job I had in simulation, I had to choose between N-dot and Verilog. I looked at both simulators and decided I could write a system model in Verilog, and couldn’t do it in N-dot. Verlog addressed the problem better.

Scott – It was more satisfying. In EDA, some things work well. Some things don’t.

Peggy – Does the best technology always win?

John – That’s the conventional wisdom in Silicon Valley. There was BetaMax versus VHS, but the more technical the field, the less likely that some marketing organization can push an inferior product on to the customers. Although, we know of examples in our industry of products that didn’t make it because of a comparable product from a more powerful vendor.

Scott – John’s right. The more technical the stuff is, the less impact of marketing. Keep in mind the premise of crossing the chasm [involves] the satisfaction of buying and using a product. Overall, how well is it satisfying the needs of the purchasing organization? The answer has more facets that just being the fastest thingy or the most accurate. It also has to do with the infrastructure around the product. A really fast simulator that no one knows how to create is useless. There may be one team who can create it, but other guys who say it’s a waste of time.

The infrastructure around VHS was there, but it wasn’t there for BetaMax. So, it’s not just about the technology, it’s also about creating a complete product. If you get too focused on the technology, you lose sight of what it really takes to satisfy the customer. That’s when the best technology doesn’t win. Even Verilog might now have succeeded if we hadn’t asked the ASIC vendors to build models for their libraries. We had to enhance the language to succeed.

John – That was ultimately the differentiator between Verilog and VHDL. VHDL would have succeeded if it had wiped out Verilog. That was the intention, but it couldn’t do gate-level simulation.

Peggy – So, are the Big Guys in EDA setting the pace?

Scott – Hell no. They compete tooth and nail around pricing and supplying multiple pieces [of the flow]. But we provide the better mousetrap.

John – We do have constraints in the general design flow – the small companies can’t do everything – but often we can fit better into the design flows of the larger companies than their own [point tools].

Peggy – Isn’t there an established infrastructure that must be met?

Scott – It’s not enough to have great technology. One of the aspects that we’ve mastered in creating complete products, is to make sure that our great technology fits into the existing infrastructure.

Peggy – Can you maximally innovate?

John – Maybe not maximally, but you can still innovate. When I started Chronologic, Redwood DA started out at the same time. They were making a compiled simulator for their own language. It was supposed to be 10x faster than the existing simulators, and they built a whole collection of stuff around it. They went out [into the market] at the same time that we did our Verilog simulator. There were benchmarks where their simulator would go faster than VCS, but my analysis and characterizations were better. They were saying to their customers, ‘You have got to come to our Brave New World and leave your old baggage behind.’ But, VCS was successful because we said, ‘You don’t have to leave your baggage behind.’

Scott – ‘We’ll carry your baggage for you.’

John – Redwood had good technology, but they didn’t make it.

Peggy – Can additional investment make a difference, particularly in a small company?

Scott – It’s [often] simply a matter of how much you’re willing to invest. Is the idea worthy of that investment? It takes a lot of appetite for risk to go after the whole enchilada. Over the last 20 years, the bar has been raised and the threshold is much higher now because of the depth and breadth of the 3 major companies supplying virtually the same stuff. So [you need to provide] a very sharp arrow that fits within the existing flow. If you’re going in broadside, taking on the big companies, you need an enormous amount of energy.

John – You have to have a compelling proposition. We see that with ESL. The value proposition with ESL is simply compelling, assuming all the ESL tools live up to the promise. [Still] there’s a lot of inertia out there that always prevents people from saying we’ll switch to ESL.

Peggy – Differentiate between your two worlds? Between working at the current level of abstraction versus moving up?

Scott – It take just as much innovation to improve an existing practice as it does to move that practice to a new place. In comparison between us, at Novas we’re focusing on improving existing practices.

John – We’re focused on moving the practice to a new place. A great deal of our effort over the history of the company has been trying to make the move to higher levels of abstraction in a seamless way.

Peggy – Why do it in a seamless way?

John – Because we think it can be done. We’re just layering a level of abstraction [on existing levels], but at the level of the details, it does change the way people do design. It’s actually much more disruptive than I would have thought back in 1998 when we started the company.

Peggy – I’m not trying to describe your work as dull, Scott, but where’s the innovation?

Scott – Well, your notion of innovation is quite generalized. In some ways, you’re equating innovation with a sea change within the whole sales channel. And I would say, yes sometimes innovation requires a sea change. At other times, it takes just as much innovation and engineering to make important incremental changes in present practices. It’s easier to throw things out and start [from scratch], than to make improvements in existing practices.

Peggy – Isn’t it about engineering versus science?

Scott – No, it’s about having marketing, which is not at all silly. The core of marketing is understanding the customer’s situation and understanding what it will take to satisfy the customer. That’s working with both scientists and engineers to come up with a product plan. Apple doesn’t succeed because of its brilliant promotion. Its success is in a deep understanding of what will satisfy the customer, the investor, and the engineer. That’s what it’s all about.

Peggy – I heard the Chancellor at U.C. Merced commenting that engineering without commercializing is not valid.

Scott – Okay, but engineering is not just about productization.

John – It’s about doing for a dime what any fool can do for a dollar. That’s what we’re doing in engineering. We’re taking basically pure science and reducing it to a product and doing it in a more efficient way.

Peggy – How is that innovation versus implementation?

John – If we implemented without innovation, we wouldn’t have much fun in our jobs.

Scott – It’s the process of adding value. We’re in the business of generating wealth. Our customers and our competitors are in the business of generating wealth. Innovation is key in that, in order to generate wealth, you need to bring something new to market. Behind innovation is the idea of newness.

I think there’s innovation in implementation quite often. You may create exactly the same thing by implementing in a new way. You can’t tell the difference between VCS results and Verilog/Excel results. The black box was identical, but inside it was completely different. The Verilog language was much more visibly innovative. To touch it and use it was different than using N-dot. And the next thing that John did was to create VCS. That innovation there was also hidden away in the implementation.

John – Particularly in EDA, and in most things – why would you bother to implement if you didn’t innovate?

Scott – Bringing something to market to make a profit implies there’s been innovation. But the idea without the implementation? You needed both – the language and the simulator.

John – If you’re trying to equate engineering with just implementation, then science is innovation.

Scott – But that’s wrong!

John – In our field, the implementation is not pure science. We all have R&D departments, and titles of VP of Research [on staff], but we don’t do a lot of basic research. But in our development, there’s still a great deal of innovation.

Scott – Our research takes the form of exploring outwards from our established value – where our product or technologies can automate steps. Research means looking for other parts of the flow that we can extend our technology into in order to add value to the customer. It’s research because we’re looking into the unknown. It’s innovation to make faster timing analysis tools.

John – The problem that we have is that we deal with heuristics a great deal. It’s research to come up with new heuristics.

Scott – It’s trial and error, which engineering has a great deal of, when you’re automating something that no one’s ever automated before. Last year at DAC, we heard from customers creating for the first time the most complex products ever designed. They can’t test the products because the test cases are so complex. These customers will break our EDA products, so how do we deal with that? What expectations do we set for these customers? How are our organizations set up to deal with these realities? How do we build resilient software that can be readily adjusted based on real-world realities? The innovation in all of this for EDA is creating architectures for these customers, and yet not changing the practice of their design. This is innovation!

John – Pretty well said.

Peggy – Ditto, and thanks.

***************************

CoWare, ITC, M&A, and SOI Consortium …

[Editor’s Note: Due to space limitations in EDA Weekly, which is now running every other week, the extended interviews related to these news items can be found in EDA Confidential. Thanks for checking them out.]

* CoWare announced “the first product release supporting companies in their transition from the proof-of-concept ESL era to ESL 2.0. ESL 2.0 refers to a second generation of ESL solutions, which aim to facilitate the design and development of processor-centric, software-intensive products with complex interconnect and memory architectures, in a production environment … CoWare is the first company to deliver a comprehensive and integrated offering for the ESL 2.0 era. The release supports the larger ESL 2.0 community with new features and benefits through a set of six solutions: Platform Architecture Design, Software Development, Platform Verification, Application Sub-System Design, Processor Design, and DSP Algorithm Design.”

* DeFacTo Technologies announced a new DFT product, HiDFT-Scan, that analyzes RTL integrated circuit and system-on-chip designs, creates appropriate RTL scan test structures, and inserts them into the RTL design. “HiDFT-Scan, works within existing design flows and with industry-standard synthesis tools. Because it eliminates the need for gate-level scan, the new product has enabled chip designers to create the industry's first high-level DFT sign-off methodology. HiDFT has been used on customer designs in both the U.S. and Europe.”

* Magma unveiled Talus ATPG and Talus ATPG-X with on-chip compression. The company says these ATPG products “enable designers to significantly improve test quality, reduce turnaround time and cut costs of nanometer ICs. By integrating Talus ATPG and Talus ATPG-X into the Talus physical design environment, Magma offers the only IC implementation flow that provides true physically aware DFT.”

* Apache Design Solutions, “the technology leader in power signoff and complete silicon integrity platform solutions, announced that it has signed a definitive agreement to acquire Optimal Corporation, a leader in 3D power, signal, and thermal analysis for package, System-in-Package, and board designs.”

* Silicon & Software Systems (S3), “the Connected Consumer Technology Company and leading provider of semiconductor mixed signal Intellectual Property (IP) for consumer products, announced that the company has acquired Acacia Semiconductor S.A., a developer of high-performance data converter IP based in Portugal. This acquisition is designed to strengthen S3's position in the market and help deliver on its strategy to be the world-leading specialist provider of analog mixed signal IP.”

* CLK Design Automation “announced that is has acquired Synchronous Design Automation, an innovative developer of clock tree synthesis and optimization tools for advanced digital chip designs. Combined with the leading-edge timing and signal integrity capabilities offered by CLK Design Automation, the result is a comprehensive suite of automated timing closure tools for 65-nanometer and 45-nanometer designs.”

***************************

Money matters …

* Applied Wave Research, Inc. (AWR) announced H1 2007 sales and revenue, ending September 30, 2007. Per the Press Release: “Sales revenues for the first six months increased nearly 30 percent over the prior year … [and] resulted in record profits for the fiscal second quarter.” Ted Miracco, AWR Co-founder and EVP, is quoted: “Sales in all geographical regions—North America, Europe, and Asia—are well ahead of expectations, with China and Germany experiencing the fastest growth,”

* Bluespec announced $4.25 million in Series C funding, bringing the total amount raised in three funding rounds to $17.25 million from investors Atlas Venture and North Bridge Venture Partners.

* Cadence Design Systems reported Q3 2007 revenue of $401 million, an increase of 9 percent over the $366 million reported for the same period in 2006. On a GAAP basis, Cadence recognized net income of $73 million, or $0.24 per share on a diluted basis, in the third quarter of 2007, compared to $42 million, or $0.14 per share on a diluted basis, in the same period in 2006.

* Chartered Semiconductor reported Q3 2007 revenues of $354.8 million, up 9.4 percent sequentially. and revenues including Chartered’s share of SMP of $381.8 million, up 8.2 percent sequentially. The company also reported net income before tax of $7.1 million, compared to net income before tax of $4.0 million in 2Q 2007. Net income in 3Q 2007 was $114.8 million which included a tax benefit.

* Magma announced Q2 revenue of $53.5 million, a 27.5 percent increase over Q2 2006, which the company says includes, “Record revenue for the third quarter in a row; revenue that beat Magma’s own guidance and Wall Street estimates, also for the third quarter in a row; and [several] major customer engagements [including] Toshiba and Maxim.”

* Synplicity Inc. announced financial results for the quarter ended September 30, 2007.
Revenue for the quarter was $19.4 million, compared to $16.3 million reported for same period in 2006. For the quarter ended September 30, 2007, GAAP net income included non-cash charges of $959,000 of intangible amortization expense from acquisitions and $747,000 of stock-based compensation expense.

* FSA [Fabless Semiconductor Association] released Q2 2007 additions to its Global Semiconductor Fundings and Financials Report. The latest Report includes statistics for CYQ2 2007 and 1H 2007:

* 1H 2007 semiconductor industry revenue totaled $129.3 billion.
* North American semiconductor companies represented 52 percent of 1H 2007 revenue, followed by Asia with 37 percent, Europe with 11 percent and India with less than one percent.
* 263 fabless and IDM companies reported $63.9 billion in revenue in Q2 2007, an increase of nine percent year-over-year (YoY) and a 2.4 percent decrease quarter-over-quarter (QoQ).
*The top 10 semiconductor companies by Q2 2007 revenue combined for $30 billion, or 47 percent of total Q2 semiconductor revenue.

1. Intel: $8.7B
2. Samsung: $4.6B
3. Texas Instruments: $3.3B
4. Toshiba: $2.5B
5. STMicroelectronics: $2.4B
6. Hynix Semiconductor: $2.1B
7. Renesas: $2.0B
8. Sony: $1.6B
9. NXP Semiconductors: $1.5B
10. Advanced Micro Devices: $1.4B

* The top 10 IP companies by Q2 2007 revenue reported $295 million in revenue, a decrease of 0.2 percent QoQ and one percent Year over Year.
* The top five packaging and test companies by Q2 2007 revenue reported $2.4 billion in revenue, an increase of four percent QoQ and a three percent decrease YoY.
* The leading foundries, TSMC, UMC, SMIC and Chartered, reported $3.7 billion in revenue in Q2, an increase of 10 percent QoQ and a decrease of seven percent YoY.
* The leading EDA companies, Cadence Design Systems, Synopsys, Mentor Graphics and Magma Design Automation, reported $873 million in revenue in Q2, an increase of six percent QoQ and 10 percent YoY.
* 112 semiconductor companies raised $1.4 billion in 1H 2007.
* 52 semiconductor companies raised $681.7 million in Q2 2007, a nine percent dollar amount decrease and a 13 percent decrease in the total number of deals closed QoQ.

***************************

In other news …

* Acceleware Corp. announced a partnership agreement through which Agilent Technologies will resell the Acceleware acceleration products throughout its sales channel to Agilent AMDS customers.

* Altera announced its Stratix II, II GX, and III FPGAs are included in GiDEL's new PROC algorithm acceleration boards, and the PC_X8 PCI Express adapter for the PROCxM prototyping systems.

* Altium Ltd. announced that NASA’s Johnson Space Center has standardized on Altium Designer as its electronics design software. Per the Press Release: “The Center’s Engineering Directorate will use Altium Designer as its electronics design standard on both manned and unmanned mission support. These include the Space Shuttle and International Space Station programs, as well as the Constellation program to send astronauts back to the moon … Altium Designer will be used on disciplines as varied as guidance and navigation, electrical power systems, avionics systems, instrumentation, thermal protection, spacesuits and other extravehicular activity (EVA) equipment, aerodynamics and related disciplines, advanced automation systems, and overall systems engineering and simulation.”

* Altium also announced that it has updated the design translation capabilities in Altium Designer to support the import of Mentor Graphics DxDesigner files. Per the Press Release: “The enhanced migration tools make it easier for companies and designers currently using a DxDesigner-based point tool solution to upgrade to Altium Designer.”

* Ansoft Corp. says it has certified the VHDL-AMS models from the FAT-AK30 working group of the German Association of the Automotive Industry (Verband der Automobilindustrie - VDA) run in Ansoft’s Simplorer system simulation software. Per the Press Release: “The German Association of the Automotive Industry consists partly of automobile manufactures and their development partners.”

* Ansoft also announced a new library of permanent magnet materials from Shin-Etsu Magnetics Inc. for its Maxwell electromagnetic field simulation software. The companies say the library contains “more than 31 high-performance permanent magnets defined at different operating temperatures using rare earth elements that can be downloaded by Ansoft customers and are ready for use within Maxwell.”

* ASSET InterTech says it is expanding its ScanWorks system by adding signal integrity analysis applications that support Intel’s next-generation embedded instrumentation technology, Intel IBIST (Interconnect Built In Self Test).

* Berkeley Design Automation announced that Beceem Communications has adopted Berkeley Design's Analog FastSPICE circuit simulator. Beceem VP of Engineering Stephen Lloyd is quoted: "With Analog FastSPICE, we can verify our complete WiMAX transceiver with full SPICE accuracy, which was impossible with other simulators. We are also able to slash the long verification times for our complex analog and RF blocks by more than 5x, again with full SPICE accuracy."

* Cadence announced a multi-year strategic agreement with NXP, and also announced Cadence has been named the primary EDA supplier to NXP.

* Carbon Design Systems announced availability of Carbon Model Studio, which the company says is designed for the automatic generation, validation and implementation of hardware-accurate software models. Per the Press Release: “Carbon Model Studio was designed for the entire design team, from system architects and software engineers to hardware designers and third-party IP providers. System architects can use Carbon Model Studio for architectural analysis and profiling. Software engineers can develop and debug embedded software, firmware, drivers and diagnostics concurrent with hardware development. Additionally, Carbon Models can be securely distributed to third-party partners to accelerate adoption of an IP provider’s technology devices.”

* CEVA and CoWare announced Processor Support Packages (PSPs) that support the use of CEVA-TeakLite and CEVA-TeakLite-II DSP cores within the CoWare system-level design environment. Per the Press Release: “The jointly developed models, based on the SystemC high-level design language, allow designers to quickly perform early architectural exploration and trade-off analysis before committing a CEVA-based design to silicon. Using the CEVA PSPs and the CoWare Platform Architect environment for ESL design, designers can explore and verify alternatives for using different cores, busses and cache sizes, as well as simulate hardware and software operating together, all without having to commit to a hardware prototype. This early-stage exploration capability streamlines the overall design process and reduces time to market for complex system development.”

CEVA’s Moshe Sheier is quoted: “ESL has emerged as a bona fide methodology to increase design efficiency and manage complexity. Providing comprehensive support at this early stage of the design process to designers wishing to utilize our cores, further enhances the potential of ESL-based approaches and provides designers with more options from an implementation standpoint. We are pleased to be working with CoWare, an acknowledged leader in the ESL space, to improve the effectiveness and utility of design systems with this collaborative offering.”

* Chip Estimate announced its IP Concierge service to aid customers in identifying IP. Per the Press Release: “This centralized service allows designers seeking IP at the ChipEstimate.com chip planning portal to connect with a network of leading IP suppliers with a single request. IP Concierge is being introduced to meet the needs of the design community to get information about IP that can be customized or is under development and is not yet publicly available. This is a free service, and supplements the existing "self-serve" model of exploring the extensive IP catalog at ChipEstimate.com.”

* CoFluent Design announced that it has joined OSCI [The Open SystemC Initiative] as an Associate Corporate Member and The SPIRIT Consortium as a Reviewing Member. President Mike Meredith is quoted in the Press Release: “I'm very happy to welcome CoFluent Design as an OSCI member. We look forward to their participation in our technical working groups as we continue to extend SystemC-based standards upward in the system level design flow.” The SPIRIT Consortium Ralph von Vignau is also quoted: “I would like to welcome CoFluent Design to our team and am looking forward to establishing a fruitful collaboration with them.”

* CoWare and Carbon Design Systems announced the “expansion of their strategic relationship focused on accelerating availability of virtual hardware platforms for architecture design and software development. Through an OEM agreement, CoWare will directly sell and support Carbon Model Studio for use with CoWare Platform Architect.”

* EEMBC [The Embedded Microprocessor Benchmark Consortium] announced OABench 2.0, its “enhanced office-automation benchmark suite that gives users the ability to approximate the performance of embedded microprocessors in printers, plotters, and other systems that handle text and image processing tasks … The OABench 2.0 suite consists of five benchmarks, each with its own datasets, including entirely new Bezier and Ghostscript tests.”

* EMA Design Automation and DesignAdvance introduced a free ROI calculator for the CircuitSpace PCB design software. The calculator helps customers quantify the time and cost benefits to expect using CircuitSpace. Bob Brady, Senior Manager of Engineering Infrastructure at RadiSys, offered a testimonial: “We purchased CircuitSpace, and have already realized substantial savings in our design time.”

* EVE has named Tsugumi Fujitani to be Vice President of Japan and Asia Sales, and has appointed Masao Fujimoto as GM of EVE K.K., a wholly owned subsidiary based in Japan.

Per the Press Release: “Tsugumi Fujitani has more than 20 years of technical and executive management experience in the electronics and EDA industry. Before joining EVE in 2004, she managed Tera Systems’ operations in Japan. Previously, she was CEO and co-founder of Spinnaker Systems. Fujitani began her career as an AE at Hitachi, Ltd. She moved to Okura, Ltd., where she served as an AE manager. Masao Fujimoto has more than 10 years of sales experience in the semiconductor industry. Most recently, he was senior sales manager in Japan at MoSys International Inc. He also served as sales manager at Denali Software K.K., and was a senior manager at Spinnaker Systems Inc. Fujimoto began his career as sales representative at ALTEX.“

* IMEC says it will install an ASML EUV pre-production tool in IMEC's 300mm facility by 2010. Per the Press Release: “This will enable IMEC and its partners to do research on 22nm CMOS on the world's most advanced lithography system. The installation of the pre-production tool follows ASML's alpha-demo tool at IMEC from which the first high-resolution images were obtained with a Sn source (Philips Extreme UV) at the end of September. World first horizontal and vertical 35-nanometer and 40-nanometer lines and spaces in 100-nanometer MET-2D resist (Rohm & Haas) at 18mJ/cm2 were successfully exposed with EUV using a Sn source. Whereas the goal of the alpha-demo tool is to pioneer the technology, demonstrate feasibility and build the infrastructure, the pre-production tool will exhibit considerably higher source power and optimized optics. This will enable full-scale development of EUV technology up to production worthy standards.”

* IMEC also announced a wireless ECG (electrocardiography) patch for continuous monitoring of cardiac activity and heart rate. Per the Press Release: “Wearable, wire-free and easy to set-up, the system removes disturbances and discomfort caused by current cardiac monitoring systems … The ECG patch is a hybrid system combining electronic assembly on a flexible Polyimide substrate and textile integration. This allows achieving flexibility and stretchability. Standard ECG electrodes are used for attachment to the body.”

* IMEC also announced a 2-channel wireless EEG (electroencephalography or monitoring of brain waves) system powered by a thermo-electric generator. It uses the body heat dissipated naturally from the forehead. The wearable EEG system operates completely autonomously and maintenance-free with no need to change or recharge the batteries … The entire system is wearable and integrated into a headband. The small size, low power consumption of only 0.8mW and autonomous operation increase the patient's autonomy and quality of life. Potential applications are detection of imbalance between the two halves of the brain, detection of certain kinds of brain trauma and monitoring of brain activity.”

* Finally, IMEC announced its next-generation DRAM MIMCAP (metal-insulator-metal capacitors) process technology as part of its (sub-)32-nanometer CMOS device scaling program. The organization says, “This research will enable IMEC and its partners to address the material and integration requirements to scale DRAM MIMCAP to future technology generations. This newly added focus follows an earlier extension of its traditional logic- and SRAM-oriented program with a DRAM periphery transistor sub-program in November 2006. The objective of the latter sub-program is to research high-k and metal gate options sustaining a DRAM-oriented process flow.”

* Impinj and Chip Estimate announced that Impinj has joined the Chip Estimate Prime IP Partner Program. As a Prime IP Partner, Impinj is enabling centralized access to information about the company's semiconductor IP at ChipEstimate.com.

* Impinj also announced that Silicon Image has licensed Impinj’s AEON NVM cores “to embed system-critical application data in high-definition multimedia interface (HDMI) and serial advanced technology attachment (SATA) chips.” Larry Morrell, VP and GM of IP Products at Impinj, is quoted: “We are very pleased that Impinj’s AEON/MTP cores have been chosen to help meet the growing demand for silicon innovation required for those sectors.”

* Finally, Impinj announced that the company’s multiple-time programmable AEON/MTP memory designed with 2.5V floating-gate transistors has been silicon-verified on TSMC’s 65-nanometer LP process. Impinj says it is also developing AEON/MTP NVM cores in TSMC’s 45-nanometer processes.

* Magma announced that Toshiba Corp. has deployed Magma’s integrated implementation software in Toshiba Electronics Europe GmbH, and that Magma’s toolset has become the common implementation platform used across Toshiba worldwide design groups in Japan, the U.S. and Europe.

* Magma unveiled Talus ATPG and Talus ATPG-X with on-chip compression. The company says these ATPG products “enable designers to significantly improve test quality, reduce turnaround time and cut costs of nanometer ICs. By integrating Talus ATPG and Talus ATPG-X into the Talus physical design environment, Magma offers the only IC implementation flow that provides true physically aware DFT.”

* Magma also announced it has partnered with Inovys Corp. to “ensure interoperability of Talus ATPG and Talus ATPG-X with the Ocelot tester. Magma has also collaborated with Source III to offer a direct path from Talus ATPG and Talus ATPG-X to variety of a test programs. Magma and its partners are also developing validated feedback paths from testers to the diagnostic capability in Talus ATPG that will allow designers to further analyze the causes of device failure.”

* SMIC and Magma Design Automation announced an enhanced low-power IC implementation reference flow for SMIC’s 90-nanometer process. Per the Press Release: “The SMIC-Magma flow utilizes SMIC’s 90-nanometer standard cell and IO libraries with multi-threshold CMOS technology, along with Magma’s low-power design flow, automatic switched domain creation, retention flip-flop insertion and power analysis for active and sleep modes, to optimize dynamic power and minimize leakage power.”

* Mentor Graphics announced that Elektrobit Corp. used Mentor’s Catapult C Synthesis tool in designing its next-generation wireless products. Per the Press Release: “EB selected the Catapult C Synthesis product based on the tool’s ability to synthesize pure ANSI C++ and increase hardware designer productivity up to 10x.”

Ari Hulkkonen, Director, Wireless Systems at Elektrobit, is quoted: “Catapult delivers a level of productivity that we are unable to achieve using hand-coded RTL methodologies. The productivity benefits come from automatic RTL creation that eases design exploration, plus verification efficiencies delivered by the C testbench Catapult’s error-free RTL code.”

* Mentor Graphics also announced a collaboration with LeCroy to deliver a complete platform for USB-based protocol applications. Michael Romm, LeCroy’s Director of Product Development, is quoted: “Mentor’s Veloce family of advanced hardware-assisted verification solutions complements our USB test systems. One of our key customers, a world leader in consumer electronic and multimedia systems, can now perform rigorous testing of their latest applications on this integrated, high-performance verification platform.”

* MIPS Technologies announced that AMIMON has licensed the low-power MIPS32 M4K core for development of wireless HD audio and video transfer applications.

“To achieve the wire-like quality and speed of AMIMON’s wireless HD technology for consumer electronic devices, the underlying technology must offer the best possible mix of high performance, low cost and low power,” said Yoav Nissan-Cohen, CEO, AMIMON. “The MIPS32 M4K core delivers on this promise, and because it is small and synthesizable, it also offers the design flexibility we need to quickly get our products to market.”

* OCP-IP announced that Synopsys has joined the OCP-IP Governing Steering Committee. Other committee members include Nokia, Texas Instruments, Toshiba, and Sonics. Per the Press Release: “Synopsys is already active in OCP-IP’s working groups, and their DesignWare Verification IP for the OCP interface is a part of the CoreCreator verification toolset that all OCP-IP members receive.”

* Ponte Solutions announced a new interface between its YA System and the Laker layout tools from Silicon Canvas, which the companies say “lets design teams easily perform Critical Area Analysis (CAA) and repair, utilizing the full capabilities of Silicon Canvas's Laker.” Ponte VP of Markting and Business Developerment, Michael Buehler-Garcia, is quoted: "DFM issues must be tackled early to increase the likelihood of first silicon success, and that means doing DFM analysis at the IP level. [This] announcement, giving designers predictable, actionable CAA analysis within a premier tool like Laker makes IP-level DFM a reality."

* Ponte Solutions also announced an interface between its YA System and the Cadence Virtuoso platform that the companies say “allows IP designers to address CAA in an actionable manner during the creation of IP elements, standard cells, and memories. As CAA is both statistical and contextual in nature it has been difficult for designers to take specific measurable actions to reduce CAA issues … Ponte’s YA System addresses this problem by presenting the designer with prioritized CAA hot spots, which will ensure the most critical CAA effects as predicted by certified defect kits provided by the leading foundries, are identified and corrected.”

Chartered‘s Walter Ng is quoted: “It is important that designs at 65nm and below consider the impact of DFM during development. This is a good example of a real design solution resulting from companies teaming up to develop the appropriate interfaces which allow designers to fully utilize the capabilities of today’s advanced nanometer process technologies.”

* Ponte Solutions and Blaze DFM announced delivery of the first modeling elements committed to Si2’s DFM Coalition last year. These contributions are the primary drivers for the critical area analysis and lithography elements of Si2’s DFMC efforts. The companies say that Ponte’s model-based yield analysis technology allows yield sensitivity analysis for identifying critical areas, and will be made available royalty-free to Si2 and DFMC members for standardization purposes, including modification and benchmarking for the next three years.

* Sagantec announced its work with TSMC resulted in the development of Sagantec’s DFM-Fix. Per the Press Release: “DFM-Fix speeds turnaround time by automatically addressing hotspots in all critical layers at all design levels, including key building blocks such as library, memory, IP and custom blocks. It also provides automated handling of post-implementation hotspots caused by boundary proximities and inter-level effects … TSMC and Sagantec tested DFM-Fix on multiple complete designs with hotspots in various layers. In all test cases, DFM-Fix automatically corrected most of the hotspots, with correction rates of 95% and above in most cases. The flow also proved highly time-efficient, running all test cases in under three hours on a standard quad-CPU platform.”

Coby Zelnik, Sagantec’s EVP of Marketing, is quoted: “Most lithography-related hotspots are found at the front-end and low metal critical layers in the IP infrastructure and macros of SoCs, as well as in memory and custom designs. These hotspots cannot be fixed by routing-level solutions. DFM-Fix addresses these hotspots very effectively for our mutual customers.”

* SAME 2007 Forum, which took place in early October in Southern France, announced 950 visitors, 46 exhibitors, and 23 sponsors. Awards at the conference including Best Paper, Best Start-up, and Best Poster and can be seen on the website at www.same-conference.org.

* Sarnoff Corp. and Carbon Design Systems announced that Sarnoff has licensed Carbon's model generation technology to develop cycle-accurate system models directly from its RTL source code. Sarnoff’s Michael Piacentino, is quoted: "We selected Carbon Model Studio as a key component in our move to virtual platforms for early software development. Carbon allows us to create system models directly from our RTL, so we don't have the burden of maintaining two separate modeling development efforts."

* Si2 [Silicon Integration Initiative] announced that Synopsys has donated an oaTcl-based OpenAccess graphical visualization program (oaViewer) to the OpenAccess Coalition. OaViewer enables software developers of OpenAccess API-based programs to easily view design data such as hierarchical schematics and layouts including PCells. Synopsys VP Rich Goldman is quoted: "This is an important step towards analog design tool interoperability. We see this donation and the recent work in Interoperable PCell Libraries [IPL] to be significant improvements in making OpenAccess a truly open custom analog environment. We hope to inspire more donations to enable OpenAccess to become a complete usable analog solution."

* Stream Processors Inc. [SPI] says it chose Sequence Design's PowerTheater for power management and low-power architecture evaluation for their new processor design, and hence reduced power 20 to 50 percent. "Based on SPI's revolutionary Stream Processor architecture, our current Storm-1 family delivers the industry's lowest Watt per GOPS (Giga Operations Per Second), and PowerTheater has enabled us to further reduce power by slashing analysis iteration cycle times by 3-4X while the sophisticated tools suite provided comprehensive feedback," is the quoted from Paul Filanowski, SPI's Vice President of Hardware Engineering, in the Press Release.

* SynaptiCAD announced it has acquired exclusive development and distribution rights to V2V, HDL translation software from Alternative System Concept (ASC) for HDL translation for an undisclosed amount. V2V is a set of command line tools that perform automatic translations of source code from VHDL to Verilog and vice versa.

Dan Notestein, President of SynaptiCAD, explains the reason for the technology transfer: "A lot of our customers have expressed interest in translating IP and legacy models so that they can debug in a single-language environment. After investigating the available options, we determined that ASC had the best technology for doing these conversions."

* Synopsys aannounced it has extended low power management capabilities in the Synopsys Galaxy test solution “to significantly reduce the time and effort needed to generate high-quality, power-aware manufacturing tests for ICs. The TetraMAX ATPG solution now creates tests reflecting designers' power budgets, and the DFT MAX scan compression product further automates integration of DFT structures in designs that deploy advanced low power management techniques.”

* Synopsys also announced general availability of the OdysseyDFT module. The company says “Odyssey yield management software has been widely adopted by leading semiconductor manufacturers to correlate and analyze diverse datasets needed for product yield enhancement.”

* Tensilica announced that Valens Semiconductor selected Tensilica’s Diamond Standard 108Mini as the controller for an SoC for high-quality transmission of audio and video in a home networking environment.

* Think Silicon announced its IPGenius on-line, parameterisable IP generation platform. Per the Press Release: “The program is designed to ease the process of obtaining and integrating IP into SoC designs by offering SoC designers an easy to use web interface for parameterising core properties before receiving them … This tool allows the generation of custom-made IP modules from a selection of modules that can be customized according to users' requirements, packaged and delivered to the end-user via the Internet.” Think Silicon’s Iakovos Stamoulis is quoted in the Press Release: “We genuinely see the advantage of employing this IP platform in rapid design development.”

* Verific Design Automation announced that its Netlist Only Parser” is gaining momentum among EDA applications, especially from startup and emerging companies. Verific’s Netlist Only Parser includes a Verilog netlist reader and a generic hierarchical netlist database to help reduce development time for products operating at the gate level rather than the register transfer level (RTL).”

* Zuken and Aldec announced collaborative product for FPGA design and verification, CADSTAR FPGA, which the companies say combines Aldec's Active-HDL Lite verification tool and Zuken's desktop PCB design suite, CADSTAR. Engineers should not be able to “perform mixed language simulation for vendor neutral FPGAs within the CADSTAR environment.”

* Zuken also announced a free simulation design kit for the newest Virtex-5 FPGA from Xilinx, which the companies say offers “first-class signal integrity for 65-nanometer FPGA designs. [The kit] provides a set of topology templates, in-context HTML documentation and useful content for simulation of waveforms and eye patterns.”



You can find the full EDACafe.com event calendar here.

To read more news, click here.


-- Peggy Aycinena, EDACafe.com Contributing Editor.