[ Back ]   [ More News ]   [ Home ]
August 20, 2007
Helpful Advice for Entrepreneurs. Also post-silicon validation, debug and in-system bring-up from out of the ClearBlue by Dafca
Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
Jack Horgan - Contributing Editor


by Jack Horgan - Contributing Editor
Posted anew every four weeks or so, the EDA WEEKLY delivers to its readers information concerning the latest happenings in the EDA industry, covering vendors, products, finances and new developments. Frequently, feature articles on selected public or private EDA companies are presented. Brought to you by EDACafe.com. If we miss a story or subject that you feel deserves to be included, or you just want to suggest a future topic, please contact us! Questions? Feedback? Click here. Thank you!


Peter Levin, CEO of DAFCA, has an impressive, diverse and quite atypical background for a CEO in the EDA industry.  Dr. Levin earned his degree at Carnegie Mellon University.  He began his academic career at Worcester Polytechnic Institute where he set up Computational Fields Laboratory.  He was a Humboldt Fellow and spent a sabbatical year at the University of Darmstadt as a guest professor of Mathematical Physics. He was the associate dean for research and graduate studies at the College of Engineering at Boston University.  He worked in the White House, first for the Director of the Office of Management and Budget as a White House Fellow, then as Assistant
to the Counselor to the President, and finally in the Office of Science and Technology Policy.  Peter was a general partner in Techno Venture Management (TVM) in Munich Germany.  He led the investment in several firms including Neolinear that was acquired by Cadence in 2004 for $78 million.  I had a chance to interview Peter recently.


How long were you a venture capitalist?

I was in venture for four years.  I went directly from Boston University where I had previously been on the faculty and was research dean.  I focused primarily on infrastructure technologies from device simulation to semiconductor fabrication and anything having to do with IT and software generally.


In the interest of full disclosure my wife graduated from BU and I got a master’s degree in computer science in the evenings.

No kidding.  When were you there?


A long time ago.  Also, I saw that you were at Worcester Poly (WPI).

That’s right I was on the faculty there.  I began my academic career actually at WPI.  I was on the faculty there for almost 10 years.  I went through the tenure cycle.  Happily I went thought it successful and stayed on a little while after that.  I had a variety of off campus appointments towards the end including one year over in Germany as Humboldt Fellow and two years down in Washington DC during the Clinton administration.


My nephew graduate from WPI and I did my undergraduate work at Holy Cross also in Worcester.

Oh, my gosh.  It looks like I have been shadowing you.


I am out here in California now.  The difference between winters in Worcester, Mass and California is tremendous.

I’m actually calling you from a Greek Island.  Let me tell you that the difference between weather on this island and Worcester is also pretty profound.


I am curious.  When you were a VC what attracted you to those firms that you invested in or seriously consider investing in?  Was it the team?  Was it the technology? Was it the market opportunity?

It is all four of the major criteria.  You got three of them right.  Absolutely, positively the team!  Without the team, you would not take even a second look.  Often it had to pass through some other market filter before it ended up on my desk.  Somebody who had been a CEO before or had been a VP of Marketing or played some operational role general would want to have a harsh look at it before they would lewt me take a look.  Then I would do a deep dive on the technology assessment.  The two of us, typically we worked two on a box, would work together on the financial engineering, on the economic structure.  How much money was it going to take to
build the prototype?  How much money was it going to take to commercialize?  Did the market already exist?  Did we think it was going to come?  If it was, how large was it going to be?  When was it going to actually be?  It sounds perhaps somewhat formulaic.  I don’t mean for it to sound like you can sort of plug it into a TI calculator and come up with an up or down decision.  There is a lot of nuance and a lot of subtlety in the decision to move forward.  But it really is all four criteria.  But I would certainly focus Somehow I seemed to have a knack at both team dynamics and technology assessment.


For the firms that you did seriously consider, how did you learn about them?  Unsolicited business plans over the transom?  From other VCs already involved?

Generally not things that came in over the transom.  Of course back in the late 90s and early 2000, you wouldn’t be surprised if you put up website that was tantamount to putting out a sign on you house like a doctor or dentist saying you were in business.  All kinds of people would come and talk to you.  We went to a lot of those meetings for sure.  Almost all of the deals that I did came in through a let me say proprietary network.  It sounds elitist and I do not mean it to be.  These were typically people with whom I had personal experience through my career or people who were in very close proximity to people I trusted.  So for the most
successful deal that I did, the deal that led direct to DAFCA which is where I am now was a Carnegie Mellon spinout.   That’s where I went to college and grad school.  The two founders of that deal (Neolinear) you may know were both CMU professors.  You may not know that they started their academic careers during my last year as a Ph. D. student in the ECE department.  They were more social acquaintances than technical colleagues.  But they were guys that I knew pretty well at the time.  We had been tracking each other’s career.  It was 15 years before we pulled the trigger and put some money in.


I started a venture funded company once.  Two of the VCs had been my manager at one point.  They sort of recruited me to run the development part.  I know that there a lot of readers who would like to become successful entrepreneurs that do not have the contacts.

It is certainly not something you can turn on in the course of a few days or even a few weeks.  I would warmly encourage, in fact I actually encourage young entrepreneurs.  It ends up that I have been recently honored by an appointment at Stanford.  One of my jobs there is to provide these kinds of mentorships and advice to people who in their case because of age and lack of experience have not yet had the opportunity to build relationships, long standing relationships of trust and mutual dependence.  But you can do that.  And I encourage people to get started.  It begins by cold calling these guys and saying “I am eventually going to want to start a company.  I definitely see that in my career, in my future but obviously I am not ready to do that today.  Are there any things that I can do with and for your firm that would allow us to get to know each other a little bit but you would also have a benefit doing due diligence or technology assessment, participating in some kind of venture forum?”  A lot of VC firms run different kinds of conferences and activities.  Participating in anything that is collectively sponsored by the venture capitalists is a very good way to meet them.  But what you do not want to do is to show up at one of these events and say “Here I am.  I have been working
on this business plan in my garage for 5 years and it is ready to go.  All you have to do is write a check for $3 million.”  Besides being hopelessly naïve it is likely unfundable because the venture backer needs to get to know you much better as a person.


How did you get involved with DAFCA?

As I mentioned I had participated in a CMU spin off.  At that time it was on the venture side.  It was one of the most successful deals TVM ever did.  One of the cofounders is a man named Rob Rutenbar.  He was familiar with Miron Abramovici and Al Dunlap.  AL was one of the cofounders of DAFCA and had gone to CMU.  He was part of the CMU Mafia as we affectionately refer to it.  I knew that there was this DAFCA company out there but it was unfunded at the time.  It was just Miron and LA working on it, literally in Miron’s kitchen.  Then I heard about it again.  The second time I heard it from Isadore Katz whose a pretty famous guy in EDA and who I had gotten very friendly with very much along the lines I just described to you.  I was actually thinking of that relationship when I was talking about slowly building relationships of thrust.  Isadore is a perfect example of how I would have done and did do that with someone who was well known to that segment of the EDA marketplace.  To make a long story short, Isadore and I became very friendly and he told me that there was a company out there that quote “I would be perfect for”.  That got my undivided attention.  It was the same one Rob had told me about.  So I figured two people in my intimate trusted network were
telling me I needed to get off my butt and give Miron a call.  That was enough to get me to call him.  The rest is history.  We hit it off extremely well on the phone and then met in person.  We decided to go forward together.


As impressive as your resume is, I missed seeing any prior CEO experience.

Laugh.  You are not the only one who missed that.  I must have forgotten to put it on my resume.


What caused you to want to tackle such a position and why would Miron and others want to hire a person with no CEO experience?

Great question!  The answer is that I had not originally intended to be CEO.  I was helping Miron out of a sense of civic duty and good karma for lack of a better incentive.  Miron is a life force.  Anybody who knows him or even has met him knows that about him.  I figured that it would just be a good thing for the technology, for the product and for the industry if his vision of on-chip instrumentation could become a reality.  There was already and there would be a severe bottleneck exactly at the place where Miron was standing and a problem that he already knew how to solve.  I saw my role initially as being a facilitator, make the right
introductions.  By then I had a very good relations with the venture capitalists.   By no means was I famous but I was well known for my role in Neolinear and having an unusually deep technology background.


To make a long story short, it is actually a funny story.  As we got closer to the A round funding I called Miron up on a Sunday, it was Father‘s Day.  I did it very deliberately because I wanted him to get use to being a startup officer.  He was going to be CTO.  This meant you are on duty 24x7.  Your life and your energy are fully devoted to the cause.  Coming from a large company which he did there was going to be an enormous amount of culture shock.  In fact there was.  Miron was nothing if not a good sport.  He picked up the phone on a Sunday, one of the many Sundays that we continue to speak to each other on.  I said “Things are looking pretty good.  It is time for you to start thinking about who you want to be CEO.  I have taken the liberty of assembling a list of five names of which I have already spoken to four.  They would like to meet with us.”  Miron who is from Eastern Europe spent most of his career at Bell Labs after training in Israel.  I tell people affectionately that I hit the trifecta in that one.  All of the stereotypes of clear thinking but stubborn commitment to particular objectives, let me say diplomatically, were absolutely true with him.  So very abruptly he almost snapped at me.  He said “I have already picked my CEO and it is you.”  He then hung up the phone.  I said to myself that’s just Miron being Miron.  I picked up the phone to call him back because he literally hung up.  It was like “That’s a stupid question.  Obviously you are going to do it.  Why did you bother me on a Sunday for that?”  I picked up the phone (it’s a true story) and the hand of God came down from the sky and said “Don’t be stupid.  If they don’t want you to be CEO, let them tell you.”  The original appointment was supposed to be for 6 months.  They figured that I was mature enough and grown up
enough to hire a bunch of good people, buy computers and establish the broad architectural strokes.  They would hire a real CEO when we had launched, once we had done the hard part of assembling the first team.  After 6 months we had a board meeting and I said “Guys, my 6 months are up.  It would probably be good if we could recruit somebody.”  They looked around at each other and said “Oh no.  You’re fine.  Stay put.  We will tell you when you are done”.  That was four years ago.


Were there any surprises beyond the appointment itself?  Did you learn anything?

That’s a two hour conversation, one that I would be happy to have.  I tell people that DAFCA is the best thing I have ever done.  You have graciously noted that I have been at some pretty interesting places before I came to DAFCA.  I have a pretty good basis of comparison.   It is also the thing that I have done the best.  I say that without the certainty without even the prospect of a large economic outcome.  I have every reason to believe that we are hitting the market at exactly the right time with exactly the right technology.  It has been by far the most spiritual, most challenging, the best growth opportunity.  I mean that both technically, commercially and personally.  It would take me ten minutes to list just the surprises, never mind explain them:  how I have developed and matured as a professional and a leader; how the technology has developed and matured.  What used to be a science project is now a commercially available product.  The market has changed from when we started.  Frankly Miron gets a lot of the credit that the major theme is right.  He basically knew that the semiconductor industry was going to be stalled unless they figured out what to do with silicon that was coming back from the foundry that was not perfect and whose complexity far exceeded anyone’s ability
to verify and validate pre-silicon.  All of those three things seem to be coming together for us.  Your question about growth and evolution is a long and difficult one but it has a loud answer that is “Oh, hell yes”


What stands in the way of substantial economic success for DAFCA, if it is the right technology at the right time?

Right now we have tremendous amount of momentum.  We just got our fourth chip from the foundry.  Now we can demonstrate to a justifiably skeptical not to say cynical customer base that this is the most economical and reliable way to go.  I tell people somewhat tongue in cheek that we are selling predictability.  It is a little bit different marketing message than you normally get in the EDA circle which I do not come from.  One of the advantages we have is that I sort of cut my teeth in other places which are just as difficult if not more difficult but are well outside the moribund EDA industry.  To answer your question directly we need now to accelerate our marketing profile.  For that we are going to need to spend more, cash for sure, more time, spend more creativity on making more people aware that this has in fact been proven in silicon and in a variety of different large and complicated methodologies to be the way that chips will be designed in the future.  The first and most severe challenge to DAFCA’s economic success is getting over what I describe as a psychological, even an emotional, barrier that many project managers have.  They want to believe that they have first silicon correct.  It was I believe the economist Joseph Schumpeter (famous about 40 years ago for the concept of creative destruction) who came up with a funny way of describing how new technologies are introduced to the market.  He said that the first thing you have to know is that it is difficult to convince a man that you can improve his productivity, if he is being paid not to know to that.  Our biggest barrier is that we have a customer base legitimately concerned and sensitive but there are non-rational fears that we can address technically and scientifically to say “Yes it works.  Yes, it helps and yes, it is going to make your life a lot easier.”  But many people who are used to doing the same job in a certain way and are in some sense paid not to be innovative solve the biggest
problems in a somewhat avant-garde way will unlikely want to have a conversation about how many respins and how many functional errors they have had and how many system integration nightmares they have experienced and how many hardware/software code debug problems that have prevented getting their products to market.    So it is a sort of diffuse responsibility.  It is our job to find the guy who is ready to take that risk and to demonstrate that the risk is much smaller than the benefits he is likely to experience by being able to see inside the device on-chip and at speed.


A more recent book on that theme would be Clayton M. Christensen “The Innovator’ Dilemma” subtitled” When New Technologies Cause Great Firms to Fail.”


Earlier and separately I had an opportunity to interview Paul Bradley, VP of Engineering for DAFCA.


Would you give us a brief biography?

I am actually not from the EDA industry.  I came from the datacomm and telecomm space.  I was more a consumer of EDA tools, similar to what DAFCA is building today.  I worked on some high end routers, switches, and Ethernet switches.  I also did a fair amount of FPGA semiconductor design work as well at Motorola in the early days.  Then I worked for a company called Sonoma Systems and CrossCom back in the early 90s.  More recently I worked for Nortel Networks in their high end routing and VPM switching groups.  I was doing some consulting work for a company called Internet Photonics.  A friend of mine that had recently joined DAFCA who I had
worked with back at Motorola, told me what DAFCA was doing.  It seemed pretty intriguing to provide all of this capability embedded inside a semiconductor chip.  It was something I had been interested in after designing a number of chips and FPGAs.  So I joined DAFCA 31/2 years ago.  Originally I joined the team as one of the architects and designers.  More recently I have moved into management and technical marketing role.

Editor: Paul left out that he was was cofounder and hardware architect at Broadcastle Corporation.


How old is the company?

The company is almost 4 years old.  We were founded in 2003 by Miron Abramovici and Peter Levin.


Where did they come from?

Miron’s background is Bell Labs.  He is sort of the godfather of test.  He wrote probably one of the most popular and heavily used books on test and test methodologies and DFT techniques called Digital Systems Testing and Testable Design.  He stayed with Bell Labs and then went to Lucent and Agere Systems.  He left Agere to start DAFCA.  Peter’s background is fairly diverse.  He was a college professor, worked in the White House, worked for a venture capitalist firm (TVM).


How big a company is DAFCA?

Right now, we are about 25 people.

Editor:  The company raised $8 million in its first round of funding.


DAFAC was awarded an Advanced Technology Program (ATP) grant totaling $1.8 million from the National Institute of Standards and Technology (NIST).  

Only 30 applications were accepted out of hundreds.  We were the semiconductor selection.  It essentially funded our advanced R&D for the first three years of the company’s life.


Would you tell us a little about what DAFCA is up to?

DAFCA essentially delivers instrumentation IP and software for on-chip, at speed, inc system validation.  The instrumentation is important but the primary value we deliver is the software that uses the instrumentation after the chip has been fabricated.  We are providing a whole suite of applications.


People tend to think the chip is sitting on a tester or something like that.  While that is possible, it is not the primary application.  It is when the new semiconductor is installed in a system and the system is being run in the lab at speed, the customer is trying to validate the system in an environment that replicates their customers’ environment.


We already have 4 chips back, three at 90nm and 1 at 65 nm.  It provides a fair amount of productivity gain.  It takes many of the things our tools do from an instrumentation point of view that people are already doing today.  People are already putting instrumentation into their designs.  They are doing it by hand.  Some firms do not spend a lot of time verifying their designs.  It is pretty much an ad hoc solution.


We offer an easy way of inserting compact instruments.  All of the instruments are inserted into an RTL design as synthesizable RTL.  Our solution is designed to be compliant with all the major synthesis flows.  The primary value is through the comprehensive analysis applications.


In the early part of the design cycle in simulation or maybe emulation, you have pretty good observability into what is going on in your circuitry.  You can see all of your transactions anywhere in the design you need to.  There is a performance issue because often it takes a long time to create all of the scenarios and run test verification sequences to completion.  So while you see everything, it is often very time consuming to do so.  Once you have fabricated the chip, the observability drops off considerably.  If after the fact you need to observe something that you don’t have access to, it is very expensive and difficult to gain that
observability.  In a simple form DAFCA is about providing that observability and doing it in a way that is seamless and easy to integrate with design flows.  It is very cost effective


We have been talking to customers, a number of key companies, for about 4 years before we had a product in the field.  A lot of folks already had the idea that what we are doing makes a lot of sense.  They were trying to implement it themselves.  The on-chip instrumentation is something people are already doing for a long time.  What is happening now with usage of more and more third party IP and with design teams being spread among many organizations across the globe, the instrumentation solutions, the post-silicon validation, test and debug solutions that have been created are very disjoint.  One piece of IP has one debug and observability structure, the next
piece has something completely different.  And by the way none of the software that leverages these instruments works together.  They are all fragmented ad hoc solutions.  In the end there are no system wide end user capabilities.  They are too fragmented to be scalable and to be used through out a large organization.


DAFCA is trying to automate the implementation of this solution.  We are providing this solution through a reconfigurable instrumentation.  Many of our customers tell us the instrumentation overhead has to be low.  They do not want to dedicate large chunks of their silicon to instrumentation.  Our novel concept and part of our patent portfolio is a reconfigurable infrastructure, reconfigurable instruments.  In a sense these instruments can have multiple purposes on silicon.  At one point in time they can serve as logic analyzer modules.  At another point in time they can be used as built-in self test or for fault insertion.  Still again for
performance monitoring.  We intentionally deigned the instruments such that the overhead becomes less and less of a problem or a barrier to entry for us.  The other thing we have done is provide a test platform, a post-silicon validation platform that all of the applications run on.  We provide not only graphical tools to configure, control and analyze the data that comes from all the on-chip instruments, we also have the ability to extend the capabilities or functions of our tools through a TCL interface.  We provide that for our customers who want to do more than our standard interface.  


If there is a signal one wishes to observe and instrumentation has been inserted for that, then no problem.  But if one did not know or suspect a priori that a particular signal would be of interest, then how does one get observability after the fact?  Do you have access to all of the signals?

Good question!  This is the first question we get from our prospects.  The answer has two parts.  First, understand that our solution is not just about debug.  It is not just putting instruments in the areas you think are going to have trouble.  It is about putting instruments into a design so that you can prove your design is working correctly.


For most people 99.9% of their design is just fine, no issues.  But it takes a long time to prove that.  Instrumentation is about choosing wisely to put the instrumentation in the right places so that you can perform that validation step not just the debug step.  With that said, what happens if you fail to choose the signals that are important to look at after the fact?  We have another solution called SnapShot.  We combine our at speed implementation and debug solutions.  Most chips already have scan chains in them.  Using our technology we wrap these scan chains so that we have access to them through the JTAG port.  So if this happens to be a
signal that the customer needs to get at that is not instrumented, we will typically have access to it through the scan chain.  Using our software applications the user now has the ability to stop the chip at the precise moment in time where the point of interest occurs or where the signal is doing something interesting and can extract the state of that signal through the scan chain.  We provide high coverage through this scan chain debugging technique.


The solution has three components.  ClearBlue ReDI (Reconfigurable Debug Instruments) Library, just a simple library of RTL primitives.  This library is used by the ClearBlue Instrumentation Studio which is the tool used by the customer to insert instruments into their designs.  This is not just inserting instruments but creating and configuring the instruments.  The user reads their own circuit design into our tool in RTL form, they navigate their design and make selections, like I would like to observe this bus, observe that interface over there, monitor what is going on around that shared memory subsystem.  Essentially they walk around and point out to our tool where there are areas of interest.  Using Wizards in our tool, ClearBlue Implementation Studio, they tell us what types of applications they would like to use post-silicon.  From that information we decide what type of instruments to construct.  This has RL generators.  We put all of that instrumentation into the design automatically, stitch everything together, close the design back up and write it out.  In addition we also write constraint files.  We create a test bench and equivalence checking script.  Essentially we do everything to allow the customer to do everything in the rest of design flow.  It does not matter to us if it is Cadence,
Synopsys, Mentor, Magma or some combination of the four.  The customer continues the design flow.  They fabricate the chip.  Once the chip comes back, they would use the ClearBlue Silicon Validation Studio.  Once they have that up and running and it is communicating with the chip through the JTAG interface, they have access to all the applications: performance monitoring, logic analysis, debugging, …


Would you give us a quick synopsis of the major applications?

The most common application and something most of are customers are already doing is some form of logic analysis.  We are doing essentially the same thing except our instruments con do not just logic analysis but a lot more.  The way we designed the tool to emulate a commercial logic analyzer that you might rent or buy from Agilent or Tektronix.  We have a slick graphical interface where customers create patterns for matching in the logical analysis, design a state machine, base a triggering sequence, control what data is written into embedded memory.  They have all the basic controls they are used to.  Most of our customers get this in a matter of
minutes.  They can use the tool without reading our user manual.


The next application we call transaction stimulus.  This is the ability to modify circuit behavior in the system at speed.  We can observe what is going on inside the chip.  Some of our customers have validation requirements that necessitate the ability to control or to stimulate circuit blocks in system at speed.  We have a mechanism where we can wrap signals.  By wrapping those signals we can gain control over them and supply functional stimulus, download that stimulus through the JTAG interface and turn the switch on that can create transactions in silicon.  Next to logic analysis this is the most popular application.  Infineon used us for on-chip
validation and this was one of the applications they used quite heavily.


The next application is assertions in silicon.  Many of our customers are using some form of assertion based verification solutions in pre-silicon.  We allow them to take those assertions and pull them into silicon, into the reconfigurable instruments to analyze the behavior of circuits.


Event analysis is the combination of the three previous applications.  Often times when customers are trying to validate their silicon, they have complex transactions that happen across the entire chip.  Using just logic analysis is not enough.  Using just transaction stimulus is not enough.  Or just having half a dozen assertions is not enough.  They need to stitch all of these things together.  They need to be triggering on one part of the system looking for certain events while assertions are running and while stimulating an IP block to really get to the corner cases in silicon and analyze them thoroughly.  They have analyzed them in
simulation.  It took them weeks and weeks to do it.  They now want to analyze them in silicon but it is often very difficult to do that.  It is especially difficult to do that when the chip first comes back into the lab.  We are trying to provide them a means to do it very early in that first silicon phase, within a matter of days of getting it into the lab they can be using all of these applications and really seeing what is going on inside the chip in a much more efficient way.


Performance monitoring is a means to observe what is going on, to count events, to measure latency between events, track sequence of certain events.  With this programmable instrumentation we have access to all kinds of counters, timers and programmable logic.  The customer has the ability to construct a performance monitor unique to his application.  For some type of mobile application, they can track the number of certain types of packets and certain types of error conditions.  For a digital TV product, they can track the behavior of one of the coders or decoders.  There are all types of interesting ways to apply performance monitoring.  When people think
about performance monitoring they think of instruction tracking, keeping track of what instructions were executed, what piece of memory was used, what threads were executed in a piece of software.  While we can do some of that, what we provide is a much more low level and abstract view and customer defined view of what is going on in silicon.  Something as simple and as granular of watching how a single wire wiggles inside the silicon or it can be as complex as watching AMBR bus transaction over a period of hours.  It is up to the customer to control the performance monitor in our solution.


Snapshot is the scan based debugging solution where we give the customer access to the scan chains.  They can access use them in conjunction with their at speed logic.  So they can create triggers to stop the chip and look at the scan chains.  Or they can have an assertion and if that assertion fires, they can extract the scan chain.  They can deposit new values into the scan chain, restart the chip and cause test conditions that would otherwise be very hard to create.


What were the chips that have come back from the fab intended for?

There are four examples”  a serial ATA controller, a printer platform, an extended AMR based processor subsystem and a digital image processing solution.  Three are at 90 nm and one (the processor subsystem) at 65 nm.  They are relatively small in terms of gate count (2M to 4M).  Most of our customers think about instrumenting their entire design but in practice they are typically at a relatively high level in their design hierarchy.  So only a handful of clock domains are being instrumented.  From a performance point of view, these examples are in the range of 200MHz to 400MHz.  Two were ARM based and one was MIPS based.  The serial ATA
controller had no embedded processor.  We are finding more and more customers are putting us in chips with embedded processors.


In the case of serial ATA controller the customer was trying to prove their new piece of IP was working and compliant to the standard and they could prove to their customer that the IP was compliant and once their customer integrated their IP into a system, they had the means to observe or diagnosis any problems if and when they occurred.  The solution served in a sense as a demarcation point between IP blocks.


The second and third examples are both ARM based solutions.  They were using our solution to not just observe what is going on but also some level of fault injection capabilities.  They were using us to validate the design.  They had some software running on the ARM processor and they needed to be able to exercise corner cases of the processor.  They were doing this by fault insertion to create conditions that the software behaved correctly when some unexpected condition occurred.


The last case was an observe only use of our solution.  The customer had been using Chipscope form Xilinx for a long time and loved the tool but could not use it for his ASIC.  They asked if we could provide similar tool for an ASIC.  We instrumented 4,000 to 5,000 signals, brought these signals down to multiple debug modules and gave them system wide observability of all the critical parts of their design.



When were the products released?

The first release was in June of last year.  We just released version 2.0.  The SnapShot feature is in alpha now.  We are working with an early adopter who has signed on to use the SnapShot capability.  We are in the process of instrumenting a chip for them.
The top articles over the last two weeks as determined by the number of readers were:


Intel Selects Synopsys As Its Primary EDA Supplier
Synopsys announced that it was selected as Intel Corporation's primary EDA supplier. The two companies signed a multi-year, expanded commercial and technology agreement under which they will closely collaborate on advanced design flows that combine Synopsys' breadth of EDA solutions with Intel's technology strengths and design expertise. The agreement expands a long-term relationship between the two companies.


Virage Logic Enters into Agreement to Acquire Ingot Systems Virage Logic Corporation announced the entrance into an agreement to acquire privately held Ingot Systems, Inc., a leading provider of critical functional IP and design services to the semiconductor industry. The planned acquisition will expand Virage Logic's ability to serve the company's chosen market by adding new products and services required in the rapid development of System-on-Chip (SoC) integrated circuits to Virage Logic's current family of memory compilers, logic libraries and related development
tools. The proposed acquisition is an all-cash transaction and is expected to be accretive (non-GAAP) beginning in early fiscal year 2008. The transaction is expected to close by August 15, 2007.


Nanno SOLUTIONS Appoints Test Expert as CEO, Kejing Song Joins Design for Manufacturing (DFM) Company  Nanno SOLUTIONS announced that its Board of Directors has appointed Kejing Song as President and CEO. Mr. Song is a successful investor, who is also an expert in silicon testing.  Mr. Song graduated from UC Berkeley with a B.S. in Computer Science. During his more than 19 years of semiconductor industry experience, he has worked at Cypress, Zycad, Lattice, and Sandisk, and has been involved in IC testing.


GUC Adds Apache's RedHawk to their 65nm Signoff Flow  Apache Design Solutions announced that Global Unichip Corp.,a leading SoC design foundry, has added RedHawk power integrity solution as part of their signoff requirements for all designs at 65nm and below. Over the past two years, RedHawk helped GUC perform dynamic IR prevention prior to tape-out, which resulted in very few yield losses due to IR drop.


Xilinx CEO Announces Retirement Xilinx announced that Willem P. Roelandts, 62, President, Chief Executive Officer and Chairman of the Board intends to retire from the positions of President and Chief Executive Officer. A committee of the Company's Board of Directors has been formed to begin an immediate search for a successor, evaluating both external and internal candidates. Roelandts will remain Chief Executive Officer until a successor is named. Upon retirement, Roelandts will continue to serve as Chairman of the Board.  At the helm of Xilinx since 1996, Roelandts led the
company through an intense period of change within the semiconductor industry, growing Xilinx sales from $560 million to over $1.8 billion.


Microchip Technology Launches Semiconductor Wiki
(www.microchip.com/ICwiki) Microchip Technology Inc., a provider of microcontroller and analog semiconductors, announced ICwiki (www.microchip.com/ICwiki)--a Web site that enables engineers, students and professors working with microelectronics to collaborate and share information related to semiconductor products, applications and best practices. Using Wiki technology, participants can change content on the site and participate in Web logging (or "blogging"), voting and messaging. ICwiki is available in several different languages, including English, Spanish, Chinese, Japanese, French, German and Russian.

Other EDA News



Solido Design Strengthens Executive Team, Welcomes Vice President of Product Development Douglas Konkin 
  • EDA Tech Forum(R) Announces Keynotes for Santa Clara Event
  • Ansoft Corporation Revenue Increases 15% 
  • Xilinx Leads Teams with EDA Leaders to Tackle Ultra
    High-Capacity FPGA Design Verification 
  • Carbon Design Systems Repeats Webinar on Automatic Model
    Generation for CoWare's Platform Architect to Accommodate Asian Market 
  • Agilent Technologies Announces HVMOS Package for IC-CAP That
    Provides Accurate, Fast Extraction for Synopsys' Widely Adopted, HSPICE
    High-Voltage Model 
  • MOSAID Renews Ottawa Real Estate Sale Process 
  • Silicon Image Names Paul Dal Santo Chief Operating Officer 
  • eASIC and Avnet ASIC Israel (AAI) Partner to Support
    Increasing Demand for Structured ASICs in Israel 
  • Synopsys Announces Earnings Release Date and Conference Call
    For Third Quarter Fiscal Year 2007
  • Carbon Design Systems Announces Four Upcoming Webinars on
    Using Automatic Model Generation for CoWare, ARM, MIPS Platforms 
  • Altera Cyclone III FPGAs Land Leading Role in New SVS
    Multiviewer Products
  • Samsung Teams with Denali Software to Improve Memory System
    Design Process 
  • IPextreme Expands Global Presence with Three New
    Representatives: Maojet in Taiwan


    Other IP & SoC News



    TI Acquires Integrated Circuit Designs, Inc. to Expand
    Low-Power RF Design Resources
  • Microsoft Embraces TSMC 90nm Embedded DRAM Process for Xbox
    360 
  • Tensilica Presents "Low-Power, Low-Overhead,
    High-Fidelity Digital Sound for SOCs" 
  • SEMI Reports Second Quarter 2007 Worldwide Semiconductor
    Equipment Figures 
  • InterDigital Issues Revenue Guidance for Third Quarter 2007 
  • Merrimac Reports Second Quarter and Six Months 2007 Results
  • Rick Cassidy of TSMC to Present Keynote at FSA Suppliers Expo
    & Conference 
  • ISSI Announces 5V, 4Mbit Asynchronous SRAMs for Industrial,
    Automotive, Telecom, and Networking 
  • New Intel Server Processors Provide Ultimate Choice in Speed
    and Energy Efficiency 
  • Jazz Semiconductor's Analog BCD Process Chosen by Nexsem to
    Develop New Line of Synchronous PWM Controllers 
  • Tensilica Presents "Low-Power, Low-Overhead,
    High-Fidelity Digital Sound for SOCs" 
  • RFMD to Acquire Sirenza Microdevices 
  • TI Introduces Sub-1GHz RF Transceiver for Low-Power Wireless
    Applications 
  • Renesas Technology Develops Industry's First Microcontroller
    with On-chip Flash Memory Built with a 90nm Process: a 200MHz SuperH(R) Chip
    for Powertrain Systems
  • Winbond's New Audio Controller Chip Gives Time-to-Market Edge
    for Skype(TM) Phones and VoIP Peripherals 
  • NVIDIA Reports Record Results for Second Quarter of Fiscal
    2008
  • TI Introduces Low Temperature Drift, High Accuracy Voltage
    Reference Family with High Output Current 
  • Nokia Selects Broadcom as a Chipset Supplier for Future EDGE
    Phones 
  • Pixelplus Obtains A Completely Favorable Ruling From the
    Intellectual Property Tribunal of the Korea Intellectual Property Office on the
    Non-Infringement of Item 12 of the Disputed Sensor Patent Claimed by MagnaChip 
  • UMC Reports Sales for July 2007 
  • AKA's MIL-STD 1553 RT Core Validated by Leading Independent
    Test House 
  • Leadis Technology Expands LED Driver Portfolio With
    Four-Channel LDS8842 & Three-Channel LDS8830 Enabled With 1.33x Mode




    You can find the full EDACafe event calendar here.


    To read more news, click here.



    -- Jack Horgan, EDACafe.com Contributing Editor.


    Rating: