September 28, 2009
It’s the Customers...
Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
| by Jack Horgan - Contributing Editor
Posted anew every four weeks or so, the EDA WEEKLY delivers to its readers information concerning the latest happenings in the EDA industry, covering vendors, products, finances and new developments. Frequently, feature articles on selected public or private EDA companies are presented. Brought to you by EDACafe.com. If we miss a story or subject that you feel deserves to be included, or you just want to suggest a future topic, please contact us! Questions? Feedback? Click here. Thank you!
While at DAC I had an opportunity to interview Kathryn Kranen, President and CEO of Jasper Design Automation. She has over 20 years experience in EDA as an executive at several companies. During the interview she argued strongly that existing and prospective EDA customers were most interested in the experiences of others with specific products rather than the usual press announcements of new products.
Unfortunately, many end user firms have a practice or policy against giving product testimonials. Perhaps, they want to keep the knowledge of a superior product to themselves as a competitive advantage. Perhaps, they fear someone who does not have a successful experience with the product they have endorsed will take some legal action. I have considerable respect for members of the legal profession but corporate attorneys can be overly conservative and protective. If you have an attorney draft an agreement for your prospective customers or for prospective partners that gives you protection and advantage in all contingencies, only an idiot would sign it and who wants idiots for
customers and partners.
It could be an issue of fairness. Most EDA customers have competing products from multiple vendors spread across multiple groups, departments, divisions and geographies. Left to their own devices some individuals or groups would be more inclined to give testimonials while others would be more resistant regardless of the relative merits of the situation. This could be a cultural thing. I worked for a Japanese company that was the leading Mechanical CAD vendor at the time with an installed base of tens of thousands of seats. The best they could get out of their large user community in Japan were statements like “Yes, I am using the product and I look forward to the next
release.” I remember working for a company and could not get the head of our US user group to say anything on the public record. He said he would like to but that it was against his company’s policy. I found a long article in German in a well respected CAD/CAM magazine with an employee of the same company going on and on extolling the features and benefits of one of our competitors. I personally translated the article into English and gave our user a copy. I asked “Now can I get a testimonial, a success story like this one from you?” He consulted his bosses and they said maybe it is different in Germany or maybe it is not different but two wrongs
don’t make a right. By having a corporate policy against vendor testimonials, a firm ensures that all vendors are treated the same. It is straightforward and simple to implement. This also eliminates the hassle of dealing with vendors complaining about what they perceive as unfair treatment.
From a vendor’s perspective the best end user success story would be where a prospect was stymied, stuck on a critical problem (could not close timing, low yield) and the vendor rode in on his white horse with his wonder product and saved the day. This would be great publicity for the vendor but not so much for the “saved” design team. There might be some reluctance to see this “intervention” in print.
While having a list of or quotes from satisfied customers may help a prospect gain confidence in a particular vendor’s product, having a detailed understanding of the existing customers’ experiences with that product would be more informative and valuable. This would require putting the prospect and the current users in contact, not just writing up a customer success story. While this does happen, one can understand why management might consider this a distraction or helping the enemy. And, of course, it is unlikely that a vendor would put a prospect in contact with an unsatisfied customer.
There is also a blog "Of Investors, bloggers and customers" on our EDACafe website by Jacques Benkoski on this subject.
Would you give us a brief biography?
I am an electrical engineer from Texas A&M. Back in the early 80s I was a chip designer. Signed with Rockwell International right away. Did some of their first ASICs. I moved into EDA with Daisy Systems. Stayed in the industry with Daisy until the late eighties and then did three successful startups. Quickturn Systems, I was probably the mother of emulation success in that I started as a sales person and found the vein of microprocessor emulation. I did the big deal with Intel that funded the company. Then in the next year we did $100 million in processor business and away they went. I grew up with that company to be VP of Sales, public company. Then I was recruited
out to go to Verisity as President and CEO of the US subsidiary of the Israeli parent company. That was test bench automation, another big verification milestone in the industry. There were around six people. Took it to cash flow positive and left three years later. After my first child, we brought in Moshi Gabrielov. I stayed for another year and then left. They went public the next year. So that was my blissful retirement stage for a few years. Then I was lured back to the industry to what is now Jasper (it started out as a different name) where I have been for six and a half years. Jasper is in the formal property verification domain. Since 2003 I have been leading the ship.
Now we are cash flow positive and have a fantastic customer lineup. We just announced our second major product which is in the RTL design and evolution space, not just the flagship property verification which is our bread and butter and revenue generator. Aside from that I have a husband, Kevin Kranen, and two kids; Karl and Kayla ages 11 and 8. I am Vice Chairperson of EDAC and other professional things. So happy, happy.
I closed a round of funding in November 2008, which is really our cash buffer. Brought in a new lead investor with ZenShin Capital who focuses on companies with a very strong market presence and growth opportunity in Japan. We do. All of my investors re-invested in the company. We now have topped it off with a little over $30 million. We do not think we need any more. So Jasper is a well capitalized company.
What is the annual revenue?
We do not discuss our revenue numbers. You can probably reverse engineer some of this since we are cash positive. We have development sites in Brazil, Gothenberg Sweden and here in the US at Mountain View. The field is all over; Israel, Germany and Japan.
We are hiring. We have actually hired quite a few, mostly on the application engineering side. A very selective occasional R&D engineer but mostly in the channel because the revenue is built directly on the AE resources in the channel and manifests itself in the revenue stream. We figured out a couple of years ago that the best business model on here is how to enable this kind of new and bolder adoption of formal property verification was to go find applications that customers have pain points across the SoC development cycle and instead of saying here is our formal property verification tool where we give you the manuals, training and one scripted flow, we say “If
you could wave a magic wand and fix any problem in your design and verification phases, what would that be.” Things came out of the woodwork at us. It is so easy to use the power and flexibility of properties and craft a solution to just about any problem. At last count there were 70 known unique problems that we have solved. There are 8 major areas (if I can remember them all):architectural validation, RTL exploration, early design pre-verification, classic protocol checking, critical block functionality, verifying critical properties like data integrity or cache coherence testing and then SoC integration. There are many, many subapplications of things that you can
automate using JasperGold, our formal property verification product. So, you simply eliminate labor. It is low hanging fruit. It is easy, easy trivial from a technology point of view but since our product was already out there at these customer sites, we said “Lets round it out and find all the ways to deliver more value and utilize more licenses.” Last but not least debug. Did I miss any?
It has become an extremely fun solution space in that it is like a chameleon. It adapts itself to what the problems are. Once you have solved a problem the first time for a customer, you can spread it out across your customer space. That has led to tremendous expansion. We have had over 100% growth year-to-year, 200% to 2008. This year we are growing still, not over 100%, but still growing significantly. Most of our revenue this year will come from our existing customer base proliferating more and more throughout their divisions.
Do you have new products in these areas or best practices? How do you leverage these successes in eight different areas?
You are the first person to ask and that is a good question. I have watched how one startup after another took their good technology and then as soon as they have a variation, say “Oh, I have a new product”. In my opinion this dilutes the value of the core product, and you end up with a lot of teeny, tiny low priced products that are very difficult to configure and sell. No! We have used JasperGold, which is our formal property verification platform. In fact, it has an API, so that you can use Tckle or C commands to script new applications on top of this. That is all under the JasperGold umbrella. Our second product has some overlapping functionality in
terms of the visualization of what is going on in your design. It is still useful in property verification. We created a new product, called Active Design that has formal algorithms underneath but does not perform formal property verification. It performs behavioral analysis on waveforms with a database attached so that you can explore your RTL, tag it with temporal relationships that are important by simply clicking on what cycles, what signals and states are meaningful. Comment that. Index that. We call it behavioral indexing. Build with that a knowledge base which is extremely useful to the author of the design to create new waveform traces and confirm behaviors as you are
developing RTL without having all the RTL complete, without absolute no testbench, be able to say “Get me to that state” It directs, 100% controllability. You do not have to worry about the stimulus any more. Get me to that state, get me that combination of things happening and it will produce waveforms that you can then manipulate. I would like to stretch that out a little bit, I would like a bit more time between these signals. You just declare what you would like the output to look like and it will fill it with the correct trace, deriving it from the RTL. So the behavior indexing lets you abstract some of the temporal functionality and use that as building blocks. You
can now explore more interesting compound behaviors and so forth without having to deal with all the bits, abstracting a level above the RTL with complete and full control on the part of the designer. You use that and then your implication analysis that takes the next version of the RTL and shows you what behaviors were impacted by the changes in RTL, what you might have broken. You can ask if you change this line of code and what else might be impacted. That is a very exciting product.
Getting back to my great question. There are several ways to leverage application success. From a public relations point of view, if one customer says that he has a certain problem, then you can say we have a customer who solved that problem. Another way is to provide services based upon these successful experiences. Another way would be to document in a best practice manual.
How to get the applications delivered. Okay! We view the evolution of the company as what we call the bull’s eye model. I have heard after the fact the Geoffrey Moore describes this very well in one of his books. In the very middle you have you software, the tool itself. You can think of the next ring around the bull’s eye being methodology, where you can document things that are doable with the tool and provide that as a kit for someone else to do it. The most outer ring is service. Service does not necessarily mean paying you to fulfill a contract to do something. It just means when you apply your experience to try to solve a new problem. Over time things that
were in the outer most service ring get documented into methodology. And things that that were methodology, repeatable over and over again get automated into the tool. The tool evolves and grows. The center circle grows much faster than if you are just delivering a shrink wrapped tool saying “Knock yourself out.”
The applications we mentioned, the 70 subapplications of these 8 categories, are predominantly methodologies we have documented. We give samples, templates on here is how to do this new application that maybe the first time, we had to figure out with a usually collaborative customer. “Here is what I would like to do. I’ve been thinking of using JasperGold” (because we opened it up for them to do so). They may ask “Do you know a way to do this sort of thing?” We say “Yes, that reminds me of that path analysis application over here for X-propagation. I bet we can use that application over here”. Over time
we find the ones that we think are most common best practices or applications that would be generally useful. We have rank ordered which ones we would be documenting next. Some of these become automated push buttons features in the tool as well. We have stuck to our guns and said that to enhance the value of the tools we already have delivered rather than to fragment what is already a small domain is going to be a winning strategy. We were right about that. There is no doubt.
You can look at other companies that are small and maybe in neighboring domains. You get a lot of press when you can say you have another product, but I do not think that amounts to revenue success or market success because customers are very busy. Whereas if you give them one useful tool across the design cycle that wears many hats. There is only one stage when you do architectural validation on that one architecture. Then you will be doing design exploration and then all that. With a license across their timing horizon they get far more value out of the one thing they are purchasing rather than five or six little things that are going to be negotiated down in price. This is a
kind of way to aggregate that value and really get the customer’s recognition of that. It shows up in our booth when people ask “what is new at DAC?” Although we just had a major, major product launch in June, it is not the first thing we mention. If you walk into our booth, two things: one wall has eight customer testimonials. We have those in quotes, the ones that are public. There are quite a few others that we can’t acknowledge. The other big wall has a myriad of applications listed on it. So it is who are some of the customers and what they are using it for. By showing people almost what becomes a menu of here are a bunch of problems solved by this one
powerful, flexible product, there is something for almost everyone on this list without having to put them in a Mixmaster, stir and give this piece of collateral or another. That has been the key. That and tying our business model for growth based on investing in the customers. Investing AE resources to go solve the customer’s application and having an account by account reconciliation. “Okay, if we are putting that much of our total AE bandwidth, we better be getting pro rata amount of the company’s operating expenses out of that account”. We are very mindful of where we invest resources and what is the late fee to growing the dollars. This has made
us very successful. I am very pleased with this team.
Granted that this product is a chameleon and very flexible, are that any sweet spots in terms of end user applications say processors or memory?
In terms of industry segments, we have a strong presents in processors, wireless and consumer segments. We also have networking. We have quite a few processor customers. The ones that have publicly endorsed us are ARM, AMD and SUN. In the consumer area, the company that we may list is Sony Corporation. We have several others that are widely successful consumer companies that we can’t mention. HP certainly. We have a very nice article on the ROI that they have seen from our products. NVIDIA is our oldest customer. Graphics is another industry segment. So processors, wireless and consumer are the three most prevalent with some networking and graphics thrown in. It is
pretty chameleon in terms of industry segments.
You asked if there are any sweet spots, application wise. Verifying critical function that is clearly risky will be the long pole in verification, critical path schedule. When you can take that and exhaustively verify with formal then you have fundamentally changed your risk, schedule and so on. Full proof of critical functionalities that correlate to spec is still the biggest differentiation for our products.
Verification is a big area now. If you could segment this, where would Jasper fit in?
Let me take a stab at that. A lot of this is based on my own history, starting out as the first female simulation user at Rockwell. Simulation I would consider as one distinct subsegments of verification. Then testbench automation. I think that testbench automation and testbenches are a distinct market in verification. However, it does get blurred because a lot of that is now included in simulation or System Verilog Testbench. You see some of the other graph based coverage tools I would put in the testbench area. Emulation hardware assisted verification is definitely a category. Static and formal property domain is another one. I think we see the crossover starting to blend
into ESL type space. Those are getting closer and closer together with the near neighbors getting closer to the implementation side of things. Those are the four big ones in my mind.
Who do you see as your competition?
The big three companies. We compete regularly with Cadence, Synopsys and Mentor. Every single customer who now has a big testimonial on our wall had one, two if not three. Usually, it is two, I do not know why. One or two of the big companies’ tools. We have found looking at our life in the market, there was all kinds of interest and excitement when we the first launched the company in 2003. A couple years later, when all of the big vendors had launched their quote-unquote competitive product, everybody froze us out. They said “We have to see what we already own in our mix and how it fits.” So it took a little while for companies to digest what they
had, understand its limitations and know whether they care about the category to go look for something better. Our turnaround happened when people starting saying “Okay, now we have a handle on what we have. It is time to go look at that product. We definitely want more from formal. Let’s see what you have.”
Our largest customer, in fact, had evaluated Jasper back in 2004 or 2005, selected us technically and gave us a low ball price. We said we cannot take that. We cannot do business at that level. They said we are going to use our existing vendor tool. They used it for a while as well as another big vend or tool. They came back to us after hearing references from other companies and from people they were hiring. The first order we did for them after a very short evaluation I think was over 10 times the price per license of their earlier low ball offer.
It is because the domain was now viewed as essential. We were clearly recognized for our differentiation because they had experience with the alternative. Sometime you just have to outlast the marketing hype out there in the market.
You had mentioned you cover new and interesting technology. I have got a lot of coverage from those types of things. It is great. But when I think of customers and what is relevant to them, it is knowing, which customers have crossed the tipping point of production testimonial usage, more actionable. Sometimes you look and see (yawn) another announcement of a five person startup that does not have funding yet, that may or may not ever get that product working. It actually gets a disproportionate amount of coverage. They need it to open doors and so forth. It was very helpful when we launched the company in 2003; front page EE Times and all that. I think it is so much more
important, customers are so much more interested when they can see the herd effect. Oh boy, companies now come in and walk directly to the Jasper reception desk and sign up for a meeting. They don’t like wandering around looking at who is this company. They say that they are here because they heard from their friends at other companies that you guys are the clear domain leader and we need this. I have a project I want to talk about with you. So it is really a different sales cycle and much more fun now.
It is difficult to judge whether a company has reached its tipping point from the outside when they cannot tell you even the names of some of their biggest customers or their revenue.
When you finally start seeing testimonials from big companies and it is a subset that are even willing to do that. But whether it is my company or not, I am very grateful to the companies that still do sincere endorsements. It is what signals the early majority in the market that it is time to go look at that product.
There was an interesting blog and article that was recently put out that says that this has real changed from my experience at Verisity in the late 90s, where we were very radical, laying a new language and all of that, to now. That set has shrunk and the whole industry has gratitude to the subset that still does speak their minds and cannot be bought. There are some that can be bought for their testimonial. They say they like a product whether they use it or not. That is really not a good practice in my mind.
How do you make the experience of formal better for the customer?
Really, it is about three things. Methodology, in giving the user the hooks to apply their own flows and methodologies. Then, there is clearly capacity. Capacity is king in the world of formal and that is where we have our greatest lead. And the third, not to be shortchanged because it has a factor on capacity, is visibility into the process. All three of the big vendor tools are black box perspective solutions. That is like driving blind. If you get lucky and your property is proven, then great. If not, you have to change the property, change the design, and/or change the constraints. You do very thing from outside the black box.
JasperGold from its initial architecture, one of the things that attracted me to the company and that the founders had done differently, was to say “We want to empower the engineer to take it farther. We want to make the process visible.” Over the last 7 years or so, things that have been added so that people use JasperGold not only for verification per se, that is proving properties. Now they use JasperGold Active Design to understand that design. They get a guided tour through their control flow that helps you identify where complexity is getting out of control and apply some of the techniques that we have taken that from the outer ring, from services and
methodology, to now under the hood about handling complex structures like proof accelerators that deal with FIFOs and memory. We have a formal score board, the only company that has a formal scoreboard which is one of the most valuable verification assets. It watches packets going in and packets going out to see if you are dropping, duplicating or corrupting nodes. I think that visibility, capacity and methodology go hand-on-hand. That has been our kind of audacious approach to this formal domain rather than trying to find something simple and small that formal can do, that is low effort, and where you do not have to teach anyone anything. That does not have any value.
You mentioned capacity: block level, SoC level?
It depends upon the application because it is a combination of the size of design element you are reading in. We read SoC level. Then there is complexity of the property you are trying to address and whether you are trying to do a full proof on it or something like find a trace. Those three things come together in what is the capacity of the problem you are trying to solve. Things like SoC integration. I mentioned connectivity testing verification; very shallow, very simple. You can operate on a whole chip because these are small things that you are verifying. Register configuration, register verification, to that step can save tens of percent of design time during certain
phases of their RTL development. That is another one that is very predictable in terms of capacity.
When you are looking into end-to-end spec level proof that sucks in most of the design, I would say designer sized block to unit size. A unit would be a couple of designers. That is about the end-to-end span but there are exceptions that are bigger. I would not steer someone towards that without a careful look. In architectural, you are running your entire architecture for several different subsystems on a high level model that does not have RTL attached. In post level debug you are taking a chip and you say I have a symptom in the lab. I know that very 600 packets, it is dropping one. I think it is in the read/write mode. You are able to devise a property on the suspect
block that you know that symptom would violate. Go and trap it using the visibility to know why is that and the root cause bug. You may aim the big spotlight at Block A. You think it is Block A and we have been trying for weeks to reproduce it with simulation. Aim the big spotlight called JasperGold on Block A and sometimes what will happen is that someone will say “I just proved it can’t be Block A. We need to look at Block B.” And then find the bug with what is may a scoreboard, maybe a big broad property that is certain to be violated. That is another way you can look at things at a higher level, hierarchically.
If we meet again next year or in 18 months from now, what is Jasper likely to be announcing?
That’s a good one. You are asking me to preempt our strategy that we have not made public. I think what you will certainly see more and more growth in the application space and more of those things that are methodology becoming tool features. In terms of announcements you will see adjacent areas. We really do believe in static as our primary approach to things. It is so much more light weight. It enabled other things. You will see static approaches. You will probably something in the higher level model helping you verify the source which you may be synthesizing from a higher level model. That is probably all I want to say right now. We have a few things cooking. A
lot more customer I would suspect.
What is the pricing and packaging of Jasper’s products?
We have made a big change to that this year. We used to sell the worldwide wan license to JasperGold for $217,500. Now what we have is that JasperGold, the formal property verification product, has a cousin called JasperCore which is a batch operatable node, almost a client server node of JasperGold. You can run a proof grid using many, many JasperCores or JasperGolds. JasperGold will automatically carve up and distribute all of the tasks across LSF, across the network, across multiple-CPU machines to take advantage of distributed computing, to get a linear reduction in the human time and schedule. If you have a design that has 12 properties or 2000 properties, you probably have
40 big ones or so, then JasperGold will figure out what is the best way to distribute them across the available nodes you give it. We actually have nice functionality, like a command that says if there are license free, I want to use them but will kill the extras, if someone else wants to use them. We have come to a more enterprise level pricing. I’m not talking about all you can eat. A solution that will have a farm of JasperCores and JasperGolds that are portals, user interactable licenses and Active Design which is the database and behavioral analysis system that is using timeshare slices of JasperCore. You say what is pricing? For a single copy it is still the same as it
was. That now includes running multiple instances in parallel over a network, letting the fastest answer win included in the cost. We are taking advantage of that core screen, multithreading. We thought that was where the application needed to go. We would rather get more money for one very strong product then slice and dice it. JasperCore and ActiveDesign are lower priced but it is so subject to configuration and volume, I really hate to put numbers out there that might mislead people.
Anything to add?
Services. ActiveDesign, the tool I mentioned with the data base and behavioral analysis, and behavioral indexing is useful for design exploration and for design evolution which is checking changes to RTL. Once you have finished the design and have a database, it the greatest thing ever to happen to reuse, because if a design transition happens, now you are sending it with a container of knowledge. One thing that we have offered as a service, really as a market on ramp, is to take a legacy design from a customer, use the tool to data mine and activate the database so that it is then queryable. Ask a what-if question and get a waveform answer. So we do offer services to do design
activation. We have several service companies that we are partners with to expand that bandwidth. That is certainly more valuable when you have used that in your original design process to accelerate the RTL development itself.
The top articles over the last two weeks as determined by the number of readers were:
IBM has fabricated a test chip with an embedded dynamic random access memory (eDRAM) technology that features the industry's smallest memory cell, and offers density, speed and capacity better than conventional on-chip static random access memory (SRAM) announced in 32nm and 22nm technology, and comparable to what would be expected of an SRAM produced in 15-nanometer technology - three technology generations ahead of chips in volume production today.
IBM's eDRAM cell is twice as dense as any announced 22nm embedded SRAM cell - including the world's smallest 22-nanometer memory cell announced by IBM in August 2008 - and up to four times as dense as any comparable 32nm embedded SRAM in the industry. Higher memory density can lead to chips that are smaller, more efficient and can process more data, improving system performance.
The IBM eDRAM in 32nm SOI technology is the fastest embedded memory announced to date, achieving latency and cycle times of less than 2 nanoseconds. In addition, the IBM eDRAM uses four times less standby power (power used by the chip as it sits idle) and has up to a thousand times lower soft-error rate (errors caused by electrical charges), offering better power savings and reliability compared to a similar SRAM.
Dr. Walden C. Rhines, Chairman and CEO, will present a keynote at the 16th annual CPDA (Collaborative Product Development Associates) PLM Road Map 2009 Conference in Detroit. Title of the talk is “The Paradigm Shift for Vehicle EE Design with Model-driven Development”
Infinisim, a provider of innovative verification solutions for mixed-signal designs, announced that AppliedMicro has selected its flagship verification product, RASER™, as the verification platform for next-generation designs. Infinisim’s RASER technology bridges the gap between advanced process technology verification requirements and capabilities of existing mixed signal simulators. RASER, with its innovative Real-time Adaptive Simulation™, guarantees SPICE accurate results with an average of 50 times higher throughput and capacity, enabling AppliedMicro to verify large mixed-mode SoCs and mixed-signal designs. RASER is production proven and currently in use at
many mixed-signal design houses for their verification.
Gary Smith EDA Research: Missing the Point (Synfora & Apache acquisitions)
Editorial on Synfora’s acquisition of Esterel Studio and Apache’s acquisition of Sequence.
Other EDA News
Other Embedded, IP & SoC News
You can find the full EDACafe event calendar here
To read more news, click here
-- Jack Horgan, EDACafe.com Contributing Editor.