Posts Tagged ‘Atrenta’
Monday, April 11th, 2011
Liz and I sat down with Riko Radojcic of Qualcomm to hear his thoughts on how upcoming 3D design and manufacturing would affect the EDA world. Naturally, the conversation morphed into a discussion about standards that will be required to make 3D adoption pervasive.
Liz: Thanks for taking the time out of your busy schedule to sit down with us, Riko. So let me ask you, what is the relevance or importance of standards in adopting 3D?
Riko: Well, first, let me make a general statement about standards. Sometimes some of the EDA companies view proprietary formats as a source of competitive advantage – a way of locking in a customer base. This is especially true when a given company has taken a lead with a given solution, and they fear that opening up a proprietary format would shrink their slice of the market pie. However, in general, design standards, or standard exchange formats, or standard models, tend to make the whole pie bigger, as opposed to affecting the size of any one’s slice of the pie. So, in the long run, standards are good for users, like Qualcomm, and for vendors, like the EDA companies. I keep referring back to the industry experience with SPICE models and the transition from the proprietary ‘Level 28’ model to the open standard BSim generation of models. I think with all the brilliance of hindsight, the industry has benefited from an open standard model.
For 3D technology specifically, we are promoting the concept of standards, in order to accelerate the adoption of 3D design and manufacturing methods. We want to help to line up the supply chain behind the 3D technology. I would say that most people – users, industry observers, EDA vendors, etc. all perceive 3D technology as a disruptive change. The fear of that change is part of the barrier to adoption. Standards are the other side of this coin of fear. They bring down the feeling of fear.
Ed: Is 3D more a barrier to standards? Are we sabotaging our own efforts?
Riko: There is a lot of FUD in 3D. It is important to realize that there is 3D and then there is 3D. Some future 3D implementations – like stacking logic on logic – does require disruptive change in design tools. We will need design methodologies and tools that comprehend entirely a new dimension of parameters for this class of designs, and until these are developed, standards may even be a bit of a barrier.
On the other hand, 3D in the short term means heterogeneous stacking, like memory on top of logic. So right now, 3D is not that disruptive. We only need some minor upgrades to design logic in a smart way, to make stacking DRAM on top of it easier and lower risk. For this class of designs, standards would be extremely helpful – having a standard exchange format so that we have relevant information about die A when designing die B or vice versa would be excellent. For example designing power distribution network on die A needs to know about power demands on die B.
To accelerate and facilitate adoption, we need more design information. JEDEC for example is doing a nice job of working on the standards for memories
Liz: What is JEDEC doing?
JEDEC is defining the pin assignment and the pin array configuration required for Wide IO DRAM memories to be stacked on logic die.
Ed: What of vendors’ fears that buying into this format will be giving away too much of their own design data?
Riko: We can all make an investment in a standard format that can provide the right characteristics without exposing too much information. The emphasis is on format, rather than specific content – which should be proprietary. Again I like to refer back to SPICE and BSim models – where the model format, units, etc. are standardized, but the specific coefficients in the model are proprietary information of whoever owns the process technology.
Liz: Why is this not happening now?
Riko: We are all driven by financial motives. No one feels they will make enough money out of it right now – and this is especially true for standards, which, by definition, belong to everybody. However, there could be certain advantages for someone creating a standard and then giving it away. If you make the rules, you have a better chance of winning the game.
The thing that is required is a series of standard ‘exchange formats’ that would communicate the necessary information about the design of the various die to be stacked, such that 3D stacking of these die is a low risk enterprise. Basically to communicate design attributes such as power demand characteristics, thermal and mechanical stress sensitivities, maybe some floorplanning restrictions, etc..
Most of the standards bodies don’t have the capability to develop such standards. They have mechanisms to review a proposed standard and to manage and distribute it afterwards – but not to do the engineering required to develop one. So, there’s a lack of champions willing to put in the work to develop and promote a standard. It could be an EDA company, like Apache, or it could be an institution, like IMEC, or it could be an academic entity.
There are some activities going on, though. IMEC is working with Atrenta to develop a PathFinding tool – which may also involve developing a PathFinding exchange format. Apache has taken the lead in pushing a standard power exchange format for 3D. Perhaps some of the academics could be engaged to develop standard exchange format proposals? Si2 is willing to take a role in managing the standards, but someone needs to give them something to standardize. GSA is active and willing to coordinate the discussions. But someone needs to make a proposal so the industry can say “I like it” or “I don’t like it” or whatever.
Liz: So you are looking for another EDA guy to come up to the plate, and then what if someone like Cadence comes up with a competing idea? Then what?
Riko: Once a product is developed, all the EDA companies are invested in one format or another. We want to get these standards in front of the product development curve, so that it would be easier for any one company to adopt and comply with a standard, rather than making up their own format. This is where the users – such as Qualcomm – come in. We have the responsibility to demand this.
Ed: So it sounds like so far, we have a lot of discussion, but to a certain extent, some organizations are waiting for others to discuss or define a proposed set of 3D standards. Other organizations are waiting for that proposal to get adopted before implementing the 3D standards. How do we get off this merry-go-round?
Riko: I would say, let’s take a stab at partitioning the effort. Qualcomm proposed this last September, at a SEMI/Sematech sponsored meeting in Taiwan. We proposed dividing the world into two buckets…one set of players and activities focused on design related standards, and another for manufacturing related standards. For each bucket of standard related activities, we proposed a suitable existing standards body, a suitable forum for discussion, and a suitable set of champions who would propose appropriate standards. In the manufacturing domain, it would make sense to use SEMI to manage the standards, and Sematech to provide the Proposals. In the design domain it would make sense to use Si2 to manage the standards, and EDA or academics that are involved with EDA to provide the proposals. That way there would be less overlap and hopefully fewer gaps
Liz: What would happen then?
Riko: In addition, we proposed to create a forum which would be conducive to exploring and kicking around some of the proposed standards. Standards bodies, by definition have a formal review and balloting mechanism – which tends to be slow. So, in order to accelerate the discussion a separate forum would be nice. The Sematech 3D Enablement Center is doing this already for manufacturing-oriented standards. Let’s work with GSA to create a forum to discuss design-oriented standards, and if (or when) a given proposal is flushed out, give it to Si2 to create a true standard.
Liz: Your 2011 hope or wish for 3D standards?
Riko: That our industry can actually define a standard without having to fight a turf war. We can do this if we get ahead of the 3D product curve. But only if we all pitch in.. .
Riko Radojcic, who has over 25 years in the semiconductor industry, is a Director of Engineering at Qualcomm, currently leading the Design-for-Through Silicon Stacking Initiatives.
Monday, March 28th, 2011
Continuing with my conversation with Tom Kozas, president of CADmazing Solutions, I asked him about a hypothetical scenario:
Ed: So Tom, what would happen if for some reason, the big three EDA vendors all went away? So instead of Cadence, Mentor, Synopsys, the biggest three would be Magma, Apache? Atrenta?
Tom: I think this raises even more questions.
Ed: Hmmm…interesting. What questions?
Tom: Several come to mind: Would this mean renewed growth for the industry? Would the fundamentals change that encourage investment in new startups? Would the design flows become more or less integrated, collaborative, and global?
Ed: Ok, good questions to ponder. So what would be THE big issue?
Tom: The “Silicon” in Silicon Valley is missing. Without investment in new semiconductor startups, growth simply won’t happen. Virtually all new design starts are happening within the big systems and semiconductor companies which means the only way to grow an EDA company, is to steal market share.
But would this translate to increased value for the remaining EDA companies in the eyes of the financial community? What’s interesting about this hypothetical is, even though it would put the remaining EDA companies in a position to take advantage of this opportunity they might not be able to.
Ed: Just to play devil’s advocate, why wouldn’t that next set of players, whoever they are, be able to take advantage of the sudden disappearance of the big three? And who do you consider to be that next set of players?
Tom: Good questions. But let me respond by saying what they will need to provide.
So, the next big three will have products that have great user interfaces, provide online collaboration, and be part of a new ecosystem that enables innovation. The industry already has advanced technology but needs graphical and command-line interfaces that exploit the online design environment.
Second, designers don’t necessarily sit in the same building but often have to work on the same problem. For example, two or more designers should be able to share the timing database and bring up the same timing path without having to rerun static timing analysis and do it within minutes no matter where they are in the world.
Finally, the current EDA ecosystem is in the dark ages, there needs to be a new model that facilitates new algorithm and tool development with a reward system.
Ed: Tom, thanks again for your insights.
Monday, February 7th, 2011
Liz and I attended a panel at DesignCon that asked the question: what are you doing about the chip killers that delay your tapeout? That’s an intriguing, possibly unanswerable thought, since we’ve asked that question virtually since EDA’s inception. Ed Sperling of Systems-Level Design moderated the panel which had on it: Sunil Malkani of Broadcom, Ravi Damaraju of Juniper, Ramon Macias of NetLogic, John Busco of NVIDIA and Bernard Murphy of Atrenta.
Sperling moderated a lively discussion; questions that he or the panelists or audience posed highlighted the ongoing nature, or unanswerability of the topic. Some were:
• As designers and design managers, what keeps you up at night?
• If your design has to finish in half the time that your previous project took, do you start with a [design methodology and flow] clean slate?
• How do you get hardware and software engineers to work together?
• What’s good enough to get the design out the door?
• How do you define failure?
• What’s the price of failure?
• Who owns quality?
• What do you do when your next project is 4X the size of your last design? Throw people at it? Make the tools do more? Run faster? How?
• How do I turn around a design in a month and get all of these [now-required] apps on it?
• Why does place & route have to be flat?
• When will P&R, timing analysis have to break down the design hierarchically?
• How can verification be improved so that its pessimistic estimates won’t require designers to over-design?
The panelists all bemoaned the dueling standards that plague EDA, attributing them to companies wanting to gain marketing advantage, to the detriment of EDA users.
Sperling will publish a transcript of this panel in a future issue of System-Level Design. Nic Mokhoff published a summary of the panel the next day.
Finally, I have a question: why does DesignCon schedule a management-level panel on a day when the exhibit floor isn’t open? Doesn’t help DesignCon panels’ attendance, which has been paltry for years, seems to me.
– end –
Monday, January 31st, 2011
Piyush Sancheti of Atrenta brings up a good point: for Ip to work as we envision it can, what players have to contribute to the quality effort? And what does each player type need to contribute? http://bit.ly/gpAHI1
Friday, January 14th, 2011
Ron Craig, Senior Marketing Manager at Atrenta and an expert on the subject of timing constraints, was good enough to sit down with me (Liz Massingill) recently to talk about the subject—what the current problems are and how to fix them. This is the result of my interview with Ron.
Liz: Ron, I was shocked to see, in the survey you conducted, that 94% of designers have timing constraint problems that could stop their current designs dead in their tracks. But they also don’t see a way to change their current methodology. WHY?!?!?!?!
Ron: There’s certainly no doubt that timing constraints remain and will continue to be a problem for design teams. The irony is that even though timing constraints are repeatedly an issue, most of these design teams feel that they know how to address all the problems that typically arise. It’s almost that the problems are viewed as less severe if the solutions are known.
Liz: Seems like the problem is more about changing the mindset. But why are designers running into increasing clock domain issues in the first place? Use of more IP? Process nodes going down to the next level? More complex designs?
Ron: The key culprit here seems to be IP. With IP, the functionality is reusable but the timing constraints are often not. Third party IP developers may well be experts in what the IP is supposed to do but not necessarily its implementation, leaving design teams with incomplete and inadequate timing constraints. On the other hand, IP reused from another design may well have been constrained in a way that’s not compatible with how you want to use it in your chip – especially if you need to change how the IP behaves. In today’s designs where IP amounts to 70% or more of a typical SoC, you end up with the constraint-driven implementation process becoming increasingly risky.
The process shrinks and more complex designs mean you simply can’t get away with having inadequate timing constraints anymore.
Liz: Well, what can they, or more appropriately, their project managers or internal CAD departments do about this increasing problem?
Ron: The key is to introduce more certainty into the whole process. Rather than taking an optimistic, or reactive approach to timing constraints, it makes a great deal of sense to put some effort in up-front to make sure that they are good. Many of our customers have noted that they simply can’t deal with the number of iterations it takes to refine timing constraints during the implementation phase of their projects, so they’re working on finalizing them up front as part of their RTL handoff. The trick for project managers or CAD people will be to introduce a methodology that their front end teams (who aren’t necessarily timing constraint experts) can easily adopt, and this is where comprehensive automated solutions such as SpyGlass-Constraints come into play.
Liz: So why isn’t this happening? Seems to me that an ounce of prevention is worth a pound of cure, as they say.
Ron: Let’s look at the two camps. First of all you have the RTL or front end team, who historically don’t want to take ownership of any part of the implementation process (even they know the design well enough to define its constraints). On the other side of that handoff ‘wall’ you have the back end team who feel that their expertise in this area, coupled with whatever the implementation and timing tools complain about, is enough of a solution. So depending on which side of that wall you sit on, you may feel that it’s either not your problem….or not a problem at all.
Liz: But we know there IS a problem, and it’ll only increase. So where in the design flow should project managers look first for a fix?
Ron: There is often a perception that timing constraints can’t be fully defined until you are actually using them – until you are in the thick of implementation or timing analysis. The problem with this is that your constraints end up being written so that you can close timing, instead of being defined to set the ground rules for timing closure. A classic example of this is the definition of timing exceptions – they’re often defined to mask timing violations, but in most cases they’re not exhaustively verified. A timing exception is a design characteristic, so can be defined and proven up-front before the implementation process even starts. It’s like an architect finalizing the plans after the building is complete. If your objectives aren’t clear how do you know when you are done?
Liz: I see what you are saying—it’s like putting the cart before the horse. Stop me, Ron, if I use another one of these old sayings. I’m dating myself. So who’s out there with technology that can help change the methodology and fix the timing disaster that’s looming?
Ron: It’s been possible to do some rudimentary timing constraint analysis in a range of implementation and STA tools since the advent of timing driven optimization. The problem with this approach, however, is that it’s largely a reactive one, and as a result doesn’t help reduce the risks in your implementation process. More recently, vendors (often ones outside the implementation/STA space) have started to provide solutions that allow the user to check the correctness of their constraints before implementation. What we’ve done with SpyGlass-Constraints is to take it one step further and look how timing constraint analysis is part of the bigger picture of reducing implementation risk. A great example of this is how we use our constraint verification methodology to ensure that data such as clock setup is in good shape before you use it to drive clock domain crossing (CDC) analysis. Again, it’s all about finding the issues up front and reducing risk later.
Liz: Well it’s intriguing…a Titanic-like iceberg of a design problem out there and we’re forging ahead…like the Titanic?
Ron: (laughs) Indeed – though given that the Titanic was built in my home city I always feel the need to point out that this particular disaster came about as a result of pilot error! To take your analogy further, I guess that the ‘iceberg’ here is a failure to close timing. Better guidance will definitely help you avoid that one.
Liz: Who knew? (laughs) Well, where can we learn more about this problem and how to fix it? Oh…and your customer survey…can we get a look at that? Sounds like some compelling information in there.
Ron: Yes, the customer survey was VERY telling and gives us a good leg up on what designers need to close at RTL for the next several generations of designs. In its current form, because we talked to customers, we can’t release it.
However, Bernard Murphy WILL refer to it at length in his DesignCon panel.
Liz: What panel is that?
Ron: At DesignCon we will be holding a panel on: “The Same Chip Killers keep Delaying your Schedules – What are you doing about it?” moderated by Ed Sperling, editor of System-Level Design. The panelists will discuss a broad range of issues, including timing constraints, the impact of IP etc. that repeatedly cause schedule slips. It will take place on Monday, January 31 at 4:45 p.m.
Liz: Sounds like a crucial discussion. I’ll be sure to attend!
Thursday, September 16th, 2010
A couple of weeks ago, a client asked, in essence, “why comment on articles or blogs?”
OK, so he didn’t say it exactly like that. But he did say that he’s
…struggling to figure out what really makes sense regarding the growing amount of posting by anybody and everybody….Is all this writing and blogging serving a real purpose? I’m not sure. Some blogs get recognition and response….I think most don’t.
He’s got a point. I think bloggers (indie, company and editorial) all feel, in our gut, that there’s value. But how do we measure that value? What do comments add to a blog or article? Tough one.
So I asked some of the bloggers what they thought. First off, I went to one of the longest running bloggers in EDA – Karen Bartleson. (Is it really three years, Karen? She’s at http://www.synopsys.com/blogs/thestandardsgame). She shed really insightful light on why EDA blogs get so few comments, if we compare them to consumer blogs like Yelp. And, she has her blog up on what she’s seen in the three years since she started her blog. So do take a look at Karen’s analysis of EDA blogging. I bet she’s got a take on the state of EDA blog comments.
Karen’s, along with a bunch of other bloggers’ comments on EDA blog comments gave me some trends to ponder. Some recurring points:
__the honeymoon infatuation period for EDA blogging has come…and is going. Now there needs to be some sense of longterm value.
My take…just what is “value” in terms of EDA blogs? Different from perspectives of the client, journalist and PR person.
__some indie bloggers say they see their blogs as diaries, written for themselves and interested people.
My take…everyone is aware of a larger cast of potential viewers, however. (By and large, they value comments but don’t use it as a metric of their blog’s value.)
__there are more eyeballs on the blogs than we can ascertain.
My take… however, these numbers are impossible to get for viewers and bloggers hosted by other sites. There’s no SRDS* in the EDA & IP social media world.
*SRDS was (is?) an organization that certified reader numbers for print publications so that they could charge advertising rates based on readership.
__engineers by and large are pretty quiet, shy types who rarely will comment or extend a discussion, even if they do read the blog, article and their accompanying comments.
My take…this came up a lot. I’m not sure…would their shyness prevent them from commenting? Probably. Would the relatively anonymous filter of the comment field encourage them to speak out? Potentially.
__by and large, the number of comments aren’t an accurate measure of eyeballs.
My take…lots of agreement that some sort of metric on value is reasonable, understandable. Less agreement on whether it’s needed now.
(One person compared the dilemma to the old attempt to measure column inches to value, which measures volume but doesn’t take into account perceptual, qualitative value.)
__commenting is a lot like getting a quote into an editorially-written article insofar as creating an authoritative voice that gets recognized, over time, as an industry voice to listen to…or not, depending on the content of the comment).
My take…one especially insightful editorial blogger felt that comments are a dynamic part of a living, breathing article that encompasses new perspectives with new comments and discussion.
One difference that I see is that the editor or author of the article hasn’t vetted the comment or incorporated it into his or her article. The comment is a response to the vetted article, which is the insightful editorial blogger’s point, I now see.
__the blog (and blogger) or article (and author) and its comments, to some degree, form a community onto each of themselves.
My take…this discussion got a bit abstract for me but I hear the notion. Help!
__this is a good time to talk about the expectations of each community (indie bloggers, editorial bloggers, company bloggers) and how to sync up each community so that there is value for everyone.
My take…but it’ll require the different goals and expectations of each community to somehow sync up so that each community’s efforts bring value to one another. How does that sync up with goals and expectations of customers, clients?
Of course, there’s no answer (yet) to the question about value here. The bloggers (indie, company and editorial) feel that there is value in commenting. Many of them agree that no one can measure value right now but that there ought to be some way to do so. Most everyone thinks that there is an existing, intangible value of being a voice of authority, an industry citizen.
And everyone thought we ought to keep talking about this issue.
– end –
Monday, July 12th, 2010
The pre-DAC acquisitions of Denali and Virage drastically realign the core of the EDA industry. When IP first came on the scene here in the US, (I think 3Soft was the first IP company I saw), many people figured that IP would become another form of delivery for chip designs – and that they would come from the semiconductor companies.
The EDA executives’ explicit remarks about how IP is key to their continued growth could turn EDA into an industry of IP haves and IP have nots.
How does this EDA realignment affect customers? We asked Atrenta vice president of marketing and industry voice Mike Gianfagna, ” What does the EDA industry realignment mean for customers?”
Here’s what he said:
Realignment can mean two things that are related, but a bit different.
One form of realignment we’re seeing is the IP market merging into the EDA market. This is definitely good for IP customers. Effective IP reuse requires a blend of quality, highly validated IP and a good reuse methodology. The methodology need is for both authoring IP to be reusable and implementing the reuse itself. EDA is a good place to bring all this together. Most larger EDA companies understand what it takes to deliver high quality, validated designs. They also understand what a reuse methodology should include. A lot of the smaller IP shops don’t have this perspective.
Another realignment is the “annexation” of embedded software into EDA. Synopsys is validating this trend with their buying spree, and Cadence is validating the trend with their EDA360 proposal and some buying, too. This is also good for the customer. If software development teams can help to drive the silicon creation process, we are going to see some new killer apps emerge as a result.
What do you think about the combination of IP and EDA? Let us know in the “comments” section.
– end –
Saturday, June 12th, 2010
We asked three EDA figures to comment on how the Synopsys purchase of Virage would impact the EDA and IP industries. Here’s what they said.
This acquisition puts Synopsys squarely in the front of the pack as far as IP suppliers go. This trend could be quite significant. Successful IP reuse is a combination of the right EDA tools, best practices methodology and well-designed IP. The EDA vendor is a pretty good place for all that to come together. ARM remains the exception to this rule, and several other rules for that matter.
Vice President, Marketing
I don’t see how this doesn’t make Synopsys a competitor with ARM on physical IP and ARC processor. ARM should start feeling like it is getting surrounded by Synopsys.
With EDA trying to expand its scope and grow beyond its traditional boundaries (see EDA360), and with small and medium size IP vendors struggling to grow, basic economy forces are pushing this trend.
Synopsys has already been a formidable IP player and Cadence now entered it with its recent acquisition of Denali.
There are still plenty of smaller IP players so we’ll see further consolidation playing out. The IP segment has been trying to define and position itself between EDA and semiconductors. We all wondered if IP would become an intrinsic part of the semiconductor industry, the EDA industry, or stand on its own. These days we clearly see that the IP pendulum has shifted toward EDA.
The outlier is of course ARM which is a different beast, in some ways closer to semiconductors: i.e., look at how ARM competes with Intel. With a market cap equivalent to Synopsys and Cadence put together, ARM is simply too big for that.
– end –
Monday, June 7th, 2010
Mike Gianfagna, well known and long time EDA executive, has quite a bit to say about the EDA360 manifesto that’s electrified the EDA world. As vice president of marketing at Atrenta, Inc, Mike has been an astute, articulate participant in the EDA value discussion. I was able to grab a few minutes with Mike to ask how EDA360 helped define the 2010 and beyond definition of EDA value and how it might alter the industry’s direction.
ED: EDA360 has caused quite a buzz. Why?
MIKE: Simply put, it’s one of the first times a major EDA vendor has focused on growing the industry and not just winning the next deal.
ED: It’s curious that EDA people have embraced it so vigorously. After all, it’s not a “how to” but more of a “here’s the vision, the dream.” What’s the impact of EDA360 on the EDA industry? The EDA user community? The EDA media?
MIKE: Let’s face it, the EDA industry has been stuck at roughly the same size for a long time. This lack of growth, in my opinion, has a lot to do with the predatory practices most suppliers pursue. That is, “I win the current budget and you lose.” Growing the business takes a broader view, and a good dose of vision to see beyond today’s budget and determine how EDA can serve new customers tomorrow. EDA360 articulates such a vision.
I’d like to think all this will have a positive impact on our industry overall. As for the EDA media, I am honestly not sure who that is anymore, so it’s hard to comment.
ED: This is a Cadence-generated document. How effective can it be if there’s a significant “other” camp?
MIKE: This point is what I find most interesting (and refreshing) about the concepts of EDA360. It’s not a Cadence document per se. It’s a blueprint of where EDA can go to find new customers and add new value. The piece articulates this in terms of current industry trends. It aims to exploit adjacencies in order to grow the market. And it clearly states that everybody needs to start thinking differently if it’s going to work.
ED: Rightfully, some people could view EDA360 as a Cadence effort to regain some of its industry momentum and influence that it has NOT had for years. Why should the rest of EDA buy into a company initiative?
MIKE: As I mentioned, I don’t see this as a company initiative. I see it as a call to action for our industry. We can all keep chasing the same budget, or find new customers and new budgets. A “dog food dish” image is spinning around in my head right now, but I’ll leave that discussion to the class historians among us.
ED: How will EDA360 affect the big 6: Atrenta, Cadence, Magma, Mentor, Springsoft and Synopsys?
MIKE: Wow, thanks for the flattering reference. It’s not every day that Atrenta gets mentioned in the same sentence with Cadence, Synopsys and Mentor. The reference is correct, however. Atrenta is now at a size, and a popularity level that gives us the opportunity to make a real difference, if you believe the DeepChip readership.
How can we make a difference? First of all, a consistent focus on serving the new and emerging user base referenced in the EDA360 vision will help. That is, the software development community that requires advanced silicon to get its job done. The changes implied by EDA360 will take time – all design paradigm shifts do and they usually take longer than you like.
If a group of forward-looking companies can work together toward the vision, the time required to get there can be reduced. And that spells opportunity for everyone.
ED: How will EDA360 affect the medium sized EDA companies?
MIKE: I think the effect here will be similar, except many mid-size EDA companies may necessarily be slower to respond. Pursuing new markets and new customers takes discretionary resources, and many mid-size companies don’t have a lot of that.
ED: How will EDA360 affect the slew of small and startup EDA companies?
MIKE: For the current crop of startups, I don’t believe the effects will be that noticeable. Some will figure out how to re-invent themselves in new, emerging markets but most will continue on the path they are currently on.
The interesting part for venture-funded startups is what happens next. Will the venture community start writing checks for new business models that address the application software developer’s needs? If this happens, we’ll have another proof point that EDA360 is more than a nicely done White Paper.
– end –