February 20th, 2014
At last years Design and Verification Conference (DVCon) in San Jose, Real Intent sponsored a panel on “Where Does Design End and Verification Begin?” In this Part 2 we are continuing with the questions from the moderator and answers by the panelists.
The panel was moderated by Brian Hunter, Cavium, Inc. and panelists:
Pranav Ashar – Real Intent, Inc.
John Goodenough – ARM, Inc.
Harry Foster – Mentor Graphics Corp.
Oren Katzir – Intel Corp.
Gary Smith – Gary Smith EDA
Below are links into the video recording where the question is asked and the immediate replies and comments by the panelists. The end-user insights by Goodenough from ARM and Katzir from Intel are particularly interesting.
These are good days for virtual prototyping vendor, UK-based Imperas. The company will be making appearances this coming week at Embedded World in Nuremberg, at DVCon in San Jose the following week, and at CDNLive in Santa Clara the week after that, as well as several events in the UK in this same time frame. Imperas has a lot to talk about, including an announcement involving MIPS, a division of Imagination Technologies.
Per CEO Simon Davidmann in a recent phone call: “We’re small, self-funded and growing, with revenues last year up 65 percent. [Even better], the type of customers we’re seeing are tier-one semiconductor and embedded systems companies. We want to help people build better software. No one builds a chip without simulation, and we believe software development should be done like that as well.”
I asked about the competition. Simon answered, “It’s true, other people have models in the same space as ours – companies like Synopsys, Cadence and ARM – but we tend to cooperate with them. Our real competition is legacy breadboards, and kick-it-and-see techniques, rather than proper methodologies.
“For most complex SoCs, many people try to develop software with simulation at the RTL level, or with a hardware-accelerator box, but those approaches don’t get the throughput of software and performance they need. And with a prototype, they don’t get the controllability and observability. That’s why most of our competition is the legacy mindset in the customers.”
DO-254 defines 3 types of verification methods: Analysis, Test and Review. In order to satisfy the verification objectives defined in DO-254, applicants must formulate a requirements-based verification plan that employs a combination of the three methods.
Analysis vs. Test
A computerized simulation of the hardware item is considered an Analysis. Test is a method that confirms the actual hardware item correctly responds to a series of stimuli. Any inability to verify specific requirements by Test on the device itself must be justified and alternative means of verification must be provided. In DO-254, the hardware test is far more important than the simulation. Certification authorities favor verification by test for official verification credits because of the simple fact that hardware flies, not simulation models. Requirements describing pin-level behavior of the device must be verified by hardware test.
A young entrepreneur’s take on EDA
February 20, 2014 by Matthieu Wipliez
You’ve all read many times the opinions of EDA veterans concerning this industry, so for a change I’ll humor you with my opinion as a young entrepreneur and EDA company founder. This post originally appeared on my company’s official blog (see original post). What prompted me to write this piece is Gabe’s article on Starting A New EDA Company.
In his post, Gabe is hoping for “disruption” and “a new business model”, yet he notes the “total lack of new ideas from younger people”. Hmm. Well I’m young, and I certainly do think that younger people have lots of ideas, and that they’re actually having a huge success, it’s just that it’s happening in other industries, like, for instance, the software industry. I’m talking about the Facebook, Twitter, Instagram, Snapchat, and whatnot. Now why is that? Why are young people having success in software but not so much in semiconductor? Could it be that there is something specific about the semiconductor industry? After all, until its recent acquisition by Cadence, Forte was still called a start-up. After 16 years (it was founded in 1998). And VCs seem to agree that the money is elsewhere. So what is it?
It turns out that to create a “new successful EDA company”, you should “understand thoroughly the application industry [your company] serves”. Ok, but how is it possible for young people to do that exactly? The semiconductor industry today is mainly about designing SoCs, and that requires many different skills and companies and people working together. It takes years to become proficient in designing quality hardware with RTL, and this is only the first step to making a chip! Then you need to learn about verification (and SystemC, and SystemVerilog, and UVM, and equivalence checking, and I don’t know what else, after all I’m not a verification engineer!), and back-end, and DFT, etc. How are you supposed to thoroughly understand all this without 10 or 15 years of experience?
This kind of attitude is part of the problem. Let’s take another example. Most EDA software use the same licensing program, the well-known FlexLM. That stuff is 26 years old. Surely by now you’d imagine we would have a better solution? Well there are alternatives. So why does EDA keep using this one? Is it because this industry is a conservative triumvirate? Is it because these three are just too big? But being a behemoth has never prevented innovation! Agreed, it does make it more difficult, because of the innovator’s dilemma, but many bigger companies still manage to innovate a lot. Google’s revenue for 2013 is about $60 billion, that’s respectively around 30, 40, and 60 times the revenue of Synopsys, Cadence, and Mentor Graphics. If being big does not prevent innovation, what else could?
I think this is a cultural problem. We have a kind of chicken and egg problem, with users who have become afraid of change (including new EDA software) because change has all too often caused problems, and with companies that do not change things because they fear this is is going to cause problems or to make users angry. And in the end, users are the ones who give you money, so you try to listen to them. That’s actually fine, as long as you keep in mind that only a small percentage of users are actually innovators and early adopters, and these are the ones willing to change first; if you convince them, you have a much better chance of convincing the others (more or less easily, see Crossing the Chasm, and the post I wrote about this Are you pre-chasm?). This is a distinctive trait of the semiconductor industry in my opinion: we seem to hear the late majority (to quote the original research, “older and fairly conservative”) voice its opinion much more than one would otherwise expect.
Despite all that, though, I love writing EDA software for all hardware designers who are open to the possibility of improving their design flow. It makes me pretty happy when I meet or talk with them And of course I love designing hardware with the Cx language that we created!
Bob Smith, Senior VP Marketing & Business Development at Uniquify, shared with us his predictions for semiconductor IP in 2014.
“If 2014 has a watchword for the Semiconductor Industry, it would be momentum and that would be a result of the rapidly increasing use of IP in SoC designs. Add on the mushrooming need for ‘adaptive’ IP to mitigate timing and variation challenges in complex SoCs as performance issues multiply and process geometries shrink.
Moves within the DDR memory space continue to rock the industry and create momentum. Designers are heading directly to the latest JEDEC standard LPDDR4 (low-power DDR4) and moving beyond (or even skipping) LPDDR3 because they’re getting greater gains in performance and low power, an important consideration for mobile applications.
Making Verification Debug Less Painful
February 18, 2014 by Tom Anderson, VP of Marketing
In our last post, we discussed the results of a survey by Wilson Research Group and Mentor Graphics. Among other interesting statistics, we learned that verification engineers spend 36% of their time on debug. This seems consistent with both previous surveys and general industry wisdom. As SoC designs get larger and more complex, the verification effort grows much faster than the design effort. The term “verification gap” seems to be on the lips of just about every industry observer and analyst.
We noted that debug can be separated into three categories: hardware, software, and infrastructure. Hardware debug involves tracking down an error in the design, usually in the RTL code. Software debug is needed when a coding mistake in production software prevents proper function. Verification infrastructure–testbenches and models of all kinds–may also contain bugs that need to be diagnosed and fixed. As promised, this post discusses some of the ways that Breker can help in all three areas.
Recently, Gabe Moretti, contributing editor to Chip Design, wrote a lengthy article for Systems Design Engineering addressing an important topic, “Verification Management.” It included comments from Atrenta, Breker Verification Systems, Jasper Design Automation, Mentor Graphics, OneSpin Solutions, Oski Technology and Sonics on a series of questions from Gabe on how to manage today’s complex and time-consuming verification process.
Introducing the iBrush
February 18, 2014 by Colin Walls
Today, for a change, instead of discussing some embedded software technology, I would like to put forward a concept for a product. It is an embedded system, which I believe could sell in high volumes. Maybe someone reading this blog would like to develop it. I am happy to waive any rights to royalties on the idea so long as we have an understanding that you will use Mentor Graphics products in your design.
This product is the iBrush …
Next up in our series of predictions is the astute insight of Mike Demler, Senior Analyst with The Linley Group & MICROPROCESSOR report, and former EDA & Chip Design news analyst.
“It’s all about the ecosystem triad: EDA + foundry + IP. Cadence and Synopsys continue to evolve more in the IP direction, and there is really not much to say about the tools that hasn’t been said for a long time —just make it all work together! Redundant “standards” and artificial barriers to interoperability cost the semiconductor industry by lowering productivity. This is the problem with the disaggregated model. Back in the days when “real men” had fabs, companies could develop complete design flows without such obstacles.
The triad needs to work together to get over the stall inMoore’s Law at 28nm. Foundries are incurring delays in getting to 16/14nm FinFETS, and almost nobody is going to use 20nm. The chip industry needs an overall lower-cost solution in order to make sub-28nm processes economically viable. Forget 3D ICs, those will be niche products for a long time, about as popular as 3D TV.
You are registered as: [email@example.com].
CafeNews is a service for EDA professionals. EDACafe.com respects your online time and Internet privacy. Edit or Change my newsletter's profile details. Unsubscribe me from this newsletter.
Copyright © 2016, Internet Business Systems, Inc. — 595 Millich Dr., Suite 216 Campbell, CA 95008 — +1 (408)-337-6870 — All rights reserved.