Posts Tagged ‘EDA’
Tuesday, August 25th, 2015
For the most part, the terms “verification” and “validation” are used interchangeably in the electronics industry. However, there are many who argue that these are distinct activities in the development of SoC s and systems, performed at different times in the schedule and usually by different groups of engineers. We refer to ourselves as “The SoC Verification Company” and this is a deliberate choice we made. So we thought that it would be useful to define the two terms as we see them and talk about the similarities and differences.
This post was inspired by an article from 2010 that our CFO and co-founder Maheen Hamid discovered recently. It opens with the “usual definitions” as follows:
- “Validation: Are we building the right system?”
- “Verification: Are we building the system right?”
This seems like a good place to start the discussion.
Thursday, August 20th, 2015
Last week we discussed some of the drivers in the electronics industry influencing the program for the upcoming DVCon India, September 10-11 in Bangalore. The Technical Program Committee has completed its arduous task of selecting among many worthy proposals for sessions and has posted a near-final program. Today we’d like to highlight some of the most interesting aspects of the packed two days, focusing on sessions that we believe will be a particular draw for those who follow Breker and SoC verification.
There are four conference-wide keynote speeches, from Atul Bhatia (formerly of nSys), Harry Foster of Mentor, Manoj Gandhi of Synopsys, and Vinay Shenoy of Infineon. They will set the tone for the event by discussing the high-level challenges in designing and verifying leading-age semiconductor devices. Nick Heaton of Cadence will keynote the Design and Verification Track (DV) while Pankaj Singh of Infineon and Dr. Sacha Loitz of Continental will give invited talks in the Electronic System Level (ESL) track.
Wednesday, August 12th, 2015
Many of our readers may recall that Breker aggressively promoted the inaugural DVCon India last year. We supported the show itself by sponsoring a booth in the exhibition and delivering three conference talks. It turned out, much to our delight, that that hottest topic at the show was portable stimulus. There was a great deal of interest in the newly formed Accellera Portable Stimulus Working Group (PSWG) and how Breker’s products provided a well-tested solution meeting all of the PSWG’s requirements.
The second DVCon India is less than a month away, on September 10-11 at Leela Palace in Bangalore. I have every expectation that portable stimulus will be a major theme again. We’re also very busy promoting the event to ensure its success, especially since I am co-chair of the Promotions Committee. I will be covering the details of the sessions and our own participation in next week’s blog post. For today, I’d like to focus on some of the industry drivers that are influencing the interest of potential attendees and the selection of content for the technical program.
Wednesday, August 5th, 2015
In last week’s post, we spent quite a bit of time talking about the concept of a (realistic) use case that reflected how actual users will eventually manipulate the design being verified. Our focus was on Breker’s graph-based scenario models and how they can easily and concisely capture such use cases. We did some research on the term “use case” and found that it seems to be more common in software design and verification than in hardware verification. That caused us to think about how we at Breker seem to be living on the hardware-software frontier.
It’s not uncommon for hardware designers and software engineers to borrow ideas from each other. Code coverage, for example, was well established in software before it was adopted for hardware design and verification languages. With the move from gates to RTL, hardware became just another form of code and therefore more amenable to software techniques. This is just one example showing that the boundary between hardware and software is fuzzy and changing over time.
Thursday, July 30th, 2015
One of the signs that a technological domain is still fairly young is frequently evolving terminology as the pioneers attempt to explain to the mainstream what problem needs to be solved and what solutions can be brought to bear on the problem. Such is the case with SoC verification. At Breker we used to start explaining what we do by talking about graphs, but shifted to “graph-based scenario models” to emphasize that graphs are perfect for expressing scenarios of real-world behavior.
Our friends at Mentor, also strong advocates for graphs, began using the term “software-driven verification” to describe their approach. We rather like this description, but feel that it can only be applied accurately when embedded test code is being generated and when the embedded processors are in charge of the test case. Now our friends at Cadence have been sprinkling the term “use case” throughout their discussions on SoC and system verification. Let’s try to sort out what all this means.
Thursday, July 23rd, 2015
The recent death of EDA analyst Gary Smith overshadowed another major transition in our industry: the retirement of longtime EDA journalist Richard Goering at the end of June. Both of these men contributed an extraordinary amount to EDA, and today I’d like to say a bit about Richard and his accomplishments. He is best remembered as the CAD/CAE/EDA editor for Electronic Engineering Times, for many years the newspaper of record for electronics.
It would be hard for today’s young engineers to imagine how influential EE Times was at its peak. It stood out on everyone’s desk with its distinctive tabloid format. Most buyers turned to its pages first. All vendors wanted to achieve editorial coverage for their companies and products, in addition to advertising there. The EE Times journalists and editors were some of the best and brightest. Landing an interview with one of them was a goal for every PR campaign. When it came to EDA, Richard Goering was the man.
Wednesday, July 15th, 2015
Recent announcements from IBM and others about supporting EDA tools in the cloud have spurred renewed discussion on this topic, including here at The Breker Trekker. As expected, the recent posts have been very popular with our readers. Those of you who have been following this topic for a while may recall that, almost exactly two years ago, EDA vendor OneSpin announced cloud support for their formal tools. We invited their VP of Marketing, Dave Kelf, to fill us in their experiences since then:
Two years ago OneSpin introduced the cloud version of it’s Design Verification (DV) formal-based products. Some commentators pointed at other failed EDA attempts to make the same move, suggesting more of the same. Others hailed the announcement as a bold move whose time had come. So… did it work out and what have we learned? The results are surprising, and suggest trends that make some EDA solutions a natural fit for the cloud, whereas others are questionable.
Tuesday, July 7th, 2015
This week began on a very bad note in the EDA world: news of the death of longtime industry analyst Gary Smith. In an industry that has been largely ignored by Wall Street and big market analysis firms in recent years, Gary has played a critical role in continuing to carry the torch for EDA and providing both hard data and thoughtful commentary on business-related and technological topics. It is difficult to imagine our world without him.
Beyond his contributions to the industry, Gary was loved and admired by many of his fellow EDA and semiconductor professionals. I’m writing this post in the first person since the memories herein are mostly mine, but I know that I speak for my colleagues at Breker when I say that we always enjoyed meeting with Gary and that we will miss both his humor and his wisdom. We hope that we can all provide a measure of support to help his family get through this terrible time.
Tuesday, June 30th, 2015
Last week on The Breker Trekker, we discussed the resurgence of interest in EDA tools in the cloud. Like our first post on the topic two year’s ago, last week’s entry was very popular. Clearly this is a topic of interest to both our regular and occasional readers. Two more announcements regarding EDA in the cloud also surfaced during the recent Design Automation Conference (DAC), so it does seem as if there is more effort going toward finding a technically and financially successful industry solution.
Last week we summarized five barriers that have helped prevent cloud-based EDA from achieving mainstream adoption:
- The EDA vendor’s effort to port to a cloud-based platform
- Worries about GUI and interactive responsiveness
- Ability to support users of cloud-based tools
- Lack of an established, proven business model
- Concerns over security of the design and verification data in the cloud
Wednesday, June 24th, 2015
It has been almost exactly two years since we discussed the possibility of EDA tools in the cloud here on The Breker Trekker. The post was popular then, and it remains so. In fact, of the more than 100 posts we’ve published, our cloud post remains the second most read. This week, the recent news that IBM will make its EDA tools available in the cloud through a partnership with SiCAD brought cloud computing back to the forefront. Let’s discuss what has changed–and what hasn’t–in the past two years.
The idea of users being able to run EDA tools as leased enterprise software on remote machines has been around for years, well before the term “the cloud” was widely used. Synopsys invested a great deal of time and effort into its DesignSphere infrastructure, initially more of a grid application than a cloud solution as we use the term today. But the difference is not very important; the key concepts are the same and they represent a major departure from the time-tested model of customers “owning” EDA tools and running them in-house.