Posts Tagged ‘EDA’
Tuesday, March 25th, 2014
As we mentioned in our last few posts regarding the DVCon and SNUG Silicon Valley events, Breker exhibited at both shows with an identical demonstration. We showed our latest product, TrekSoC-Si, generating a test case, downloading it into a commercial SoC (a TI OMAP4430 with dual ARM cores), and running in the actual chip. This demonstrated our ability to support all verification platforms, from ESL and RTL simulation through acceleration, emulation, FPGA prototyping, and silicon.
This demo attracted quite a bit of interest and some good questions at both shows, so we thought we’d devote this blog post to filling in a few of the details. We especially want to stress that we provide exactly the same level of visualization for a multi-threaded, multi-processor test case running deep inside an actual chip as we do when it’s running in simulation or simulation acceleration.
Monday, March 10th, 2014
In our last two posts, we talked about the 2014 edition of the Design & Verification Conference & Exhibition, DVCon, in San Jose. Now that the show is history, lots of bloggers are summarizing their experience. Since I thought that this was an excellent event all around, allow me to join the chorus of voices praising DVCon 2014.
Here at Breker, our biggest effort goes toward the exhibition. Although it’s a relatively small booth and exhibit floor, we do want to put our best foot forward. So we had all-new signage this year updating attendees on our products and their capabilities. We also showed a very different demo from last year, with our TrekSoC-Si product generating a test case, downloading it into a commercial SoC (a TI OMAP4430), and running in the actual chip. We chose to repeat our very popular giveaway from DAC: a combined flashlight and distress whistle that will come in handy if you perform inadequate SoC verification and hit an iceberg.
Tuesday, February 25th, 2014
Next week (March 3-6) marks the return of the most important annual event for verification engineers: the Design & Verification Conference & Exhibition 2014, better known as DVCon. Its home remains the DoubleTree hotel in San Jose, a Silicon Valley landmark and site of many interesting conferences going back to its original days as the Red Lion Inn. Breker will be there in force, so we’d like to tell you about our activities as well as preview the technical program.
Of course, Breker will be participating in the exhibition portion of the show. This has expanded from previous years. The exhibit floor will be open on Tuesday (March 4) and Wednesday (March 5) from 2:30pm to 6:00pm as usual. However, a special preview on Monday from 5:00pm to 7:00pm has been added this year. You’ll have plenty of time to stop by to visit Breker in booth number 902 and (if you must) perhaps some other vendors as well.
Tuesday, February 18th, 2014
In our last post, we discussed the results of a survey by Wilson Research Group and Mentor Graphics. Among other interesting statistics, we learned that verification engineers spend 36% of their time on debug. This seems consistent with both previous surveys and general industry wisdom. As SoC designs get larger and more complex, the verification effort grows much faster than the design effort. The term “verification gap” seems to be on the lips of just about every industry observer and analyst.
We noted that debug can be separated into three categories: hardware, software, and infrastructure. Hardware debug involves tracking down an error in the design, usually in the RTL code. Software debug is needed when a coding mistake in production software prevents proper function. Verification infrastructure–testbenches and models of all kinds–may also contain bugs that need to be diagnosed and fixed. As promised, this post discusses some of the ways that Breker can help in all three areas.
Tuesday, February 11th, 2014
For today’s blog post, we use as our text a recent article on SemiWiki by well-known verification expert Hemendra Talesara. He provides a nice summary of a recent talk given in Austin by another verification expert, Harry Foster from Mentor. Many of you have probably seen Harry’s blog posts dissecting in great detail the results of a bi-annual survey that Mentor commissions from Wilson Research Group. There is much less coverage and analysis of the EDA world available today than there used to be, so we all applaud Mentor’s willingness to fund this survey and share the results.
Hemendra’s focus is on the well-known phenomenon of verification consuming more and more of a chip project’s resources. It is not uncommon to find that SoC projects have two or three verification engineers for every design engineer. So what do these verification engineers do with all their time and resources? The interesting result from the Mentor survey is that verification engineers spend 36% of their time on debug. At Breker, we’ve given a lot of thought about how to reduce debug time and effort, so we’d like to share some thoughts.
Tuesday, February 4th, 2014
Our last post on the relationship between the Universal Verification Methodology (UVM) and Breker’s technology was very popular. In only a week, it has become the fifth-most-read post in the nine-month history of The Breker Trekker blog. Clearly people are interested in the UVM and what strengths and weaknesses it brings to the ever more complex world of SoC verification.
This week we’d like to continue the discussion with a topic that we did not address last week: how the UVM offers an alternative to running embedded code by replacing one or more of the processors in the SoC with a verification component (VC). Our CEO, Adnan Hamid, addressed this topic in an Electronic Design article last November. We’d like to revisit some of the key points of that article in the context of last week’s UVM discussion
Tuesday, January 28th, 2014
When people first start reading about Breker and what we do, we make the point that transactional simulation testbenches are breaking down at the full-SoC level. Usually, we specifically mention the Universal Verification Methodology (UVM) standard from Accellera as not being up to the challenge of full-chip verification for SoC designs. We sometimes worry that someone will read into this that we don’t like the UVM, or Accellera, or even standards in general. Nothing could be further from the truth!
We have great respect for the UVM and other EDA-related standards developed by Accellera, IEEE, and other organizations. In this post, we’d like to discuss specifically what we see as the strengths and weaknesses of the UVM and explain how Breker’s technology complements rather than replaces this methodology. Yes, the UVM has limitations, and we address those with our tools and technologies. But the UVM forms a stable and standard base on which nearly all of our customers build their simulation-based verification environments.
Tuesday, January 21st, 2014
Recently on this blog, a series of related posts from Breker, Jasper, and OneSpin discussed formal analysis and its potential for playing a greater role in the verification process. We think that it’s important for The Breker Trekker to address topics in verification beyond our own technology and to provide occasional commentary on technology and the world of EDA in general. However, this recent focus on formal has caused some readers to wonder whether we consider ourselves to be in the formal market.
The short answer is “no” but there is some overlap in the technologies that we use and the techniques employed for formal analysis. Regular readers know that the foundation for our products is a graph-based scenario model that captures both the intended behavior of your SoC design and your system-level test plan. We can automatically extract system coverage from this model, with the model and coverage interacting in interesting ways. Let’s consider to what extent this is formal technology.
Tuesday, January 14th, 2014
Both our original post challenging Jasper Design Automation’s statement that “formal will dominate verification” and Jasper’s response have generated excellent readership. Another major player in the formal world, OneSpin Solutions, also has some strong opinions to share. Please join us in welcoming OneSpin’s Director of Marketing Dave Kelf with his guest post:
I would like to thank Breker for driving this debate on the future importance of formal verification. In my opinion, not only will formal dominate verification, but my belief is that the effect of this technology will be as transformational as the advent of logic synthesis.
Tuesday, January 7th, 2014
This week’s blog post is inspired by Brian Bailey’s recent article “Making Modeling Less Unpleasant.” I noted with amusement that the link to his article ends with “making-modeling-pleasant” which I suspect was automatically generated from an early draft. So perhaps Brian started with the idea that modeling could be pleasant, but concluded that “less unpleasant” is as good as it can get? Is he too pessimistic? Can modeling actually be pleasant?
It depends in part on what aspect of design or verification modeling we consider. Brian’s primary focus is on system-level models of the design, also called electronic system-level (ESL) models, architectural models, or virtual prototypes. The appeal of a simulatable SoC model fast enough to run compiled code, capable of both functional and performance verification, is easy to understand. There have been many attempts to establish standard approaches, such as transaction-level modeling (TLM), and languages, such as SystemC.