The Breker Trekker
Tom Anderson, VP of Marketing
Tom Anderson is vice president of Marketing for Breker Verification Systems. He previously served as Product Management Group Director for Advanced Verification Solutions at Cadence, Technical Marketing Director in the Verification Group at Synopsys and Vice President of Applications Engineering at … More »
Guest Post: Open Source Requires Open Minds, Especially in EDA Verification
July 20th, 2016 by Tom Anderson, VP of Marketing
Few recent announcements in the EDA, IP, or semiconductor industries have had the impact of SoftBank’s proposed US$32B acquisition of ARM. Many commentators have weighed in on this news. Today’s guest blogger, OneSpin Solutions Vice President of Marketing David Kelf, shares some thoughts on how changes to the ARM universe might intersect with ongoing changes in the open source community:
One side effect of the ARM acquisition news was an increase in the debate on the fascinating RISC-V Open Source processor development. Clearly this has the interest of a number of significant ARM users, judging by the recent workshop at MIT last week as one example, and might represent a significant game changer. It also begs the question on the application of Open Source, and indeed standardization efforts in general, in verification and how programs in this area might change the dynamics of increasingly closed environments from the two largest EDA vendors.
In a world and, more specifically, a semiconductor industry that is becoming increasingly community-driven, closed design flows fly in the face of an engineering environment that demands the mix and match of preferred tools. On the other hand, as Moore’s Law, together with the need for larger SoCs slows, the business model employed by the large EDA companies, together with the drive to take market share from each other, would seem incompatible with Open Source ideals. Or are they?
The well know investor and industry luminary, Lucio Lanza, has been a strong advocate on Open Source, and the video of a panel he ran at the last DAC on this subject is well worth a look. Lanza and his panel discuss the effect of a new community of designers with a different outlook on development processes and environments along with they way they work and live. The idea of leveraging internet-based mechanisms to build communities that could propel an Open Source design component model is considered.
Can this model work for EDA tools? It has already been tried with some success in the form of SystemC, where a community was indeed created and a development flow introduced. However, there are still perceived issues by some groups in the larger EDA vendors who see competition, and this has lead to a lack of attention.
Verification tool vendors will correctly point out that building and maintaining successful simulation, emulation and formal technologies require significant resources and knowledge. In general, these tools are initially produced by a highly focused team of experts who work in a tightly integrated group to maximize the efficiency of the code they produce. It is hard (but not impossible) to duplicate this intense development in an open environment. Additionally, these teams need to be compensated for their efforts.
However, what about the layer of application code that sits on top of these core engines? I would argue that it is this area that we need to be looking at more closely, and might provide a stepping-stone to greater Open Source adoption and verification productivity.
A traditional EDA technology roll-out will often take the form of a smart small company producing a new technology and working on this with a handful of customer-partners to knock the edges off it and make it work well in practice. Then follows proliferation to a more general market, and ultimately standardization of sections as they make sense given market conditions. Breker is a good example of this, and it is a proven model for new innovations.
But what about the situation where openness is key, where the implementation of feedback must be more immediate, where building functionality in application layers is important, and where the market simply does not have the time to wait around for the traditional development process? Open Source may well provide the answer.
Around the core verification engines is a plethora of tools, IP, methodology enablers, and integration code. It is this area where new thinking is required. For example, what if the coverage data from different tools were collated in an SQL database, using a meaningful coverage model, which could be accessed by open source tools to extract useful data? Such a coverage environment would have to be open to engines from different companies, allow users to add application code to extract relevant information to them, and be highly reactive to changing market requirements.
This would seem to be a clear opportunity that could be made lucrative with a business model along the lines of RedHat, priced to fit the market. Of course the big EDA vendors would have to go along with such an innovation, but this might not be such a bad thing. They would no longer have to worry about working with their competitors directly, and could meet customer demands for openness to other tools. If they want, they could still produce their own, more tightly integrated version.
The idea of building an application layer on top of the core verification engines that is flexible with open licensing could provide an answer for the next generation of verification, which is clearly heading towards more narrow focused segments that cannot be satisfied with general solutions. Open Source could be the mechanism that allows the EDA vendors to sell their high value core engines into new markets, thus realizing the growth they need and are having trouble extracting from existing sources.
Tags: applications, apps, ARM, bandwidth, Breker, coverage, debug, EDA, emulation, formal, functional verification, graph, multi-threaded, multiprocessor, OneSpin, open source, portable stimulus, reuse, RISC-V, simulation, SoC verification, SystemC