Bridging the Frontier Bob Smith, Executive Director
Bob Smith is Executive Director of the ESD Alliance responsible for its management and operations. Previously, Bob was senior vice president of Marketing and Business Development at Uniquify, responsible for brand development, positioning, strategy and business development activities. Bob began his … More » Breaking Down Chip Design Functional Verification with Breker’s Adnan HamidNovember 30th, 2021 by Bob Smith, Executive Director
Note: Last week, I post the first of my Q&A blog post series with ESD Alliance member company executives that appears on the SEMI website. This Q&A features Adnan Hamid of Breker Verification who describes the functional verification space. Adnan Hamid, CEO, founder and visionary of Breker Verification Systems, an ESD Alliance member based in San Jose, Calif., once described his job in chip design verification at AMD as “breaking things.” When it came to naming his startup, Breaker was a natural choice. After some consideration, the “a” was dropped and the company became Breker. Now Hamid is breaking the most complex semiconductor designs and Breker, moving from a startup to a scale-up company, is a noted part of the functional verification space. Smith: Why does verification continue to take the most amount of time in a project cycle? Hamid: The project cycle for semiconductor design has changed. Design abstraction has been raised to a much higher level than the days when developers were connecting logic gates. Today’s developers are typing functions that don’t include lower-level implementation details. Designs incorporate more blocks of reusable IP. Both reduce design time. Meanwhile, designs are getting bigger with more blocks of IP stitched together, all in need of testing. As design complexity grows, the amount of testing and verification increases as a square of design effort. One block requires one functional verification effort. Four blocks of IP mean up to 16 functional interactions require verification. While design is moving up the abstraction level, that’s not the case for verification, where plenty of detail must be reimplemented. Verification has certainly evolved, but engineers still think at the level of independent stimulus, response and coverage, driving the need to allocate so much time for verification.
Smith: Are chips targeting artificial intelligence and machine learning applications more difficult to verify? If so, why? Hamid: Yes, absolutely and it’s an interesting challenge, especially given that machine learning is based on massively connected processing element arrays. Attempting to verify the individual processing elements and the critical interconnects is complex. AI device arrays and, interestingly, verification test content operation may both be thought of as a mathematical graph of processing elements and interconnect. Their operation involves walking through the graph form to generate a result. Finding the optimum path through these arrays is key. To understand how these systems may be effectively verified, it is worth investigating planning algorithms. Originally proposed by IBM, these hold the key to this type of verification process. The AI- style algorithm starts backward at the end of the processing element array and tracks down the most optimal and likely paths through it. At Breker, we have used these planning algorithms extensively to drive our graph-based test content synthesis process. Smith: Does system integration require verification? Hamid: Yes, it does. In the past, most functional verification has been performed at the block level. However, with the increase in more specialized SoCs, functionality is spread across multiple blocks, as well as the software running on the processors, driving full system-on-chip (SoC) functional verification. In addition, new requirements such as security and safety must be validated. A system-level infrastructure such as cache coherency and power domain execution has become more complex and these must also be tested. The new frontier in verification is ensuring a fully operational SoC. Of course, given the size of these SoCs, hardware-assisted verification such as emulation is essential, and porting tests from block simulations to SoC emulations has become a requirement. This porting process is problematic and this in turn has driven portable tests, giving rise to the idea behind Accellera’s Portable Stimulus Standard (PSS), of which Breker was a major participant. Indeed, some companies are taking this to the next level by composing their system-level testbench at the same time as they commence SoC architectural design, and then developing the hardware design, software design and test content all in parallel, in the so-called “shift-left” manner. Smith: Is “shift-left” a growing trend that are you seeing in verification? Hamid: Yes. Shift-left is taking hold in hardware and software design, giving way to an increase in early test content composition. Then as individual blocks are finished and connected, their verification is driven from this same test content, saving a significant amount of time and effort. This is a huge verification and test generation change that was inevitable given the increased time-to-market constraints and SoC complexity. Figure 1: Shift-left is ushering in the next generation of SoC verification. Source: Breker Smith: As an entrepreneur, what advice would you give someone founding a startup or thinking about starting one? Hamid: Do not take the attitude “Build it and they will come.” My best advice for an entrepreneur or fledgling entrepreneur is to solve a specific customer problem, however narrow it might seem. Including services as part of a product offering and developing partnerships with other vendors helps with this and turns your company into a solution provider not a product developer. This is essential for getting the right products to market on time and within budget, and then ultimately scaling them across the market. |