DVCon 2014: Design & Verification on Steroids
March 6th, 2014 by Peggy Aycinena
Oh my gosh: If you arrived at DVCon 2014 at 10:45 am on Tuesday this week, you’d have wondered if you’d wandered into the wrong conference. What happened to sedate, dignified DVCon? Standing at the registration desk on the first floor of the DoubleTree Hotel in San Jose, the volume of noise and conviviality sweeping down the staircase from the upstairs mezzanine was unprecedented. What was going on up there? The DVCon morning poster session, awash in company reps and their ideas, and engineers anxious to engage with both.
When I got to the top of the staircase, I took a moment before plunging into the crowd, amazed at the vitality and the numbers of people hobnobbing among the posters. It wasn’t surprising to learn later in the day from DVCon General Chair Stan Krolikoski that over a thousand people – attendees and exhibitors combined – were at this year’s conference. Clearly, DVCon is enjoying an extraordinary renaissance, so much so that DVCon Europe will be debuting this October in Munich, with DVCon India, DVCon China, and DVCon Japan now in the planning stages. Like I said, omg.
So who did I get to chat with during the poster session this week in Silicon Valley, and what were the take-aways?
* Vaibhar Mahimkar from Texas Instruments. His poster illustrated a TI-developed SoC testbench environment architecture with three concentric layers of verification strategies wrapped around the DUV [device under verification]. Layer One consists of a fully synthesizable testbench, written in Verilog; Layer Two is a PPI, written in e/Specman, that provides an abstraction layer to hide all protocol/sequencing details from Layer 3, also written in e/Specman, which in turn provides stimulus generation sequences, functional coverage bins, monitors and checkers – the CDV [Coverage Driven Verification] layer. The proprietary interactions between these layers has allowed TI to verify devices more quickly, but the re-usability of the environment has not extended across all of TI.
Take Away: The porting of ‘secret sauce’ processes and IP across business units remains one of the great challenges in the semiconductor design chain.
* Rich Edelman of Mentor Graphics briefly talked me through his poster, Debugging Communication Systems: The Blame Game, Blurring the Line between Performance Analysis and Debug. The idea is to get the DUV to label its own data, so it can be tracked as it flows through the device and indicate where things may have gone wonky. The strategy assumes, of course, that you know what the data should look like after transiting the device, and how to label the data in the first place. Definitely a work in progress.
Take Away: Making devices smart enough to verify themselves is really the only way to go.
* Oracle Labs sent Ram Narayan to host a poster, The Future of Formal Model Checking is Now. [He came into the Oracle organization through the acquisition of Sun Labs.] Talking with Narayan was interesting because of his enthusiasm for Formal, and because his poster was predicated on the fact that still today, Formal is not fully accepted, is not fully mainstream for most verification strategists.
Per the poster, the Perception of Formal is that it’s only for experts, that it needs special skills and talents, is hard to use and deploy, is impractical for real designs, and needs a lot of effort before benefits can be reaped.
However, per Narayan’s poster, the Reality of Formal is that yes, it needs planning, diligence and persistence, but establishing the Formal skill set within the verification team can evolve from modest beginnings to a fully targeted strategy that produces immediate results by leveraging fundamental assertion-based verification principles.
Take Away: New technologies always need internal evangelists to propel them through the organization.
* Andreas Meyer used his poster, System-Level Distributed Metrics Analysis and Results, to talk about strategies for verifying devices with multiple clusters of cores. Meyer said ARM deals with design issues at “lower levels” within the device, yet still has to keep an eye on the target SoC within which the ARM cores will reside. To do that, they work with companies like Mentor Graphics to look to the larger system’s verification needs by feeding the appropriate data into third-party VIP, and by providing accurate information about what’s happening on ARM-provided bus structures.
Take Away: It’s in the best interest of megalithic companies like ARM, whose technology comprises a de-facto industry standard, to cultivate an ecosystem conducive to further entrenchment of that standard. What goes around, comes around.
Poster Session to Happy Hour …
I was only able to be at DVCon on Tuesday this week, very disappointing, but I tried to make the most of that one day. Following the late-morning poster session, I attended technical sessions in the afternoon – following VC Jim Hogan’s aw-shucks keynote at lunch where he talked about aging tech warriors such as himself and their need for wearable devices that will get [read “force”] them to live healthier, happier lives by monitoring calorie intake and exercise output.
After the technical sessions ended, it was on to the legendary DVCon Happy Hour on Day 2 of the 3-day Exhibition Hall. If you thought the Poster Sessions were lively, you should have been at DVCon between 5 pm and 6 pm. The exhibitor booths spilled out into the hallway, and the whole place was abuzz with conversation, food and libation.
Who knew DVCon could so totally feel like Design & Verification on Steroids. No wonder the folks who run the conference want to take it international. They can see that technology is at its most mesmerizing when presented face-to-face.
Tags: Andreas Meyer, ARM, DVCon 2014, DVCon Europe, formal verification, Jim Hogan, Mentor Graphics, Oracle Labs, Ram Narayan, Rich Edelman, Stan Krolikoski, TI, Vaibhar Mahimkar