February 25, 2008
First 4 weeks of Shock & Awe then DVCon
Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
Despite this call for peace between factions, it’s worth noting that the DVCon conference bag came fully loaded with two recent publications: Synopsys’
Verification Avenue, which touts VMM, and Mentor’s
Verification Horizons, which touts OVM. Other than keeping editors and printers in business, a merger of efforts across the industry doesn’t look like it will come anytime soon if the promises detailed in each pub are both pursued.
* Accellera at DVCon: Meanwhile, Accellera took advantage of DVCon (its spiritual home) to announce its Board of Directors has approved the VHDL 4.0 standard specification, which will be released to IEEE for balloting this year: “VHDL 4.0 addresses over 90 issues that were discovered during the trial implementation period for the VHDL 3.0 version. These encompass enhancements to major new areas introduced by VHDL 3.0 including generic types, IP protection, PSL integration, VHDL API integration, and the introduction of fixed and floating point types.”
Does this announcement answer John Cooley’s challenge to his DVCon panelists (see below) to prove that VHDL isn’t dead? I don’t know, but somebody’s using VHDL or I don’t think Accellera would be going to all of this effort for nothing.
* OSCI & NASCUG at DVCon: Co-located with DVCon this year, the North American SystemC Users Group (NASCUG) meeting on Tuesday had 70+ people in attendance. Also on Tuesday, the OpenSystem Initiative (OSCI) hosted a tutorial detailing the OSCI TLM-2 draft standard, released in November 2007, which “addresses the interoperability of memory-mapped bus models at the transaction level, as well as providing a foundation and framework for the transaction level modeling of other protocols.”
Over coffee on Wednesday with ESLX Co-Founder
Jack Donovan, NASCUG President, and Forte VP of Technical Marketing
Mike Meredith, OSCI President, I learned that OSCI is looking for feedback from any interested parties with respect to the TLM-2 standard.
Mike said, “Developing the standard has been challenging, in my view, because it really is multiple standards being built at the same time. It’s meeting the requirements of people who want to do detailed architectural and performance analysis, plus also those who want to do virtual platform development. Yes we’re targeting more than just a single goal, but we think this will prove to be the strength of the standard [in the long run]. There will be interoperable models that can be exchanged across the industry, and you’ll be able to mix and match depending on what you’re trying to accomplish.” Mike also noted that various OSCI events would take place
at both DATE and DAC, where additional opportunities to give feedback on TLM-2 will be available.
Meanwhile, Jack said, “There are a whole class of engineers out there who are looking at working at the ESL level, and a whole lot of people using the OSCI simulator and ModelSim. But [in general], those users are still under the radar of the EDA companies. The question [for many users interested in ESL] is how do you grow adoption of SystemC and ESL within the company without shutting down completely for a number of months.”
Jack added that the business models for companies in Europe and Asia provide a better chance to see the opportunities associated with SystemC, versus the fabless business models more common in the U.S. where it’s only about getting the chip out as fast as possible. Companies outside of North America will often give an employee several years to come up to speed on system-level languages and technologies, to essentially become an internal evangelist, and then give that same employee additional time to educate their co-workers in the technology. Jack and Mike said that’s why we’re seeing a different pattern of adoption of ESL and SystemC in North America versus elsewhere
in the world. Nonetheless, they remain extremely optimistic that the move to higher levels of abstraction worldwide is inevitable.
Denali’s freight train (see above) and Cooley’s doubts (see below) notwithstanding, I’d have to agree with Jack Donovan and Mike Meredith. It’s really not over til it’s truly over – and the fight for SystemC and ESL ain’t anywhere near being over yet.
* Formal Verification at DVCon: In a complex hour of conversation positioned at the heart of DVCon’s topic material, a panel that included Intel’s
Limor Fix, IBM’s
Avi Ziv, Jasper Design’s
Rajeev Ranjan, Mentor Graphics’s
Harry Foster, and Cadence’s
Axel Scherer attempted to create some order out of one of the thorniest questions in verification. Is formal verification a reality or is it not?
Although there appeared to be agreement between the speakers with regards to the need for standards to establish structure among the different verification methods, there was a fair amount of disagreement in other areas of the discussion. In the end, after what seemed to me a confusing array of positions and counter-positions from the various speakers, the panel ended with one clarifying question from discussion moderator
Richard Ho. His question: “Has formal verification come of age?”
The answers from the three EDA vendors were inevitable. Harry Foster said, “Yes.” Axel Scherer said, “Yes.” And Rajeev Ranjan said, “Absolutely!” The answers from the EDA customers were not so predictable. IBM’s Avi Ziv said, “I wouldn’t go so far as to say that formal verification’s come of age.” Intel’s Limor Fix got a round of applause: “Formal verification has finished high school, but not yet started university!”
every vendor laid claim to far more progress in the technology than the engineers in the audience were willing to acknowledge. The idea of verifying designs that can have more than 25 power islands on-chip is so daunting, it’s not a surprise that the technologies and tools suggested by the vendors are being greeted with skepticism by the users.
After speaking about static and formal verification of power-aware design using UPF, Mentor’s
Amit Srivastava was stymied by a question from the audience: “So, this tool will generate assertions? Does is actually exist yet?”
Harry Foster, session chair, answered: “This is a proof of concept talk!”
After speaking about power assertions and coverage for low-power verification in that same session, Cadence’s
Bill Winkeler was equally stymied by a question: “You’re turning power on and off on a bus as specified, but how can we be sure it’s all covered?” Winkeler’s response: “We measure whether or not it’s a domain, a mode, or a transition. But other than that, there’s no automatic way to do what you want.”
Tom Williams gave a dynamic early morning keynote on Thursday on trends in low-power verification, one in which he dramatized on stage the difficulties electrons are having these days making their way efficiently through narrow Cu interconnects (average width 600 Å) versus the much roomier Al interconnects of yore (average width 1000 Å). Although Williams made a terrific electron, he too was hit up with questions from skeptics of Synopsys’ strategy of including dynamic analysis in low-power design verification.
Question: “Even if you’re working on a mix of voltage domains, aren’t there clearly defined boundaries between voltage domains like there are with clock domains [making dynamic analysis unnecessary]?” Williams replied, “There should be, but there can be errors. And yes, one would hope for a global solution [that might arise] if you could shove everything into the static portion of the design reliably, but that’s just not possible.”
ordered-test patterns, and had increased test efficiencies by up to 10x. But it wasn’t enough because even though the cost of components came down, the cost of test did not.
You can find the full EDACafe event calendar here.
To read more news, click here.
-- Peggy Aycinena, EDACafe.com Contributing Editor.