February 25, 2008
First 4 weeks of Shock & Awe … then DVCon
Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
Peggy Aycinena - Contributing Editor

by Peggy Aycinena - Contributing Editor
Posted anew every four weeks or so, the EDA WEEKLY delivers to its readers information concerning the latest happenings in the EDA industry, covering vendors, products, finances and new developments. Frequently, feature articles on selected public or private EDA companies are presented. Brought to you by EDACafe.com. If we miss a story or subject that you feel deserves to be included, or you just want to suggest a future topic, please contact us! Questions? Feedback? Click here. Thank you!

Despite this call for peace between factions, it’s worth noting that the DVCon conference bag came fully loaded with two recent publications: Synopsys’
Verification Avenue, which touts VMM, and Mentor’s
Verification Horizons, which touts OVM. Other than keeping editors and printers in business, a merger of efforts across the industry doesn’t look like it will come anytime soon if the promises detailed in each pub are both pursued.

* Accellera at DVCon: Meanwhile, Accellera took advantage of DVCon (its spiritual home) to announce its Board of Directors has approved the VHDL 4.0 standard specification, which will be released to IEEE for balloting this year: “VHDL 4.0 addresses over 90 issues that were discovered during the trial implementation period for the VHDL 3.0 version. These encompass enhancements to major new areas introduced by VHDL 3.0 including generic types, IP protection, PSL integration, VHDL API integration, and the introduction of fixed and floating point types.”

Does this announcement answer John Cooley’s challenge to his DVCon panelists (see below) to prove that VHDL isn’t dead? I don’t know, but somebody’s using VHDL or I don’t think Accellera would be going to all of this effort for nothing.

* OSCI & NASCUG at DVCon: Co-located with DVCon this year, the North American SystemC Users Group (NASCUG) meeting on Tuesday had 70+ people in attendance. Also on Tuesday, the OpenSystem Initiative (OSCI) hosted a tutorial detailing the OSCI TLM-2 draft standard, released in November 2007, which “addresses the interoperability of memory-mapped bus models at the transaction level, as well as providing a foundation and framework for the transaction level modeling of other protocols.”

Over coffee on Wednesday with ESLX Co-Founder
Jack Donovan, NASCUG President, and Forte VP of Technical Marketing
Mike Meredith, OSCI President, I learned that OSCI is looking for feedback from any interested parties with respect to the TLM-2 standard.

Mike said, “Developing the standard has been challenging, in my view, because it really is multiple standards being built at the same time. It’s meeting the requirements of people who want to do detailed architectural and performance analysis, plus also those who want to do virtual platform development. Yes we’re targeting more than just a single goal, but we think this will prove to be the strength of the standard [in the long run]. There will be interoperable models that can be exchanged across the industry, and you’ll be able to mix and match depending on what you’re trying to accomplish.” Mike also noted that various OSCI events would take place
at both DATE and DAC, where additional opportunities to give feedback on TLM-2 will be available.

Meanwhile, Jack said, “There are a whole class of engineers out there who are looking at working at the ESL level, and a whole lot of people using the OSCI simulator and ModelSim. But [in general], those users are still under the radar of the EDA companies. The question [for many users interested in ESL] is how do you grow adoption of SystemC and ESL within the company without shutting down completely for a number of months.”

Jack added that the business models for companies in Europe and Asia provide a better chance to see the opportunities associated with SystemC, versus the fabless business models more common in the U.S. where it’s only about getting the chip out as fast as possible. Companies outside of North America will often give an employee several years to come up to speed on system-level languages and technologies, to essentially become an internal evangelist, and then give that same employee additional time to educate their co-workers in the technology. Jack and Mike said that’s why we’re seeing a different pattern of adoption of ESL and SystemC in North America versus elsewhere
in the world. Nonetheless, they remain extremely optimistic that the move to higher levels of abstraction worldwide is inevitable.

Denali’s freight train (see above) and Cooley’s doubts (see below) notwithstanding, I’d have to agree with Jack Donovan and Mike Meredith. It’s really not over ‘til it’s truly over – and the fight for SystemC and ESL ain’t anywhere near being over yet.

* Formal Verification at DVCon: In a complex hour of conversation positioned at the heart of DVCon’s topic material, a panel that included Intel’s
Limor Fix, IBM’s
Avi Ziv, Jasper Design’s
Rajeev Ranjan, Mentor Graphics’s
Harry Foster, and Cadence’s
Axel Scherer attempted to create some order out of one of the thorniest questions in verification. Is formal verification a reality or is it not?

Although there appeared to be agreement between the speakers with regards to the need for standards to establish structure among the different verification methods, there was a fair amount of disagreement in other areas of the discussion. In the end, after what seemed to me a confusing array of positions and counter-positions from the various speakers, the panel ended with one clarifying question from discussion moderator
Richard Ho. His question: “Has formal verification come of age?”

The answers from the three EDA vendors were inevitable. Harry Foster said, “Yes.” Axel Scherer said, “Yes.” And Rajeev Ranjan said, “Absolutely!” The answers from the EDA customers were not so predictable. IBM’s Avi Ziv said, “I wouldn’t go so far as to say that formal verification’s come of age.” Intel’s Limor Fix got a round of applause: “Formal verification has finished high school, but not yet started university!”

* Low-Power Design & Verification at DVCon: There were essentially three sessions on low power at DVCon: “Verification of Low-Power Designs” featuring speakers from STMicro, Mentor, and Cadence; “Assertion-Based Verification of Low-Power Design” featuring speakers from Mentor, and Cadence; and “Trends in Low-Power Verification” featuring Synopsys Fellow Tom Williams. If you conclude from this list that Mentor, Cadence, and Synopsys are concerned about the verification of low-power designs, I think you’d be right. I attended all or part of all 3 sessions and came away with the impression that each and
every vendor laid claim to far more progress in the technology than the engineers in the audience were willing to acknowledge. The idea of verifying designs that can have more than 25 power islands on-chip is so daunting, it’s not a surprise that the technologies and tools suggested by the vendors are being greeted with skepticism by the users.

After speaking about static and formal verification of power-aware design using UPF, Mentor’s
Amit Srivastava was stymied by a question from the audience: “So, this tool will generate assertions? Does is actually exist yet?”
Harry Foster, session chair, answered: “This is a proof of concept talk!”

After speaking about power assertions and coverage for low-power verification in that same session, Cadence’s
Bill Winkeler was equally stymied by a question: “You’re turning power on and off on a bus as specified, but how can we be sure it’s all covered?” Winkeler’s response: “We measure whether or not it’s a domain, a mode, or a transition. But other than that, there’s no automatic way to do what you want.”

Tom Williams gave a dynamic early morning keynote on Thursday on trends in low-power verification, one in which he dramatized on stage the difficulties electrons are having these days making their way efficiently through narrow Cu interconnects (average width 600 Å) versus the much roomier Al interconnects of yore (average width 1000 Å). Although Williams made a terrific electron, he too was hit up with questions from skeptics of Synopsys’ strategy of including dynamic analysis in low-power design verification.

Question: “Even if you’re working on a mix of voltage domains, aren’t there clearly defined boundaries between voltage domains like there are with clock domains [making dynamic analysis unnecessary]?” Williams replied, “There should be, but there can be errors. And yes, one would hope for a global solution [that might arise] if you could shove everything into the static portion of the design reliably, but that’s just not possible.”

* Ending Endless Verification: Wally Rhines’ Wednesday afternoon keynote was addressed directly to a packed ballroom full of real engineers. He talked. They listened. He promised to talk about verification, but begged permission to start with DFT. He said on-chip complexity forced folks to search for better ways to test over the years. Bigger and faster computers had helped, as had testing for stuck-at faults, but the number of transition faults still got bigger. Test engineers beat that stuff back, Rhines said, by shifting to scan-based test, by introducing ATPG, BIST, and
ordered-test patterns, and had increased test efficiencies by up to 10x. But it wasn’t enough because even though the cost of components came down, the cost of test did not.

« Previous Page 1 | 2 | 3 | 4 | 5  Next Page »

You can find the full EDACafe event calendar here.

To read more news, click here.

-- Peggy Aycinena, EDACafe.com Contributing Editor.

Review Article
  • DVCon Troublemaker's Panel February 26, 2008
    Reviewed by 'Cedric Iwashina'
    Hi Peggy,
    At the DVCon Troublemaker's Panel, I thought that some of Gary Smith's observations/predictions were the most interesting.
    A few weeks ago at the EDAC CEO panel, you asked Mike Fister about the rumor last summer that Kohlberg Kravis Roberts and the Blackstone Group might possibly take Cadence private.
    On this year's Troublemaker's panel, Gary said that a large part of Cadence's recent financial shortfall was due to private equity (PE). Specifically, some of Cadence's largest customers were taken private over the last two years, e.g., NXP and Freescale. Now, the PE firms realize that they made a mistake and are losing money hand over fist. So, they've cut costs to the bone. As a result, they're not spending as much on EDA software as expected. Gary also said that, due to these financial debacles, PE firms were no longer interested in anything even remotely related to semiconductors, including EDA.
    Gary also said in physical implementation, Cadence needed to buy Magma, but probably couldn't afford it any longer after their precipitous drop in market cap. He said that Mentor/Sierra would be able to sustain a viable physical implementation business. And that AtopTech would most likely be acquired, probably by Synopsys.
    One last thing he said, that I found to be a bit strange, is that ESL continues to grow, but hasn't reached the "knee of the curve" in terms of market acceptance, yet. I think he said that it would reach that "knee" around 2012. That can't be good news for ESL companies.

      8 of 9 found this review helpful.
      Was this review helpful to you?   (Report this review as inappropriate)

  • What the *&!^? February 25, 2008
    Reviewed by 'Bob Smith'
    Peggy -
    This is a great article and dissection of the state of the EDA industry. No doubt the industry is going through very tough times and your article provides several different perspectives about why this is happening. Your obesrvation about the role the IDMs play in the food chain is very good, for example. While EDA may be on the "outs" now, it is clear that if semiconductor technology is going to continue to evolve, it will require new technical innovations to solve the tough problems in design, verification, and implementation. Some of this will come from within the large IDMs / semicos, but based on history we also know that a good part of the fundamental innovation will come from startups and academia. The question is ... can the fundamental business models that drive the industry change and evolve as well?

      7 of 10 found this review helpful.
      Was this review helpful to you?   (Report this review as inappropriate)

For more discussions, follow this link …


Featured Video
Peggy AycinenaWhat Would Joe Do?
by Peggy Aycinena
Acquiring Mentor: Four Good Ideas, One Great
More Editorial  
Manager, Field Applications Engineering for Real Intent at Sunnyvale, CA
Upcoming Events
DeviceWerx - 2016 at Green Valley Ranch Casino & Resort Las Vegas NV - Nov 3 - 4, 2016
2016 International Conference On Computer Aided Design at Doubletree Hotel Austin TX - Nov 7 - 10, 2016
ICCAD 2016, Nov 7-10, 2016 at Doubletree Hotel in Austin, TX at Doubletree Hotel Austin TX - Nov 7 - 10, 2016
Electric&Hybrid Aerospace Technology Symposium 2016 at Conference Centre East. Koelnmesse (East Entrance) Messeplatz 1 Cologne Germany - Nov 9 - 10, 2016
S2C: FPGA Base prototyping- Download white paper

Internet Business Systems © 2016 Internet Business Systems, Inc.
595 Millich Dr., Suite 216, Campbell, CA 95008
+1 (408)-337-6870 — Contact Us, or visit our other sites:
AECCafe - Architectural Design and Engineering TechJobsCafe - Technical Jobs and Resumes GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy Policy