Open side-bar Menu
 Guest Blogger
Jonah McLeod
Jonah McLeod
Marketing Communications Director at Kilopass Technologies, Inc.

Is “Lifecare” the Next Killer App?

 
August 7th, 2012 by Jonah McLeod

Article source: Kilopass Technologies

The world population hit 7 billion last fall, with a billion more expected in a dozen years. “Lifecare” represents an incredible opportunity for the semiconductor industry to promote health, energy conservation, safety and productivity. From smart city infrastructure to medical care advances, from sensors and controls to nanotechnology, what new EDA ecosystems will emerge to better model the real world? Panelists participating in the discussion “Is Lifecare the Next Killer App?” at the Design Automation Conference on June 4, 2012 addressed the question and their remarks are quite enlightening. Moderator Rick Merritt, Editor at large, Electronic Engineering Times led the discussion, which included Kristopher Ardis from Maxim Integrated Products, Fabrice Hoerner, from QUALCOMM Inc. and Greg Fawcett from Palo Alto Research Center.

Accelerating Coverage Closure with Jasper

 
August 6th, 2012 by Rob van Blommestein

Jasper’s formal technology has advanced to the point that it can address a broad range of verification and design issues. With a strong foundation in fundamental proof technology and best-in-class capacity and performance, Jasper’s users now apply the tools and technology to address questions of connectivity, x-propagation, clock-glitch detection, protocol cache coherence, deadlock detection, property synthesis and more.

The added scope and breadth of use of Jasper’s tools and technology is leading users to demand a measurable and quantitative approach that will help correlate the results of formal proofs to verification closure, often expressed in terms of verification coverage. What is needed is a methodology that will correlate formal proof results with coverage. A second requirement is for a methodology that can integrate the coverage results from Jasper’s formal technology with other verification tools (simulation). A third requirement is the ability for Jasper tools to use external coverage data to address areas in the design that are not covered by other verification methodologies.

Read the rest of Accelerating Coverage Closure with Jasper

Jasper Users Share How They Upgraded Their Verification with Jasper

 
August 1st, 2012 by Rob van Blommestein

Enough can’t be said about the power to educate based on experience.  At this year’s DAC, a few of Jasper’s top users volunteered to give seminars on their best practices for using Jasper Formal technology.  If you happened to miss DAC or did attend but didn’t get a chance to visit the Jasper booth, here’s your chance to view the on-line videos from ST, ARM, and NVIDIA on how they utilized Jasper Formal technology to get ahead in their designs.

ST: Low Power Verification and Optimization with Jasper Formal

ST Microelectronics talked about the verification challenges associated with sophisticated low-power designs, and ways those challenges are being addressed by Jasper’s power-aware formal verification technology.  The seminar detailed how Jasper’s low-power verification solution applies to:

  • Parsing CPF information to enable power-aware formal analysis
  • X-propagation due to shutting down power
  • Functional impact due to power-down
  • Power-up state analysis
  • Exploration of power-state

Read the rest of Jasper Users Share How They Upgraded Their Verification with Jasper

Think Parallel First, Then Cloud for EDA

 
July 20th, 2010 by Ziyad Hanna, Chief Architect and VP of Research, Jasper Design Automation

Cloud computing was the subject of much interest and discussion at this year’s DAC.  While I acknowledge that the cloud will play an increasingly important role in our business, its displacement of today’s semiconductor design practices is easily a decade or more away.

The attraction of the cloud is to increase one’s access to raw computing power and software.  If you need more speed, or a specialized program, just grab it and go.  But the hard work and differentiation for our customers is still done in the trenches, not up in the sky.  They are not about to put their proprietary designs on some server, somewhere, or give up their customized and optimized design flows.  What can we in EDA do for them today to help with increased design complexity and the need for higher performance?

For me the answer lies in maximizing throughput by leveraging the advantages of parallel computing.  This is also a subject of great debate within our community, often involving the technical difficulties of parallelizing EDA software (particularly legacy software) and its impact on traditional business models.  These are real challenges, but so are the benefits to our users when we get it right.

Read the rest of Think Parallel First, Then Cloud for EDA

Accellera at DAC: Defining a Universal Verification Methodology

 
June 7th, 2010 by Stan Krolikoski, Cadence; Dennis Brophy, Mentor; Yatin Trivedi, Synopsys

First of all, we’d like to invite all the DAC attendees to Accellera’s breakfast and panel about UVM: Charting a New Course on Tuesday, June 15, 7:30 am – 9:00 am, Room 203B in Convention Center.

It is no news when one talks about increasing complexity of designing the SoC devices. It is a foregone conclusion that designing is a relatively bounded problem compared to verification. Just as design reuse through Semiconductor IP (aka design IP) helped bring the designers up the productivity curve, in the last decade Verification IP (VIP) has done the same for the verification engineers. Two leading methodologies, Verification Methodology Manual (VMM) and Open Verification Methodology (OVM), helped accelerate the adoption of structured verification methodologies using SystemVerilog as well as the creation of commercially available verification IP to independently validate integration of design IP in SoCs. Essentially, both methodologies are a collection of SystemVerilog classes with inherent semantics for their behavior in different phases of the simulation. The user creates verification objects from these classes and attaches them to the design components as traffic/data generators, monitors, checkers, etc.

Read the rest of Accellera at DAC: Defining a Universal Verification Methodology

What Will It Take for Next-generation Routing to Meet the Needs of the Most Advanced Process Nodes and Beyond?

 
June 1st, 2010 by Mark Waller, VP of Research and Development, Pulsic

Custom chip designers are reluctant to adopt automation, largely because they have been traditionally better able to design by hand. While hand crafting may still suffice for designs of relatively few transistors, it is no longer sufficient for the new, highly complex devices that are becoming the norm.

At advanced nodes, process rules make full hand layout almost impossible. For example, instead of simple space rules, “space” now depends on the width of metal and the length of parallel lines, and there are complex via and contact density rules and end-of-line rules that can’t readily be dealt with by hand crafting. An automatic custom design routing tool that can deal with the new custom world will improve productivity and achieve on-time, efficient design delivery.

The thought of automation raises the specter, with some designers, that they, or the majority of their functions, will be replaced. However, increased custom design automation will increase the productivity of designers, not replace them, as we have seen in the digital design world. Given that extremely complex projects now need to be completed in the same time and with the same number of people as older, simpler designs, automating the custom design process to increase the productivity of designers is the only way to manage multi-thousand-gate designs.

Read the rest of What Will It Take for Next-generation Routing to Meet the Needs of the Most Advanced Process Nodes and Beyond?

DAC Tutorials: Get a shot in the arm for your job skills

 
May 18th, 2010 by Robert Jones, Sr. Principal Engineer at Intel

Hi, I’m Robert Jones, the DAC 2010 Tutorials Chair.The executive and tutorials committees have been working for almost a year on this summer’s tutorial program.I’m excited about the topics; I hope that you will find them as compelling as I do.The speakers are all domain experts; the speaker lists read like a “Who’s Who” in the respective areas.

This year, we will offer seven tutorials; four full-day, two half-day, and one for two hours.The topics are timely and relevant: ESL design and prototyping, low power, SystemC for multiple domains, analog circuit design, and 3D ICs.Two of the tutorials cover topics outside of EDA technology: marketing and software development.Each tutorial will follow the DAC tradition of providing clear, informative education from domain experts.

Monday offers two full-day tutorials. The first, ESL Design and Virtual Prototyping of MPSOCs, will provide a comprehensive introduction to the fast-growing world of Electronic System Level (abstract) design. Attendees will learn about current ESL design techniques, future trends, and participate in exercises via a live CD distributed to all participants.

The second Monday tutorial, Low Power from A to Z, covers one of today’s hottest (pun intended) issues in design and design automation. This tutorial will provide a comprehensive overview of low-power approaches at all levels of the design process, from process technology to system architecture.

Two tutorials will cover important topics for EDA professionals that are not usually part of a standard Electrical Engineering curriculum. On Monday morning, the two-hour tutorial Marketing of Technology – The Last Critical Step will teach a technically-trained professional about go-to-market strategies and planning.

Read the rest of DAC Tutorials: Get a shot in the arm for your job skills

(How to) Train Your DAC Dragon – Pavilion Panels

 
May 12th, 2010 by Sabina Burns and Yatin Trivedi

Believe it or not, it is time for another DAC. Yes, the 47th DAC will be upon us in just under 6 weeks. The Pavilion Panels Committee has worked hard over the past four months and we think we have a great program to offer in Anaheim, CA.

If you are a long time DAC attendee, you know the routine – Gary Smith opens the Pavilion festivities on Monday with his list of “What’s Hot at DAC.” Tuesday, Jim Hogan will bring “Hogan’s Heroes” to the podium to talk about the design and lithography challenges at 22nm. Veteran EDA venture capitalist Lucio Lanza will give litmus test on Wednesday to four start-ups about starting, surviving and thriving in the EDA marketplace. The stalwart sessions also include an interview by Peggy Aycinena with the winner of the prestigious Marie R. Pistilli Women in Design Automation Award; an EDA Heritage session with previous Phil Kaufman Award winners Prof. Randy Bryant and Dr. Phil Moorby where they will discuss the impact of commercialization of their inventions; and Kathryn Kranen will host the ever popular session with High School students who tell the experienced audience that “You Don’t Know Jack!” about how they use the latest tech gadgets and what they expect to be using in 2 to 3 years.

… and that’s just the regulars. In the new and exciting category, this year we’ve added “Everyone Loves Product Teardowns.” On Tuesday and Wednesday, right after lunch, you’ll get to see live teardowns of two products sponsored by IP vendors ARM and Virage Logic. The in-depth information of the newest product in the market is sure to please the technical appetite, but you can’t afford to miss the session as the same product (a new one in original packaging, of course) will be given away in a prize drawing. You know those pesky auditors require you to be present to win …

This year’s Pavilion Panel Program will also feature a larger number of User Panelists (versus EDA Vendor Panelists) than ever before. Look for User Panelists to speak on Outsourcing Challenges, Analog Interoperability, SoC Verification, Multi-core Multi-OS Applications and SPICE Flavors. Other exciting panels in the lineup include: Career/Job Outlook, IP Commercialization, the 28nm Ecosystem and FPGA Business Opportunities. Keep your search engines tuned to look for information on each of these panels.

That’s how you train your DAC schedule for the right pavilion panels.

This year’s pavilion events will be more exciting and rich in experience than ever before. The Pavilion Panels Committee has done its part. They have done a thoroughly great job of reviewing submissions, selecting topics, and finding the most knowledgeable moderators and panelists to appeal to all attendees. It’s been a challenging task and we are looking for your verdict at DAC. Of course, we are confident you will learn and enjoy these sessions.

One final word: Just in case you don’t know what or where the DAC Pavilion is, look for signs at the Exhibit Hall entrance. The booth number is 694, located towards the back wall next to the Synopsys booth.  We look forward to seeing you there!

EDA 2010: The Year of “Less is More”

 
February 1st, 2010 by John Zuk, VP marketing & Strategy at TannerEDA

While many of us are just getting used to writing “2010” on our documents and personal checks, it’s clear that the economic impact of 2009 will not be forgotten any time soon.
The consensus across diverse constituencies – ranging from world leaders to industry heads and many leading economists – is clear. We are not simply recovering from a cyclical recession; we are entering into a Global Economic Reset. While this Reset creates challenges for balancing our labor forces and manufacturing capacity, it provides a real opportunity for electronic design automation (EDA) providers to demonstrate the intrinsic value of our technologies.

The semiconductor industry stands on the shoulders of its EDA tool providers and we must deliver the innovations and productivity necessary to feed and nurture the designers that have come to rely on us. It is only through this combination of innovation and productivity that we can provide the sustained value that will serve as the growth catalyst our broader ecosystem thrives on.

Innovation without context is irrelevant, however, so it’s essential that we deliver technology and capability in a manner that can be applied and exploited by the intended user. Through our interactions and discussions with designers, we consistently hear that many EDA design tools have exceeded the core requirements for a majority of the user base. In fact, just last week one of our customers referred to their usage of a “big three” vendor tool as “firing up the space shuttle to go to the corner store for milk.” This excess is understandable, as the market leaders are driven by the most extreme requirements for their (niche) user base working in the smallest geometries with unique needs. What’s tragic is that these cumbersome, overburdened tool flows have become the acceptable paradigm for the entire industry. The result is an ever-increasing gap between the requirements of most users and the features and functionality provided by the market-leading tools.

We believe 2010 is ripe for a new paradigm – one where “less is more.” An approach to tool design that delivers just the right mix of top-notch features and functionality that is squarely aligned with requirements. This concept of elegant, efficient design is what John Tanner embraced when he founded Tanner EDA twenty-two years ago and it’s an approach that we believe is not only relevant — but imperative — today.

Delivering on “less is more” is difficult. Anyone who has tried to distill a presentation to one slide or simplify a complex design knows that it requires more than just skill. One must achieve a deep level of understanding in order to get to the essence of the topic or issue. For EDA products, we think that functionality will not include superfluous features but instead will deliver excellent, tested and well supported solutions. This cannot be achieved in a vacuum; it requires the leveraging of users, partners, and even competitors. We believe this leverage – achieved through models such as “open innovation” (originally coined by Professor Henry Chesbrough) — is essential to achieving and sustaining “less is more.”

The open innovation business model offers a compelling framework for consideration in the EDA industry. With a core principle that ideas and intellectual capital can come from outside the traditional boundaries and connections, open innovation can bring new capabilities and technologies efficiently and effectively. Companies in the broader EDA ecosystem (such as Qualcomm) have already embraced open innovation as a means of effectively bolstering their innovation capacity and effectiveness.

While perhaps not considered a traditional example of open innovation, process design kits (PDKs) offer a congruent model for connecting technologies and intellectual property (IP) from one domain (IC fabrication) to another (design). One perspective on PDKs is that they are simply rule-sets that provide all users with a consistent base of information; effectively eroding opportunity for differentiation within a design. However, further consideration reveals that there are several other dimensions to PDKs where unique IP can be inserted for sustained differentiation.

One such dimension is PDK selection: simply identifying and applying technologies within and across foundries. A working example of this is Tanner’s recent collaboration with Sound Design Technologies (SDT). SDT and Tanner are launching a PDK to allow users to include advanced integrated passives technologies and 3D chip packaging capabilities in their designs. This offers the potential for substantial space savings as well as production and operating cost reductions. The other dimension here is access – where a tool vendor’s use of standard programming languages and opening of PDKs can provide a designer with the access and opportunity to customize and create differentiation.

Productivity is not exclusive from innovation; in fact, we believe that in this new era of doing more with less, designers will require more productivity from their EDA tools if they are to achieve the breakthroughs demanded by their customers. We believe that 2010 will see productivity requirements expand beyond the basics of performance, security and capability. Significant advances in the area of design environments and analog automation will achieve prominence. And designers will be able to use more efficient, focused tools to deliver profound breakthroughs for business and society.

Practical Methodologies for Power / Signal Integrity of Chip-Package-Board Designs – A Industry Focused Workshop at DesignCon 2010

 
January 22nd, 2010 by Bhavana Thudi

Is there a disconnect in your die, package and board design methodology? As technologies evolve to meet demands of higher performance, smaller size and lower cost, there are several challenges in the design of chips, packages and boards which must be addressed with an integrated analysis and verification methodology.

For example, maintaining power integrity means ensuring that the entire power delivery from the voltage regulator on the PCB to the transistor on the die meets the device power supply requirement. This involves designing and optimizing the location of the voltage regulator, PCB and package de-coupling capacitors, power plane impedance, bump placement, on-die power grid, on-die de-coupling capacitors and switching transients in one design and analysis environment. Sufficient data sharing needs to happen between each of the design teams to ensure that the final working part delivers the supply current as needed by the chip within the specified voltage fluctuation limit.

On the other hand, high-speed memory and serial interfaces have very stringent requirements for simultaneous switching noise resulting from their need to maintain the fidelity of the transmitted and received signals. The simultaneous consideration of the IO ring design, IO and decoupling capacitor cell placement, input switching pattern, and package/board power and signal layouts is necessary to meet the goals of near end and far end SSN.

However, as these challenges become increasingly critical to the success of next generation of designs and systems, there is a singular lack of tools and methodologies to address these issues. Multiple disparate techniques exist with contention about the efficacy of each. Tool-sets exist but address only parts of the problem. For example, frequency domain analysis tools employing fast electromagnetic field solvers have looked at the mid and low frequency power delivery network system noise. Hence, time domain analysis tools are needed to solve for the high frequency noise that result from the switching of devices on the chip. But these time domain analysis tools need accurate models of the package and board to provide realistic on-die voltage and current waveform data. Similarly for IO signal integrity analysis, most analysis methods compromise either on the modeling sophistication or on data inclusiveness to generate results in a reasonable time-frame. For example, either the entire IO bank is not considered in the simulation or the signal/power ground network coupling are ignored. These trade-offs however impact the quality of the results which are critical in determining whether the chip-to-chip transmission will happen according to the specifications.

Apache is sponsoring a workshop at DesignCon where several industry experts from semiconductor and system design houses including Larry Smith from Altera, Jim Antonelis from Broadcom, Rick Brooks from Cisco, and Dr. Souvik Mukherjee from Texas Instruments are coming together to discuss their understanding of these challenges, to present the approaches they are taking and to outline the needs they have for tools and methodologies. Additionally, a panel discussion will be conducted to foster discussion on the techniques for chip model creation, package and board extraction tools and co-simulation methodologies. Perspectives on system modeling, extraction and simulation using EM tools, methods of accurate and distributed modeling of the IC and techniques of performing time domain and frequency domain simulations will be shared through presentations addressing both power and signal integrity issues. The topics will include practical methods for designing and evaluating the system performance. Another important topic is to define a method to build confidence in these methods. This workshop will be an open forum to share insights, discuss issues and present proven techniques. Such an open exchange of ideas and information from semiconductor and system companies will help to define the content of future technologies for chip and package modeling and system level verification of power and signal integrity.

I hope you will join us in the workshop “Practical methodologies for Power/Signal Integrity of Chip-Package-Board Designs” held from 9AM to noon on Thursday February 4th at the DesignCon conference in Santa Clara Convention center.

For more information on the workshop, please visit http://www.designcon.com/2010/attendees/th_th1/index.asp

You can also register for complimentary Exhibit PLUS pass to DesignCon for entry to the workshop. http://www.apache-da.com/apache-da/Home/NewsandEvents/Events.html




© 2024 Internet Business Systems, Inc.
670 Aberdeen Way, Milpitas, CA 95035
+1 (408) 882-6554 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise