Open side-bar Menu
 Real Talk

Archive for January, 2011

EDA Innovation

Monday, January 31st, 2011

I recently came across this quote from Robert Noyce:  “Optimism is an essential ingredient for innovation.  How else can the individual welcome change over security, adventure over staying in safe places?” 

Noyce knew a thing or two about innovation and the alchemy to create it.  The “Mayor of Silicon Valley” co-founded Fairchild Semiconductor and Intel, and is credited, along with Jack Kilby, with inventing the integrated circuit.  He had both an impressive career and an impressive grasp on innovation.

Armed with this quote, Bob Noyce as a role model and a bit of innovative thinking, I went looking for innovation in EDA.  I’m happy to report that I found it, starting with many of the recipients of the Phil Kaufman Award.  Kaufman, who died in 1992 while on a business trip in Japan, was a creative and innovative force within the areas of hardware, software, semiconductors, EDA and computer architecture.  He was CEO of Quickturn Design Systems, now part of Cadence, and accelerated the use of emulation.  It’s easy to understand why a prestigious industry award carries his name.

The emulation and verification space is one segment of EDA that creates unlimited opportunities for innovative types.  The founders of my company EVE, for example, boldly redesigned the architecture of a hardware emulation platform and, in my humble opinion, transformed a market segment. 

Real Intent is another good example.  Formal verification is a hard and complicated problem.  That didn’t appear to deter Real Intent’s founders who pressed on and devised an innovative approach that makes the lives of many verification engineers much easier.

Entrepreneurial Rajeev Madhavan concluded in the late 1990s that synthesis needed to be linked with physical design.  He and his innovative team at Magma introduced the first physical synthesis and rocked the industry.  And, with Madhavan still at the helm, Magma is still innovating today.  Now, Oasys Design Systems’ team introduced a new synthesis methodology known as Chip Synthesis, enabling designers to synthesize full chips and not just blocks.  That technology, too, is rocking the industry.

Over in Alameda, Calif., Verific Design Automation has taken the mundane task of developing hardware description language parsers and elaborators and built it into a successful business.  In the meantime, these tools have become the industry’s de facto standard front-end software for just about every imaginable EDA and FPGA company.  This is innovative thinking at its greatest.

Of course, anyone who has been in EDA for a while can point to pockets of tremendous optimism and enthusiasm that resonates throughout the industry.  Who needs security or a safe place when there was a big adventure with an innovative and entrepreneurial big thinker just waiting for you in the Silicon Valley office complex next door?

We’re heading into DVCon later this month and DAC in June where we will see many more examples of creative thinking, enthusiasm and optimism in EDA.  I am looking forward to being wowed.

Top 3 Reasons Why Designers Switch to Meridian CDC from Real Intent

Monday, January 24th, 2011

In a meeting last week with a potential customer, I jotted down the following notes on their experience using another company’s CDC (clock domain crossing) tool:

·        The designs were 300k – 2M gates with 10 clock domains

·        They had lots of issues reading the design in, and had to write a bunch of wrappers to get the VHDL through

·        It took 10-15 days to setup each module for CDC analysis

·        FIFOs were not recognized by the tool

·        Many useless messages were hiding real design problems

·        They found 4 bugs in the CDC tool itself

·        After a long struggle, they could not verify CDC successfully on any of their designs

·        That’s why they are talking to Real Intent

In fact, I hear this kind of story at every company I visit. If this sounds like the painful experience you have with your current CDC tool, read on, because you should know that CDC analysis can be much simpler, with the right solution! That’s what customers who have switched to Real Intent are telling us! Check out the latest newsletter to see our user survey results!

Here are the top 3 reasons why companies are switching to Meridian CDC:

1)      Ease of setup – Setup is a very time consuming step in our main competitor’s flow, taking almost 80% of the time as stated by customers. Unfortunately, when you have garbage going in, you get garbage out. So you have to spend a lot of time to setup the other tool to get somewhat meaningful info out. One of my engineer friends recently told me that you almost have to know the answer before setting up the tool in order to get the results – WOW!!! Productivity goes out the door.

 

Meridian CDC, based on Real Intent’s years of experience in understanding designers’ intent, can automatically extract the clock/reset/constant/static intent from the design or the SDC file to ensure proper setup. 90% of setup is done for you automatically! You’ll be getting real results from Meridian CDC while other engineers are still figuring out how to set up the competing CDC tool.

 

2)      Noise – This is primarily a consequence of poor setup. Since setup takes much painful work and lots of time with the other tool, often designers under time constraints have no choice but to forge ahead to the CDC analysis stage without complete setup so some progress and results can be shown to management. However, finding bugs in the mountain of erroneous messages is a formidable task. Many veteran CDC tool users have gone through tens of thousands of messages before giving up on the analysis altogether. This is a repetitive theme I hear in my meetings!

Why is Meridian CDC better? Because of the three underlying principles built in the tool: 1) Meridian CDC invests a lot of effort up front to automatically create the proper setup for users, so their manual effort is minimized. Users are much more willing to invest the remaining 10% effort to ensure complete setup; 2) Meridian CDC provides comprehensive feedback on user setup so refinement can be done easily; 3) Meridian CDC analysis is smarter in reporting root causes of problems, not the many symptoms of problems. As a result, quality and accuracy of results are easily achieved!

3)      Performance – Have you waited days in order to get CDC results? Wait no more! Meridian CDC is on average 10X faster than competition! Finish your project early and take a vacation!

 

4)      Coverage – Oops, this is the fourth one! Well, at least you might expect good coverage from our competition when they report tens of thousands of messages!  NOT SO. Aside from false-positives, they also have a great deal of false-negatives, or missed issues. There is nothing worse than a chip re-spin if you don’t catch a problem in simulation.

Meridian CDC offers a layered approach to CDC signoff to make sure every stone is turned in order to find sneaky CDC bugs and guarantee CDC-safe designs. Following Meridian CDC’s recommended methodology, you can rest assured that no CDC bugs will make it to silicon!

The bottom line – Doing CDC verification takes a Real CDC tool architected to do the job, not a linter adapted to do CDC work. Perhaps it was ok 8-10 years ago when a meager linter could do the job of finding possible clock crossings in a small design with a 10-20 clock domains. However, today’s multi-million gate designs may have 100+ clocks and several layers of hierarchy. Using a linter on these is like playing tennis with a ping-pong paddle. 

If it is painful setting up your CDC tool, if your CDC analysis takes a long time to finish, and if you are tired of weeding through tens of thousands of messages to find bugs, it is time to look at Meridian CDC! Many customers have done so successfully as evidenced by Real Intent’s rapid growth in 2010 (watch out for the press release coming out this week). So why not YOU?

Hot Topics, Hot Food, and Hot Prize

Monday, January 17th, 2011

February in Tokyo is one of the coldest months in a year with an average high of 48F and low of 40F. To warm things up, Real Intent teamed up with SpringSoft, NexTop and Maxeler Technology to offer a joint seminar at Tokyo University VLSI Design and Education Center on Feb. 2, 2011, on “Hot topics in high-performance designs and their functional verification & debug”. The seminar features technical discussions on problems and solutions in many hot verification areas by industry experts:

10:00am Keynote: Acceleration of Verification and Verification of Acceleration

Oskar Mence (CEO, Maxeler Technology)

Acceleration and Verification are mutually important components. Acceleration of individual computer applications via special hardware/software extensions “benefits” from verification, i.e. making sure that the accelerated application still produces the correct result for all relevant input patterns. At the same time verification can take a lot of time if there are very many such relevant inputs, and as a consequence acceleration is of key value. Maxeler provides acceleration solutions and we encounter a range of verification approaches, depending on the domain and people involved. In addition, the acceleration of key verification algorithms such as SAT will show an instance specific acceleration approach using FPGAs.

                                                 

10:40am Presentation: Chasing X’s Between RTL and Gate Level Efficiently

Pranav Ashar (CTO, Real Intent)

Designers must ensure that their gate level netlist produces same results as RTL simulation results. X-propagation is a major cause of differences between gate level and RTL functionality.?It is a painful and time consuming process to identify X sources and chase their propagation from RTL to Gate. Logical equivalence checkers ignore X-propagation and gate level simulations are very slow. Such “X-Prop” issues often lead to dangerous masking of real bugs. This presentation explains the common sources of X’ s, shows how they can mask real bugs that affect functionality and why they are difficult to avoid. It also discusses the challenges that Real Intent overcame in developing a unique and efficient solution to assist designers in catching bugs caused by X propagation.

 

 

11:20am Presentation: Getting You Closer to Verification Closure

Bindesh Patel (Technology manager, SpringSoft)

Today’s leading-edge designs are verified by sophisticated and diverse verification environments, the complexity of which often rivals or exceeds that of the design itself. Despite advancements in the area of stimulus generation and coverage, existing techniques provide no comprehensive, objective measurement of the quality of your verification environment. They do not tell you how good your testbench is at propagating the effects of bugs to observable outputs or detecting the presence of bugs. The result is that decisions about when you are “done” verifying are often based on partial data or “gut feel” assessments. These shortcomings have led to the development of a new approach, known as Functional Qualification, which provides an objective measure of the quality of your verification environment and guidance on how to improve it. If used effectively, Functional Qualification can help you in the early stages of verification environment development. This seminar provides background information on mutation-based techniques – the technology behind Functional Qualification – and how they are applied to assess the quality of your verification environment. We’ll discuss the problems and weaknesses that Functional Qualification exposes and how they translate into fixes and improvements that give you more confidence in the effectiveness of your verification efforts.

 

2:10pm Presentation: Assertion Synthesis: Enabling Assertion-Based Verification For Simulation, Formal and Emulation Flows

Yunshan Zhu (CEO, Nextop)

Assertion-based verification (ABV) helps design and verification teams accelerate verification sign-off by enhancing RTL and test specifications with assertions and functional coverage properties. The effectiveness of ABV methodology has been limited by the manual process of creating adequate assertions. Assertion synthesis leverages RTL and testbench to automatically create high quality functional assertions and coverage properties, and therefore removes the bottleneck of ABV adoption. The synthesized properties can be seamlessly integrated in simulation, formal and emulation flows to find bugs, identify coverage holes and improve verification observability.

 

3pm Presentation: SystemVerilog Testbench – Innovative Efficiencies for Understanding Your Testbench Behavior

Bindesh Patel (Technology manager, SpringSoft)

The adoption of SystemVerilog as the core of a modern constrained-random verification environment is ever-increasing. The automation and sophisticated stimulus and checking capabilities are large reason why. The supporting standards libraries and methodologies that have emerged have made the case for adoption even stronger and all the major simulators now support the language nearly 100%. A major consideration in verification is debugging and naturally, debug tools have to extend and innovate around the language. Because the language is object-oriented and more software-like, the standard techniques that have helped with HDL-based debug no longer apply. For example, event-based signal dumping provides unlimited visibility into the behavior of an HDL-based environment; unfortunately, such straight-forward dumping is not exactly meaningful for SystemVerilog testbenches. Innovation is necessary. This seminar will discuss the use of message logging and how to leverage the transactional nature of OVM and UVM-based SystemVerilog testbenches to automatically record transaction data. We’ll show you how this data can be viewed in a waveform or a sequence diagram to give you a clearer picture of the functional behavior of the testbench. For more detailed visibility into the testbench execution, we will also discuss emerging technologies that will allow you to dump dynamic object data and view it in innovative ways was well as using this same data to drive other applications such as simulation-free virtual interactive capability.

 

3:40pm Presentation: What do you need to know for effective CDC verification?

Pranav Ashar (CTO, Real Intent)

The complexity of clock architecture is growing with larger designs. Functionality that was traditionally distributed among multiple chips is now integrated into a single chip. As a result, the number of clock domains is increasing and Clock domain crossing (CDC) verification has become increasingly important and complex. For the effectiveness of CDC analysis tools it is required that designers/verification engineers have good knowledge of a design’s clock/reset architecture so that complete and accurate constraints can be provided to CDC tools. This knowledge also helps designers/verification engineers understand CDC analysis results meaningfully and efficiently. This seminar discusses what designers/verification engineers need to know in order to perform effective CDC verification.

 

Demo and poster sessions will start at 4:20pm showcasing each company’s technology. A dinner reception with hot food will be served from 5 – 8pm. A hot prize drawing of an iPad will be conducted at the end.  Click here for more information and free registration. Hope to see you there, stay warm in Tokyo!

Satisfaction EDA Style!

Monday, January 10th, 2011

EVE’s founder and CEO Luc Burgun took home the spoils at DAC last June with his winning performance as an EDA360 Idol, the industry’s top talent show, during the Cadence/Denali Party.  Besting four other contestants, Luc delighted party goers by performing the Rolling Stones classic, “ (I Can’t Get No) Satisfaction,” with lyrics rewritten to appeal to DAC attendees. 

Luc had some fun with this.  His rewritten refrain laments, “I can’t get no satisfaction, I can’t get no bug reaction,” which makes you wonder if the lyrics played a significant role in his win.  After all, we’ve all heard verification engineers complain about the tools they have at hand and the amount of time verification takes out of the project budget. 

Let’s ask the judges.  “At the Denali Finale, all performers were exceptional,” says Judge Simon Davidmann, president and CEO of Imperas.  “Luc stood out for his stage presence, singing ability and a well-chosen song with lyrics everyone associated with EDA can relate to.  His guitar playing was pretty good, too.”

Judge Dennis Brophy, director of strategic business development at Mentor Graphics Corporation, weighs in with:  “Despite formidable competition, Luc Burgun showed us he really knows how to rock out.  His rendition of ‘Satisfaction’ told us that successful transactions are indeed the key to satisfaction!”

In another stanza, Luc sings, “When I’m drivin’ in my car, When EDA man comes on the radio, He’s tellin’ me more and more, About some useless simulation, Supposed to fire design acceleration.” Useless simulation?  Fire design acceleration?  Well, in the real world, we would never advocate that because each verification tool serves a purpose and works on a specific problem.  Real Intent’s verification solutions, for example, use innovative formal techniques in an easy-to-use methodology, solving critical problems with comprehensive error detection.

And, of course, Luc advocates the use of hardware emulation as a solution.  “Well, I’m doin’ billion cycles, And I’m tryin’ this and I’m trying that, And I’m tryin’ to find the weak bug kink, When boss says get emulation later next week, ‘Cause you see I’m on losing streak.”  After all, a new generation of hardware emulators, including EVE’s ZeBu, can handle a billion ASIC gates and offers flexible support for hardware verification, software development, and hardware/software co-verification across multiple SoC applications.  That should give some satisfaction!

In case you missed his performance, you can view it here:  http://www.youtube.com/watch?v=8SBrDnj0nc0

Are you curious about the rewritten lyrics?  Here they are:

Satisfaction EDA Style

 

I can’t get no satisfaction
I can’t get no bug reaction

‘Cause I try and I try and I try and I try
I can’t get no, I can’t get no

When I’m drivin’ in my car

When EDA man comes on the radio
He’s tellin’ me more and more
About some useless simulation
Supposed to fire design acceleration
I can’t get no, oh no no no
Hey hey hey, that’s what I say

I can’t get no satisfaction
I can’t get no bug reaction

‘Cause I try and I try and I try and I try
I can’t get no, I can’t get no

When I’m workin’ my SoC

And Moore’s Law tells me
How fast my chips can be
But he can’t be a chip jock ‘cause he don’t use
The same ver’fication as me
I can’t get no, oh no no no
Hey hey hey, that’s what I say

I can’t get no satisfaction
I can’t get no bug reaction

‘Cause I try and I try and I try and I try
I can’t get no, I can’t get no

Well, I’m doin’ billion cycles
And I’m tryin’ this and I’m trying that

And I’m tryin’ to find the weak bug kink
When boss says get emulation later next week
‘Cause you see I’m on losing streak
I can’t get no, oh no no no
Hey hey hey, that’s what I say

I can’t get no, I can’t get no
I can’t get no satisfaction
No bug reaction, no satisfaction, no bug reaction

The King is Dead. Long Live the King!

Monday, January 3rd, 2011

The New Paradigm

 

Not long ago, functional simulation and static timing analysis was it for RTL verification. In fact, it was all that was needed because the inner-loop of computation and data-transfer on a chip was one synchronous block. As chip complexities grew and gate-level simulation became unviable, formal equivalence checking stepped in to pick up the slack with orders of magnitude improvement in productivity in comparing gate and RTL representations. But the paradigm remained the same even as the methods changed – verification still needed to cover only the functional input space as comprehensively and efficiently as possible.

Then, somehow, things changed under the hood. Computation on a chip got fragmented out of necessity and with significant consequences. An illustrative example of this trend is the multicore chip by Tilera, Inc. shown here, Inc. It is a 64-core processor with a number of high-speed interfaces integrated on chip.

 

chip

Tile64 Processor Block Diagram

For one, it has become impractical to send a signal in one clock cycle from one end of the chip to another in one clock cycle, as well as to send the same clock to all parts of the chip with manageable and predictable skew. It is also energy inefficient and practically impossible to keep raising the clock frequency. Higher performance can increasingly only be achieved with application-specific cores or on-chip parallelism in processors. As a result, computation is being done increasingly in locally synchronous islands that communicate asynchronously with each other on chip. This was predicted some time ago, but is now truly coming to roost in the form of heterogeneous and homogeneous multicore chips. With fine-grain fragmentation, communication bandwidths and latencies between the computation islands have come under the design scanner, and protocols for transferring data and signaling between the islands are beginning to push the limits.

 

A second important change has been that energy and power optimization is now more aggressive than ever. Beyond parallelism-for-performance and custom cores, this trend has also brought once arcane design techniques into the mainstream. Each island runs at its optimal frequency, and dynamic control of clocks, clock frequencies and Vdd is now par for course.

 

Finally, chips are now true systems in that they integrate computation with real-world interfaces to peripherals, sensors, actuators, radios, and you name it. And, these interfaces must talk to the chip’s core logic at their own speeds and per their chosen protocols. Many of these interfaces are also pushing the performance limits of the core logic.

 

An apt analogy is that it is as if chips have transitioned from an orderly two-party political system to an Italian or Indian multi-party system in which the various parties must align with each other at periodic intervals to accomplish something and each party has its own chief whip to get the troops to toe the party line.

 

The implication of this trend on chip verification is that it has gotten messier – one can’t cleanly abstract timing from functional analysis any more, i.e., the functional space and the timing space must be explored together. Deterministic functional simulation with fixed clock frequencies and delays does not cover all failure modes, and static timing analysis neglects the dynamic and data dependent nature of interaction between clock domains in the presence of unrelated clocks and variability. We are still not in the world where we must timing-simulate everything, but the new complexity is daunting nevertheless.

 

The New Signoff Solution

 

In order to mitigate this complexity, it is essential that the verification tool first decipher design intent to localize the analysis requirements. This exercise also helps make debug more precise. To be sure, this is harder as optimizations get more aggressive – the boundary between computation and interface blurs and designers resort to ever more innovative techniques. Real Intent was prescient in predicting the new verification paradigm many years ago. After much experimentation and interaction with design companies, we have demonstrated that automatic and reliable capture of design intent is indeed viable for clock domain crossings.

 

The design intent step triages the design, finds many types of bugs, and sets up local analysis tasks and models (potentially with special algebras to capture the timing and variability effects) for further formal analysis and simulation. I call this the verification 4-step of intent extraction, formal analysis, simulation, all integrated into a systematic hierarchical approach of analysis and reporting for scalability.

 

signoff2

 

We find from our customers that the special verification requirement for clock domain crossings is now an essential part of the signoff process for all chips. Similar customized signoff is also called for in other contexts like DFT and power optimization for which failures cannot reliably be caught with functional simulation. Effectively, the old paradigm of “functional simulation + static timing analysis” is obsolete and the sign-off flow today looks more like the figure shown above.

 

 

 

 

 

CST Webinar Series



Internet Business Systems © 2016 Internet Business Systems, Inc.
595 Millich Dr., Suite 216, Campbell, CA 95008
+1 (408)-337-6870 — Contact Us, or visit our other sites:
TechJobsCafe - Technical Jobs and Resumes EDACafe - Electronic Design Automation GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy Policy