A long-time EDA industry analyst, Gary Smith, passed away on Friday, July 3, 2015 after a short bout of pneumonia in Flagstaff, Arizona. He died peacefully surrounded by his family.
Gary Smith in 1963
Gary was from Stockton, CA and graduated from the United States Naval Academy in 1963 with a bachelor of science degree in engineering. His class yearbook says: “He managed to maintain an average grade point despite the frantic efforts of the Foreign Language Department. Tuesday nights found Gary carrying his string bass to NA-10 practice.” Gary continued to be a musician and played his electric bass for years with the Full Disclosure Blues band at the Design Automation Conference Denali party with other industry figures. The band started out of a jam session in 2000 with Grant Pierce who asked Gary to help put together a group for the following DAC. Gary had suggested Aart de Geus as lead guitar who ended up giving the band its name.
Gary got into the world of semiconductors in 1969. He had roles at the following companies:
LSI Logic, Design Methodologist (and RTL evangelist), 2 years
Plessey Semiconductor, ASIC Business Unit Manager, 3 years
Signetics, various positions, 7 years
In 1994 he retired from the semiconductor industry and joined Dataquest, a Gartner Company to become an Electronic Design Automation (EDA) analyst. Gary described his retirement this way: “instead of having to worry about Tape Outs and Product Launches, I get to fly around the world and shoot off my big mouth (which I seem to be good at) generally playing the World’s Expert role. Obviously there isn’t much competition. Now if I could only get my ‘retirement’ under sixty hours a week I’d be happy.” (more…)
I was an organizer for the industry DAC panel on “Scalable Verification: Evolution or Revolution?” held during the second week of June in San Francisco. While the industry generally agrees on methodologies used to verify IP blocks or subsystems, we lack consensus on approaches required to verify SoC integration and system-level functionality of embedded systems. One of questions addressed by the panelists was “Can existing standards and methodologies be extended to address system-level challenges, or are new approaches required?”
The panel was moderated by that veteran verification technologist Brian Bailey. He was excellent in steering the panelists through this deep topic. The panelists were all from semiconductor companies (not EDA) and included the following:
Ali Habibi, design verification methodology lead at Qualcomm
Steven Jorgensen, architect for the Hewlett-Packard Networking Provision ASIC Group;
Bill Greene, CPU verification manager for ARM
Mark Glasser, verification architect at Nvidia.
Brian has written an excellent article on the panelists insights in his column on SemiEngineering.com. Here are a few quick snippets to entice you in reading the entire piece:
“Simulators are not making effective use of the advances in the underlying hardware. Design sizes are growing faster than the improvements they are making.”
“Design reuse has not helped us, and even if you change only 20% of a design you still have to completely re-verify it. We need to be able to describe features and functionality in an abstract manner, and from that derive the inputs to the verification tools.”
“You might think that we are able to re-use much of our verification collateral from the IP, unit and top levels into the system-level environment, but this isn’t the case. You can’t find new bugs by running stimulus that was used in the past, and this means that the notions of coverage are different.”
You can read the entire column at this link: Wrong Verification Revolution Offered. Be sure to add your comments at the end and let Brian know what you think is missing in verification.
Richard Goering at his 30th DAC, San Francisco in 2014
Richard Goering, the EDA industry’s distinguished reporter and most recently Cadence blogger is finally closing his notebook and retiring from the world of EDA writing after 30 years. I can’t think of anyone that is more universally regarded and respected in our industry, even though all he did was report and analyze industry news and developments.
When Richard left EETimes in 2007, there was universal hand-wringing and distress that we had lost a key part of our industry. John Cooley did a Wiretap post on his DeepChip web-site with contributions from 20 different executives, analysts and other media heavyweights. Here are a just few quotes that I picked out for this post: (more…)
The EDA Consortium and the IEEE Council on EDA is seeking qualified nominations for the 2015 Phil Kaufman Award. The nomination deadline is June 30.
Presented by the EDA Consortium and the IEEE Council on EDA, this prestigious award honors an individual who has had demonstrable impact on the field of electronic design through contributions in Electronic Design Automation. .
Additional information on the nomination process is available .
Information on previous Phil Kaufman Award recipients is available .
This years Design Automation Conference in San Francisco was excellent! You don’t have to take my word for it. At the Industry Liaison Committee meeting for DAC exhibitors on Thursday June 11, the various members were in agreement that show traffic was up, and the quality of the customer meetings exceeded expectations. Why is that? It is in large part of due to the tremendous efforts of Anne Cerkel senior director for technology marketing at Mentor Graphics, who was the general chair for the 52nd DAC.
One innovation at this year’s show was opening the exhibitor floor at 10 am. This made it more convenient to see the morning keynotes and also more flexibility in commuting to the show from around the Bay area. I think you can expect to see this again at the next 53rd DAC show in Austin Texas.
Our two GRID racing car simulators was one reason the show was excellent for Real Intent. We were able to draw a large crowd to our booth. Budding race car drivers could challenge their friends and colleagues to a race and enjoy our license-to-speed verification solutions. A special thank you to Shama Jawaid and the team at OpenText who was our partner for the license-to-speed promotion.
Here are some quick photos from the show for you to enjoy.
One trend we’re seeing in Asia is the number of FPGA design starts — now counting in the thousands. Getting a functionally correct design is the first goal for designers. It is easy to think that once that is achieved FPGAs can shipped out in finished products. But that’s not a robust model. For example, we have had customers with failures in the field due to a subtle timing change between FPGA part lots.Larger FPGA designs have grown in complexity, resulting in an amalgamation of disparate IP that can lead to clock domain challenges. A robust model for FPGA designs requires advanced signoff tools, a design flow that works easily with Xilinx and Altera tools, and support for high-reliability standards like DO-254. This is where Real Intent’s Meridian and Ascent products excel. For high-performance, our CDC and Lint tools provide the confidence design teams need, with unsurpassed verification and sign-off support.
Come visit us in Booth #1422 at DAC in San Francisco, June 8-10 to see our latest technical presentations. To choose your technical presentation clickhere.
Can’t attend DAC? Check out some of our latest video interviews with Real Intent technologists or email us for a personal presentation to you or your team.
The last two weeks before the Design Automation Conference in San Francisco are a busy time. For us marketeers, it has been called “our Superbowl.” We want to get the word out that we have something new and important to show visitors to at our exhibit booth. But there is more going on which I will mention after I talk about our booth activities.
Real Intent is number two on the GarySmithEDA What to See @ DAC list. I know why we are number two on the list. But I don’t want to give the secret away. If you know the reason, then please let everyone know in the comments section at the end of the blog.
Here are the quick titles for our technical presentation in our demo suites.
Ascent Lint with 3rd Generation iDebug Platform and DO-254
Meridian CDC for RTL with New 3rd Generation iDebug Platform
Ascent XV with Advanced Gate-level Pessimism Analysis
Accelerate Your RTL Sign-off
Hierarchical CDC Analysis and Reporting for Giga-gate Designs
In the stories of the Wild West from the 1800s, the image of a cattle drive often is depicted. A small team of cowboys delivers thousands of heads of cattle to market. The cowboys spend many days crossing open land until they reach their destination – one with stock yards to accept their precious herd, and a rail station to deliver it quickly to market. Along the way there are dangers, including losses by predators and mad stampedes by cattle rushing blindly when frightened or disturbed. The primary job of the cowboys is to keep the herd on track and settled as they move to ship-out.
I see immediate parallels between the cowboys of the Wild West and today’s system-on-chip (SoC) design and verification engineers. Cowhands struggle to control and move a big herd. Similarly, today’s design teams grapple with how to keep a project on target and converging to tape-out and success when the gate count of SoCs has become so large it can stretch and even overwhelm their ability to stay on track. How big are these new SoCs? (more…)
In a recent blog, Does Your Synthesis Code Play Well With Others?, I explored some of the requirements for verifying the quality of the RTL code generated by high-level synthesis (HLS) tools. At a minimum, a state-of-the-art lint tool should be used to ensure that there are no issues with the generated code. Results can be achieved in minutes, if not seconds for generated blocks.
What else can be done to ensure the quality of the generated RTL code? For functional verification, an autoformal tools, like Real Intent’s Ascent IIV product can be used to ensure that basic operation is correct. The IIV tools will automatically generate sequences and detect whether incorrect or undesirable behavior can occur. Here is a quick list of what IIV can catch in the generated code:
FSM deadlocks and unreachable states
Bus contention and floating busses
Full- and Parallel-case pragma violations
Constant RTL expressions, nets & state vector bits
Designers are are also concerned about the resettability of their designs and if they power-up into a known good state. (more…)
We are at the dawn of a new age of digital verification for SoCs. A fundamental change is underway. We are moving away from a tool and technology approach — “I have a hammer, where are some nails?” — and toward a verification-objective mindset for design sign-off, such as “Does my design achieve reset in two cycles?”
Objective-driven verification at the RT level now is being accomplished using static-verification technologies. Static verification comprises deep semantic analysis (DSA) and formal methods. DSA is about understanding the purpose and intent of logic, flip-flops, state machines, etc. in a design, in the context of the verification objective being addressed. When this understanding is at the core of an EDA tool set, a major part of the sign-off process happens before the use or need of formal analysis. (more…)