EDACafe Weekly Review June 20th, 2013

In Austin, at the 50th DAC earlier this month,  I delivered a poster presentation on “Lending a ‘Formal’ Hand to CDC Verification: A Case Study of Non-Intuitive Failure Signatures”.   In this second blog in a series, I discuss a set of failures in a common clock domain crossing synchronizer.

The design used for this case study closely mirrors the CDC synchronization scheme shown earlier in Part 1. Two components of the scheme are represented generically in the schematic below. They are

  1. The logic controlling the loading of the registers in the transmit domain and
  2. The detection logic on the control signals in the receive domain

A designer can use one of many techniques for implementing the generic components which can be considered as variables for design exploration.

The clock frequencies of the transmit and receive domains, specifically the ratio they imply and the relative reset release of the two clock domains can be considered as the System variables in this experiment.

Update: IP on the move
June 20, 2013  by Peggy Aycinena

 

Despite their marked contributions to DAC in Austin, the folks in the IP world have not been resting on their laurels, but have continued to generate developments of both a technical and business nature.


**
Synopsys and OCZ Technology Group announced OCZ “achieved first-pass silicon success” in its newest NAND flash Vector SSD using Synopsys’ DesignWare DDR2/3-Lite PHY, Embedded Memories, STAR Memory System, and Professional Services.

The companies say the OCZ Vector SSD was designed “to deliver superior sustained performance through its new, high-performance Indilinx Barefoot 3 flash controller supporting the SATA-3 protocol. Synopsys’ design consultants worked closely with OCZ’s engineers throughout the implementation of their chip, delivering expertise and advanced methodologies in IP integration, physical design, and physical verification that enabled OCZ to complete their implementation in less than six months.”

 

Great if you were able to attend DAC in Austin this month. Even better if you were able to attend the Monday afternoon Pavilion Panel on the how-and-why of networking for career growth. The topic may seem irrelevant to some of you, but networking sits as the center of successful career development and it’s definitely not for the faint of heart.

Sashi Obilisetty, Director, R&D at Synopsys, assembled a seasoned panel of experts to discuss the issue – How networking is crucial to professional growth – with the June 3rd panelist including Tufts University Professor and DAC 2014 General Chair, Dr. Soha Hassoun, Calibra Consulting President Jan Willis, and Blue Pearl Software VP Kavita Snyder. The panel discussion began with Jan Willis:

Jan Willis – I want to share three perspectives on the issue. First, networking matters a great deal – for changing jobs, for moving into other fields, and for changing your career trajectory. I didn’t realize how much it mattered until I found that 100-percent of my current business in consulting is a result of networking.

Second, sponsors are very different from mentors, not at all the same. Sponsors tap you on the shoulders and point out when a job is available that would be good for you going forward. Third, networking is critical and it’s important to spend time on it. LinkedIn is a wonderful thing, but it offers you a false sense of security that you have great connections. If you’re not working at networking, [your network] won’t work for you.

Soha Hassoun – I would like to emphasize that it’s important to network early on in your career. Some people wait until they are at the mid-point in their careers, but that is too late. Whether in academia or industry, it holds true – you need to start early.

As a society entrenched in connectivity, we put a great deal of pressure on our portable electronic devices to provide us with more and more computing power and capabilities.  Take this blog for example.  As I’m traveling, I’m actually writing this blog post on my smart phone. To write this effectively, I need to be able to easily flip back and forth between PowerPoint, Word, and the Internet while still answering emails and the occasional phone call.  The fact that my mobile device is able to handle all of these requests with no errors is astonishing given that just a few short years ago, this idea was just “pie in the sky”.  The computation complexities that make this possible are staggering.  But what is also staggering, is that even more complex designs are being created in ever shrinking time-to-market windows.  How do system and SOC companies remain competitive with these seemingly unrealistic expectations?

There are, of course, a myriad of answers to that question, but a critical facet is the use of third-party IP.  More and more companies must adopt third-party IP so that they can focus their design on their companies’ core competence.  Outsourcing other, proven, capabilities to IP providers saves a great deal of time, energy, and money.  However, the use of this third-party IP also introduces new challenges for interface specification, integration, and verification of SoCs on a large scale.  These challenges, if not addressed properly, can eliminate any of the productivity gains thought to be realized with the use of third-party IP.

We have a winner of the hardware hackathon
June 19, 2013  by Ed Lee

 

Liz here.  I’ve just gotten word from Angel Orrantia of Innopartners, and we have a winner of the 2nd-ever hardware hackathon, mentioned in our earlier post today.

Drumroll please…..

 

We’re waiting to hear from SKTA Innopartners LLC director Angel Orrantia on the results of the Open Compute Project hackathon that took place yesterday at the Facebook campus.   Orrantia is one of the judges. We hear that the winner will be announced at the GigaOM Structure conference this afternoon sometime.

What happened at the hackathon?

There were a number of  teams comprised of over 50 hackers from Silicon Valley, Singapore, Miami, Boston, Seattle, Virginia and Texas.

Projects included:

• building an ARM based system on a chip

• bringing robotics into the datacenter to automate repairs

• building a fast interconnect between ARM boards

• gathering server diagnostics data into a web interface for remote diagnostics over the web

• two projects on car automation

1- collecting diagnostic data about the car – like speed, fuel consumption, acceleration, etc. – to give people the ability to monitor their driving habits to prevent or avoid accidents

2- designing a headset that measures brainwaves to alert the driver or a third party company that can get in touch with the driver

Also, the winners from the last hackathon returned to continue working and expanding on their debug port aggregation hack.

Raiders of the Lost Article
June 18, 2013  by Tom Anderson, VP of Marketing

Back before DAC, I wrote a blog post on the rapid migration of technical information from magazines and catalogs to online-only publication. I addressed the topic from my perspective as a voracious reader of industry news who likes flipping through magazines as a nice break from staring at the screen most of the day. Just for the record, today over lunch I skimmed through the latest hardcopy issues of Information Week, Electronic Design, and MIT’s Spectrum. But my post also addressed a more serious topic: the evanescence of online technical content.

Futurists would have us believe otherwise: online is supposed to be forever. However, many technical sites are hosted by motivated individuals or organizations who may simply decide one day to stop. Other sites are owned by commercial interests, including publishers, who may fold and take their content with them into the void. Yes, there are organizations trying to capture the ongoing history of the Internet but, in my experience, their retention of desired content is inconsistent at best.

Demystifying Traceability
June 17, 2013  by Louie De Luna

For DO-254 Compliant FPGAs and ASICs

I have been getting a lot of questions from our customers about traceability in the context of DO-254 and airborne FPGAs and ASICs.  It seems that there are several new concepts and terminologies associated to traceability that are new to most of us.  So I thought I would shed some light in this blog and explain the basic 5 terminologies. Also I have always liked the word “demystify”, but never had the chance to use it – so here is my chance.

Traceability – Traceability is the activity that maps all of the design and verification elements back to requirements to ensure that what is being built and tested is based on the requirements. Traceability is the correlation between system requirements, FPGA requirements, conceptual design, HDL design, post-layout design, verification test cases, testbench and test results.

Downstream Traceability – A top to bottom reporting activity that shows the mapping or correlation between system requirements, FPGA requirements, HDL design, test case, testbench and test results.  Running a downstream traceability can expose FPGA requirements that are not implemented by any HDL function or not covered by a test case.

Upstream Traceability – A bottom to top reporting activity that shows the mapping or correlation between test results, testbench, test case, HDL design, FPGA requirements and system requirements.  Running an upstream traceability can expose derived FPGA requirements or unused HDL functions.  Tools like Spec-TRACER can also use upstream traceability to expose all of the design and verification elements associated to a FAILED simulation result.

AWR: Redefining Design
June 17, 2013  by Sherry Hess

When I first learned of NI’s Redefining campaign, I thought… yes, makes perfect sense and fits AWR extremely well. Our company was founded almost 20 years ago on the very idea of redefining design for microwave/RF engineers. We began this mission with the release of our flagship product, Microwave Office®, and have continued our tradition of innovation as the first with a Microsoft Office look and feel, the first to fully embrace the PC as the preferred platform, the first to open our environment to third-party vendor tools through our EM Socket™ interface, the first to offer the eye-catching real-time tuning feature…yup… AWR has been constantly evolving and redefining design with every new technology, product, and partner announcement.

If you look at our innovation timeline (snapshot below), you can see for yourself how we continually work to redefine the tools and technologies our customers require, request, and enjoy and that enable them to achieve design success by first virtually prototyping their MMICs, RF PCBs, RFICs, microwave modules, communication systems, radar systems, antennas, and more.

 

So this year as we embrace our parent company’s redefining campaign, we want to clearly say, “Hear, hear, we agree and support redefining design in all that we do—past, present and future.” Take a look at Analyst™, which has already begun to redefine the design flow for 3D FEM EM analysis by enabling users to move away from disparate point tools to analysis so seamlessly integrated within Microwave Office circuit design that it effectively makes EM a one-click option. Take a look at our forthcoming Visual System Simulator™ (VSS) software release with 802.11ac IP that’s been modularized so it lends itself not only to use within VSS but also within NI’s PXI hardware and LabVIEW software. Take a look at our many AWR Connected™ partners to see how our openness philosophy continues on today, providing our customers with a design flow and eco-system that is flexible and open to better satisfy their ever changing and challenging design needs.

The Internet of Things
June 16, 2013  by Ed Lee

 

As Mike Demler predicted back in May, the “Internet of Things” was all the buzz at DAC this year. 

Freescale CEO Gregg Lowe talked about the opportunities and challenges in his keynote.  

Mentor CEO Wally Rhines said in his keynote that the big growth in the semiconductor industry will come with the Internet of Things. 

It was simultaneously discussed at the GSA European Executive Forum in Munich and the Sensors Expo in Chicago

What do you think? 

Is it the next big thing? 

Can EDA step up to the challenge? 

And what does it mean to our future?

Where, Oh Where Should My Little DAC Be?
June 14, 2013  by Tom Anderson, VP of Marketing

I spent my last few posts previewing and reporting on the 50th Design Automation Conference (DAC) in Austin. As I have mentioned, this was the first time that DAC was held in Austin and so a lot of vendors were nervous about that. I know at least a couple of companies who downsized their DAC crews in anticipation of a smaller show. Well, the numbers are in and DAC did fairly well in Austin. Full-conference passes were 1589, down 16% from 2012 in San Francisco. Exhibits-only passes were 2364, down 15%. The number of both staffers was down 26%, reflecting both consolidation in the EDA industry and smaller crews.

No one really expected Austin to match San Francisco, but the numbers are quite respectable. What was especially interesting was that the number of exhibits-only passes exceeded by 15% those in San Diego in 2011. It seems that the local electronics community really turned out at DAC this year, already clear to us exhibitors since we saw many new faces we had not seen at shows in other locations.

Attend CST Webinar


You are registered as: [sdave@quantenna.com].

CafeNews is a service for EDA professionals. EDACafe.com respects your online time and Internet privacy. Edit or Change my newsletter's profile details. Unsubscribe me from this newsletter.

Copyright © 2016, Internet Business Systems, Inc. — 595 Millich Dr., Suite 216 Campbell, CA 95008 — +1 (408)-337-6870 — All rights reserved.