What drives what happens now? A critical combination of the latest technology and knowledge, which is what you’ll find at IPC APEX EXPO 2015.
Archive for 2014
Thanksgiving is here so it’s likely to be a slow week in the EDA industry. Of course, like much else in our culture, this event has been co-opted by rampant media messages to shop and consume. Already I’ve seen lots of stories about Black Friday, mostly discussing whether the whole idea of a 24-hour window is now moot given Cyber Monday and the reality that the holiday shopping season now starts right after Halloween and stretches into January.
As a German I know I must tread lightly when writing about the most American of holidays. Turkey is not all that big on holiday menus back home and as I’ve written about in an earlier EDA Café post, football (or rather fussball) will always mean something different to me, no matter the success of the regional favorite Seahawks or my staff’s obsession with making their weekly fantasy picks. I’ll just say that I’ve grown to like some of the old-fashioned aspects of the holiday (a good meal with friends followed by a hike with the dogs, who demand to get out regardless of the weather). I can’t help but being thankful that EDA is not part of this annual shopping lunacy, at least not directly. Last I looked, the big three EDA vendors aren’t offering holiday-themed sales and I’ve never yet seen a line out the door for a piece of technical software. (That said, DAC attendees have been known to queue up for the free coffee, beer and wine that exhibitors offer almost every day — as I see it, much more reasonable behavior than waiting outside a superstore before it opens.)
Initially, USB provided two speeds (12 Mbps and 1.5 Mbps). With rapid adoption and success of the USB standard and the increasing power of PCs and computing devices, the USB 2.0 specification was defined in the year 2000. USB 2.0 provided upto 480 Mbps of bandwidth while keeping software compatibility with earlier USB applications. With ever increasing bandwidth requirements, in 2008 the USB 3.0 specification (providing 5 Gbps bi-directional bandwidth) was released. USB 3.1 is the next logical step in this progression. It provides 10Gbps of bi-directional bandwidth while maintaining backward compatibility with previous USB versions. To know more about USB 3.1 Verification solution click here.
In this post, we will analyze the technical differences between the USB 3.1 and USB 3.0 specifications. The aim is to enable people familiar with USB 3.0 to quickly understand the main aspects of USB 3.1.
We will analyze the PHY, Link and Protocol layers and list out the major ways in which USB 3.1 differs from USB 3.0.
Okay, this is no hyperbole and I’m no longer just trying to gin up enthusiasm. The first batch of DAC deadlines is upon us. For those of you who won’t read beyond this paragraph, here are dates you need to know if you’re interested in participating in what remains the premier EDA industry conference. Proposals for the following are due November 13: panels, tutorials, workshops and co-located conferences. Abstracts for research papers are due November 21 and full manuscripts are due December 2; you can submit in these five categories: automotive electronic design, EDA research, ESS research, hardware software and security, and work-in-progress (WIP). I hope all those links help. Of course you can find more information and everything else you need on DAC.com.
Verification is never ending process! You can never be sure that you have verified everything. The aim of verification is risk reduction to the level of practical perfection.
The increase in chip complexity coupled with pressure to shorten time to market, are pushing chip design companies towards adoption of third party IPs. Let us consider you have weighed in all pros and cons of IP outsourcing and decided have to go for a third party IP for your next project. Then the question is – Does the externally bought IP need re-verification?
Many of you may have heard the story about the woodcutter and his blunt axe. The “Switching cost” of sharpening/buying a new axe may seem to be too high when in a time crunch. But a step back to review the situation and switching to a better tool can be life changing!
In today’s world this applies to chip design and verification teams more than ever. A Verification IP plays a key role in controlling verification schedules. Consider a case where tape out schedule is slipping in spite of having both – Internal VIP and External VIP.
Just like a blunt axe will take much longer to fell a tree, a sub standard Verification IP will prolong your IP Development. On the other hand if you “sharpen your axe” i.e. develop/buy a better Verification solution, it may initially seem like its taking longer and others are getting ahead. But, in the long run you will develop your IP faster.
Surely the biggest tech news since my last post is the new Apple Watch, finally announced September 9 after months of anticipation. I can’t add much to the volumes that have been written, except perhaps to issue my standard gentle reminder on behalf of our industry anytime a tech device makes a splash. Surely the new watch, and for that matter the two new iPhones that were part of the announcement, simply wouldn’t have been possible to design and test without EDA tools and expertise. The world may look at the watch and make declarations like this from Scott Stein and David Carnoy at CNET: “For fitness-lovers who want a smart connected workout device that plays music, the Apple Watch could be a slam dunk.” Or this from Farhad Manjoo at The New York Times: “The biggest news was about the old Apple: It’s back, and it’s more capable than ever.” Or even make parody videos that get the predictable millions of YouTube views (see below). Meanwhile I can’t help but think of all the hardware/software verification that Apple had to do before Tim Cook could take the stage.
Mixed-signal silicon design, bringing the worlds of analog and digital technology onto a single die, has never been an easy task. Formerly, the analog and digital teams would work independently on their designs, leaving the place and route team with the thankless task of integrating everything onto a single chip. A microcontroller design, with all of its carefully thought out peripherals, would be routed leaving analog-sized holes for the oscillator, ADC and transceivers needed to complete the design.
Here in Portland summer is in full swing. Outdoor tables are full at the restaurants in my neighborhood and there are more people on the trails in Forest Park where I walk my two Miniature Schnauzers most mornings. And this time of year it’s more than feet that wander. Even as I hurry to keep up with the dogs, my mind is often rambling elsewhere, often to matters related to DAC. Some of these musings are making it into my efforts to blog my way to next year’s conference, weekly on the DAC site and monthly here on EDA Café.
I know at this point most people are thinking, DAC? That’s a lifetime away. But as general chair for DAC 52, I’m often brought up short during my morning strolls by realizations like this: We have just 10 months to plan this conference! Suffice it to say there is lots to do and, summer and eating and trekking aside, those of us on the executive committee haven’t been idle.
Last week, a few of us met in Louisville, Colorado to audit the 2014 conference and begin budget planning for DAC 52. Yes, it’s a somewhat tedious process to go through expense reports, vendor bills and registration data. However, we take this work seriously, understanding that we’re merely stewards of a conference that has been going on since the days of time-sharing on mainframes. Indeed, just as time-sharing has morphed into cloud computing and the Internet of Things, now among the hottest topics in technology, DAC has proven remarkably adept at staying relevant and even reinventing itself through the years. All of us on the executive committee want this to continue on our watch.
There was a time when the “big three” gave the impression they deeply cared about what their customers saw in their documentation and how useful they shared critical product information. From my view, that is no longer the case. Maybe that is harsh but this is more than a simple impression. Whether providing book-based or topic-based documents, whether offering downloadable pdf hard copy, sharing online documentation with robust search capabilities, or delivering meaningful embedded tutorials, most EDA companies took an active role in ensuring what they produced would be innovative, encompass the latest trends and meet (and even exceed) customer expectations.