Archive for the ‘Uncategorized’ Category
Tuesday, May 1st, 2012
DAC’s coming and, at EVE, we’re thinking about the evolution of emulation, a theme that you’ll hear more about from us. It’s been fascinating to look at how emulation has evolved from high-priced, hard-to-use clunkers introduced in the 1980s to sleek, low-cost hardware-assisted verification solutions that execute at high speeds.
At $1 million per seat, those early emulators were available to only the largest companies doing the most complicated designs. Cost alone prevented widespread deployment, and they quickly became outdated as new process technologies emerged, quashing their practicality and curbing accuracy. The maximum speed was about one MHz, slow even then, and they were roundly criticized for being difficult to set up. The designer’s lament was the excessive time to emulation.
Tuesday, April 17th, 2012
Recently, I’ve started to see an interesting trend cropping up in SoC development. Companies and teams are adopting or “inheriting” the emulation platform of their vendor, partner, or customer to accelerate the SoC realization effort.
Adopting a common emulation platform allows multiple organizations to share data and replicate development environments. Emulation tests for a critical block from an IP vendor can be replicated in-house, and later used as a golden reference model for verification at the system level. Leveraging a common emulation platform and use model enables partners, vendors, and customers to share a high-performance software development environment. Integration testing, along with driver and application software development can occur at multiple sites in parallel prior to tapeout.
Although there are many potential benefits to using a common emulation platform, there are also many potential issues. Failure to address these issues can result in increased project delays and costs, effectively erasing the advantages of using a common platform.
The first potential issue to be addressed is the choice of emulator. In many cases, the choice of the emulation platform to be shared is imposed by one company over another. For example, an important customer might demand that a vendor verifies its IP using the same emulator.
However, many organizations already have a preferred emulator or FPGA prototyping platform, and have built a complex verification environment around it. Adding a new emulation platform requires more time and money. An organization that typically uses FPGA prototypes with a target hardware system may need to invest significantly in order to adopt a transaction-based emulation flow built around an Electronic System-Level (ESL) virtual platform.
Alternatively, if both parties have an equal say in the choice of emulation platform, there are other potential difficulties. Each company may have a centralized IT or CAD team that demands its own evaluation, and each could have differing criteria for success. Each company may have differing budgets for the project and differing procurement policies, creating further complications in purchasing a common platform. (more…)
Monday, April 2nd, 2012
By now, you probably know that social media is kind of a big deal. Social media is literally changing the world, having played a significant role in political scenes around the globe. Closer to home, social media marketing is changing the way we do business, providing access to online audiences like never before. If you need some more convincing on the value of social media marketing, you can check out my earlier blog post, “What Is Up With Social Media?”.
We know that social media is important for our business, but exactly how important—or perhaps more accurately, how effective—is it? How do we know if our social media campaigns are having any impact? The number of followers of your Twitter account or fans of your Facebook page doesn’t give you the whole story. Are your links being clicked and reposted? How many people do you influence? Who else is talking about you (and what are they saying)? As social media has evolved into a critical marketing tool, a host of new technologies have been developed to help you answer these questions.
Tracking Your Content/Links
When you post a link to your latest blog post, does anyone retweet it? If you advertise a sale or promotion with a landing page, does your social media campaign result in conversions? Content tracking is the most straightforward and readily available metric for social media. Almost every social media scheduling tool, such as HootSuite, Tracx, Buffer, and Gremln also includes metrics on clicks, mentions, and reposts. There are a wider range of dedicated analytics tools to choose from as well, such as SWIX and Cyfe. You can also collect some of this data through URL shorteners like bit.ly. This information helps you to assess the quality of your content. (more…)
Monday, March 19th, 2012
Carol Hallett and I became fast friends in 2006 when EVE acquired Tharas Systems, where she was vice president of marketing and sales. From then on, we often met for coffee after she joined Real Intent to head marketing and sales. It’s an upbeat and positive Carol who called me from her home in Twain Harte, Calif., near Yosemite National Park in mid January where the weather’s a beautiful, though unseasonable, 60 degrees.
2011 was a tough and, ultimately, transitional year for Carol, starting with her March trip to Japan. While at the airport waiting for her flight back to San Jose, the earthquake shook Japan and shocked the world. Carol being Carol found a working WiFi area at the airport where she proceeded to help fellow travelers rebook flights home.
After arriving back home with an overwhelming feeling of being lucky to survive, Carol got a call that her mother was very ill. She immediately booked a ticket for herself and her sister to fly to Virginia and within that week her mother passed away.
In April, Carol’s husband retired from Lockheed and they decided to put their home in Almaden on the market. Not really expecting it to sell, but a force of nature was in play here. The house sold in five days and the Halletts had 30 days to move.
June always brings a busy time in EDA with DAC and all the follow-up work after the event, so Carol’s focus was on work, as usual. The move was up to Carol’s husband Dave.
With all the changes that happened, it seemed that the next step was inevitable. In July, she decided to retire to begin the next phase of her life. “I helped to build companies and worked hard all my life. It’s time to do some things for me now,” she remarks unapologetically and with a smile in her voice. (more…)
Wednesday, March 7th, 2012
How long did your last EDA tool evaluation take? One month? Three? Six? The EDA industry seems to be the land of the never-ending evaluation.
Of course, it’s totally understandable. EDA tools are amazingly complex, and thorough evaluation ensures that you are getting the right tool for the job.
Evaluations also come at a cost. There’s a direct cost to the evaluator, in that evaluations require internal resources that could otherwise be applied to a live project. There’s also an indirect cost to the industry as a whole, as EDA vendors have to loop the cost of evaluations back into the price of their products. Thus, it’s in everybody’s best interests to maximize the efficiency of evaluations.
Let’s take a look at some of the reasons why EDA evaluations can take forever. (more…)
Monday, February 20th, 2012
Many readers of this Blog will know Luc Burgun as the 2010 EDA360 Idol winner who performed the specially tuned version of the Rolling Stones classic “Satisfaction,” charming attendees at that final Denali Party.
Luc is so much more than a good musician. For starters, he is CEO of EVE, a role he’s held since founding the company in 2000, unusual for the EDA industry. He and I sat over cups of espresso one overcast afternoon in San Francisco to catch up on his life. One of the first things we remarked on was how few technical founders of EDA startups remain in that position after the first few years.
But first, let’s talk about his background. Luc was born in Brest in Brittany, France, in the Year of the Dragon. After a short pause of refection, he notes that the Dragon years, which reappear every 12 years in the Chinese calendar, have played a central role in his life, something he expects will continue.
We move on to his family and early life. Luc is the youngest child of five, three boys and two girls, and the only one to pursue a career in technology. An interest in math came early, then computer science at the university. As he got more curious about how a computer works, he began studying hardware and electronics, then software. Soon after, he was specializing in EDA. The first simulator he wrote was an instruction set simulator (ISS) running on an X86 with the old CPM operating system. That was in 1985. From there, he was exploring synthesis, layout, timing analysis and formal verification. He even designed his own CMOS cells and eventually earned a Ph.D. in logic synthesis. You could say that in a stroke of prescience, Luc studied the entire spectrum of the EDA tools that would eventually become mainstream.
Monday, February 6th, 2012
Looking at the mobile computing space today, we see a wide array of products filling every niche in the market. A consumer could easily own a smartphone, tablet, and eReader, even though there is significant overlap in functionality of these devices. Each has its strengths (and weaknesses) that causes the consumer to purchase more than one. For example, an eReader may leverage display technology that makes it superior to a tablet for reading outdoors. In other cases, multiple devices can complement each other. A cellphone could be used to provide tethering for a tablet. Consumers and manufacturers recognize that there is room for multiple products to coexist within the mobile ecosystem, and in recent years, I’ve seen the same pattern in hardware-assisted verification.
Products in the hardware-assisted verification space also have their strengths and weaknesses. Traditional emulators, typically based on custom processors, offer full-chip capacity, easy bring-up, and simulator-like debugging capabilities—features ideal for hardware verification. But their performance, typically in the 500 kHz to 1 MHz range, limits the effectiveness of traditional emulation for software validation and hardware/software co-verification, two requirements for modern SoC realization.
At the other end of the spectrum, FPGA prototypes are the converse to traditional emulation. FPGA prototypes offer significantly higher performance, enabling at-speed connections for real-time testing. But the bring-up is a largely manual process, and the trade-off for reaching real-time speeds is capacity, typically limited to a sub-system fitting into a few, lower single-digit FPGAs. FPGA prototypes are also lacking in hardware debug capabilities, making them preferable for software validation rather than hardware/software co-verification or hardware verification.
Falling into the middle of the spectrum are FPGA-based emulators such as EVE’s ZeBu. These systems leverage the higher performance FPGA components, but add a software layer to provide the ease of bring-up, capacity, and debugging features associated with traditional emulation. Operating at multi-MHz speeds, these systems enable full-chip hardware/software co-verification, a use model previously inaccessible with either traditional emulators or FPGA prototypes alone.
Given these trade-offs, I’ve seen increasing occurrences of co-existence at companies at the forefront of ASIC/SoC development. These are organizations with the largest designs and tightest time-to-market windows, and therefore see the most value in using the right tool for the right job, at every stage of the project. Traditional emulation is used for enhanced hardware verification, where RTL changes may be more frequent. But when the design is more stable, and needs to boot Linux or process HD video frames, FPGA-based emulation is leveraged for its combination of higher performance and hardware debugging. In parallel, FPGA prototypes are used to validate critical sub-systems that interface to devices like radios or cameras that must be tested at-speed.
Design teams have also asked to integrate emulators with other hardware-assisted verification products to extend functionality. That sub-system running at-speed on the FPGA prototype now gets synchronized with the rest of the emulated SoC, providing a full-chip hardware/software co-verification solution that also interfaces with the real-time target hardware.
So, is co-existence for everyone? Not quite. For many organizations, the resources required to own and manage multiple hardware-assisted verification platforms may be prohibitive. Design teams routinely use our FPGA-based emulators across the entire project development cycle for hardware verification, software validation, and hardware/software co-verification, opting for a single solution that meets most, if not all, of their needs. But for those organizations pushing the boundaries of capacity and performance in SoC realization, co-existence is a winning strategy.
Tuesday, January 24th, 2012
It’s only January, but EVE’s event calendar for the first half of 2012 is filling quickly. We kicked off the year with a presentation by Luc Burgun, EVE’s president and CEO, at the 14th Annual Needham Growth Stock Conference January 10. This was the first time we were invited to participate and we were delighted. From all accounts, his presentation was well received.
Next up is DesignCon January 31-February 1 at the Santa Clara Convention Center in Santa Clara, Calif. We’ll be in Booth #721 from 12:30-6 p.m.
We move on to Shenzhen, China, for IIC China Conference and Exhibition February 23-25, where we will be in Booth #1L26.
Local Silicon Valley favorite DVCon returns to the Doubletree Hotel in San Jose in late February. EVE will exhibit February 28-29 in Booth #602.
Luc Burgun will represent EVE at a panel during DATE March 12-16 in Dresden, Germany. The panel, “Accelerators and emulators: Can they become the platform of choice for hardware verification?,” will be held and will be held Wednesday, March 14, at 8:30 a.m. It will be moderated by Professor Bashir M Al-Hashimi from the University of Southampton.
SNUG Silicon Valley at the Santa Clara Convention Center will host a Designer Community Expo March 26-28 and EVE will be there as a Synopsys partner.
The first half of the year culminates with DAC, as it always does for the designer and EDA community, and EVE will be there in Booth #1926. We’ll have more details about our plans for DAC in the coming months.
Please introduce yourself to any of the EVE staff member at the events we’re attending to learn more about us and our hardware-assisted verification products. At each, you will have an opportunity to discover ZeBu-Blade2, the first member of the ZeBu emulation family based on Xilinx Virtex6-LX760 FPGAs, used for ASIC and SoC designs implemented in 40-nanometer technology. It offers fast execution and attractive pricing for best-in-class hardware/software integration ahead of silicon availability.
And, do stay tuned for more on what we’re doing in the second half of 2012.
Monday, January 9th, 2012
Not long ago, I overheard a thought-provoking exchange related to the demise of a beloved cultural icon. This conversation gave me pause to consider what intriguing new application is coming next that will displace another symbol of popular culture, much like what’s happened to local book and video stores or hard-wired landline phones. I’m sure some clever entrepreneur is already designing an enabling technology to open a new world for us, beyond our current habits. Gone may be a beloved local store or pink Princess phones, but consider the access to a variety of new adventures offered by these future innovations.
Ah, but these game-changing technologies don’t exist in a vacuum. They rely on innovations occurring at all levels of development. The next iconic technology will require more than just sophisticated application software and fast internet connections. It will also require a robust infrastructure able to crunch through and process, compress, and transport the massive amounts of data. That infrastructure will be built on the next generation of multicore (or manycore) SoC devices executing previously unimaginable amounts of embedded software. And these SoC devices will in turn be designed and verified using the latest and greatest EDA technologies, most notably, emulation.
To support the next big idea, every EDA tool requires innovation, but this is especially true for emulation, given its growing prominence in the process of SoC realization. As I’ve written in the past, emulation has become mandatory for the verification of complex chips. Thus, emulators can’t just keep up with the requirements of SoC development; they must outpace them. SoC design sizes of 20- or 30-million gates are common today, but the emulator for tomorrow must be able to cross the billion-gate threshold. Similarly, the 500kHz to 1MHz performance that emulators have traditionally supplied simply won’t cut it for next-generation SoC designs. You need an order of magnitude performance boost if you are going to boot an OS, transcode high-definition video, or process multiple pages of scanned images.
Innovation isn’t always about size or speed. Sometimes it’s about breaking down barriers and improving accessibility. Design teams around the globe have concluded that an emulator can accelerate the verification process across the entire SoC project. To support this surge in adoption, emulators must evolve into a cost-effective and flexible solution that supports both enterprise- and desktop-level usage.
I’m still pondering what the next big idea will be, all-the-while knowing that innovations in emulation will play an important supporting role. I welcome your thoughts.
Thursday, December 15th, 2011
We have a running joke in the EVE office in San Jose, Calif., that our children will keep us employed. The truth is, we’re not far off. Our kids are the driving force behind the development of new technology. They are using social media and downloading videos as forms of communication, networking and entertainment, and it’s a constant barrage on an already overworked Internet.
Bandwidth is becoming an increasing problem. Sluggish downloads or error messages due to capacity and overload are not going to cut it for the younger set who wants it now and in high-definition video. While the baby boomers fashioned instant gratification into a lifestyle, this generation has made it into a high-speed, multitasking art form.
If complaints about a slow-crawling Internet sound familiar, they should. The EDA industry enabled the telecommunications infrastructure overhaul from 1998 to 2000 with powerful, effective hardware and software tools, just as the Internet was becoming a new form of communication.
And here we are again, poised for another huge leap in Internet and cellular bandwidth requirements, supplied by telecom equipment makers and the semiconductor/EDA ecosystem supporting them. Our children are ensuring gainful employment for those of us in these industries as they lead the way for all of us to overload the networks in 2012 and 2013.
We’re already seeing the signs in the form of new communication chip design projects. Be it fabless semiconductor players or network equipment suppliers directly, processing and multimedia requirements are moving up 4-6X in the next design cycle.
The EDA community has the opportunity once again to play a big role in revamping the backbone of the Internet infrastructure because suppliers will need more powerful EDA tools to develop next-generation devices. Each piece of the design flow will play a role, but the significant opportunity will be available to those of us that address co-development of hardware and software for these new platforms.
Development teams assigned to a SoC project of this magnitude will be juggling embedded multicore processors, DSP, third-party IP in hardware and huge development teams for software applications, now required across multiple operating systems. A strategic verification plan will be a must-have for the project team and emulation should be its keystone.
Emulation is uniquely suited to these challenges due to its versatility. It provides a close realization to silicon because it models a design into a hardware implementation, the only a way to ensure that all of the blocks are verified accurately and in a reasonable timeframe. Emulation can test a wide range of design styles, and validate hardware and software on billion-gate devices by exercising billions of clock cycles before tapeout. Emulators include hardware debugging capabilities and execute RTL models at multi-MHz speeds, mitigating runtime performance issues associated with simulation, particularly at the full-chip and system level. Newer emulators are more cost effective than traditional models, making them more accessible than ever before.
I consider this phenomenon as a means of coming full circle, both in business and in life, thanks to our children. 2012 should be an exciting year for the EDA industry as a whole — and verification vendors in particular — as we facilitate another overhaul of the Internet. I hope our children will be pleased. And in the meantime, I need help with one of the features on this new PDA.