[ Back ]   [ More News ]   [ Home ]
August 03, 2009
The Role of a Chief Technology Officer
Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
Jack Horgan - Contributing Editor

by Jack Horgan - Contributing Editor
Posted anew every four weeks or so, the EDA WEEKLY delivers to its readers information concerning the latest happenings in the EDA industry, covering vendors, products, finances and new developments. Frequently, feature articles on selected public or private EDA companies are presented. Brought to you by EDACafe.com. If we miss a story or subject that you feel deserves to be included, or you just want to suggest a future topic, please contact us! Questions? Feedback? Click here. Thank you!


Often times I receive a press release announcing that a small company has hired a new CEO. The same press release often says that the former CEO and founder has now become the Chief Technology Officer of the company. The new CEO typical has a sales and marketing background while the former CEO and founder is more likely to have been the technical visionary and development leader when the company was formed. This is not always the case and in fact is not the case with Dr. Pranav Ashar, the subject of this interview. He has been the CTO or Chief Scientist at several companies. What is unusual is that he was previously CTO at Real Intent and has recently returned after a few years absence as
the company's CTO. I asked him about the role of a CTO. Also on the call was Carol Hallett, VP Sales and Marketing.

Would you give us a brief biography?

How far back would you like? My last position was at a company called Liga Systems. This was a simulation acceleration company. Our technology was a FPGA-based solution. The technology was different than the typical FPGA-based simulator. It was a processor on an FPGA not the circuit under simulation on an FPGA. The original technology was developed at NEC. The company started in about 2005. I served with the company in an official capacity as Chief Scientist in the 2008 period. Before that I was a founder and CTO in a start up called NetFortis. Our goal there was to develop technology for low power detection of viruses and other malware on cell phones. This was a hardware/software
solution. We had an interesting business model to get the technology to cell phones using sim cards. I was there from 2006 to 2007. Before that I was the CTO at Real Intent from 2004 to 2006. Before that I was at NEC Labs in Princeton, NJ for about 13 years from 1991 through 2004. I got my Ph. D. from UC Berkley in '87 thru '91. My Ph.D. was also in EDA. So basically I have been working in VLSI and EDA spaces for close to 20 years now. My Ph. D. was in the areas of synthesis and formal verification. I also did a masters project at Berkley in circuit simulation.

In these 20 years I have been pretty deep and also pretty wide in VLSI design. I have been exposed to a number of different environments and ecosystems at universities, corporate research and development, and in different size companies.

You said that the last two companies where you have worked had interesting business models. Would you expand on that?

Sure. The company after my two years at Real Intent was a company called NetFortis. It was a company funded by me and a couple of other people. The goal was to get a low power and high performance technology for malware detection into cell phones. Cell phones are interesting because they are computers but unlike desktops and laptops, they do not have the compute and energy resources at their disposal. Malware detection becomes a significant overhead as a result. Since the operating systems on these phones are becoming as open as on PCs, we thought that there was an opportunity here. The most efficient way to achieve this was to express some of the algorithms in hardware. What we did was to develop a combined hardware/software solution for regular expression detection and matching. Unfortunately, we found that cell phone ecosystems are extremely controlled . The delivery time scales, deadlines and so forth are consumer oriented and are determined by factors outside the control of small companies like ours trying to introduce a new technology. So what we did was to figure out that there was a way to get hardware technology inside the phone by using the next generation SIM card. SIM cards are on a path where they are getting more complex in terms of the amount of memory and processing that can be performed in them. They also have a high bandwidth interface to the mother board
of the cell phone. We developed our technology to be low-power enough that it fits in the SIM card profile and planned to introduce this technology through a replacement of the SIM card rather than replace some other component. That was one of the important business-model innovations enabled by our technology. The second part of innovation was the realization that hardware IP gets commoditized very fast. Stable revenue can be generated only if there is a software part. We decided to have this up-front as part of the business plan and planned to have revenue through recurring software upgrades and malware signature licensing fees.

The company after that was called Liga Systems. It is a simulation acceleration company. This is a technology that a number of companies have tried and have not been very successful at. Our goal was to provide a technology to perform the acceleration without the overhead of setup and having to introduce new technology into the design flow. In our technology, the circuit got compiled into the a custom processor's instruction set, and it was the processor that got mapped to the FPGA. The company was successful in substantial performance improvements in gate-level simulation of 20+ million-gate designs.

You have been a CTO at several companies. Would you describe in general terms what the role of a CTO is?

The role of the CTO depends on the specific context. There are two ways in which the CTO can leverage his or her deep understanding of the technology. The first is an inward focus and the second is an outward focus. In the inward focus role, you give direction to the company's development group in terms of the product and technology targets and opportunities. In the outward focus role, the role is one of communicating with the rest of the industry on the challenges faced by it and on how the company is positioned to address those challenges. In the specific context of Real Intent, verification is one of the most significant challenges that our industry is facing given the complexities of upcoming designs. It appears that there are four or five semiconductor technology-shrink generations yet to come. Today, a design with medium-scale complexity is of the order of 20-30 million gates. If you factor in four or five more technology shrinks, a medium complexity complexity design will have about 500 million or a billion gates on a chip. With the 20-30 million gates today, companies are finding it hard to design chips and verify them and get them out on time. So you can imagine what it is going to be like when the number of gates gets to be of the order of between one-half and one billion. My goal in this context is to understand the challenges out there and the
technology opportunities in terms of what is on the cutting edge on the research side of our community and how to bring this into our products to address these challenges. On the outward focus side, the role is to make the industry understand that these challenges are real and that companies like Real Intent are working toward finding solutions to address them. Thus there is both an inward and an outward focus.

In many instances the CTO tends to be the founder of a company who is moved from the CEO slot to CTO when the investors decide to bring in someone with a sale and marketing orientation. This is not the case with you. How would you define the relationship between the CEO and the CTO?

At a previous company where I was a founder and a CTO, we had a CEO who just handled the business side. The CEO is more likely to be focused on a company's business strategy, on how the company positioned in the market, with customers and with investors. The role of the CTO is to have a very deep understanding of the company's technology and to give direction to the company internally as far as the technology focus is concerned and to communicate this technology focus to the outside so that it is synergistic with the company's operations and business goals. As far as the CEO and CTO relationship is concerned, it is a partnership, a division of labor if you like, where the CEO takes
care of the business and investment sides of the company and the CTO complements with a technology focus and direction.

Is the current CEO at Real Intent, the same one when you were there before?

Yes. The CEO of Real Intent is Dr. Prakash Narain. Our relationship goes a long way back. I worked with him when I was at NEC Labs and he was CEO at Real Intent. I was a department head developing formal technologies and got Prakash interested in looking at those technologies more closely. That's how our relationship started. In 2004 he recruited me to come over to Real Intent as CTO. That relationship worked out very well. I have to say that while Prakash is very focused on the business side, he also is a Ph. D. and is very savvy in the technologies in our products.

It is not unusually for people during their professional careers to move from one company to another but it is relatively unusual for one to return to a company that they have left. What brought you back to Real Intent after a couple years elsewhere with other small companies?

That's a good question. The two years I spent at Real Intent previously went very well. I was instrumental in bringing in some new formal technologies to Real Intent's products that had very good impact on the market competitiveness of our products. The reason I left had nothing to do with Real Intent. It was more an opportunity to found a company in a different space that I thought would be challenging. That said, in a sense I never left because over the last few years I have maintained close communication with Prakash, in terms of the direction that Real Intent was taking and possible opportunities. When I saw the opportunity to come back I took it because, as I said earlier, that
verification is one of the most important challenges that our industry faces today. It continues to be the show stopper in terms of realization of the next four to five technology shrink generations. It is really mind-boggling to consider the complexity of designs with one-half to one billion gates on a single chip and of getting them out in a reliable and predictable way. I am basically a technologist. I sort of feed off of on such fundamental challenges. I am coming back to Real Intent because I see myself as being able to contribute to addressing these challenges.

The complexities in the design of these chips are coming from a number of different directions. It is sort of a multiheaded beast. These challenges are a combination of the timing complexity of these chips, the system complexity as a result of many diverse components, and from the raw complexity of the number of transistors on these chips. And also, going forward, the power and reliability management structures on chips are becoming more complex. Handling all of these things in a predictable manner is a very interesting challenge.

Real Intent, over the years, has established itself as a leader in the chip verification space. Chip verification is interesting in the sense that the longevity of a company is a significant plus. The longer you are in this space and the more tapeouts you have participated in, the more competent you become and the greater recognition you get in the design community.

Real Intent at this point has 30 to 40 customers. It has been around for about a decade. Possibly hundreds of tapeouts have depended on Real Intent. I feel good joining a company with this strong foundation. It provides me an excellent platform to innovate and be in the thick of things as far as being able to participate in addressing chip design challenges.

Editor: While at DAC I sat next to a Chief Technologist quite by accident. He was not only a CT but also a full professor at a well respected university. He was not a CTO as the university does not allow its professor to be an officer in a company. The company was founded by some of his former students and has subsequently hired many more of his former students. He said he had nothing to do with the business side. When he told his wife that company had offered him the position, she said he could take it, only if it had nothing to do with the business side because he was such a lousy business man. He spends more time with the company than with the university and has taken a
few leaves of absence. He travels the world talking about technology. At the table were two Europeans, one from STMicro and one from Infineon. He spoke of meetings with their associates the previous week in two different European countries. He spoke of leveraging the network of contacts he has built up over the years in the research community both in academia and industry as a way of keep track of what is going on. He agrees with Pranav about the need for the CTO to have a deep understanding of the company's technology, the possible application of that technology and related technologies to help drive the technical direction of the company.

How big a company is Real Intent?

About 30 people worldwide.

What is the revenue stream?

We are a privately held company and do not give out that information.

Would you provide us an overview of Real Intent?

Real Intent is a design verification company that offers an innovative cocktail of formal methods, structural analyses and simulation-based methods to verify key aspects of a design like functionality and timing. The company fundamentally believes in the power of formal methods. It also believes that naive application of formal methods is not net productive. We believe in what we call "automatic formal". This is a philosophy that permeates all our products, where the checking to be done is almost completely extracted automatically from the design description itself and the application of formal methods is judicious and under the hood. The role of the designer in terms of telling the
tools what needs to be checked is very minimal. This is consistent with the name of the company, Real Intent! Real Intent is focused on the verification of the intent that the designer probably implied without the designer having to explicitly spell it out. The goal of all of our products is to minimize the designer's overhead. Underlying the tools are formal methods, structural analysis and simulation hooks to check the extracted checks automatically and comprehensively.

Currently the company has three product families. There is a product family called Ascent associated with checking the functional aspects aspects of a design. It extracts basic functional checks automatically from the RTL and then applies simulation and formal methods to perform the checking.

The second family of products is called Meridian. Its role is to check the design around clock domain crossings. Clock domain crossings have become an extremely important part of a design today partly because the raw diversity of components on chips has grown by leaps and bounds. You have a number of different kinds of components (USB, Ethernet, …) and also the design itself. Everything on a single die. All these different components feed off different clocks and of course communicate with each other. You have a whole lot of these clocks asynchronous relatively that communicate with each other. Multiple domains are also necessitated by the difficulty in distributing a single clock to the
entire chip in a predictable manner.

Checking a design in the context of this type of asynchronous communication is extremely difficult. It is extremely hard to find these errors in simulation or even in the lab. The manifestation of these errors may actually only take place on a specific sample of chips from the foundry. In that sense Meridian provides an extremely important service to the design community by being able to catch these errors early on.

The third family of products is called PureTime. The role of PureTime is to check timing constraints that are commonly specified in the context of a complex design today and also parallelize these timing constraints in a formal manner. Timing errors can be insidious in that timing errors are very hard to find through simulation. PureTime will likely become an important part of the design flow because of its capability to find timing problems early.

So three product families.

One of the goals of these product families is to find problems as early as possible and hopefully by a designer rather than a verification engineer. The earlier these bugs are found, the earlier also the turn around of debugging and fixing of these errors. That being said, tools like PureTime and Meridian also have application in downstream abstraction levels, the gate level for example, after the timing constraints and timing specifications have been fixed. PureTime can also be used at the gate level. Ascent and Meridian can also be used with simulators at the full-chip level.

What is new with these three product families?

Ascent, the functional verification tool, is being augmented with comprehensive linting. Linting is a first line of defense in design verification. A lot of design companies have started enforcing design rules or design policies to make sure that all designers are on the same page in terms design styles and accepted idioms. STARC and VMM are examples of such policies. What we are doing is to understand these policies and to build in the checking of the design based on these policies into our tools. Such a tool makes it easy to enforce these polices across a large and distributed design team. That is one of our most important introductions at DAC. We have STARC, PVD and RMM policies. Also
the linting tool will help in policies related to System Verilog migration.

EDITOR:The Semiconductor Technology Academic Research Center, STARC, is a research consortium co-founded by major Japanese semiconductor companies in December 1995. STARC's mission is to contribute the growth of the Japanese semiconductor industry by developing leading-edge SoC design technologies.

Carol Hallett:

Real Intent always had linting capability. We had it in the reporting section, kind of under the covers. Our customers asked us to bring that upfront so that they could actually use the lint tool. We did that in a version around a year and half ago. Since then, more people realized that we have lint capability and have come to us and said "Since you have lint, why not support these industry policies?" We always react as best we can to customer requests. We decided to develop these policies like STARC. We are not doing VMM just RFM because VMM has test benches and our product works before test benches are developed.

We went one step further. You can just run our default lint, if that is all that you need. It doesn't have much noise and you can easily fix the errors that pop up. Or you can select an industry policy that your company requires. And/or you can pick rules from different policies, add some of your own company policies and changes some of the parameters so that you have your own company lint check run automatically with Ascent.

Pranav Ashar: The second new feature of Ascent is SimPortal. The basic underlying techniques in our tools have been mostly formal thus far. What SimPortal dies is create a link or a hook into simulation for these same checks. This allows us to do a couple of things. In many cases, where formal capabilities are unable to handle the design complexity, it allows us to perform simulation. As a result, this enables a smoother tradeoff between complexity and the checks. Another thing SimPortal allows us to do is to create a seamless verification flow from the block level all the way to the chip level. Many times it happens that the designer will start checking the design at the block level, say a
FIFO. By performing these checks, a designer would have to make some assumptions in terms of how this blocks fits into the rest of the environment, the scaffolding if you will and the constraints that the scaffold imposes on the block. Imposing these constraints is cumbersome and imprecise. While the check at the block level is extremely useful in that bugs are found, it still requires that the designer to ensure that the bugs that are found are correct by also performing the same checks at the chip level.

Formal analysis at the chip level is still a work in progress. As you know, what really works at the chip level is simulation. SimPortal enables us to extend the block level checking to the chip level and in some sense validates the checking that happened at the block level.

The third addition to Ascent is the so-called PBV, path-based verification. One of the areas has to do with checking the design in the context of Xs. This is interesting in the sense that there is disagreement in the industry about whether X's in simulation are good for design and verification or not and what they are supposed to mean. What we heard from customers is that it is important to check the generation of X's and propagation of X's in the design during simulation. We believe that a precise and formal methodology that is able to detect the generation and propagation of X's and to verify their soundness is required. This is a brand new capability that we are introducing. We believe
that this will be extremely useful to the design community.

Editor: Explicit and implicit X sources (X assignments in RTL and non-resettable flops, respectively) in the designs can lead to many challenging issues for design verification, such as masking real design errors and causing RTL-to-netlist simulation mismatches. Depending on coding styles, simulation results can be X-pessimistic which lead to unnecessary unknown values; or X-optimistic which results in known values when they should have been unknown. Design and verification teams write properties to trap Xs or instrument 2-value simulation with random initialization to avoid X ambiguity in order to detect design errors. However, these approaches take considerable amount of
manual and computational resources without offering the complete confidence of X robustness.

What is new with Meridian? Isn't SimPortal part of Meridian?

One of the additions to Meridian is SimPortal. It is similar to Ascent SimPortal in that it provides simulation hooks for the checks normally performed structurally or through formal methods. It allows us to do the checking at the chip level and with test benches in simulation in situations where the formal methods are not able to meet the requirement. Basically, the combination of SimPortal and formal methods and structural analysis provides a very complete and comprehensive approach to the checking of the functionality and clock domain crossing issues. With this combination we can cover all the bases.

The second addition to Meridian is hierarchical analysis. We want to be able to provide a methodology that goes from the block level all the way to the chip level. Many times, in the context of clock domain crossing checks, you might perform a check at the block level, say for a FIFO. Once you have done that, you don't have to check that again. What we are providing is a mechanism to create a shell model of the blocks that you have checked already and then to allow the abstracted versions of these blocks to be used at the chip level with the understanding that the internals of these shell models have already been checked. The nice thing about it, is that the clock setup and structural
analysis that happened in the context of the block is still used at the chip level.

A third new aspect of Meridian that we are introducing is something called free running clocks. As we discussed, clock domain crossing checks are required because the clocks associated with communicating blocks are relatively asynchronous. The checking of these crossings is most precise when you build the relative asynchrony of these clocks into formal analysis. What the free running clock aspect of Meridian does is that it allows formal analysis to treat the different clocks completely asynchronously with respect to each other. It does not place any constraints between the relative times at which the clock edges from various source arrive.

What is new in PureTime?

A very important new feature in PureTime is the ability to check the constraints that are provided by the user before any formal analysis of the constraints on the design is performed. The reason for having a constraint validation front-end is that these constraints tend to come from multiple design groups. Many times these constraints can, in fact, conflict with one another. Also the constraints may be incorrect in the context of what is actually present in the design. What constraint validation does is to catch these errors in the specification of the constraints very early on so that the constraints that are actually given to static timing analysis or to the formal analysis in
PureTime are, in fact, meaningful. In addition, the formal analysis of timing exceptions in PureTime has been enhanced quite a bit and can handle much larger designs than before.

The top articles over the last two weeks as determined by the number of readers were:

STMicroelectronics Unveils Latest Advances in Design Methodologies at DAC 2009 STMicroelectronics will participate as presenter or co-author of several papers at DAC 2009. ST's contributions to the conference cover advances in design methodologies and automation in the areas of 3-D stacking for complex SoC (System-on-Chip) ICs, physical- and system-level chip design, and IC reliability.

Cadence Reports Q2 2009 Financial Results Cadence reported second quarter 2009 revenue of $210 million, compared to revenue of $308 million reported for the same period in 2008. On a GAAP basis, Cadence recognized a net loss of $74 million, or $(0.29) per share on a diluted basis, in the second quarter of 2009, compared to a net loss of $19 million, or $(0.07) per share on a diluted basis in the same period in 2008.

Atrenta Announces Major Extensions to 1Team®-Genesis Platform Atrenta Inc announced major extensions to its 1Team®-Genesis platform. 1Team-Genesis supports architectural level chip assembly and provides a rich set of capabilities to plan the design, automate its assembly and establish feasibility. Along with the industry standard SpyGlass® platform for RTL analysis and optimization, Atrenta provides a fully integrated flow from early specification to RTL handoff for implementation.

Virage Logic Reports Third Quarter Fiscal 2009 Results Total revenue for the third quarter of fiscal 2009 was $11.9 million, compared with $11.0 million for the second quarter and $15.1 million for the third quarter of fiscal 2008. License revenue for the third quarter of fiscal 2009 was $10.7 million, compared with $9.1 million for the prior quarter and $12.3 million for the same period a year ago. Royalties for the third quarter of fiscal 2009 were $1.2 million, compared with $1.9 million for the second quarter and $2.8 million for the third quarter of fiscal 2008.

Other EDA News

Mentor Graphics Unveils Android and Embedded Linux Strategy with Acquisition of Embedded Alley

Mentor Graphics Enables Android on Freescale Products Based on Power Architecture Technology, …

Mentor Graphics Announces Linux and Nucleus Multi-OS Support for Marvell Sheeva Embedded Processors

Mentor Graphics Announces Nucleus Graphics and Linux Platform for ARM Mali GPUs

DS Reports Second Quarter 2009 Financial Results At High End of Company Objectives

46th DAC Announces Preliminary Attendance Figures

UMC Adopts Cadence 40-Nanometer Reference Flow for Low Power, Verification, Implementation and …

Virage Logic Reports Third Quarter Fiscal 2009 Results

Altium Accelerates Move Towards Continuous Updates

Cadence Reports Q2 2009 Financial Results

Synopsys Honors Accellera and The SPIRIT Consortium with Ninth Annual Tenzing Norgay …

Verification Special Session at DAC 2009 to Discuss Advances in Debugging

Magma Quartz DRC and Quartz LVS Support TSMC's New Unified Physical Verification Format

Cadence Achieves First-Silicon Results on 32nm Common Platform(TM) Technology

Taiwan's Industrial Technology Research Institute Adopts Cadence C-to-Silicon Compiler to Boost …

Azuro Strengthens Leadership in Low-Power CTS on Complex SoC Designs

Magillem Releases its Verification Solution to Complement Its Existing Suite of Tools

Open SystemC Initiative Unveils Technical Working Group Milestones at DAC

Berkeley Design Automation Analog FastSPICE™ Selected by NEC Electronics

Synopsys Introduces Galaxy 2009 with 2x Faster Throughput

Teradici Chooses Synopsys as Its Primary EDA Partner

Freescale Achieves Design Cycle Reduction and Superior Silicon Predictability With Cadence …

VeriSilicon Delivers Chip Designs on Time and at Lower Cost With Cadence InCyte Chip Estimator

Magma Announces Next-Generation Mixed-Signal Design Flow With New Release of Titan Platform

LG Electronics Adopts Cadence Conformal Technology for Improved Engineering Design Management, …

Atrenta Announces Major Extensions to 1Team®-Genesis Platform

Tuscany Design Automation Introduces Web-enabled Visualization of Complete IC Design Data

Tuscany Design Automation’s Tego Physical Design Software Accelerates Structured Design

Other Embedded, IP & SoC News

OSRAM TopLED Black Series LEDs Improve Readability in High Ambient Light Conditions

Fairchild Semiconductor’s P-Channel MOSFET Offers Industry’s Lowest RDS(ON) in a 1mm x 1.5mm …

Custom Silicon Solutions Gains ISO 9001:2008 Quality Management System Certification

Micron Introduces a New Way to Increase Server Memory Capacity and Improve Performance

Akros Offers First Integrated, 2kV-Isolated, Quad-Output Power SoC

Motorola Reports Second-Quarter Financial Results

TSMC Reports Second Quarter EPS of NT$0.94

AMD Delivers Its Most Powerful Professional 3D Graphics Card With Up To Four Times The Processing …

NetLogic Microsystems Announces Second Quarter 2009 Financial Results

Amkor Reports Second Quarter 2009 Results

LSI Reports Second Quarter 2009 Results

Altera Extends Temperature Range of Stratix III FPGAs to Support Military Applications

ARM Sees Demand for HD Mobile Media and Entertainment Grow as Mali GPU Licensing Momentum Continues

UMC Reports 2009 Second Quarter Results

InterDigital Announces Second Quarter 2009 Financial Results

STMicroelectronics Reports 2009 Second Quarter and First Half Financial Results

austriamicrosystems to Manufacture Triad Semiconductor’s Via-Configurable SoC Solution

Validity Sensors Licenses Two Tensilica Processors for High-Volume Fingerprint Sensors

Austereo licenses audio IP core portfolio from Coreworks including SPDIF, I2S/LJ/RJ serial …

Virage Logic Extends IP Technology Leadership to the 32/28nm Process

Virage Logic Introduces Volume Production-Proven SiPro™ PCI Express PHY IP

IDT Reports Fiscal First Quarter 2010 Results

QuickLogic Announces Fiscal 2009 Second Quarter Results

MoSys, Inc. Reports Second Quarter 2009 Financial Results

SST Reports Second Quarter 2009 Financial Results

ANADIGICS Announces Second Quarter 2009 Results

Actel Announces Second Quarter 2009 Financial Results

Leadis Technology Reports Second Quarter 2009 Results

Analog Resistive Touch-Screen Controllers for Embedded Markets Announced by Microchip Technology

SMIC Reports 2009 Second Quarter Results

IDT Introduces Industry’s First DisplayPort-Based TCON With Integrated Digital LED Backlight for …

HomePlug® AV Technology Enables IEEE 1901 Draft Standard

Altera's 40-nm Arria II GX FPGAs Achieve PCI-SIG Compliance for PCIe Express 2.0 Specification

NXP Dual Channel Class-D Amplifiers Bring Power Efficient Concert Hall-Like Sound Into the Vehicle

You can find the full EDACafe event calendar here.

To read more news, click here.

-- Jack Horgan, EDACafe.com Contributing Editor.