Peggy Aycinena is a contributing editor for EDACafe.Com
CTO & Visionary: A conversation with Real Intent’s Pranav Ashar
October 23rd, 2014 by Peggy Aycinena
Dr. Pranav Ashar embodies the best of what EDA is all about these days, serving as articulate spokesman for the company’s technology, while tracking the wider view as well, the trends and future of the industry. I spoke with Dr. Ashar in early October and was impressed with his willingness to participate in an unscripted interview.
Our conversation was precipitated by Real Intent’s recent announcement of the 2014 release of its Meridian CDC product for clock-domain crossing analysis, which per the press release, adds enhanced speed, analysis and debug support for SoC and FPGA design teams, introduces a new CDC interface approach, a new way of debugging CDC violations, and a unique way to handle flat and hierarchical designs comprehensively. Dr. Ashar started our conversation by talking about the announcement.
Pranav Ashar – Real Intent is in the business of solving the higher value problems that existing tools have been unable to solve, or at least not at the level of productivity desired. We are one of the companies that design houses around the world look to for solutions. Huge companies today depend on small companies like Real Intent to solve their problems, and it’s big picture within which our announcement is being made.
Clock domain crossing, as it currently stands, is in that realm of tough problems. It used to be that 6 or 7 years ago, you solved the problem in pieces using laborious simulations or manual inspection. But such strategies are absolutely no longer possible with tens of thousands of crossings in designs, yet signing off on these interfaces using a solution pool has become a requirement for any tapeout.
[Which brings us] to the specifics of our Meridian CDC announcement. This product has been maturing over the last several years, designed into the verification process of lots of companies today, including the largest in the world. Our latest release is a completely new debugger. The tool analyzes the design formally and determines areas in the design that are susceptible to error.
In any design there is a huge amount of data that is converted by a tool that does any kind of analysis, which helps narrow the focus on that part of the data that’s truly relevant. Doing this work across a substantial area of the design presents a significant challenge for any tool and [recommends] the biggest advantage of our newest release of Meridian CDC; it provides help navigating through the tool. When the user’s looking at a certain line, the product now anticipates what the user is likely to see. The tool [then suggests] actions the user may want to pursue as a next step. Very helpful in terms of using the product.
There is actually a grand analogy here. The reason people buy the Apple iPhone is because the company has put a lot of work into determining what the user wants to do, reducing the onus on the user to anticipate usage needs. We also want our users to have [these assists], which is why it is part of the big advancements included in the new release of Meridian.
It is important to note that these advances in the debug user experience and productivity are ultimately only possible when the analysis engines under-the-hood do more and perform their tasks more efficiently.
As you will see from the press release, the new release reports large improvements in memory and performance of the semantic and formal analysis engines to enable better debug. Also, a major consideration for us in advancing Meridian CDC features is that CDC verification today is a sign-off requirement – the first priority is to not miss a failure.
The new release recommends and enables an ‘uncompromisingly efficient’ workflow that fits intuitively into the typical hierarchical design process, allowing the design team to systematically sign-off on the full chip by a process of progressive certification of the lower levels of the design hierarchy.
WWJD – Introducing new features into a product requires discerning in advance where the user community and the industry are headed. Where do you see Real Intent in 5 years?
Pranav Ashar – I can’t reveal all of our plans, but I can tell you that when Real Intent started in static verification space 10+ years ago, that space was nascent and we were almost in an evangelistic role. Clock-domain crossing was a problem that was just starting to be known, but clearly that’s completely turned around over the last decade. Today, the complexity of chips is so high, all of the dire forecasts have come to pass. The need has grown with the recognition [of increasing problems] over the last 3-to-4 years. Now we see that a big change is imminent.
We are seeing something else as well. As recently as 6 years ago, people were wondering how the verification of SoCs was going to happen. Now, as a result of increased familiarity with the challenges, the design houses have a much better understanding of the various steps involved in the process.
Verification is always about the analysis of foundations that happen in the design, or the abstract concepts in the design. Every change you make in the design has implications [with regards to] steps required in the associated design process. Once you have developed a better understanding of the implications, it becomes easier to develop an analytical model that captures the effect of the changes, a critical analysis that impacts verification.
Along that same concept, if you don’t understand something you fall back on simulation, the last defense if you have the models. But the fact is that [modern thinking] has crystallized that process, verifying along each design step [as it occurs]. This has established an important role for verification in [the overall design process].
Clock-domain verification is also static, so static verification has become firmly anchored in the SoC design and verification process. We see that over the next few years, the number of problems [associated] with static verification is only going to grow.
As a result, a lot of static verification is going to be done before the simulation starts. Simulation will continue to have its role, but it’s not going to be the solution to everything. From the perspective of Real Intent, this is one of the things we believe is going to happen over the next few years. We are firmly anchored in this technology and believe the need will grow substantially over the next few years.
WWJD – You taught VLSI design at Columbia. Why do some schools consider that subject matter to be the stuff of trade schools?
Pranav Ashar – For people coming out of a Masters programs, even at top-tier schools like Columbia, they are not as well prepared as one would like them to be to jump into the VLSI design field.
VLSI design, and EDA in particular, are interesting – a mash-up of a lot of different domains, including electronics, software programming, and computer science and engineering. Even if you limit yourself to just these three disciplines, it is very rare and actually quite hard to put the onus on the schools to [provide this preparation], particularly given that it’s already difficult to train students well who don’t yet have the professional perspective to be experts in all of these areas.
That being said, at the end of the day it’s about problem solving in three different domains that are not that dissimilar. The basic principles are very similar.
I had the opportunity to do a bachelors in electronics, including analog, digital, power electronics and all of that. I then came to Berkeley to grad school and did a lot of courses in computer science, so my education is in fact a combination of these disciples. A well-rounded skillset, but one that took me 8 years to complete. We are expecting the universities, however, to push all of this [curriculum] in just 4 years, clearly a very hard task.
Teaching VLSI design does not make it the [stuff] of trade schools, because both the learning and the research involved depend to a great extent on the application context. What motivates the research, motivates the applications. Yes, it’s like a feedback loop.
The students are the motivators in all of this, and [their advisors]. For researchers and research labs, they must have money to fund their work, and they need to work in an applied context. Otherwise, they are doing basic research and that is very hard to fund.
WWJD – You worked earlier at NEC Labs in Princeton. With the phasing out of those types of research facilities, organizations like Bell Labs, where is basic research happening today?
Pranav Ashar – I agree with the current thinking: Whereas basic research used to happen in the Bell Labs of old, it is now happening in companies like Google, Microsoft, Facebook and so on.
Google is for all practical purposes a monopoly in the search domain, and Apple also has a lot of cash. These companies have the latitude today to put some of that cash to work in areas that don’t necessarily align [with their core businesses]. Google in a big way with their cars, Apple in advanced consumer technology, and Microsoft across a variety of applied research areas. These companies are also investing in things like radio-astronomy research and basic materials science, topics that the old Bell Labs and IBM used to explore.
Then there are self-driving cars, a non-trivial problem to anyone. It is basic research to put technologies in place to allow a car to run on its own. Tesla is going to announce shortly an automatic assist to their Model S, which can enter the highway and go from entrance to exit. The solutions to these types of problems are very hard, where small errors can have such a large impact.
WWJD – In your time at Real Intent, has basic research been part of the company’s processes?
Pranav Ashar – I joined Real Intent in 2004, but in the interim took a couple of years sabbatical to work on technology for malware detection for smart phones in the pre-iPhone time frame. At the time, the technology looked at a way to do malware detection by off-loading the string-matching hardware. We developed low-power and low-bandwidth hardware to do that and raised money [for the technology], but it was a little early and did not work out in the end.
I was also involved starting a company based on simulation-acceleration technology out of NEC Labs. The premise was that it was possible with advances in FPGAs, and so on, to have a simulation-centric processor implemented on an FPGA. Then you would actually compile the RTL instructions to run on this simulation processor, the bottleneck being the bandwidth to bring the instructions to the processor. This company did fairly well for about 3 years, but then suffered reverses in the downturn and no longer exists.
Since those sabbatical years, I have been back at Real Intent for about 5 years.
WWJD – You clearly have experience at both large and small companies, can you address the small company versus large company conundrum with regards to research and innovation?
Pranav Ashar – Yes, in large companies opportunities for research are available. But the reality is that more innovation happens in small companies like Real Intent, which have their ear to the ground to respond to the needs of the environment.
When there are millions of gates, as there are today, and the tools need to be able to run overnight on those designs and determine if the crossing are correct, there are a lot of discoveries. Our work is about discovery, understanding what the requirements are and the scale of design we are dealing with. At Real Intent, we have created a synthesis of all of our technologies, a non-trivial task if you delve into the internals of the product and work that could almost quality as basic research in itself.
It’s also important to consider that in smaller companies like Real Intent, a large fraction of our R&D team are Ph.D.s from the universities, so the research and development at the end of the day can certainly be seen as basic research.
The separations between basic and applied research and development is in how you approach the problem. Taking existing technology and applying it to a problem is development, but looking at the problem from first principles and developing technology to solve the problem – that is research to a certain extent, because discovery is always involved!
And, success if not just a matter of execution. In small companies we take risks, because the possibility of success is [so appealing].
Now look at Synopsys as an example of a large company. They would say that 35% of their revenue is used for R&D. But if that is the case, how does a company invest at that level but still not be able to innovate, still be relying on acquiring smaller companies to acquire innovation. The 35% is clearly misleading!
Yes, supporting the legacy customer base can be difficult for the larger companies. A company needs to continue to innovate and improve their products, yet support existing technologies because of that legacy user base. That R&D budget can often fall into that bucket, but that’s misleading as well.
Even when the revenue models for these large companies is fairly stable, with a lot of engineers on board, a lot of the expense in R&D is just about moving algorithms around to address the scale of the chip size and so on. Given that’s the case, these companies do not have a lot of latitude, because to truly innovate and develop new technologies, there has to be a research element – not just development.
Meanwhile, advances are still happening at smaller companies like Real Intent. Companies which are not locked into a business model, not locked into quarterly reporting cycles [like publicly traded companies], and not required to answer to the shareholders.
As a result, startups have a lot more freedom and latitude in terms of innovation, compared for instance to a big company like Synopsys. And it continues to be part of the reason that innovation continues to happen in smaller companies. Plus, there is much better payoff in the smaller companies.
So if you have the choice between being part of a stable, larger company versus a smaller company that’s really fixated on solving problem, you might tend to the larger company for security. Nonetheless, it’s still the case that the smaller companies in the industry continue to attract the better talent compared to the bigger companies.
WWJD – What will happen to Real Intent if it grows too big to innovate?
Pranav Ashar – [Laughing] The challenge for us first is to get to that point, to see that static verification has its [appropriate] place in the technology, and then to find the next high-value problem. We will want to solve that problem better, and to the point that it becomes the next defacto ‘challenge’ to the bigger companies.
Our challenge always is to respond to the problems in the world, to grow the technology to address those problems, and then to apply the company to implementing the solutions.
Of course, as with the entire EDA industry, we also have to meet the challenge of monetizing the solutions we are providing to the whole electronic industry. They continue to depend heavily on the EDA industry, yet that dependence is still not translated adequately into revenue. In that area, Real Intent faces the exact same challenge as every other EDA company, large or small!
Dr. Pranav Ashar brings more than two decades of EDA expertise to Real Intent. Previously he was Department Head at NEC Labs in Princeton, New Jersey, where he developed a number of EDA technologies that have influenced the industry.
Ashar has authored 70 refereed publications with more than 1500 citations, and co-authored a book titled, Sequential Logic Synthesis. His paper titled “Accelerating Boolean Satisfiability with Configurable Hardware” was selected as one of 25 significant contributions from 20 Years of the IEEE Symposium on Field-Programmable Custom Computing Machines. He has 35 patents granted or pending.
Ashar has served as adjunct CSEE faculty at Columbia University, where he taught VLSI design and verification courses. He has a Ph.D. in EECS from the University of California, Berkeley.