March 05, 2007
Characterization – Altos Design Automation
Please note that contributed articles, blog entries, and comments posted on are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
Jack Horgan - Contributing Editor

by Jack Horgan - Contributing Editor
Posted anew every four weeks or so, the EDA WEEKLY delivers to its readers information concerning the latest happenings in the EDA industry, covering vendors, products, finances and new developments. Frequently, feature articles on selected public or private EDA companies are presented. Brought to you by If we miss a story or subject that you feel deserves to be included, or you just want to suggest a future topic, please contact us! Questions? Feedback? Click here. Thank you!

Is Altos self-funded?

Initially we were self-funded. Last December we took a small amount of Series A funding ($1.5 million). Most of that was from a private investor. We got some money from Jim Hogan at Vista Ventures.

How big a firm is Altos?

It is still very small. We are 7 people. Now we are trying to add a few more.

You said that there was a problem that Altos was addressing that others were not. What is that problem?

general problem of characterization. 
People have made efforts to solve this problem before.  There are solutions in the marketplace.  It is not that there have not been products for
characterization.  What has happened was
that there has been a lot of new factors that have all come along at the same
time and have been looming on the horizon that have put the existing
characterization solutions under undue stress. 
This is just going to put a big hole in the whole ecosystem of people
using existing digital design flow.  Things
like low power.  As you introduce low
power you start to do new things.  You
need to look at multiple voltages on a chip, which means you have to
characterize libraries at multiple voltages. 
You start seeing thermal effects such as temperature inversion where worse
case corners no longer occur at the highest temperature.  You may get worse cases occurring at lower
temperatures.  You see people using
multiple threshold devices which increase the size of the library typically by
3x and doing power shut off, things like state retention flops.  In addition there are new model formats for
more accurate modeling like CCS and ECSM. 
People are also starting to look at yields, trying to come up with an
alternative set of libraries that would tradeoff performance for yield.  All these factors were exploding the number
of potential library views that you are going to need.  The complexity of the models is going up
too.  You have the kind of perfect storm
of more complex cells like some of these state retention flops, and more
complex models like current based models. 
Then looming on the horizon of course is statistical timing.  The complexity in generating statistical
models is kind of like the hurricane. 
The other factors are more like gale force winds.  Together all these are like a double perfect
storm.  This is the area where people are
sort of making do with older technology and getting by living with huge run
times, large computer farms and dedicated teams.  A lot of people are doing it in house with a
lot of homegrown tools.  We just felt
that it was a time to take a fresh run at this. 
I think we are starting to see that this was the right decision.

What do the acronyms CSM and ECSM stand for?

stands for Composite Current Source and ECSM stands for Effective Current
Source Model.  They are new delay models which
use a current source.  These give you
more accuracy than the table lookup model that has been the industry standard
for 15 years that Synopsys introduced in the early 90’s or late 80’s.


the extension of the CSS Synopsys model to address signal integrity.  Synopsys had an equivalent to the NLDM model
called Liberty SI.  That has been deemed
to be very hard to characterize and takes a very long time and may not be as
accurate as some people need at 65 nm and below.  CCS Noise is a much more accurate model and
it takes less time to characterize. 
However it is a more complex characterization task as it requires a lot
of internal details not just the boundary information.  A lot of in-house tools are written for treating
cells as black boxes.

Are CCS, ECSM simply generic terms or are they formal standards?

There are
competing standards.  ECSM come from
Cadence plus some stuff from Magma.  CCS
is the Synopsys equivalent.  CCS is part of
the Liberty standard
that gets blessed by the TAB at Si2.  Si2
Initiative formed a technical advisory board to facilitate the evolution of Liberty library modeling
standard.  CCSM is a Si2 standard.  They are both open to us.  They do essentially equivalent things but
they use different data.  ECSM derives
the current model from voltage waveforms while CCS requires you to actually
capture current information to characterize. 
That is the main difference.

Editor: Si2 is
an organization of over 100 semiconductor, systems, EDA, and manufacturing
companies focused on improving the way integrated circuits are designed and
manufactured in order to speed time to market, reduce costs, and meet the
challenges of sub-micron design. Si2 focuses on developing practical technology
solutions to industry challenges.

The Open

Modeling Coalition (OMC) was formed by Si2 in mid-2005 to address critical
issues - such as accuracy, consistency, security, and process variations - in
the characterization and modeling of libraries and IP blocks used for the
design of integrated circuits.

The OMC technical objectives are to define a consistent modeling and
characterization environment in support of both static and dynamic library
representations for improved integration and adoption of advanced library
features and capabilities, such as statistical timing.  The system will
support delay modeling for library cells, macro-blocks and IP blocks, and
provide increased accuracy to silicon for 90nm and 65nm technologies, while
being extensible to future technology nodes.  Technology contributions
from Cadence Design Systems, IBM, Magma Design Automation, Synopsys, and other
companies are in support of these goals.

Tell us about the Altos products.

Since our
inception we have built two products. 
The first one we call Liberate which is a standard cell and IO library characterizer.  It builds Liberty models and plugs into existing digital
implementation flows.  That product took
us just one year to build.  We released
Liberate in December 2005.  We were
engaged with 3 beta customers at that time. 
Early in 2006 we were able to turn one of those beta sites into a paying
customer.  They were able to put it into
their production flow.  The second
product, Variety, which obviously leverages a lot of the technology from the
first product, was released in September 2006. 
Before the end of the fourth quarter we were able to get the first deal
for that product.  We have been able to
bring these products to market and to get paying customers within the last

What is the main differentiation of your product?

The main
differentiation of our products is that we do a lot of things to make
characterization go faster.  Basically characterization
was a bottleneck with all the different views and models that people were
starting to require.  It was going to
become self evident that it was so costly to do that people would start cutting
corners and would not do certain things. 
Statistical timing would not become a reality unless models were readily
available.  That’s how we can play a role
and add value.  It is very easy to
use.  A lot of the characterization tools
require the user to tell them I want to characterize it this way.  There is a lot of the manual intervention, a
lot of setting up vectors and conditions. We automate all of that.   We track the optimal set of vectors that you
need to fully characterize the cell.   We
can filter out duplicate vectors exercising the same path.   Because we are automated we have found that
we do better than a lot of other people do with a more manual approach.   Things may be missing with the manual
approach.   With about 90% of the libraries
we get from other people, we are able to pinpoint some holes, some areas they
have missed.

You can find the full EDACafe event calendar here.

To read more news, click here.

-- Jack Horgan, Contributing Editor.

Review Article
  • Yet another doomed to fail March 05, 2007
    Reviewed by 'M&M'
    This time Jim jumped into something with a very tiny market ($20M). Several people have tried walking that path before- burnt the money and closed/got acquired. GOOD LUCK.

      9 of 9 found this review helpful.
      Was this review helpful to you?   (Report this review as inappropriate)

For more discussions, follow this link …
CST Webinar Series


Featured Video
Manager, Field Applications Engineering for Real Intent at Sunnyvale, CA
Electronics Firmware / Digital DesignEngineer 2 for Northrop Grumman at Rolling Meadows,, IL
Upcoming Events
SEMICON Europe at Grenoble France - Oct 25 - 27, 2016
ARM TechCon 2016 at Santa Clara Convention Center Santa Clara CA - Oct 25 - 27, 2016
Call For Proposals Now Open! at Santa Clara Convention Center, Santa Clara, CA California CA - Oct 25 - 27, 2016
DeviceWerx - 2016 at Green Valley Ranch Casino & Resort Las Vegas NV - Nov 3 - 4, 2016
S2C: FPGA Base prototyping- Download white paper

Internet Business Systems © 2016 Internet Business Systems, Inc.
595 Millich Dr., Suite 216, Campbell, CA 95008
+1 (408)-337-6870 — Contact Us, or visit our other sites:
AECCafe - Architectural Design and Engineering TechJobsCafe - Technical Jobs and Resumes GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy Policy