Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /www/www10/htdocs/nbc/articles/view_weekly.php on line 750

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /www/www10/htdocs/nbc/articles/config.inc.php on line 369

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /www/www10/htdocs/nbc/articles/config.inc.php on line 392

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /www/www10/htdocs/nbc/articles/config.inc.php on line 369

Deprecated: Using ${var} in strings is deprecated, use {$var} instead in /www/www10/htdocs/nbc/articles/config.inc.php on line 392

Warning: Cannot modify header information - headers already sent by (output started at /www/www10/htdocs/nbc/articles/view_weekly.php:750) in /www/www_com/htdocs/html.inc.php on line 229

Warning: Cannot modify header information - headers already sent by (output started at /www/www10/htdocs/nbc/articles/view_weekly.php:750) in /www/www_com/htdocs/html.inc.php on line 230

Warning: Cannot modify header information - headers already sent by (output started at /www/www10/htdocs/nbc/articles/view_weekly.php:750) in /www/www_com/htdocs/html.inc.php on line 237

Warning: Cannot modify header information - headers already sent by (output started at /www/www10/htdocs/nbc/articles/view_weekly.php:750) in /www/www_com/htdocs/html.inc.php on line 239

Warning: Cannot modify header information - headers already sent by (output started at /www/www10/htdocs/nbc/articles/view_weekly.php:750) in /www/www_com/htdocs/html.inc.php on line 240

Warning: Cannot modify header information - headers already sent by (output started at /www/www10/htdocs/nbc/articles/view_weekly.php:750) in /www/www_com/htdocs/html.inc.php on line 241
Dr. Tom Williams – A Lifetime of Achievement - April 09,
Warning: Undefined variable $module in /www/www10/htdocs/nbc/articles/view_weekly.php on line 623
[ Back ]   [ More News ]   [ Home ]
April 09, 2007
Dr. Tom Williams – A Lifetime of Achievement

Warning: Undefined variable $vote in /www/www10/htdocs/nbc/articles/view_weekly.php on line 732
Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
Peggy Aycinena - Contributing Editor


by Peggy Aycinena - Contributing Editor
Posted anew every four weeks or so, the EDA WEEKLY delivers to its readers information concerning the latest happenings in the EDA industry, covering vendors, products, finances and new developments. Frequently, feature articles on selected public or private EDA companies are presented. Brought to you by EDACafe.com. If we miss a story or subject that you feel deserves to be included, or you just want to suggest a future topic, please contact us! Questions? Feedback? Click here. Thank you!


Warning: Undefined variable $module in /www/www10/htdocs/nbc/articles/view_weekly.php on line 843

Warning: Undefined array key "upload_with_nude_flag" in /www/www_com/htdocs/get_weekly_feature_ad.inc.php on line 69

Life is good for Tom Williams, Fellow at Synopsys. Half of his life is spent in Colorado and the other half in the wilds of Western Canada. Not only does Dr. Williams get to continue to pursue his life’s passion in design and test from his offices in Boulder, he’s also able to enjoy his avocations for photography, skiing, and teaching in Alberta, a particularly rugged and scenic part of Canada where he is an adjunct professor at the University of Calgary.

These days, there’s one additional facet in Tom’s life which is the envy of his friends and associates. Tom’s core expertise has become a Golden Child of EDA – test, with an eye to Manufacturing and Yield. DFT is walking hand in hand with DFM/DFY and together moving the industry forward.

Several weeks ago, Williams gave the lunchtime keynote at ISQED at the Double Tree Hotel in San Jose. In an homage to optimism entitled, “EDA to the Rescue of the ITRS Roadmap,” Tom detailed the technical challenges associated with the move to 65- and 45-nanometer design, described some of the many initiatives within the EDA community in support of the move, and put out a call for cooperation between customers and vendors to facilitate further evolution in semiconductor design and manufacturing. The talk was well received by a sold-out crowd of designers.

I was lucky to catch up with Tom after lunch that day, and enjoyed a long, impromptu chat in the lobby that covered a number of topics. Tom’s a gracious and engaging individual, down to earth, and appreciative of the many people and circumstances he says have enriched his life over the years.

Happily, I’ll be seeing Tom again shortly at DATE in Nice. He’ll be onstage during the opening session on April 17th accepting a Lifetime Achievement Award  from the European Electronic Design and Automation Association [EDAA] for outstanding contributions to the technology. From the sound of things, it couldn’t be happening to a luckier guy.

*************************

Tom Williams grew up in Rochester, New York and despite a less than auspicious start to his education – his parents were advised early on that his learning capabilities were limited at best – the system never gave up on him. Several teachers took an interest in Tom over the years, and eventually he was admitted to Clarkson University in Potsdam, New York, where he earned a BSEE. From there, he went on to the State University of New York at Binghamton for an MA in pure mathematics, and then Colorado State University for a PhD in EE. Tom’s next 360 months (his metric, not mine) were spent at IBM managing the VLSI Design for Testability group, and his last 10 years have been at Synopsys. As methodical as this all sounds, there was actually a lot of capricious luck and circumstance in all of it.

Per Tom, “I’ve been lucky throughout my life to be in the right place at the right time. For instance, my father was a speed skater and introduced me to skating as a child. When I could finally afford to buy my first pair of skates, I looked around and realized other people were taking really expensive lessons. But my family didn’t have a lot of money, so I went and asked the pro at a local rink if I could have a 15-minute lesson once a week. That is all I could afford from my paper route income. The actual lesson time was 10-20 times less than the serious competitors.”

“Fortunately the guy didn’t laugh and, low and behold, nine months later I was asked if I would skate in competition if someone paid for my lessons – I was about 14 or 15 at the time, and that was very lucky. Someone really thought that I had potential and was willing to pay for me. It must have been very expensive. I eventually became the Eastern U.S. Figure Skating Champion. I was on a track to be considered for the US Team. A year later, I quit skating because I was continuing to struggle in school and needed more time to study.”

“That following year the US Team competed at the North American Figure Skating Championship. I watched it on TV. A year after I quit competing in 1961, the entire US Team all headed off to Europe for a competition, but quite tragically their plane went down in Belgium and the whole team was killed. Since that time, I’ve known for sure that I have had extraordinary luck in life. First I had the luck to be involved with skating, and then the luck to have quit before tragedy struck.”

Tom went on, “In a less profound way, my career at IBM grew out of a series of lucky circumstances as well. When I first went to Binghamton for my masters, I had to get a job to support myself. I was on the desperate side, so I went to the IBM Country Club and tried to caddy. The manager at the club said I could not caddy but he would help me get another job at the Club. I often say IBM was so impressed with my Bachelors in Electrical Engineering they gave me a job cutting down trees and driving a tractor.” Tom added with a chuckle, “That was really a lucky break!”

“But while I was in graduate school in Binghamton, I ended up being a TA for a calculus class. There was a guy from IBM in the class and in gratitude for some help I gave him with his problem sets, he helped find a summer internship for me at IBM. At my going-away party at the end of the summer, my manager offered me a consulting position while I finished my masters. I had asked later if I could stay on full time for 6 months, till I left to go work on my PhD. After checking with personnel the response was ‘No, we do not hire part time.’ I was so annoyed that my honesty about going on in school had cost me the job, I really pressed the issue by pointing out that if I had lied they would have given me a position. Not only did IBM then offer me the job, they also gave me an educational leave and stipend to help me fund my schooling. Again, a very lucky break.”

“Of course, the only reason I was able to go for my PhD was that the head of the department at Clarkson became the head of the department at Colorado State University in Fort Collins. I was very fortunate that he invited me to join the PhD program at CSU, where I did my PhD under Dr. Lee Maxwell.”

“Like I said, I’ve always been lucky and my mother was right. If I hadn’t studied, I would have ended up as a garbage man. The only reason I got through undergrad, through grad school, and had the opportunity to earn the PhD was became several people said I wasn’t as dumb as my grades would lead you to believe. The fact that people went out of their way to help me has made all the difference in the world, and I try as often as possible to pass the favor on to the people I interface with. I try to be a role model to people, and give a hand to the younger generation in the technology. I hope I have earned a reputation for helping people, because I certainly think it’s important.”

*************************

Not surprisingly, as one of the world’s foremost authorities in the technology, Tom Williams has lots to say about test. He started this portion of our conversation by telling me that when he and Ed Eichelberger published their first paper on scan (specifically, “LSSD” or level-sensitive scan design) at DAC in 1977, “We thought it was such a simple concept that it would become common practice within 2 or 3 years. On the contrary, it ended up taking lots and lots of time and education to get the message across to the industry that the overhead of these test structures was worth the cost. Now the industry is well down the path and designing full scan. And, I am very proud to have played a major role in making full scan a de facto standard.”

“Today in the deep-submicron era, we’re at a really sensitive point in the technology because, as has happened every time we take a step forward in the technology, we take a step backwards in yield. So we’re all working to get yield back up and, all of a sudden, DFT [design for testability] is again playing a central role. It is essential for good diagnostics. It’s enabling the diagnostics to work well.”

“There’s actually a kind of convergence to this process. First everyone says, ‘Hey, let’s run some test tools.’ But when the tools fail in the next technology, or we have low yield, then everyone say, ‘Okay, let’s run some diagnostic tools.’”

“My experience has been over the years that it’s sort of like nip and tuck. You start to move along and believe you’re on top of things, then suddenly the technology falls away, too many defects of one type or another start to escape, and people start saying again, ‘We’ve got to do something better here.’ So all of a sudden today, there’s renewed interest in new tests for new fault models.”

“I remember in the late 1970’s and early 1980’s, you would go to a designer and tell them they needed to get as much coverage as possible of stuck-at faults. But they’d respond, ‘No, I don’t need to test for all of that.’ So there had to be a drive to force the designers to get higher coverage.”

“The truth is that jumps in the use of test technology are always driven by what people are willing to tolerate – whether it’s the designers or the customers. Today, for instance, the automobile industry is a driver for the industry because they’re a big customer and they’re insisting on zero defective chips. When designers are driven by demands like this, demands for quality from their customers, they’re willing to do more in the area of test.”

*************************

In his current work at Synopsys, Dr. Williams has been heavily involved in developing the company’s DFT MAX product along with Dr. Rohit Kapur. He says the tool includes a very simple approach to doing test data compression without any sequential elements. I asked him about the loss of data integrity that comes with compression.

He responded, “Essentially, there’s a model that you’re going after – the defect model, the stuck-at model, the bridging-fault model, or the transition fault model. You need data to measure relative to what you’re doing with that model, so essentially what you want to do is to generate a set of patterns as if you had all the time in the world and get complete coverage of the fault models you are trying to cover. Of course, if you could apply a lot more patterns, you’d have a high quality test, and there’s a lot of experimentation going on there, but people can’t afford to go after all of those things. Instead they want coverage of the fault model they are going after, and they want to apply as few patterns as possible. So how do we balance the two conflicting demand, less test data with equivalent coverage? This is a challenge. That is what DFT MAX does with ease of use and very low overhead.”

“To get higher quality, stuck-at faults are not sufficient. There are random defects which can cause shorts between nets. Can we find the net pairs that are most likely to be contributing to faults due to random defects? Well, you can do critical area analysis. If you have two lines running side-by-side for a long time in a design, they’ll have a relatively high coupling capacitance. You can use that capacitance, which it turns out is directly proportional to the contribution of random defects to target test.”

“You now have a rank ordered list of net pairs that are the highest contributors of defects due to random defects in the design. From here one can generate a bridging fault test for these net pairs with the victim net in its weakest level and the aggressor in its strongest level. Essentially the metric we use is, for every gate output, we find a net with the highest probability of having a short. Then we generate these tests. This is an example of how a new fault model can be introduced to target defects that we ignored or tolerated in prior technologies.”

*************************

Dr. Williams said it’s hard to talk about these technologies, without invoking the history of test. We talked about the numerous papers and people that have contributed to the development of the technology.

In particular, he noted: “To me one of the interesting pieces history in test involves some work that I did with Eun Sei Park and Ray Mercer in 1988. We said that if you’re going to do delay testing, make sure that when you test for transition faults, you test on the longest path. A short path could have a large delay effect that would be accepted as good, but if you have a long path and little tolerance for a small delay defect which will cause a failure. This paper on this topic got an Honorable Mention at ITC that year [International Test Conference], but was put aside as a complete dust collector for 17 years. No one referenced it at all. Then all of a sudden, some people in Japan started looking at the same ideas and now there’s a renaissance of looking for these small delay effects. The technology is starting to come back.”

“Similarly, in 1991, Bill Underwood, Ray Mercer, and I received an award for Best Paper at DAC. Our work said that as synthesis tools get better and better the path delay distribution of synthesized networks will be distributed closer and closer to the cycle time of the machine. The impact of this is that the designs would become more and more sensitive to small delay defects. Kurt Kuetzer was chair of the session where we presented this paper at DAC and he said at the time, ‘Isn’t it funny how the simplest ideas are the most profound.’ He was right! In essence this work predicted that small delay transition fault testing would be required. This is in fact what happened with the renewed interest in small delay testing by a number of groups in Japan.

“Another piece of work that I did with Bob Dennard, Ray Mercer, Rohit Kapur and Wojciech Maly, 1996 at ITC, was on the dramatic effects of scaling on test. One of these impacts was that for every 80 mVolt decrease in the threshold voltage there is a 10x increase in Ioff current. Thus, two things would happen - one the effectiveness of Iddq testing would dramatically decrease and two the impact of power on testing and design would be a major issue in the future. This in fact is what has happened, it has taken a while, and this was key in some of what I talked about today at ISQED.”

“The point here is that there is a significant amount of technology in test out there but it has to have the right kind of pull from the industry to put it into use, and I was pleased to be involved in some of these leading edge efforts with my colleagues.”

*************************

Speaking of ITC, I asked Williams if he could comment on my impressions from the International Test Conference last fall [held in Santa Clara in October 2006] that test continues to be a niche unto itself, almost provincial in its separation from the larger world of Design.

Tom chuckled and said, “Historically, when people were still writing their names in cuneiform, test was never part of the engineer’s responsibility. It was the manufacturer’s responsibility. The designers designed things, and the manufacturers got the designs and said, ‘I’ll re-generate the tests to tell the good from the bad.’” “But that started to fall apart when the manufacturing people started saying, ‘Hey, I don’t even know how this thing works, so I certainly don’t know how to write the functional patterns. Let’s ask the design community to do that.’”

“Essentially, the test community was saying that they no longer wanted to deal with absolutely everything that was thrown over the wall at them, so back when I was at IBM the design team started having to be involved. They had to write test patterns, which was difficult for them because people assumed if they could design a perfect design, they could also write a good set of patterns.”

“The designers would give their patterns to the test people to take along with the design, but if the patterns and the chips failed at the tester it wasn’t clear if it was a zero yield, bad test pattern, or maybe the tester due to pins or probes that were broken. The manufacturer in charge would say to the designer, ‘Go fix that!’” “In order to do so the expensive tester would have to be taken out of production to debug this problem. The cost to the company was time and money. That’s when IBM started saying that the only kind of designs that were acceptable were ones with full scan, LSSD.”

Referring back to ITC, Tom continued: “Clearly, test has always been downstream, at the end of the process and there is still a little bit of that ‘provincial’ mentality you mentioned. However today DFT [design for testability] is really part of the whole design process. You synthesize the test and compression structures right along with the design. Even more importantly, DFM [design for manufacturing] and DFY [design for yield] are becoming part of the process. They fit right in with DFT.” “It’s true that some of the things being done in DFM today – things like the ability to do massive diagnostics and collect massive amounts of manufacturing data – that kind of feedback and data collection is new. But certainly all along, test and diagnostics have played a key role in trying to get yields to levels that are more and more acceptable.”

Tom added: “Test is a really strange thing. If you look at the total number of bits, test is only requires between 1 and 5 percent of the bits be specified. Let’s assume that you only need 2 percent in order to cover your faults, and remember that we’re always going to do the same coverage, then no matter how you compress the data, you’ll never get below 50-to-1 compression. But there’s a certain entropy to the lower bound of the number of bits required, so aren’t you throwing the baby out with the bath water to do the compression needed on all of this data? Go any lower and you’ll lose coverage, go any higher and the test data compression.”

“Ultimately test is about having that intuitive feel, it’s about the tricks you can play to preserve coverage while minimizing pattern data volume. It’s what all of the latest and greatest technologies in test are all about – capturing all of the data for small delay defects while optimizing the compressions.”

*************************

Our wide-ranging conversation almost at an end, I asked Tom to put aside his self-effacing ways and give me a list of what he considered to be his most important contributions to the industry, and perhaps his top picks among the 50+ papers he’s published over the years.

He balked at the idea, but then agreed to send me an email after he’d had some time to think about it. I was delighted at the prospect, and also impressed on him that I hoped to receive in that same email some of his best photos of what he had told me was one of his favorite places – Stonehenge in the south of England. He agreed to the deal and we parted ways.

When I received his email the next day, Tom had followed through on his promise. Hence, you’ve been enjoying Tom’s favorite photos from Stonehenge peppered throughout this article. And, here are Tom’s favorite ‘moments’ from a lifetime of achievement.

Clearly, both with respect to his photography and his contributions to the technology, luck had nothing to do with it. These are the accomplishments of someone who was humble enough to work hard, and inspired enough to foment true innovation and excellence. The EDAA Lifetime Achievement Award could not go to a worthier guy.

*************************

Hello Peggy,

It was very interesting talking to you today. I hope this is what you wanted. I have reflected on what I think are the most important things I’ve worked on:

  1. The relationship between defect level, yield and test coverage as described in this paper:

    "Defect Level as a Function of Fault Coverage," (with N. C. Brown), IEEE Trans. Computers, Vol. C-30, No. 12, pp. 987-988, December 1981.

  2. How that relationship could be extended to transition or delay testing, which is having a resurgence today:

    "Statistical Delay Fault Coverage and Defect Level for Delay Faults," (with E. S. Park and M. R. Mercer), Proc. 1988 International Test Conference, Washington, DC, pp. 492-499, September 1988 (Best Paper Award - Honorable Mention).
  3. The relationship between synthesis and test:

    "The Interdependence Between Delay-Optimization of Synthesized Networks and Testing," (with B. Underwood and M. R. Mercer), Proc. 28th ACM/IEEE Design Automation Conference, San Francisco, CA, pp. 87-92, June 1991 (Best Paper Award).

  4. Playing a role in getting the industry to change in the direction of Full Scan Designs.
Cheers,

Tom Williams



*************************

Peggy Aycinena is Editor of EDA Confidential and Contributing Editor to EDA Weekly.


You can find the full EDACafe.com event calendar here.

To read more news, click here.


-- Peggy Aycinena, EDACafe.com Contributing Editor.