Dr. Tom Williams – A Lifetime of Achievement


“Today in the deep-submicron era, we’re at a really sensitive point in the technology because, as has happened every time we take a step forward in the technology, we take a step backwards in yield. So we’re all working to get yield back up and, all of a sudden, DFT [design for testability] is again playing a central role. It is essential for good diagnostics. It’s enabling the diagnostics to work well.”

“There’s actually a kind of convergence to this process. First everyone says, ‘Hey, let’s run some test tools.’ But when the tools fail in the next technology, or we have low yield, then everyone say, ‘Okay, let’s run some diagnostic tools.’”

“My experience has been over the years that it’s sort of like nip and tuck. You start to move along and believe you’re on top of things, then suddenly the technology falls away, too many defects of one type or another start to escape, and people start saying again, ‘We’ve got to do something better here.’ So all of a sudden today, there’s renewed interest in new tests for new fault models.”

“I remember in the late 1970’s and early 1980’s, you would go to a designer and tell them they needed to get as much coverage as possible of stuck-at faults. But they’d respond, ‘No, I don’t need to test for all of that.’ So there had to be a drive to force the designers to get higher coverage.”

“The truth is that jumps in the use of test technology are always driven by what people are willing to tolerate – whether it’s the designers or the customers. Today, for instance, the automobile industry is a driver for the industry because they’re a big customer and they’re insisting on zero defective chips. When designers are driven by demands like this, demands for quality from their customers, they’re willing to do more in the area of test.”

*************************

In his current work at Synopsys, Dr. Williams has been heavily involved in developing the company’s DFT MAX product along with Dr. Rohit Kapur. He says the tool includes a very simple approach to doing test data compression without any sequential elements. I asked him about the loss of data integrity that comes with compression.

He responded, “Essentially, there’s a model that you’re going after – the defect model, the stuck-at model, the bridging-fault model, or the transition fault model. You need data to measure relative to what you’re doing with that model, so essentially what you want to do is to generate a set of patterns as if you had all the time in the world and get complete coverage of the fault models you are trying to cover. Of course, if you could apply a lot more patterns, you’d have a high quality test, and there’s a lot of experimentation going on there, but people can’t afford to go after all of those things. Instead they want coverage of the fault model they are going after, and they want to apply as few patterns as possible. So how do we balance the two conflicting demand, less test data with equivalent coverage? This is a challenge. That is what DFT MAX does with ease of use and very low overhead.”

“To get higher quality, stuck-at faults are not sufficient. There are random defects which can cause shorts between nets. Can we find the net pairs that are most likely to be contributing to faults due to random defects? Well, you can do critical area analysis. If you have two lines running side-by-side for a long time in a design, they’ll have a relatively high coupling capacitance. You can use that capacitance, which it turns out is directly proportional to the contribution of random defects to target test.”

“You now have a rank ordered list of net pairs that are the highest contributors of defects due to random defects in the design. From here one can generate a bridging fault test for these net pairs with the victim net in its weakest level and the aggressor in its strongest level. Essentially the metric we use is, for every gate output, we find a net with the highest probability of having a short. Then we generate these tests. This is an example of how a new fault model can be introduced to target defects that we ignored or tolerated in prior technologies.”

*************************

Dr. Williams said it’s hard to talk about these technologies, without invoking the history of test. We talked about the numerous papers and people that have contributed to the development of the technology.

In particular, he noted: “To me one of the interesting pieces history in test involves some work that I did with Eun Sei Park and Ray Mercer in 1988. We said that if you’re going to do delay testing, make sure that when you test for transition faults, you test on the longest path. A short path could have a large delay effect that would be accepted as good, but if you have a long path and little tolerance for a small delay defect which will cause a failure. This paper on this topic got an Honorable Mention at ITC that year [International Test Conference], but was put aside as a complete dust collector for 17 years. No one referenced it at all. Then all of a sudden, some people in Japan started looking at the same ideas and now there’s a renaissance of looking for these small delay effects. The technology is starting to come back.”

“Similarly, in 1991, Bill Underwood, Ray Mercer, and I received an award for Best Paper at DAC. Our work said that as synthesis tools get better and better the path delay distribution of synthesized networks will be distributed closer and closer to the cycle time of the machine. The impact of this is that the designs would become more and more sensitive to small delay defects. Kurt Kuetzer was chair of the session where we presented this paper at DAC and he said at the time, ‘Isn’t it funny how the simplest ideas are the most profound.’ He was right! In essence this work predicted that small delay transition fault testing would be required. This is in fact what happened with the renewed interest in small delay testing by a number of groups in Japan.

“Another piece of work that I did with Bob Dennard, Ray Mercer, Rohit Kapur and Wojciech Maly, 1996 at ITC, was on the dramatic effects of scaling on test. One of these impacts was that for every 80 mVolt decrease in the threshold voltage there is a 10x increase in Ioff current. Thus, two things would happen - one the effectiveness of Iddq testing would dramatically decrease and two the impact of power on testing and design would be a major issue in the future. This in fact is what has happened, it has taken a while, and this was key in some of what I talked about today at ISQED.”

“The point here is that there is a significant amount of technology in test out there but it has to have the right kind of pull from the industry to put it into use, and I was pleased to be involved in some of these leading edge efforts with my colleagues.”

*************************

Speaking of ITC, I asked Williams if he could comment on my impressions from the International Test Conference last fall [held in Santa Clara in October 2006] that test continues to be a niche unto itself, almost provincial in its separation from the larger world of Design.

Tom chuckled and said, “Historically, when people were still writing their names in cuneiform, test was never part of the engineer’s responsibility. It was the manufacturer’s responsibility. The designers designed things, and the manufacturers got the designs and said, ‘I’ll re-generate the tests to tell the good from the bad.’” “But that started to fall apart when the manufacturing people started saying, ‘Hey, I don’t even know how this thing works, so I certainly don’t know how to write the functional patterns. Let’s ask the design community to do that.’”


Rating:


Review Article Be the first to review this article
 True Circuits: Ultra PLL

Aldec Simulator Evaluate Now

Featured Video
Jobs
Principal PIC Hardware Controls Engineer for Infinera Corp at Sunnyvale, CA
ASIC Design Engineer for Infinera Corp at Sunnyvale, CA
Senior Formal FAE Location OPEN for EDA Careers at San Jose or Anywhere, CA
Senior PIC Test Development Engineer for Infinera Corp at Sunnyvale, CA
Senior DSP Firmware Engineer for Cirrus Logic, Inc. at Austin, TX
Design Verification Engineer for Cirrus Logic, Inc. at Austin, TX
Upcoming Events
IC Open Innovation Panel During REUSE 2017 at Santa Clara Convention Center 5001 Great America Parkway Santa Clara CA - Dec 14, 2017
Essentials of Electronic Technology: A Crash Course at Columbia MD - Jan 16 - 18, 2018
Essentials of Digital Technology at MD - Feb 13 - 14, 2018
CST: Webinar series
TrueCircuits: IoTPLL



Internet Business Systems © 2017 Internet Business Systems, Inc.
25 North 14th Steet, Suite 710, San Jose, CA 95112
+1 (408) 882-6554 — Contact Us, or visit our other sites:
AECCafe - Architectural Design and Engineering TechJobsCafe - Technical Jobs and Resumes GISCafe - Geographical Information Services  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise