March 15, 2004
TEST & ATE - Cost of Test
Please note that contributed articles, blog entries, and comments posted on EDACafe.com are the views and opinion of the author and do not necessarily represent the views and opinions of the management and staff of Internet Business Systems and its subsidiary web-sites.
While the manufacturing costs per transistor have plummeted, testing costs per transistor have remained relatively level. Among the reasons for this phenomenon are increasing gate counts cause longer test sequences; a mix of analog, digital, and memory circuitry requires different test strategies; higher operating frequencies make at-speed test difficult; and an increasing number of functional levels requires more complex test sequences.
The Cost of Test is simplistically given by the formula.
Capital and operational costs include depreciation of acquisition costs of the ATE machine, handlers and probes, amortization of facility modifications, maintenance and spares, facilities, indirect materials and consumables, labor and overhead. Equipment utilization is the percentage of the time in production and excludes non-production uses such as engineering use, maintenance and repair and idle time. Yield is the percentage passing the test, while throughput is the number of parts tested per unit time. At a purchase price in excess of $1M, depreciation is usually the dominant factor. It is often said that the cheapest ATE machine, is the one that is fully depreciated.
This cost model is for the time a chip spends "in-socket" on the ATE, it does not include the costs for test program development, simulation, and debug. It does not include costs to diagnose and address the root causes of both detected and undetected defects and their impact (yield loss) during this time. Nor does it include the impact of “defect escapees”. An undetected defect at one level of integration translates into a probability of faults at the next level of integration of the part
The other side of the cost coin is time as in time to market (TTM), time to volume (TTV) and time to profit (TTP). To the extent that any test related activity lies on the critical path, it has an effect on these important metrics. In an era of increasingly shorter product lifetimes, any delay can impact the size of the remaining market, a firm's share of the available market and its margin.
Reducing price is a challenge for ATE vendors given their relatively small volumes coupled with pressure to align their R&D efforts with rapidly evolving technologies. However, ATE vendors have successfully developed ways (multisite and concurrent testing) to increase throughput by introducing parallelism into what had been a serial process. Multisite capabilities enable two or more devices under test (DUTs) to be tested simultaneously. The number of sites is much greater for memory chips than logic chips. With concurrent testing the parallelism is inside the device itself. An SoC contains several embedded cores (predesigned and verified but untested blocks - logic, memory, analog)
requiring different test strategies and relying on core-vendor supplied test. Concurrent testing enables the simultaneous testing of multiple cores. The greatest degree of parallelism is the combination of the two approaches. Parallelism offers the greatest costs saving per device. Doubling the number of devices being tested simultaneously simplistically halves the cost, while a 10% to 15% reduction in ATE machine purchase price might result in only a 5% COT saving.
Some have suggested that some new standard might be part of the solution. In fact, there already are some IEEE standards in the test arena including JTAG boundary scan, STIL (Standard Test Interface Language) and CTL (Core Test description Language).
In July 2002 at Semicon West Advantest Corporation with support from Intel created a stir by announcing its plans to establish the Semiconductor Test Consortium (STC), a non-profit industry wide collaboration to develop and to proliferate a Semiconductor Test Open Architecture. By July 2003 STC had released the first two drafts of the specification for OPENSTAR - the Open Semiconductor Test. Twenty three members are listed on the website
www.semitest.org including Intel, Motorola, Fujitsu and Phillips Semiconductor. Other than founding member Advantest no other ATE vendor is a member.
pay," he said.
In a phone conversation Sergio Perez Vice Chairman of STC acknowledges that it would be difficult for ATE vendors to “jump on board” at this time. It would take 18 to 24 months to develop a compliant product. Such a move would undermine their current product offerings. He sees many smaller module and instrumentation vendors benefiting from this open standard.
Wayne Lonowski, EDA/DFT Marketing Manager at Agilent, sees the greatest benefit for end users from standardized interfaces that make the information flow seamless. He considers the benefits for standardized architecture to be debatable.
In the meantime, the ATE industry, both vendors and customers, continues to gravitate toward single-platform systems that have an open architecture in the sense that they are open to third-party instrument suppliers; even as the architecture itself is proprietary.
The traditional approach to testing was functional or behavioral testing. Functional testing allows the testing of a very large number of "actual functional paths" at speed using millions of vectors in a few milliseconds. The problem is the number of input/output combinations is rising exponentially and with it testing time. Sophisticated algorithms, e.g. redundancy removal, have been developed to reduce number of test vectors.
A different approach is structural testing which seeks to determine whether or not a particular physical failure has occurred. Fault models define the properties of the tests that will detect the faulty behavior caused by defects. The most common fault models are the single-stuck-at DC model, the transition and path delay AC models, and current measurement models. Once a set of faults is modeled, tests can be generated to differentiate the circuit with faults from the circuit without any faults.
A popular approach to the ever-increasing cost-of-test problem is to apply Design-for-Test (DFT) techniques to the device during the design phase. These techniques are designed to detect specific types of faults in the IC, and they all require the addition of circuitry and/or adherence to particular design rules during the design process. DFT techniques provide a high degree of access, controllability, and observability to the internals of the design with reduced test pin count. DFT-related test techniques include internal scan, boundary scan, IDDQ and BIST.
The trend is toward increasing test data volumes which increases test application times. Most major test tool vendors offer deterministic compression schemes that work in conjunction with ATPG algorithms to significantly reduce the scan test data volume, scan test time and vector-buffer-memory requirements (avoid costly reloads). On chip circuitry decompresses the full vector set for test execution and compresses test results for transmission back to the ATE for pass/fail determination and fault diagnostics.
You can find the full EDACafe event calendar here.
To read more news, click here.
-- Jack Horgan, EDACafe.com Contributing Editor.
Be the first to review this article