US20030014205A1 - Methods and apparatus for semiconductor testing - Google Patents

Methods and apparatus for semiconductor testing Download PDF

Info

Publication number
US20030014205A1
US20030014205A1 US10/154,627 US15462702A US2003014205A1 US 20030014205 A1 US20030014205 A1 US 20030014205A1 US 15462702 A US15462702 A US 15462702A US 2003014205 A1 US2003014205 A1 US 2003014205A1
Authority
US
United States
Prior art keywords
data
test
test data
outlier
smoothing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US10/154,627
Other versions
US6792373B2 (en
Inventor
Eric Tabor
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
In Depth Test LLC
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=27501599&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US20030014205(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Priority claimed from US09/872,195 external-priority patent/US6782297B2/en
Priority to US10/154,627 priority Critical patent/US6792373B2/en
Application filed by Individual filed Critical Individual
Assigned to TEST ADVANTAGE, INC. reassignment TEST ADVANTAGE, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TABOR, ERIC PAUL
Publication of US20030014205A1 publication Critical patent/US20030014205A1/en
Priority to US10/367,355 priority patent/US7167811B2/en
Priority to US10/730,388 priority patent/US7225107B2/en
Priority to US10/817,750 priority patent/US7395170B2/en
Application granted granted Critical
Publication of US6792373B2 publication Critical patent/US6792373B2/en
Priority to US11/053,598 priority patent/US7356430B2/en
Priority to US11/134,843 priority patent/US8417477B2/en
Priority to US11/692,021 priority patent/US8041541B2/en
Priority to US12/021,616 priority patent/US20080189575A1/en
Priority to US12/111,773 priority patent/US8000928B2/en
Priority to US12/573,415 priority patent/US20100088054A1/en
Priority to US12/579,634 priority patent/US8606536B2/en
Priority to US13/044,202 priority patent/US20110178967A1/en
Assigned to TEST ACUITY SOLUTIONS, INC. reassignment TEST ACUITY SOLUTIONS, INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: TEST ADVANTAGE, INC.
Priority to US13/853,686 priority patent/US8788237B2/en
Assigned to ACACIA RESEARCH GROUP LLC reassignment ACACIA RESEARCH GROUP LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TEST ACUITY SOLUTIONS, INC.
Assigned to IN-DEPTH TEST LLC reassignment IN-DEPTH TEST LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ACACIA RESEARCH GROUP LLC
Priority to US15/991,324 priority patent/US11853899B2/en
Adjusted expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L22/00Testing or measuring during manufacture or treatment; Reliability measurements, i.e. testing of parts without further processing to modify the parts as such; Structural arrangements therefor
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L22/00Testing or measuring during manufacture or treatment; Reliability measurements, i.e. testing of parts without further processing to modify the parts as such; Structural arrangements therefor
    • H01L22/20Sequence of activities consisting of a plurality of measurements, corrections, marking or sorting steps
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/0001Technical content checked by a classifier
    • H01L2924/0002Not covered by any one of groups H01L24/00, H01L24/00 and H01L2224/00

Definitions

  • the invention relates to semiconductor testing.
  • test components to ensure that the components operate properly.
  • the test data not only determine whether the components function properly, but also may indicate deficiencies in the manufacturing process.
  • many semiconductor companies may analyze the collected data from several different components to identify problems and correct them. For example, the company may gather test data for multiple chips on each wafer among several different lots. This data may be analyzed to identify common deficiencies or patterns of defects or identify parts that may exhibit quality and performance issues and to identify or classify user-defined “good parts”. Steps may then be taken to correct the problems. Testing is typically performed before device packaging (at wafer level) as well as upon completion of assembly (final test).
  • test data Gathering and analyzing test data is expensive and time consuming. Automatic testers apply signals to the components and read the corresponding output signals. The output signals may be analyzed to determine whether the component is operating properly. Each tester generates a large volume of data. For example, each tester may perform 200 tests on a single component, and each of those tests may be repeated 10 times. Consequently, a test of a single component may yield 2000 results. Because each tester is testing 100 or more components an hour and several testers may be connected to the same server, an enormous amount of data must be stored. Further, to process the data, the server typically stores the test data in a database to facilitate the manipulation and analysis of the data. Storage in a conventional database, however, requires further storage capacity as well as time to organize and store the data.
  • the analysis of the gathered data is also difficult.
  • the volume of the data may demand significant processing power and time.
  • the data is not usually analyzed at product run time, but is instead typically analyzed between test runs or in other batches.
  • test data presents a complex and painstaking process.
  • a test engineer prepares a test program to instruct the tester to generate the input signals to the component and receive the output signals.
  • the program tends to be very complex to ensure full and proper operation of the component. Consequently, the test program for a moderately complex integrated circuit involves a large number of tests and results. Preparing the program demands extensive design and modification to arrive at a satisfactory solution, and optimization of the program, for example to remove redundant tests or otherwise minimize test time, requires additional exertion.
  • a method and apparatus for testing semiconductors comprises a test system comprising an outlier identification element configured to identify significant data in a set of test results.
  • the test system may be configured to provide the data in an output report.
  • the outlier identification element suitably performs the analysis at run time.
  • the outlier identification element may also operate in conjunction with a smoothing system to smooth the data and identify trends and departures from test result norms.
  • FIG. 1 is a block diagram of a test system according to various aspects of the present invention and associated functional components;
  • FIG. 2 is a block diagram of elements for operating the test system
  • FIG. 3 illustrates a flow chart for a configuration element
  • FIGS. 4 A-C illustrate a flow chart for a supplemental data analysis element
  • FIG. 5 is a diagram of various sections of a wafer
  • FIGS. 6 A-B further illustrate a flow chart for a supplemental data analysis element
  • FIG. 7 illustrates a flow chart for an output element
  • FIG. 8 is a flow chart for operation of an exemplary data smoothing system according to various aspects of the present invention.
  • FIG. 9 is a plot of test data for a test of multiple components
  • FIG. 10 is a representation of a wafer having multiple devices and a resistivity profile for the wafer
  • FIG. 11 is a graph of resistance values for a population of resistors in the various devices of the wafer of FIG. 10.
  • FIGS. 12 A-B are general and detailed plots, respectively, of raw test data and smoothed test data for the various devices of FIG. 10.
  • the present invention may be described in terms of functional block components and various process steps. Such functional blocks and steps may be realized by any number of hardware or software components configured to perform the specified functions.
  • the present invention may employ various testers, processors, storage systems, processes, and integrated circuit components, e.g., statistical engines, memory elements, signal processing elements, logic elements, programs, and the like, which may carry out a variety of functions under the control of one or more testers, microprocessors, or other control devices.
  • the present invention may be practiced in conjunction with any number of test environments, and each system described is merely one exemplary application for the invention. Further, the present invention may employ any number of conventional techniques for data analysis, component interfacing, data processing, component handling, and the like.
  • a method and apparatus operates in conjunction with a test system 100 having a tester 102 , such as automatic test equipment (ATE) for testing semiconductors.
  • the test system 100 comprises a tester 102 and a computer system 108 .
  • the test system 100 may be configured for testing any components 106 , such as semiconductor devices on a wafer, circuit boards, packaged devices, or other electrical or optical systems.
  • the components 106 comprise multiple integrated circuit dies formed on a wafer or packaged integrated circuits or devices.
  • the tester 102 suitably comprises any test equipment that tests components 106 and generates output data relating to the testing.
  • the tester 102 may comprise a conventional automatic tester, such as a Teradyne tester, and suitably operates in conjunction with other equipment for facilitating the testing.
  • the tester 102 may be selected and configured according to the particular components 106 to be tested and/or any other appropriate criteria.
  • the tester 102 may operate in conjunction with the computer system 108 to, for example, program the tester 102 , load and/or execute the test program, collect data, provide instructions to the tester 102 , implement a statistical engine, control tester parameters, and the like.
  • the computer system 108 receives tester data from the tester 102 and performs various data analysis functions independently of the tester 102 .
  • the computer system 108 also implements a statistical engine to analyze data from the tester 102 .
  • the computer system 108 may comprise a separate computer, such as a personal computer or workstation, connected to or networked with the tester 102 to exchange signals with the tester 102 .
  • the computer system 108 may be omitted from or integrated into other components of the test system 100 and various functions may be performed by other components, such as the tester 102 .
  • the computer system 108 includes a processor 110 and a memory 112 .
  • the processor 110 comprises any suitable processor, such as a conventional Intel, Motorola, or Advanced Micro Devices processor, operating in conjunction with any suitable operating system, such as Windows 98, Windows NT, Unix, or Linux.
  • the memory 112 may comprise any appropriate memory accessible to the processor 110 , such as a random access memory (RAM) or other suitable storage system, for storing data.
  • the memory 112 of the present system includes a fast access memory for storing and receiving information and is suitably configured with sufficient capacity to facilitate the operation of the computer 108 .
  • the memory 112 includes capacity for storing output results received from the tester 102 and facilitating analysis of the output test data.
  • the memory 112 is configured for fast storage and retrieval of test data for analysis.
  • the memory 112 is suitably configured to store the elements of a dynamic datalog, suitably comprising a set of information selected by the test system 100 and/or the operator according to selected criteria and analysis based on the test results.
  • the memory 112 suitably stores a component identifier for each component 106 , such as x-y coordinates corresponding to a position of the component 106 on a wafer map for the tested wafer.
  • Each x-y coordinate in the memory 112 may be associated with a particular component 106 at the corresponding x-y coordinate on the wafer map.
  • Each component identifier has one or more fields, and each field corresponds, for example, to a particular test performed on the component 106 at the corresponding x-y position on the wafer, a statistic related to the corresponding component 106 , or other relevant data.
  • the memory 112 may be configured to include any data identified by the user as desired according to any criteria or rules.
  • the computer 108 of the present embodiment also suitably has access to a storage system, such as another memory (or a portion of the memory 112 ), a hard drive array, an optical storage system, or other suitable storage system.
  • the storage system may be local, like a hard drive dedicated to the computer 108 or the tester 102 , or may be remote, such as a hard drive array associated with a server to which the test system 100 is connected.
  • the storage system may store programs and/or data used by the computer 108 or other components of the test system 100 .
  • the storage system comprises a database 114 available via a remote server 116 comprising, for example, a main production server for a manufacturing facility.
  • the database 114 stores tester information, such as tester data files, master data files for operating the test system 100 and its components, test programs, downloadable instructions for the test system 100 , and the like.
  • the test system 100 may include additional equipment to facilitate testing of the components 106 .
  • the present test system 100 includes a device interface 104 , like a conventional device interface board and/or a device handler or prober, to handle the components 106 and provide an interface between the components 106 and the tester 102 .
  • the test system 100 may include or be connected to other components, equipment, software, and the like to facilitate testing of the components 106 according to the particular configuration, application, environment of the test system 100 , or other relevant factors.
  • the test system 100 is connected to an appropriate communication medium, such as a local area network, intranet, or global network like the internet, to transmit information to other systems, such as the remote server 116 .
  • the test system 100 may include one or more testers 102 and one or more computers 108 .
  • one computer 108 may be connected to an appropriate number of, such as up to twenty or more, testers 102 according to various factors, such as the system's throughput and the configuration of the computer 108 .
  • the computer 108 may be separate from the tester 102 , or may be integrated into the tester 102 , for example utilizing one or more processors, memories, clock circuits, and the like of the tester 102 itself.
  • various functions may be performed by different computers. For example, a first computer may perform various pre-analysis tasks, several computers may then receive the data and perform data analysis, and another set of computers may prepare the dynamic datalogs and/or other output reports.
  • a test system 100 tests the components 106 and provides enhanced analysis and test results.
  • the supplemental analysis may identify incorrect, questionable, or unusual results, repetitive tests, and/or tests with a relatively high probability of failure.
  • the operator such as the product engineer, test engineer, manufacturing engineer, device engineer, or other personnel using the test data, may then use the results to verify and/or improve the test system 100 and classify components 106 .
  • the test system 100 executes an enhanced test process for testing the components 106 and collecting and analyzing test data.
  • the test system 100 suitably operates in conjunction with a software application executed by the computer 108 .
  • the software application of the present embodiment includes multiple elements for implementing the enhanced test process, including a configuration element 202 , a supplementary data analysis element 206 , and an output element 208 .
  • Each element 202 , 206 , 208 suitably comprises a software module operating on the computer 108 to perform various tasks.
  • the configuration element 202 prepares test system 100 for testing and analysis.
  • output test data from the tester 102 is analyzed to generate supplementary test data, suitably at run time and automatically.
  • the supplementary test data is then transmitted to the operator or another system by the output element 208 .
  • the configuration element 202 configures the test system 100 for testing the components 106 and analyzing the test data.
  • the test system 100 suitably uses a predetermined set of initial parameters and, if desired, information from the operator to configure the test system 100 .
  • the test system 100 is suitably initially configured with predetermined or default parameters to minimize operator attendance to the test system 100 . Adjustments may be made to the configuration by the operator, if desired, for example via the computer 108 .
  • an exemplary configuration process 300 performed by the configuration element 202 begins with an initialization procedure (step 302 ) to set the computer 108 in an initial state.
  • the configuration element 202 then obtains application configuration information (step 304 ), for example from the database 114 , for the computer 108 and the tester 102 .
  • the configuration element 202 may access a master configuration file for the enhanced test process and/or a tool configuration file relating to the tester 102 .
  • the master configuration file may contain data relating to the proper configuration for the computer 108 and other components of the test system 100 to execute the enhanced test process.
  • the tool configuration file suitably includes data relating to the tester 102 configuration, such as connection, directory, IP address, tester node identification, manufacturer, flags, prober identification, or any other pertinent information for the tester 102 .
  • the configuration element 202 may then configure the test system 100 according to the data contained in the master configuration file and/or the tool configuration file (step 306 ).
  • the configuration element 202 may use the configuration data to retrieve further relevant information from the database 114 , such as the tester's 102 identifier (step 308 ) for associating data like logistics instances for tester data with the tester 102 .
  • the test system 100 information also suitably includes one or more default parameters that may be accepted, declined, or adjusted by the operator.
  • the test system 100 information may include global statistical process control (SPC) rules and goals that are submitted to the operator upon installation, configuration, power-up, or other appropriate time for approval and/or modification.
  • SPC global statistical process control
  • the test system 100 information may also include default wafer maps or other files that are suitably configured for each product, wafer, component 106 , or other item that may affect or be affected by the test system 100 .
  • the configuration algorithms, parameters, and any other criteria may be stored in a recipe file for easy access, correlation to specific products and/or tests, and for traceability.
  • the test system 100 commences a test run, for example in conjunction with a conventional series of tests, in accordance with a test program.
  • the tester 102 suitably executes the test program to apply signals to connections on the components 106 and read output test data from the components 106 .
  • the tester 102 may perform multiple tests on each component 106 on a wafer, and each test may be repeated several times on the same component 106 .
  • Test data from the tester 102 is stored for quick access and supplemental analysis as the test data is acquired. The data may also be stored in a long-term memory for subsequent analysis and use.
  • an exemplary set of test results for a single test of multiple components comprises a first set of test results having statistically similar values and a second set of test results characterized by values that stray from the first set.
  • Each test result may be compared to an upper test limit and a lower test limit. If a particular result for a component exceeds either limit, the component may be classified as a “bad part”.
  • Some of the test results in the second set that stray from the first set may exceed the control limits, while others do not.
  • those test results that stray from the first set but do not exceed the control limits or otherwise fail to be detected are referred to as “outliers”.
  • the outliers in the test results may be identified and analyzed for any appropriate purpose, such as to identify potentially unreliable components.
  • the outliers may also be used to identify a various potential problems and/or improvements in the test and manufacturing processes.
  • the output test data for each component, test, and repetition is stored by the tester 102 in a tester data file.
  • the output test data received from each component 106 is analyzed by the tester 102 to classify the performance of the component 106 , for example by comparison to the upper and lower test limits, and the results of the classification are also stored in the tester data file.
  • the tester data file may include additional information as well, such as logistics data and test program identification data.
  • the tester data file is then provided to the computer 108 in an output file, such as a standard tester data format (STDF) file, and stored in memory.
  • STDF standard tester data format
  • the supplementary data analysis element 206 analyzes the data to provide enhanced output results.
  • the supplementary data analysis element 206 may provide any appropriate analysis of the tester data to achieve any suitable objective.
  • the supplementary data analysis element 206 may implement a statistical engine for analyzing the output test data at run time and identifying data and characteristics of the data of interest to the operator. The data and characteristics identified may be stored, while data that is not identified may be otherwise disposed of, such as discarded.
  • the supplementary data analysis element 206 may, for example, calculate statistical figures according to the data and a set of statistical configuration data.
  • the statistical configuration data may call for any suitable type of analysis according to the needs of the test system 100 and/or the operator, such as statistical process control, outlier identification and classification, signature analyses, and data correlation.
  • the supplementary data analysis element 206 suitably performs the analysis at run time, i.e., within a matter of seconds or minutes following generation of the test data.
  • the supplementary data analysis element 206 may also perform the analysis automatically with minimal intervention from the operator and/or test engineer.
  • the supplementary data analysis element 206 performs various preliminary tasks to prepare the computer 108 for analysis of the output test data and facilitate generation of supplementary data and preparation of an output report.
  • the supplementary data analysis element 206 initially copies the tester data file to a tool input directory corresponding to the relevant tester 102 (step 402 ).
  • the supplementary data analysis element 206 also retrieves configuration data to prepare the computer 108 for supplementary analysis of the output test data.
  • the configuration data suitably includes a set of logistics data that may be retrieved from the tester data file (step 404 ).
  • the supplementary data analysis element 206 also creates a logistics reference (step 406 ).
  • the logistics reference may include tester 102 information, such as the tester 102 information derived from the tool configuration file.
  • the logistics reference is assigned an identification.
  • the configuration data may also include an identifier for the test program that generated the output test data.
  • the test program may be identified in any suitable manner, such as looking it up in the database 114 (step 408 ), by association with the tester 102 identification, or reading it from the master configuration file. If no test program identification can be established (step 410 ), a test program identification may be created and associated with the tester identification (step 412 ).
  • the configuration data further identifies the wafers in the test run to be processed by the supplementary data analysis element 206 , if fewer than all of the wafers.
  • the supplementary data analysis element 206 accesses a file indicating which wafers are to be analyzed (step 414 ). If no indication is provided, the computer 108 suitably defaults to analyzing all of the wafers in the test run.
  • the supplementary data analysis element 206 proceeds with performing the supplementary data analysis on the test data file for the wafer. Otherwise, the supplementary data analysis element 206 waits for or accesses the next test data file (step 418 ).
  • the supplementary data analysis element 206 may establish one or more section groups to be analyzed for the various wafers to be tested (step 420 ). To identify the appropriate section group to apply to the output test data, the supplementary data analysis element 206 suitably identifies an appropriate section group definition, for example according to the test program and/or the tester identification. Each section group includes one or more section arrays, and each section array includes one or more sections of the same section types.
  • Section types comprise various sorts of component 106 groups positioned in predetermined areas of the wafer.
  • a section type may include a row 502 , a column 504 , a stepper field 506 , a circular band 508 , a radial zone 510 , a quadrant 512 , or any other desired grouping of components.
  • Different section types may be used according to the configuration of the components, such as order of components processed, sections of a tube, or the like.
  • Such groups of components 106 are analyzed together to identify, for example, common defects or characteristics that may be associated with the group. For example, if a particular portion of the wafer does not conduct heat like other portions of the wafer, the test data for a particular group of components 106 may reflect common characteristics or defects associated with the uneven heating of the wafer.
  • the supplemental data analysis element 206 retrieves any further relevant configuration data, such as control limits and enable flags for the test program and/or tester 102 (step 422 ).
  • the supplemental data analysis element 206 suitably retrieves a set of desired statistics or calculations associated with each section array in the section group (step 423 ). Desired statistics and calculations may be designated in any manner, such as by the operator or retrieved from a file.
  • the supplemental data analysis element 206 may also identify one or more signature analysis algorithms (step 424 ) for each relevant section type or other appropriate variation relating to the wafer and retrieve the signature algorithms from the database 114 as well.
  • All of the configuration data may be provided by default or automatically accessed by the configuration element 202 or the supplemental data analysis element 206 . Further, the configuration element 202 and the supplemental data analysis element 206 of the present embodiment suitably allow the operator to change the configuration data according to the operator's wishes or the test system 100 requirements. When the configuration data have been selected, the configuration data may be associated with relevant criteria and stored for future use as default configuration data. For example, if the operator selects a certain section group for a particular kind of components 106 , the computer 108 may automatically use the same section group for all such components 106 unless instructed otherwise by the operator.
  • the supplemental data analysis element 206 also provides for configuration and storage of the tester data file and additional data.
  • the supplemental data analysis element 206 suitably allocates memory (step 426 ), such as a portion of the memory 112 , for the data to be stored.
  • the allocation suitably provides memory for all of the data to be stored by the supplemental data analysis element 206 , including output test data from the tester data file, statistical data generated by the supplemental data analysis element 206 , control parameters, and the like.
  • the amount of memory allocated may be calculated according to, for example, the number of tests performed on the components 106 , the number of section group arrays, the control limits, statistical calculations to be performed by the supplementary data analysis element 206 , and the like.
  • the supplementary data analysis element 206 loads the relevant test data into memory (step 428 ) and performs the supplementary analysis on the output test data.
  • the supplementary data analysis element 206 may perform any number and types of data analyses according to the components 106 , configuration of the test system 100 , desires of the operator, or other relevant criteria.
  • the supplemental data analysis element 206 may be configured to analyze the sections for selected characteristics identifying potentially defective components 106 and patterns, trends, or other characteristics in the output test data that may indicate manufacturing concerns or flaws.
  • the present supplementary data analysis element 206 smoothes the output test data, calculates and analyzes various statistics based on the output test data, and identifies data and/or components 106 corresponding to various criteria.
  • the present supplementary data analysis element 206 may also classify and correlate the output test data to provide information to the operator and/or test engineer relating to the components 106 and the test system 100 .
  • the present supplementary data analysis element 206 may perform output data correlations, for example to identify potentially related or redundant tests, and outlier incidence analyses to identify tests having frequent outliers.
  • the supplementary data analysis element 206 may include a smoothing system to initially process the tester data to smooth the data and assist in the identification of outliers (step 429 ).
  • the smoothing system may also identify significant changes in the data, trends, and the like, which may be provided to the operator by the output element 208 .
  • the smoothing system is suitably implemented, for example, as a program operating on the computer system 108 .
  • the smoothing system suitably comprises multiple phases for smoothing the data according to various criteria.
  • the first phase may include a basic smoothing process.
  • the supplemental phases conditionally provide for enhanced tracking and/or additional smoothing of the test data.
  • the smoothing system suitably operates by initially adjusting an initial value of a selected tester datum according to a first smoothing technique, and supplementarily adjusting the value according to a second smoothing technique if at least one of the initial value and the initially adjusted value meets a threshold.
  • the first smoothing technique tends to smooth the data.
  • the second smoothing technique also tends to smooth the data and/or improve tracking of the data, but in a different manner from the first smoothing technique.
  • the threshold may comprise any suitable criteria for determining whether to apply supplemental smoothing.
  • the smoothing system suitably compares a plurality of preceding adjusted data to a plurality of preceding raw data to generate a comparison result, and applies a second smoothing technique to the selected datum to adjust the value of the selected datum according to whether the comparison result meets a first threshold. Further, the smoothing system suitably calculates a predicted value of the selected datum, and may apply a third smoothing technique to the selected datum to adjust the value of the selected datum according to whether the predicted value meets a second threshold.
  • a first smoothed test data point is suitably set equal to a first raw test data point (step 802 ) and the smoothing system proceeds to the next raw test data point (step 804 ).
  • the smoothing system initially determines whether smoothing is appropriate for the data point and, if so, performs a basic smoothing operation on the data. Any criteria may be applied to determine whether smoothing is appropriate, such as according to the number of data points received, the deviation of the data point values from a selected value, or comparison of each data point value to a threshold. In the present embodiment, the smoothing system performs a threshold comparison. The threshold comparison determines whether data smoothing is appropriate. If so, the initial smoothing process is suitably configured to proceed to an initial smoothing of the data.
  • the process starts with an initial raw data point R 0 , which is also designated as the first smoothed data point S 0 .
  • a difference between each raw data point (R n ) and a preceding smoothed data point (S n ⁇ 1 ) is calculated and compared to a threshold (T 1 ) (step 806 ). If the difference between the raw data point R n and the preceding smoothed data point S n ⁇ 1 exceeds the threshold T 1 , it is assumed that the exceeded threshold corresponds to a significant departure from the smoothed data and indicates a shift in the data. Accordingly, the occurrence of the threshold crossing may be noted and the current smoothed data point S n is set equal to the raw data point R n (step 808 ). No smoothing is performed, and the process proceeds to the next raw data point.
  • the process calculates a current smoothed data point S n in conjunction with an initial smoothing process (step 810 ).
  • the initial smoothing process provides a basic smoothing of the data.
  • the basic smoothing process comprises a conventional exponential smoothing process, such as according to the following equation:
  • M 1 is a selected smoothing coefficient, such as 0.2 or 0.3.
  • the initial smoothing process suitably uses a relatively low coefficient M 1 to provide a significant amount of smoothing for the data.
  • the initial smoothing process and coefficients may be selected according to any criteria and configured in any manner, however, according to the application of the smoothing system, the data processed, requirements and capabilities of the smoothing system, and/or any other criteria.
  • the initial smoothing process may employ random, random walk, moving average, simple exponential, linear exponential, seasonal exponential, exponential weighted moving average, or any other appropriate type of smoothing to initially smooth the data.
  • the data may be further analyzed for and/or subjected to smoothing.
  • Supplementary smoothing may be performed on the data to enhance the smoothing of the data and/or improve the tracking of the smoothed data to the raw data.
  • Multiple phases of supplementary smoothing may also be considered and, if appropriate, applied.
  • the various phases may be independent, interdependent, or complementary.
  • the data may be analyzed to determine whether supplementary smoothing is appropriate.
  • the data is analyzed to determine whether to perform one or more additional phases of smoothing.
  • the data is analyzed according to any appropriate criteria to determine whether supplemental smoothing may be applied (step 812 ).
  • the smoothing system identify trends in the data, such as by comparing a plurality of adjusted data points and raw data points for preceding data and generating a comparison result according to whether substantially all of the preceding adjusted data share a common relationship (such as less than, greater than, or equal to) with substantially all of the corresponding raw data.
  • the smoothing system of the present embodiment compares a selected number P 2 of raw data points to an equal number of smoothed data points. If the values of all of the P 2 raw data points exceed (or are equal to) the corresponding smoothed data points, or if all raw data points are less than (or equal to) the corresponding smoothed data points, then the smoothing system may determine that the data is exhibiting a trend and should be tracked more closely. Accordingly, the occurrence may be noted and the smoothing applied to the data may be changed by applying supplementary smoothing. If, on the other hand, neither of these criteria is satisfied, then the current smoothed data point remains as originally calculated and the relevant supplementary data smoothing is not applied.
  • the criterion for comparing the smoothed data to the raw data is selected to identify a trend in the data behind which the smoothed data may be lagging. Accordingly, the number of points P 2 may be selected according to the desired sensitivity of the system to changing trends in the raw data.
  • the supplementary smoothing changes the effect of the overall smoothing according to the data analysis. Any appropriate supplementary smoothing may be applied to the data to more effectively smooth the data or track a trend in the data. For example, in the present embodiment, if the data analysis indicates a trend in the data that should be tracked more closely, then the supplementary smoothing may be applied to reduce the degree of smoothing initially applied so that the smoothed data more closely tracks the raw data (step 814 ).
  • the degree of smoothing is reduced by recalculating the value for the current smoothed data point using a reduced degree of smoothing.
  • Any suitable smoothing system may be used to more effectively track the data or otherwise respond to the results of the data analysis.
  • another conventional exponential smoothing process is applied to the data using a higher coefficient M 2 :
  • the coefficients M 1 and M 2 may be selected according to the desired sensitivity of the system, both in the absence (M 1 ) and the presence (M 2 ) of trends in the raw data. In various applications, for example, the value of M 1 may be higher than the value of M 2 .
  • the supplementary data smoothing may include additional phases as well.
  • the additional phases of data smoothing may similarly analyze the data in some manner to determine whether additional data smoothing should be applied. Any number of phases and types of data smoothing may be applied or considered according to the data analysis.
  • the data may be analyzed and potentially smoothed for noise control, such as using a predictive process based on the slope of the smoothed data.
  • the smoothing system computes a slope (step 816 ) based on a selected number P 3 of smoothed data points preceding the current data point according to any appropriate process, such as line regression, N-points centered, or the like.
  • the data smoothing system uses a “least squares fit through” process to establish a slope of the preceding P 3 smoothed data points.
  • the smoothing system predicts a value of the current smoothed data point according to the calculated slope.
  • the system compares the difference between the previously calculated value for the current smoothed data point (S n ) to the predicted value for the current smoothed data point to a range number (R 3 ) (step 818 ). If the difference is greater than the range R 3 , then the occurrence may be noted and the current smoothed data point is not adjusted.
  • the current smoothed data point is set equal to the difference between the calculated current smoothed data point (S n ) and the predicted value for the current smoothed data point (S n ⁇ pred ) multiplied by a third multiplier M 3 and added to the original value of the current smoothed data point (step 820 ).
  • the current smoothed data point is set according to a modified difference between the original smoothed data point and the predicted smoothed data point, but reduced by a certain amount (when M 3 is less than 1).
  • Applying the predictive smoothing tends to reduce point-to-point noise sensitivity during relatively flat (or otherwise nontrending) portions of the signal.
  • the limited application of the predictive smoothing process to the smoothed data points ensures that the calculated average based on the slope does not affect the smoothed data when significant changes are occurring in the raw data, i.e., when the raw data signal is not relatively flat.
  • the supplementary data analysis element 206 may proceed with further analysis of the tester data.
  • the supplementary data analysis element 206 may conduct statistical process control (SPC) calculations and analyses on the output test data. More particularly, referring again to FIGS. 4 A-C, the supplemental data analysis element 206 may calculate and store desired statistics for a particular component, test, and/or section (step 430 ).
  • the statistics may comprise any statistics useful to the operator or the test system 100 , such as SPC figures that may include averages, standard deviations, minima, maxima, sums, counts, Cp, Cpk, or any other appropriate statistics.
  • the supplementary data analysis element 206 also suitably performs a signature analysis to dynamically and automatically identify trends and anomalies in the data, for example according to section, based on a combination of test results for that section and/or other data, such as historical data (step 442 ).
  • the signature analysis identifies signatures and applies a weighting system, suitably configured by the operator, based on any suitable data, such as the test data or identification of defects.
  • the signature analysis may cumulatively identify trends and anomalies that may correspond to problem areas or other characteristics of the wafer or the fabrication process. Signature analysis may be conducted for any desired signatures, such as noise peaks, waveform variations, mode shifts, and noise.
  • the computer 108 suitably performs the signature analysis on the output test data for each desired test in each desired section.
  • a signature analysis process may be performed in conjunction with the smoothing process.
  • results of the analysis indicating a trend or anomaly in the data are stored as being indicative of a change in the data or an outlier that may be of significance to the operator and/or test engineer. For example, if a trend is indicated by a comparison of sets of data in the smoothing process, the occurrence of the trend may be noted and stored. Similarly, if a data point exceeds the threshold T 1 in the data smoothing process, the occurrence may be noted and stored for later analysis and/or inclusion in the output report.
  • a signature analysis process 600 may initially calculate a count (step 602 ) for a particular set of test data and control limits corresponding to a particular section and test. The signature analysis process then applies an appropriate signature analysis algorithm to the data points (step 604 ). The signature analysis is performed for each desired signature algorithm, and then to each test and each section to be analyzed. Errors identified by the signature analysis, trend results, and signature results are also stored (step 606 ). The process is repeated for each signature algorithm (step 608 ), test (step 610 ), and section (step 612 ). Upon completion, the supplementary data analysis element 206 records the errors (step 614 ), trend results (step 616 ), signature results (step 618 ), and any other desired data in the storage system.
  • each relevant data point may be associated with a value identifying the relevant characteristics (step 444 ).
  • each relevant components or data point may be associated with a series of values, suitably expressed as a hexadecimal figure, corresponding to the results of the supplementary analysis relating to the data point.
  • Each value may operate as a flag or other designator of a particular characteristic. For example, if a particular data point has failed a particular test completely, a first flag in the corresponding hexadecimal value may be set. If a particular data point is the beginning of a trend in the data, another flag may be set. Another value in the hexadecimal figure may include information relating to the trend, such as the duration of the trend in the data.
  • the supplementary data analysis element 206 may also be configured to classify and correlate the data (step 446 ).
  • the supplementary data analysis element 206 may utilize the information in the hexadecimal figures associated with the data points to identify the failures, outliers, trends, and other features of the data.
  • the supplementary data analysis element 206 also suitably applies conventional correlation techniques to the data, for example to identify potentially redundant or related tests.
  • the computer 108 may perform additional analysis functions upon the generated statistics and the output test data, such as automatically identifying and classifying outliers (step 432 ). Analyzing each relevant datum according to the selected algorithm suitably identifies the outliers. If a particular algorithm is inappropriate for a set of data, the supplementary data analysis element 206 may be configured to automatically abort the analysis and select a different algorithm.
  • the supplementary data analysis element 206 may operate in any suitable manner to designate outliers, such as by comparison to selected values and/or according to treatment of the data in the data smoothing process.
  • an outlier identification element according to various aspects of the present invention initially automatically calibrates its sensitivity to outliers based on selected statistical relationships for each relevant datum (step 434 ). Some of these statistical relationships are then compared to a threshold or other reference point, such as the data mode, mean, or median, or combinations thereof, to define relative outlier threshold limits. In the present embodiment, the statistical relationships are scaled, for example by one, two, three, and six standard deviations of the data, to define the different outlier amplitudes (step 436 ). The output test data may then be compared to the outlier threshold limits to identify and classify the output test data as outliers (step 438 ).
  • the supplementary data analysis element 206 stores the resulting statistics and outliers in memory and identifiers, such as the x-y wafer map coordinates, associated with any such statistics and outliers (step 440 ). Selected statistics, outliers, and/or failures may also trigger notification events, such as sending an electronic message to an operator, triggering a light tower, stopping the tester 102 , or notifying a server.
  • the supplementary data analysis element 206 includes a scaling element 210 and an outlier classification element 212 .
  • the scaling element 210 is configured to dynamically scale selected coefficients and other values according to the output test data.
  • the outlier classification element 212 is configured to identify and/or classify the various outliers in the data according to selected algorithms.
  • the scaling element of the present embodiment suitably uses various statistical relationships for dynamically scaling outlier sensitivity and smoothing coefficients for noise filtering sensitivity.
  • the scaling coefficients are suitably calculated by the scaling element and used to modify selected outlier sensitivity values and smoothing coefficients. Any appropriate criteria, such as suitable statistical relationships, may be used for scaling.
  • a sample statistical relationship for outlier sensitivity scaling is defined as:
  • the outlier classification element is suitably configured to identify and/or classify components 106 , output test data, and analysis results according to any suitable algorithm the outliers in the output test data.
  • the outlier classification element may also identify and classify selected outliers and components 106 according to the test output test results and the information generated by the supplementary analysis element 206 .
  • the outlier classification element is suitably configured to classify the components 106 into critical/marginal/good part categories, for example in conjunction with user-defined criteria; user-defined good/bad spatial patterns recognition; classification of pertinent data for tester data compression; test setup in-situ sensitivity qualifications and analyses; tester yield leveling analyses; dynamic wafer map and/or test strip mapping for part dispositions and dynamic retest; or test program optimization analyses.
  • the outlier classification element may classify data in accordance with conventional SPC control rules, such as Western Electric rules or Nelson rules, to characterize the data.
  • the outlier classification element suitably classifies the data using a selected set of classification limit calculation methods. Any appropriate classification methods may be used to characterize the data according to the needs of the operator.
  • the present outlier classification element classifies outliers by comparing the output test data to selected thresholds, such as values corresponding to one, two, three, and six statistically scaled standard deviations from a threshold, such as the data mean, mode, and/or median. The identification of outliers in this manner tends to normalize any identified outliers for any test regardless of datum amplitude and relative noise.
  • the outlier classification element analyzes and correlates the normalized outliers and/or the raw data points based on user-defined rules.
  • Sample user-selectable methods for the purpose of part and pattern classification based on identified outliers are as follows:
  • Part CRITICAL True, If ⁇ (Part Cumlative Outlier Count >Count LIMIT )AND(Part Cumlative Normalized Outlier Amplitude >NormalizedOutlier Amplitude LIMIT ) ⁇
  • Part CRITICAL True, If ⁇ (Part CumlativeOutlierCount 2 >Count LIMIT 2 )AND(Part CumlativeNormalizedOutlierAmplitude 2 >Normalized Outlier Amplitude LIMIT 2 ) ⁇
  • ⁇ in these examples ⁇ relative to datum mean, mode, and/or median based on datum standard deviation scaled by key statistical relationships.
  • Part CRITICAL True, If[((Part COUNT6 ⁇ +Part COUNT3 ⁇ ) ⁇ 2)OR((Part COUNT2 ⁇ +Part COUNT1 ⁇ ) ⁇ 6)]
  • Part CRITICAL True, If[((Part COUNT6 ⁇ +Part COUNT3 ⁇ ) ⁇ 1)AND((Part COUNT2 ⁇ +Part COUNT1 ⁇ ) ⁇ 3)]
  • Part MARGINAL True, If[((Part COUNT6 ⁇ +Part COUNT3 ⁇ +Part COUNT2 ⁇ +Part COUNT1 ⁇ ) ⁇ 3)]
  • Part NOISY True, If[((Part COUNT6 ⁇ +Part COUNT3 ⁇ +Part COUNT2 ⁇ +Part COUNT1 ⁇ ) ⁇ 1)]
  • the supplementary data analysis element 206 may be configured to perform additional analysis of the output test data and the information generated by the supplementary data analysis element 206 .
  • the supplementary data analysis element 206 may identify tests having high incidences of failure or outliers, such as by comparing the total or average number of failures, outliers, or outliers in a particular classification to one or more threshold values.
  • the supplementary data analysis element 206 may also be configured to correlate data from different tests to identify similar or dissimilar trends, for example by comparing cumulative counts, outliers, and/or correlating outliers between wafers or other data sets.
  • the supplementary data analysis element 206 may also analyze and correlate data from different tests to identify and classify potential critical and/or marginal and/or good parts on the wafer.
  • the supplementary data analysis element 206 may also analyze and correlate data from different tests to identify user-defined good part patterns and/or bad part patterns on a series of wafers for the purposes of dynamic test time reduction.
  • the supplementary data analysis element 206 is also suitably configured to analyze and correlate data from different tests to identify user-defined pertinent raw data for the purposes of dynamically compressing the test data into memory.
  • the supplementary data analysis element may also analyze and correlate statistical anomalies and test data results for test node in-situ setup qualification and sensitivity analysis. Further, the supplementary data analysis element may contribute to test node yield leveling analysis, for example by identifying whether a particular test node may be improperly calibrated or otherwise producing inappropriate results.
  • the supplementary data analysis element may moreover analyze and correlate the data for the purposes of test program optimization including, but not limited to, automatic identification of redundant tests using correlated results and outlier analysis and providing additional data for use in analysis.
  • the supplementary data analysis element is also suitably configured to identify critical tests, for example by identifying regularly failed or almost failed tests, tests that are almost never-fail, and/or tests exhibiting a very low Cpk.
  • the supplementary data analysis may also provide identification of test sampling candidates, such as tests that are rarely or never failed or in which outliers are never detected.
  • the supplementary data analysis element may also provide identification of the best order test sequence based on correlation techniques, such as conventional correlation techniques, combined with analysis and correlation of identified outliers and/or other statistical anomalies, number of failures, critical tests, longest/shortest tests, or basic functionality issues associated with failure of the test.
  • the supplementary data analysis may also provide identification of critical, marginal, and good parts as defined by sensitivity parameters in a recipe configuration file.
  • Part identification may provide disposition/classification before packaging and/or shipping the part that may represent a reliability risk, and/or test time reduction through dynamic probe mapping of bad and good parts during wafer probe. Identification of these parts may be represented and output in any appropriate manner, for example as good and bad parts on a dynamically generated prober control map (for dynamic mapping), a wafer map used for offline inking equipment, a test strip map for strip testing at final test, a results file, and/or a database results table.
  • Supplemental data analysis at the cell controller level tends to increase quality control at the probe, and this final test yields.
  • quality issues may be identified at product run time, not later.
  • the supplemental data analysis and signature analysis tends to improve the quality of data provided to the downstream and offline analysis tools, as well as test engineers or other personnel, by identifying outliers.
  • the computer 108 may include information on the wafer map identifying a group of components having signature analyses indicating a fault in the manufacturing process.
  • the signature analysis system may identify potentially defective goods that went undetected using conventional test analysis.
  • an array of semiconductor devices are positioned on a wafer.
  • the general resistivity of resistor components in the semiconductor devices varies across the wafer, for example due to uneven deposition of material or treatment of the wafer.
  • the resistance of any particular component may be within the control limits of the test.
  • the target resistance of a particular resistor component may be 1000 ⁇ +/ ⁇ 10%.
  • the resistances of most of the resistors approach, but do not exceed, the normal distribution range of 900 ⁇ and 1100 ⁇ (FIG. 11).
  • Components on the wafer may include defects, for example due to a contaminant or imperfection in the fabrication process.
  • the defect may increase the resistance of resistors located near the low-resistivity edge of the wafer to 1080 ⁇ . The resistance is well over the 1000 ⁇ expected for a device near the middle of the wafer, but is still well within the normal distribution range.
  • the raw test data for each component may be plotted.
  • the test data exhibits considerable variation, due in part to the varying resistivity among components on the wafer as the prober indexes across rows or columns of devices.
  • the devices affected by the defect are not easily identifiable based on visual examination of the test data or comparison to the test limits.
  • the devices affected by the defect may be associated with outliers in the test data.
  • the smoothed test data is largely confined to a certain range of values.
  • the data associated with the defects is unlike the data for the surrounding components. Accordingly, the smoothed data illustrates the departure from the values associated with the surrounding devices without the defect.
  • the outlier classification element may identify and classify the outliers according to the magnitude of the departure of the outlier data from the surrounding data.
  • the output element 208 collects data from the test system 100 , suitably at run time, and provides an output report to a printer, database, operator interface, or other desired destination. Any form, such as graphical, numerical, textual, printed, or electronic form, may be used to present the output report for use or subsequent analysis.
  • the output element 208 may provide any selected content, including selected output test data from the tester 102 and results of the supplementary data analysis.
  • the output element 208 suitably provides a selection of data from the output test data specified by the operator as well as supplemental data at product run time via the dynamic datalog.
  • the output element 208 initially reads a sampling range from the database 114 (step 702 ).
  • the sampling range identifies predetermined information to be included in the output report.
  • the sampling range identifies components 106 on the wafer selected by the operator for review.
  • the predetermined components may be selected according to any criteria, such as data for various circumferential zones, radial zones, random components, or individual stepper fields.
  • the sampling range comprises a set of x-y coordinates corresponding to the positions of the predetermined components on the wafer or an identified portion of the available components in a batch.
  • the output element 208 may also be configured to include information relating to the outliers, or other information generated or identified by the supplementary data analysis element, in the dynamic datalog (step 704 ). If so configured, the identifiers, such as x-y coordinates, for each of the outliers are assembled as well. The coordinates for the operator-selected components and the outliers are merged into the dynamic datalog (step 706 ). The output element 208 retrieves selected information, such as the raw test data and one or more data from the supplementary data analysis element 206 , for each entry in the merged x-y coordinate array of the dynamic datalog (step 708 ).
  • the retrieved information is then suitably stored in an appropriate output report (step 710 ).
  • the report may be prepared in any appropriate format or manner.
  • the output report suitably includes the dynamic datalog having a wafer map indicating the selected components on the wafer and their classification.
  • the output element 208 may superimpose wafer map data corresponding to outliers on the wafer map of the preselected components.
  • the output element may include only the outliers from the wafer map or batch as the sampled output.
  • the output report may also include a series of graphical representation of the data to highlight the occurrence of outliers and correlations in the data.
  • the output report may further include recommendations and supporting data for the recommendations.
  • the output report may include a suggestion that the tests are redundant and recommend that one of the tests be omitted from the test program.
  • the recommendation may include a graphical representation of the data showing the identical results of the tests.
  • the output report may be provided in any suitable manner, for example output to a local workstation, sent to a server, activation of an alarm, or any other appropriate manner (step 712 ).
  • the output report may be provided off-line such that the output does not affect the operation of the system or transfer to the main server.
  • the computer 108 copies data files, performs the analysis, and generates results, for example for demonstration or verification purposes.

Abstract

A method and apparatus for testing semiconductors according to various aspects of the present invention comprises a test system comprising an outlier identification element configured to identify significant data in a set of test results. The test system may be configured to provide the data in an output report. The outlier identification element suitably performs the analysis at run time. The outlier identification element may also operate in conjunction with a smoothing system to smooth the data and identify trends and departures from test result norms.

Description

    CROSS-REFERENCES TO RELATED APPLICATIONS
  • This application is a continuation-in-part of U.S. patent application Ser. No. 09/872,195, filed on May 31, 2001, entitled METHODS AND APPARATUS FOR DATA SMOOTHING, and claims the benefit of U.S. Provisional Patent Application No. 60/293,577, filed May 24, 2001, entitled METHODS AND APPARATUS FOR DATA SMOOTHING; U.S. Provisional Patent Application No. 60/295,188, filed May 31, 2001, entitled METHODS AND APPARATUS FOR TEST DATA CONTROL AND ANALYSIS; and U.S. Provisional Patent Application No. 60/374,328, filed Apr. 21, 2002, entitled METHODS AND APPARATUS FOR TEST PROGRAM ANALYSIS AND ENHANCEMENT; and [0001]
  • incorporates the disclosure of each application by reference. To the extent that the present disclosure conflicts with any referenced application, however, the present disclosure is to be given priority.[0002]
  • FIELD OF THE INVENTION
  • The invention relates to semiconductor testing. [0003]
  • BACKGROUND OF THE INVENTION
  • Semiconductor companies test components to ensure that the components operate properly. The test data not only determine whether the components function properly, but also may indicate deficiencies in the manufacturing process. Accordingly, many semiconductor companies may analyze the collected data from several different components to identify problems and correct them. For example, the company may gather test data for multiple chips on each wafer among several different lots. This data may be analyzed to identify common deficiencies or patterns of defects or identify parts that may exhibit quality and performance issues and to identify or classify user-defined “good parts”. Steps may then be taken to correct the problems. Testing is typically performed before device packaging (at wafer level) as well as upon completion of assembly (final test). [0004]
  • Gathering and analyzing test data is expensive and time consuming. Automatic testers apply signals to the components and read the corresponding output signals. The output signals may be analyzed to determine whether the component is operating properly. Each tester generates a large volume of data. For example, each tester may perform 200 tests on a single component, and each of those tests may be repeated 10 times. Consequently, a test of a single component may yield 2000 results. Because each tester is testing 100 or more components an hour and several testers may be connected to the same server, an enormous amount of data must be stored. Further, to process the data, the server typically stores the test data in a database to facilitate the manipulation and analysis of the data. Storage in a conventional database, however, requires further storage capacity as well as time to organize and store the data. [0005]
  • The analysis of the gathered data is also difficult. The volume of the data may demand significant processing power and time. As a result, the data is not usually analyzed at product run time, but is instead typically analyzed between test runs or in other batches. [0006]
  • To alleviate some of these burdens, some companies only sample the data from the testers and discard the rest. Analyzing less than all of the data, however, ensures that the resulting analysis cannot be fully complete and accurate. As a result, sampling degrades the complete understanding of the test results. [0007]
  • In addition, acquiring the test data presents a complex and painstaking process. A test engineer prepares a test program to instruct the tester to generate the input signals to the component and receive the output signals. The program tends to be very complex to ensure full and proper operation of the component. Consequently, the test program for a moderately complex integrated circuit involves a large number of tests and results. Preparing the program demands extensive design and modification to arrive at a satisfactory solution, and optimization of the program, for example to remove redundant tests or otherwise minimize test time, requires additional exertion. [0008]
  • SUMMARY OF THE INVENTION
  • A method and apparatus for testing semiconductors according to various aspects of the present invention comprises a test system comprising an outlier identification element configured to identify significant data in a set of test results. The test system may be configured to provide the data in an output report. The outlier identification element suitably performs the analysis at run time. The outlier identification element may also operate in conjunction with a smoothing system to smooth the data and identify trends and departures from test result norms. [0009]
  • BRIEF DESCRIPTION OF THE DRAWING
  • A more complete understanding of the present invention may be derived by referring to the detailed description and the claims when considered in connection with the following illustrative figures, which may not be to scale. Like reference numbers refer to similar elements throughout the figures. [0010]
  • FIG. 1 is a block diagram of a test system according to various aspects of the present invention and associated functional components; [0011]
  • FIG. 2 is a block diagram of elements for operating the test system; [0012]
  • FIG. 3 illustrates a flow chart for a configuration element; [0013]
  • FIGS. [0014] 4A-C illustrate a flow chart for a supplemental data analysis element;
  • FIG. 5 is a diagram of various sections of a wafer; [0015]
  • FIGS. [0016] 6A-B further illustrate a flow chart for a supplemental data analysis element;
  • FIG. 7 illustrates a flow chart for an output element; [0017]
  • FIG. 8 is a flow chart for operation of an exemplary data smoothing system according to various aspects of the present invention; [0018]
  • FIG. 9 is a plot of test data for a test of multiple components; [0019]
  • FIG. 10 is a representation of a wafer having multiple devices and a resistivity profile for the wafer; [0020]
  • FIG. 11 is a graph of resistance values for a population of resistors in the various devices of the wafer of FIG. 10; and [0021]
  • FIGS. [0022] 12A-B are general and detailed plots, respectively, of raw test data and smoothed test data for the various devices of FIG. 10.
  • Elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the connections and steps performed by some of the elements in the figures may be exaggerated or omitted relative to other elements to help to improve understanding of embodiments of the present invention. [0023]
  • DETAILED DESCRIPTION OF AN EXEMPLARY EMBODIMENT
  • The present invention may be described in terms of functional block components and various process steps. Such functional blocks and steps may be realized by any number of hardware or software components configured to perform the specified functions. For example, the present invention may employ various testers, processors, storage systems, processes, and integrated circuit components, e.g., statistical engines, memory elements, signal processing elements, logic elements, programs, and the like, which may carry out a variety of functions under the control of one or more testers, microprocessors, or other control devices. In addition, the present invention may be practiced in conjunction with any number of test environments, and each system described is merely one exemplary application for the invention. Further, the present invention may employ any number of conventional techniques for data analysis, component interfacing, data processing, component handling, and the like. [0024]
  • Referring to FIG. 1, a method and apparatus according to various aspects of the present invention operates in conjunction with a [0025] test system 100 having a tester 102, such as automatic test equipment (ATE) for testing semiconductors. In the present embodiment, the test system 100 comprises a tester 102 and a computer system 108. The test system 100 may be configured for testing any components 106, such as semiconductor devices on a wafer, circuit boards, packaged devices, or other electrical or optical systems. In the present embodiment, the components 106 comprise multiple integrated circuit dies formed on a wafer or packaged integrated circuits or devices.
  • The [0026] tester 102 suitably comprises any test equipment that tests components 106 and generates output data relating to the testing. The tester 102 may comprise a conventional automatic tester, such as a Teradyne tester, and suitably operates in conjunction with other equipment for facilitating the testing. The tester 102 may be selected and configured according to the particular components 106 to be tested and/or any other appropriate criteria.
  • The [0027] tester 102 may operate in conjunction with the computer system 108 to, for example, program the tester 102, load and/or execute the test program, collect data, provide instructions to the tester 102, implement a statistical engine, control tester parameters, and the like. In the present embodiment, the computer system 108 receives tester data from the tester 102 and performs various data analysis functions independently of the tester 102. The computer system 108 also implements a statistical engine to analyze data from the tester 102. The computer system 108 may comprise a separate computer, such as a personal computer or workstation, connected to or networked with the tester 102 to exchange signals with the tester 102. In an alternative embodiment, the computer system 108 may be omitted from or integrated into other components of the test system 100 and various functions may be performed by other components, such as the tester 102.
  • The [0028] computer system 108 includes a processor 110 and a memory 112. The processor 110 comprises any suitable processor, such as a conventional Intel, Motorola, or Advanced Micro Devices processor, operating in conjunction with any suitable operating system, such as Windows 98, Windows NT, Unix, or Linux. Similarly, the memory 112 may comprise any appropriate memory accessible to the processor 110, such as a random access memory (RAM) or other suitable storage system, for storing data. In particular, the memory 112 of the present system includes a fast access memory for storing and receiving information and is suitably configured with sufficient capacity to facilitate the operation of the computer 108.
  • In the present embodiment, the [0029] memory 112 includes capacity for storing output results received from the tester 102 and facilitating analysis of the output test data. The memory 112 is configured for fast storage and retrieval of test data for analysis. The memory 112 is suitably configured to store the elements of a dynamic datalog, suitably comprising a set of information selected by the test system 100 and/or the operator according to selected criteria and analysis based on the test results.
  • For example, the [0030] memory 112 suitably stores a component identifier for each component 106, such as x-y coordinates corresponding to a position of the component 106 on a wafer map for the tested wafer. Each x-y coordinate in the memory 112 may be associated with a particular component 106 at the corresponding x-y coordinate on the wafer map. Each component identifier has one or more fields, and each field corresponds, for example, to a particular test performed on the component 106 at the corresponding x-y position on the wafer, a statistic related to the corresponding component 106, or other relevant data. The memory 112 may be configured to include any data identified by the user as desired according to any criteria or rules.
  • The [0031] computer 108 of the present embodiment also suitably has access to a storage system, such as another memory (or a portion of the memory 112), a hard drive array, an optical storage system, or other suitable storage system. The storage system may be local, like a hard drive dedicated to the computer 108 or the tester 102, or may be remote, such as a hard drive array associated with a server to which the test system 100 is connected. The storage system may store programs and/or data used by the computer 108 or other components of the test system 100. In the present embodiment, the storage system comprises a database 114 available via a remote server 116 comprising, for example, a main production server for a manufacturing facility. The database 114 stores tester information, such as tester data files, master data files for operating the test system 100 and its components, test programs, downloadable instructions for the test system 100, and the like.
  • The [0032] test system 100 may include additional equipment to facilitate testing of the components 106. For example, the present test system 100 includes a device interface 104, like a conventional device interface board and/or a device handler or prober, to handle the components 106 and provide an interface between the components 106 and the tester 102. The test system 100 may include or be connected to other components, equipment, software, and the like to facilitate testing of the components 106 according to the particular configuration, application, environment of the test system 100, or other relevant factors. For example, in the present embodiment, the test system 100 is connected to an appropriate communication medium, such as a local area network, intranet, or global network like the internet, to transmit information to other systems, such as the remote server 116.
  • The [0033] test system 100 may include one or more testers 102 and one or more computers 108. For example, one computer 108 may be connected to an appropriate number of, such as up to twenty or more, testers 102 according to various factors, such as the system's throughput and the configuration of the computer 108. Further, the computer 108 may be separate from the tester 102, or may be integrated into the tester 102, for example utilizing one or more processors, memories, clock circuits, and the like of the tester 102 itself. In addition, various functions may be performed by different computers. For example, a first computer may perform various pre-analysis tasks, several computers may then receive the data and perform data analysis, and another set of computers may prepare the dynamic datalogs and/or other output reports.
  • A [0034] test system 100 according to various aspects of the present invention tests the components 106 and provides enhanced analysis and test results. For example, the supplemental analysis may identify incorrect, questionable, or unusual results, repetitive tests, and/or tests with a relatively high probability of failure. The operator, such as the product engineer, test engineer, manufacturing engineer, device engineer, or other personnel using the test data, may then use the results to verify and/or improve the test system 100 and classify components 106.
  • The [0035] test system 100 according to various aspects of the present invention executes an enhanced test process for testing the components 106 and collecting and analyzing test data. The test system 100 suitably operates in conjunction with a software application executed by the computer 108. Referring to FIG. 2, the software application of the present embodiment includes multiple elements for implementing the enhanced test process, including a configuration element 202, a supplementary data analysis element 206, and an output element 208. Each element 202, 206, 208 suitably comprises a software module operating on the computer 108 to perform various tasks. Generally, the configuration element 202 prepares test system 100 for testing and analysis. In the supplementary data analysis element 206, output test data from the tester 102 is analyzed to generate supplementary test data, suitably at run time and automatically. The supplementary test data is then transmitted to the operator or another system by the output element 208.
  • The [0036] configuration element 202 configures the test system 100 for testing the components 106 and analyzing the test data. The test system 100 suitably uses a predetermined set of initial parameters and, if desired, information from the operator to configure the test system 100. The test system 100 is suitably initially configured with predetermined or default parameters to minimize operator attendance to the test system 100. Adjustments may be made to the configuration by the operator, if desired, for example via the computer 108.
  • Referring to FIG. 3, an [0037] exemplary configuration process 300 performed by the configuration element 202 begins with an initialization procedure (step 302) to set the computer 108 in an initial state. The configuration element 202 then obtains application configuration information (step 304), for example from the database 114, for the computer 108 and the tester 102. For example, the configuration element 202 may access a master configuration file for the enhanced test process and/or a tool configuration file relating to the tester 102. The master configuration file may contain data relating to the proper configuration for the computer 108 and other components of the test system 100 to execute the enhanced test process. Similarly, the tool configuration file suitably includes data relating to the tester 102 configuration, such as connection, directory, IP address, tester node identification, manufacturer, flags, prober identification, or any other pertinent information for the tester 102.
  • The [0038] configuration element 202 may then configure the test system 100 according to the data contained in the master configuration file and/or the tool configuration file (step 306). In addition, the configuration element 202 may use the configuration data to retrieve further relevant information from the database 114, such as the tester's 102 identifier (step 308) for associating data like logistics instances for tester data with the tester 102. The test system 100 information also suitably includes one or more default parameters that may be accepted, declined, or adjusted by the operator. For example, the test system 100 information may include global statistical process control (SPC) rules and goals that are submitted to the operator upon installation, configuration, power-up, or other appropriate time for approval and/or modification. The test system 100 information may also include default wafer maps or other files that are suitably configured for each product, wafer, component 106, or other item that may affect or be affected by the test system 100. The configuration algorithms, parameters, and any other criteria may be stored in a recipe file for easy access, correlation to specific products and/or tests, and for traceability.
  • When the initial configuration process is complete, the [0039] test system 100 commences a test run, for example in conjunction with a conventional series of tests, in accordance with a test program. The tester 102 suitably executes the test program to apply signals to connections on the components 106 and read output test data from the components 106. The tester 102 may perform multiple tests on each component 106 on a wafer, and each test may be repeated several times on the same component 106. Test data from the tester 102 is stored for quick access and supplemental analysis as the test data is acquired. The data may also be stored in a long-term memory for subsequent analysis and use.
  • Each test generates at least one result for at least one of the components. Referring to FIG. 9, an exemplary set of test results for a single test of multiple components comprises a first set of test results having statistically similar values and a second set of test results characterized by values that stray from the first set. Each test result may be compared to an upper test limit and a lower test limit. If a particular result for a component exceeds either limit, the component may be classified as a “bad part”. [0040]
  • Some of the test results in the second set that stray from the first set may exceed the control limits, while others do not. For the present purposes, those test results that stray from the first set but do not exceed the control limits or otherwise fail to be detected are referred to as “outliers”. The outliers in the test results may be identified and analyzed for any appropriate purpose, such as to identify potentially unreliable components. The outliers may also be used to identify a various potential problems and/or improvements in the test and manufacturing processes. [0041]
  • As the [0042] tester 102 generates the test results, the output test data for each component, test, and repetition is stored by the tester 102 in a tester data file. The output test data received from each component 106 is analyzed by the tester 102 to classify the performance of the component 106, for example by comparison to the upper and lower test limits, and the results of the classification are also stored in the tester data file. The tester data file may include additional information as well, such as logistics data and test program identification data. The tester data file is then provided to the computer 108 in an output file, such as a standard tester data format (STDF) file, and stored in memory.
  • When the [0043] computer 108 receives the tester data file, the supplementary data analysis element 206 analyzes the data to provide enhanced output results. The supplementary data analysis element 206 may provide any appropriate analysis of the tester data to achieve any suitable objective. For example, the supplementary data analysis element 206 may implement a statistical engine for analyzing the output test data at run time and identifying data and characteristics of the data of interest to the operator. The data and characteristics identified may be stored, while data that is not identified may be otherwise disposed of, such as discarded.
  • The supplementary [0044] data analysis element 206 may, for example, calculate statistical figures according to the data and a set of statistical configuration data. The statistical configuration data may call for any suitable type of analysis according to the needs of the test system 100 and/or the operator, such as statistical process control, outlier identification and classification, signature analyses, and data correlation. Further, the supplementary data analysis element 206 suitably performs the analysis at run time, i.e., within a matter of seconds or minutes following generation of the test data. The supplementary data analysis element 206 may also perform the analysis automatically with minimal intervention from the operator and/or test engineer.
  • In the [0045] present test system 100, after the computer 108 receives and stores the tester data file, the supplementary data analysis element 206 performs various preliminary tasks to prepare the computer 108 for analysis of the output test data and facilitate generation of supplementary data and preparation of an output report. Referring now to FIGS. 4A-C, in the present embodiment, the supplementary data analysis element 206 initially copies the tester data file to a tool input directory corresponding to the relevant tester 102 (step 402). The supplementary data analysis element 206 also retrieves configuration data to prepare the computer 108 for supplementary analysis of the output test data.
  • The configuration data suitably includes a set of logistics data that may be retrieved from the tester data file (step [0046] 404). The supplementary data analysis element 206 also creates a logistics reference (step 406). The logistics reference may include tester 102 information, such as the tester 102 information derived from the tool configuration file. In addition, the logistics reference is assigned an identification.
  • The configuration data may also include an identifier for the test program that generated the output test data. The test program may be identified in any suitable manner, such as looking it up in the database [0047] 114 (step 408), by association with the tester 102 identification, or reading it from the master configuration file. If no test program identification can be established (step 410), a test program identification may be created and associated with the tester identification (step 412).
  • The configuration data further identifies the wafers in the test run to be processed by the supplementary [0048] data analysis element 206, if fewer than all of the wafers. In the present embodiment, the supplementary data analysis element 206 accesses a file indicating which wafers are to be analyzed (step 414). If no indication is provided, the computer 108 suitably defaults to analyzing all of the wafers in the test run.
  • If the wafer for the current test data file is to be analyzed (step [0049] 416), the supplementary data analysis element 206 proceeds with performing the supplementary data analysis on the test data file for the wafer. Otherwise, the supplementary data analysis element 206 waits for or accesses the next test data file (step 418).
  • The supplementary [0050] data analysis element 206 may establish one or more section groups to be analyzed for the various wafers to be tested (step 420). To identify the appropriate section group to apply to the output test data, the supplementary data analysis element 206 suitably identifies an appropriate section group definition, for example according to the test program and/or the tester identification. Each section group includes one or more section arrays, and each section array includes one or more sections of the same section types.
  • Section types comprise various sorts of [0051] component 106 groups positioned in predetermined areas of the wafer. For example, referring to FIG. 5, a section type may include a row 502, a column 504, a stepper field 506, a circular band 508, a radial zone 510, a quadrant 512, or any other desired grouping of components. Different section types may be used according to the configuration of the components, such as order of components processed, sections of a tube, or the like. Such groups of components 106 are analyzed together to identify, for example, common defects or characteristics that may be associated with the group. For example, if a particular portion of the wafer does not conduct heat like other portions of the wafer, the test data for a particular group of components 106 may reflect common characteristics or defects associated with the uneven heating of the wafer.
  • Upon identifying the section group for the current tester data file, the supplemental [0052] data analysis element 206 retrieves any further relevant configuration data, such as control limits and enable flags for the test program and/or tester 102 (step 422). In particular, the supplemental data analysis element 206 suitably retrieves a set of desired statistics or calculations associated with each section array in the section group (step 423). Desired statistics and calculations may be designated in any manner, such as by the operator or retrieved from a file. Further, the supplemental data analysis element 206 may also identify one or more signature analysis algorithms (step 424) for each relevant section type or other appropriate variation relating to the wafer and retrieve the signature algorithms from the database 114 as well.
  • All of the configuration data may be provided by default or automatically accessed by the [0053] configuration element 202 or the supplemental data analysis element 206. Further, the configuration element 202 and the supplemental data analysis element 206 of the present embodiment suitably allow the operator to change the configuration data according to the operator's wishes or the test system 100 requirements. When the configuration data have been selected, the configuration data may be associated with relevant criteria and stored for future use as default configuration data. For example, if the operator selects a certain section group for a particular kind of components 106, the computer 108 may automatically use the same section group for all such components 106 unless instructed otherwise by the operator.
  • The supplemental [0054] data analysis element 206 also provides for configuration and storage of the tester data file and additional data. The supplemental data analysis element 206 suitably allocates memory (step 426), such as a portion of the memory 112, for the data to be stored. The allocation suitably provides memory for all of the data to be stored by the supplemental data analysis element 206, including output test data from the tester data file, statistical data generated by the supplemental data analysis element 206, control parameters, and the like. The amount of memory allocated may be calculated according to, for example, the number of tests performed on the components 106, the number of section group arrays, the control limits, statistical calculations to be performed by the supplementary data analysis element 206, and the like.
  • When all of the configuration data for performing the supplementary analysis are ready and upon receipt of the output test data, the supplementary [0055] data analysis element 206 loads the relevant test data into memory (step 428) and performs the supplementary analysis on the output test data. The supplementary data analysis element 206 may perform any number and types of data analyses according to the components 106, configuration of the test system 100, desires of the operator, or other relevant criteria. The supplemental data analysis element 206 may be configured to analyze the sections for selected characteristics identifying potentially defective components 106 and patterns, trends, or other characteristics in the output test data that may indicate manufacturing concerns or flaws.
  • The present supplementary [0056] data analysis element 206, for example, smoothes the output test data, calculates and analyzes various statistics based on the output test data, and identifies data and/or components 106 corresponding to various criteria. The present supplementary data analysis element 206 may also classify and correlate the output test data to provide information to the operator and/or test engineer relating to the components 106 and the test system 100. For example, the present supplementary data analysis element 206 may perform output data correlations, for example to identify potentially related or redundant tests, and outlier incidence analyses to identify tests having frequent outliers.
  • The supplementary [0057] data analysis element 206 may include a smoothing system to initially process the tester data to smooth the data and assist in the identification of outliers (step 429). The smoothing system may also identify significant changes in the data, trends, and the like, which may be provided to the operator by the output element 208.
  • The smoothing system is suitably implemented, for example, as a program operating on the [0058] computer system 108. The smoothing system suitably comprises multiple phases for smoothing the data according to various criteria. The first phase may include a basic smoothing process. The supplemental phases conditionally provide for enhanced tracking and/or additional smoothing of the test data.
  • The smoothing system suitably operates by initially adjusting an initial value of a selected tester datum according to a first smoothing technique, and supplementarily adjusting the value according to a second smoothing technique if at least one of the initial value and the initially adjusted value meets a threshold. The first smoothing technique tends to smooth the data. The second smoothing technique also tends to smooth the data and/or improve tracking of the data, but in a different manner from the first smoothing technique. Further, the threshold may comprise any suitable criteria for determining whether to apply supplemental smoothing. The smoothing system suitably compares a plurality of preceding adjusted data to a plurality of preceding raw data to generate a comparison result, and applies a second smoothing technique to the selected datum to adjust the value of the selected datum according to whether the comparison result meets a first threshold. Further, the smoothing system suitably calculates a predicted value of the selected datum, and may apply a third smoothing technique to the selected datum to adjust the value of the selected datum according to whether the predicted value meets a second threshold. [0059]
  • Referring to FIG. 8, a first smoothed test data point is suitably set equal to a first raw test data point (step [0060] 802) and the smoothing system proceeds to the next raw test data point (step 804). Before performing smoothing operations, the smoothing system initially determines whether smoothing is appropriate for the data point and, if so, performs a basic smoothing operation on the data. Any criteria may be applied to determine whether smoothing is appropriate, such as according to the number of data points received, the deviation of the data point values from a selected value, or comparison of each data point value to a threshold. In the present embodiment, the smoothing system performs a threshold comparison. The threshold comparison determines whether data smoothing is appropriate. If so, the initial smoothing process is suitably configured to proceed to an initial smoothing of the data.
  • More particularly, in the present embodiment, the process starts with an initial raw data point R[0061] 0, which is also designated as the first smoothed data point S0. As additional data points are received and analyzed, a difference between each raw data point (Rn) and a preceding smoothed data point (Sn−1) is calculated and compared to a threshold (T1) (step 806). If the difference between the raw data point Rn and the preceding smoothed data point Sn−1 exceeds the threshold T1, it is assumed that the exceeded threshold corresponds to a significant departure from the smoothed data and indicates a shift in the data. Accordingly, the occurrence of the threshold crossing may be noted and the current smoothed data point Sn is set equal to the raw data point Rn (step 808). No smoothing is performed, and the process proceeds to the next raw data point.
  • If the difference between the raw data point and the preceding smoothed data point does not exceed the threshold T[0062] 1, the process calculates a current smoothed data point Sn in conjunction with an initial smoothing process (step 810). The initial smoothing process provides a basic smoothing of the data. For example, in the present embodiment, the basic smoothing process comprises a conventional exponential smoothing process, such as according to the following equation:
  • S n=(R n −S n−1)*M 1 +S n−1
  • where M[0063] 1 is a selected smoothing coefficient, such as 0.2 or 0.3.
  • The initial smoothing process suitably uses a relatively low coefficient M[0064] 1 to provide a significant amount of smoothing for the data. The initial smoothing process and coefficients may be selected according to any criteria and configured in any manner, however, according to the application of the smoothing system, the data processed, requirements and capabilities of the smoothing system, and/or any other criteria. For example, the initial smoothing process may employ random, random walk, moving average, simple exponential, linear exponential, seasonal exponential, exponential weighted moving average, or any other appropriate type of smoothing to initially smooth the data.
  • The data may be further analyzed for and/or subjected to smoothing. Supplementary smoothing may be performed on the data to enhance the smoothing of the data and/or improve the tracking of the smoothed data to the raw data. Multiple phases of supplementary smoothing may also be considered and, if appropriate, applied. The various phases may be independent, interdependent, or complementary. In addition, the data may be analyzed to determine whether supplementary smoothing is appropriate. [0065]
  • In the present embodiment, the data is analyzed to determine whether to perform one or more additional phases of smoothing. The data is analyzed according to any appropriate criteria to determine whether supplemental smoothing may be applied (step [0066] 812). For example, the smoothing system identify trends in the data, such as by comparing a plurality of adjusted data points and raw data points for preceding data and generating a comparison result according to whether substantially all of the preceding adjusted data share a common relationship (such as less than, greater than, or equal to) with substantially all of the corresponding raw data.
  • The smoothing system of the present embodiment compares a selected number P[0067] 2 of raw data points to an equal number of smoothed data points. If the values of all of the P2 raw data points exceed (or are equal to) the corresponding smoothed data points, or if all raw data points are less than (or equal to) the corresponding smoothed data points, then the smoothing system may determine that the data is exhibiting a trend and should be tracked more closely. Accordingly, the occurrence may be noted and the smoothing applied to the data may be changed by applying supplementary smoothing. If, on the other hand, neither of these criteria is satisfied, then the current smoothed data point remains as originally calculated and the relevant supplementary data smoothing is not applied.
  • In the present embodiment, the criterion for comparing the smoothed data to the raw data is selected to identify a trend in the data behind which the smoothed data may be lagging. Accordingly, the number of points P[0068] 2 may be selected according to the desired sensitivity of the system to changing trends in the raw data.
  • The supplementary smoothing changes the effect of the overall smoothing according to the data analysis. Any appropriate supplementary smoothing may be applied to the data to more effectively smooth the data or track a trend in the data. For example, in the present embodiment, if the data analysis indicates a trend in the data that should be tracked more closely, then the supplementary smoothing may be applied to reduce the degree of smoothing initially applied so that the smoothed data more closely tracks the raw data (step [0069] 814).
  • In the present embodiment, the degree of smoothing is reduced by recalculating the value for the current smoothed data point using a reduced degree of smoothing. Any suitable smoothing system may be used to more effectively track the data or otherwise respond to the results of the data analysis. In the present embodiment, another conventional exponential smoothing process is applied to the data using a higher coefficient M[0070] 2:
  • S n=(R n −S n−1)*M 2 +S n−1
  • The coefficients M[0071] 1 and M2 may be selected according to the desired sensitivity of the system, both in the absence (M1) and the presence (M2) of trends in the raw data. In various applications, for example, the value of M1 may be higher than the value of M2.
  • The supplementary data smoothing may include additional phases as well. The additional phases of data smoothing may similarly analyze the data in some manner to determine whether additional data smoothing should be applied. Any number of phases and types of data smoothing may be applied or considered according to the data analysis. [0072]
  • For example, in the present embodiment, the data may be analyzed and potentially smoothed for noise control, such as using a predictive process based on the slope of the smoothed data. The smoothing system computes a slope (step [0073] 816) based on a selected number P3 of smoothed data points preceding the current data point according to any appropriate process, such as line regression, N-points centered, or the like. In the present embodiment, the data smoothing system uses a “least squares fit through” process to establish a slope of the preceding P3 smoothed data points.
  • The smoothing system predicts a value of the current smoothed data point according to the calculated slope. The system then compares the difference between the previously calculated value for the current smoothed data point (S[0074] n) to the predicted value for the current smoothed data point to a range number (R3) (step 818). If the difference is greater than the range R3, then the occurrence may be noted and the current smoothed data point is not adjusted. If the difference is within the range R3, then the current smoothed data point is set equal to the difference between the calculated current smoothed data point (Sn) and the predicted value for the current smoothed data point (Sn−pred) multiplied by a third multiplier M3 and added to the original value of the current smoothed data point (step 820). The equation:
  • S n=(Sn−pred −S n)*M 3 +S n
  • Thus, the current smoothed data point is set according to a modified difference between the original smoothed data point and the predicted smoothed data point, but reduced by a certain amount (when M[0075] 3 is less than 1). Applying the predictive smoothing tends to reduce point-to-point noise sensitivity during relatively flat (or otherwise nontrending) portions of the signal. The limited application of the predictive smoothing process to the smoothed data points ensures that the calculated average based on the slope does not affect the smoothed data when significant changes are occurring in the raw data, i.e., when the raw data signal is not relatively flat.
  • After smoothing the data, the supplementary [0076] data analysis element 206 may proceed with further analysis of the tester data. For example, the supplementary data analysis element 206 may conduct statistical process control (SPC) calculations and analyses on the output test data. More particularly, referring again to FIGS. 4A-C, the supplemental data analysis element 206 may calculate and store desired statistics for a particular component, test, and/or section (step 430). The statistics may comprise any statistics useful to the operator or the test system 100, such as SPC figures that may include averages, standard deviations, minima, maxima, sums, counts, Cp, Cpk, or any other appropriate statistics.
  • The supplementary [0077] data analysis element 206 also suitably performs a signature analysis to dynamically and automatically identify trends and anomalies in the data, for example according to section, based on a combination of test results for that section and/or other data, such as historical data (step 442). The signature analysis identifies signatures and applies a weighting system, suitably configured by the operator, based on any suitable data, such as the test data or identification of defects. The signature analysis may cumulatively identify trends and anomalies that may correspond to problem areas or other characteristics of the wafer or the fabrication process. Signature analysis may be conducted for any desired signatures, such as noise peaks, waveform variations, mode shifts, and noise. In the present embodiment, the computer 108 suitably performs the signature analysis on the output test data for each desired test in each desired section.
  • In the present embodiment, a signature analysis process may be performed in conjunction with the smoothing process. As the smoothing process analyzes the tester data, results of the analysis indicating a trend or anomaly in the data are stored as being indicative of a change in the data or an outlier that may be of significance to the operator and/or test engineer. For example, if a trend is indicated by a comparison of sets of data in the smoothing process, the occurrence of the trend may be noted and stored. Similarly, if a data point exceeds the threshold T[0078] 1 in the data smoothing process, the occurrence may be noted and stored for later analysis and/or inclusion in the output report.
  • For example, referring to FIGS. [0079] 6A-B, a signature analysis process 600 may initially calculate a count (step 602) for a particular set of test data and control limits corresponding to a particular section and test. The signature analysis process then applies an appropriate signature analysis algorithm to the data points (step 604). The signature analysis is performed for each desired signature algorithm, and then to each test and each section to be analyzed. Errors identified by the signature analysis, trend results, and signature results are also stored (step 606). The process is repeated for each signature algorithm (step 608), test (step 610), and section (step 612). Upon completion, the supplementary data analysis element 206 records the errors (step 614), trend results (step 616), signature results (step 618), and any other desired data in the storage system.
  • Upon identification of each relevant data point, such as outliers and other data of importance identified by the supplementary analysis, each relevant data point may be associated with a value identifying the relevant characteristics (step [0080] 444). For example, each relevant components or data point may be associated with a series of values, suitably expressed as a hexadecimal figure, corresponding to the results of the supplementary analysis relating to the data point. Each value may operate as a flag or other designator of a particular characteristic. For example, if a particular data point has failed a particular test completely, a first flag in the corresponding hexadecimal value may be set. If a particular data point is the beginning of a trend in the data, another flag may be set. Another value in the hexadecimal figure may include information relating to the trend, such as the duration of the trend in the data.
  • The supplementary [0081] data analysis element 206 may also be configured to classify and correlate the data (step 446). For example, the supplementary data analysis element 206 may utilize the information in the hexadecimal figures associated with the data points to identify the failures, outliers, trends, and other features of the data. The supplementary data analysis element 206 also suitably applies conventional correlation techniques to the data, for example to identify potentially redundant or related tests.
  • The [0082] computer 108 may perform additional analysis functions upon the generated statistics and the output test data, such as automatically identifying and classifying outliers (step 432). Analyzing each relevant datum according to the selected algorithm suitably identifies the outliers. If a particular algorithm is inappropriate for a set of data, the supplementary data analysis element 206 may be configured to automatically abort the analysis and select a different algorithm.
  • The supplementary [0083] data analysis element 206 may operate in any suitable manner to designate outliers, such as by comparison to selected values and/or according to treatment of the data in the data smoothing process. For example, an outlier identification element according to various aspects of the present invention initially automatically calibrates its sensitivity to outliers based on selected statistical relationships for each relevant datum (step 434). Some of these statistical relationships are then compared to a threshold or other reference point, such as the data mode, mean, or median, or combinations thereof, to define relative outlier threshold limits. In the present embodiment, the statistical relationships are scaled, for example by one, two, three, and six standard deviations of the data, to define the different outlier amplitudes (step 436). The output test data may then be compared to the outlier threshold limits to identify and classify the output test data as outliers (step 438).
  • The supplementary [0084] data analysis element 206 stores the resulting statistics and outliers in memory and identifiers, such as the x-y wafer map coordinates, associated with any such statistics and outliers (step 440). Selected statistics, outliers, and/or failures may also trigger notification events, such as sending an electronic message to an operator, triggering a light tower, stopping the tester 102, or notifying a server.
  • In the present embodiment, the supplementary [0085] data analysis element 206 includes a scaling element 210 and an outlier classification element 212. The scaling element 210 is configured to dynamically scale selected coefficients and other values according to the output test data. The outlier classification element 212 is configured to identify and/or classify the various outliers in the data according to selected algorithms.
  • More particularly, the scaling element of the present embodiment suitably uses various statistical relationships for dynamically scaling outlier sensitivity and smoothing coefficients for noise filtering sensitivity. The scaling coefficients are suitably calculated by the scaling element and used to modify selected outlier sensitivity values and smoothing coefficients. Any appropriate criteria, such as suitable statistical relationships, may be used for scaling. For example, a sample statistical relationship for outlier sensitivity scaling is defined as: [0086]
  • ({square root}{square root over (1+NaturalLog Cpk 2)})
  • Another sample statistical relationship for outlier sensitivity and smoothing coefficient scaling is defined as: [0087]
  • ({square root}{square root over (1+NaturalLog Cpk 2)})*Cpm
  • Another sample statistical relationship for outlier sensitivity and smoothing coefficient scaling is defined as: [0088] ( σ * Cpk ) ( Max - Min ) , where σ = datum Standard Deviation
    Figure US20030014205A1-20030116-M00001
  • A sample statistical relationship used in multiple algorithms for smoothing coefficient scaling is: [0089] σ μ * 10 , where σ = datum Standard Deviation and μ = datum Mean
    Figure US20030014205A1-20030116-M00002
  • Another sample statistical relationship used in multiple algorithms for smoothing coefficient scaling is: [0090] σ 2 μ 2 * 10 , where σ = datum Standard Deviation and μ = datum Mean
    Figure US20030014205A1-20030116-M00003
  • The outlier classification element is suitably configured to identify and/or classify [0091] components 106, output test data, and analysis results according to any suitable algorithm the outliers in the output test data. The outlier classification element may also identify and classify selected outliers and components 106 according to the test output test results and the information generated by the supplementary analysis element 206. For example, the outlier classification element is suitably configured to classify the components 106 into critical/marginal/good part categories, for example in conjunction with user-defined criteria; user-defined good/bad spatial patterns recognition; classification of pertinent data for tester data compression; test setup in-situ sensitivity qualifications and analyses; tester yield leveling analyses; dynamic wafer map and/or test strip mapping for part dispositions and dynamic retest; or test program optimization analyses. The outlier classification element may classify data in accordance with conventional SPC control rules, such as Western Electric rules or Nelson rules, to characterize the data.
  • The outlier classification element suitably classifies the data using a selected set of classification limit calculation methods. Any appropriate classification methods may be used to characterize the data according to the needs of the operator. The present outlier classification element, for example, classifies outliers by comparing the output test data to selected thresholds, such as values corresponding to one, two, three, and six statistically scaled standard deviations from a threshold, such as the data mean, mode, and/or median. The identification of outliers in this manner tends to normalize any identified outliers for any test regardless of datum amplitude and relative noise. [0092]
  • The outlier classification element analyzes and correlates the normalized outliers and/or the raw data points based on user-defined rules. Sample user-selectable methods for the purpose of part and pattern classification based on identified outliers are as follows: [0093]
  • Cumulative Amplitude, Cumulative Count Method: [0094] Count LIMIT = μ OverallOutlierCount + ( 3 * ( σ OverallOutlierCount 2 ) ( Max OverallOutlierCount - Min OverallOutlierCount ) ) NormalizedOutlierAmplitude LIMIT = μ OverallNormalizedOutlierAmplitude + ( 3 * ( σ OverallNormalizedOutlierAmplitude 2 ) ( Max OverallNormalizedOutlierAmplitude - Min OverallNormalizedOutlierAmplitude ) )
    Figure US20030014205A1-20030116-M00004
  • Classification Rules: [0095]
  • PartCRITICAL=True, If└(PartCumlative Outlier Count>CountLIMIT)AND(PartCumlative Normalized Outlier Amplitude>NormalizedOutlier AmplitudeLIMIT)┘
  • PartMARGINAL HighAmplitude=True, If└(PartCumlative Normalized Outlier Amplitude>NormalizedOutlier AmplitudeLIMIT)┘
  • PartMARGINAL HighCount=True, If(PartCumlativeOutlierCount>CountLIMIT)
  • Cumulative Amplitude Squared, Cumulative Count Squared Method: [0096] Count LIMIT 2 = μ OverallOutlierCount 2 + ( 3 * ( σ OverallOutherCount 2 2 ) ( Max OverallOutlierCount 2 - Min OverallOutlierCount 2 ) ) NormalizedOutlierAmplitude LIMIT 2 = μ OverallNormalizedOutlierAmplitude 2 + ( 3 * ( σ OverallNormalizedOutlierAmplitude 2 2 ) ( Max OverallNormalizedOutlierAmplitude 2 - Min OverallNormalizedOutlierAmplitude 2 ) )
    Figure US20030014205A1-20030116-M00005
  • Classification Rules: [0097]
  • PartCRITICAL=True, If└(PartCumlativeOutlierCount 2 >CountLIMIT 2 )AND(PartCumlativeNormalizedOutlierAmplitude 2 >Normalized Outlier AmplitudeLIMIT 2 )┘
  • PartMARGINAL HighAmplitude=True, If└(PartCumlativeNormalizedOutlierAmplitude 2 >Normalized Outlier AmplitudeLIMIT 2 )┘
  • PartMARGINAL HighCount=True, If└(PartCumlative Outlier Count 2 >CountLIMIT 2 )┘
  • N-Points Method: [0098]
  • The actual numbers and logic rules used in the following examples can be customized by the end user per scenario (test program, test node, tester, prober, handler, test setup, etc.). σ in these examples=σ relative to datum mean, mode, and/or median based on datum standard deviation scaled by key statistical relationships. [0099]
  • PartCRITICAL=True, If[((PartCOUNT6σ+PartCOUNT3σ)≧2)OR((PartCOUNT2σ+PartCOUNT1σ)≧6)]
  • PartCRITICAL=True, If[((PartCOUNT6σ+PartCOUNT3σ)≧1)AND((PartCOUNT2σ+PartCOUNT1σ)≧3)]
  • PartMARGINAL=True, If[((PartCOUNT6σ+PartCOUNT3σ+PartCOUNT2σ+PartCOUNT1σ)≧3)]
  • PartNOISY=True, If[((PartCOUNT6σ+PartCOUNT3σ+PartCOUNT2σ+PartCOUNT1σ)≧1)]
  • The supplementary [0100] data analysis element 206 may be configured to perform additional analysis of the output test data and the information generated by the supplementary data analysis element 206. For example, the supplementary data analysis element 206 may identify tests having high incidences of failure or outliers, such as by comparing the total or average number of failures, outliers, or outliers in a particular classification to one or more threshold values.
  • The supplementary [0101] data analysis element 206 may also be configured to correlate data from different tests to identify similar or dissimilar trends, for example by comparing cumulative counts, outliers, and/or correlating outliers between wafers or other data sets. The supplementary data analysis element 206 may also analyze and correlate data from different tests to identify and classify potential critical and/or marginal and/or good parts on the wafer. The supplementary data analysis element 206 may also analyze and correlate data from different tests to identify user-defined good part patterns and/or bad part patterns on a series of wafers for the purposes of dynamic test time reduction.
  • The supplementary [0102] data analysis element 206 is also suitably configured to analyze and correlate data from different tests to identify user-defined pertinent raw data for the purposes of dynamically compressing the test data into memory. The supplementary data analysis element may also analyze and correlate statistical anomalies and test data results for test node in-situ setup qualification and sensitivity analysis. Further, the supplementary data analysis element may contribute to test node yield leveling analysis, for example by identifying whether a particular test node may be improperly calibrated or otherwise producing inappropriate results. The supplementary data analysis element may moreover analyze and correlate the data for the purposes of test program optimization including, but not limited to, automatic identification of redundant tests using correlated results and outlier analysis and providing additional data for use in analysis. The supplementary data analysis element is also suitably configured to identify critical tests, for example by identifying regularly failed or almost failed tests, tests that are almost never-fail, and/or tests exhibiting a very low Cpk.
  • The supplementary data analysis may also provide identification of test sampling candidates, such as tests that are rarely or never failed or in which outliers are never detected. The supplementary data analysis element may also provide identification of the best order test sequence based on correlation techniques, such as conventional correlation techniques, combined with analysis and correlation of identified outliers and/or other statistical anomalies, number of failures, critical tests, longest/shortest tests, or basic functionality issues associated with failure of the test. [0103]
  • The supplementary data analysis may also provide identification of critical, marginal, and good parts as defined by sensitivity parameters in a recipe configuration file. Part identification may provide disposition/classification before packaging and/or shipping the part that may represent a reliability risk, and/or test time reduction through dynamic probe mapping of bad and good parts during wafer probe. Identification of these parts may be represented and output in any appropriate manner, for example as good and bad parts on a dynamically generated prober control map (for dynamic mapping), a wafer map used for offline inking equipment, a test strip map for strip testing at final test, a results file, and/or a database results table. [0104]
  • Supplemental data analysis at the cell controller level tends to increase quality control at the probe, and this final test yields. In addition, quality issues may be identified at product run time, not later. Furthermore, the supplemental data analysis and signature analysis tends to improve the quality of data provided to the downstream and offline analysis tools, as well as test engineers or other personnel, by identifying outliers. For example, the [0105] computer 108 may include information on the wafer map identifying a group of components having signature analyses indicating a fault in the manufacturing process. Thus, the signature analysis system may identify potentially defective goods that went undetected using conventional test analysis.
  • EXAMPLE
  • Referring now to FIG. 10, an array of semiconductor devices are positioned on a wafer. In this wafer, the general resistivity of resistor components in the semiconductor devices varies across the wafer, for example due to uneven deposition of material or treatment of the wafer. The resistance of any particular component, however, may be within the control limits of the test. For example, the target resistance of a particular resistor component may be 1000Ω +/−10%. Near the ends of the wafer, the resistances of most of the resistors approach, but do not exceed, the normal distribution range of 900Ω and 1100Ω (FIG. 11). [0106]
  • Components on the wafer may include defects, for example due to a contaminant or imperfection in the fabrication process. The defect may increase the resistance of resistors located near the low-resistivity edge of the wafer to 1080Ω. The resistance is well over the 1000Ω expected for a device near the middle of the wafer, but is still well within the normal distribution range. [0107]
  • Referring to FIGS. [0108] 12A-B, the raw test data for each component may be plotted. The test data exhibits considerable variation, due in part to the varying resistivity among components on the wafer as the prober indexes across rows or columns of devices. The devices affected by the defect are not easily identifiable based on visual examination of the test data or comparison to the test limits.
  • When the test data is processed according to various aspects of the present invention, the devices affected by the defect may be associated with outliers in the test data. The smoothed test data is largely confined to a certain range of values. The data associated with the defects, however, is unlike the data for the surrounding components. Accordingly, the smoothed data illustrates the departure from the values associated with the surrounding devices without the defect. The outlier classification element may identify and classify the outliers according to the magnitude of the departure of the outlier data from the surrounding data. [0109]
  • The [0110] output element 208 collects data from the test system 100, suitably at run time, and provides an output report to a printer, database, operator interface, or other desired destination. Any form, such as graphical, numerical, textual, printed, or electronic form, may be used to present the output report for use or subsequent analysis. The output element 208 may provide any selected content, including selected output test data from the tester 102 and results of the supplementary data analysis.
  • In the present embodiment, the [0111] output element 208 suitably provides a selection of data from the output test data specified by the operator as well as supplemental data at product run time via the dynamic datalog. Referring to FIG. 7, the output element 208 initially reads a sampling range from the database 114 (step 702). The sampling range identifies predetermined information to be included in the output report. In the present embodiment, the sampling range identifies components 106 on the wafer selected by the operator for review. The predetermined components may be selected according to any criteria, such as data for various circumferential zones, radial zones, random components, or individual stepper fields. The sampling range comprises a set of x-y coordinates corresponding to the positions of the predetermined components on the wafer or an identified portion of the available components in a batch.
  • The [0112] output element 208 may also be configured to include information relating to the outliers, or other information generated or identified by the supplementary data analysis element, in the dynamic datalog (step 704). If so configured, the identifiers, such as x-y coordinates, for each of the outliers are assembled as well. The coordinates for the operator-selected components and the outliers are merged into the dynamic datalog (step 706). The output element 208 retrieves selected information, such as the raw test data and one or more data from the supplementary data analysis element 206, for each entry in the merged x-y coordinate array of the dynamic datalog (step 708).
  • The retrieved information is then suitably stored in an appropriate output report (step [0113] 710). The report may be prepared in any appropriate format or manner. In the present embodiment, the output report suitably includes the dynamic datalog having a wafer map indicating the selected components on the wafer and their classification. Further, the output element 208 may superimpose wafer map data corresponding to outliers on the wafer map of the preselected components. Additionally, the output element may include only the outliers from the wafer map or batch as the sampled output. The output report may also include a series of graphical representation of the data to highlight the occurrence of outliers and correlations in the data. The output report may further include recommendations and supporting data for the recommendations. For example, if two tests appear to generate identical sets of failures and/or outliers, the output report may include a suggestion that the tests are redundant and recommend that one of the tests be omitted from the test program. The recommendation may include a graphical representation of the data showing the identical results of the tests.
  • The output report may be provided in any suitable manner, for example output to a local workstation, sent to a server, activation of an alarm, or any other appropriate manner (step [0114] 712). In one embodiment, the output report may be provided off-line such that the output does not affect the operation of the system or transfer to the main server. In this configuration, the computer 108 copies data files, performs the analysis, and generates results, for example for demonstration or verification purposes.
  • The particular implementations shown and described herein are illustrative of the invention and its best mode and are not intended to otherwise limit the scope of the present invention in any way. Indeed, for the sake of brevity, conventional signal processing, data transmission, and other functional aspects of the systems (and components of the individual operating components of the systems) may not be described in detail herein. Furthermore, the connecting lines shown in the various figures contained herein are intended to represent exemplary functional relationships and/or physical couplings between the various elements. Many alternative or additional functional relationships or physical connections may be present in a practical test system. The present invention has been described above with reference to a preferred embodiment. However, changes and modifications may be made to the preferred embodiment without departing from the scope of the present invention. [0115]
  • The present invention has been described with reference to a preferred embodiment. Changes and modifications may be made, however, without departing from the scope of the present invention. These and other changes or modifications are intended to be included within the scope of the present invention, as expressed in the following claims. [0116]

Claims (20)

1. A test system, comprising:
a tester configured to test a component and generate test data; and
an outlier identification element configured to receive the test data and identify an outlier in the test data.
2. A test system according to claim 1, wherein the outlier identification element is configured to operate in conjunction with a set of configuration data in a recipe file.
3. A test system according to claim 1, wherein the test data corresponds to a section group of components on a wafer.
4. A test system according to claim 1, wherein the outlier identification element is configured to automatically calibrate a sensitivity of the outlier identification element to the test data.
5. A test system according to claim 1, further comprising a data correlation element configured to correlate the test data.
6. A test system according to claim 1, wherein the outlier identification element is configured to identify the outlier at run time.
7. A test system according to claim 1, further comprising a data smoothing element configured to receive the test data and smooth the test data, and wherein the outlier identification element is configured to receive the smoothed test data and identify the outlier in the smoothed test data.
8. A data analysis system for semiconductor test data, comprising:
a supplementary data analysis element configured to identify outliers in the test data; and
an output element configured to generate an output report including the identified outliers.
9. A data analysis system according to claim 8, wherein the supplementary data analysis element is configured to operate in conjunction with a set of configuration data in a recipe file.
10. A data analysis system according to claim 8, wherein the test data corresponds to a section group of components on a wafer.
11. A data analysis system according to claim 8, wherein the supplementary data analysis element is configured to automatically calibrate a sensitivity of the outlier identification element to the test data.
12. A data analysis system according to claim 8, wherein the supplementary data analysis element includes a data correlation element configured to correlate the test data.
13. A data analysis system according to claim 8, wherein the supplementary data analysis element is configured to identify the outliers at run time.
14. A data analysis system according to claim 8, wherein the supplementary data analysis element includes a data smoothing element configured to receive the test data and smooth the test data, and wherein the supplementary data analysis element is configured to identify the outliers in the smoothed test data.
15. A method for testing semiconductors, comprising:
generating test data for multiple components; and
identifying an outlier in the test data at run time.
16. A method according to claim 15, further comprising reading configuration data from a recipe file, wherein identifying the outlier includes identifying the outlier according to the configuration data in the recipe file.
17. A method according to claim 15, wherein the test data corresponds to a section group of components on a wafer.
18. A method according to claim 15, further comprising calibrating a sensitivity for identifying the outlier in the test data.
19. A method according to claim 15, further comprising smoothing the test data.
20. A method according to claim 15, further comprising correlating the test data to identify similarities in the test data.
US10/154,627 2001-01-05 2002-05-24 Methods and apparatus for semiconductor testing Expired - Lifetime US6792373B2 (en)

Priority Applications (14)

Application Number Priority Date Filing Date Title
US10/154,627 US6792373B2 (en) 2001-05-24 2002-05-24 Methods and apparatus for semiconductor testing
US10/367,355 US7167811B2 (en) 2001-05-24 2003-02-14 Methods and apparatus for data analysis
US10/730,388 US7225107B2 (en) 2001-05-24 2003-12-07 Methods and apparatus for data analysis
US10/817,750 US7395170B2 (en) 2001-05-24 2004-04-02 Methods and apparatus for data analysis
US11/053,598 US7356430B2 (en) 2001-05-24 2005-02-07 Methods and apparatus for data analysis
US11/134,843 US8417477B2 (en) 2001-05-24 2005-05-20 Methods and apparatus for local outlier detection
US11/692,021 US8041541B2 (en) 2001-05-24 2007-03-27 Methods and apparatus for data analysis
US12/021,616 US20080189575A1 (en) 2001-05-24 2008-01-29 Methods and apparatus for data analysis
US12/111,773 US8000928B2 (en) 2001-05-24 2008-04-29 Methods and apparatus for data analysis
US12/573,415 US20100088054A1 (en) 2001-05-24 2009-10-05 Methods and apparatus for data analysis
US12/579,634 US8606536B2 (en) 2002-05-24 2009-10-15 Methods and apparatus for hybrid outlier detection
US13/044,202 US20110178967A1 (en) 2001-05-24 2011-03-09 Methods and apparatus for data analysis
US13/853,686 US8788237B2 (en) 2001-05-24 2013-03-29 Methods and apparatus for hybrid outlier detection
US15/991,324 US11853899B2 (en) 2001-01-05 2018-05-29 Methods and apparatus for data analysis

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US29357701P 2001-05-24 2001-05-24
US29518801P 2001-05-31 2001-05-31
US09/872,195 US6782297B2 (en) 2001-05-24 2001-05-31 Methods and apparatus for data smoothing
US37432802P 2002-04-21 2002-04-21
US10/154,627 US6792373B2 (en) 2001-05-24 2002-05-24 Methods and apparatus for semiconductor testing

Related Parent Applications (3)

Application Number Title Priority Date Filing Date
US09/872,195 Continuation-In-Part US6782297B2 (en) 2001-01-05 2001-05-31 Methods and apparatus for data smoothing
US11/535,851 Continuation US20070219741A1 (en) 2002-05-24 2006-09-27 Methods and apparatus for hybrid outlier detection
US12/579,634 Continuation US8606536B2 (en) 2002-05-24 2009-10-15 Methods and apparatus for hybrid outlier detection

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US10/367,355 Continuation-In-Part US7167811B2 (en) 2001-01-05 2003-02-14 Methods and apparatus for data analysis

Publications (2)

Publication Number Publication Date
US20030014205A1 true US20030014205A1 (en) 2003-01-16
US6792373B2 US6792373B2 (en) 2004-09-14

Family

ID=27501599

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/154,627 Expired - Lifetime US6792373B2 (en) 2001-01-05 2002-05-24 Methods and apparatus for semiconductor testing

Country Status (10)

Country Link
US (1) US6792373B2 (en)
EP (1) EP1479025B1 (en)
JP (2) JP2005507557A (en)
KR (1) KR20040067875A (en)
AT (1) ATE483186T1 (en)
AU (1) AU2002312045A1 (en)
CA (1) CA2448460A1 (en)
DE (1) DE60237849D1 (en)
IL (2) IL159009A0 (en)
WO (1) WO2002095802A2 (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020152800A1 (en) * 2001-03-29 2002-10-24 Bouten Petrus Cornelis Paulus Method for measuring a permeation rate, a test and an apparatus for measuring and testing
US20030144810A1 (en) * 2001-05-24 2003-07-31 Tabor Eric Paul Methods and apparatus for data analysis
US20040006447A1 (en) * 2000-06-22 2004-01-08 Jacky Gorin Methods and apparatus for test process enhancement
US20040088074A1 (en) * 2002-11-02 2004-05-06 Taiwan Semiconductor Manufacturing Company Auto classification shipping system
US20040119749A1 (en) * 2002-12-24 2004-06-24 Lam Research Corporation User interface for wafer data analysis and visualization
US20040138846A1 (en) * 2001-05-24 2004-07-15 Buxton Paul M. Methods and apparatus for data analysis
US6816811B2 (en) * 2001-06-21 2004-11-09 Johnson Controls Technology Company Method of intelligent data analysis to detect abnormal use of utilities in buildings
US6862540B1 (en) 2003-03-25 2005-03-01 Johnson Controls Technology Company System and method for filling gaps of missing data using source specified data
US20060085155A1 (en) * 2001-05-24 2006-04-20 Emilio Miguelanez Methods and apparatus for local outlier detection
US20060106469A1 (en) * 2004-11-17 2006-05-18 Taiwan Semiconductor Manufacturing Co., Ltd. Systems and methods for statistical process control
EP1696295A1 (en) * 2005-02-25 2006-08-30 Siemens Aktiengesellschaft Method and device for data evaluation, computer programm product and computer readable medium
US20060226053A1 (en) * 2005-03-16 2006-10-12 Masafumi Asano System of testing semiconductor devices, a method for testing semiconductor devices, and a method for manufacturing semiconductor devices
US20070157056A1 (en) * 2005-12-29 2007-07-05 Lsi Logic Corporation Method and apparatus for detecting defects in integrated circuit die from stimulation of statistical outlier signatures
WO2007098426A2 (en) * 2006-02-17 2007-08-30 Test Advantage, Inc. Methods and apparatus for data analysis
US20070219741A1 (en) * 2005-05-20 2007-09-20 Emilio Miguelanez Methods and apparatus for hybrid outlier detection
US20080091977A1 (en) * 2004-04-02 2008-04-17 Emilio Miguelanez Methods and apparatus for data analysis
US20080249742A1 (en) * 2001-05-24 2008-10-09 Scott Michael J Methods and apparatus for data analysis
US20090013218A1 (en) * 2007-07-02 2009-01-08 Optimal Test Ltd. Datalog management in semiconductor testing
US20090078817A1 (en) * 2005-11-23 2009-03-26 Raytheon Company Absolute time encoded semi-active laser designation
US7533313B1 (en) * 2006-03-09 2009-05-12 Advanced Micro Devices, Inc. Method and apparatus for identifying outlier data
US20110117683A1 (en) * 2008-07-22 2011-05-19 Ricoh Company, Ltd. Chip quality determination method and marking mechanism using same
US20130275357A1 (en) * 2012-04-11 2013-10-17 Henry Arnold Algorithm and structure for creation, definition, and execution of an spc rule decision tree
US8725748B1 (en) * 2004-08-27 2014-05-13 Advanced Micro Devices, Inc. Method and system for storing and retrieving semiconductor tester information
WO2014179801A1 (en) 2013-05-03 2014-11-06 Avocent Huntsville Corp. System and method for ups battery monitoring and data analysis
US8948494B2 (en) 2012-11-12 2015-02-03 Kla-Tencor Corp. Unbiased wafer defect samples
CN107369635A (en) * 2017-06-06 2017-11-21 上海集成电路研发中心有限公司 A kind of intelligent semi-conductor change system based on deep learning
US10223492B1 (en) * 2011-02-22 2019-03-05 Kla-Tencor Corporation Based device risk assessment
US10371744B2 (en) 2012-04-11 2019-08-06 Advantest Corporation Method and apparatus for an efficient framework for testcell development

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7944971B1 (en) * 2002-07-14 2011-05-17 Apple Inc. Encoding video
US6907378B2 (en) * 2002-09-26 2005-06-14 Agilent Technologies, Inc. Empirical data based test optimization method
US7319935B2 (en) * 2003-02-12 2008-01-15 Micron Technology, Inc. System and method for analyzing electrical failure data
US7281165B2 (en) * 2003-06-12 2007-10-09 Inventec Corporation System and method for performing product tests utilizing a single storage device
US7129735B2 (en) * 2004-07-21 2006-10-31 Texas Instruments Incorporated Method for test data-driven statistical detection of outlier semiconductor devices
US9037280B2 (en) 2005-06-06 2015-05-19 Kla-Tencor Technologies Corp. Computer-implemented methods for performing one or more defect-related functions
US7528622B2 (en) 2005-07-06 2009-05-05 Optimal Test Ltd. Methods for slow test time detection of an integrated circuit during parallel testing
US7567947B2 (en) * 2006-04-04 2009-07-28 Optimaltest Ltd. Methods and systems for semiconductor testing using a testing scenario language
JP4931710B2 (en) 2007-06-29 2012-05-16 株式会社リコー Non-defective chip classification method on wafer, chip quality determination method using the same, chip classification program, chip quality determination program, marking mechanism, and semiconductor device manufacturing method
US9943014B2 (en) 2013-03-15 2018-04-10 Coolit Systems, Inc. Manifolded heat exchangers and related systems
US9496200B2 (en) 2011-07-27 2016-11-15 Coolit Systems, Inc. Modular heat-transfer systems
US20100070211A1 (en) * 2008-09-12 2010-03-18 Analog Devices, Inc. Rolling average test
US10118200B2 (en) 2009-07-06 2018-11-06 Optimal Plus Ltd System and method for binning at final test
US20110288808A1 (en) 2010-05-20 2011-11-24 International Business Machines Corporation Optimal test flow scheduling within automated test equipment for minimized mean time to detect failure
US8855959B2 (en) 2010-08-30 2014-10-07 International Business Machines Corporation Integrated cross-tester analysis and real-time adaptive test
WO2014141162A1 (en) 2013-03-15 2014-09-18 Coolit Systems, Inc. Sensors, multiplexed communication techniques, and related systems
US10365667B2 (en) 2011-08-11 2019-07-30 Coolit Systems, Inc. Flow-path controllers and related systems
US9052252B2 (en) 2013-03-15 2015-06-09 Coolit Systems, Inc. Sensors, communication techniques, and related systems
US11662037B2 (en) 2019-01-18 2023-05-30 Coolit Systems, Inc. Fluid flow control valve for fluid flow systems, and methods
US11473860B2 (en) 2019-04-25 2022-10-18 Coolit Systems, Inc. Cooling module with leak detector and related systems
WO2021229365A1 (en) 2020-05-11 2021-11-18 Coolit Systems, Inc. Liquid pumping units, and related systems and methods

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5668745A (en) * 1995-10-20 1997-09-16 Lsi Logic Corporation Method and apparatus for testing of semiconductor devices
US5835891A (en) * 1997-02-06 1998-11-10 Hewlett-Packard Company Device modeling using non-parametric statistical determination of boundary data vectors
US6184048B1 (en) * 1999-11-03 2001-02-06 Texas Instruments Incorporated Testing method and apparatus assuring semiconductor device quality and reliability
US6366851B1 (en) * 1999-10-25 2002-04-02 Navigation Technologies Corp. Method and system for automatic centerline adjustment of shape point data for a geographic database

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5668745A (en) * 1995-10-20 1997-09-16 Lsi Logic Corporation Method and apparatus for testing of semiconductor devices
US5835891A (en) * 1997-02-06 1998-11-10 Hewlett-Packard Company Device modeling using non-parametric statistical determination of boundary data vectors
US6366851B1 (en) * 1999-10-25 2002-04-02 Navigation Technologies Corp. Method and system for automatic centerline adjustment of shape point data for a geographic database
US6184048B1 (en) * 1999-11-03 2001-02-06 Texas Instruments Incorporated Testing method and apparatus assuring semiconductor device quality and reliability

Cited By (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040006447A1 (en) * 2000-06-22 2004-01-08 Jacky Gorin Methods and apparatus for test process enhancement
US20020152800A1 (en) * 2001-03-29 2002-10-24 Bouten Petrus Cornelis Paulus Method for measuring a permeation rate, a test and an apparatus for measuring and testing
US6993956B2 (en) * 2001-03-29 2006-02-07 Koninklijke Philips Electronics N.V. Method for measuring a permeation rate, a test and an apparatus for measuring and testing
US7117720B2 (en) 2001-03-29 2006-10-10 Koninklijke Philips Electronics N. V. Method for measuring a permeation rate, a test and an apparatus for measuring and testing
US20060147346A1 (en) * 2001-03-29 2006-07-06 Bouten Petrus C P Method for measuring a permeation rate, a test and an apparatus for measuring and testing
US7167811B2 (en) 2001-05-24 2007-01-23 Test Advantage, Inc. Methods and apparatus for data analysis
US8000928B2 (en) 2001-05-24 2011-08-16 Test Advantage, Inc. Methods and apparatus for data analysis
US20040138846A1 (en) * 2001-05-24 2004-07-15 Buxton Paul M. Methods and apparatus for data analysis
US7225107B2 (en) 2001-05-24 2007-05-29 Test Advantage, Inc. Methods and apparatus for data analysis
US20080249742A1 (en) * 2001-05-24 2008-10-09 Scott Michael J Methods and apparatus for data analysis
US8417477B2 (en) 2001-05-24 2013-04-09 Test Acuity Solutions, Inc. Methods and apparatus for local outlier detection
US8788237B2 (en) * 2001-05-24 2014-07-22 Test Acuity Solutions Methods and apparatus for hybrid outlier detection
US20130226491A1 (en) * 2001-05-24 2013-08-29 Test Acuity Solutions Methods and apparatus for hybrid outlier detection
US20030144810A1 (en) * 2001-05-24 2003-07-31 Tabor Eric Paul Methods and apparatus for data analysis
US20060085155A1 (en) * 2001-05-24 2006-04-20 Emilio Miguelanez Methods and apparatus for local outlier detection
US6816811B2 (en) * 2001-06-21 2004-11-09 Johnson Controls Technology Company Method of intelligent data analysis to detect abnormal use of utilities in buildings
WO2004003572A2 (en) * 2002-06-28 2004-01-08 Test Advantage, Inc. Methods and apparatus for test process enhancement
WO2004003572A3 (en) * 2002-06-28 2004-03-25 Test Advantage Inc Methods and apparatus for test process enhancement
US20040088074A1 (en) * 2002-11-02 2004-05-06 Taiwan Semiconductor Manufacturing Company Auto classification shipping system
US7079960B2 (en) * 2002-11-02 2006-07-18 Taiwan Semiconductor Manufacturing Company, Ltd. Auto classification shipping system
US20100211903A1 (en) * 2002-12-24 2010-08-19 Lam Research Corporation User Interface for Wafer Data Analysis and Visualization
US7738693B2 (en) * 2002-12-24 2010-06-15 Lam Research Corporation User interface for wafer data analysis and visualization
US7945085B2 (en) * 2002-12-24 2011-05-17 Lam Research Corporation User interface for wafer data analysis and visualization
US20040119749A1 (en) * 2002-12-24 2004-06-24 Lam Research Corporation User interface for wafer data analysis and visualization
EP1593083A2 (en) * 2003-02-14 2005-11-09 Test Advantage, Inc. Methods and apparatus for data analysis
EP1593083A4 (en) * 2003-02-14 2006-05-03 Test Advantage Inc Methods and apparatus for data analysis
US6862540B1 (en) 2003-03-25 2005-03-01 Johnson Controls Technology Company System and method for filling gaps of missing data using source specified data
WO2005001667A3 (en) * 2003-06-27 2005-08-04 Test Advantage Inc Methods and apparatus for data analysis
US20080091977A1 (en) * 2004-04-02 2008-04-17 Emilio Miguelanez Methods and apparatus for data analysis
US7904279B2 (en) * 2004-04-02 2011-03-08 Test Advantage, Inc. Methods and apparatus for data analysis
EP1787132A2 (en) * 2004-08-20 2007-05-23 Test Advantage, Inc. Methods and apparatus for local outlier detection
EP1787132A4 (en) * 2004-08-20 2010-09-29 Test Advantage Inc Methods and apparatus for local outlier detection
US8725748B1 (en) * 2004-08-27 2014-05-13 Advanced Micro Devices, Inc. Method and system for storing and retrieving semiconductor tester information
US7957821B2 (en) * 2004-11-17 2011-06-07 Taiwan Semiconductor Manufacturing Co., Ltd. Systems and methods for statistical process control
US20060106469A1 (en) * 2004-11-17 2006-05-18 Taiwan Semiconductor Manufacturing Co., Ltd. Systems and methods for statistical process control
EP1696295A1 (en) * 2005-02-25 2006-08-30 Siemens Aktiengesellschaft Method and device for data evaluation, computer programm product and computer readable medium
US7982155B2 (en) * 2005-03-16 2011-07-19 Kabushiki Kaisha Toshiba System of testing semiconductor devices, a method for testing semiconductor devices, and a method for manufacturing semiconductor devices
US7629550B2 (en) * 2005-03-16 2009-12-08 Kabushiki Kaisha Toshiba System of testing semiconductor devices, a method for testing semiconductor devices, and a method for manufacturing semiconductor devices
US20060226053A1 (en) * 2005-03-16 2006-10-12 Masafumi Asano System of testing semiconductor devices, a method for testing semiconductor devices, and a method for manufacturing semiconductor devices
US20100068833A1 (en) * 2005-03-16 2010-03-18 Kabushiki Kaisha Toshiba System of testing semiconductor devices, a method for testing semiconductor devices, and a method for manufacturing semiconductor devices
US20070219741A1 (en) * 2005-05-20 2007-09-20 Emilio Miguelanez Methods and apparatus for hybrid outlier detection
US7767945B2 (en) 2005-11-23 2010-08-03 Raytheon Company Absolute time encoded semi-active laser designation
US20090078817A1 (en) * 2005-11-23 2009-03-26 Raytheon Company Absolute time encoded semi-active laser designation
US7617427B2 (en) * 2005-12-29 2009-11-10 Lsi Corporation Method and apparatus for detecting defects in integrated circuit die from stimulation of statistical outlier signatures
US20070157056A1 (en) * 2005-12-29 2007-07-05 Lsi Logic Corporation Method and apparatus for detecting defects in integrated circuit die from stimulation of statistical outlier signatures
WO2007098426A2 (en) * 2006-02-17 2007-08-30 Test Advantage, Inc. Methods and apparatus for data analysis
WO2007098426A3 (en) * 2006-02-17 2008-11-13 Test Advantage Inc Methods and apparatus for data analysis
US7533313B1 (en) * 2006-03-09 2009-05-12 Advanced Micro Devices, Inc. Method and apparatus for identifying outlier data
WO2008039918A3 (en) * 2006-09-27 2008-10-30 Test Advantage Inc Methods and apparatus for hybrid outlier detection
WO2008039918A2 (en) * 2006-09-27 2008-04-03 Test Advantage, Inc. Methods and apparatus for hybrid outlier detection
US20090013218A1 (en) * 2007-07-02 2009-01-08 Optimal Test Ltd. Datalog management in semiconductor testing
US8440474B2 (en) * 2008-07-22 2013-05-14 Ricoh Company, Ltd. Chip quality determination method and marking mechanism using same
US20110117683A1 (en) * 2008-07-22 2011-05-19 Ricoh Company, Ltd. Chip quality determination method and marking mechanism using same
US10223492B1 (en) * 2011-02-22 2019-03-05 Kla-Tencor Corporation Based device risk assessment
US20130275357A1 (en) * 2012-04-11 2013-10-17 Henry Arnold Algorithm and structure for creation, definition, and execution of an spc rule decision tree
CN104364664A (en) * 2012-04-11 2015-02-18 爱德万测试公司 An algorithm and structure for creation, definition, and execution of an SPC rule decision tree
US10371744B2 (en) 2012-04-11 2019-08-06 Advantest Corporation Method and apparatus for an efficient framework for testcell development
US8948494B2 (en) 2012-11-12 2015-02-03 Kla-Tencor Corp. Unbiased wafer defect samples
WO2014179801A1 (en) 2013-05-03 2014-11-06 Avocent Huntsville Corp. System and method for ups battery monitoring and data analysis
EP2992340A4 (en) * 2013-05-03 2017-01-11 Liebert Corporation System and method for ups battery monitoring and data analysis
US10852357B2 (en) 2013-05-03 2020-12-01 Vertiv Corporation System and method for UPS battery monitoring and data analysis
CN107369635A (en) * 2017-06-06 2017-11-21 上海集成电路研发中心有限公司 A kind of intelligent semi-conductor change system based on deep learning

Also Published As

Publication number Publication date
JP2010226125A (en) 2010-10-07
IL159009A (en) 2011-01-31
WO2002095802A2 (en) 2002-11-28
EP1479025B1 (en) 2010-09-29
WO2002095802A3 (en) 2004-09-23
AU2002312045A1 (en) 2002-12-03
KR20040067875A (en) 2004-07-30
JP2005507557A (en) 2005-03-17
ATE483186T1 (en) 2010-10-15
DE60237849D1 (en) 2010-11-11
CA2448460A1 (en) 2002-11-28
EP1479025A4 (en) 2006-04-12
IL159009A0 (en) 2004-05-12
US6792373B2 (en) 2004-09-14
EP1479025A2 (en) 2004-11-24

Similar Documents

Publication Publication Date Title
US6792373B2 (en) Methods and apparatus for semiconductor testing
US7437271B2 (en) Methods and apparatus for data analysis
US8000928B2 (en) Methods and apparatus for data analysis
US7225107B2 (en) Methods and apparatus for data analysis
US11853899B2 (en) Methods and apparatus for data analysis
US8417477B2 (en) Methods and apparatus for local outlier detection
US8041541B2 (en) Methods and apparatus for data analysis
US7904279B2 (en) Methods and apparatus for data analysis
US20100036637A1 (en) Methods and apparatus for hybrid outlier detection
WO2007098426A2 (en) Methods and apparatus for data analysis

Legal Events

Date Code Title Description
AS Assignment

Owner name: TEST ADVANTAGE, INC., ARIZONA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TABOR, ERIC PAUL;REEL/FRAME:013458/0973

Effective date: 20020729

STCF Information on status: patent grant

Free format text: PATENTED CASE

RR Request for reexamination filed

Effective date: 20061102

FPAY Fee payment

Year of fee payment: 4

B1 Reexamination certificate first reexamination

Free format text: THE PATENTABILITY OF CLAIMS 1-20 IS CONFIRMED.

FPAY Fee payment

Year of fee payment: 8

AS Assignment

Owner name: TEST ACUITY SOLUTIONS, INC., ARIZONA

Free format text: CHANGE OF NAME;ASSIGNOR:TEST ADVANTAGE, INC.;REEL/FRAME:029002/0009

Effective date: 20120821

AS Assignment

Owner name: ACACIA RESEARCH GROUP LLC, TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TEST ACUITY SOLUTIONS, INC.;REEL/FRAME:032067/0660

Effective date: 20130731

AS Assignment

Owner name: IN-DEPTH TEST LLC, TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ACACIA RESEARCH GROUP LLC;REEL/FRAME:032089/0907

Effective date: 20131016

CBM Aia trial proceeding filed before patent trial and appeal board: covered business methods

Free format text: TRIAL NO: CBM2015-00060

Opponent name: FAIRCHILD SEMICONDUCTOR CORPORATION

Effective date: 20150113

IPR Aia trial proceeding filed before the patent and appeal board: inter partes review

Free format text: TRIAL NO: IPR2015-01627

Opponent name: MAXIM INTEGRATED PRODUCTS INC.

Effective date: 20150727

IPR Aia trial proceeding filed before the patent and appeal board: inter partes review

Free format text: TRIAL NO: IPR2015-01998

Opponent name: LINEAR TECHNOLOGY CORPORATION

Effective date: 20150929

IPR Aia trial proceeding filed before the patent and appeal board: inter partes review

Free format text: TRIAL NO: IPR2015-01994

Opponent name: LINEAR TECHNOLOGY CORPORATION

Effective date: 20150928

FPAY Fee payment

Year of fee payment: 12

IPR Aia trial proceeding filed before the patent and appeal board: inter partes review

Free format text: TRIAL NO: IPR2016-01833

Opponent name: ON SEMICONDUCTOR CORPORATION,SEMICONDUCTOR COMPONE

Effective date: 20160916

IPR Aia trial proceeding filed before the patent and appeal board: inter partes review

Free format text: TRIAL NO: IPR2017-02009

Opponent name: MENTOR GRAPHICS CORPORATION

Effective date: 20170828

IPR Aia trial proceeding filed before the patent and appeal board: inter partes review

Free format text: TRIAL NO: IPR2017-02094

Opponent name: MENTOR GRAPHICS CORPORATION

Effective date: 20170911