WO2005100961A2 - Imaging semiconductor strucutures using solid state illumination - Google Patents

Imaging semiconductor strucutures using solid state illumination Download PDF

Info

Publication number
WO2005100961A2
WO2005100961A2 PCT/US2005/013448 US2005013448W WO2005100961A2 WO 2005100961 A2 WO2005100961 A2 WO 2005100961A2 US 2005013448 W US2005013448 W US 2005013448W WO 2005100961 A2 WO2005100961 A2 WO 2005100961A2
Authority
WO
WIPO (PCT)
Prior art keywords
radiation
wavelengths
semiconductor
inspection system
wavelength
Prior art date
Application number
PCT/US2005/013448
Other languages
French (fr)
Other versions
WO2005100961A3 (en
Inventor
Mark D. Owen
françois VLACH
Steven J. Olson
Original Assignee
Phoseon Technology, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Phoseon Technology, Inc. filed Critical Phoseon Technology, Inc.
Priority to EP05745456.3A priority Critical patent/EP1738156A4/en
Publication of WO2005100961A2 publication Critical patent/WO2005100961A2/en
Publication of WO2005100961A3 publication Critical patent/WO2005100961A3/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/9501Semiconductor wafers
    • G01N21/9505Wafer internal defects, e.g. microcracks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/956Inspecting patterns on the surface of objects

Definitions

  • test and quality control may be performed, e.g., to inspect the relevant features, including to identify various conditions and to detect defects, such as in bonding and adjacent layer(s).
  • infrared microscopy provides for imaging and inspection of semiconductor devices having bonded or stacked substrates
  • microscopy also tends to have drawbacks.
  • a typical light source is a halogen or other bulb, which provides light across a broad spectrum, including infrared. In order to provide infrared light, then, an appropriate filter is used.
  • the radiation source when imaging based on a radiation orientation that directs the selected wavelength(s) to the back side of the structure, the radiation source preferably provides infrared wavelength(s) that may be transmitted entirely through the • to-be-imaged structure and that are half or less than half the relevant dimensions of the feature to be detected. Moreover, the radiation source preferably (i) provides the selected wavelengths (e.g., at appropriate intensities and for sufficient durations) so as to enable the imaging device to capture the image based oix the selected wavelengths, i.e., despite the device's relative insensitivity to such wavelengths, while (ii) excluding all other wavelengths so that the imaging device's sensor cells are not electrically saturated by such other wavelengths.
  • the selected wavelengths e.g., at appropriate intensities and for sufficient durations
  • FIG. 16 shows a cutaway side view of a single semiconductor wafer for imaging and inspection using tools and processes in accordance with the invention.
  • FIG. 17 guides discussion of how various features of a semiconductor device are depicted imaging using tools and processes in accordance with the invention.
  • FIG. 18 shows a cutaway side view of a semiconductor package application for imaging and inspection using tools and processes in accordance with the invention.
  • FIG. 19 shows a cutaway side view of two wafers being aligned and bonded.
  • the incident radiation's photons are unlikely to so excite an electron and, as such, the photons are unlikely to produce charge for image capture.
  • a particular semiconductor material is subject to incident radiation of a wavelength at or below ⁇ c (i.e., corresponding to energy above the material's band-gap eneirgy)
  • the collision of photons with the material's atoms is likely to excite valence-band el-ectron(s) into the conduction band.
  • incident radiation exceeds a material's critical wavelength ⁇ c, the radiation's photons tend to penetrate either deeply into or completely through the material.
  • Charge generation factors also include: the diffusion length and recombination rate of the liberated electron (or hole); the chemical and physical nature of materials overlying the device's surface (e.g., shielding elements); and the location and depth at which the photon is absorbed (relative to the location and depth of relevant circuit structures).
  • the photons are absorbed at locations and depths in the potential well of, e.g., a CCD, the photons are likely to generate charge that will be effective in the CCD's imaging operations.
  • any electron-hole pairs created by the photons may be more likely to recombine before contributing to the device's imaging operation.
  • the human eye's photopic and scotopic vision i.e., arising from cones and rods, respectively
  • typical CCDs both as to any individual wavelength and in terms of the range of wavelengths.
  • the model CCDs tend to have more substantial quantum efficiency (and, thus, sensitivity) in the visible light spectmm, with diminishing response at and into the near infrared spectmm.
  • These devices have various total array areas; however, for those example devices having 3-6 megapixels, the total array area tends to exceed 20mm (measured on the diagonal).
  • An example is the Marconi Applied Technologies CCD39-01 sensor, which is a back illuminated CCD having square unit sensor cells, each such cell having sides of 24 microns.
  • This chip has a pixel number of 80x80 pixels (6400 pixels total), and a total array area of only 1.92mmx 1.92mm.
  • This chip's quantum efficiency curve is shown in FIG. 3. The curve exhibits little to no ripple, which is thought to follow from the absence of antireflective coating.
  • the radiation source(s) may exclude other wavelengths, at least at some selected time and/or for a selected time duration.
  • the certain wavelengths may include infrared wavelengths alone, or in combination with wavelengths of the visible or ultraviolet spectra, such combination being simultaneously or at different times.
  • This example also contemplates one or more imagmg devices, wherein each imaging device maybe tuned to specific wavelength(s) or band(s) of wavelengths.
  • one or more mainstream solid state imaging devices may be used. Generally, one or more mainstream imaging device may be substituted for selected or all imaging devices in the example embodiment described above relating to use of scientific-grade solid state imaging devices.
  • mainstream devices may be used, e.g., to image and inspect features relevant to selected stmcture(s) of semiconductor devices, particularly where such features have sizes more compatible, in the context of the imaging system, to imagmg with the typically smaller sensor cells of the mainstream imaging device, than with the sensor cells of the scientific-grade imaging device.
  • mainstream devices due to smaller sensor cells may introduce sensitivity, signal to noise and dynamic range issues, with attendant ramifications, e.g., in the provision of radiation and exclusion of noise.
  • the lens system may lead to use of higher quality lens systems, at least higher quality than may typically be associated with scientific-grade imaging devices (e.g., due to the relatively larger sensor cells and array area).
  • mainstream devices may be used, e.g., to image and inspect such features where such features are capable of being imaged and inspected via certain wavelength(s) (e.g., infrared wavelengths), to which wavelengths the mainstream imaging device is appropriately responsive, while the scientific-grade device either is not responsive or not any more responsive.
  • the selected wavelength(s) may be such that the features may be imaged or inspected either best or only with such wavelength(s).
  • the imaging device should have sufficient sensitivity (or, equivalently for our purposes, have sufficient quantum efficiency) at the selected radiation wavelength(s) (e.g., such wavelengths being selected based on the expected defect's size and/or to enable imaging through the material or structure under consideration).
  • solid state imaging devices including both scientific-grade and mainstream devices, generally exhibit diminishing sensitivity as the radiation wavelengths extend into the infrared spectmm. Accordingly, where the selected radiation wavelength(s) approach or are in the infrared spectmm, an example embodiment in accordance with the invention provides for modification of the imaging devices so as to obtain sufficient sensitivity.
  • Examples of such features and/or advantages are directed to one or more of: selectivity of the radiation's wavelength(s) (including variations therein, e.g., over time); control over, and quality of, collimation (as well as selected departures therefrom, including as a function of wavelength); control and quality of coherence (as well as selected departures therefrom); quantity and control over intensity (e.g., providing variations of intensity, including as a function of wavelength); control over duty cycle (e.g., from pulsed to continuous, including as a function of wavelength), as well as other characteristics of the source and/or its radiation.
  • the radiation source provides radiation in one or more selected, narrow band(s).
  • FIG. 5 shows a first example embodiment in accordance with the invention.
  • a solid state light source 1 irradiates selected semiconductor stmctures 4 via a fiber optic light guide 2 and a lens system 3.
  • the source's radiation is directed to stmctures 4 via an internal beam splitter in the lens system 3.
  • the source's radiation is transmitted through the stmctures 4 (as well as, thiough the substrate 7 of the semiconductor device having such structures 4) at different intensities to the lens system 3 for image formation on the CCD/CMOS camera 5.
  • the image, so captured may be provided for further processing via, e.g., computer 6.
  • the captured image, so processed or otherwise may be employed for test and quality control, toward identifying relevant features of such structures 4.
  • transmission of the radiation through the structures 4 and substrate 7 will depend on various factors, as previously described, including the absence of metal or other interconnect layers or other materials which would block the transmission of the radiation, or reflect it away from the lens system 3.
  • the radiation source 1 is oriented to the side of the to-be-imaged structures 4. Whereas most of the source's radiation will tend be reflected by the substantially flat surface of the s ⁇ bstrate7 away from the lens system 3 so as to be unavailable for image capture via CCD/CMOS camera 5, the stmctures 4 will cause dark field reflections perpendicular to the substrate's surface. Since such reflections respond to the structures (e.g., topology, conditions and other features), such orientation is generally suitable for providing higher contrast imaging and inspection.
  • An extension of the third example embodiment, shown in FIG. 8 contemplates capturing a plurality of images with dark field lighting to deduce height information associated with a selected structure 4.
  • the radiation source 1 typically is a solid state source, preferably a one dimensional array of solid state emitting devices —such as, e.g., LEDs/VCSELs— radiating either through a lens array or through a linearly-arranged fiber optic light guide.
  • each source radiates a specific wavelength or band and each source is subject to individual control, including, as examples, control one or more of: radiating at selected time(s), for selected duration(s), at selected intensities, with selectable collimation, with selected pulse duty cycles, etc.
  • a plurality of anays may be provided, pairs of which provides distinct, nanow band of wavelengths about a center wavelength, one collimated and the other not collimated, and such that the each array may be sequentially, simultaneously or otheiwise powered with respect to any other array(s), such power being for respective time duration(s), or continuing, or in pulse modes of selected duty cycle, so as to enable gathering of various responses of a stmcture to such applied radiation(s) and with that information (and, preferably, processing thereof), imagmg, inspecting or otherwise analyzing the stmcture, including as to relevant conditions and/or defects.
  • Still other example embodiments of the invention include, but are not limited to: • Providing fiber-optics in image acquisition • Providing pulsed illumination with synchronized image capture (e.g., synchronizing the camera's shutter or other gating component, of the camera or otherwise). • Providing enhanced high-intensity radiation in, e.g., a through-lighting orientation, such as by super high intensity radiation, preferably pulsed, from one or more LED arrays, in one of more selected bands
  • Solid state sources such as LEDs
  • solid state sources tend to have a direct cost advantage.
  • Elimination of filters, e.g., IR band pass filters as in FIG. 1 are eliminated because LEDs and LED-based arrays can be provided that deliver narrow band(s) of wavelength(s), thus indirectly reducing cost of and complexity of implementations.
  • o Readily enable implementations having spectral separation between back and top light sources.
  • Clear images are promoted as LEDs tend to have narrow band radiation, which tends to preclude certain problems, e.g., chromatic aberration (where rays of different wavelengths bend differently when passing through lenses).
  • o Narrow band radiation also results in interference fringes in which bonding defects show up as concentric rings due to constructive and destmctive interference.
  • o Backlighting is scalable with LEDs by simply increasing size of array.
  • o LEDs have stable light output - eliminates calibration problems with bulbs.
  • o LEDs have long lifetime (-100000 hours) - no need to replace after only 1000 hours as with bulb.
  • o LEDs are narrow band and do not put additional IR (heat) energy into the inspection target. Heat could damage target.
  • o LED arrays can be used to selectively provide collimation at one or more wavelengths.
  • o LED arrays can be populated with various wavelength specific LEDs so as to provide various wavelengths at selected times, e.g., sequential or simultaneous pulsing at various power and duty cycles.
  • Optics The lens system typically is selected based on various factors. Examples of these factors include the field-of-view requirements of the imaging/inspection application and the applicable (selected) radiation source orientation (with examples of same described above). Optics typically are treated with antireflective coatings to reduce reflections in a range of selected wavelengths, e.g., those centered on 1070 nm.
  • One example embodiment with particular application to the first example embodiment described above with reference to FIG. 5, uses a zoom lens which provides a field of view ranging from 6 mm to 40 mm.
  • Focus and zoom may be set either/both manually (e.g., by turning a dial) or automatically (e.g., by computer control). For applications where dimensional measurement is required, a telecentric lens may also be used.
  • optics selected for proper magnification and coated for maximum transmission at selected wavelength(s) e.g., wavelength(s) generally in the 700mn- 3000nm long visible red to near infrared spectmms, or more specific band(s), e.g., centered on 1070nm, or centered on 1200nm or in any of various bands, such as 1050- 1200nm, 1050-13 OOnm, or 1000-1300 mn wavelength range
  • selected wavelength(s) e.g., wavelength(s) generally in the 700mn- 3000nm long visible red to near infrared spectmms, or more specific band(s), e.g., centered on 1070nm, or centered on 1200nm or in any of various bands, such as 1050- 1200nm, 1050-13 OOnm, or 1000-1300 mn wavelength range
  • CCD/CMOS imaging devices e.g., near the upper -wavelength limits of their spectral sensitivity.
  • CCD/CMOS imaging technologies are substantially mature, particularly relative to some infrared camera technologies, such as those based on arrays of certain gallrum arsenide detectors or microbolometers. This maturity translates into various advantages for CCD/CMOS imaging devices and the cameras based thereon, particularly as compared to cameras specific to infrared imaging: • Sensor density: CCD/CMOS cameras are commercially available with up to 8 million pixels (compared to typical infrared cameras which typically have as few as 0.25 million pixels). • Standardized electrical interfaces: CCD/CMOS cameras are commonly available with standard electrical interfaces to frame grabbers, or to flexible high-speed bus architectures, such as IEEE 1492, USB-2, or 100-Base-T.
  • CCD/CMOS cameras are significantly less expensive than such infrared cameras (by as much as an order of magnitude).
  • the use of CCD/CMOS imaging devices is enabled by the use of selected radiation wavelength(s). The radiation wavelengths typically are selected based, among other things on the spectral response of the imaging devices.
  • radiation in infrared band may be employed which radiation typically corresponds to significantly diminished sensitivity in semiconductor- based imaging devices, e.g., silicon-based CCDs and CMOS sensors.
  • tools and processes are provided that exclude (or substantially exclude) radiation wavelengths —other than those of the selected infrared wavelength(s) or band(s)— from the imaging device, such exclusion being maintained at least during for time period(s) associated with imaging using the selected wavelengths.
  • the relative insensitivity of the imaging devices is overcome That is, absent wavelengths to which the CCD/CMOS imaging device is more responsive, the imaging device responds only to the narrow band of selected wavelengths and the images reflects such response.
  • the signal levels for such imaging are brought up to a measurable level using various approaches, such approaches including, as examples, opening the lens aperture, increasing exposure time, increasing electronic gain, by digitally averaging multiple acquired images, or using other techniques to expose that may be known in the art.
  • tools and processes are provided which recognize and respond to the quantum efficiencies and other physical properties of solid state imaging devices.
  • Such tools and processes preferably respond to and result from coordination of various facts and factors, including: (a) the particular, to-be-imaged semiconductor stmcture has known or detennined semiconductor materials (and associated band-gap energy or energies) and may have features (including conditions and defects) of known or determined parameters, including as to typical size(s), shape(s) and location(s); (b) radiation wavelength(s) or band(s) of wavelength(s) are selected based on such materials, energies, and parameters, as well as the orientation of the radiation soxirce and subject to the spectral response of the imaging device; (c) one or more radiation sources are selected and oriented, which radiation sources are enabled to provide the selected wavelength(s) and to deliver the radiation at appropriate orientations (e.g., angles and locations, including from the back side of the stmcture) relative to the semiconductor stmcture, as well as, preferably, to control radiation characteristics (including as to intensity, collimation, lack of collimation, puls
  • this general embodiment contemplates coordination between these two factors, which factors may at times tend to push in different directions (e.g., longer wavelengths to pass through the substrate but shorter wavelengths so as to detect and properly image the structure as to its relevant features).
  • the radiation source(s) preferably (i) provides the selected wavelengths (e.g., at appropriate intensities, for sufficient durations, at proper duty cycles) so as to enable the imaging device(s) to capture the image based on the selected wavelengths, i.e., despite the device's relative insensitivity to such wavelengths, while (ii) excluding all (or substantially all) other wavelengths, so that an imaging device's sensor cells are not electrically saturated by such other wavelengths.
  • CCD/CMOS tends to provide various advantages, with examples including: o Cost advantage o Enhanced flexibility in selection of resolution and pixel sizes (e.g., scientific-grade vs mainstream), such that tools and processes may render detail and "big picture" in same view. More infonnation tends to be collected in one snapshot, which simplifies image analysis. o Improved data rates o CCD/CMOS cameras are mainstream and mature.
  • the thieshold may then be defined as
  • Shading Correction - The result of non-planar illumination and distortion in the optics train leads to images that tend to be darker near the edges relative to the center. Shading correction is applied to correct tins problem and, in so doing, facilitate qualitative and quantitative comparisons between regions of the image that may be illuminated at different levels. Shading correction entails taking a calibration image using a uniform background, with the illumination power set so that the brightest portion of the image (near the center) almost, but does not quite saturate the image. To perform shading correction of subsequent images, each pixel of a raw acquired image is corrected by dividing by the value of the corresponding pixel of the calibration image.
  • Scratch removal -
  • the surfaces of unpolished wafers are often marred by scratches (e.g., horizontal, multi-directional, etc.). Scratches tend to add periodic oriented noise to the digital images obtained with a CCD/CMOS camera. Moreover, scratches interfere with standard computer vision techniques, such as template matching, edge detection, and connectivity analysis, as well as with a human operator's ability to inspect for defects. Therefore it is expedient to digitally remove this oriented noise.
  • the removal algorithm (i) transforms the implicated image into the Fourier domain by use of the Fast Fourier Transfo ⁇ n, (ii) analyzes the transformed image to detect oriented noise, (iii) subtracts a Gabor filtered approximation of the oriented noise, and finally (iv) converts the result back to the image domain via the inverse Fourier transform.
  • This function has the effect of linearly stretching the histogram over the complete dynamic range (0...2 b,ls -l) of a pixel represented with a number of bits equal to bits. In an example embodiment according to this invention 8 bits per pixel are used; however, it is understood that other bit values may be used without departing from the principles of the invention.
  • the parameters ⁇ and b (nominally set to 0) control the dark level and bright level of the histogram. Larger values of ⁇ cause the histogram to be stretched more while one or more values of p to be merged into a single value of n. Larger values of b also increase the degree of histogram stretching, i.e., by causing one or more values ofp to be merged into a single value of n.
  • Deconvolution is the process of undoing the smearing in a data set that has occurred under the influence of a known response function, for example, because of the known effect of a less-than perfect measuring apparatus. This definition comes from definition from: Press, W., Teukolsky, S.A., Nettering, W.T., and Flannery, B.P., 1992, Numerical Recipes in C The Art of Scientific Computing, Second Edition (Cambridge: Cambridge University Press).
  • Deconvolution can be applied to help eliminate blurring effects of the optical or imaging system and can yield improved object resolution.
  • Wiener filtering - Wiener filtering is similar to deconvolution in the presence of noise. The process is to find an optimal filter, which is applied to the input image before deconvolution to eliminate the deleterious effects of noise. This is a well- known technique described in Press, W., Teukolsky, S.A., Nettering, W.T., and Flannery, B.P., 1992, Numerical Recipes in C The Art of Scientific Computing, Second Edition (Cambridge: Cambridge University Press) and Gonzalez, R.C. and Wintz, P., 1987 ', Digital Image Processing, Second Edition (Reading, Massachusetts: Addison- Wesley Publishing Company).
  • FIG. 12 illustrates an example flow chart of representative image processing operations contemplated to be performed in accordance with this invention.
  • a raw image is captured by a CCD/CMOS camera and made available for image processing.
  • a computer 6 such as a PC.
  • the algoritlims may be provided via electronics and/or software associated with the camera itself (e.g., triggered by selecting a hard or soft button on the camera).
  • dust is removed, as described above or otherwise.
  • shading co ⁇ ection is performed, as described above or otherwise.
  • step 14 scratch removal is performed, as described above or otherwise.
  • contrast enhancement is performed, as described above or otherwise.
  • FIG. 13 shows a cutaway side view of a typical MEMs sensor wafer sandwich.
  • the incident radiation generally will include selected infrared wavelength(s) or band(s), so as to penetrate to the bond layer and any relevant features therein (e.g., the above-described conditions and defects).
  • FIG. 14 shows a cutaway side view of a typical fusion bonded, bare wafer sandwich as is typically used in production of Silicon on Insulator (SOI) bare wafers.
  • This stmcture includes a substrate carrier layerl 10, a cap layer 112 and a bond layer 114.
  • uniformity and integrity in the bond layer is generally of importance. As such, presence of particulates, voids or other defects 116 in the bond layer 114, or even slight differences in uniformity are not desirable.
  • the mcident radiation generally will include selected infrared wavelength(s) or band(s), so as to penetrate to the bond layer and to any relevant features therein (e.g., the above-described conditions and defects).
  • imaging subjecting the stmcture to narrow band IR backlight illumination in the presence of particulates/voids i the bond layer 114, or even slight differences in uniformity therein, will generally result in formation of interference fringes in the image.
  • FIG. 15 illustrates a representative ring pattern that might typically be formed when imaging a fusion bonded bare wafer using tools and processes in accordance with the invention. These periodic patterns are readily detected by eye, and may be automatically detected by an algorithm designed to detect such periodic features. The fringes may also be used to estimate the height of internal defects. One full period of an interference fringe (transition from dark to light to dark again) coiresponds to a change in distance between bonded materials of 1 wavelength of incident light.
  • FIG. 16 shows a cutaway side view of a single semiconductor wafer 120, which wafer 120 may be either patterned or non-pattemed.
  • a ciystalline bond is shown to be cleaved, resulting in formation of a micro crack 122.
  • a void or insertion defect 124 is also interior to the wafer.
  • each such crack or defect may slip detection using conventional imaging and inspection approaches, e.g., with typical pattern, electrical, or surface inspection tools).
  • even one such micro-crack 122 or defect 124 may result in unacceptably low, and thus, costly yields and/or poor long-term reliability.
  • relevant features for imaging and inspection in even single wafers include micro-cracks 122 and void, insertion or other defects 124 ulterior to the wafer 120.
  • the incident radiation generally will include selected infrared wavelength(s) or band(s), so as to penetrate into the wafer's interior to any of the above-described conditions or defects.
  • FIG. 17 guides discussion of how various features of a semiconductor device are depicted imaging using tools and processes in accordance with the invention.
  • a typical patterned wafer 130 has devices 132 positioned in a regular grid on the wafer's surface. Dicing lines 134 indicate where the wafer 130 will be cut to liberate individual devices 132 for packaging and, prior to such cutting process, such lanes separates the devices 132.
  • a bond region 136 is recognized (and distinguished) from a "no print" zone 138 surrounding certain active circuitry 140 of the device 132. That recognition is achieved based on variation in the intensity of light reaching the camera at that position, relative to other positions. Since the bonding material attenuates light more than the air filling a void region, the bond region 136 (demarcated by inner boundary 142 and outer boundary 144) will appear darker in the resulting image.
  • an example process inspecting a typical patterned wafer has the following steps: • Locate the circuitry 140 and/or the device 132 (optionally, use this position to fix the detection of the following features).
  • the circuitry/device 140, 132 may be located by, e.g., a template matching algorithm, such as an algorithm utilizing no ⁇ nalized correlation suffices.
  • the outer boundary 144 may be located, e.g., with any of a variety of line detection or "caliper" tools. Such tools find the location of extended straight lines between regions of image contrast.
  • Flip chip devices 150 are typically bare dies placed upside down upon a piece of interposer material, typically glass based or organic based. Placement accuracy can be ascertained by imaging and inspecting the position of an on-chip alignment target 152 (e.g., imaging through the chip) relative to the position of an alignment target 154 on the substrate 156. In this FIG. 18 the respective aligmnent targets are shown to have a displacement ⁇ . A similar application arises when two wafers are to he aligned and then bonded. As shown in FIG. 19, a lower substrate 160 is positioned underneath a cap substrate 162. Substrates 160 and 162 have respective aligmnent targets 164 and 166.

Abstract

The invention consists of a camera, light sources, lenses and software algorithms that are used to image and inspect semiconductor structures, including through infrared radiation. The use of various configurations of solid state lighting and software algorithms enhances the imaging and inspection.

Description

IMAGING SEMICONDUCTOR STRUCTURES USING SOLID STATE ILLUMINATION Inventors: Mark D. Owen, Francois Vlach, and Steven J. Olson
Related Applications This invention claims the benefit of co-pending U.S. Provisional Application No. 60/563,856, entitled METHOD AND APPARATUS FOR THROUGH-SUBSTRATE IMAGING AND INSPECTION OF BONDED SILICON WAFERS USING SOLID STATE ILLUMINATION, filed on April 19, 2004, the entire disclosure of which is hereby incorporated by reference, as if set forth herein, for all p poses.
Background of the Invention The semiconductor industry is continually innovating in fabrication processes. This imiovation has resulted, and will likely continue to result, in the development of new structures and, as such, new semiconductor devices. More specifically, this imiovation has taken semiconductor fabrication from (a) having active circuitry in largely flat layers disposed substantially at or in the very top of a single semiconductor substrate, toward (b) providing active circuitiy at one or more of various layers, in new substrates, or in combination(s) of substrates, including between two or more bonded or stacked substrates. This innovation has resulted in semiconductor devices such as Micro Electro Mechanical Systems (MEMS), Micro Electro Optical Mechanical Systems (MOEMS), Silicon on Insulator (SOI) devices and Light Emitting Diodes (LEDs). Fabrication innovations in the semiconductor industry generally are accompanied by innovations in test and quality control. In test and quality control, tools and processes are employed that identify defects in particular chips/wafers, while also generally contributing to improvements in fabrication (e.g., process control so as to increase yield) and reliability (e.g., to anticipate and help control failure parameters of products in the field). Such tools and processes are directed, among other things, to imaging and inspecting semiconductor devices, particularly as to the semiconductor structures thereof. Accordingly, when fabrication innovation results in new semiconductor structures, im ovations generally keep pace in tools and processes so as to enable imaging and inspection of such structures. As would be expected for conventional semiconductor devices having active circuitry substantially at or near the surface of a single semiconductor substrate, conventional imaging and inspection tools and processes are employed. These tools and processes enable identification of features located substantially at or near the wafer's surface, e.g., within approximately 200 Angstroms of the wafer's surface. Clearly, these tools and processes have capabilities paired to the structures that are to be imaged or inspected. As for conventional semiconductor devices, new semiconductor devices generally need tools and processes that enable imaging and inspection of device's structure(s)'s relevant features, including to identify various conditions and to detect defects. However, these relevant features may be disposed other than at or near the surface of the substrate. Indeed, these relevant features within bonded or stacked substrates tend to be located inside the bonded or stacked layers (e.g., in the interface layer(s), including the characteristics of the bond itself). As such, for these and other new semiconductor devices, conventional imaging and inspection tends generally to be insufficiently effective, or even ineffective, if performed using the above-described conventional tools and processes. Tools and processes have been developed that enable imaging and inspection of features relevant to the stracture(s) of the above described semiconductor devices. To illustrate, tools and processes exist for imaging and inspection of semiconductor devices having bonded or stacked substrates, or other structures based on bonding or stacking materials. These tools and processes include infrared microscopy using high magnification optics under infrared light provided by bulbs; X-Ray imaging; and ultrasonic imaging. Of these, ultrasonic imaging may be the most prevalent. It entails placing a wafer in a liquid bath, applying an ultrasonic signal and, using ultrasound wave flight measurement, constructing a map of the wafer bond's integrity. Even though prevalent, ultrasonic imaging has several drawbacks. These drawbacks include, as examples: the liquid bath tends to be detrimental to electronic production environments; it not only adds the steps described above, but also introduces additional steps before fabrication can proceed (e.g., to clean and dry the wafer); and it enables only the inspection for wafer bond defects, such that other relevant conditions or defects are identified/detected using additional imaging/inspection tools and/or processes. The drawbacks of ultrasonic imaging are not present in infrared microscopy. Infrared microscopy, as illustrated in FIG. 1, typically entails using a halogen or other bulb light source 10 in conjunction with an appropriate infrared high-pass or band-pass filter 20 so as to generate infrared light. The infrared light is provided to irradiate objects 50 via a fiber optic light guide 2 and a lens system 3. In this configuration, the infrared light is directed to objects 50 a an internal beam splitter in the lens system 3. The infrared light, so directed, generally is reflected by objects 50 at various intensities (e.g., depending on the bond characteristics and other structural features of the semiconductor device) back up through the lens system 3 to an infrared camera 60 for image capture. Via such image, test and quality control may be performed, e.g., to inspect the relevant features, including to identify various conditions and to detect defects, such as in bonding and adjacent layer(s). While infrared microscopy provides for imaging and inspection of semiconductor devices having bonded or stacked substrates, microscopy also tends to have drawbacks. As an example, a typical light source is a halogen or other bulb, which provides light across a broad spectrum, including infrared. In order to provide infrared light, then, an appropriate filter is used. As another example, a typical infrared camera in conventional microscopy arrangements is or employs, e.g., a vidicon camera, gallium arsenide detectors, microbolometers, or other scientific, professional or industrial-grade technologies which technologies tend to be technically more complex to develop, manufacture and use, while also tending to be produced in lower volumes and at higher costs than mainstream solid state imaging devices (e.g., standard, consumer-grade, silicon-based charge coupled devices or CMOS image sensors, used in, for example, consumer digital still cameras that are widely sold to average consumers in retail outlets). Accordingly, it is desirable to have tools and processes that broadly enable imaging and inspection of the various features relevant to selected structure(s) of semiconductor devices. In addition, it is desirable to have tools and processes that enable imaging and inspection of features relevant to selected stracture(s) of semiconductor devices, particularly where such structures and associated features are disposed other than at or near the surface of the device.
Summary of the Invention The present invention provides tools and processes that broadly enable imaging and inspection of the various features relevant to selected structure(s) of semiconductor devices. The present inventions also provides tools and processes that enable imaging and inspection of features relevant to selected st cture(s) of semiconductor devices, particularly where such structures and associated features are disposed other than at or near the surface of the device. The present invention also provides tools and processes that enable imaging and inspection of .features relevant to selected structure(s) of semiconductor devices, where such relevant features (such as defects) are associated with bonded or stacked layers (e.g., in the interfacing layer(s) of bonded or stacked substrates or in the bond itself) or with other bonded or stacked materials. The present invention also provides tools and processes that have enhanced source(s) of radiation, particularly infrared radiation. Such source(s) are variously enhanced, including, as examples, as to selectivity of the radiation's wavelength(s)
(including variations therein, e.g., over time), control and quality of collimation (as well as selected departures therefrom, including as a function of wavelength), control and quality of coherence (as well as selected departures therefrom), control over intensity (e.g., selected variations therein, including as a function of wavelength), control over duty cycle (e.g., from pulsed to continuous, including as a function of wavelength), as well as other characteristics of the source and/or its radiation. The present invention also provides tools and processes that employ infrared camera(s) based on or using either or both scientific-grade and/or mainstream solid state imaging devices. The present invention also provides tools and processes that —as to infrared wavelength(s) capable of imaging selected, relevant features of a selected semiconductor structure— couple a light source and a solid state imaging device, such that the light source is enabled to provide such infrared wavelerιgth(s) and the imaging device is appropriately responsive to such wavelength(s). In this example embodiment, the infrared wavelength(s) may be selected not only for ability to detect such features, but also for transmissiveness through the entire semiconductor structure. Moreover, in this example embodiment, the imaging device preferably also has sufficient resolution to properly image the condition or defect being imaged. In this embodiment, the light source preferably is enabled to provide such infrared wavelength(s), e.g., (a) to the exclusion of other wavelengths, at least at some selected time and/or for a selected time duration) and (b) with selected characteristics, including as to intensity, collimation, and the like. The present invention also provides tools and processes that —as to radiated wavelength(s) capable of imaging selected, relevant features of a selected semiconductor structure— couple a light source and a solid state imaging device (e.g., camera based on such device), such that the light source is enabled to provide such wavelength(s) and the imaging device is appropriately responsive to such wavelength(s). In this example embodiment, certain wavelength(s) may be selected not only for ability to detect such features, but also for transmissiveness through the entire semiconductor structure, e.g., infrared wavelengths. Moreover, so as to enable or enhance imaging and inspection, the selected wavelengths may include combinations of wavelengths or bands of wavelengths among one or more of the visible, infrared and/or ultraviolet spectra, simultaneously or at different times. In this example embodiment, the imaging device preferably also has sufficient resolution to properly image the condition or defect being imaged. This example embodiment also contemplates one or more imaging devices, wherein each imaging device may be tuned to specific wavelength (s) or band(s) of wavelengths based, e.g., on the respective device's sensitivity to such wavelengths and/or its ability to resolve features sought to be imaged. In this embodiment, the light source preferably is enabled to provide such infrared wavelength(s), e.g., (a) to the exclusion of other wavelengths, at least at some selected time and/or for a selected time duration) and (b) with selected characteristics, including as to intensity, collimation, and the like. In a general embodiment in accordance with this invention, tools and processes are provided which recognize and respond to the quantum efficiencies and other physical properties of solid state imaging devices. Such tools and processes preferably respond to and result from coordination of various facts and factors, including: (a) the particular, to- be-imaged semiconductor structure has known or determined semiconductor materials (and associated band-gap energy or energies) and may have features of known or determined parameters, including as to typical size, shape and location; (b) radiation wavelength(s) or band(s) of wavelength(s) are selected based on such materials, energies, and parameters, as well as the orientation of the radiation source and subject to the spectral response of the imaging device; (c) the radiation source is selected and oriented, which radiation source is enabled both to provide the selected wavelengths, to control radiation characteristics (including as to intensity, collimation, lack of collimation, pulsing, etc.), and to deliver the radiation at appropriate orientations (e.g., angles and locations, including from the back side of the structure) relative to the semiconductor structure; (d) a lens system is selected so as to transmit the selected wavelengths to the imaging device and to match the lens' image-fonxiing capabilities with the imaging device's image-capture capabilities (e.g., the lens is able to resolve features of size equal to, or less than, the feature sizes that the imaging device resolves), so as to properly image the features; and (e) the imaging device is able to capture an image of the features, based on sufficient sensitivity to the selected wavelength(s) and having sensor cell size and number sufficient to resolve the imaged features, as well as proper delivery of the selected radiation. To illustrate, when imaging based on a radiation orientation that directs the selected wavelength(s) to the back side of the structure, the radiation source preferably provides infrared wavelength(s) that may be transmitted entirely through the to-be-imaged structure and that are half or less than half the relevant dimensions of the feature to be detected. Moreover, the radiation source preferably (i) provides the selected wavelengths (e.g., at appropriate intensities and for sufficient durations) so as to enable the imaging device to capture the image based oix the selected wavelengths, i.e., despite the device's relative insensitivity to such wavelengths, while (ii) excluding all other wavelengths so that the imaging device's sensor cells are not electrically saturated by such other wavelengths. These and other embodiments are described in more detail in the following detailed descriptions and the figures. The foregoing is not intended to be exhaustive of all embodiments and features of the present invention. Persons skilled in the art are capable of appreciating other embodiments and features from the following detailed description in conjunction with the drawings.
Brief Description of the Drawings FIG. 1 is a schematic diagram of a conventional infrared microscopy arrangement using an IR bulb and filter. FIG. 2 shows representative quantum efficienc curves of several model CCDs, as well as of a typical human eye. FIG. 3 shows the spectral response of a typical, high performance CCD. FIG. 4 shows an example of a representative composite sensitivity for a hypothetical imaging device. FIG. 5 shows a first example embodiment in accordance with the invention. FIG. 6 shows a second example embodiment in accordance with the invention. FIG. 7 shows an a third example embodiment in accordance with the invention. FIG. 8 shows an extension of the third example embodiment in accordance with the invention. FIG. 9 shows a fourth example embodiment in accordance with the invention. FIG. 10 shows an a fifth example embodiment in accordance with the invention. FIG. 11 shows an extension of the fifth example embodiment in accordance with the invention. FIG. 12 illustrates an example flow chart of representative image processing operations contemplated to be performed in accordance with this invention. FIG. 13 shows a cutaway side view of a typical MEMs sensor wafer sandwich for imaging and inspection using tools and processes in accordance with the invention. FIG. 14 shows a cutaway side view of a typical fusion bonded wafer sandwich for imaging and inspection using tools and processes in accordance with the invention. FIG. 15 illustrates representative ring patterns that might typically be formed when imaging a fusion bonded bare wafer using tools and processes in accordance with the invention. FIG. 16 shows a cutaway side view of a single semiconductor wafer for imaging and inspection using tools and processes in accordance with the invention. FIG. 17 guides discussion of how various features of a semiconductor device are depicted imaging using tools and processes in accordance with the invention. FIG. 18 shows a cutaway side view of a semiconductor package application for imaging and inspection using tools and processes in accordance with the invention. FIG. 19 shows a cutaway side view of two wafers being aligned and bonded.
Detailed Description of the Invention Representative embodiments of the present invention are shown in Figs. 1-19, wherein similar, features share common reference numerals. Solid State Imaging Devices
Solid state imaging devices (e.g., charge coupled devices (CCDs) or CMOS image sensors) have been developed that (a) sense incident radiation and (b) where such incident radiation is representative of an image, capture such image. These imaging devices respond to, and perform based on, the known physical relationship among semiconductors and incident radiation which relationship, generally, provides that photons may interact with silicon to produce electric charge. Though known, the relationship is a relatively complex function involving various factors, including the incident light's wavelength, the implicated semiconductor material(s), and the semiconductor material's doping (e.g., the dopant (s), concentration(s) and dimensional profiles of such doping). This relationship provides, for selected wavelengths in the infrared spectrum, that semiconductor materials tend to be more or less trans missive of incident radiation. In this relationship, the implicated semiconductor material's band-gap energy figures prominently. This band-gap energy is a constant. Generally, this band-gap energy represents the minimum amount of energy required for an electron to jump an energy band (e.g., from a valence band to the conduction band). This band-gap energy, for the particular semiconductor material, follows the formula:
Ee (material ) - tιclλ Ee(material) = hc/λ where h is Plank's constant, c is the velocity of light in vacuum and λ is the wavelength of incident radiation. Applied to imaging, the above formula may be restated to descri oe each semiconductor material's critical wavelength for incident radiation, as follows: λc > he/ Ee(material) This restated formula may be used to detennine whether or not, in the collision of a photon of a specific wavelength with an atom of a particular semiconductor material, any electrons are likely to be excited from the valence band to the conduction band di-ie to the reaction between the photons and orbital electrons. Above the material-specific critical wavelength λc, the incident radiation's photons are unlikely to so excite an electron and, as such, the photons are unlikely to produce charge for image capture. Conversely, when a particular semiconductor material is subject to incident radiation of a wavelength at or below λc (i.e., corresponding to energy above the material's band-gap eneirgy), the collision of photons with the material's atoms is likely to excite valence-band el-ectron(s) into the conduction band. When incident radiation exceeds a material's critical wavelength λc, the radiation's photons tend to penetrate either deeply into or completely through the material. Table 1 below lists the band-gap energy and critical wavelength (calculated using such energies) for each of a variety of materials. From this table, it is apparent that typical substrate materials such as germanium, silicon and gallium arsenide are characterized by critical wavelengths in the infrared spectrum, particularly the near infrared spectrum.
Table 1
Figure imgf000015_0001
Table 2 below is representative of the depth to which incident photons tend to penetrate a model, silicon-based CCD. From this table, it is apparent that penetration (and conversely, absorption) of photons in silicon, as well other semiconductor materials, is wavelength dependent. Indeed, as incident radiation goes further into the infrared spectrum, photons tend to penetrate ever more deeply into the semiconductor material. That is, in a solid state imaging device, photons of ever longer infrared wavelengths tend to penetrate ever more deeply into the bulk of the substrate. Where penetration exceeds the thickness of the device's substrate, the incident radiation may pass through the substrate, and device, entirely. It is noted, however, that the penetration depth (and, conversely, absorption) of a photon into a silicon-based CCD, or other solid state imaging device, will tend also to depend on other structures (e.g., passivation layers, oxide layers, metal and polysilicon interconnect layers, diffusion layers, active layer shielding elements, protective windows etc.) which the photon may encounter along the way. Table 2
Figure imgf000016_0001
When a photon is absorbed by a solid state imaging device, as previously described, electronic charge is generated through the excitation of valence-band electron(s) into the conduction band (i.e., creating an electron and a hole). The amount of electronic charge generated in connection with the photon depends on various factors. These factors include the radiant power density of the radiation incident on the device, the total duration of irradiation of the device and, if pulsed, the duty cycle of the radiation. Generally, greater power density results in greater charge generation. Similarly, greater irradiation duration and duty cycle (e.g., approaching or being continuous irradiation) also result in greater charge generation. Charge generation factors also include: the diffusion length and recombination rate of the liberated electron (or hole); the chemical and physical nature of materials overlying the device's surface (e.g., shielding elements); and the location and depth at which the photon is absorbed (relative to the location and depth of relevant circuit structures). As to the lattermost factor, if photons are absorbed at locations and depths in the potential well of, e.g., a CCD, the photons are likely to generate charge that will be effective in the CCD's imaging operations. Conversely, if photons are absorbed deep in the CCD's substrate, any electron-hole pairs created by the photons may be more likely to recombine before contributing to the device's imaging operation. In that case, the photons are ineffective in the CCD's imaging operations. hi solid state imaging devices, like CCDs and CMOS sensors, a device's responsiveness in converting incident radiation to charge effective for the device's imaging operations typically is known as "quantum efficiency". FIG. 2 shows representative quantum efficiency curves of several model CCDs (i.e., these CCDs and the curves do not necessarily correspond to any actual devices or curves, but are meant to typical of the devices and curves). Figure 2 also shows correlative responsiveness of the typical human eye. In doing so, Figure 2 provides data extending from the longer wavelengths of the ultraviolet light spectmm, across the visible light spectmm and into the near infrared light spectmm. As is apparent from Figure 2, the human eye's photopic and scotopic vision (i.e., arising from cones and rods, respectively) is far less sensitive than typical CCDs, both as to any individual wavelength and in terms of the range of wavelengths. As is also apparent from Figure 2, the model CCDs tend to have more substantial quantum efficiency (and, thus, sensitivity) in the visible light spectmm, with diminishing response at and into the near infrared spectmm. These conclusions are expected to apply similarly to typical CMOS sensors as well as solid state imaging devices generally. Figure 2 also illustrates that among solid state imaging devices, including these model CCDs, some will exhibit quantum efficiencies that are superior to others. Figure 2 illustrates this using representative quantum efficiency curves for model CCDs that are labeled, respectively, as standard, "back thinned" and "blue plus". Compared to the standard CCD, the "back thinned" and "blue plus" CCDs generally exhibit enhanced performance across most of the spectmm (subject to (a) ripples thought to be due to antireflective films often used on protective windows found in many CCDs and (b) slight performance degradation of the "blue plus" in approximately 750-875 n range). Certain enhanced-performance CCDs, such as those illustrated in Figure 2, characterize a device category often referred to in the industry as "scientific-grade" (also sometimes referred to as "professional-grade" or other such terms). Generally, scientific- grade solid-state imaging devices offer various advantages over mainstream solid state imaging devices (e.g., the standard consumer-grade, silicon-based, imaging devices used in, for example, digital still cameras widely sold to average consumers in retail outlets). As illustrated already, one typical advantage is enhanced quantum efficiencies, typically either across a broad spectmm as in Figure 2 or as to a specific section of interest. Other advantages typically include, as examples: higher signal to noise ratios, larger dynamic ranges, fewer defects, lower noise, enhanced gain uniformity across the array of sensors, and enhanced control of the chip's operations (e.g., control over read-out rates and shutter speeds, as well as over performance enhancement features, such as "binning" which pools the charge of a select number of adjacent sensor cells). However, scientific imaging devices also tend to have disadvantages. Typical disadvantages include, for example, that they tend to be technically more complex to develop, manufacture and use, while also tending to be produced in lower volumes and at higher costs than mainstream solid state imaging devices. As well, compared to mainstream solid state imaging devices, scientific-grade, solid state imaging devices generally have relatively large dimensions for each sensor cell, together with either relatively few pixels or relatively large total array size. To illustrate, scientific-grade devices typically have unit sensor cell sizes ranging from about 4.65 microns to as much 24 microns on a side, with typical sizes tending to be between 6.5 and 9 micron. These devices have pixel numbers ranging from the low thousands up to approximately 8 megapixels, but with typical pixel numbers tending to be between 0.3 and 1.5 megapixels. These devices have various total array areas; however, for those example devices having 3-6 megapixels, the total array area tends to exceed 20mm (measured on the diagonal). An example is the Marconi Applied Technologies CCD39-01 sensor, which is a back illuminated CCD having square unit sensor cells, each such cell having sides of 24 microns. This chip has a pixel number of 80x80 pixels (6400 pixels total), and a total array area of only 1.92mmx 1.92mm. This chip's quantum efficiency curve is shown in FIG. 3. The curve exhibits little to no ripple, which is thought to follow from the absence of antireflective coating. As well, this curve's exhibits substantial quantum efficiency across a broad spectmm of wavelengths, from the ultraviolet into the infrared, which quality may reflect, in addition to other design and fabrication choices, either/both the absence of a protective window over the sensor cells and/or possibly the absence of an infrared cut-off filter (i.e., an optical filter blocking infrared wavelengths). Another example of a scientific-grade CCD is the Sony ICX285AL. This chip provides 1.5 megapixels, wherein each unit sensor cell is 6.45 micron x 6.45 micron. This chip has a total array area of 1 lmm (on the diagonal). By comparison, mainstream solid state imaging devices having the same total array area of 1 lmm as the Sony chip typically provide 6-8 megapixels, wherein each unit sensor cell is at or below 2.5 microns per side. Scientific-grade imaging devices generally are specified and used as a matter of industry practice for cameras and systems directed to imagmg and inspection of semiconductor structures. Generally, such specification is weighted based on the larger sensor cell sizes typical of scientific-grade imaging devices, i.e., relative to mainstream imaging. Larger sensor cell sizes enable collection of larger amounts of charge over a fixed exposure time or, similarly, enable collection of a required amount of charge over a shorter period of time. As well, larger sensor cells promote greater signal to noise ratios and greater dynamic range (i.e., so as enable clear recognition of both bright and dim areas in an image). As well, as previously described, quantum efficiencies may be important in any particular application, whether at a particular wavelength, across a range of wavelengths or in a particular band of interest. hi an example embodiment in accordance with this invention, one or more scientific-grade imaging devices are used for cameras and systems directed to imaging and inspection of semiconductor stmctures. To illustrate, one or more such scientific- grade imaging devices may be employed in tools and processes that enable imaging and inspection of features relevant to selected structure(s) of semiconductor devices, particularly where such stmctures and associated features are disposed other than at or near the surface of the device. To further illustrate, such imaging devices are employed to image and inspect relevant features (including various conditions and defects) associated with bonded or stacked layers (e.g., in the interfacing layer(s) of bonded or stacked substrates or in the bond itself) or with other bonded or stacked materials. To so image and inspect, the one or more imaging devices preferably are coupled with one or more radiation source(s), particularly enhanced source(s) of radiation. Where certain wavelength(s) are beneficial or otherwise required to image selected, relevant features of a selected semiconductor structure, the imaging device preferably is (a) coupled to one or more radiation sources that provide such wavelength(s) and (b) appropriately responsive to such wavelength(s). In providing such certain wavelength(s), the radiation source(s) may exclude other wavelengths, at least at some selected time and/or for a selected time duration. In this example, the certain wavelengths may include infrared wavelengths alone, or in combination with wavelengths of the visible or ultraviolet spectra, such combination being simultaneously or at different times. This example also contemplates one or more imagmg devices, wherein each imaging device maybe tuned to specific wavelength(s) or band(s) of wavelengths. In another example embodiment in accordance with this invention, one or more mainstream solid state imaging devices may be used. Generally, one or more mainstream imaging device may be substituted for selected or all imaging devices in the example embodiment described above relating to use of scientific-grade solid state imaging devices. Such use is either alone or in conjunction with one or more scientific-grade imaging devices. In another example embodiment, mainstream devices may be used, e.g., to image and inspect features relevant to selected stmcture(s) of semiconductor devices, particularly where such features have sizes more compatible, in the context of the imaging system, to imagmg with the typically smaller sensor cells of the mainstream imaging device, than with the sensor cells of the scientific-grade imaging device. Generally, such use of mainstream devices due to smaller sensor cells may introduce sensitivity, signal to noise and dynamic range issues, with attendant ramifications, e.g., in the provision of radiation and exclusion of noise. Moreover, to provide proper (e.g., sharp imaging), the lens system may lead to use of higher quality lens systems, at least higher quality than may typically be associated with scientific-grade imaging devices (e.g., due to the relatively larger sensor cells and array area). In another example embodiment, mainstream devices may be used, e.g., to image and inspect such features where such features are capable of being imaged and inspected via certain wavelength(s) (e.g., infrared wavelengths), to which wavelengths the mainstream imaging device is appropriately responsive, while the scientific-grade device either is not responsive or not any more responsive. In either of these embodiments, the selected wavelength(s) may be such that the features may be imaged or inspected either best or only with such wavelength(s). Generally, in using solid state imaging devices, the imaging device should have sufficient sensitivity (or, equivalently for our purposes, have sufficient quantum efficiency) at the selected radiation wavelength(s) (e.g., such wavelengths being selected based on the expected defect's size and/or to enable imaging through the material or structure under consideration). However, as previously described, including with reference to Figure 2, solid state imaging devices, including both scientific-grade and mainstream devices, generally exhibit diminishing sensitivity as the radiation wavelengths extend into the infrared spectmm. Accordingly, where the selected radiation wavelength(s) approach or are in the infrared spectmm, an example embodiment in accordance with the invention provides for modification of the imaging devices so as to obtain sufficient sensitivity. One such modification entails removal of a mainstream imaging device's infrared cut-off filter, which filters typically are found in mainstream solid state imaging devices (e.g., often affixed on top of, or otherwise above, the sensor array), but typically not found in scientific-grade imaging devices. Another modification entails altering the design and fabrication of the imaging device(s) so as to provide doping profiles (e.g., in or about each sensor cell's potential well) or other alterations, so as to increase the probability of absorption of photons in or around the selected (e.g., infrared) spectrum. The foπner modification tends to increase sensitivity generally through the previously blocked infrared wavelengths. The latter modification may be employed to improve sensitivity less broadly, e.g., as to more limited bands. These and other modifications preferably are employed to improve the imaging device's sensitivity to the selected wavelength(s). Where the radiation wavelength(s) are selected so as to be transmitted entirely through the semiconductor materials being imaged and inspected, a composite wavelength sensitivity may be associated with an embodiment in accordance with the invention, which sensitivity generally is a function of both transmitted radiation (e.g., characterized by spectmm and optical power thereof) and the imaging device's spectral sensitivity. An example of a representative composite sensitivity is illustrated in FIG. 4. It is understood, as well, that when radiation wavelengths area selected for reflection by the semiconductor materials being imaged and inspected, a composite sensitivity may also be associated with an embodiment in accordance with the invention, which sensitivity similarly is a function of both the reflected radiation (e.g., characterized by spectrum and optical power thereof) and the imaging device's spectral sensitivity.
Lighting Generally, embodiments in accordance with the invention provide tools and processes that preferably employ enhanced-performai ce source(s) of radiation, particularly radiation selected in coordination with the other components of the tools and processes contemplated in the invention. Such enhanced radiation source(s) may have various features and/or advantages over more conventional sources of the selected radiation (e.g., over bulbs). Examples of such features and/or advantages are directed to one or more of: selectivity of the radiation's wavelength(s) (including variations therein, e.g., over time); control over, and quality of, collimation (as well as selected departures therefrom, including as a function of wavelength); control and quality of coherence (as well as selected departures therefrom); quantity and control over intensity (e.g., providing variations of intensity, including as a function of wavelength); control over duty cycle (e.g., from pulsed to continuous, including as a function of wavelength), as well as other characteristics of the source and/or its radiation. hi an example embodiment in accordance with the invention, the radiation source provides radiation in one or more selected, narrow band(s). The source's radiation band typically may be characterized by its central wavelength, e.g., 1070 mn. The source - preferably provides radiation using an array of one or more light emitting diodes (LEDs) and, in application, does so in connection with a selected orientation relative to the to-be- imaged structure (e.g., top lighting, side lighting, etc.). LED arrays having various enhanced features are shown and described in (a) U.S. Patent Application No. , filed March 18, 2005 (Attorney Docket No. PHO-2.010.US), entitled
"MICRO-REFLECTORS ON A SUBSTRATE FOR HIGH-DENSITY LED ARRAY", which application claims priority from U.S. Provisional Application Serial Number 60/554,628, filed March 18, 2004, (b) U.S. Patent Application No. , filed
March 18, 2005 (Attorney Docket No. PHO-2.009.US), entitled "DIRECT COOLING OF LEDS", which application claims priority from U.S. Provisional Application Serial Number 60/554,632, filed March 18, 2004, (c) U.S. Patent Application No. , filed March 30, 2005 (Attorney Docket No. PHO-2.008.US), entitled "LED ARRAY HANING ARRAY-BASED LED DETECTORS", which application claims priority from U.S. Provisional Application No. 60/558,205, filed March 30, 2004, and (d) U.S. Patent Application No. 10/984,589, filed November 8, 2004, entitled "HIGH EFFICIENCY SOLID-STATE LIGHT SOURCE AND METHODS OF USE AND MANUFACTURE"; the contents of all such applications of which are hereby incorporated by reference, as if recited in full herein, for all p poses. .FIG. 5 shows a first example embodiment in accordance with the invention. There, a solid state light source 1 irradiates selected semiconductor stmctures 4 via a fiber optic light guide 2 and a lens system 3. The source's radiation is directed to stmctures 4 via an internal beam splitter in the lens system 3. The radiation, so directed, generally is reflected by structures 4 at various intensities (e.g., depending on the bond characteristics and other features and defects of the semiconductor stmctures), so as to travel back up through the lens system 3, to a camera 5, such camera being based on or using one or more solid state imaging devices, e.g., CCD or CMOS detectors. The camera 5 preferably detects such reflected radiation of one or more wavelengths. Via such detection, an image of the stmctures 4 is captured. The image, so captured, may be provided for further processing via, e.g., computer 6. The captured image, so processed or otherwise, may be employed for test and quality control, toward identifying relevant features of such stmctures 4, e.g., where such relevant features are associated with bonded or stacked layers (e.g., in the interfacing layer(s) of bonded or stacked substrates or in the bond itself) or with other bonded or stacked materials. In a second example embodiment, shown in FIG. 6, the radiation source 1 is oriented on the side of the to-be-imaged stmctures 4 opposite the lens system 3 and camera 5, so as to provide back light. In this orientation, the source's radiation is transmitted through the stmctures 4 (as well as, thiough the substrate 7 of the semiconductor device having such structures 4) at different intensities to the lens system 3 for image formation on the CCD/CMOS camera 5. The image, so captured, may be provided for further processing via, e.g., computer 6. As with the first example embodiment, the captured image, so processed or otherwise, may be employed for test and quality control, toward identifying relevant features of such structures 4. In this back light orientation, transmission of the radiation through the structures 4 and substrate 7 will depend on various factors, as previously described, including the absence of metal or other interconnect layers or other materials which would block the transmission of the radiation, or reflect it away from the lens system 3. In a third example embodiment, shown in FIG. 7, the radiation source 1 is oriented to the side of the to-be-imaged structures 4. Whereas most of the source's radiation will tend be reflected by the substantially flat surface of the sαbstrate7 away from the lens system 3 so as to be unavailable for image capture via CCD/CMOS camera 5, the stmctures 4 will cause dark field reflections perpendicular to the substrate's surface. Since such reflections respond to the structures (e.g., topology, conditions and other features), such orientation is generally suitable for providing higher contrast imaging and inspection. An extension of the third example embodiment, shown in FIG. 8 contemplates capturing a plurality of images with dark field lighting to deduce height information associated with a selected structure 4. Because the image is generated using radiation directed to structure A at a known angle (i.e., based on the orientation of the source 1) the height of the stmcture 4 (or a given feature of the stmcture 4) is measured by measuring the width of the shadows of the given stmcture or its given feature. More specifically, the height is given by the product of the measured shadow width and the tangent of the known angle of the directed radiation. In a fourth example embodiment, shown in FIG. 9, the radiation source 1 is oriented underneath the to-be-imaged stmcture 4 so as to direct radiation toward the stmcture 4, but at an angle from the axis of the lens system 3 and CCD/CMOS camera 5. This orientation is suitable to outline specific directional edges of any given feature located in a transparent or semi-transparent medium, the specific edge direction being determined to be perpendicular to the direction of the light source. This orientation also generates high edge contrast. Similarly, a topside angled radiation source may be used to highlight features and feature textures that may not be visible, or not sufficiently visible, via other methods. -An extension of the fourth example embodiment contemplates capturing a plurality of images with backside light sources shining at different angles to collect all or a substantial variety of directions and constmct the multidirectional edge profiles associated with a structure 4. In a fifth example embodiment, shown in FIG. 10, the radiation source 1 is oriented to shine precisely perpendicular to the edge of the surface of a semiconductor substrate 7. So delivered, the radiation is retained in the material by total internal reflection, provided that the angle of incidence is less than the critical angle for that material. However, whenever such radiation encounters a feature (e.g., the stmcture 4) on or at one of the surfaces providing such internal reflection, the radiation will tend to be directed out of the substrate, e.g., from back side of the chip for capture by the lens system 3 and imaging by the CCD/CMOS camera 5. Here, the radiation source 1 typically is a solid state source, preferably a one dimensional array of solid state emitting devices —such as, e.g., LEDs/VCSELs— radiating either through a lens array or through a linearly-arranged fiber optic light guide. This orientation has advantages, including, for example, that it enables provision of enhanced visual contrast of stmctures disposed inside a semiconductor device, including defects. In practice, absorption of the radiation tends to limit the penetration of the incident light to up to a few millimeters (see Table 2, above for example penetrations into silicon associated with various wavelengths). Even so, when this orientation may be applied, it generally provides useful images of back side device stmctures, including, as examples, circuitry, cracks, voids and particulate defects embedded within diced semiconductor chips. An extension of the fifth embodiment, as shown in FIG. 11, contemplates employ of a plurality of radiation sources la- Id, each in an orientation described by the fifth embodiment. More specifically, as illustrated, this extension contemplates employ of four radiation sources, one for each dicing direction (top, bottom, left, right), so as, e.g., to outline directional features embedded within diced semiconductor chips. It is understood that more or less than four sources may be used without departing from the principles of the invention. Other example embodiments of the invention include, but are not limited to one or more LEDs arrays, or other solid-state radiation source(s): • Irradiating a beam splitter in a through-the-lens lighting system directly (i.e., absent a fiber optic light guide). • Providing top light irradiation in either a "ring" or "dome" configuration. • Providing "ring" or "dome" irradiation via a fiber optic ring or dome light guide. • Providing radiation in backlight and/or toplight orientations, via fiber optic light guide. • Providing a variety of selected wavelength(s) or bands of wavelength(s), such that each source radiates a specific wavelength or band
• Providing a variety of selected wavelength(s) or bands of wavelength(s), such that each source radiates a specific wavelength or band and each source is subject to individual control, including, as examples, control one or more of: radiating at selected time(s), for selected duration(s), at selected intensities, with selectable collimation, with selected pulse duty cycles, etc. To illustrate, a plurality of anays may be provided, pairs of which provides distinct, nanow band of wavelengths about a center wavelength, one collimated and the other not collimated, and such that the each array may be sequentially, simultaneously or otheiwise powered with respect to any other array(s), such power being for respective time duration(s), or continuing, or in pulse modes of selected duty cycle, so as to enable gathering of various responses of a stmcture to such applied radiation(s) and with that information (and, preferably, processing thereof), imagmg, inspecting or otherwise analyzing the stmcture, including as to relevant conditions and/or defects.
Still other example embodiments of the invention include, but are not limited to: • Providing fiber-optics in image acquisition • Providing pulsed illumination with synchronized image capture (e.g., synchronizing the camera's shutter or other gating component, of the camera or otherwise). • Providing enhanced high-intensity radiation in, e.g., a through-lighting orientation, such as by super high intensity radiation, preferably pulsed, from one or more LED arrays, in one of more selected bands
Solid state sources, such as LEDs, have various characteristics, including some advantages, including: o As compared to bulbs, solid state sources tend to have a direct cost advantage. o Elimination of filters, e.g., IR band pass filters as in FIG. 1 are eliminated because LEDs and LED-based arrays can be provided that deliver narrow band(s) of wavelength(s), thus indirectly reducing cost of and complexity of implementations. o Readily enable implementations having spectral separation between back and top light sources. o Clear images are promoted as LEDs tend to have narrow band radiation, which tends to preclude certain problems, e.g., chromatic aberration (where rays of different wavelengths bend differently when passing through lenses). o Readily enable collimation or absence of collimation. o Narrow band radiation also results in interference fringes in which bonding defects show up as concentric rings due to constructive and destmctive interference. o Backlighting is scalable with LEDs by simply increasing size of array. o LEDs have stable light output - eliminates calibration problems with bulbs. o LEDs have long lifetime (-100000 hours) - no need to replace after only 1000 hours as with bulb. o LEDs are narrow band and do not put additional IR (heat) energy into the inspection target. Heat could damage target. o LED arrays can be used to selectively provide collimation at one or more wavelengths. o LED arrays can be populated with various wavelength specific LEDs so as to provide various wavelengths at selected times, e.g., sequential or simultaneous pulsing at various power and duty cycles. Optics The lens system typically is selected based on various factors. Examples of these factors include the field-of-view requirements of the imaging/inspection application and the applicable (selected) radiation source orientation (with examples of same described above). Optics typically are treated with antireflective coatings to reduce reflections in a range of selected wavelengths, e.g., those centered on 1070 nm. One example embodiment, with particular application to the first example embodiment described above with reference to FIG. 5, uses a zoom lens which provides a field of view ranging from 6 mm to 40 mm. A second example embodiment, with particular application to the second example embodiment described above with reference to FIG. 6, uses a fixed magnification lens system which provides a field of view of 4 mm, and has both a beam splitter and an input port to accommodate a fiber optic light guide directing radiation from a source. Focus and zoom (if applicable) may be set either/both manually (e.g., by turning a dial) or automatically (e.g., by computer control). For applications where dimensional measurement is required, a telecentric lens may also be used. Using optics selected for proper magnification and coated for maximum transmission at selected wavelength(s) (e.g., wavelength(s) generally in the 700mn- 3000nm long visible red to near infrared spectmms, or more specific band(s), e.g., centered on 1070nm, or centered on 1200nm or in any of various bands, such as 1050- 1200nm, 1050-13 OOnm, or 1000-1300 mn wavelength range), enables the use of high resolution CCD/CMOS imaging devices, e.g., near the upper -wavelength limits of their spectral sensitivity. Imaging' Example embodiments in accordance with this invention employ of a high-sensitivity cameras based on or using CCD/CMOS imaging device(s). CCD/CMOS imaging technologies are substantially mature, particularly relative to some infrared camera technologies, such as those based on arrays of certain gallrum arsenide detectors or microbolometers. This maturity translates into various advantages for CCD/CMOS imaging devices and the cameras based thereon, particularly as compared to cameras specific to infrared imaging: • Sensor density: CCD/CMOS cameras are commercially available with up to 8 million pixels (compared to typical infrared cameras which typically have as few as 0.25 million pixels). • Standardized electrical interfaces: CCD/CMOS cameras are commonly available with standard electrical interfaces to frame grabbers, or to flexible high-speed bus architectures, such as IEEE 1492, USB-2, or 100-Base-T. • Cost: CCD/CMOS cameras are significantly less expensive than such infrared cameras (by as much as an order of magnitude). • Noise: CCD/CMOS cameras may have various noise performance (e.g. , cameras using scientific-grade solid state imaging devices tend to have superior signal to noise ratios and, generally, relatively low noise characteristics). For those cameras using imaging devices where noise may nevertheless be an issue, the noise may be readily and relatively inexpensively reduced by cooling, e.g., such as through Peltier cooling assemblies. The use of CCD/CMOS imaging devices is enabled by the use of selected radiation wavelength(s). The radiation wavelengths typically are selected based, among other things on the spectral response of the imaging devices. Generally, particularly for through-substrate orientations, radiation in infrared band may be employed which radiation typically corresponds to significantly diminished sensitivity in semiconductor- based imaging devices, e.g., silicon-based CCDs and CMOS sensors. In a general embodiment in accordance with the invention, tools and processes are provided that exclude (or substantially exclude) radiation wavelengths —other than those of the selected infrared wavelength(s) or band(s)— from the imaging device, such exclusion being maintained at least during for time period(s) associated with imaging using the selected wavelengths. In so doing, the relative insensitivity of the imaging devices is overcome That is, absent wavelengths to which the CCD/CMOS imaging device is more responsive, the imaging device responds only to the narrow band of selected wavelengths and the images reflects such response. Preferably, the signal levels for such imaging are brought up to a measurable level using various approaches, such approaches including, as examples, opening the lens aperture, increasing exposure time, increasing electronic gain, by digitally averaging multiple acquired images, or using other techniques to expose that may be known in the art. In another general embodiment in accordance with this invention, tools and processes are provided which recognize and respond to the quantum efficiencies and other physical properties of solid state imaging devices. Such tools and processes preferably respond to and result from coordination of various facts and factors, including: (a) the particular, to-be-imaged semiconductor stmcture has known or detennined semiconductor materials (and associated band-gap energy or energies) and may have features (including conditions and defects) of known or determined parameters, including as to typical size(s), shape(s) and location(s); (b) radiation wavelength(s) or band(s) of wavelength(s) are selected based on such materials, energies, and parameters, as well as the orientation of the radiation soxirce and subject to the spectral response of the imaging device; (c) one or more radiation sources are selected and oriented, which radiation sources are enabled to provide the selected wavelength(s) and to deliver the radiation at appropriate orientations (e.g., angles and locations, including from the back side of the stmcture) relative to the semiconductor stmcture, as well as, preferably, to control radiation characteristics (including as to intensity, collimation, lack of collimation, pulsing, etc.); (d) a lens system is selected so as to transmit the selected wavelengths to, and form the images on, the imaging device which selection preferably matches the lens' image-forming capabilities with the imaging device's image-capture capabilities (e.g., the lens is able to resolve features of size equal to, or less than, the feature sizes that the imaging device resolves), so as to properly image the stmcture as to its relevant feature(s), including conditions and defects; and (e) one or more solid state imaging device(s) are employed that are able to properly capture an image of the structure's relevant features (e.g., conditions and defects), at least one of which imaging devices, among other attributes, has (i) sufficient sensitivity to the selected wavelength(s) to capture the image, provided proper delivery of the radiation is maintained and (ii) sensor cell size and number sufficient to resolve the to-be-imaged imaged structure and its relevant features. In this general embodiment, when imaging based on a radiation orientation that directs the selected wavelength(s) to the back side of the stmcture (e.g., for through- substrate imaging), the radiation source preferably provides infrared wavelength(s) that are long enough to be transmitted entirely through the to-be-imaged stmcture. However, such wavelengths should yet be short enough to enable imaging of the stmcture and its relevant features, including the various relevant conditions and defects that may be driving the imaging. Principles of physics generally dictate that, to image a device having a relevant dimension "x," the wavelength employed should be "Hx'^nd, preferably, even smaller. In selecting the wavelength(s), this general embodiment contemplates coordination between these two factors, which factors may at times tend to push in different directions (e.g., longer wavelengths to pass through the substrate but shorter wavelengths so as to detect and properly image the structure as to its relevant features). Moreover, in this general embodiment, the radiation source(s) preferably (i) provides the selected wavelengths (e.g., at appropriate intensities, for sufficient durations, at proper duty cycles) so as to enable the imaging device(s) to capture the image based on the selected wavelengths, i.e., despite the device's relative insensitivity to such wavelengths, while (ii) excluding all (or substantially all) other wavelengths, so that an imaging device's sensor cells are not electrically saturated by such other wavelengths.
Use of CCD/CMOS tends to provide various advantages, with examples including: o Cost advantage o Enhanced flexibility in selection of resolution and pixel sizes (e.g., scientific-grade vs mainstream), such that tools and processes may render detail and "big picture" in same view. More infonnation tends to be collected in one snapshot, which simplifies image analysis. o Improved data rates o CCD/CMOS cameras are mainstream and mature.
Image Enhancement The images captured by the CCD/CMOS camera in the example embodiments may be enhanced using one or more of various digital image processing techniques. Examples of these techniques include: Dust removal: - Small particles of dust on the surface of a wafer show up as dark spots in the image, typically veiy dark spots. The impact of these spots to subsequent image enhancement algorithms and to subjective quality judgments may be reduced by thresholding the image. To do so, all pixels with values less than the tlireshold are set to the threshold value. This thieshold may be variously determined, including empirically. In any case, the threshold is determined so that, in application, it reduces the impact of dust in the image, while having either no (or a non-substantial) impact on the image otherwise. A reasonable threshold setting may be obtained by first computing the mean
Figure imgf000038_0001
and standard deviation
Figure imgf000038_0002
of the image pixels. The thieshold may then be defined as
. t = p - Aσ
• Shading Correction: - The result of non-planar illumination and distortion in the optics train leads to images that tend to be darker near the edges relative to the center. Shading correction is applied to correct tins problem and, in so doing, facilitate qualitative and quantitative comparisons between regions of the image that may be illuminated at different levels. Shading correction entails taking a calibration image using a uniform background, with the illumination power set so that the brightest portion of the image (near the center) almost, but does not quite saturate the image. To perform shading correction of subsequent images, each pixel of a raw acquired image is corrected by dividing by the value of the corresponding pixel of the calibration image. This results in an array of pixels in the range [0...1], which may be renormalized to fit in the more standard pixel range [0...255] or any other range appropriate to downstream processing. Scratch removal: - The surfaces of unpolished wafers are often marred by scratches (e.g., horizontal, multi-directional, etc.). Scratches tend to add periodic oriented noise to the digital images obtained with a CCD/CMOS camera. Moreover, scratches interfere with standard computer vision techniques, such as template matching, edge detection, and connectivity analysis, as well as with a human operator's ability to inspect for defects. Therefore it is expedient to digitally remove this oriented noise.
Because this noise is a strong oriented signal in Fourier space, the removal algorithm (i) transforms the implicated image into the Fourier domain by use of the Fast Fourier Transfoπn, (ii) analyzes the transformed image to detect oriented noise, (iii) subtracts a Gabor filtered approximation of the oriented noise, and finally (iv) converts the result back to the image domain via the inverse Fourier transform. Contrast enhancement: - This image processing algorithm is a foπn of the well- known histogram equalization technique, in which pixel values are globally remapped according to a "stretching" function. First, the minimum and maximum grey levels are computed. These computed levels are used to remap the original pixel values according to the formula:
(Tlts -l)(px>y - (m (p) + a)) "-.,, = - (max(p) -b) — (min( ) + α)
This function has the effect of linearly stretching the histogram over the complete dynamic range (0...2b,ls-l) of a pixel represented with a number of bits equal to bits. In an example embodiment according to this invention 8 bits per pixel are used; however, it is understood that other bit values may be used without departing from the principles of the invention. The parameters α and b (nominally set to 0) control the dark level and bright level of the histogram. Larger values of α cause the histogram to be stretched more while one or more values of p to be merged into a single value of n. Larger values of b also increase the degree of histogram stretching, i.e., by causing one or more values ofp to be merged into a single value of n.
Other pixel transformation functions may also be used without departing from the principles of the invention. As an example, the following function may be used:
Figure imgf000040_0001
Pmin = P -
Figure imgf000040_0002
In this case, c and d define the upper and lower bounds of the stretched histogram.
• Combining images: - Images of the same field of view taken using different radiation sources having different orientations (e.g., toplight, backlight or sidelight) generally emphasize different sets of physical features. Two or more of these images may be merged together in a number of different ways. Examples include: o The pixels in pairs of images may be subtracted to yield a set of differential images. This method is particularly effective when the top semiconductor substrate is unpolished, allowing substantial reduction in signal noise based in the backlight image from the information contained in the top light reflected image. o The pixels in selected three images may be used to represent color channels, e.g., in an RGB image o Individual images may be analyzed independently for features that are robustly detectable under each illumination scheme. Measurements may be made between features detected in one image to features in another image.
• Deconvolution: - Deconvolution is the process of undoing the smearing in a data set that has occurred under the influence of a known response function, for example, because of the known effect of a less-than perfect measuring apparatus. This definition comes from definition from: Press, W., Teukolsky, S.A., Nettering, W.T., and Flannery, B.P., 1992, Numerical Recipes in C The Art of Scientific Computing, Second Edition (Cambridge: Cambridge University Press).
Deconvolution can be applied to help eliminate blurring effects of the optical or imaging system and can yield improved object resolution.
• Wiener filtering - Wiener filtering is similar to deconvolution in the presence of noise. The process is to find an optimal filter, which is applied to the input image before deconvolution to eliminate the deleterious effects of noise. This is a well- known technique described in Press, W., Teukolsky, S.A., Nettering, W.T., and Flannery, B.P., 1992, Numerical Recipes in C The Art of Scientific Computing, Second Edition (Cambridge: Cambridge University Press) and Gonzalez, R.C. and Wintz, P., 1987 ', Digital Image Processing, Second Edition (Reading, Massachusetts: Addison- Wesley Publishing Company).
FIG. 12 illustrates an example flow chart of representative image processing operations contemplated to be performed in accordance with this invention. In step 11, a raw image is captured by a CCD/CMOS camera and made available for image processing. In various example embodiments describe above, such processing is performed using a computer 6, such as a PC. However, it is to be recognized that any one or more image processing algorithms may be implemented other than via a PC, without departing from the principles of the invention. As an example, the algoritlims may be provided via electronics and/or software associated with the camera itself (e.g., triggered by selecting a hard or soft button on the camera). In step 12, dust is removed, as described above or otherwise. In step 13, shading coπection is performed, as described above or otherwise. In step 14, scratch removal is performed, as described above or otherwise. In step 15, contrast enhancement is performed, as described above or otherwise.
Application
FIG. 13 shows a cutaway side view of a typical MEMs sensor wafer sandwich.
The substrate layer 100 and the cap layer 102 are both made of silicon, which is transparent to wavelengths in the near infrared spectrum (NIR). The bond layer 104 holds the substrate layer 10O and the cap layer 102 together and, depending on the specifics of the wafer constmction process, may seive as a hermetic seal to protect devices 106 from the enviromnent. The purpose of the imaging and inspection process is to verify the integrity and consistency of the bond layer 106, including any defects 108 (e.g., here illustrated as a void) that may be disposed therein. These conditions and defects may be identified and measured from digital images captured using tools and processes, according to the invention. To do so, the incident radiation generally will include selected infrared wavelength(s) or band(s), so as to penetrate to the bond layer and any relevant features therein (e.g., the above-described conditions and defects). FIG. 14 shows a cutaway side view of a typical fusion bonded, bare wafer sandwich as is typically used in production of Silicon on Insulator (SOI) bare wafers. This stmcture includes a substrate carrier layerl 10, a cap layer 112 and a bond layer 114. In such stmctures, uniformity and integrity in the bond layer is generally of importance. As such, presence of particulates, voids or other defects 116 in the bond layer 114, or even slight differences in uniformity are not desirable. Accordingly, relevant features for imaging and inspection including the uniformity condition, as well as any particulate, void or other defects associated with the bond layer 114. Again, such conditions and defects may be identified and measured from digital images captured using tools and processes, according to the invention. To do so, the mcident radiation generally will include selected infrared wavelength(s) or band(s), so as to penetrate to the bond layer and to any relevant features therein (e.g., the above-described conditions and defects). In that imaging, subjecting the stmcture to narrow band IR backlight illumination in the presence of particulates/voids i the bond layer 114, or even slight differences in uniformity therein, will generally result in formation of interference fringes in the image. With a more broadband illumination, such interference fringes (also referred to as ring patterns) tend to appear in the image, with proximity to bond layer defects. FIG. 15 illustrates a representative ring pattern that might typically be formed when imaging a fusion bonded bare wafer using tools and processes in accordance with the invention. These periodic patterns are readily detected by eye, and may be automatically detected by an algorithm designed to detect such periodic features. The fringes may also be used to estimate the height of internal defects. One full period of an interference fringe (transition from dark to light to dark again) coiresponds to a change in distance between bonded materials of 1 wavelength of incident light. FIG. 16 shows a cutaway side view of a single semiconductor wafer 120, which wafer 120 may be either patterned or non-pattemed. Interior to the wafer 120, a ciystalline bond is shown to be cleaved, resulting in formation of a micro crack 122. Also interior to the wafer is a void or insertion defect 124. Clearly, neither such defect is desirable (and, being interior to the wafer 120, each such crack or defect may slip detection using conventional imaging and inspection approaches, e.g., with typical pattern, electrical, or surface inspection tools). Moreover, depending on the circuits to be constmcted using such wafer 120, even one such micro-crack 122 or defect 124 may result in unacceptably low, and thus, costly yields and/or poor long-term reliability. Accordingly, relevant features for imaging and inspection in even single wafers include micro-cracks 122 and void, insertion or other defects 124 ulterior to the wafer 120. Again, such conditions and defects may be identified and measured from digital images captured using tools and processes, according to the invention. To do so, the incident radiation generally will include selected infrared wavelength(s) or band(s), so as to penetrate into the wafer's interior to any of the above-described conditions or defects. By imaging through the wafer 120 at an angle or with infrared backlight, the presence of any such micro-crack or defect may be detected. FIG. 17 guides discussion of how various features of a semiconductor device are depicted imaging using tools and processes in accordance with the invention. Here, a typical patterned wafer 130 has devices 132 positioned in a regular grid on the wafer's surface. Dicing lines 134 indicate where the wafer 130 will be cut to liberate individual devices 132 for packaging and, prior to such cutting process, such lanes separates the devices 132. In digital imaging, a bond region 136 is recognized (and distinguished) from a "no print" zone 138 surrounding certain active circuitry 140 of the device 132. That recognition is achieved based on variation in the intensity of light reaching the camera at that position, relative to other positions. Since the bonding material attenuates light more than the air filling a void region, the bond region 136 (demarcated by inner boundary 142 and outer boundary 144) will appear darker in the resulting image. Similarly, the silicon features making up the dicing lines 134and other surrounding wafer features will appear even darker than the bond region 136. With reference to FIG. 17, an example process inspecting a typical patterned wafer has the following steps: • Locate the circuitry 140 and/or the device 132 (optionally, use this position to fix the detection of the following features). The circuitry/device 140, 132 may be located by, e.g., a template matching algorithm, such as an algorithm utilizing noπnalized correlation suffices. • Locate the outer boundary 144 of the bond region 136. The outer boundary 144 may be located, e.g., with any of a variety of line detection or "caliper" tools. Such tools find the location of extended straight lines between regions of image contrast. • Locate the inner boundary 142 of the bond region 136. The typically irregular imier boundary may be identified by applying any of a variety of line detection tools, toward finding, e.g., a set of short line segments that approximate an inner boundary contour. Greater approximation accuracy may be achieved by increasing the number of detectors used. • Compute the distance from each point along the bond region's imier boundary 142 to its outer boundary 144. • Locate all voids in the bond region. Voids may be identified by using connectivity analysis. Connectivity analysis separates the foreground from background by considering foreground to be all image components with a grey scale value larger than a threshold. The threshold is determined empirically and varies depending on application, optics, lighting, and imaging. FIG. 18 shows a cutaway side view of a semiconductor package application 148.
Flip chip devices 150 are typically bare dies placed upside down upon a piece of interposer material, typically glass based or organic based. Placement accuracy can be ascertained by imaging and inspecting the position of an on-chip alignment target 152 (e.g., imaging through the chip) relative to the position of an alignment target 154 on the substrate 156. In this FIG. 18 the respective aligmnent targets are shown to have a displacement Δ. A similar application arises when two wafers are to he aligned and then bonded. As shown in FIG. 19, a lower substrate 160 is positioned underneath a cap substrate 162. Substrates 160 and 162 have respective aligmnent targets 164 and 166. Placement accuracy can be ascertained by imaging and inspecting the relative positions of such on- substrate aligmnent targets 164, 166 (e.g., imaging through the cap substrate to do so). In this FIG. 19, the respective alignment targets 164, 166 are shown to be aligned. Application of the principles of the invention to image and inspect is viable for, but not limited to, a number of semiconductor structures. These stmctures include, as examples: micro electromechanical devices, CCDs, CMOS sensors, other sensors, electro-optical components, semiconductors with minors.
These operations generally may be performed using well-known computer vision techniques. A number of computer vision software packages are commercially available (for example, MVTec's Halcon, or Intel's Integrated Performance Primitives (IPP)) that provide a rich set of software tools.
Persons skilled in the art will recognize that many modifications and variations are possible in the details, materials, and arrangements of the parts and actions which have been described and illustrated in order to explain the nature of this invention and that such modifications and variations do not depart from the spirit and scope of the teachings and claims contained therein.

Claims

WHAT IS CLAIMED:
1. A through-substrate optical inspection system for inspecting semiconductor structure, comprising a solid state radiation source oriented to provide backlight with radiation at a first wavelength having intensity > 0.001 mW/cm2, a second solid state radiation source oriented to provide toplight with radiation at a second wavelength having intensity >0.001 mW/cm2, and a solid state imaging device sensitive to the first and second wavelengths, and wherein the second solid state radiation source provides radiation at wavelengths, times and intensity insufficient to saturate the imaging device, whereby the imaging device may capture inspection information from the first solid state radiation source.
2. An inspection system as in claim 1, wherein the first radiation source provides radiation at one or more wavelengths that are transmissive of the semiconductor structure.
3. An inspection system as in claim 1, wherein at least one light source comprises one or more arrays of light-emitting, solid state semiconductor devices.
4. An inspection system as in claim 3, wherein the an array of light-emitting, solid state semiconductor devices provides radiation at one or more wavelengths that are transmissive of the semiconductor structure.
5. An inspection system as in claim 1, wherein the semiconductor stmcture is silicon- based and the first radiation source provides radiation at wavelengths in a range of 1050-1200nm.
6. An inspection system as in claim 5, wherein the semiconductor structure is silicon-based and the second radiation source provides radiation at wavelengths in a range of 1100-1300nm.
7. An inspection system as in claim 1, further comprising a lens system that provides resolution matched to the resolution of the imaging device.
8. An inspection system as in claim 7, wherein the imaging device has useful spectral sensitivity up to wavelengths at least as long asl200nm and has sensor cell size with dimensions at or below half the dimension of the semiconductor structure's relevant feature.
9. An inspection system as in claim 8, further comprising a plurality of lighting systems to support a selected combination of front side, backside, side and dark field irradiation of the semiconductor stmcture.
10. An inspection system as in claim 1, further comprising an image processing mechanism, which mechanism supports at least one of stretching the region of interest, identifying edges and features in the image, and automatically inspecting the semiconductor stmcture.
11. An inspection system as in claim 10, wherein the image processing mechanism provides for measuring height of a feature enclosed between two silicon wafers of the semiconductor structure.
12. An inspection system as in claim 1, wherein the semiconductor structure includes direct fusion bonded semiconductor materials and at least one of the radiation sources provide radiation at orientation, wavelength and intensity sufficient to enable detection of the presence of defects associated with the bond of such structure.
13. An inspection system as in claim 1, wherein the imaging device captures an image of at least one aligmnent target associated with the semiconductor stmcture.
14. An inspection system as in claim 1, wherein the imaging device captures an image representative of ciystal defects -cracks, insertions or voids- inside the semiconductor stmcture.
15. An inspection system as in claim 1, wherein the imaging device captures an image representing the depth or z location of a buried defect, like a crack, dislocation or void, whereby such depth or z location is measured.
16. A process for an inspection system, the process providing for inspection of a semiconductor structure for relevant features, such features having known size, the process co prising: identifying the semiconductor stmcture 's relevant semiconductor materials and associated hand-gap energy or energies, identifying the semiconductor stmcture' s relevant features, including size and location of such features; selecting one or more imaging devices based on the resolution thereof, in coordination with the size of the relevant features; identifying the imaging device's spectral sensitivity curve; determining one or more orientations for irradiation using one or more radiation sources, in coordination with the location of such features; selecting one or more radiation wavelength(s) or band(s) of wavelength(s) based on such materials, energies, size and location, in coordination with orientation of the radiation sources and the spectral sensitivity curve of the imaging device; and selecting one or more solid state radiation sources to provide the one or more selected radiation wavelengths at the selected orientations.
17. A process for an inspection system as in claim 16, further comprising selecting a lens system so as to transmit the selected wavelengths to, and form the images on, the imaging device which selection matches the lens' image-fonning capabilities with the imaging device's image-capture capabilities.
18. A process for an inspection system as in claim 16, further comprising operating the selected one or more solid state radiation sources to control radiation characteristics, including as to one or more of intensity, collimation, lack of collimation, and pulsed operation.
19. A process for an inspection system as in claim 16, further comprising operating the selected one or more solid state radiation sources so that, when imaging using wavelengths transmissive of the stmcture, other wavelengths are substantially excluded, whereby the imaging device images the structure responsive to the selected, transmissive wavelengths.
20. A process for an inspection system as in claim 16, wherein selecting one or more radiation wavelength(s) or band(s) of wavelength(s) provides wavelength(s) that are both long enough to be transmitted entirely through the stmcture and short enough to enable imaging of the features.
PCT/US2005/013448 2004-04-19 2005-04-19 Imaging semiconductor strucutures using solid state illumination WO2005100961A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP05745456.3A EP1738156A4 (en) 2004-04-19 2005-04-19 Imaging semiconductor strucutures using solid state illumination

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US56385604P 2004-04-19 2004-04-19
US60/563,856 2004-04-19

Publications (2)

Publication Number Publication Date
WO2005100961A2 true WO2005100961A2 (en) 2005-10-27
WO2005100961A3 WO2005100961A3 (en) 2007-02-08

Family

ID=35150586

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2005/013448 WO2005100961A2 (en) 2004-04-19 2005-04-19 Imaging semiconductor strucutures using solid state illumination

Country Status (4)

Country Link
US (1) US8077305B2 (en)
EP (1) EP1738156A4 (en)
TW (1) TWI302756B (en)
WO (1) WO2005100961A2 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1801569A2 (en) * 2005-12-23 2007-06-27 Basler Aktiengesellschaft Method and device for detecting cracks in silicon wafers
EP1857977A2 (en) * 2006-05-16 2007-11-21 Mitsubishi Electric Corporation Image inspection method and image inspection apparatus employing the same
WO2009027517A1 (en) 2007-08-31 2009-03-05 Icos Vision Systems Nv Apparatus and method for detecting semiconductor substrate anomalies
US7816638B2 (en) 2004-03-30 2010-10-19 Phoseon Technology, Inc. LED array having array-based LED detectors
WO2010145881A1 (en) * 2009-04-30 2010-12-23 Wilcox Associates, Inc. An inspection method and an inspection apparatus
US8077305B2 (en) 2004-04-19 2011-12-13 Owen Mark D Imaging semiconductor structures using solid state illumination
US8192053B2 (en) 2002-05-08 2012-06-05 Phoseon Technology, Inc. High efficiency solid-state light source and methods of use and manufacture
US8428337B2 (en) 2008-07-28 2013-04-23 Bluplanet Pte Ltd Apparatus for detecting micro-cracks in wafers and method therefor
US9651502B2 (en) 2008-07-28 2017-05-16 Bluplanet Pte Ltd Method and system for detecting micro-cracks in wafers

Families Citing this family (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2839147B1 (en) * 2002-04-30 2004-07-09 Soitec Silicon On Insulator DEVICE AND METHOD FOR AUTOMATICALLY CONTROLLING THE CONDITION OF THE PLATE SURFACE BY MEASURING THE ADHESIVE SPEED
WO2006072071A2 (en) 2004-12-30 2006-07-06 Phoseon Technology Inc. Methods and systems relating to light sources for use in industrial processes
EP1678442B8 (en) 2003-10-31 2013-06-26 Phoseon Technology, Inc. Led light module and manufacturing method
US7638808B2 (en) 2004-03-18 2009-12-29 Phoseon Technology, Inc. Micro-reflectors on a substrate for high-density LED array
US7235878B2 (en) * 2004-03-18 2007-06-26 Phoseon Technology, Inc. Direct cooling of LEDs
US7071493B2 (en) * 2004-04-12 2006-07-04 Phoseon Technology, Inc. High density LED array
JP4778755B2 (en) * 2005-09-09 2011-09-21 株式会社日立ハイテクノロジーズ Defect inspection method and apparatus using the same
US8363048B2 (en) * 2006-11-16 2013-01-29 General Electric Company Methods and apparatus for visualizing data
US7561329B2 (en) 2006-12-14 2009-07-14 Cytyc Corporation Illumination source for stained biological samples
US8115213B2 (en) 2007-02-08 2012-02-14 Phoseon Technology, Inc. Semiconductor light sources, systems, and methods
JP4407707B2 (en) * 2007-03-02 2010-02-03 日産自動車株式会社 IMAGING DEVICE, IMAGE DISPLAY SYSTEM, AND IMAGING DEVICE CONTROL METHOD
WO2008119550A1 (en) * 2007-04-02 2008-10-09 Viscom Ag Inspection apparatus and method
WO2008125330A1 (en) * 2007-04-16 2008-10-23 Viscom Ag Through-substrate optical imaging device and method
WO2008152020A1 (en) * 2007-06-12 2008-12-18 Icos Vision Systems Nv Method for semiconductor substrate inspection
JP5801558B2 (en) * 2008-02-26 2015-10-28 スリーエム イノベイティブ プロパティズ カンパニー Multi-photon exposure system
US7891159B2 (en) * 2008-05-30 2011-02-22 Cryovac, Inc. Method for positioning a loaded bag in a vacuum chamber
GB2461510A (en) * 2008-06-30 2010-01-06 Ubidyne Inc Reconfigurable Bandpass Delta-Sigma Modulator
US20100193412A1 (en) * 2009-02-02 2010-08-05 Satake Usa, Inc. Beam splitter
EP2284523A1 (en) * 2009-07-28 2011-02-16 VITRONIC Dr.-Ing. Stein Bildverarbeitungssysteme GmbH Method and device for inspecting at least one solar cell
DE102009050711A1 (en) * 2009-10-26 2011-05-05 Schott Ag Method and device for detecting cracks in semiconductor substrates
US8450688B2 (en) 2009-11-05 2013-05-28 The Aerospace Corporation Refraction assisted illumination for imaging
US8138476B2 (en) * 2009-11-05 2012-03-20 The Aerospace Corporation Refraction assisted illumination for imaging
US8461532B2 (en) * 2009-11-05 2013-06-11 The Aerospace Corporation Refraction assisted illumination for imaging
TWI567381B (en) * 2009-11-16 2017-01-21 魯道夫科技股份有限公司 Infrared inspection of bonded substrates
TWI412736B (en) * 2009-12-04 2013-10-21 Delta Electronics Inc A apparatus and method for inspecting inner defect of substrate
US8947664B2 (en) * 2009-12-23 2015-02-03 Infineon Technologies Ag Apparatus and method for aligning a wafer's backside to a wafer's frontside
WO2011127474A1 (en) * 2010-04-09 2011-10-13 Northeastern University A tunable laser-based infrared imaging system and method of use thereof
KR101214806B1 (en) * 2010-05-11 2012-12-24 가부시키가이샤 사무코 Apparatus and method for defect inspection of wafer
TWI422814B (en) * 2010-08-23 2014-01-11 Delta Electronics Inc An apparatus and method for inspecting inner defect of substrate
CN103098459B (en) * 2010-09-22 2016-11-09 富士胶片株式会社 Stereo photographic device and shadow correction method
US8766192B2 (en) * 2010-11-01 2014-07-01 Asm Assembly Automation Ltd Method for inspecting a photovoltaic substrate
JP6000546B2 (en) * 2011-06-30 2016-09-28 浜松ホトニクス株式会社 Optical device for microscopic observation
JP6289450B2 (en) * 2012-05-09 2018-03-07 シーゲイト テクノロジー エルエルシーSeagate Technology LLC Surface feature mapping
US8896827B2 (en) 2012-06-26 2014-11-25 Kla-Tencor Corporation Diode laser based broad band light sources for wafer inspection tools
US9212900B2 (en) * 2012-08-11 2015-12-15 Seagate Technology Llc Surface features characterization
FR2994734B1 (en) * 2012-08-21 2017-08-25 Fogale Nanotech DEVICE AND METHOD FOR MAKING DIMENSION MEASUREMENTS ON MULTI-LAYER OBJECTS SUCH AS WAFERS.
US9427776B2 (en) * 2012-08-23 2016-08-30 Raytheon Company Method of stress relief in anti-reflective coated cap wafers for wafer level packaged infrared focal plane arrays
JP5862522B2 (en) * 2012-09-06 2016-02-16 株式会社島津製作所 Inspection device
US9297759B2 (en) 2012-10-05 2016-03-29 Seagate Technology Llc Classification of surface features using fluorescence
US9297751B2 (en) 2012-10-05 2016-03-29 Seagate Technology Llc Chemical characterization of surface features
US9377394B2 (en) 2012-10-16 2016-06-28 Seagate Technology Llc Distinguishing foreign surface features from native surface features
US9007454B2 (en) 2012-10-31 2015-04-14 The Aerospace Corporation Optimized illumination for imaging
US9217714B2 (en) 2012-12-06 2015-12-22 Seagate Technology Llc Reflective surfaces for surface features of an article
US9513215B2 (en) 2013-05-30 2016-12-06 Seagate Technology Llc Surface features by azimuthal angle
US9217715B2 (en) 2013-05-30 2015-12-22 Seagate Technology Llc Apparatuses and methods for magnetic features of articles
US9201019B2 (en) 2013-05-30 2015-12-01 Seagate Technology Llc Article edge inspection
US9274064B2 (en) 2013-05-30 2016-03-01 Seagate Technology Llc Surface feature manager
DE102013112885A1 (en) * 2013-11-21 2015-05-21 Osram Opto Semiconductors Gmbh Method for optically characterizing an optoelectronic semiconductor material and device for carrying out the method
KR101886947B1 (en) 2014-05-05 2018-08-08 아르코닉 인코포레이티드 Apparatus and methods for weld measurement
JP6298577B2 (en) * 2014-05-16 2018-03-20 アーコニック インコーポレイテッドArconic Inc. Peeling apparatus and method for separating welded layers
US20160056065A1 (en) * 2014-08-25 2016-02-25 Robert E. Stahlbush Method and apparatus for removing experimental artifacts from ensemble images
AR104246A1 (en) 2015-04-14 2017-07-05 Cryovac Inc METHOD OF PLACING AND SEALING A BAG IN AN EMPTY CHAMBER, BAG PLACEMENT APPLIANCE, AND METHOD OF MANUFACTURING A PATCH BAG
GB2549071B (en) * 2016-03-23 2020-11-11 Sony Interactive Entertainment Inc 3D printing system
SG11201810682PA (en) * 2016-06-02 2018-12-28 Tokyo Electron Ltd Dark field wafer nano-defect inspection system with a singular beam
KR20170138207A (en) * 2016-06-07 2017-12-15 삼성전자주식회사 Method for Inspecting Surface
US11137356B2 (en) * 2017-11-03 2021-10-05 Sela Semiconductor Engineering Laboratories Ltd. System and method of cleaving of buried defects
EP3707499A4 (en) 2017-11-07 2021-07-28 ABB Schweiz AG Method and apparatus for imaging analysis of a switchgear or the like
CN108593564B (en) * 2018-02-11 2021-01-29 常德金德新材料科技股份有限公司 Color quality detection method for cigarette label printing stock
TW202036059A (en) * 2018-11-07 2020-10-01 荷蘭商露明控股公司 Illumination module
CN109540918B (en) * 2018-11-28 2021-04-16 鞍钢集团自动化有限公司 Hot-rolled coil edge defect detection device and method
TWI802843B (en) * 2021-02-04 2023-05-21 敬鵬工業股份有限公司 Intelligent logistics data collection system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4595289A (en) * 1984-01-25 1986-06-17 At&T Bell Laboratories Inspection system utilizing dark-field illumination
US5892579A (en) * 1996-07-16 1999-04-06 Orbot Instruments Ltd. Optical inspection method and apparatus
US6141040A (en) * 1996-01-09 2000-10-31 Agilent Technologies, Inc. Measurement and inspection of leads on integrated circuit packages

Family Cites Families (150)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1218852A (en) * 1968-04-02 1971-01-13 English Electric Co Ltd High voltage thyristor equipment
US4435732A (en) * 1973-06-04 1984-03-06 Hyatt Gilbert P Electro-optical illumination control system
US3936686A (en) 1973-05-07 1976-02-03 Moore Donald W Reflector lamp cooling and containing assemblies
US4011575A (en) * 1974-07-26 1977-03-08 Litton Systems, Inc. Light emitting diode array having a plurality of conductive paths for each light emitting diode
US4118873A (en) 1976-12-13 1978-10-10 Airco, Inc. Method and apparatus for inerting the atmosphere above a moving product surface
JPS57180005A (en) 1981-04-30 1982-11-05 Hitachi Ltd Silicon carbide electric insulator with low dielectric constant
US4530040A (en) 1984-03-08 1985-07-16 Rayovac Corporation Optical focusing system
US4680644A (en) * 1984-07-23 1987-07-14 Canon Kabushiki Kaisha Method and apparatus for reading an image
JPH0825302B1 (en) 1984-09-27 1996-03-13 Sanyo Denki Kk
DE3578768D1 (en) * 1985-03-14 1990-08-23 Toppan Printing Co Ltd DEVICE FOR CHECKING PRINTING.
US4684801A (en) * 1986-02-28 1987-08-04 Carroll Touch Inc. Signal preconditioning for touch entry device
KR880014692A (en) * 1987-05-30 1988-12-24 강진구 Semiconductor Light Emitting Device with Reflector
GB2224374A (en) 1988-08-24 1990-05-02 Plessey Co Plc Temperature control of light-emitting devices
DE8815418U1 (en) 1988-12-12 1989-02-16 Isensee-Electronic-Gmbh, 7012 Fellbach, De
US5296724A (en) * 1990-04-27 1994-03-22 Omron Corporation Light emitting semiconductor device having an optical element
JPH0424541A (en) * 1990-05-21 1992-01-28 Mitsui Mining & Smelting Co Ltd Method and apparatus for measuring internal defect
US5018853A (en) * 1990-06-04 1991-05-28 Bear Automotive Service Equipment Company Angle sensor with CCD
US5150623A (en) 1990-07-17 1992-09-29 The Boeing Company Inspection device for flush head bolts and rivets
US5032734A (en) * 1990-10-15 1991-07-16 Vti, Inc. Method and apparatus for nondestructively measuring micro defects in materials
US5195102A (en) * 1991-09-13 1993-03-16 Litton Systems Inc. Temperature controlled laser diode package
JP3025109B2 (en) * 1992-03-11 2000-03-27 シャープ株式会社 Light source and light source device
US5397867A (en) * 1992-09-04 1995-03-14 Lucas Industries, Inc. Light distribution for illuminated keyboard switches and displays
JPH06301304A (en) 1993-02-19 1994-10-28 Minolta Camera Co Ltd Fixing device
US6118383A (en) 1993-05-07 2000-09-12 Hegyi; Dennis J. Multi-function light sensor for vehicle
FR2707223B1 (en) * 1993-07-07 1995-09-29 Valeo Vision Improved signaling light with light-emitting diodes.
US5424544A (en) * 1994-04-29 1995-06-13 Texas Instruments Incorporated Inter-pixel thermal isolation for hybrid thermal detectors
US5449926A (en) 1994-05-09 1995-09-12 Motorola, Inc. High density LED arrays with semiconductor interconnects
US5632551A (en) 1994-07-18 1997-05-27 Grote Industries, Inc. LED vehicle lamp assembly
US5698866A (en) 1994-09-19 1997-12-16 Pdt Systems, Inc. Uniform illuminator for phototherapy
US5555038A (en) * 1994-10-28 1996-09-10 Bausch & Lomb Incorporated Unitary lens for eyewear
US5660461A (en) * 1994-12-08 1997-08-26 Quantum Devices, Inc. Arrays of optoelectronic devices and method of making same
US5522225A (en) * 1994-12-19 1996-06-04 Xerox Corporation Thermoelectric cooler and temperature sensor subassembly with improved temperature control
US5554849A (en) * 1995-01-17 1996-09-10 Flir Systems, Inc. Micro-bolometric infrared staring array
US5623510A (en) 1995-05-08 1997-04-22 The United States Of America As Represented By The United States Department Of Energy Tunable, diode side-pumped Er: YAG laser
US20020054291A1 (en) * 1997-06-27 2002-05-09 Tsai Bin-Ming Benjamin Inspection system simultaneously utilizing monochromatic darkfield and broadband brightfield illumination sources
US5719589A (en) 1996-01-11 1998-02-17 Motorola, Inc. Organic light emitting diode array drive apparatus
US5981949A (en) * 1996-01-18 1999-11-09 The United States Of America As Represented By The Secretary Of The Air Force Locating defects in solid material
US5806965A (en) * 1996-01-30 1998-09-15 R&M Deese, Inc. LED beacon light
US5936353A (en) * 1996-04-03 1999-08-10 Pressco Technology Inc. High-density solid-state lighting array for machine vision applications
US5777729A (en) * 1996-05-07 1998-07-07 Nikon Corporation Wafer inspection method and apparatus using diffracted light
US5880828A (en) * 1996-07-26 1999-03-09 Hitachi Electronics Engineering Co., Ltd. Surface defect inspection device and shading correction method therefor
US6058012A (en) * 1996-08-26 2000-05-02 Compaq Computer Corporation Apparatus, method and system for thermal management of an electronic system having semiconductor devices
US5857767A (en) * 1996-09-23 1999-01-12 Relume Corporation Thermal management system for L.E.D. arrays
US5715270A (en) 1996-09-27 1998-02-03 Mcdonnell Douglas Corporation High efficiency, high power direct diode laser systems and methods therefor
US5910706A (en) 1996-12-18 1999-06-08 Ultra Silicon Technology (Uk) Limited Laterally transmitting thin film electroluminescent device
TW402856B (en) 1996-12-26 2000-08-21 Palite Corp LED illuminator
US5783909A (en) * 1997-01-10 1998-07-21 Relume Corporation Maintaining LED luminous intensity
US5877899A (en) 1997-05-13 1999-03-02 Northeast Robotics Llc Imaging system and method for imaging indicia on wafer
US6319425B1 (en) 1997-07-07 2001-11-20 Asahi Rubber Inc. Transparent coating member for light-emitting diodes and a fluorescent color light source
US6376329B1 (en) 1997-08-04 2002-04-23 Nikon Corporation Semiconductor wafer alignment using backside illumination
US6459919B1 (en) * 1997-08-26 2002-10-01 Color Kinetics, Incorporated Precision illumination methods and systems
US6577332B2 (en) 1997-09-12 2003-06-10 Ricoh Company, Ltd. Optical apparatus and method of manufacturing optical apparatus
US6163036A (en) * 1997-09-15 2000-12-19 Oki Data Corporation Light emitting element module with a parallelogram-shaped chip and a staggered chip array
US6273596B1 (en) 1997-09-23 2001-08-14 Teledyne Lighting And Display Products, Inc. Illuminating lens designed by extrinsic differential geometry
US6200134B1 (en) * 1998-01-20 2001-03-13 Kerr Corporation Apparatus and method for curing materials with radiation
EP0935145A1 (en) 1998-02-04 1999-08-11 IMS Industrial Micro System AG Optical signal and display device
US6239702B1 (en) 1998-03-10 2001-05-29 Raytheon Company Electromagnetic energy detection
US6088185A (en) * 1998-06-05 2000-07-11 Seagate Technology, Inc. Rotational vibration detection using a velocity sense coil
US6536923B1 (en) * 1998-07-01 2003-03-25 Sidler Gmbh & Co. Optical attachment for a light-emitting diode and brake light for a motor vehicle
JP3195294B2 (en) * 1998-08-27 2001-08-06 スタンレー電気株式会社 Vehicle lighting
US6291839B1 (en) * 1998-09-11 2001-09-18 Lulileds Lighting, U.S. Llc Light emitting device having a finely-patterned reflective contact
GB9821311D0 (en) 1998-10-02 1998-11-25 Koninkl Philips Electronics Nv Reflective liquid crystal display device
DE19852323C2 (en) * 1998-11-12 2001-08-16 Steag Hamatech Ag Method for determining the thickness of layers provided on a substrate
US6534791B1 (en) * 1998-11-27 2003-03-18 Lumileds Lighting U.S., Llc Epitaxial aluminium-gallium nitride semiconductor substrate
US20010042866A1 (en) * 1999-02-05 2001-11-22 Carrie Carter Coman Inxalygazn optical emitters fabricated via substrate removal
US6320206B1 (en) * 1999-02-05 2001-11-20 Lumileds Lighting, U.S., Llc Light emitting devices having wafer bonded aluminum gallium indium nitride structures and mirror stacks
US6155699A (en) * 1999-03-15 2000-12-05 Agilent Technologies, Inc. Efficient phosphor-conversion led structure
AU3667900A (en) 1999-04-07 2000-10-23 Mv Research Limited Material inspection
JP3536203B2 (en) * 1999-06-09 2004-06-07 東芝セラミックス株式会社 Method and apparatus for measuring crystal defects in wafer
JP4332933B2 (en) * 1999-06-10 2009-09-16 ソニー株式会社 Inspection device
US6285449B1 (en) * 1999-06-11 2001-09-04 University Of Chicago Optical method and apparatus for detection of defects and microstructural changes in ceramics and ceramic coatings
DE19931689A1 (en) 1999-07-08 2001-01-11 Patent Treuhand Ges Fuer Elektrische Gluehlampen Mbh Planar LED assembly on thermally-conductive board, increases cooling, component packing density and life, whilst permitting active device integration to form display- or illumination panel in or on e.g. vehicle
US6366017B1 (en) * 1999-07-14 2002-04-02 Agilent Technologies, Inc/ Organic light emitting diodes with distributed bragg reflector
TW457732B (en) * 1999-08-27 2001-10-01 Lumileds Lighting Bv Luminaire, optical element and method of illuminating an object
JP4131891B2 (en) * 1999-09-03 2008-08-13 ローム株式会社 Lens array and method for manufacturing lens array
US6788895B2 (en) 1999-12-10 2004-09-07 Altera Corporation Security mapping and auto reconfiguration
JP2001194321A (en) * 2000-01-12 2001-07-19 Tokyo Seimitsu Co Ltd Semiconductor wafer inspection device
US6318886B1 (en) * 2000-02-11 2001-11-20 Whelen Engineering Company High flux led assembly
US7320593B2 (en) 2000-03-08 2008-01-22 Tir Systems Ltd. Light emitting diode light source for curing dental composites
US6419384B1 (en) * 2000-03-24 2002-07-16 Buztronics Inc Drinking vessel with indicator activated by inertial switch
US6328456B1 (en) 2000-03-24 2001-12-11 Ledcorp Illuminating apparatus and light emitting diode
US6704089B2 (en) * 2000-04-28 2004-03-09 Asml Netherlands B.V. Lithographic projection apparatus, a method for determining a position of a substrate alignment mark, a device manufacturing method and device manufactured thereby
EP1158761A1 (en) 2000-05-26 2001-11-28 GRETAG IMAGING Trading AG Photographic image acquisition device using led chips
US6850637B1 (en) * 2000-06-28 2005-02-01 Teradyne, Inc. Lighting arrangement for automated optical inspection system
DE60143152D1 (en) * 2000-06-29 2010-11-11 Koninkl Philips Electronics Nv OPTOELECTRIC ELEMENT
JP4142234B2 (en) * 2000-07-04 2008-09-03 株式会社エンプラス Surface light source device and liquid crystal display device
US7099005B1 (en) * 2000-09-27 2006-08-29 Kla-Tencor Technologies Corporation System for scatterometric measurements and applications
CN1259732C (en) * 2000-09-29 2006-06-14 欧姆龙株式会社 Optical device for optical element and equipment using the same
EP1366552B1 (en) 2000-09-29 2011-11-09 Panasonic Electric Works Co., Ltd. Semiconductor device with protective functions
US7714301B2 (en) 2000-10-27 2010-05-11 Molecular Devices, Inc. Instrument excitation source and calibration method
US6525335B1 (en) * 2000-11-06 2003-02-25 Lumileds Lighting, U.S., Llc Light emitting semiconductor devices including wafer bonded heterostructures
GB2369428B (en) * 2000-11-22 2004-11-10 Imperial College Detection system
GB0101985D0 (en) * 2001-01-25 2001-03-14 Marconi Comm Ltd Optical component
CA2332190A1 (en) 2001-01-25 2002-07-25 Efos Inc. Addressable semiconductor array light source for localized radiation delivery
US7075112B2 (en) * 2001-01-31 2006-07-11 Gentex Corporation High power radiation emitter device and heat dissipating package for electronic components
US6547249B2 (en) * 2001-03-29 2003-04-15 Lumileds Lighting U.S., Llc Monolithic series/parallel led arrays formed on highly resistive substrates
US6755647B2 (en) * 2001-04-26 2004-06-29 New Photonics, Llc Photocuring device with axial array of light emitting diodes and method of curing
US6607286B2 (en) * 2001-05-04 2003-08-19 Lumileds Lighting, U.S., Llc Lens and lens cap with sawtooth portion for light emitting diode
US6630689B2 (en) * 2001-05-09 2003-10-07 Lumileds Lighting, U.S. Llc Semiconductor LED flip-chip with high reflectivity dielectric coating on the mesa
US6578986B2 (en) * 2001-06-29 2003-06-17 Permlight Products, Inc. Modular mounting arrangement and method for light emitting diodes
US6785001B2 (en) * 2001-08-21 2004-08-31 Silicon Light Machines, Inc. Method and apparatus for measuring wavelength jitter of light signal
JP4067802B2 (en) * 2001-09-18 2008-03-26 松下電器産業株式会社 Lighting device
US6561808B2 (en) 2001-09-27 2003-05-13 Ceramoptec Industries, Inc. Method and tools for oral hygiene
US6942018B2 (en) 2001-09-28 2005-09-13 The Board Of Trustees Of The Leland Stanford Junior University Electroosmotic microchannel cooling system
US6498355B1 (en) 2001-10-09 2002-12-24 Lumileds Lighting, U.S., Llc High flux LED array
US6561640B1 (en) * 2001-10-31 2003-05-13 Xerox Corporation Systems and methods of printing with ultraviolet photosensitive resin-containing materials using light emitting devices
US8057903B2 (en) * 2001-11-30 2011-11-15 Sabic Innovative Plastics Ip B.V. Multilayer articles comprising resorcinol arylate polyester and method for making thereof
EP1335577A3 (en) * 2002-02-08 2006-04-05 Canon Kabushiki Kaisha Illumination Apparatus and Image Reading Apparatus
IL148566A (en) * 2002-03-07 2007-06-17 Nova Measuring Instr Ltd Method and system for overlay measurement
JP3991730B2 (en) 2002-03-13 2007-10-17 チッソ株式会社 Polymerizable compound and polymer thereof
US6724473B2 (en) * 2002-03-27 2004-04-20 Kla-Tencor Technologies Corporation Method and system using exposure control to inspect a surface
US6796698B2 (en) * 2002-04-01 2004-09-28 Gelcore, Llc Light emitting diode-based signal light
KR20050044865A (en) 2002-05-08 2005-05-13 포세온 테크날러지 인코퍼레이티드 High efficiency solid-state light source and methods of use and manufacture
WO2006072071A2 (en) 2004-12-30 2006-07-06 Phoseon Technology Inc. Methods and systems relating to light sources for use in industrial processes
US6642066B1 (en) * 2002-05-15 2003-11-04 Advanced Micro Devices, Inc. Integrated process for depositing layer of high-K dielectric with in-situ control of K value and thickness of high-K dielectric layer
DE10223201C1 (en) * 2002-05-24 2003-05-28 Fraunhofer Ges Forschung Optical detection device for optical data has primary detection diode and secondary detection diodes coupled to respective read-out circuits with different read-out rates
US6573536B1 (en) * 2002-05-29 2003-06-03 Optolum, Inc. Light emitting diode light source
GB2396331A (en) 2002-12-20 2004-06-23 Inca Digital Printers Ltd Curing ink
US20040134603A1 (en) * 2002-07-18 2004-07-15 Hideo Kobayashi Method and apparatus for curing adhesive between substrates, and disc substrate bonding apparatus
US7279069B2 (en) 2002-07-18 2007-10-09 Origin Electric Company Limited Adhesive curing method, curing apparatus, and optical disc lamination apparatus using the curing apparatus
EP1551329A4 (en) 2002-07-25 2006-08-16 Jonathan S Dahm Method and apparatus for using light emitting diodes for curing
WO2004038759A2 (en) 2002-08-23 2004-05-06 Dahm Jonathan S Method and apparatus for using light emitting diodes
US7084935B2 (en) 2002-08-28 2006-08-01 Adaptive Micro Systems, Llc Display device with molded light guide
US7008795B2 (en) 2002-09-20 2006-03-07 Mitsubishi Electric Research Labs, Inc. Multi-way LED-based chemochromic sensor
US20040207836A1 (en) * 2002-09-27 2004-10-21 Rajeshwar Chhibber High dynamic range optical inspection system and method
US6822991B2 (en) * 2002-09-30 2004-11-23 Lumileds Lighting U.S., Llc Light emitting devices including tunnel junctions
US6880954B2 (en) 2002-11-08 2005-04-19 Smd Software, Inc. High intensity photocuring system
US6708501B1 (en) * 2002-12-06 2004-03-23 Nanocoolers, Inc. Cooling of electronics by electrically conducting fluids
TW571449B (en) * 2002-12-23 2004-01-11 Epistar Corp Light-emitting device having micro-reflective structure
US7465909B2 (en) 2003-01-09 2008-12-16 Con-Trol-Cure, Inc. UV LED control loop and controller for causing emitting UV light at a much greater intensity for UV curing
US7211299B2 (en) * 2003-01-09 2007-05-01 Con-Trol-Cure, Inc. UV curing method and apparatus
US7175712B2 (en) * 2003-01-09 2007-02-13 Con-Trol-Cure, Inc. Light emitting apparatus and method for curing inks, coatings and adhesives
GB0304761D0 (en) 2003-03-01 2003-04-02 Integration Technology Ltd Ultraviolet curing
US20040206970A1 (en) 2003-04-16 2004-10-21 Martin Paul S. Alternating current light emitting device
US7068363B2 (en) * 2003-06-06 2006-06-27 Kla-Tencor Technologies Corp. Systems for inspection of patterned or unpatterned wafers and other specimen
US7271921B2 (en) * 2003-07-23 2007-09-18 Kla-Tencor Technologies Corporation Method and apparatus for determining surface layer thickness using continuous multi-wavelength surface scanning
US7002677B2 (en) 2003-07-23 2006-02-21 Kla-Tencor Technologies Corporation Darkfield inspection system having a programmable light selection array
JP2007503622A (en) 2003-08-26 2007-02-22 レッドシフト システムズ コーポレイション Infrared camera system
US7102172B2 (en) * 2003-10-09 2006-09-05 Permlight Products, Inc. LED luminaire
US8264431B2 (en) 2003-10-23 2012-09-11 Massachusetts Institute Of Technology LED array with photodetector
WO2005043598A2 (en) 2003-10-31 2005-05-12 Phoseon Technology, Inc. Use of potting gels for fabricating microoptic arrays
EP1678442B8 (en) 2003-10-31 2013-06-26 Phoseon Technology, Inc. Led light module and manufacturing method
US7524085B2 (en) 2003-10-31 2009-04-28 Phoseon Technology, Inc. Series wiring of highly reliable light sources
US7179670B2 (en) * 2004-03-05 2007-02-20 Gelcore, Llc Flip-chip light emitting diode device without sub-mount
US7638808B2 (en) * 2004-03-18 2009-12-29 Phoseon Technology, Inc. Micro-reflectors on a substrate for high-density LED array
US7235878B2 (en) * 2004-03-18 2007-06-26 Phoseon Technology, Inc. Direct cooling of LEDs
EP1743384B1 (en) 2004-03-30 2015-08-05 Phoseon Technology, Inc. Led array having array-based led detectors
US7071493B2 (en) * 2004-04-12 2006-07-04 Phoseon Technology, Inc. High density LED array
WO2005100961A2 (en) 2004-04-19 2005-10-27 Phoseon Technology, Inc. Imaging semiconductor strucutures using solid state illumination
US7554656B2 (en) * 2005-10-06 2009-06-30 Kla-Tencor Technologies Corp. Methods and systems for inspection of a wafer

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4595289A (en) * 1984-01-25 1986-06-17 At&T Bell Laboratories Inspection system utilizing dark-field illumination
US6141040A (en) * 1996-01-09 2000-10-31 Agilent Technologies, Inc. Measurement and inspection of leads on integrated circuit packages
US5892579A (en) * 1996-07-16 1999-04-06 Orbot Instruments Ltd. Optical inspection method and apparatus

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8192053B2 (en) 2002-05-08 2012-06-05 Phoseon Technology, Inc. High efficiency solid-state light source and methods of use and manufacture
US10401012B2 (en) 2002-05-08 2019-09-03 Phoseon Technology, Inc. High efficiency solid-state light source and methods of use and manufacture
US7816638B2 (en) 2004-03-30 2010-10-19 Phoseon Technology, Inc. LED array having array-based LED detectors
US8077305B2 (en) 2004-04-19 2011-12-13 Owen Mark D Imaging semiconductor structures using solid state illumination
EP1801569A3 (en) * 2005-12-23 2008-07-23 Basler Aktiengesellschaft Method and device for detecting cracks in silicon wafers
EP1801569A2 (en) * 2005-12-23 2007-06-27 Basler Aktiengesellschaft Method and device for detecting cracks in silicon wafers
EP1857977A2 (en) * 2006-05-16 2007-11-21 Mitsubishi Electric Corporation Image inspection method and image inspection apparatus employing the same
EP1857977A3 (en) * 2006-05-16 2010-06-16 Mitsubishi Electric Corporation Image inspection method and image inspection apparatus employing the same
US7965883B2 (en) 2006-05-16 2011-06-21 Mitsubishi Electric Corporation Image inspection method and image inspection apparatus employing the same
WO2009027517A1 (en) 2007-08-31 2009-03-05 Icos Vision Systems Nv Apparatus and method for detecting semiconductor substrate anomalies
US8379964B2 (en) 2007-08-31 2013-02-19 Kla-Tencor Corporation Detecting semiconductor substrate anomalies
US8428337B2 (en) 2008-07-28 2013-04-23 Bluplanet Pte Ltd Apparatus for detecting micro-cracks in wafers and method therefor
US9651502B2 (en) 2008-07-28 2017-05-16 Bluplanet Pte Ltd Method and system for detecting micro-cracks in wafers
US8781208B2 (en) 2009-04-30 2014-07-15 Wilcox Associates, Inc. Inspection method and inspection apparatus
WO2010145881A1 (en) * 2009-04-30 2010-12-23 Wilcox Associates, Inc. An inspection method and an inspection apparatus

Also Published As

Publication number Publication date
EP1738156A4 (en) 2017-09-27
TW200539489A (en) 2005-12-01
US8077305B2 (en) 2011-12-13
TWI302756B (en) 2008-11-01
EP1738156A2 (en) 2007-01-03
US20050231713A1 (en) 2005-10-20
WO2005100961A3 (en) 2007-02-08

Similar Documents

Publication Publication Date Title
US8077305B2 (en) Imaging semiconductor structures using solid state illumination
JP7373527B2 (en) Workpiece defect detection device and method
US8426223B2 (en) Wafer edge inspection
US20090196489A1 (en) High resolution edge inspection
JP6850332B2 (en) Imaging system
TWI589861B (en) System and method for detecting cracks in a wafer
CN108449972B (en) Low-level photoluminescence imaging by optical filtering
TW522447B (en) Method and apparatus for embedded substrate and system status monitoring
US9064922B2 (en) Substrate inspection apparatus and substrate inspection method
US9651502B2 (en) Method and system for detecting micro-cracks in wafers
TW201315990A (en) Solar metrology methods and apparatus
CN106688110B (en) It is bonded the method for detecting of defective part and checks system
TW201339572A (en) Apparatus and method for detecting defects in device
EP2609419A1 (en) Methods and systems for inspecting bonded wafers
TW201727215A (en) Micro photoluminescence imaging with optical filtering
JP2018146531A (en) Substrate inspection device, substrate polishing device, substrate inspection method, and substrate polishing method
JP2001141659A (en) Image pick-up device and defect detecting apparatus
JP3505655B2 (en) Glass container inspection equipment
JP2003007746A (en) Method and equipment for inspecting semiconductor element
JP2007010586A (en) Inspection device and method for infrared reflecting film coated member

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2005745456

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

WWP Wipo information: published in national office

Ref document number: 2005745456

Country of ref document: EP