US20010012069A1 - Confocal microscope with a motorized scanning table - Google Patents

Confocal microscope with a motorized scanning table Download PDF

Info

Publication number
US20010012069A1
US20010012069A1 US09/779,960 US77996001A US2001012069A1 US 20010012069 A1 US20010012069 A1 US 20010012069A1 US 77996001 A US77996001 A US 77996001A US 2001012069 A1 US2001012069 A1 US 2001012069A1
Authority
US
United States
Prior art keywords
array
sensor
confocal microscope
diaphragm
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US09/779,960
Other versions
US6429897B2 (en
Inventor
Eberhard Derndinger
Norbert Czarnetzki
Peter Ott
Thomas Scherubl
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US09/779,960 priority Critical patent/US6429897B2/en
Publication of US20010012069A1 publication Critical patent/US20010012069A1/en
Application granted granted Critical
Publication of US6429897B2 publication Critical patent/US6429897B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/0052Optical details of the image generation
    • G02B21/0068Optical details of the image generation arrangements using polarisation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/0036Scanning details, e.g. scanning stages
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/0036Scanning details, e.g. scanning stages
    • G02B21/004Scanning details, e.g. scanning stages fixed arrays, e.g. switchable aperture arrays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/0052Optical details of the image generation
    • G02B21/0056Optical details of the image generation based on optical coherence, e.g. phase-contrast arrangements, interference arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/0052Optical details of the image generation
    • G02B21/0072Optical details of the image generation details concerning resolution or correction, including general design of CSOM objectives
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/008Details of detection or image processing, including general computer control

Definitions

  • This invention relates to a confocal microscope and, more particularly, to a confocal microscope with a motorized scanning table for moving a sample perpendicularly to the optical axis of the microscope.
  • a confocal microscope with a motorized scanning table to move a sample perpendicularly to the optical axis of the microscope is known from U.S. Pat. No. 5,239,178. Furthermore, the microscope has a light source array in a plane conjugate to the focal plane of an objective, and a detector array with numerous light-sensitive elements, also in a plane conjugate to the focal plane of the microscope objective. The movement of the specimen perpendicularly to the optical axis of the microscope takes place primarily in the microscopic region in order to increase the resolution, otherwise defined by the raster spacing of the light source array, perpendicular to the optical axis.
  • a Nomarski microscope (not confocal) is designed for taking and storing corresponding series of images, and is described, for example, in European Patent EP 0 444 450-A1. Since this Nomarski microscope is not confocal, it has only a small resolution in the direction of the optical axis. Furthermore, this microscope is much too slow when the image data in a large number of image fields must be sensed. The sensing of large object fields in the shortest possible time, with high resolution, is imperative in inspection equipment used in production processes, for example, in the semiconductor industry or in LCD production.
  • a microscope used for wafer inspection also not confocal, is described in U.S. Pat. No. 5,264,912.
  • filtering takes place in the Fourier plane of the objective.
  • the transmission characteristic of the spatial filter in the Fourier plane corresponds to the inverse diffraction figure of the integrated circuit (IC) that is being produced. Consequently, the filter transmits light only when the diffraction image of the momentarily imaged IC deviates from the diffraction image of the reference IC, and it can be concluded that the structure of the observed IC deviates from the reference structure.
  • a CCD array or, alternatively, a high speed multiple output time delay integration (TDI) sensor is provided as the light detector.
  • TDI time delay integration
  • U.S. Pat. No. 5,365,084 includes an arrangement for inspecting a running length of fabric during its manufacture, in which a TDI sensor is used, synchronized with the motion of the length of fabric.
  • a video inspection device cannot be considered for inspecting semiconductors in a production process, because of its low resolution both in the direction of the optical axis and perpendicular to the optical axis.
  • the object of the present invention is to provide an arrangement that can be used for the optical inspection of semiconductors in the production process.
  • a further object is to achieve a sufficient resolution both in the direction of, and also perpendicular to, the optical axis.
  • an object is to sense large image fields in the shortest possible time.
  • a synchronizing unit for effecting displacement of the charges corresponding to motion of an image point of an object point in a plane of the sensor array.
  • the arrangement according to the invention is a confocal microscope with a motorized scanning table to move the specimen perpendicularly of the optical axis of the microscope. It has a diaphragm array with numerous light transmitting regions, so-called pinholes, in a plane that is conjugate to the focal plane of the microscope objective.
  • the diaphragm array is followed by a sensor array that has numerous photosensitive elements. Each photosensitive element is associated with a charge storage element.
  • the sensor array has a device for displacing the charges stored in the charge storage elements from one storage element to another storage element, as in the case in the so-called TDI sensors.
  • a synchronizing unit is provided that effects displacing charges corresponding to the movement of the image point of a specimen point in the plane of the sensor array.
  • the motion of the sample takes place along linear paths that extend over the complete length of the sample in the direction of motion.
  • corresponding linear paths can be combined in a meander form.
  • Short acceleration or deceleration segments during which no signal recording takes place, occur respectively at the start and at the end of each linear path. Between these acceleration and deceleration segments, the motion of the sample is uniform, so that the movement of charge between the storage elements of the sensor array and the motion of the image point on the sensor array are mutually synchronized.
  • a light source array that has numerous mutually spaced-apart light sources is arranged in a plane conjugate to the focal plane of the objective.
  • the positions of the individual light sources are conjugate to the positions of the transparent regions of the diaphragm array.
  • Corresponding light source arrays can be formed in different ways. The simplest variant results when the diaphragm array is arranged in a common portion of the illumination and observation beam paths, and the diaphragm array is illuminated from the back.
  • this simple arrangement has a disadvantage, in that a substantial portion of the illuminating light is reflected at the back side of the diaphragm array and thus produces a strong signal background on the sensor array.
  • Such a strong signal background can be prevented by providing two separate diaphragm arrays, one in the illuminating beam path and the other in the observation beam path or measuring beam path.
  • the diaphragm array in the illumination beam path is then again illuminated from the back.
  • the diaphragm array in the illumination beam path can be preceded by a lens array, as described in U.S. Pat. No. 5,239,178.
  • the light source array can also be formed by light-conducting fibers with their end surfaces arranged in an array.
  • a correspondingly constructed diffractive element may be used.
  • the diaphragm array, the light source array, and the sensor array are at rest. All three components are mutually stationary.
  • the sensor array is a two-dimensional array of photosensitive elements and charge storage elements associated with the photosensitive elements, with numerous columns arranged parallel to each other. The direction of the columns is then defined by the direction in which the charges are displaced between the charge storage elements.
  • the light source array and diaphragm array, and on the other hand, the sensor array are arranged relative to each other so that at least one transparent region of the diaphragm array is imaged on each of the mutually parallel columns of the sensor array.
  • TDI sensors may be used as the corresponding sensor array. To the extent that such TDI sensors have light-insensitive regions between the photosensitive surfaces, these can be arranged, and the imaging between the diaphragm array and the sensor can be chosen so that the transparent regions of the diaphragm array are exclusively imaged on the photosensitive regions.
  • the transparent regions of the diaphragm array are formed, corresponding to the direction of motion of the scanning table and to the imaging ratio between the object plane and the diaphragm array, so that the paths of the images of all the transparent regions, closely fill, preferably without a gap, a portion of the focal plane, while maintaining the confocal filtering.
  • the image data for a strip whose width corresponds to the width of the image section sensed perpendicularly to the direction of motion is sensed completely confocally, without micro-movements perpendicular to the direction of motion required.
  • the transparent regions of the diaphragm array may be arranged in the form of a two-dimensional rhombic grid.
  • each transparent region corresponds to the position of the theoretical grid point.
  • the transparent regions of the diaphragm array in the form of a rectangular grid, the grid axes of which are rotated relative to the linear direction of motion.
  • Such a rectangular geometry confers advantages when the light source array is formed in the form of a fiber illumination, a lens array, or as a diffractive element producing a corresponding illumination.
  • Such an offset arrangement of several two-dimensional sensor arrays has an image field that is larger by the number of two-dimensional arrays in anamorphotic imaging of the diaphragm array on the sensor array, in contrast to an arrangement of a single sensor array with the same number of photosensitive elements, so that a correspondingly large signal/noise ratio results.
  • FIG. 1 a comprises a schematic of the principles of a first embodiment of the invention, with a single pinhole array arranged in the common portion of the illuminating and observation beam paths;
  • FIG. 1 b shows a second embodiment of the invention with separate light source array and diaphragm array
  • FIG. 1 c is a schematic explaining the principle of the synchronization between object motion and charge displacement in the sensor array
  • FIG. 2 a is a block circuit diagram for the synchronization between the object motion and the charge displacement in the sensor array
  • FIG. 2 b is a detailed representation of the functioning sequence in the microcontroller of FIG. 2 a;
  • FIGS. 3 a - 3 c show sections of a diaphragm array forming a rhombic grid and the associated image points in the object plane and in the plane of the sensor array;
  • FIGS. 4 a - 4 c show sections of a diaphragm array forming a rectangular grid and the associated image points in the object plane and the plane of the sensor array;
  • FIG. 5 a shows a schematic representation of a sensor array consisting of several two-dimensional partial sensor arrays that are arranged mutually offset;
  • FIG. 5 b is a schematic of the principle of a pinhole array suitable for the sensor array of FIG. 5 a.
  • a single diaphragm array ( 4 ) with numerous transparent regions or holes is arranged in the common portion of the illumination and observation beam paths.
  • This arrangement forms, at one and the same time, the diaphragm array for the detection beam path and the light source array for the illumination of the object ( 8 ).
  • the diaphragm array ( 4 ) is uniformly illuminated from the back side by a light source ( 1 ) that is followed by a condenser ( 2 ).
  • Each transparent region, or each pinhole, of the diaphragm array ( 4 ) thus forms a secondary light source.
  • a tube lens ( 5 ) with a microscope objective ( 7 ) is arranged after the diaphragm array ( 4 ) in order to image the diaphragm array ( 4 ) on the object ( 8 ) positioned on the motorized scanning table ( 9 ).
  • the microscope objective ( 7 ) is shown, greatly simplified, as a single lens in FIG. 1 a.
  • the microscope objective ( 7 ) is corrected to an infinite focal intercept, and thus to an infinite image distance. This is indicated in FIG. 1 a by the telecentering diaphragm ( 6 ).
  • the diaphragm array ( 4 ), and thus also the light source array imaged by the diaphragm array ( 4 ), is arranged, by means of the tube lens ( 5 ) and the telecentric imaging, to be confocal with the focal plane of the objective ( 7 ).
  • a pattern of illumination corresponding to the image of the diaphragm array ( 4 ) arises in the focal plane of the objective ( 7 ).
  • the object ( 8 ) is illuminated at the points that are conjugate to the transparent regions of the diaphragm array ( 4 ).
  • the light scattered or reflected by the object ( 8 ) is imaged backward again, by the objective ( 7 ) with the subsequent tube lens ( 5 ), onto the diaphragm array ( 4 ).
  • the diaphragm array ( 4 ) effects confocal filtering, resulting in only such light being transmitted through the transparent regions of the diaphragm array ( 4 ) as was scattered or reflected in regions of the object ( 8 ) that are confocal to the transparent regions of the diaphragm array ( 4 ).
  • the light that is scattered or reflected on the object ( 8 ) above or below the focal plane of the objective ( 7 ) is trapped by the non-transparent regions of the diaphragm array ( 4 ).
  • This confocal microscope results in high resolution in the direction of the optical axis (z-direction), denoted by a dot-dash line.
  • a beam-splitter mirror ( 3 ) is arranged between the diaphragm array ( 4 ) and the condenser ( 2 ), and a portion of the light scattered or reflected at the object ( 8 ) and transmitted through the diaphragm array ( 4 ) is reflected out towards the sensor array ( 11 ).
  • a further imaging optics ( 10 ) that images the diaphragm array ( 4 ) on the sensor array ( 11 ) is provided in the reflected beam path; that is, between the beam-splitter mirror ( 3 ) and the sensor array ( 11 ).
  • the sensor array ( 11 ) is a so-called TDI sensor (Time Delay and Integration), such as is offered, for example, by DALSA Inc., Ontario, Canada, under the reference IT-E1 or IT-F2.
  • TDI sensor has 2048 columns each with 96 TDI stages or rows.
  • a photosensitive region and a charge storage element is associated with each TDI stage in each column, so that the number of pixels (photosensitive regions) and charge storage elements amounts to 96 ⁇ 2048.
  • the diaphragm array ( 4 ) has at least a number of transparent regions corresponding to the number of columns of the TDI sensor, so that at least one transparent region of the diaphragm array ( 4 ) is imaged on each column of the TDI sensor.
  • the detailed imagewise arrangement of the pixels of the TDI sensor and of the transparent regions is described in more detail herein below with reference to FIGS. 3 a - 3 c and 4 a - 4 c.
  • the scanning table ( 9 ) can be moved by motor drive in two directions perpendicular to the optical axis, and senses large object regions. Its motion is sensed by means of two position measuring systems ( 12 ).
  • the summed charges in the charge storage elements of the TDI sensor ( 11 ) are displaced in the stage direction by means of a synchronization unit ( 13 ), corresponding to the motion of the scanning table ( 9 ).
  • the motion of the scanning table takes place along (possibly several) linear paths of movement, so that on the TDI sensor ( 11 ) the image point belonging to an object point is displaced along the columns. This state of affairs will be explained with reference to the simplified representation of FIG. 1 c.
  • an object point ( 8 a ) is imaged at an image point ( 11 a ) on the TDI sensor ( 11 ). Due to the motion of the scanning table ( 9 ), a motion of the object ( 8 ) results in the direction of the arrow (P 1 ) and at a somewhat later instant the object point ( 8 a ) has traveled to position ( 8 b ). Simultaneously with the motion of the object ( 8 ), the charges stored in the charge storage elements of the TDI sensor ( 11 ) are displaced in the direction of the arrow (P 2 ) from the stage ( 11 a ) to the stage ( 11 b ).
  • Measurement can proceed during the motion of the object ( 8 ) due to this synchronization between the motion of the object ( 8 ) and the motion of the charges.
  • the motion of the object ( 8 ) therefore does not take place in start-stop operation but uniformly during the measurement.
  • Substantially shorter measurement times are attained at the same signal/noise ratio compared to arrangements in which the object motion takes place in start/stop operation and a measurement takes place when the object is stationary.
  • FIG. 1 b components corresponding to the individual components of the embodiment according to FIG. 1 a are referenced with the same symbols as in FIG. 1 a.
  • the difference between the embodiment according to FIG. 1 a and in FIG. 1 b is that the diaphragm array ( 4 b ) is arranged following the beam splitter ( 31 ) in the observation beam path or the detection beam path.
  • the illumination beam path has its own diaphragm array ( 4 a ), which forms the light source array.
  • the two diaphragm arrays ( 4 a ) and ( 4 b ) are arranged conjugate to each other and conjugate to the focal plane of the objective ( 7 ).
  • the transparent regions of the two diaphragm arrays ( 4 a ) and ( 4 b ) are also mutually conjugate.
  • the use of separate diaphragm arrays ( 4 a, 4 b ) in the illumination and observation beam paths avoids producing a large signal background on the TDI sensor ( 11 ) due to the relatively large proportion of light reflected at the diaphragm array ( 4 a ) of the illumination beam path.
  • the beam splitter ( 3 ′) is constructed as a polarizing beam splitter, and the illumination of the diaphragm array ( 4 a ) in the illumination beam path also takes place with polarized light, denoted by a polarizer ( 2 a ) preceding the diaphragm array ( 4 a ).
  • a quarter wavelength plate ( 14 ) is provided on the object side of the beam splitter ( 3 ′) and, in a known manner, effects a rotation of 90° in the polarization of the light that is transmitted twice through the quarter wavelength plate ( 14 ).
  • a polarizing beam splitter ( 3 ′) and a quarter wavelength plate ( 14 ) results in a better use, by a factor of four, of the light present behind the condenser ( 2 ), compared to the embodiment according to FIG. 1 a.
  • a corresponding arrangement of polarizing beam splitter, polarizing filter and quarter wavelength plate is also possible in the embodiment with only one diaphragm array according to FIG. 1 a.
  • FIG. 3 b A first embodiment of a diaphragm array ( 4 , 4 a, 4 b ) is shown in FIG. 3 b.
  • the diaphragm array ( 4 ) contains a number of transparent regions, of which only 20 ( 4 1 - 4 20 ), are shown in FIG. 3 b for reasons of clarity.
  • the spacing of closest neighboring transparent regions is at least 4 times the diameter of the transparent regions.
  • the transparent regions ( 4 1 - 4 20 ) form a two-dimensional rhombic grid.
  • the angle between the two grid axes is chosen so that, taking into account the imaging ratio between the diaphragm array ( 4 , 4 b ) and the TDI sensor ( 11 ), the center of respective closest neighboring transparent regions is imaged on neighboring columns of the TDI sensor ( 11 ).
  • This imagewise arrangement is shown in FIG. 3 c.
  • Each square in FIG. 3 c represents a photosensitive region.
  • the 96 stages are represented in the vertical direction, and a section of the 2,048 columns in the horizontal direction, the columns being denoted by (P 1 , P 2 , P 10 , P 11 ).
  • the transparent region ( 41 ) is imaged on the column (P 1 ); the transparent region ( 42 ) on the column (P 2 ); and so on, on different columns of the TDI sensor ( 11 ).
  • the transparent regions ( 4 1 - 4 10 ) are imaged on different stages.
  • the stage position again corresponding to the stage position of the region ( 4 1 ) is the stage position on which the transparent region ( 4 11 ) is imaged.
  • FIG. 3 a shows the image of the diaphragm array ( 4 ) and the TDI sensor ( 11 ) in the focal plane of the objective ( 7 ), and hence in a sectional plane of the object ( 8 ).
  • the images of the transparent regions of the diaphragm array ( 4 ) are denoted using the same symbols as in FIG. 3 b.
  • Each square that has been drawn represents the image of the associated photosensitive region of the TDI sensor 11 .
  • the linear direction of motion of the scanning table ( 9 ) on the long meander paths is denoted by the arrow (S).
  • FIGS. 4 a - 4 c The same situation as in FIGS. 3 a - 3 c is shown in principle in FIGS. 4 a - 4 c for an alternative diaphragm array ( 4 ′) (see FIG. 4 b ).
  • the transparent regions correspond in their diameter and their distance to the neighboring transparent region to those of FIG. 3 b.
  • These transparent regions are arranged so that a rectangular two-dimensional grid of transparent regions results.
  • the grid axes of the rectangular grid are rotated relative to the scanning direction (arrow S) so that here (as in previously described the embodiments according to FIGS.
  • a respective transparent region ( 4 1′ - 4 6′ ) is imaged on a respective column of the TDI sensor ( 11 ).
  • FIG. 4 a the image of the diaphragm array ( 4 ′) and of the TDI sensor ( 11 ) are again shown in the focal plane of the objective ( 7 ).
  • the object table or stage ( 9 ) consists of table elements that are displaceable in two mutually perpendicular directions, the motorized drives ( 20 , 21 ), the position measuring systems ( 22 , 23 ), and a microcontroller ( 24 ).
  • the object table itself ( 9 ) is displaceably received, for a focusing in the direction of the optical axis, on a stand (not shown).
  • the two motorized drives ( 20 , 21 ), for producing motion in two orthogonal directions are preferably constituted as linear drives.
  • the position measuring systems ( 22 , 23 ) that sense the motion or deflection of the table ( 9 ) independently of each other in the two mutually perpendicular directions, are constructed as length measuring interferometers.
  • these interferometers provide an intensity of irradiation on a radiation sensor that has a sinusoidal dependence on the path traveled.
  • the period of the sinisoidal signal which is proportional to the wavelength of the measuring light used is then directly associated with the distance traveled.
  • a null position is traveled to, since the measuring signal has ambiguities for long traveled paths, and an absolute calibration is required.
  • the present position is then given in relation to this null position by the number of times the interferometer signal passed through zero, together with the phase difference of the detected sine wave signal in the calibration position and the present position.
  • the microcontroller ( 24 ) controls the drives ( 20 , 21 ) of the object table ( 9 ) corresponding to the present position values that are supplied by the measuring systems ( 22 , 23 ), and to the reference position values that are determined by a host computer (not shown) via a bus line ( 29 ).
  • FIG. 2 b shows (on a larger scale), the controller circuit required for this purpose within the microcontroller 24 .
  • the data supplied via the control bus for example, a CAN bus, is converted in an arithmetic logic unit (ALU) ( 33 ) into the present reference positions.
  • ALU arithmetic logic unit
  • the values determined in the ALU ( 33 ) are respectively subtracted from the values supplied from the two measuring systems ( 22 , 23 ), so that the difference represents the amount of deviation between the actual position and the reference position.
  • This difference is integrated over time in an integrator ( 34 ) and then multiplied in a unit ( 35 ) by a factor that gives the amplification of the open control circuit. This factor is as a rule negative, in order to effect a phase displacement of 180°.
  • This amplified and time integrated difference signal then represents the drive signal for the drives ( 20 , 21 ).
  • the values of the present reference positions in the two mutually perpendicular directions are simultaneously passed on by the ALU ( 32 ) via data leads ( 30 , 31 ) to a further microcontroller ( 28 ), a drive ( 27 ) for the reading out, or the cycle timing, of the TDI sensor ( 11 ), and an image processing electronics ( 25 ).
  • the drive ( 27 ) (driven by the microcontroller ( 24 )), effects a displacement of the charges stored in the TDI sensor corresponding to the travel of each image point on the TDI sensor ( 11 ).
  • the charge data read out from the TDI sensor ( 11 ) are digitized by an A/D converter ( 26 ) and are then also passed on to the image processing electronics ( 25 ).
  • the image processing electronics ( 25 ) obtains the information for which table position the radiation intensities recorded with the TDI sensor are to be entered into the image to be produced.
  • the electronics takes into consideration the delays which are caused by the systematic properties of the TDI sensor. Should the table be located at a position outside the region to be sensed by the recording, the values given by the TDI sensor remain unconsidered.
  • the image processing electronics first carries out a restoration of the recording. Constant and linear errors (that can arise, for example, due to changes of the radiation intensity, or due to deviations of the dimensions of the transparent regions within the diaphragm array, or deviations of the table speed from the reference speed, or different sensitivity characteristics of the pixels of the TDI sensor) are thereby compensated. After such constant or linear errors are compensated, the structures of the object (for example, of the illuminated wafer) can be suppressed somewhat by suitable filtering, in order to better establish the existence of errors between the dies.
  • the portions of the recording that are to be compared with each other are brought to cover one another, with pixel accuracy, taking into account errors in the table system.
  • the portions of the recording to be compared are then subtracted one from another, the die-to-die comparison is carried out, and defects such as contaminating particles are detected by exclusive threshold formation.
  • the nominal desired speed and the course of the table are predetermined by the host computer.
  • the microcontroller calculates from the speed standards the reference position of the table and the cycle time according to which the table is regulated, and the cycle times are set for the drive ( 27 ) for the TDI sensor and for the image processing electronics.
  • the cycle times for reading out the TDI scanner and the image processing electronics are set directly from the host computer.
  • the reference position is not passed on via the data leads ( 30 , 31 ), but the momentary actual positions are passed on to the image processing electronics ( 25 ).
  • the image recording of a large object field takes place by an object table motion of meander form, in which the long motion is oriented so that the image points travel in the direction of the 96 stages of the TDI sensor.
  • the motion then takes place at a constant speed over the image region to be recorded.
  • a displacement of the table takes place in the direction perpendicular to this, so that now when scanning the nearest long meander path, the neighboring object regions are imaged on the TDI sensor. Scanning out then takes place in the opposite direction, wherein at the same time the direction of the charge transport between the storage elements of the TDI sensor is reversed.
  • the TDI sensor have bidirectional scanning properties, so that the charges are displaceable in the two opposite directions.
  • the sensor can be, for this purpose, an IT-F2-Type of DALSA, Inc.
  • the frequency that is predetermined by the host computer or by the clock ( 36 ) of the microcontroller ( 24 ) is determined so that the object table is moved at the maximum speed possible for a readout of the TDI rows with the maximum frequency, while taking into account the imaging scale and the image drift.
  • a change of the objective ( 7 ) is required to change the imaging scale.
  • this takes place by means of a coded revolving nosepiece, where the scale data of the objectives belonging to the positions of the revolving nosepiece are stored in a memory.
  • a matching of the mutually synchronized speed between the reading out of the TDI sensor and the object table can then also occur when the revolving nosepiece position is changed.
  • a change in the imaging scale is associated with a change in the diaphragm array, since the diameter of the transparent regions remains matched to the size of the Airy disk, which depends on the numerical aperture of the objective.
  • FIG. 5 a A particularly advantageous arrangement of a TDI sensor in combination with the present invention is shown in FIG. 5 a.
  • d the pixel spacing
  • n is the number of partial sensors.
  • the transparent regions lying in the first two rows (Z 1 , Z 2 ) of the diaphragm array ( 41 ) are then imaged on the first partial sensor ( 38 ); the two succeeding rows (Z 3 , Z 4 ) are imaged on the second partial sensor ( 39 ); and so on.
  • This anamorphotic imaging is shown in FIG. 5 a by the oval images of the circular transparent regions of the diaphragm array ( 41 ).
  • the offset arrangement of several partial sensors makes it possible to image the transparent regions that are imaged on each partial sensor as right-angled, partial grids directed parallel to the rows and columns of the partial sensors.
  • the partial grids are mutually offset in correspondence with the mutual offset of the partial sensors, so that the whole image field is sensed without gaps when the image data of the partial TDIs are correspondingly sorted to obtain the correct sequence.
  • Several transparent regions can thereby be imaged on one column of each partial sensor at different stage positions, resulting in the improved signal/noise ratio.
  • two transparent regions are imaged on each pixel position at correspondingly offset stage positions of the same partial sensor ( 38 ).
  • the use of only two transparent regions per pixel position serves only for illustration.
  • the number of the transparent regions can be chosen corresponding to the number of partial sensors ( 38 , 39 , 40 ), so that with 9 partial sensors, an amount of light per pixel result that is greater by a factor of 9 than in the embodiments according to FIGS. 3 a - 3 c and 4 a - 4 c, so that with the same signal/noise ratio, the scanning of the object can take place at 9 times the speed.
  • a sensor array of several partial sensors may also be put to use in combination with a normal, non-anamorphotic imaging of the diaphragm array on the sensor array. In this case, only a portion of the columns of the partial sensors contributes to the formation of the image.

Abstract

A confocal microscope has a motorized scanning table for moving the sample perpendicularly to the optical axis of the microscope. The object is illuminated simultaneously at many places by means of a light source array. The light reflected or scattered at the object is detected by means of a diaphragm array, which is conjugate to the object and to the light source array. A sensor array is provided as a detector and makes a displacement of charges possible between individual positions in the scanning direction. The sensor is a so-called TDI sensor. The displacement of the charges is synchronized with the motion of the object corresponding to the motion of the image points in the plane of the sensor array. The image data can thereby be recorded during the motion of the object, so that even large object fields can be sensed in a short time with high lateral resolution. The motion of the object takes place along linear paths (if necessary linear paths combined in a meander form) and the motion along the linear paths takes place uniformly. The microscope is particularly suitable for inspection in the semiconductor industry (wafer inspection, LCD inspection).

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0001]
  • This invention relates to a confocal microscope and, more particularly, to a confocal microscope with a motorized scanning table for moving a sample perpendicularly to the optical axis of the microscope. [0002]
  • 2. Discussion of Prior Art [0003]
  • A confocal microscope with a motorized scanning table to move a sample perpendicularly to the optical axis of the microscope is known from U.S. Pat. No. 5,239,178. Furthermore, the microscope has a light source array in a plane conjugate to the focal plane of an objective, and a detector array with numerous light-sensitive elements, also in a plane conjugate to the focal plane of the microscope objective. The movement of the specimen perpendicularly to the optical axis of the microscope takes place primarily in the microscopic region in order to increase the resolution, otherwise defined by the raster spacing of the light source array, perpendicular to the optical axis. [0004]
  • With this confocal microscope, sensing large object fields that are substantially greater than the visual field imaged by the objective is only possible to a limited extent. A series of individual images of the object must be recorded. Between each individual image, the object must be displaced over a path length corresponding to the image field diameter. [0005]
  • A Nomarski microscope (not confocal) is designed for taking and storing corresponding series of images, and is described, for example, in European Patent EP 0 444 450-A1. Since this Nomarski microscope is not confocal, it has only a small resolution in the direction of the optical axis. Furthermore, this microscope is much too slow when the image data in a large number of image fields must be sensed. The sensing of large object fields in the shortest possible time, with high resolution, is imperative in inspection equipment used in production processes, for example, in the semiconductor industry or in LCD production. [0006]
  • A microscope used for wafer inspection, also not confocal, is described in U.S. Pat. No. 5,264,912. In it, filtering takes place in the Fourier plane of the objective. The transmission characteristic of the spatial filter in the Fourier plane corresponds to the inverse diffraction figure of the integrated circuit (IC) that is being produced. Consequently, the filter transmits light only when the diffraction image of the momentarily imaged IC deviates from the diffraction image of the reference IC, and it can be concluded that the structure of the observed IC deviates from the reference structure. In this microscope, a CCD array or, alternatively, a high speed multiple output time delay integration (TDI) sensor is provided as the light detector. However, the reason for using a TDI sensor is not stated. Furthermore, because of the non-confocal arrangement, this microscope also has only a small resolution in the direction of the optical axis. [0007]
  • Furthermore, U.S. Pat. No. 5,365,084 includes an arrangement for inspecting a running length of fabric during its manufacture, in which a TDI sensor is used, synchronized with the motion of the length of fabric. However, such a video inspection device cannot be considered for inspecting semiconductors in a production process, because of its low resolution both in the direction of the optical axis and perpendicular to the optical axis. [0008]
  • SUMMARY OF THE INVENTION
  • The object of the present invention is to provide an arrangement that can be used for the optical inspection of semiconductors in the production process. With this arrangement, a further object is to achieve a sufficient resolution both in the direction of, and also perpendicular to, the optical axis. At the same time, an object is to sense large image fields in the shortest possible time. These objects are achieved by a confocal microscope including: [0009]
  • A motorized scanning table for moving an object at right angles to the optical axis of the microscope; [0010]
  • A diaphragm array in a plane that is conjugate to the focal plane of the microscope objective; [0011]
  • A sensor array following the diaphragm array in an observation direction with a plurality of photosensitive elements, charge storage elements associated with the photosensitive elements, and a device for displacing charges stored in the charge storage elements from one storage element to another storage element; and [0012]
  • A synchronizing unit for effecting displacement of the charges corresponding to motion of an image point of an object point in a plane of the sensor array. [0013]
  • The arrangement according to the invention is a confocal microscope with a motorized scanning table to move the specimen perpendicularly of the optical axis of the microscope. It has a diaphragm array with numerous light transmitting regions, so-called pinholes, in a plane that is conjugate to the focal plane of the microscope objective. The diaphragm array is followed by a sensor array that has numerous photosensitive elements. Each photosensitive element is associated with a charge storage element. Furthermore, the sensor array has a device for displacing the charges stored in the charge storage elements from one storage element to another storage element, as in the case in the so-called TDI sensors. Furthermore, a synchronizing unit is provided that effects displacing charges corresponding to the movement of the image point of a specimen point in the plane of the sensor array. [0014]
  • In the confocal microscopic arrangement, high resolution both in the direction of the optical axis and perpendicular to the optical axis, which is usual for confocal microscopes, is attained. The resolution that can be attained by using a strong magnifying objective, for example, one having a magnification of 20-120 times, is sufficient for semiconductor inspection. By using a diaphragm array, and the numerous parallel confocal beam paths associated with the diaphragm array, a number of object positions is sensed that correspond to the number of pinholes in the diaphragm array. By synchronizing the displacement of the charges in the sensor array corresponding to the motion of the image point of an object point, the measurement takes place while the sample is in motion. Preferably, the motion of the sample takes place along linear paths that extend over the complete length of the sample in the direction of motion. For sensing large, two-dimensional surfaces, corresponding linear paths can be combined in a meander form. Short acceleration or deceleration segments, during which no signal recording takes place, occur respectively at the start and at the end of each linear path. Between these acceleration and deceleration segments, the motion of the sample is uniform, so that the movement of charge between the storage elements of the sensor array and the motion of the image point on the sensor array are mutually synchronized. [0015]
  • In order to produce the parallel confocal beam paths, a light source array that has numerous mutually spaced-apart light sources is arranged in a plane conjugate to the focal plane of the objective. The positions of the individual light sources are conjugate to the positions of the transparent regions of the diaphragm array. Corresponding light source arrays can be formed in different ways. The simplest variant results when the diaphragm array is arranged in a common portion of the illumination and observation beam paths, and the diaphragm array is illuminated from the back. However, this simple arrangement has a disadvantage, in that a substantial portion of the illuminating light is reflected at the back side of the diaphragm array and thus produces a strong signal background on the sensor array. Such a strong signal background can be prevented by providing two separate diaphragm arrays, one in the illuminating beam path and the other in the observation beam path or measuring beam path. The diaphragm array in the illumination beam path is then again illuminated from the back. For a more effective use of light, the diaphragm array in the illumination beam path can be preceded by a lens array, as described in U.S. Pat. No. 5,239,178. In an alternative to using diaphragm arrays illuminated at the back, the light source array can also be formed by light-conducting fibers with their end surfaces arranged in an array. Likewise, as an alternative to a lens array, a correspondingly constructed diffractive element may be used. [0016]
  • As the sample is scanned, the diaphragm array, the light source array, and the sensor array are at rest. All three components are mutually stationary. [0017]
  • Preferably, the sensor array is a two-dimensional array of photosensitive elements and charge storage elements associated with the photosensitive elements, with numerous columns arranged parallel to each other. The direction of the columns is then defined by the direction in which the charges are displaced between the charge storage elements. On the one hand, the light source array and diaphragm array, and on the other hand, the sensor array, are arranged relative to each other so that at least one transparent region of the diaphragm array is imaged on each of the mutually parallel columns of the sensor array. [0018]
  • TDI sensors may be used as the corresponding sensor array. To the extent that such TDI sensors have light-insensitive regions between the photosensitive surfaces, these can be arranged, and the imaging between the diaphragm array and the sensor can be chosen so that the transparent regions of the diaphragm array are exclusively imaged on the photosensitive regions. [0019]
  • The transparent regions of the diaphragm array are formed, corresponding to the direction of motion of the scanning table and to the imaging ratio between the object plane and the diaphragm array, so that the paths of the images of all the transparent regions, closely fill, preferably without a gap, a portion of the focal plane, while maintaining the confocal filtering. With linear, one-dimensional scanning of the object, the image data for a strip whose width corresponds to the width of the image section sensed perpendicularly to the direction of motion is sensed completely confocally, without micro-movements perpendicular to the direction of motion required. For this purpose, the transparent regions of the diaphragm array may be arranged in the form of a two-dimensional rhombic grid. The midpoint of each transparent region then corresponds to the position of the theoretical grid point. However, it is particularly advantageous to arrange the transparent regions of the diaphragm array in the form of a rectangular grid, the grid axes of which are rotated relative to the linear direction of motion. Such a rectangular geometry confers advantages when the light source array is formed in the form of a fiber illumination, a lens array, or as a diffractive element producing a corresponding illumination. [0020]
  • Preferably, a particularly advantageous sensor array has several mutually independent two-dimensional partial sensor arrays that are arranged one behind the other in the column or stage direction, and that are respectively offset, perpendicularly to the column direction or stage direction, by a distance Δ=d/n from each other, where d is the spacing of the individual sensors perpendicularly to the column direction and n is the number of two-dimensional partial arrays. Such an offset arrangement of several two-dimensional sensor arrays has an image field that is larger by the number of two-dimensional arrays in anamorphotic imaging of the diaphragm array on the sensor array, in contrast to an arrangement of a single sensor array with the same number of photosensitive elements, so that a correspondingly large signal/noise ratio results. [0021]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Details of the invention are described in further detail herein below taken together with the accompanying drawings, in which: [0022]
  • FIG. 1[0023] a comprises a schematic of the principles of a first embodiment of the invention, with a single pinhole array arranged in the common portion of the illuminating and observation beam paths;
  • FIG. 1[0024] b shows a second embodiment of the invention with separate light source array and diaphragm array;
  • FIG. 1[0025] c is a schematic explaining the principle of the synchronization between object motion and charge displacement in the sensor array;
  • FIG. 2[0026] a is a block circuit diagram for the synchronization between the object motion and the charge displacement in the sensor array;
  • FIG. 2[0027] b is a detailed representation of the functioning sequence in the microcontroller of FIG. 2a;
  • FIGS. 3[0028] a-3 c show sections of a diaphragm array forming a rhombic grid and the associated image points in the object plane and in the plane of the sensor array;
  • FIGS. 4[0029] a-4 c show sections of a diaphragm array forming a rectangular grid and the associated image points in the object plane and the plane of the sensor array;
  • FIG. 5[0030] a shows a schematic representation of a sensor array consisting of several two-dimensional partial sensor arrays that are arranged mutually offset;
  • FIG. 5[0031] b is a schematic of the principle of a pinhole array suitable for the sensor array of FIG. 5a.
  • DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
  • In the confocal microscope according to the invention shown in FIG. 1[0032] a, a single diaphragm array (4) with numerous transparent regions or holes is arranged in the common portion of the illumination and observation beam paths. This arrangement forms, at one and the same time, the diaphragm array for the detection beam path and the light source array for the illumination of the object (8). For this purpose, the diaphragm array (4) is uniformly illuminated from the back side by a light source (1) that is followed by a condenser (2). Each transparent region, or each pinhole, of the diaphragm array (4) thus forms a secondary light source.
  • A tube lens ([0033] 5) with a microscope objective (7) is arranged after the diaphragm array (4) in order to image the diaphragm array (4) on the object (8) positioned on the motorized scanning table (9). The microscope objective (7) is shown, greatly simplified, as a single lens in FIG. 1a. The microscope objective (7) is corrected to an infinite focal intercept, and thus to an infinite image distance. This is indicated in FIG. 1a by the telecentering diaphragm (6).
  • The diaphragm array ([0034] 4), and thus also the light source array imaged by the diaphragm array (4), is arranged, by means of the tube lens (5) and the telecentric imaging, to be confocal with the focal plane of the objective (7). A pattern of illumination corresponding to the image of the diaphragm array (4) arises in the focal plane of the objective (7). The object (8) is illuminated at the points that are conjugate to the transparent regions of the diaphragm array (4). The light scattered or reflected by the object (8) is imaged backward again, by the objective (7) with the subsequent tube lens (5), onto the diaphragm array (4). In this backward imaging, the diaphragm array (4) effects confocal filtering, resulting in only such light being transmitted through the transparent regions of the diaphragm array (4) as was scattered or reflected in regions of the object (8) that are confocal to the transparent regions of the diaphragm array (4). In contrast to this, the light that is scattered or reflected on the object (8) above or below the focal plane of the objective (7) is trapped by the non-transparent regions of the diaphragm array (4). This confocal microscope results in high resolution in the direction of the optical axis (z-direction), denoted by a dot-dash line. For separating the illumination and observation beam paths, a beam-splitter mirror (3) is arranged between the diaphragm array (4) and the condenser (2), and a portion of the light scattered or reflected at the object (8) and transmitted through the diaphragm array (4) is reflected out towards the sensor array (11). A further imaging optics (10) that images the diaphragm array (4) on the sensor array (11) is provided in the reflected beam path; that is, between the beam-splitter mirror (3) and the sensor array (11). The sensor array (11) is a so-called TDI sensor (Time Delay and Integration), such as is offered, for example, by DALSA Inc., Ontario, Canada, under the reference IT-E1 or IT-F2. Such a TDI sensor has 2048 columns each with 96 TDI stages or rows. A photosensitive region and a charge storage element is associated with each TDI stage in each column, so that the number of pixels (photosensitive regions) and charge storage elements amounts to 96×2048. The diaphragm array (4) has at least a number of transparent regions corresponding to the number of columns of the TDI sensor, so that at least one transparent region of the diaphragm array (4) is imaged on each column of the TDI sensor. The detailed imagewise arrangement of the pixels of the TDI sensor and of the transparent regions is described in more detail herein below with reference to FIGS. 3a-3 c and 4 a-4 c.
  • The scanning table ([0035] 9) can be moved by motor drive in two directions perpendicular to the optical axis, and senses large object regions. Its motion is sensed by means of two position measuring systems (12). The summed charges in the charge storage elements of the TDI sensor (11) are displaced in the stage direction by means of a synchronization unit (13), corresponding to the motion of the scanning table (9). For this purpose, the motion of the scanning table takes place along (possibly several) linear paths of movement, so that on the TDI sensor (11) the image point belonging to an object point is displaced along the columns. This state of affairs will be explained with reference to the simplified representation of FIG. 1c. Suppose that, at a first instant, an object point (8 a) is imaged at an image point (11 a) on the TDI sensor (11). Due to the motion of the scanning table (9), a motion of the object (8) results in the direction of the arrow (P1) and at a somewhat later instant the object point (8 a) has traveled to position (8 b). Simultaneously with the motion of the object (8), the charges stored in the charge storage elements of the TDI sensor (11) are displaced in the direction of the arrow (P2) from the stage (11 a) to the stage (11 b). Measurement can proceed during the motion of the object (8) due to this synchronization between the motion of the object (8) and the motion of the charges. The motion of the object (8) therefore does not take place in start-stop operation but uniformly during the measurement. Substantially shorter measurement times are attained at the same signal/noise ratio compared to arrangements in which the object motion takes place in start/stop operation and a measurement takes place when the object is stationary.
  • The complete scanning of the object field at right angles to the direction of motion of the scanning table ([0036] 9) takes place through an arrangement of the transparent regions that is offset at right angles to the direction of motion. In combination with the synchronization of the charge displacement in the sensor array corresponding to the motion of the image point of an object point, the whole object field, which corresponds to the row width of the sensor array, is sensed. Due to the offset arrangement of the diaphragms in the diaphragm array, the paths of the image points of the diaphragms lie close together, without gaps, in the focal plane of the objective (7). Complete sensing of the image field is possible without any micro-displacements at right angles to the direction of motion. This reduces the costs of data storage (data sorting) and reduces the tolerance requirements on the motion of the scanning table.
  • In the embodiment according to FIG. 1[0037] b, components corresponding to the individual components of the embodiment according to FIG. 1a are referenced with the same symbols as in FIG. 1a. The difference between the embodiment according to FIG. 1a and in FIG. 1b is that the diaphragm array (4 b) is arranged following the beam splitter (31) in the observation beam path or the detection beam path. The illumination beam path has its own diaphragm array (4 a), which forms the light source array. The two diaphragm arrays (4 a) and (4 b) are arranged conjugate to each other and conjugate to the focal plane of the objective (7). The transparent regions of the two diaphragm arrays (4 a) and (4 b) are also mutually conjugate. The use of separate diaphragm arrays (4 a, 4 b) in the illumination and observation beam paths avoids producing a large signal background on the TDI sensor (11) due to the relatively large proportion of light reflected at the diaphragm array (4 a) of the illumination beam path.
  • In addition, in the embodiment according to FIG. 1[0038] b, the beam splitter (3′) is constructed as a polarizing beam splitter, and the illumination of the diaphragm array (4 a) in the illumination beam path also takes place with polarized light, denoted by a polarizer (2 a) preceding the diaphragm array (4 a). In addition, a quarter wavelength plate (14) is provided on the object side of the beam splitter (3′) and, in a known manner, effects a rotation of 90° in the polarization of the light that is transmitted twice through the quarter wavelength plate (14). Using polarized light, a polarizing beam splitter (3′) and a quarter wavelength plate (14) results in a better use, by a factor of four, of the light present behind the condenser (2), compared to the embodiment according to FIG. 1a. However, a corresponding arrangement of polarizing beam splitter, polarizing filter and quarter wavelength plate is also possible in the embodiment with only one diaphragm array according to FIG. 1a.
  • A first embodiment of a diaphragm array ([0039] 4, 4 a, 4 b) is shown in FIG. 3b. The diaphragm array (4) contains a number of transparent regions, of which only 20 (4 1-4 20), are shown in FIG. 3b for reasons of clarity. The diameter of each transparent region (4 1-4 20) corresponds to about half the diameter of the Airy disk, and with an objective of numerical aperture NA=0.95 and for a wavelength lambda=365 nm amounts to about 0.25 μm multiplied by the imaging scale between the object (8) and the diaphragm array (4, 4 a, 4 b). In order to obtain the best possible confocal filtering, the spacing of closest neighboring transparent regions is at least 4 times the diameter of the transparent regions. The transparent regions (4 1-4 20) form a two-dimensional rhombic grid. The angle between the two grid axes is chosen so that, taking into account the imaging ratio between the diaphragm array (4, 4 b) and the TDI sensor (11), the center of respective closest neighboring transparent regions is imaged on neighboring columns of the TDI sensor (11). This imagewise arrangement is shown in FIG. 3c. Each square in FIG. 3c represents a photosensitive region. The 96 stages are represented in the vertical direction, and a section of the 2,048 columns in the horizontal direction, the columns being denoted by (P1, P2, P10, P11). As can be gathered from the view of FIGS. 3b and 3 c, the transparent region (41) is imaged on the column (P1); the transparent region (42) on the column (P2); and so on, on different columns of the TDI sensor (11). At the same time, the transparent regions (4 1-4 10) are imaged on different stages. The stage position, again corresponding to the stage position of the region (4 1) is the stage position on which the transparent region (4 11) is imaged.
  • FIG. 3[0040] a shows the image of the diaphragm array (4) and the TDI sensor (11) in the focal plane of the objective (7), and hence in a sectional plane of the object (8). The images of the transparent regions of the diaphragm array (4) are denoted using the same symbols as in FIG. 3b. Each square that has been drawn represents the image of the associated photosensitive region of the TDI sensor 11. The linear direction of motion of the scanning table (9) on the long meander paths is denoted by the arrow (S).
  • The same situation as in FIGS. 3[0041] a-3 c is shown in principle in FIGS. 4a-4 c for an alternative diaphragm array (4′) (see FIG. 4b). In this alternative embodiment for the diaphragm array (4′), the transparent regions correspond in their diameter and their distance to the neighboring transparent region to those of FIG. 3b. These transparent regions are arranged so that a rectangular two-dimensional grid of transparent regions results. The grid axes of the rectangular grid are rotated relative to the scanning direction (arrow S) so that here (as in previously described the embodiments according to FIGS. 3a-3 c) a respective transparent region (4 1′-4 6′) is imaged on a respective column of the TDI sensor (11). In FIG. 4a the image of the diaphragm array (4′) and of the TDI sensor (11) are again shown in the focal plane of the objective (7).
  • The rectangular grid arrangement of the transparent regions confers constructional advantages when the light source array ([0042] 4 a) is not constituted solely by a diaphragm array that is homogeneously illuminated from the back, but by a diaphragm array with a preceding lens array, a diffractive element, or a preceding fiber array for better illumination of the transparent regions of the diaphragm array (4 a). With resulting secondary light sources that are sufficiently formed as points, an illuminating diaphragm array (4 a) may even be dispensed with.
  • The electronics required for controlling the object motion and the simultaneous synchronization of the charge displacement is now described, with reference to the block circuit diagrams in FIGS. 2[0043] a and 2 b.
  • Essentially, the object table or stage ([0044] 9) consists of table elements that are displaceable in two mutually perpendicular directions, the motorized drives (20, 21), the position measuring systems (22, 23), and a microcontroller (24). The object table itself (9) is displaceably received, for a focusing in the direction of the optical axis, on a stand (not shown). The two motorized drives (20, 21), for producing motion in two orthogonal directions are preferably constituted as linear drives. The position measuring systems (22, 23) that sense the motion or deflection of the table (9) independently of each other in the two mutually perpendicular directions, are constructed as length measuring interferometers. When the table moves in the direction of the measuring beam path of the associated interferometer, these interferometers provide an intensity of irradiation on a radiation sensor that has a sinusoidal dependence on the path traveled. The period of the sinisoidal signal which is proportional to the wavelength of the measuring light used is then directly associated with the distance traveled. At the beginning of a measurement, a null position is traveled to, since the measuring signal has ambiguities for long traveled paths, and an absolute calibration is required. At each later instant, the present position is then given in relation to this null position by the number of times the interferometer signal passed through zero, together with the phase difference of the detected sine wave signal in the calibration position and the present position.
  • The microcontroller ([0045] 24) controls the drives (20, 21) of the object table (9) corresponding to the present position values that are supplied by the measuring systems (22, 23), and to the reference position values that are determined by a host computer (not shown) via a bus line (29). FIG. 2b shows (on a larger scale), the controller circuit required for this purpose within the microcontroller 24. The data supplied via the control bus, for example, a CAN bus, is converted in an arithmetic logic unit (ALU) (33) into the present reference positions. In a further ALU (32) that follows, the values determined in the ALU (33) are respectively subtracted from the values supplied from the two measuring systems (22, 23), so that the difference represents the amount of deviation between the actual position and the reference position. This difference is integrated over time in an integrator (34) and then multiplied in a unit (35) by a factor that gives the amplification of the open control circuit. This factor is as a rule negative, in order to effect a phase displacement of 180°. This amplified and time integrated difference signal then represents the drive signal for the drives (20, 21).
  • The values of the present reference positions in the two mutually perpendicular directions are simultaneously passed on by the ALU ([0046] 32) via data leads (30, 31) to a further microcontroller (28), a drive (27) for the reading out, or the cycle timing, of the TDI sensor (11), and an image processing electronics (25). The drive (27) (driven by the microcontroller (24)), effects a displacement of the charges stored in the TDI sensor corresponding to the travel of each image point on the TDI sensor (11). The charge data read out from the TDI sensor (11) are digitized by an A/D converter (26) and are then also passed on to the image processing electronics (25). In this manner, the image processing electronics (25) obtains the information for which table position the radiation intensities recorded with the TDI sensor are to be entered into the image to be produced. Here, the electronics takes into consideration the delays which are caused by the systematic properties of the TDI sensor. Should the table be located at a position outside the region to be sensed by the recording, the values given by the TDI sensor remain unconsidered.
  • The image processing electronics first carries out a restoration of the recording. Constant and linear errors (that can arise, for example, due to changes of the radiation intensity, or due to deviations of the dimensions of the transparent regions within the diaphragm array, or deviations of the table speed from the reference speed, or different sensitivity characteristics of the pixels of the TDI sensor) are thereby compensated. After such constant or linear errors are compensated, the structures of the object (for example, of the illuminated wafer) can be suppressed somewhat by suitable filtering, in order to better establish the existence of errors between the dies. [0047]
  • In order to carry out a so-called die-to-die comparison, the portions of the recording that are to be compared with each other are brought to cover one another, with pixel accuracy, taking into account errors in the table system. The portions of the recording to be compared are then subtracted one from another, the die-to-die comparison is carried out, and defects such as contaminating particles are detected by exclusive threshold formation. [0048]
  • With reference to FIGS. 2[0049] a and 2 b, in the control circuit described above, the nominal desired speed and the course of the table are predetermined by the host computer. With the aid of the clock (36) built into the microcontroller (24), the microcontroller calculates from the speed standards the reference position of the table and the cycle time according to which the table is regulated, and the cycle times are set for the drive (27) for the TDI sensor and for the image processing electronics. As an alternative to this, the cycle times for reading out the TDI scanner and the image processing electronics are set directly from the host computer. In this case, the reference position is not passed on via the data leads (30, 31), but the momentary actual positions are passed on to the image processing electronics (25).
  • Preferably, the image recording of a large object field takes place by an object table motion of meander form, in which the long motion is oriented so that the image points travel in the direction of the 96 stages of the TDI sensor. The motion then takes place at a constant speed over the image region to be recorded. After the object has been scanned in one direction, a displacement of the table takes place in the direction perpendicular to this, so that now when scanning the nearest long meander path, the neighboring object regions are imaged on the TDI sensor. Scanning out then takes place in the opposite direction, wherein at the same time the direction of the charge transport between the storage elements of the TDI sensor is reversed. Here it is of course required that the TDI sensor have bidirectional scanning properties, so that the charges are displaceable in the two opposite directions. The sensor can be, for this purpose, an IT-F2-Type of DALSA, Inc. [0050]
  • The frequency that is predetermined by the host computer or by the clock ([0051] 36) of the microcontroller (24) is determined so that the object table is moved at the maximum speed possible for a readout of the TDI rows with the maximum frequency, while taking into account the imaging scale and the image drift.
  • A change of the objective ([0052] 7) is required to change the imaging scale. Preferably, this takes place by means of a coded revolving nosepiece, where the scale data of the objectives belonging to the positions of the revolving nosepiece are stored in a memory. A matching of the mutually synchronized speed between the reading out of the TDI sensor and the object table can then also occur when the revolving nosepiece position is changed.
  • As a rule, a change in the imaging scale is associated with a change in the diaphragm array, since the diameter of the transparent regions remains matched to the size of the Airy disk, which depends on the numerical aperture of the objective. [0053]
  • A particularly advantageous arrangement of a TDI sensor in combination with the present invention is shown in FIG. 5[0054] a. The TDI sensor (37) consists of several partial sensors (38, 39, 40), that are arranged one after the other in the stage direction, and that are mutually offset in the pixel direction (the horizontal, in FIG. 5a) by the distance Δ=d/n, where d is the pixel spacing and n is the number of partial sensors. Together with an anamorphotic imaging of the diaphragm array (41) (FIG. 5b) on the composite TDI sensor, an improvement in the signal/noise ratio corresponding to the number of partial sensors (38, 39, 40) arranged one behind the other results, compared to a TDI sensor having an identical total surface area. In the embodiment shown in FIG. 5a, a total of 9 partial sensors 20. (38, 39, 40), again with 96 stages respectively, are arranged one behind the other. The stage direction here again corresponds to the motion of the object point when the object is scanned. The imaging of the diaphragm array (41) then takes place with a 9 times greater imaging scale in the scanning direction than in the direction at right angles to it. By this anamorphotic imaging, the transparent regions lying in the first two rows (Z1, Z2) of the diaphragm array (41) are then imaged on the first partial sensor (38); the two succeeding rows (Z3, Z4) are imaged on the second partial sensor (39); and so on. This anamorphotic imaging is shown in FIG. 5a by the oval images of the circular transparent regions of the diaphragm array (41). First, the offset arrangement of several partial sensors makes it possible to image the transparent regions that are imaged on each partial sensor as right-angled, partial grids directed parallel to the rows and columns of the partial sensors. Second, at the same time, the partial grids are mutually offset in correspondence with the mutual offset of the partial sensors, so that the whole image field is sensed without gaps when the image data of the partial TDIs are correspondingly sorted to obtain the correct sequence. Several transparent regions can thereby be imaged on one column of each partial sensor at different stage positions, resulting in the improved signal/noise ratio. In the illustrations of FIGS. 5a and 5 b, two transparent regions are imaged on each pixel position at correspondingly offset stage positions of the same partial sensor (38). However, the use of only two transparent regions per pixel position serves only for illustration. In order to optimally use the surface of the sensor (37) at a predetermined ratio of diameter of the transparent regions to the spacing of the transparent regions, the number of the transparent regions can be chosen corresponding to the number of partial sensors (38, 39, 40), so that with 9 partial sensors, an amount of light per pixel result that is greater by a factor of 9 than in the embodiments according to FIGS. 3a-3 c and 4 a-4 c, so that with the same signal/noise ratio, the scanning of the object can take place at 9 times the speed.
  • Due to the anamorphotic imaging, all columns of all the partial sensors contribute to image production. A sensor array of several partial sensors may also be put to use in combination with a normal, non-anamorphotic imaging of the diaphragm array on the sensor array. In this case, only a portion of the columns of the partial sensors contributes to the formation of the image. [0055]
  • Instead of TDIs as the partial sensors, an arrangement of a corresponding number of row sensors in a mutually offset arrangement is conceivable. Such an arrangement can be compared, in terms of light sensitivity, with the embodiments according to FIGS. 3[0056] a-3 c. Of course, in comparison, the sensor surface used is clearly reduced.

Claims (12)

We claim:
1. A confocal microscope having an optical axis and an objective (7) with a focal plane, comprising:
a motorized scanning table (9) for moving an object (8) at right angles to said optical axis of said microscope,
a diaphragm array (4, 4 a, 4 b) in a plane that is conjugate to said focal plane of said microscope objective (7),
a sensor array (11) following said diaphragm array (4, 4 a, 4 b) in an observation direction, with a plurality of photosensitive elements, change storage elements associated with said photosensitive elements, and a device for displacing charges stored in said charge storage elements from one charge storage element to another charge storage element, and a synchronizing unit (13, 24) for effecting displacement of said charges corresponding to motion of an image point of an object point in a plane of said sensor array (11).
2. The confocal microscope according to
claim 1
, wherein said scanning table (9) is arranged to move said object (8) along linear paths.
3. The confocal microscope according to
claim 1
, wherein said diaphragm array (4, 4 a, 4 b) is fixed relative to an observation beam path during motion of said object.
4. The confocal microscope according to
claim 3
, wherein said diaphragm array (4, 4 a, 4 b) has a plurality of transparent regions (4 1-4 20) that are arranged such that image paths of said plurality of transparent regions in said focal plane-of said objective (7) fill a portion of said focal plane of said objective (7) without gaps.
5. The confocal microscope according to
claim 1
, wherein a light source array (4, 4 a) is arranged for producing a plurality of mutually spaced-apart light sources in a plane conjugate to said focal plane of said objective (7), and positions of said plurality of light sources are conjugate to positions of transparent regions (4 1-4 20) of said diaphragm array (4, 4 a).
6. The confocal microscope according to
claim 1
, wherein said sensor array (11) has a plurality of mutually parallel linear sensor columns and said charges are displaced in the direction of said sensor columns.
7. The confocal microscope according to
claim 6
, wherein said diaphragm array has a plurality of transparent regions (4 1-4 20), and each column of said sensor array has at least one of said transparent regions imaged on it.
8. The confocal microscope according to
claim 7
, wherein said transparent regions (4 1-4 20) of said diaphragm array (4, 4 a, 4 b) form a two-dimensional rhombic grid arrangement.
9. The confocal microscope according to
claim 7
, wherein said transparent regions (4 1-4 20) of said diaphragm array (4, 4 a, 4 b) form a two-dimensional rectangular grid arrangement.
10. The confocal microscope according to
claim 9
, wherein said diaphragm array (4, 4 b) is imaged on said sensor array (11).
11. The confocal microscope according to
claim 6
, wherein said sensor array (37) comprises a plurality of mutually independent partial sensor arrays (38, 39, 40) arranged one behind the other in a columnar direction, mutually offset in a row direction by a distance (Δ) equal to d/n, where d is the spacing of individual sensors in said row direction and n is the number of said partial sensor arrays.
12. The confocal microscope according to
claim 11
, wherein said diaphragm array (41) is anamorphotically imaged on said sensor array (37).
US09/779,960 1997-04-07 2001-02-09 Confocal microscope with a motorized scanning table Expired - Fee Related US6429897B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/779,960 US6429897B2 (en) 1997-04-07 2001-02-09 Confocal microscope with a motorized scanning table

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
DE19714221 1997-04-07
DE19714221A DE19714221A1 (en) 1997-04-07 1997-04-07 Confocal microscope with a motorized scanning table
DE19714221.4 1997-04-07
US92347097A 1997-09-04 1997-09-04
US09/779,960 US6429897B2 (en) 1997-04-07 2001-02-09 Confocal microscope with a motorized scanning table

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US92347097A Continuation 1997-04-07 1997-09-04

Publications (2)

Publication Number Publication Date
US20010012069A1 true US20010012069A1 (en) 2001-08-09
US6429897B2 US6429897B2 (en) 2002-08-06

Family

ID=7825633

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/779,960 Expired - Fee Related US6429897B2 (en) 1997-04-07 2001-02-09 Confocal microscope with a motorized scanning table

Country Status (4)

Country Link
US (1) US6429897B2 (en)
EP (1) EP0871052B1 (en)
JP (1) JP3970998B2 (en)
DE (2) DE19714221A1 (en)

Cited By (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020074493A1 (en) * 2000-07-27 2002-06-20 Hill Henry A. Multiple-source arrays for confocal and near-field microscopy
US6591003B2 (en) * 2001-03-28 2003-07-08 Visiongate, Inc. Optical tomography of small moving objects using time delay and integration imaging
US20030210760A1 (en) * 2002-05-13 2003-11-13 Nelson Alan C. Method and apparatus for emission computed tomography using temporal signatures
US20040017487A1 (en) * 2002-03-22 2004-01-29 Olympus Optical Co., Ltd. Image acquiring apparatus and image acquiring method
US6697508B2 (en) 2002-05-10 2004-02-24 Visiongate, Inc. Tomographic reconstruction of small objects using a priori knowledge
US20040076319A1 (en) * 2002-04-19 2004-04-22 Fauver Mark E. Method and apparatus of shadowgram formation for optical tomography
US6741730B2 (en) 2001-08-10 2004-05-25 Visiongate, Inc. Method and apparatus for three-dimensional imaging in the fourier domain
US20040101210A1 (en) * 2001-03-19 2004-05-27 The Arizona Board Of Regents On Behalf Of The University Of Arizona Miniaturized microscope array digital slide scanner
US20040169922A1 (en) * 2001-03-29 2004-09-02 Tony Wilson Stereo microscopy
US20050010108A1 (en) * 2002-04-19 2005-01-13 Rahn John Richard Method for correction of relative object-detector motion between successive views
US20050085721A1 (en) * 2002-04-19 2005-04-21 University Of Washington System and method for processing specimens and images for optical tomography
US20050085708A1 (en) * 2002-04-19 2005-04-21 University Of Washington System and method for preparation of cells for 3D image acquisition
US6944322B2 (en) 2001-03-28 2005-09-13 Visiongate, Inc. Optical tomography of small objects using parallel ray illumination and post-specimen optical magnification
US20060006311A1 (en) * 2001-06-22 2006-01-12 Orbotech Ltd High-sensitivity optical scanning using memory integration
US20060018013A1 (en) * 2004-07-07 2006-01-26 Yoshimasa Suzuki Microscope imaging apparatus and biological-specimen examination system
US6991738B1 (en) 2004-10-13 2006-01-31 University Of Washington Flow-through drum centrifuge
US20060023219A1 (en) * 2001-03-28 2006-02-02 Meyer Michael G Optical tomography of small objects using parallel ray illumination and post-specimen optical magnification
US20060096358A1 (en) * 2004-10-28 2006-05-11 University Of Washington Optical projection tomography microscope
US7197355B2 (en) 2002-04-19 2007-03-27 Visiongate, Inc. Variable-motion optical tomography of small objects
US20080273196A1 (en) * 1999-03-23 2008-11-06 Kla-Tencor Corporation Confocal wafer inspection system and method
US20090028414A1 (en) * 2000-05-03 2009-01-29 Aperio Technologies, Inc. Data Management in a Linear-Array-Based Microscope Slide Scanner
US7494809B2 (en) 2004-11-09 2009-02-24 Visiongate, Inc. Automated cell sample enrichment preparation method
US20100027856A1 (en) * 2000-05-03 2010-02-04 Aperio Technologies, Inc. Method for Pre-focus of Digital Slides
US20100188655A1 (en) * 2009-01-23 2010-07-29 Kla-Tencor Corporation TDI Sensor Modules With Localized Driving And Signal Processing Circuitry For High Speed Inspection
US20100214639A1 (en) * 2009-02-23 2010-08-26 Visiongate, Inc. Optical tomography system with high-speed scanner
US7787112B2 (en) 2007-10-22 2010-08-31 Visiongate, Inc. Depth of field extension for optical tomography
US7835561B2 (en) 2007-05-18 2010-11-16 Visiongate, Inc. Method for image processing and reconstruction of images for optical tomography
US20100322494A1 (en) * 2001-03-28 2010-12-23 University Of Washington Focal Plane Tracking for Optical Microtomography
USRE42220E1 (en) 1999-04-21 2011-03-15 Hamamatsu Photonics K.K. Microscopy
US20110090223A1 (en) * 2004-05-27 2011-04-21 Aperio Technologies, Inc. Creating and viewing three dimensional virtual slides
WO2012002893A1 (en) * 2010-06-30 2012-01-05 Ge Healthcare Bio-Sciences Corp A system for synchronization in a line scanning imaging microscope
WO2014053415A1 (en) * 2012-10-01 2014-04-10 Carl Zeiss Microscopy Gmbh Confocal microscope with freely adjustable sample scanning
US8736924B2 (en) 2011-09-28 2014-05-27 Truesense Imaging, Inc. Time-delay-and-integrate image sensors having variable integration times
US8743195B2 (en) 2008-10-24 2014-06-03 Leica Biosystems Imaging, Inc. Whole slide fluorescence scanner
US20140152797A1 (en) * 2012-12-04 2014-06-05 Samsung Electronics Co., Ltd. Confocal optical inspection apparatus and confocal optical inspection method
US8805050B2 (en) 2000-05-03 2014-08-12 Leica Biosystems Imaging, Inc. Optimizing virtual slide image quality
US20150042997A1 (en) * 2013-02-26 2015-02-12 Beijing Boe Optoelectronics Technology Co., Ltd. Transmittance testing apparatus
US20150109512A1 (en) * 2011-12-19 2015-04-23 Hamamatsu Photonics K.K. Image capturing apparatus and focusing method thereof
US9235041B2 (en) 2005-07-01 2016-01-12 Leica Biosystems Imaging, Inc. System and method for single optical axis multi-detector microscope slide scanner
US20160109693A1 (en) * 2014-10-16 2016-04-21 Illumina, Inc. Optical scanning systems for in situ genetic analysis
US9860437B2 (en) 2011-12-19 2018-01-02 Hamamatsu Photonics K.K. Image capturing apparatus and focusing method thereof
US9971140B2 (en) 2011-12-19 2018-05-15 Hamamatsu Photonics K.K. Image capturing apparatus and focusing method thereof
CN108181005A (en) * 2017-11-17 2018-06-19 天津津航技术物理研究所 A kind of method and system for the debugging of TDI ccd detectors focal plane
US10194108B2 (en) 2015-05-14 2019-01-29 Kla-Tencor Corporation Sensor with electrically controllable aperture for inspection and metrology systems
US10298833B2 (en) 2011-12-19 2019-05-21 Hamamatsu Photonics K.K. Image capturing apparatus and focusing method thereof
US10313622B2 (en) 2016-04-06 2019-06-04 Kla-Tencor Corporation Dual-column-parallel CCD sensor and inspection systems using a sensor
CN111323899A (en) * 2010-10-26 2020-06-23 完整基因有限公司 Method and system for imaging high density biochemical arrays by sub-pixel alignment
US10778925B2 (en) 2016-04-06 2020-09-15 Kla-Tencor Corporation Multiple column per channel CCD sensor architecture for inspection and metrology
US10852519B2 (en) 2016-11-30 2020-12-01 Asm Technology Singapore Pte Ltd Confocal imaging of an object utilising a pinhole array
US11069054B2 (en) 2015-12-30 2021-07-20 Visiongate, Inc. System and method for automated detection and monitoring of dysplasia and administration of immunotherapy and chemotherapy
CN113479353A (en) * 2021-07-14 2021-10-08 贵州航天林泉电机有限公司 Satellite turntable path planning method based on speed planning

Families Citing this family (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4042185B2 (en) * 1997-08-29 2008-02-06 株式会社ニコン Pattern inspection device
US6388788B1 (en) 1998-03-16 2002-05-14 Praelux, Inc. Method and apparatus for screening chemical compounds
US20030036855A1 (en) 1998-03-16 2003-02-20 Praelux Incorporated, A Corporation Of New Jersey Method and apparatus for screening chemical compounds
DE19921127A1 (en) * 1999-05-07 2000-11-16 Metasystems Hard & Software Gm Microscope systems for optical scanning of microscopic objects and methods for optical scanning
JP3544892B2 (en) 1999-05-12 2004-07-21 株式会社東京精密 Appearance inspection method and apparatus
DE19956438A1 (en) * 1999-11-24 2001-05-31 Leica Microsystems Structure for scanning an object-recording device records data with a confocal laser scan microscope enabling image capture of large object fields with adequate speed by scanning the object-recording device.
US7738688B2 (en) 2000-05-03 2010-06-15 Aperio Technologies, Inc. System and method for viewing virtual slides
DE10024687A1 (en) * 2000-05-18 2001-11-22 Zeiss Carl Jena Gmbh Autofocus unit for e.g. semiconductor wafer inspection microscope, includes e.g. diaphragms with complementary structures straddling conjugate point of illuminant
DE10026392A1 (en) * 2000-05-27 2001-11-29 Leica Microsystems Method and arrangement for coding live images in microscopy
DE10050529B4 (en) * 2000-10-11 2016-06-09 Leica Microsystems Cms Gmbh Method for beam control in a scanning microscope, arrangement for beam control in a scanning microscope and scanning microscope
DE10122607B4 (en) * 2001-05-10 2006-11-30 Leica Microsystems Cms Gmbh Method and arrangement for direct Fourier imaging of samples
DE10126291C2 (en) 2001-05-30 2003-04-30 Leica Microsystems microscope
DE10126286A1 (en) * 2001-05-30 2002-12-19 Leica Microsystems Method and apparatus for spot scanning a sample
CN1602451A (en) 2001-11-07 2005-03-30 应用材料有限公司 Maskless photon-electron spot-grid array printer
US6946655B2 (en) 2001-11-07 2005-09-20 Applied Materials, Inc. Spot grid array electron imaging system
US6639201B2 (en) 2001-11-07 2003-10-28 Applied Materials, Inc. Spot grid array imaging system
JP3731073B2 (en) * 2002-09-17 2006-01-05 独立行政法人理化学研究所 Microscope equipment
US7116440B2 (en) 2003-02-28 2006-10-03 Aperio Technologies, Inc. Image processing and analysis framework
US7257268B2 (en) 2003-02-28 2007-08-14 Aperio Technologies, Inc. Systems and methods for image pattern recognition
KR100942841B1 (en) * 2003-06-02 2010-02-18 엘지디스플레이 주식회사 Method and Apparatus for Testing and Repairing Liquid Crystal Display Device
EP2278383A1 (en) * 2004-11-24 2011-01-26 Battelle Memorial Institute A test tube handling apparatus
JP5336088B2 (en) 2005-01-27 2013-11-06 アペリオ・テクノロジーズ・インコーポレイテッド System and method for visualizing a three-dimensional virtual slide
US7684048B2 (en) 2005-11-15 2010-03-23 Applied Materials Israel, Ltd. Scanning microscopy
DE102007009550B4 (en) 2007-02-27 2008-12-18 Ludwig-Maximilian-Universität Method and microscope device for observing a moving sample
JP5389016B2 (en) 2007-05-04 2014-01-15 アペリオ・テクノロジーズ・インコーポレイテッド System and method for quality assurance in pathology
EP3543357A1 (en) 2007-05-08 2019-09-25 Trustees of Boston University Chemical functionalization of solid-state nanopores and nanopore arrays and applications thereof
WO2009034564A2 (en) * 2007-09-16 2009-03-19 Meir Ben-Levy Imaging measurements system with periodic pattern illumination and tdi
JP2009181088A (en) * 2008-02-01 2009-08-13 Nikon Corp Confocal unit, confocal microscope, and confocal diaphragm
DE102008010435B4 (en) 2008-02-21 2010-07-29 Tecan Trading Ag Data acquisition procedure with a laser scanner device
JP2010216880A (en) * 2009-03-13 2010-09-30 Omron Corp Displacement sensor
CA2808576A1 (en) 2009-09-30 2011-04-07 Quantapore, Inc. Ultrafast sequencing of biological polymers using a labeled nanopore
WO2011072211A2 (en) 2009-12-11 2011-06-16 Aperio Technologies, Inc. Improved signal to noise ratio in digital pathology image analysis
US8421903B2 (en) * 2010-05-10 2013-04-16 Abbott Laboratories Staggered contact image sensor imaging system
DE202011001569U1 (en) * 2011-01-14 2012-03-01 Berthold Technologies Gmbh & Co. Kg Device for measuring optical properties in microplates
US9651539B2 (en) 2012-10-28 2017-05-16 Quantapore, Inc. Reducing background fluorescence in MEMS materials by low energy ion beam treatment
JP6131448B2 (en) * 2012-12-04 2017-05-24 三星電子株式会社Samsung Electronics Co.,Ltd. Confocal optical inspection apparatus and confocal optical inspection method
DE102013005563A1 (en) * 2013-03-28 2014-10-02 Carl Zeiss Microscopy Gmbh LIGHT MICROSCOPE AND METHOD FOR EXAMINING A MICROSCOPIC SAMPLE
US9862997B2 (en) 2013-05-24 2018-01-09 Quantapore, Inc. Nanopore-based nucleic acid analysis with mixed FRET detection
US9772297B2 (en) 2014-02-12 2017-09-26 Kla-Tencor Corporation Apparatus and methods for combined brightfield, darkfield, and photothermal inspection
FR3018603B1 (en) * 2014-03-13 2018-03-09 Holo3 - Association Pour Le Developpement Des Methodes Et Techniques Holographiques Optiques Et Connexes DEVICE FOR ANALYZING AND IMAGING A SURFACE BY DEFLECTOMETRY AND CORRESPONDING METHOD
US9606069B2 (en) * 2014-06-25 2017-03-28 Kla-Tencor Corporation Method, apparatus and system for generating multiple spatially separated inspection regions on a substrate
ES2789000T3 (en) 2014-10-10 2020-10-23 Quantapore Inc Nanopore-based polynucleotide analysis with mutually inactivating fluorescent labels
JP6757316B2 (en) 2014-10-24 2020-09-16 クアンタポール, インコーポレイテッド Efficient optical analysis of polymers using nanostructured arrays
EP3482196B1 (en) 2016-07-05 2022-02-23 Quantapore, Inc. Optically based nanopore sequencing
JP6863578B2 (en) * 2017-04-19 2021-04-21 日本分光株式会社 Infrared microscope
JP7115826B2 (en) * 2017-07-18 2022-08-09 三星電子株式会社 Imaging device and imaging method
US10429315B2 (en) * 2017-07-18 2019-10-01 Samsung Electronics Co., Ltd. Imaging apparatus and imaging method

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4382267A (en) * 1981-09-24 1983-05-03 Rca Corporation Digital control of number of effective rows of two-dimensional charge-transfer imager array
NL8402340A (en) * 1984-07-25 1986-02-17 Philips Nv MICROSCOPE FOR NON-DIFFERENTIATED PHASE IMAGING.
EP0437883B1 (en) * 1990-01-16 1994-06-29 Koninklijke Philips Electronics N.V. Scanning device for optically scanning a surface along a line
JP2954381B2 (en) * 1991-05-27 1999-09-27 株式会社日立製作所 Pattern inspection method and apparatus
KR960007481B1 (en) * 1991-05-27 1996-06-03 가부시끼가이샤 히다찌세이사꾸쇼 Pattern recognition method and the device thereof
JP2971628B2 (en) * 1991-06-27 1999-11-08 株式会社日立製作所 Pattern inspection method and apparatus
US5248876A (en) * 1992-04-21 1993-09-28 International Business Machines Corporation Tandem linear scanning confocal imaging system with focal volumes at different heights
US5544338A (en) * 1992-12-31 1996-08-06 International Business Machines Corporation Apparatus and method for raster generation from sparse area array output
WO1996039619A1 (en) * 1995-06-06 1996-12-12 Kla Instruments Corporation Optical inspection of a specimen using multi-channel responses from the specimen

Cited By (122)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080273196A1 (en) * 1999-03-23 2008-11-06 Kla-Tencor Corporation Confocal wafer inspection system and method
US7858911B2 (en) * 1999-03-23 2010-12-28 Kla-Tencor Corporation Confocal wafer inspection system and method
USRE42220E1 (en) 1999-04-21 2011-03-15 Hamamatsu Photonics K.K. Microscopy
US20110037847A1 (en) * 2000-05-03 2011-02-17 Aperio Technologies, Inc. Fully Automatic Rapid Microscope Slide Scanner
US20090028414A1 (en) * 2000-05-03 2009-01-29 Aperio Technologies, Inc. Data Management in a Linear-Array-Based Microscope Slide Scanner
US8385619B2 (en) 2000-05-03 2013-02-26 Aperio Technologies, Inc. Fully automatic rapid microscope slide scanner
US20130162802A1 (en) * 2000-05-03 2013-06-27 Aperio Technologies, Inc. Fully Automatic Rapid Microscope Slide Scanner
US8094902B2 (en) * 2000-05-03 2012-01-10 Aperio Technologies, Inc. Data management in a linear-array-based microscope slide scanner
US8055042B2 (en) 2000-05-03 2011-11-08 Aperio Technologies, Inc. Fully automatic rapid microscope slide scanner
US20110221882A1 (en) * 2000-05-03 2011-09-15 Aperio Technologies, Inc. Data Management in a Linear-Array-Based Microscope Slide Scanner
US7978894B2 (en) * 2000-05-03 2011-07-12 Aperio Technologies, Inc. Fully automatic rapid microscope slide scanner
US20110141263A1 (en) * 2000-05-03 2011-06-16 Aperio Technologies, Inc. Achieving Focus in a Digital Pathology System
US7949168B2 (en) 2000-05-03 2011-05-24 Aperio Technologies, Inc. Data management in a linear-array-based microscope slide scanner
US7893988B2 (en) 2000-05-03 2011-02-22 Aperio Technologies, Inc. Method for pre-focus of digital slides
US8456522B2 (en) 2000-05-03 2013-06-04 Aperio Technologies, Inc. Achieving focus in a digital pathology system
US8731260B2 (en) 2000-05-03 2014-05-20 Leica Biosystems Imaging, Inc. Data management in a linear-array-based microscope slide scanner
US9213177B2 (en) 2000-05-03 2015-12-15 Leica Biosystems Imaging, Inc. Achieving focus in a digital pathology system
US9851550B2 (en) 2000-05-03 2017-12-26 Leica Biosystems Imaging, Inc. Fully automatic rapid microscope slide scanner
US9729749B2 (en) 2000-05-03 2017-08-08 Leica Biosystems Imaging, Inc. Data management in a linear-array-based microscope slide scanner
US8755579B2 (en) * 2000-05-03 2014-06-17 Leica Biosystems Imaging, Inc. Fully automatic rapid microscope slide scanner
US7826649B2 (en) 2000-05-03 2010-11-02 Aperio Technologies, Inc. Data management in a linear-array-based microscope slide scanner
US9535243B2 (en) 2000-05-03 2017-01-03 Leica Biosystems Imaging, Inc. Optimizing virtual slide image quality
US8805050B2 (en) 2000-05-03 2014-08-12 Leica Biosystems Imaging, Inc. Optimizing virtual slide image quality
US20100027856A1 (en) * 2000-05-03 2010-02-04 Aperio Technologies, Inc. Method for Pre-focus of Digital Slides
US20090141126A1 (en) * 2000-05-03 2009-06-04 Aperio Technologies, Inc. Fully Automatic Rapid Microscope Slide Scanner
US9386211B2 (en) 2000-05-03 2016-07-05 Leica Biosystems Imaging, Inc. Fully automatic rapid microscope slide scanner
US9521309B2 (en) 2000-05-03 2016-12-13 Leica Biosystems Imaging, Inc. Data management in a linear-array-based microscope slide scanner
US20020074493A1 (en) * 2000-07-27 2002-06-20 Hill Henry A. Multiple-source arrays for confocal and near-field microscopy
US20040101210A1 (en) * 2001-03-19 2004-05-27 The Arizona Board Of Regents On Behalf Of The University Of Arizona Miniaturized microscope array digital slide scanner
US7184610B2 (en) * 2001-03-19 2007-02-27 The Arizona Board Of Regents On Behalf Of The University Of Arizona Miniaturized microscope array digital slide scanner
US6944322B2 (en) 2001-03-28 2005-09-13 Visiongate, Inc. Optical tomography of small objects using parallel ray illumination and post-specimen optical magnification
US20060023219A1 (en) * 2001-03-28 2006-02-02 Meyer Michael G Optical tomography of small objects using parallel ray illumination and post-specimen optical magnification
US20100322494A1 (en) * 2001-03-28 2010-12-23 University Of Washington Focal Plane Tracking for Optical Microtomography
US6591003B2 (en) * 2001-03-28 2003-07-08 Visiongate, Inc. Optical tomography of small moving objects using time delay and integration imaging
US7907765B2 (en) 2001-03-28 2011-03-15 University Of Washington Focal plane tracking for optical microtomography
US20040169922A1 (en) * 2001-03-29 2004-09-02 Tony Wilson Stereo microscopy
US7129509B2 (en) 2001-06-22 2006-10-31 Orbotech, Ltd. High-sensitivity optical scanning using memory integration
US8536506B2 (en) 2001-06-22 2013-09-17 Orbotech Ltd. Imaging device and method for high-sensitivity optical scanning and integrated circuit therefor
US20080278775A1 (en) * 2001-06-22 2008-11-13 Orbotech Ltd. High-sensitivity optical scanning using memory integration
US20060006311A1 (en) * 2001-06-22 2006-01-12 Orbotech Ltd High-sensitivity optical scanning using memory integration
US20070012865A1 (en) * 2001-06-22 2007-01-18 Orbotech Ltd. High-sensitivity optical scanning using memory integration
US7417243B2 (en) 2001-06-22 2008-08-26 Orbotech Ltd High-sensitivity optical scanning using memory integration
US7897902B2 (en) 2001-06-22 2011-03-01 Orbotech Ltd. Imaging device and method for high-sensitivity optical scanning and integrated circuit therefor
US7009163B2 (en) 2001-06-22 2006-03-07 Orbotech Ltd. High-sensitivity optical scanning using memory integration
US9232114B2 (en) 2001-06-22 2016-01-05 Orbotech Ltd. Imaging device and method for high-sensitivity optical scanning and integrated circuit therefor
US8119969B2 (en) 2001-06-22 2012-02-21 Orbotech Ltd Imaging device and method for high-sensitivity optical scanning and integrated circuit therefor
US6741730B2 (en) 2001-08-10 2004-05-25 Visiongate, Inc. Method and apparatus for three-dimensional imaging in the fourier domain
US20040017487A1 (en) * 2002-03-22 2004-01-29 Olympus Optical Co., Ltd. Image acquiring apparatus and image acquiring method
US7411626B2 (en) * 2002-03-22 2008-08-12 Olympus Corporation Image acquiring apparatus and image acquiring method
US7542597B2 (en) 2002-04-19 2009-06-02 Visiongate, Inc. Method for correction of relative object-detector motion between successive views
US20070071357A1 (en) * 2002-04-19 2007-03-29 Visiongate, Inc. Method for correction of relative object-detector motion between successive views
US20050085708A1 (en) * 2002-04-19 2005-04-21 University Of Washington System and method for preparation of cells for 3D image acquisition
US20050085721A1 (en) * 2002-04-19 2005-04-21 University Of Washington System and method for processing specimens and images for optical tomography
US7811825B2 (en) 2002-04-19 2010-10-12 University Of Washington System and method for processing specimens and images for optical tomography
US20040076319A1 (en) * 2002-04-19 2004-04-22 Fauver Mark E. Method and apparatus of shadowgram formation for optical tomography
US7260253B2 (en) 2002-04-19 2007-08-21 Visiongate, Inc. Method for correction of relative object-detector motion between successive views
US7197355B2 (en) 2002-04-19 2007-03-27 Visiongate, Inc. Variable-motion optical tomography of small objects
US7738945B2 (en) 2002-04-19 2010-06-15 University Of Washington Method and apparatus for pseudo-projection formation for optical tomography
US20050010108A1 (en) * 2002-04-19 2005-01-13 Rahn John Richard Method for correction of relative object-detector motion between successive views
US6697508B2 (en) 2002-05-10 2004-02-24 Visiongate, Inc. Tomographic reconstruction of small objects using a priori knowledge
US6770893B2 (en) 2002-05-13 2004-08-03 Visiongate, Inc. Method and apparatus for emission computed tomography using temporal signatures
US20030210760A1 (en) * 2002-05-13 2003-11-13 Nelson Alan C. Method and apparatus for emission computed tomography using temporal signatures
WO2004031809A3 (en) * 2002-09-30 2004-08-19 Visiongate Inc Optical tomography of small moving objects using time delay and integration imaging
WO2004031809A2 (en) * 2002-09-30 2004-04-15 Visiongate, Inc. Optical tomography of small moving objects using time delay and integration imaging
US8565480B2 (en) 2004-05-27 2013-10-22 Leica Biosystems Imaging, Inc. Creating and viewing three dimensional virtual slides
US20110090223A1 (en) * 2004-05-27 2011-04-21 Aperio Technologies, Inc. Creating and viewing three dimensional virtual slides
US9069179B2 (en) 2004-05-27 2015-06-30 Leica Biosystems Imaging, Inc. Creating and viewing three dimensional virtual slides
US8923597B2 (en) 2004-05-27 2014-12-30 Leica Biosystems Imaging, Inc. Creating and viewing three dimensional virtual slides
US20070121200A1 (en) * 2004-07-07 2007-05-31 Yoshimasa Suzuki Microscope imaging apparatus and biological-specimen examination system
US20060018013A1 (en) * 2004-07-07 2006-01-26 Yoshimasa Suzuki Microscope imaging apparatus and biological-specimen examination system
US20070121199A1 (en) * 2004-07-07 2007-05-31 Yoshimasa Suzuki Microscope imaging apparatus and biological-specimen examination system
US20070121198A1 (en) * 2004-07-07 2007-05-31 Yoshimasa Suzuki Microscope imaging apparatus and biological-specimen examination system
US7271952B2 (en) 2004-07-07 2007-09-18 Olympus Corporation Microscope imaging apparatus and biological-specimen examination system
US6991738B1 (en) 2004-10-13 2006-01-31 University Of Washington Flow-through drum centrifuge
US20060096358A1 (en) * 2004-10-28 2006-05-11 University Of Washington Optical projection tomography microscope
US7494809B2 (en) 2004-11-09 2009-02-24 Visiongate, Inc. Automated cell sample enrichment preparation method
US9235041B2 (en) 2005-07-01 2016-01-12 Leica Biosystems Imaging, Inc. System and method for single optical axis multi-detector microscope slide scanner
US7835561B2 (en) 2007-05-18 2010-11-16 Visiongate, Inc. Method for image processing and reconstruction of images for optical tomography
US7933010B2 (en) 2007-10-22 2011-04-26 Rahn J Richard Depth of field extension for optical tomography
US7787112B2 (en) 2007-10-22 2010-08-31 Visiongate, Inc. Depth of field extension for optical tomography
US20100321786A1 (en) * 2007-10-22 2010-12-23 Visiongate, Inc. Depth of field extension for optical tomography
US9523844B2 (en) 2008-10-24 2016-12-20 Leica Biosystems Imaging, Inc. Whole slide fluorescence scanner
US8743195B2 (en) 2008-10-24 2014-06-03 Leica Biosystems Imaging, Inc. Whole slide fluorescence scanner
US8624971B2 (en) 2009-01-23 2014-01-07 Kla-Tencor Corporation TDI sensor modules with localized driving and signal processing circuitry for high speed inspection
US20100188655A1 (en) * 2009-01-23 2010-07-29 Kla-Tencor Corporation TDI Sensor Modules With Localized Driving And Signal Processing Circuitry For High Speed Inspection
US9077862B2 (en) 2009-01-23 2015-07-07 Kla-Tencor Corporation TDI sensor modules with localized driving and signal processing circuitry for high speed inspection
US8254023B2 (en) 2009-02-23 2012-08-28 Visiongate, Inc. Optical tomography system with high-speed scanner
US20100214639A1 (en) * 2009-02-23 2010-08-26 Visiongate, Inc. Optical tomography system with high-speed scanner
US10027855B2 (en) 2010-06-30 2018-07-17 Ge Healthcare Bio-Science Corp. System for synchronization in a line scanning imaging microscope
WO2012002893A1 (en) * 2010-06-30 2012-01-05 Ge Healthcare Bio-Sciences Corp A system for synchronization in a line scanning imaging microscope
CN111323899A (en) * 2010-10-26 2020-06-23 完整基因有限公司 Method and system for imaging high density biochemical arrays by sub-pixel alignment
US9049353B2 (en) 2011-09-28 2015-06-02 Semiconductor Components Industries, Llc Time-delay-and-integrate image sensors having variable integration times
US8964088B2 (en) 2011-09-28 2015-02-24 Semiconductor Components Industries, Llc Time-delay-and-integrate image sensors having variable intergration times
US8736924B2 (en) 2011-09-28 2014-05-27 Truesense Imaging, Inc. Time-delay-and-integrate image sensors having variable integration times
US9503606B2 (en) 2011-09-28 2016-11-22 Semiconductor Components Industries, Llc Time-delay-and-integrate image sensors having variable integration times
US20150109512A1 (en) * 2011-12-19 2015-04-23 Hamamatsu Photonics K.K. Image capturing apparatus and focusing method thereof
US10298833B2 (en) 2011-12-19 2019-05-21 Hamamatsu Photonics K.K. Image capturing apparatus and focusing method thereof
US20170285306A9 (en) * 2011-12-19 2017-10-05 Hamamatsu Photonics K.K. Image capturing apparatus and focusing method thereof
US9860437B2 (en) 2011-12-19 2018-01-02 Hamamatsu Photonics K.K. Image capturing apparatus and focusing method thereof
US9921392B2 (en) * 2011-12-19 2018-03-20 Hamamatsu Photonics K.K. Image capturing apparatus and focusing method thereof
US9971140B2 (en) 2011-12-19 2018-05-15 Hamamatsu Photonics K.K. Image capturing apparatus and focusing method thereof
US10571664B2 (en) 2011-12-19 2020-02-25 Hamamatsu Photonics K.K. Image capturing apparatus and focusing method thereof
WO2014053415A1 (en) * 2012-10-01 2014-04-10 Carl Zeiss Microscopy Gmbh Confocal microscope with freely adjustable sample scanning
US10645247B2 (en) 2012-10-01 2020-05-05 Carl Zeiss Microscopy Gmbh Confocal microscope with a pinhole arrangement
US9983395B2 (en) 2012-10-01 2018-05-29 Carl Zeiss Microscopy Gmbh Confocal microscope with a pinhole arrangement
US20140152797A1 (en) * 2012-12-04 2014-06-05 Samsung Electronics Co., Ltd. Confocal optical inspection apparatus and confocal optical inspection method
US9182346B2 (en) * 2013-02-26 2015-11-10 Beijing Boe Optoelectronics Technology Co., Ltd. Transmittance testing apparatus
US20150042997A1 (en) * 2013-02-26 2015-02-12 Beijing Boe Optoelectronics Technology Co., Ltd. Transmittance testing apparatus
US9897791B2 (en) * 2014-10-16 2018-02-20 Illumina, Inc. Optical scanning systems for in situ genetic analysis
CN112099214A (en) * 2014-10-16 2020-12-18 亿明达股份有限公司 Optical scanning system for in situ genetic analysis
TWI671549B (en) * 2014-10-16 2019-09-11 美商伊路米納有限公司 Confocal tdi line scan imaging system and method of performing scanning
US20160109693A1 (en) * 2014-10-16 2016-04-21 Illumina, Inc. Optical scanning systems for in situ genetic analysis
CN107076975A (en) * 2014-10-16 2017-08-18 亿明达股份有限公司 Optical scanning system for genetic analysis in situ
CN107076975B (en) * 2014-10-16 2020-09-11 亿明达股份有限公司 Optical scanning system for in situ genetic analysis
US10194108B2 (en) 2015-05-14 2019-01-29 Kla-Tencor Corporation Sensor with electrically controllable aperture for inspection and metrology systems
US11069054B2 (en) 2015-12-30 2021-07-20 Visiongate, Inc. System and method for automated detection and monitoring of dysplasia and administration of immunotherapy and chemotherapy
US10313622B2 (en) 2016-04-06 2019-06-04 Kla-Tencor Corporation Dual-column-parallel CCD sensor and inspection systems using a sensor
US10764527B2 (en) 2016-04-06 2020-09-01 Kla-Tencor Corporation Dual-column-parallel CCD sensor and inspection systems using a sensor
US10778925B2 (en) 2016-04-06 2020-09-15 Kla-Tencor Corporation Multiple column per channel CCD sensor architecture for inspection and metrology
US10852519B2 (en) 2016-11-30 2020-12-01 Asm Technology Singapore Pte Ltd Confocal imaging of an object utilising a pinhole array
CN108181005A (en) * 2017-11-17 2018-06-19 天津津航技术物理研究所 A kind of method and system for the debugging of TDI ccd detectors focal plane
CN113479353A (en) * 2021-07-14 2021-10-08 贵州航天林泉电机有限公司 Satellite turntable path planning method based on speed planning

Also Published As

Publication number Publication date
JPH10326587A (en) 1998-12-08
DE19714221A1 (en) 1998-10-08
EP0871052B1 (en) 2004-01-14
DE59810569D1 (en) 2004-02-19
US6429897B2 (en) 2002-08-06
EP0871052A1 (en) 1998-10-14
JP3970998B2 (en) 2007-09-05

Similar Documents

Publication Publication Date Title
US6429897B2 (en) Confocal microscope with a motorized scanning table
US6640014B1 (en) Automatic on-the-fly focusing for continuous image acquisition in high-resolution microscopy
US5248876A (en) Tandem linear scanning confocal imaging system with focal volumes at different heights
KR100685574B1 (en) Apparatus and method for evaluating a large target relative to the measuring hole of the sensor
JP3411780B2 (en) Laser microscope and pattern inspection apparatus using this laser microscope
CN101114134B (en) Alignment method and micro-device manufacturing method used for shadow cast scan photo-etching machine
JPH0117523B2 (en)
TWI402498B (en) An image forming method and image forming apparatus
KR101444048B1 (en) Confocal scanner and optical measurement apparatus using the same
JP2000275027A (en) Slit confocal microscope and surface shape measuring apparatus using it
TW202217278A (en) Grey-mode scanning scatterometry overlay metrology
JPH0578761B2 (en)
JP2001272603A (en) Optical device
JPH08128923A (en) Image evaluation device
KR101867081B1 (en) Confocal 3d sensing system with digital optical system
JPH0324965B2 (en)
JP2501098B2 (en) microscope
JPH07111505B2 (en) Photoelectric microscope
US20220357285A1 (en) Defect inspection apparatus and defect inspection method
JP2001012926A (en) Three-dimensional inspection apparatus for object
JP4675011B2 (en) Shape measuring method and shape measuring apparatus
JPH03102249A (en) Method and apparatus for detecting foreign matter
JP2002277399A (en) Lighting system and inspection device using the same, encoder, and instrument for measuring height
JPH102724A (en) Optical three-dimensional measuring apparatus
JPH09250912A (en) Pattern measurement device

Legal Events

Date Code Title Description
FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20140806