US20110025831A1 - 3D Biplane Microscopy - Google Patents

3D Biplane Microscopy Download PDF

Info

Publication number
US20110025831A1
US20110025831A1 US12/936,095 US93609509A US2011025831A1 US 20110025831 A1 US20110025831 A1 US 20110025831A1 US 93609509 A US93609509 A US 93609509A US 2011025831 A1 US2011025831 A1 US 2011025831A1
Authority
US
United States
Prior art keywords
particle
sample
microscopy system
data set
planes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/936,095
Inventor
Joerg Bewersdorf
Manuel F. Juette
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jackson Laboratory
Original Assignee
Jackson Laboratory
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jackson Laboratory filed Critical Jackson Laboratory
Assigned to THE JACKSON LABORATORY reassignment THE JACKSON LABORATORY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BEWERSDORF, JOERG, JUETTE, MANUEL F.
Publication of US20110025831A1 publication Critical patent/US20110025831A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • G02B21/367Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/16Microscopes adapted for ultraviolet illumination ; Fluorescence microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/361Optical details, e.g. image relay to the camera or image sensor
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/58Optics for apodization or superresolution; Optical synthetic aperture systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/218Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0085Motion estimation from stereoscopic image signals

Definitions

  • the present invention relates generally to microscopic imaging and, more specifically, to three-dimensional (“3D”) sub-100 nanometer resolution by biplane microscope imaging.
  • STED microscopy and other members of reversible saturable optical fluorescence transitions (“RESOLFT”) family achieve a resolution >10-fold beyond the diffraction barrier by engineering the microscope's point-spread function (“PSF”), also referred to as a “reference data set,” through optically saturable transitions of the (fluorescent) probe molecules.
  • PSF point-spread function
  • FPALM fluorescence photoactivation localization microscopy
  • PAM photoactivation localization microscopy
  • STORM stochastic optical reconstruction microscopy
  • PALM PALM with independently running acquisition
  • FPALM is described in more detail in an article titled “Ultra-High Resolution Imaging by Fluorescence Photoactivation Localization Microscopy” by Samuel T. Hess et al. (91 Biophysical Journal, 4258-4272, December 2006), which is incorporated herein by reference in its entirety.
  • PALM is described in more detail in an article titled “Imaging Intracellular Fluorescent Proteins at Nanometer Resolution” by Eric Betzig el al.
  • photo-sensitive refers to both photo-activatable (e.g., switching probes between an on state and an off state) and photo-switching (e.g., switching between a first color and a second color).
  • the sample is labeled with photo-sensitive probes, such as photo-activatable (“PA”) fluorescent probes (e.g., PA proteins or caged organic dyes).
  • PA photo-activatable
  • Activation of only a sparse subset of molecules at a time allows their separate localization.
  • the final sub-diffraction image of the labeled structure is generated by plotting the positions of some or all localized molecules.
  • Particle-tracking techniques can localize small objects (typically ⁇ diffraction limit) in live cells with sub-diffraction accuracy and track their movement over time. But conventional particle-tracking fluorescence microscopy cannot temporally resolve interactions of organelles, molecular machines, or even single proteins, which typically happen within milliseconds.
  • d spatial resolution
  • N total number of detected fluorescence photons from the particle
  • technical constraints arising from axial scanning and/or camera readout times limit the recording speed, and therefore, the temporal resolution.
  • a particular 3D particle-tracking technique can track particles only with 32 milliseconds time resolution.
  • This technique scans a 2-photon excitation focus in a 3D orbit around the fluorescent particle and determines its 3D position by analyzing the temporal fluorescence fluctuations. The temporal resolution is ultimately limited by the frequency with which the focus can revolve in 3D around the particle.
  • This technique is described in more detail in an article titled “3-D Particle Tracking In A Two-Photon Microscope: Application To The Study Of Molecular Dynamics IN Cells” by V. Levi, Q. Ruan, and E. Gratton (Biophys. J., 2005, 88(4): pp. 2919-28), which is incorporated by reference in its entirety.
  • another current 3D particle-tracking technique combines traditional particle-tracking with widefield “bifocal detection” images. Particles are simultaneously detected in one plane close to the focal plane of the particle and a second plane 1 micrometer out of focus. The lateral and axial coordinates are derived from the 2 images.
  • the temporal resolution is limited to the 2-50 milliseconds range, and the localization accuracy is limited to the 2-5 nanometer range. Additional details are described in an article titled “Three-Dimensional Particle Tracking Via Bifocal Imaging” by E Toprak et al. (Nano Lett., 2007, 7(7): pp. 2043-45), which is incorporated by reference in its entirety. As such, advances in temporal resolution to sub-millisecond levels have been limited only to 2D imaging.
  • determining the 3D position of a particle by any of the methods mentioned above requires fitting a model function to the respective experimental data.
  • the particle position (and also, typically, the particle brightness and background value) can be deduced from the parameters that fit the experimental data best, according to a chosen figure of merit.
  • an analytical function is used to reasonably model the characteristics that dominantly describe the 3D particle position, e.g., the diameter of the defocused image or the particle ellipticity in the case of astigmatism.
  • the model function is calibrated with imaged particles located at known positions to achieve mapping of the determined fit parameters to real spatial positions.
  • a microscopy system is configured for creating 3D images from individually localized probe molecules.
  • the microscopy system includes a sample stage, an activation light source, a readout light source, a beam splitting device, at least one camera, and a controller.
  • the activation light source activates probes of at least one probe subset of photo-sensitive luminescent probes
  • the readout light source causes luminescence light from the activated probes.
  • the activation light source and the readout light source is the same light source.
  • the beam splitting device splits the luminescence light into at least two paths to create at least two detection planes that correspond to the same or different number of object planes of the sample.
  • the camera detects simultaneously the at least two detection planes, the number of object planes being represented in the camera by the same number of recorded regions of interest.
  • the controller is programmable to combine a signal from the regions of interest into a 3D data set.
  • a method for creating 3D images from individually localized probe molecules includes mounting a sample on a sample stage, the sample having a plurality of photo-sensitive luminescent probes.
  • probes of at least one probe subset of the plurality of photo-sensitive luminescent probes are activated.
  • luminescence light from the activated probes is caused.
  • the luminescence lights is split into at least two paths to create at least two detection planes, the at least two detection planes corresponding to the same or different object planes in the sample.
  • At least two detection planes are detected via a camera.
  • the object planes are recorded in corresponding recorded regions of interest in the camera.
  • a signal from the regions of interest is combined into a 3D data stack.
  • a microscopy system is configured for tracking microscopic particles in 3D.
  • the system includes a sample, a sample stage, at least one light source, a beam-steering device, a beam splitting device, at least one camera, and a controller.
  • the sample which includes luminescence particles, is mounted to the sample stage.
  • the light source is configured to illuminate an area of the sample to cause luminescence light, primarily, from one tracked particle of the luminescence particles.
  • the beam-steering device is configured to selectively move a light beam to illuminate different areas of the sample such that the luminescence light is detected.
  • the beam splitting device which is located in a detection light path, splits the luminescence light into at least two paths to create at least two detection planes that correspond to different object planes in the sample.
  • the camera is positioned to detect simultaneously the at least two detection planes, the number of object planes being represented in the camera by the same number of recorded regions of interest.
  • the controller is programmable to combine a signal from the recorded regions of interest, determine a 3D trajectory of the particle at each time point of a recorded data sequence, and move the beam-steering device to illuminate the different areas of the sample in accordance with corresponding positions of the one tracked particle.
  • a method for tracking microscopic particles in 3D includes mounting a sample on a sample stage, the sample including luminescent particles.
  • a small area of the sample is illuminated to cause luminescence light from primarily one particle of the luminescent particles.
  • the light beam is selectively moved to illuminate different areas of the sample to track movement of the one particle, the different areas including the small area of the sample and corresponding to respective positions of the one particle.
  • the luminescence light is split into at least two paths to create at least two detection planes that correspond to the same or different number of object planes in the sample.
  • the at least two detection planes are detected simultaneously.
  • the number of object planes is represented in a camera by the same number of recorded regions of interest. Based on a combined signal from the recorded regions of interest, a 3D trajectory of the one particle is determined at each time point of a recorded data sequence.
  • a microscopy system is configured for creating 3D images from individually localized probe molecules.
  • the microscopy system includes an activation light source, a readout light source, a beam splitting device, at least one camera, and a controller.
  • the activation light source is configured to illuminate a sample with an activation light, the activation light source being configured to activate probes of at least one probe subset of the plurality of photo-sensitive luminescent probes.
  • the readout light source is configured to illuminate the sample with a readout light, the readout light source being configured to cause luminescence light from the activated probes.
  • the beam splitting device is located in a detection light path that splits the luminescence light into at least two paths, the beam splitting device creating at least two detection planes that correspond to the same or different number of object planes of the sample.
  • the camera is positioned to detect simultaneously the at least two detection planes, the number of object planes being represented in the camera by the same number of recorded regions of interest.
  • the controller is programmable to combine a signal from the regions of interest into a 3D data stack, to calculate a figure of merit as a measure of the difference between imaged pixel data and a reference data set h 0 ( ⁇ l ) modifiable by at least one parameter, and to optimize the figure of merit by adjusting the at least one parameter.
  • FIG. 1 is a schematic view illustrating a biplane microscope setup for Fluorescence Photoactivation Localization Microscopy (FPALM), according to one embodiment.
  • FPALM Fluorescence Photoactivation Localization Microscopy
  • FIG. 2 is a schematic view illustrating a biplane microscope setup, according to an alternative embodiment.
  • FIG. 3 is a schematic view illustrating a fluorescent particle image on a CCD chip.
  • FIG. 4A is a graph representing an axial resolution measured from an axial profile of caged fluorescein-labeled antibodies.
  • FIG. 4B is a representative image showing added-up projections of a data set in three different orientations for the axial resolution measured in FIG. 4A .
  • FIG. 5A is a representative image of a data set for beads labeled with caged fluorescein at an axial position of 300 nanometers.
  • FIG. 5B illustrates a representative image of a resulting data set for the beads of FIG. 5A at an axial position of 100 nanometers.
  • FIG. 5C illustrates a representative image of a resulting data set for the beads of FIG. 5A at an axial position of ⁇ 100 nanometers.
  • FIG. 5D illustrates a representative image of a resulting data set for the beads of FIG. 5A at an axial position of ⁇ 300 nanometers.
  • FIG. 5E illustrates a representative image of a resulting data set for the beads of FIG. 5A at an axial position of ⁇ 500 nanometers.
  • FIG. 5F illustrates a volume-rendered representation of the data set illustrated in FIGS. 5A-5E .
  • FIG. 6 is a schematic view illustrating adjustment of a biplane microscope setup, according to an alternative embodiment.
  • FIG. 7A is a schematic view illustrating a fluorescent particle image on a CCD chip when the particle is in focus, in a first position.
  • FIG. 7B is a schematic view illustrating the fluorescent particle image of FIG. 7A when the particle is out of focus, in a second position.
  • FIG. 7C is a schematic view illustrating the fluorescent particle image of FIG. 7B when the particle is in focus, in a third position.
  • FIG. 8 is a diagrammatic representation of an exemplary embodiment of a photoactivation localization microscopy method.
  • FIG. 9 is a diagrammatic representation of an exemplary embodiment of a single particle tracking method.
  • FIG. 10 is a diagrammatic representation of an exemplary conventional embodiment of a non-iterative localization algorithm.
  • FIG. 11 is a diagrammatic representation of an exemplary conventional embodiment of an iterative localization algorithm.
  • FIG. 12 is a diagrammatic representation of an exemplary embodiment of a localization algorithm.
  • FIG. 13 is a diagrammatic representation of an exemplary embodiment of an algorithm for calculating a figure of merit.
  • FIG. 14 is a diagrammatic representation of an exemplary embodiment of a Fourier-Transform based Shift Algorithm.
  • a biplane (“BP”) microscope system 100 allows 3D imaging at an unmatched resolution well below 100 nanometers in all three dimensions, resulting in at least a 100-fold smaller resolvable volume than obtainable by conventional 3D microscopy.
  • the BP microscope system 100 is optionally a BP FPALM system, which is generally based on a conventional FPALM design.
  • the BP microscope system 100 includes a modified detection path that allows the simultaneous detection from two focal planes.
  • the simultaneous detection of two planes for localization-based super-resolution microscopy speeds up the imaging process by making axial scanning unnecessary, and more importantly, in contrast to scanning-based systems, eliminates localization artifacts caused by abrupt blinking and bleaching common to single molecules.
  • the BP microscope system 100 can optionally be located on an air-damped optical table to minimize vibrations.
  • the BP microscope system 100 can also achieve temporal resolution ⁇ 1 milliseconds.
  • the BP microscope system 100 can also be a next-generation 3D particle-tracking microscope (“3D PTM”) for providing unprecedented temporal and spatial resolution when tracking fluorescent particles in live cells in 3D.
  • 3D PTM next-generation 3D particle-tracking microscope
  • FPALM and particle-tracking are just some exemplary applications of the BP microscope system 100 .
  • the BP microscope system 100 tracks one particle at a time (in contrast to conventional 2D and 3D tracking techniques that visualize the entire field).
  • the BP microscope system 100 can include a detection scheme without any moving parts that detects simultaneously two axially shifted detection planes.
  • the BP microscope system 100 can include a focused laser beam for excitation combined with spatially limited detection. Background light is filtered out to avoid localization disturbances and to increase sensitivity in samples thicker than about 1 micrometer. This enables particle-tracking even in tissue sections.
  • the BP microscope system 100 can include, for example, high-speed piezo-mirrors and a fast piezo-driven sample stage. The combination of focused excitation and feedback-driven beam-tracking reduces the background and enhances the speed limit by approximately one order of magnitude.
  • a second (different) luminescence color can be detected to enable correlative studies of the movement of the tracked particle.
  • Illumination for readout and activation can be provided by a readout laser 102 , operating typically at 496 nanometers, and an activation laser 104 (e.g., 50 mW, Crystalaser), operating typically at 405 nanometers.
  • the readout laser 102 is optionally a water-cooled Argon laser (e.g., Innova 70, coherent Inc.) that can provide 458, 472, 488, 496, or 514 nanometers for readout illumination.
  • the wavelength of the readout laser 102 is selected to minimize activation of inactive probes of a plurality of photo-sensitive probes of a sample 124 .
  • the readout laser 102 and the activation laser 104 can be the same source.
  • the readout laser 102 can perform both the readout functions and the activation functions, without requiring the use of the activation laser 104 .
  • at least one illuminated area of the sample 124 is a relatively small area, having, for example, a general diameter that is less than about three times an Airy disk diameter.
  • Both lasers 102 , 104 are combined, via a first dichroic beam splitter 110 , and coupled, via a second dichroic beam splitter 120 , into a microscope stand 106 equipped with a 63 ⁇ 1.2NA water immersion tube lens 108 after passing through a field aperture 107 .
  • Both lasers 102 , 104 can be switched on and off by software-controlled electrical shutters (e.g., SH05, Thorlabs).
  • Other components that may be included along the path between the lasers 102 , 104 and the microscope stand 106 are a first mirror 112 and a first lens 114 .
  • the microscope stand 106 can have a plurality of components, including a sample stage 116 and an objective 118 .
  • the sample 124 including for example a biological cell 124 a is generally positioned on the sample stage 116 .
  • the sample stage 116 can be a mechanical stage or a three-axis piezo stage (e.g., P-733.3DD, Physik Instrumente).
  • Other components which are not shown, may include shutters in front of the lasers 102 , 104 and further optics for folding the beam path.
  • Fluorescence is collected by the objective 118 , passes through a second dichroic beam splitter 120 (which reflects the laser light) and is focused by the tube lens 108 via an optional second mirror 122 (e.g., a piezo-driven mirror) into an intermediate focal plane 140 .
  • the focal plane 140 is imaged by two lenses—a second lens 128 and a third lens 132 —onto a high-sensitivity EM-CCD camera 126 (e.g., DU897DCS-BV iXon, Andor Technology).
  • Scattered laser light is attenuated by bandpass and Raman edge filters (e.g., Chroma and Semrock), such as filter 130 .
  • the detection scheme can be achieved by moving the CCD camera 126 out of the standard image plane closer to the tube lens 108 and thereby shifting the corresponding focal plane ⁇ 350 nanometers deeper into the sample.
  • a beam splitter cube 134 is placed into a focused light path 136 a in front of the CCD camera 126 .
  • the beam splitter cube 134 redirects a reflected light path 136 b via a third mirror 138 towards the CCD camera 126 to form a second image in a different region of the same CCD. Due to the longer optical path, this second image corresponds to a focal plane ⁇ 350 nanometers closer to the objective 118 than the original focal plane.
  • the BP microscope system 100 using a single camera, is straightforward to implement and avoids synchronization problems between separate cameras.
  • the BP microscope system 100 features a reasonable field of view of ⁇ 20 ⁇ 50 micrometers (pixel size corresponding to ⁇ 100 nanometers in the sample 124 ; 512 ⁇ 512 pixels), sufficient to image large portions of a cell.
  • the BP microscope system 100 is able to image 100 frames per second with a field of view of 10 to 20 micrometers in length and 2 ⁇ 2 binning.
  • the use of the CCD camera 126 which features negligible readout noise due to its on-chip electron multiplication, avoids additional noise that would otherwise result from splitting the light up into two fields as required for BP detection. Combined with the fact that there is minimal loss of fluorescence detection efficiency, this exemplary BP microscope system 100 expands conventional FPALM to 3D imaging without significant drawbacks.
  • BP FPALM technology is compatible with live cell imaging and can be expanded to multicolor imaging (even realizable on the same CCD detector).
  • BP FPALM can record 3D structures in a ⁇ 1 micrometer thick z-section without scanning. Larger volumes can be recorded by recording BP FPALM data at different sample positions.
  • BP FPALM can be combined with a 2-photon (“2P”) laser scanner. 2P excitation-mediated activation is directed to diffraction-limited planes of ⁇ 800 nanometers thickness, a thickness that is compatible with the axial detection range of BP FPALM.
  • BP FPALM therefore has the potential of imaging specimens such as cell nuclei or tissue sections far exceeding 1 micrometer in thickness.
  • BP FPALM can be readily implemented in practically every existing FPALM, PALM, PALMIRA or STORM instrument. BP FPALM therefore provides the means to investigate a large variety of biological 3D structures at resolution levels previously far out of reach.
  • BP FPALM detected luminescence from activated probes is fluorescence or scattered light.
  • the activation of activated probes is achieved via a non-linear process that limits the activation to a plane of diffraction-limited thickness.
  • 100 nanometer diameter yellow-green fluorescent beads (Invitrogen, F-8803) can be attached to a poly-L-lysine coated cover slip.
  • the sample can be mounted on a piezo stage and imaged in the BP FPALM setup with 496 nm excitation.
  • 101 images at z-positions ranging from ⁇ 2 . 5 to +2.5 micrometers with 50 nanometers step size are recorded.
  • the same bead is imaged 2 to 3 times to check for drift and to correct for bleaching.
  • the data set can be smoothed in Imspector with a Gaussian filter of sub-diffraction size. Additionally, the data set can be corrected for mono-exponential bleaching, cropped to appropriate size and to be centered and normalized to 1.
  • the accumulation time per frame is typically 10 milliseconds.
  • electron multiplying gain is set to 300, the readout is 2 ⁇ 2 binned, only the region occupied by two recorded regions of interest (“ROIs”) is read out, and, typically, 5,000 to 50,000 frames are recorded.
  • ROIs regions of interest
  • At least some of the ROIs are detected at different wavelengths by including suitable detection filters in the BP microscope system 100 .
  • at least some of the ROIs are detected at different polarization directions by including suitable polarization optics in the BP microscopy system 100 .
  • the BP microscope system 200 includes a microscope stand 202 having a piezo-driven sample stage 204 on which a sample 206 is positioned.
  • the sample 206 includes a plurality of fluorescent particles 206 a - 206 d .
  • the microscope stand 202 further includes an objective 208 and a first lens 210 .
  • the components include a second lens 216 , a beam-steering device 281 (e.g., a piezo-driven mirror), a dichroic beam splitter 220 , a bandpass filter 222 , a third lens 224 , a neutral 50:50 beam splitter 226 , and a mirror 228 .
  • the beam-steering device 218 can include generally a focusing optical element that moves illumination and detection focal planes axially to follow the tracked particle.
  • the beam-steering device 218 can include a phase-modulating device that moves an illuminated area laterally and illumination and detection focal planes axially to follow the tracked particle.
  • more than one piezo-driven mirror 218 can be included in the BP microscope system 200 .
  • a polarized laser beam from a laser 229 is coupled into the microscope stand 202 and focused into the sample 206 by the objective 208 .
  • a fourth lens 230 and a ⁇ /4 plate 232 are positioned between the laser 229 and the dichroic beam splitter 220 .
  • the focus can be positioned in the region of interest by moving the sample stage 204 and the beam-steering device 218 .
  • the fluorescence emerging from the focal region is collected by the objective 208 and is imaged onto the CCD camera 214 via the first lens 210 , the second lens 216 , and the third lens 224 .
  • the dichroic beam splitter 220 and the bandpass filter 222 filter out scattered excitation light and other background light.
  • the neutral 50:50 beam splitter 226 splits the fluorescence light into two beam paths, a transmitted beam 215 a and a reflected beam 215 b .
  • the transmitted beam 215 a images light emitted from a plane deeper in the sample onto one area of the CCD chip.
  • the reflected beam 215 b images light from a plane closer to the objective onto another well-separated area to avoid cross-talk.
  • two ROIs on the CCD chip represent two focal planes in the sample 206 (illustrated in FIG. 2 ), typically 700 nanometers apart, arranged like wings of a biplane.
  • the two ROIs include a transmitted ROI 300 and a reflected ROI 302 , each having nine pixels showing an image of the fluorescent particle 206 b from the sample 206 .
  • the dashed areas 304 a - 304 i , 306 a - 306 i depict the pixels that are used for tracking the fluorescent particle 206 b .
  • the two 9-pixel-areas 304 a - 304 i , 306 a - 306 i represent in general the position of the particle 206 b in 3D.
  • the fluorescent particle 206 b which is generally smaller than the laser focus and located in the focal region, is excited homogeneously and 3 (binned) lines (i.e., the two 9-pixel-areas represented by dashed areas 304 a - 304 i , 306 a - 306 i ) of the CCD chip arranged around the laser focus image are read out at every time point. Particles laterally shifted with respect to the laser focus center will appear shifted on the CCD chip.
  • the two 9-pixel-areas 304 a - 304 i , 306 a - 306 i act in the same was as two confocal pinholes in different planes: if the particle 206 b moves axially, the signal will increase in one of the 9-pixel-area and decrease in the other 9-pixel-area.
  • An axial shift will be represented by a sharper intensity distribution in one of the two 9-pixel-areas depending on the direction of the shift.
  • the 3D position can be determined by subtracting different pixel values of the two 9-pixel-areas from each other. For the axial coordinate (z-axis), the sum of all pixels from one 9-pixel-area can be subtracted from the other 9-pixel-area. The fact that the lateral information is preserved in the 9-pixel-areas allows for lateral localization of the particle 306 b at the same time.
  • the signal collected in the left columns 304 a , 304 d , 304 g , 306 a , 306 d , 306 g (or upper rows: 304 a , 304 b , 304 c and 306 a , 306 b , 306 c ) of both 9-pixel-areas 300 and 302 can be subtracted from the one in the right columns 304 c , 304 f , 304 i , 306 c , 306 f , 306 i (or lower rows: 304 g , 304 h , 304 i and 306 g , 306 h , 306 i ).
  • the determined values are approximately proportional to the particle position offset of the center as long as the position stays in a range of +/ ⁇ 250 nanometers axially and +/ ⁇ 100 nanometers laterally.
  • these values can be fed back to piezo controllers tilting piezo mirrors and moving the sample stage piezo to re-center the particle in the 9-pixel-areas after every measurement.
  • the position can be determined by taking the image shape and brightness into account in the data analysis to increase the tracking range.
  • the pixels of the transmitted ROI 300 show a brighter image than the pixels of the reflected ROI 302 (on the right).
  • the top-right dashed areas 304 b , 304 c , 304 e , 304 f of the transmitted ROI 300 are generally brighter than the other 5 pixels in the same ROI 300 and than all pixels of the reflected ROI 302
  • the fluorescent particle 206 b is located axially more towards the focal plane 140 imaged on transmitted ROI 300 and is shifted by about half the diffraction limit toward the right and top relative to the excitation focus.
  • the signal from the two ROIs 300 , 302 can also be combined into a 3D data stack (2 pixels in z; x and y dimensions are determined by the size of the ROIs 300 , 302 ).
  • Data analysis is a generalization of standard FPALM methods to 3D. Instead of a Gaussian, an experimentally obtained 3D-PSF can be fit to each data set consisting of the pixels around each detected probe molecule. The x, y and z-coordinates of each molecule are determined from the best fit of the molecule image with the PSF.
  • BP FPALM For BP FPALM, typically but not necessarily, larger ROIs 300 , 302 are used to allow localization of particles over a larger field of view. Also, several particles can be present in the same ROI and still be analyzed separately. Slight variations in the magnification and rotation between the two detection areas may be corrected by software before combination of the two ROIs 300 , 302 into a 3D data stack. The slight difference in the tilt of the focal planes between the two ROIs 300 , 302 is negligible because of the large axial magnification (proportional to the lateral magnification squared).
  • the analysis of the 3D data can be seen as the generalization of standard 2D FPALM analysis to 3D.
  • Particles are identified in the z-projected images by iteratively searching for the brightest pixels and eliminating this region in the subsequent search until a lower intensity threshold has been reached.
  • the raw data may be cut out in each ROI 300 , 302 around each found particle in a square window of, for example, 10-19 pixels long and wide.
  • a theoretical or experimentally obtained 3D-PSF can be fitted to the data sets in this cutout window using a simplex fitting algorithm adapted from Numerical Recipes in C, or a different algorithm.
  • the algorithm can be a localization algorithm that is independent of theoretical models and, therefore, is generally applicable to a large number of experimental realizations. A more detailed description of the localization algorithm is provided below, after the description of FIGS. 7A-7C .
  • the localized position is extracted and stored. Additionally, amplitude, background, the deviation from the cutout windows center, the number of iterations and the chi square value are stored, which allow later determination of the quality of the fit.
  • the stored list of fit results is analyzed and translated into 3D data sets of customizable voxel sizes.
  • the fit amplitude is used as the voxel intensity for every molecule found that fulfills the user-defined quality criteria.
  • the camera software Solis, Andor Technology
  • Software to operate the microscope with the piezo stage, for fitting, and to create 3D data sets, may be programmed in LabView 8.2 (National Instruments). Imspector (Andreas Schoenle, Max Planck Institute for Biophysical Chemisty, Goettingen, Germany) is used for display and analysis of 3D data sets. 3D rendered images may be created using Amira.
  • a graph illustrates the axial resolution measured using a BP FPALM setup. Specifically, the axial resolution is measured from an axial profile of caged fluorescein-labeled antibodies on a covers slip and embedded in 87% glycerol. The black line represents raw data and the dashed line represents a Gaussian fit.
  • FWHM full-width-at-half-maximum
  • an inset shows added-up projections of the data set (of FIG. 4A ) in three different orientations.
  • the white box marks the region used to generate the axial profile.
  • the scale bar of the original images was 2 micrometers.
  • the data shown in all planes 5 A- 5 F is recorded simultaneously without scanning. Especially to image samples thicker than 1 micrometer, the sample stage can be moved after finishing recording at one sample position to access different sample depth positions and the data recording process is repeated until all sample positions of interest have been recorded.
  • FIG. 5F a volume-rendered representation is shown based on the data sets of FIGS. 5A-5E .
  • the curved surface of the bead is nicely reproduced over nearly 1 ⁇ m in depth without scanning.
  • the optical images show well-below 100 nanometers resolution in all three dimensions. With approximately 30 ⁇ 30 ⁇ 80 nanometers 3 , the resolvable volume is ⁇ 500-fold below the diffraction-limited observation volume and represents the smallest observation volume achieved in a far-field light microscope.
  • a BP microscope system 600 is illustrated to show the tracking of a single particle 606 positioned on a sample stage 604 .
  • the BP microscope system 600 is generally similar to the BP microscope system 300 described above in reference to FIG. 3 .
  • the fluorescence light beam is adjusted by tilting one or more piezo-mounted mirrors or adjusting alternative beam-steering devices 618 .
  • the piezo-mounted mirror 618 is tilted counterclockwise from a first position (indicated in solid line) to a second position (indicated in dashed line).
  • the rotation of the mirror 618 steers the fluorescence light beam on the camera as well as the excitation light beam focusing into the sample and coming from the laser to correct for sideways movement of the particle 606 .
  • the mirror 618 is rotated until the excitation light beam is again centered on the particle 606 .
  • the sample stage 604 is moved up or down to correct for vertical movement.
  • a suitable beam-steering device 618 refocuses the beam vertically. After the necessary adjustments are made to track the particle 606 , the positions of the piezo and stage are recorded to reconstruct large scale movement in post-processing.
  • FIGS. 7A and 7B two insets show the images recorded when a particle moves from a first position to a second position as described above in reference to FIG. 6 .
  • a transmitted ROI 700 a and a reflected ROI 700 b are recorded on a CCD chip when the particle is in the first position.
  • the pixels of the transmitted ROI 700 a show the same focus and intensity as the pixels in the reflected ROI 700 b .
  • a black box surrounds a general 5 ⁇ 5 pixel area of interest.
  • the transmitted ROI 700 a and the reflected ROI 700 b change such that the respective pixels in the area of interest are now out of focus and of different intensity.
  • the pixels of the transmitted ROI 700 a are now generally brighter (i.e., more intense) than in the first position, and off-center with respect to the area of interest (i.e., up and to the right).
  • the pixels of the reflected ROI 700 b are now generally less bright (i.e., less intense) than in the first position, and off-center with respect to the area of interest (i.e., up and to the right).
  • the fluorescence light beam has now been steered to center it on the particle 606 in the second position.
  • the pixels of the transmitted ROI 700 a and of the reflected ROI 700 b are generally similar, if not identical, to the pixels illustrated in the first position of the particle 606 (shown in FIG. 7A ). Specifically, the pixels are generally centered within the area of interest and are now of similar intensity in both the transmitted ROI 700 a and the reflected ROI 700 b.
  • 3D reference data sets are used to fit data sets obtained by either a multi-plane approach or an astigmatism approach.
  • all raw data that contributes to the image of a particle is taken into account by the fitting process according to the particle's statistical weight, which is especially relevant for photon-limited applications such as imaging single molecules. Additional calibration steps are generally not necessary because the raw data and the fit reference data set are acquired by the same setup.
  • the localization algorithm converges in close to 100% of the cases over a range of 1 to 2 ⁇ m by a fraction ⁇ of images in which the particle could be localized correctly.
  • the fraction ⁇ of images depends on ⁇ and the number of detected photons, N det . Only for axial particle positions far away from either of the focal detection planes the localization algorithm may fails to converge properly.
  • biplane detection is capable of localizing particles over a range nearly twice the value achievable by astigmatic detection. This is a relevant feature in imaging of thick biological samples and, therefore, the biplane mode seems to be a favorable mode for these applications.
  • the fact that the signal in biplane detection is spread over double the number of pixels does not have a detectable negative effect in the experimental setup.
  • the localization algorithm works independently of a theoretical model function and instead uses experimentally obtained reference data sets. This enables the algorithm to account for experimental deviations from perfect theoretical descriptions that are often difficult to include accurately in theoretical models and, also, reduces artifacts in the localization process. Additionally, complex theoretical models that cannot be described by a simple formula are now accessible by numerically generating a reference data set to feed into the algorithm. Small systematic deviations of the determined positions from actual positions can result from differences between used reference data sets and data generated during imaging, or from the fact that the reference data set data set is of finite size and can be readily corrected by proper calibration curves.
  • the localization algorithm allows general applicability to a range of 3D localization strategies without a need to develop individual theoretical model functions for every case. Additionally, it can be readily applied to recently reported detection schemes of iPALM and double-helix reference data sets.
  • the iPALM detection scheme is described in more detail in an article titled “Interferometric fluorescent super-resolution microscopy resolves 3D cellular ultrastructure” by G. Shtengel, J. A. Galbraith, C. G. Galbraith, J. Lippincott-Schwartz, J. M. Gillete, S. Naley, R. Sougrat, C. M. Waterman, P. Kanchanawong, M. W. Davidson, R. D. Fetter, and H. F.
  • the localization algorithm is not limited to the five fitting parameters identified below (i.e., x-position, y-position, z-position, amplitude, and background) or to 3D reference data sets.
  • the localization algorithm can be readily expanded to include parameters such as interference phase, polarization, or wavelength by providing experimental reference data set sets that provide the necessary information in a fourth, fifth, or even higher dimension. Alternatively, less parameters and dimensions are possible, allowing for application to 2D imaging.
  • Image acquisition software that controls camera and piezo actuator parameters may be written, for example, using LabVIEW 8.2 software in Windows XP, the software being available from National Instruments Corp., Austin, Tex. Recorded data is stored in a raw data format that is later analyzed by separate analysis software programmed in C, which may run on a Linux computer cluster (e.g., 31 compute nodes, each equipped with two dual-core AMD Opteron processors and 16 GB of RAM and connected by a single gigabit Ethernet network).
  • a Linux computer cluster e.g., 31 compute nodes, each equipped with two dual-core AMD Opteron processors and 16 GB of RAM and connected by a single gigabit Ethernet network.
  • ROIs corresponding to an illuminated field of view are cut out automatically.
  • biplane detection mode the two ROIs in every frame representing the two detected planes are co-registered by slightly tilting and magnifying one ROI according to earlier determined calibration values.
  • Particles are identified as the brightest pixels in smoothed versions of the ROIs.
  • one ROI of 15 ⁇ 15 pixels at 2 ⁇ 2 binning (corresponding to 1.9 ⁇ m ⁇ 1.9 ⁇ m in the sample) is cut out from the non-smoothed data centered on the identified brightest pixel in the astigmatic (biplane) detection mode. This data is, then, corrected for an electronic offset in the signal stemming from the camera, translated from counts into number of photons, and fed into the fit algorithm.
  • the fit algorithm provides the best estimates for the three spatial particle coordinates of the particle, as well as the amplitude and a background value of the particle.
  • the coordinates, amplitude, and/or background value are stored together with other parameters that indicate quality of the fit (e.g., ⁇ 2 -values, number of iterations before convergence, etc.) in ASCII data lists. These lists are later compiled into the data presented below using, for example, such computer programs like Microsoft Excel, Origin (OriginLab, Northampton, Mass.), and LabVIEW.
  • the same software especially the same fit algorithm, may be applied to data from both imaging modes with the only difference being that in biplane mode two ROIs are used instead of one. Combined with the required minimal changes in the optical setup, this warrants optimal conditions for a direct and thorough comparison of the two methods of 3D localization.
  • the fit algorithm also referred to as a particle localization routine, performs for every identified particle a least-squares fit based on the Nelder-Mead downhill simplex method. This method is described in more detail in an article titled “Numerical Recipes: The Art of Scientific Computing” by W. H. Press, S. A. Teukolsky, W. T. Vetterling, and B. P. Flannery (Cambridge University Press, 2007), which is incorporated by reference in its entirety.
  • This algorithm finds the best fit by successively contracting an n-dimensional polytope (“simplex” with m+1 vertices) around the minimum of the figure-of-merit function in m-dimensional parameter space.
  • the figure-of-merit function, ⁇ 2 is calculated as squares of the error-weighted differences between the observed number of photons, n j , and a model function, F, which depends on a set of fit parameter values summed over all pixels, j, which describe the image of an identified particle:
  • the 3D positions, x j describe the coordinates in the sample and correspond to a lattice of 15 ⁇ 15 ⁇ 2 or 15 ⁇ 15 ⁇ 1 extracted pixels in biplane and astigmatic detection mode, respectively.
  • the 3D positions can represent any distribution matching the experimental imaging conditions.
  • the estimated statistical error of n j - ⁇ is assumed to be n 1 1 ′′ 2 because shot noise is generally the main error contribution.
  • the normalized instrument response at point x for a particle located at position a is described by h a (x) and is derived from experimentally obtained reference data sets.
  • the reference data set h 0 ( ⁇ l ) is defined for a lattice of voxel coordinates ⁇ l .
  • Simple linear interpolation and related methods generate points of non-differentiability that can cause failure of proper convergence of the simplex method and induce localization artifacts. To address this issue, the following interpolation method based on Fourier transforms has been developed.
  • x ⁇ a can be interpreted as determining the function value at the nearest node ⁇ l after shifting the whole function by the amount necessary to make (x ⁇ a) coincide with ⁇ l .
  • This shift can be achieved by convolving the function with a Dirac delta distribution:
  • H 0 ( ⁇ l ) is the optical transfer function (“OTF”) of the system and is calculated once at the beginning of the fit procedure as the Fourier transform of h 0 ( ⁇ l ).
  • OTF optical transfer function
  • the spacing of the pixel positions, x j is an integer multiple of the spacing of the reference data set nodes ⁇ l .
  • D a (x) is independent of x j and a single inverse Fourier transformation of Eq. 3 is sufficient to find all the reference data set values required to calculate ⁇ 2 for a given shift a.
  • fluorescent latex beads of 100 nm diameter with an emission maximum at 560 nm (F-8800, Invitrogen, Carlsbad, Calif.) were imaged. Beads were adhered on poly-L-lysine coated (Sigma-Aldrich, St. Louis, Mo.) cover slips, immersed in water and mounted on a slide. Bead density was chosen so low, that only about eight to twelve beads were visible in the field of view when imaging. This guaranteed that fluorescence from neighboring particles was not influencing the analysis.
  • an experimentally obtained reference data set (also referred to as a point-spread function) replaces theoretical models used elsewhere.
  • This reference data set great care was exercised in its generation during performed experiments.
  • the same bead samples as in later experiments were imaged at maximum electron-multiplying gain of the camera without pixel binning.
  • Single frames were recorded with acquisition times of 30 ms at 50 nm axial piezo steps over a range of 10 ⁇ m.
  • the background was removed and the data was corrected for bleaching that had occurred during the imaging process.
  • the reference data sets were then cut to a size of approximately 3.8 ⁇ m ⁇ 3.8 ⁇ m ⁇ 7.5 ⁇ m with the reference data set centered in the middle.
  • both reference data sets were cut in an identical way such that (i) the stack centers were located axially in the middle between the two reference data set centers and (ii) the reference data set centers maintained their original axial distance.
  • the assured maximum processing speed in the fit algorithm depends on the number of reference data set voxels without altering the optical characteristics of the original reference data set.
  • the reference data set has been further normalized to a maximum value of 1 for easier determination of reasonable start parameters for the fit algorithm.
  • the brighter reference data set was normalized to 1 and the other reference data set was normalized to a lower value.
  • an exemplary photoactivation localization microscopy method in accordance with the features described above includes providing recorded raw images ( 801 ) from which particles are identified in images ( 803 ).
  • the photoactivation localization microscopy method can be performed in both 2D and 3D.
  • a computation is made separately for each identified particle ( 805 ) and pixels are extracted in regions of interest centered around each of the identified particles ( 807 ).
  • Each identified particle is localized by determining the particle position from the intensity distribution in the region of interest ( 809 ). Additional details regarding the localization of each identified particle is provided in FIGS. 10-12 .
  • the determined positions of all the particles are merged ( 811 ) and a particle distribution map is created from the determined particle positions ( 813 ).
  • the particle distribution map is provided as a resulting image ( 815 ).
  • an exemplary single particle tracking method in accordance with the features described above includes providing a recorded image sequence ( 901 ) from which a particle is identified in each frame ( 903 ).
  • the single particle tracking method can be performed in both 2D and 3D.
  • a computation is made separately for each frame ( 905 ) and pixels are extracted in regions of interest centered around the particle ( 907 ).
  • the particle is localized by determining the particle position from the intensity distribution in the region of interest ( 909 ). Additional details regarding the localization of each identified particle is provided in FIGS. 10-12 .
  • the determined positions of the particle are merged ( 911 ) and a particle trajectory is created from the determined particle positions ( 913 ).
  • the resulting particle trajectory is provided ( 915 ).
  • an example of a conventional non-iterative localization algorithm includes extracting pixels for one particle (imaging) or one point in time (particle tracking) ( 1001 ), wherein the center of mass is calculated ( 1003 ). The particle position is, then, determined ( 1005 ).
  • an example of a conventional iterative localization algorithm includes extracting pixels for one particle (imaging) or one point in time (particle tracking) ( 1101 ). After calculating the center of mass ( 1103 ), a first guess of the particle position ( 1105 ) is provided to calculate the figure of merit as a measure of the difference between the imaged pixel data and a model function based on the guessed particle position ( 1107 ). The guess is modified ( 1109 ) and the figure of merit is calculated based on the modified guess ( 1111 ). If the figure of merit has not decreased ( 1113 ), the old guess is used to determine if the figure of merit is below a specific threshold ( 1115 ).
  • the modified guess is used to determine if the figure of merit is below the specific threshold ( 1115 ). Then, if the figure of merit is below the specific threshold, the particle position is outputted as the last guess ( 1117 ). However, if the figure of merit is not below the specific threshold, the guess is modified ( 1109 ) and the figure of merit is calculated based on the modified guess ( 1111 ).
  • pixels are extracted for one particle (imaging) or one point in time (particle tracking) ( 1201 ).
  • particle tracking After calculating the center of mass ( 1203 ), a first guess of the particle position ( 1205 ) and a reference data set ( 1207 ) is provided to calculate the figure of merit as a measure of the difference between the imaged pixel data and the reference data set adjusted for guessed particle position ( 1209 ).
  • Exemplary algorithms for calculating the figure of merit are further described below in reference to FIGS. 13 and 14 .
  • the guess is modified ( 1211 ) and the figure of merit is calculated based on the modified guess ( 1213 ). If the figure of merit has not decreased ( 1215 ), the old guess is used to determine if the figure of merit is below a specific threshold ( 1217 ). If the figure of merit has decreased ( 1215 ), the modified guess is used to determine if the figure of merit is below the specific threshold ( 1217 ). Then, if the figure of merit is below the specific threshold, the particle position is outputted as the last guess ( 1219 ). However, if the figure of merit is not below the specific threshold, the guess is modified ( 1211 ) and the figure of merit is calculated based on the modified guess ( 1213 ).
  • an algorithm for calculating the figure of merit includes extracting pixels for a particle ( 1301 ), guessing the particle position, brightness, and background ( 1303 ), and providing the reference data set ( 1305 ).
  • the figure of merit is calculated ( 1313 ) based on the model function and the extracted pixels for the particle and the figure of the merit function is provided ( 1315 ).
  • an exemplary embodiment for a Fourier-transform based Shift Algorithm includes extracting pixels for a particle ( 1401 ), and guessing the particle position, brightness, and background ( 1403 ).
  • a calculation ( 1405 ) is performed to obtain a reference data set ( 1405 a ), a Fourier Transform ( 1405 b ), and the OTF ( 1405 c ).
  • the OTF is multiplied with a parameter-dependent phase factor ( 1407 ), based on the particle position, brightness, and background.
  • the inverse Fourier Transform is calculated ( 1409 ), providing a shifted reference data set ( 1411 ).
  • the figure of merit is calculated ( 1415 ) based on the model function and the extracted pixels for the particle and the figure of the merit function is provided ( 1417 ).

Abstract

A microscopy system is configured for creating 3D images from individually localized probe molecules. The microscopy system includes a sample stage, an activation light source, a readout light source, a beam splitting device, at least one camera, and a controller. The activation light source activates probes of at least one probe subset of photo-sensitive luminescent probes, and the readout light source causes luminescence light from the activated probes. The beam splitting device splits the luminescence light into at least two paths to create at least two detection planes that correspond to the same or different number of object planes of the sample. The camera detects simultaneously the at least two detection planes, the number of object planes being represented in the camera by the same number of recorded regions of interest. The controller is programmable to combine a signal from the regions of interest into a 3D data.

Description

    FIELD OF THE INVENTION
  • The present invention relates generally to microscopic imaging and, more specifically, to three-dimensional (“3D”) sub-100 nanometer resolution by biplane microscope imaging.
  • BACKGROUND OF THE INVENTION
  • Until about a decade ago, resolution in far-field light microscopy was thought to be limited to ˜200-250 nanometers in the focal plane, concealing details of sub-cellular structures and constraining its biological applications. Breaking this diffraction barrier by the seminal concept of stimulated emission depletion (“STED”) microscopy has made it possible to image biological systems at the nano scale with light. Additional details are provided in an article titled “Far-Field Optical Nanoscopy by Stefan W. Hell (316 Science, 1153-1158, May 25, 2007), which is incorporated herein by reference in its entirety. STED microscopy and other members of reversible saturable optical fluorescence transitions (“RESOLFT”) family achieve a resolution >10-fold beyond the diffraction barrier by engineering the microscope's point-spread function (“PSF”), also referred to as a “reference data set,” through optically saturable transitions of the (fluorescent) probe molecules.
  • Lately, an emerging group of localization-based techniques has obtained similar resolution in the lateral plane. This group includes fluorescence photoactivation localization microscopy (“FPALM”), photoactivation localization microscopy (“PALM”), stochastic optical reconstruction microscopy (“STORM”), and PALM with independently running acquisition (“PALMIRA”). FPALM is described in more detail in an article titled “Ultra-High Resolution Imaging by Fluorescence Photoactivation Localization Microscopy” by Samuel T. Hess et al. (91 Biophysical Journal, 4258-4272, December 2006), which is incorporated herein by reference in its entirety. PALM is described in more detail in an article titled “Imaging Intracellular Fluorescent Proteins at Nanometer Resolution” by Eric Betzig el al. (313 Science, 1642-1645, Sep. 15, 2006), which is incorporated herein by reference in its entirety. STORM is described in more detail in an article titled “Sub-Diffraction-Limit Imaging by Stochastic Optical Reconstruction Microscopy” by Michael J. Rust et al. (Nature Methods/Advance Online Publication, Aug. 9, 2006), which is incorporated herein by reference in its entirety. PALMIRA is described in more detail in an article titled “Resolution of λ/10 in Fluorescence Microscopy Using Fast Single Molecule Photo-Switching” by H. Bock et al. (88 Applied Physics A, 223-226, Jun. 1, 2007), and an article titled “Photochromic Rhodamines Provide Nanoscopy With Optical Sectioning” by J. Folling et al. (Angew. Chem. Int. Ed., 46, 6266-6270, 2007), each of which is incorporated herein by reference in its entirety. As referred to in the current application, the term photo-sensitive refers to both photo-activatable (e.g., switching probes between an on state and an off state) and photo-switching (e.g., switching between a first color and a second color).
  • While utilizing similar optical switching mechanisms, this latter group of microscopes circumvents the diffraction limit by basing resolution improvement on the precise localization of spatially well-separated fluorescent molecules, a method previously used to track, for example, conventionally labeled myosin V molecules with 1.5 nanometers localization accuracy. This method is described in more detail in an article titled “Myosin V Walks Hand-Over-Hand: Single Fluorophore Imaging With 1.5-nanometers Localization” by Ahmet Yildiz et al. (300 Science, 2061-2065, Jun. 27, 2003), which is incorporated herein by reference in its entirety.
  • To resolve complex nanoscale structures by localization-based methods, the sample is labeled with photo-sensitive probes, such as photo-activatable (“PA”) fluorescent probes (e.g., PA proteins or caged organic dyes). Activation of only a sparse subset of molecules at a time allows their separate localization. By repeated bleaching or deactivation of the active molecules in concert with activation of other inactive probe molecules, a large fraction of the whole probe ensemble can be localized over time. The final sub-diffraction image of the labeled structure is generated by plotting the positions of some or all localized molecules.
  • Based on the rapid development in both RESOLFT and localization-based techniques, the impact of super-resolution far-field fluorescence microscopy on the biological sciences is expected to increase significantly. Within 2007 alone subdiffraction multi-color imaging has been reported for the first time for STED microscopy, PALMIRA, STORM, and FPALM has successfully been demonstrated in live cells. Some of these reports are included in an article titled “Two-Color Far-Field Fluorescence Nanoscopy” by Gerald Donnert et al. (Biophysical Journal, L67-L69, Feb. 6, 2007), in an article by M. Bates, B. Huang, G. T. Dempsey, and X. Zhuang (Science 317, 1749-1753, 2007), and in an article titled “Dynamic Clustered Distribution of Hemagglutinin Resolved at 40 nanometers in Living Cell Membranes Discriminates Between Raft Theories” by Samuel T. Hess et al. (Proc. Natl. Acad. Sci. USA 104, 17370-17375, Oct. 30, 2007), each of which is incorporated herein by reference in its entirety.
  • However, the slow progress in 3D super-resolution imaging has limited the application of these techniques to two-dimensional (“2D”) imaging. The best 3D resolution until recently had been 100 nanometers axially at conventional lateral resolution. Achieved by the combination of two objective lens apertures in 4Pi microscopy, it has been applied for more than a decade. This is described in more detail in an article titled “H2AX Chromatin Structures and Their Response to DNA Damage Revealed by 4Pi Microscopy” by Joerg Bewersdorf et al. (Proc. Natl. Acad. Sci. USA 103, 18137-18142, Nov. 28, 2006), which is incorporated by reference in its entirety. Only lately first 3D STED microscopy images have been published exceeding this resolution moderately with 139 nanometer lateral and 170 nanometer axial resolution. These images are presented in more detail in an article by K. I. Willig, B. Harke, R. Medda, and S. W. Hell (Nat. Methods 4, 915-918, 2007), which is incorporated by reference in its entirety. While this represents a ˜10-fold smaller resolvable volume than provided by conventional microscopy, it is still at least 10-fold larger than a large number of sub-cellular components, for example synaptic vesicles. Recently, an article (Huang et al., Science 2008) has reported first 3D STORM of thin optical sections (<600 nanometers) with sub-100 nanometer 3D resolution under reducing (low oxygen) conditions.
  • Moreover, current understanding of fundamental biological processes on the nanoscale (e.g., neural network formation, chromatin organization) is limited because these processes cannot be visualized at the necessary sub-millisecond time resolution. Current biological research at the sub-cellular level is constrained by the limits of spatial and temporal resolution in fluorescence microscopy. The diameter of most organelles is below the diffraction limit of light, limiting spatial resolution and concealing sub-structure. Recent developments (e.g., STED, FPALM, STORM, etc.) have dramatically enhanced the spatial resolution and even overcome the traditional diffraction barrier. However, comparable improvements in temporal resolution are still needed.
  • Particle-tracking techniques can localize small objects (typically<diffraction limit) in live cells with sub-diffraction accuracy and track their movement over time. But conventional particle-tracking fluorescence microscopy cannot temporally resolve interactions of organelles, molecular machines, or even single proteins, which typically happen within milliseconds.
  • The spatial localization accuracy of single particles in a fluorescence microscope is approximately proportional to d/√{square root over (N)} (d=spatial resolution; N=total number of detected fluorescence photons from the particle) in the absence of background and effects due to finite pixel size. For longer acquisition times more signal can be accumulated, hence increased temporal resolution requires a trade-off of decreased spatial localization accuracy. For bright organelles containing a few hundred fluorescent molecules, (or future fluorescent molecules with increased brightness), sufficient signal can be accumulated quickly. However, especially for 3D localization where data acquisition is far more complicated than in 2D, technical constraints arising from axial scanning and/or camera readout times limit the recording speed, and therefore, the temporal resolution.
  • For example, a particular 3D particle-tracking technique can track particles only with 32 milliseconds time resolution. This technique scans a 2-photon excitation focus in a 3D orbit around the fluorescent particle and determines its 3D position by analyzing the temporal fluorescence fluctuations. The temporal resolution is ultimately limited by the frequency with which the focus can revolve in 3D around the particle. This technique is described in more detail in an article titled “3-D Particle Tracking In A Two-Photon Microscope: Application To The Study Of Molecular Dynamics IN Cells” by V. Levi, Q. Ruan, and E. Gratton (Biophys. J., 2005, 88(4): pp. 2919-28), which is incorporated by reference in its entirety.
  • In another example, another current 3D particle-tracking technique combines traditional particle-tracking with widefield “bifocal detection” images. Particles are simultaneously detected in one plane close to the focal plane of the particle and a second plane 1 micrometer out of focus. The lateral and axial coordinates are derived from the 2 images. In accordance with this technique, the temporal resolution is limited to the 2-50 milliseconds range, and the localization accuracy is limited to the 2-5 nanometer range. Additional details are described in an article titled “Three-Dimensional Particle Tracking Via Bifocal Imaging” by E Toprak et al. (Nano Lett., 2007, 7(7): pp. 2043-45), which is incorporated by reference in its entirety. As such, advances in temporal resolution to sub-millisecond levels have been limited only to 2D imaging.
  • In general, determining the 3D position of a particle by any of the methods mentioned above requires fitting a model function to the respective experimental data. The particle position (and also, typically, the particle brightness and background value) can be deduced from the parameters that fit the experimental data best, according to a chosen figure of merit. In most cases, an analytical function is used to reasonably model the characteristics that dominantly describe the 3D particle position, e.g., the diameter of the defocused image or the particle ellipticity in the case of astigmatism. The model function is calibrated with imaged particles located at known positions to achieve mapping of the determined fit parameters to real spatial positions.
  • This indirect method, especially of acquiring the z-position from abstract fit parameters such as ellipticity or ring diameter, is problematic, however, because it is prone to artifacts in the analysis process. Experimental deviations from the theoretical descriptions by the model function can lead to divergence between real and measured particle positions. Additionally, every model function is limited to a certain optical setup and weighs the information content of the raw data differently. This prevents a direct comparison of different optical setups.
  • Thus, there is a need for a microscopy system that can provide 3D imaging with resolution below 100 nanometers in all three dimensions. Another need is directed to achieving particle-tracking in 3D with a temporal resolution below 1 millisecond for enabling visualization of dynamic sub-cellular processes. The present invention is directed to satisfying one or more of these needs and solving other problems.
  • SUMMARY OF THE INVENTION
  • According to one embodiment, a microscopy system is configured for creating 3D images from individually localized probe molecules. The microscopy system includes a sample stage, an activation light source, a readout light source, a beam splitting device, at least one camera, and a controller. The activation light source activates probes of at least one probe subset of photo-sensitive luminescent probes, and the readout light source causes luminescence light from the activated probes. Optionally, the activation light source and the readout light source is the same light source. The beam splitting device splits the luminescence light into at least two paths to create at least two detection planes that correspond to the same or different number of object planes of the sample. The camera detects simultaneously the at least two detection planes, the number of object planes being represented in the camera by the same number of recorded regions of interest. The controller is programmable to combine a signal from the regions of interest into a 3D data set.
  • According to another embodiment, a method for creating 3D images from individually localized probe molecules includes mounting a sample on a sample stage, the sample having a plurality of photo-sensitive luminescent probes. In response to illuminating the sample with an activation light, probes of at least one probe subset of the plurality of photo-sensitive luminescent probes are activated. In response to illuminating the sample with a readout light, luminescence light from the activated probes is caused. The luminescence lights is split into at least two paths to create at least two detection planes, the at least two detection planes corresponding to the same or different object planes in the sample. At least two detection planes are detected via a camera. The object planes are recorded in corresponding recorded regions of interest in the camera. A signal from the regions of interest is combined into a 3D data stack.
  • According to yet another embodiment, a microscopy system is configured for tracking microscopic particles in 3D. The system includes a sample, a sample stage, at least one light source, a beam-steering device, a beam splitting device, at least one camera, and a controller. The sample, which includes luminescence particles, is mounted to the sample stage. The light source is configured to illuminate an area of the sample to cause luminescence light, primarily, from one tracked particle of the luminescence particles. The beam-steering device is configured to selectively move a light beam to illuminate different areas of the sample such that the luminescence light is detected. The beam splitting device, which is located in a detection light path, splits the luminescence light into at least two paths to create at least two detection planes that correspond to different object planes in the sample. The camera is positioned to detect simultaneously the at least two detection planes, the number of object planes being represented in the camera by the same number of recorded regions of interest. The controller is programmable to combine a signal from the recorded regions of interest, determine a 3D trajectory of the particle at each time point of a recorded data sequence, and move the beam-steering device to illuminate the different areas of the sample in accordance with corresponding positions of the one tracked particle.
  • According to yet another embodiment, a method for tracking microscopic particles in 3D includes mounting a sample on a sample stage, the sample including luminescent particles. A small area of the sample is illuminated to cause luminescence light from primarily one particle of the luminescent particles. The light beam is selectively moved to illuminate different areas of the sample to track movement of the one particle, the different areas including the small area of the sample and corresponding to respective positions of the one particle. The luminescence light is split into at least two paths to create at least two detection planes that correspond to the same or different number of object planes in the sample. The at least two detection planes are detected simultaneously. The number of object planes is represented in a camera by the same number of recorded regions of interest. Based on a combined signal from the recorded regions of interest, a 3D trajectory of the one particle is determined at each time point of a recorded data sequence.
  • According to yet another embodiment, a microscopy system is configured for creating 3D images from individually localized probe molecules. The microscopy system includes an activation light source, a readout light source, a beam splitting device, at least one camera, and a controller. The activation light source is configured to illuminate a sample with an activation light, the activation light source being configured to activate probes of at least one probe subset of the plurality of photo-sensitive luminescent probes. The readout light source is configured to illuminate the sample with a readout light, the readout light source being configured to cause luminescence light from the activated probes. The beam splitting device is located in a detection light path that splits the luminescence light into at least two paths, the beam splitting device creating at least two detection planes that correspond to the same or different number of object planes of the sample. The camera is positioned to detect simultaneously the at least two detection planes, the number of object planes being represented in the camera by the same number of recorded regions of interest. The controller is programmable to combine a signal from the regions of interest into a 3D data stack, to calculate a figure of merit as a measure of the difference between imaged pixel data and a reference data set h0l) modifiable by at least one parameter, and to optimize the figure of merit by adjusting the at least one parameter.
  • Additional aspects of the invention will be apparent to those of ordinary skill in the art in view of the detailed description of various embodiments, which is made with reference to the drawings, a brief description of which is provided below.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic view illustrating a biplane microscope setup for Fluorescence Photoactivation Localization Microscopy (FPALM), according to one embodiment.
  • FIG. 2 is a schematic view illustrating a biplane microscope setup, according to an alternative embodiment.
  • FIG. 3 is a schematic view illustrating a fluorescent particle image on a CCD chip.
  • FIG. 4A is a graph representing an axial resolution measured from an axial profile of caged fluorescein-labeled antibodies.
  • FIG. 4B is a representative image showing added-up projections of a data set in three different orientations for the axial resolution measured in FIG. 4A.
  • FIG. 5A is a representative image of a data set for beads labeled with caged fluorescein at an axial position of 300 nanometers.
  • FIG. 5B illustrates a representative image of a resulting data set for the beads of FIG. 5A at an axial position of 100 nanometers.
  • FIG. 5C illustrates a representative image of a resulting data set for the beads of FIG. 5A at an axial position of −100 nanometers.
  • FIG. 5D illustrates a representative image of a resulting data set for the beads of FIG. 5A at an axial position of −300 nanometers.
  • FIG. 5E illustrates a representative image of a resulting data set for the beads of FIG. 5A at an axial position of −500 nanometers.
  • FIG. 5F illustrates a volume-rendered representation of the data set illustrated in FIGS. 5A-5E.
  • FIG. 6 is a schematic view illustrating adjustment of a biplane microscope setup, according to an alternative embodiment.
  • FIG. 7A is a schematic view illustrating a fluorescent particle image on a CCD chip when the particle is in focus, in a first position.
  • FIG. 7B is a schematic view illustrating the fluorescent particle image of FIG. 7A when the particle is out of focus, in a second position.
  • FIG. 7C is a schematic view illustrating the fluorescent particle image of FIG. 7B when the particle is in focus, in a third position.
  • FIG. 8 is a diagrammatic representation of an exemplary embodiment of a photoactivation localization microscopy method.
  • FIG. 9 is a diagrammatic representation of an exemplary embodiment of a single particle tracking method.
  • FIG. 10 is a diagrammatic representation of an exemplary conventional embodiment of a non-iterative localization algorithm.
  • FIG. 11 is a diagrammatic representation of an exemplary conventional embodiment of an iterative localization algorithm.
  • FIG. 12 is a diagrammatic representation of an exemplary embodiment of a localization algorithm.
  • FIG. 13 is a diagrammatic representation of an exemplary embodiment of an algorithm for calculating a figure of merit.
  • FIG. 14 is a diagrammatic representation of an exemplary embodiment of a Fourier-Transform based Shift Algorithm.
  • DETAILED DESCRIPTION
  • While this invention is susceptible of embodiment in many different forms, there is shown in the drawings and will herein be described in detail preferred embodiments of the invention with the understanding that the present disclosure is to be considered as an exemplification of the principles of the invention and is not intended to limit the broad aspect of the invention to the embodiments illustrated.
  • Referring to FIG. 1, a biplane (“BP”) microscope system 100 allows 3D imaging at an unmatched resolution well below 100 nanometers in all three dimensions, resulting in at least a 100-fold smaller resolvable volume than obtainable by conventional 3D microscopy. The BP microscope system 100 is optionally a BP FPALM system, which is generally based on a conventional FPALM design. However, in contrast to conventional FPALM design, the BP microscope system 100 includes a modified detection path that allows the simultaneous detection from two focal planes. The simultaneous detection of two planes for localization-based super-resolution microscopy speeds up the imaging process by making axial scanning unnecessary, and more importantly, in contrast to scanning-based systems, eliminates localization artifacts caused by abrupt blinking and bleaching common to single molecules. The BP microscope system 100 can optionally be located on an air-damped optical table to minimize vibrations.
  • In addition to achieving 3D particle localization down to the nanometer range accuracy, the BP microscope system 100 can also achieve temporal resolution <1 milliseconds. As such, in addition to being a BP FPALM system, the BP microscope system 100 can also be a next-generation 3D particle-tracking microscope (“3D PTM”) for providing unprecedented temporal and spatial resolution when tracking fluorescent particles in live cells in 3D. FPALM and particle-tracking are just some exemplary applications of the BP microscope system 100. To achieve unprecedented temporal resolution at least as short as 0.3 milliseconds, the BP microscope system 100 tracks one particle at a time (in contrast to conventional 2D and 3D tracking techniques that visualize the entire field). Additionally, the BP microscope system 100 can include a detection scheme without any moving parts that detects simultaneously two axially shifted detection planes.
  • In contrast to current PTM techniques, the BP microscope system 100 can include a focused laser beam for excitation combined with spatially limited detection. Background light is filtered out to avoid localization disturbances and to increase sensitivity in samples thicker than about 1 micrometer. This enables particle-tracking even in tissue sections. To follow a particular particle over several microns in 3D, the BP microscope system 100 can include, for example, high-speed piezo-mirrors and a fast piezo-driven sample stage. The combination of focused excitation and feedback-driven beam-tracking reduces the background and enhances the speed limit by approximately one order of magnitude. Optionally, a second (different) luminescence color can be detected to enable correlative studies of the movement of the tracked particle.
  • Illumination for readout and activation can be provided by a readout laser 102, operating typically at 496 nanometers, and an activation laser 104 (e.g., 50 mW, Crystalaser), operating typically at 405 nanometers. The readout laser 102 is optionally a water-cooled Argon laser (e.g., Innova 70, coherent Inc.) that can provide 458, 472, 488, 496, or 514 nanometers for readout illumination. Optionally, the wavelength of the readout laser 102 is selected to minimize activation of inactive probes of a plurality of photo-sensitive probes of a sample 124. Optionally yet, the readout laser 102 and the activation laser 104 can be the same source. For example, the readout laser 102 can perform both the readout functions and the activation functions, without requiring the use of the activation laser 104. According to one embodiment, at least one illuminated area of the sample 124 is a relatively small area, having, for example, a general diameter that is less than about three times an Airy disk diameter.
  • Both lasers 102, 104 are combined, via a first dichroic beam splitter 110, and coupled, via a second dichroic beam splitter 120, into a microscope stand 106 equipped with a 63×1.2NA water immersion tube lens 108 after passing through a field aperture 107. Both lasers 102, 104 can be switched on and off by software-controlled electrical shutters (e.g., SH05, Thorlabs). Other components that may be included along the path between the lasers 102, 104 and the microscope stand 106 are a first mirror 112 and a first lens 114.
  • The microscope stand 106 can have a plurality of components, including a sample stage 116 and an objective 118. The sample 124, including for example a biological cell 124 a is generally positioned on the sample stage 116. The sample stage 116 can be a mechanical stage or a three-axis piezo stage (e.g., P-733.3DD, Physik Instrumente). Other components, which are not shown, may include shutters in front of the lasers 102, 104 and further optics for folding the beam path.
  • Fluorescence is collected by the objective 118, passes through a second dichroic beam splitter 120 (which reflects the laser light) and is focused by the tube lens 108 via an optional second mirror 122 (e.g., a piezo-driven mirror) into an intermediate focal plane 140. The focal plane 140 is imaged by two lenses—a second lens 128 and a third lens 132—onto a high-sensitivity EM-CCD camera 126 (e.g., DU897DCS-BV iXon, Andor Technology). Scattered laser light is attenuated by bandpass and Raman edge filters (e.g., Chroma and Semrock), such as filter 130.
  • The detection scheme can be achieved by moving the CCD camera 126 out of the standard image plane closer to the tube lens 108 and thereby shifting the corresponding focal plane ˜350 nanometers deeper into the sample. A beam splitter cube 134 is placed into a focused light path 136 a in front of the CCD camera 126. The beam splitter cube 134 redirects a reflected light path 136 b via a third mirror 138 towards the CCD camera 126 to form a second image in a different region of the same CCD. Due to the longer optical path, this second image corresponds to a focal plane ˜350 nanometers closer to the objective 118 than the original focal plane.
  • The BP microscope system 100, using a single camera, is straightforward to implement and avoids synchronization problems between separate cameras. The BP microscope system 100 features a reasonable field of view of ˜20×50 micrometers (pixel size corresponding to ˜100 nanometers in the sample 124; 512×512 pixels), sufficient to image large portions of a cell. The BP microscope system 100 is able to image 100 frames per second with a field of view of 10 to 20 micrometers in length and 2×2 binning. The use of the CCD camera 126, which features negligible readout noise due to its on-chip electron multiplication, avoids additional noise that would otherwise result from splitting the light up into two fields as required for BP detection. Combined with the fact that there is minimal loss of fluorescence detection efficiency, this exemplary BP microscope system 100 expands conventional FPALM to 3D imaging without significant drawbacks.
  • BP FPALM technology is compatible with live cell imaging and can be expanded to multicolor imaging (even realizable on the same CCD detector). BP FPALM can record 3D structures in a ˜1 micrometer thick z-section without scanning. Larger volumes can be recorded by recording BP FPALM data at different sample positions. To minimize activation of out of focus PA molecules, BP FPALM can be combined with a 2-photon (“2P”) laser scanner. 2P excitation-mediated activation is directed to diffraction-limited planes of ˜800 nanometers thickness, a thickness that is compatible with the axial detection range of BP FPALM. BP FPALM therefore has the potential of imaging specimens such as cell nuclei or tissue sections far exceeding 1 micrometer in thickness.
  • Moreover, combined with or without 2P excitation, BP FPALM can be readily implemented in practically every existing FPALM, PALM, PALMIRA or STORM instrument. BP FPALM therefore provides the means to investigate a large variety of biological 3D structures at resolution levels previously far out of reach.
  • Optionally, BP FPALM detected luminescence from activated probes is fluorescence or scattered light. In an alternative embodiment, the activation of activated probes is achieved via a non-linear process that limits the activation to a plane of diffraction-limited thickness.
  • For PSF measurement, according to one example, 100 nanometer diameter yellow-green fluorescent beads (Invitrogen, F-8803) can be attached to a poly-L-lysine coated cover slip. The sample can be mounted on a piezo stage and imaged in the BP FPALM setup with 496 nm excitation. Typically, 101 images at z-positions ranging from −2.5 to +2.5 micrometers with 50 nanometers step size are recorded. The same bead is imaged 2 to 3 times to check for drift and to correct for bleaching. To reduce noise, the data set can be smoothed in Imspector with a Gaussian filter of sub-diffraction size. Additionally, the data set can be corrected for mono-exponential bleaching, cropped to appropriate size and to be centered and normalized to 1.
  • Use of two focal planes for z-position determination is generally sufficient for particle localization under the constraints that (1) a sparse distribution of particles is analyzed (no overlapping signal within the size of one PSF) and (2) the axial position of the particle is close to one of the detection planes or lies between them. For example, to evaluate the range and accuracy of z-localization, 40 nanometers diameter fluorescent beads (FluoSpheres, F8795, Invitrogen) were imaged on a cover slip over 1,000 frames. A piezo-driven sample stage was moved by one 100 nanometers z-step every 100 frames. Localization analysis of the BP images reproduced that z-movement very accurately with σ≈6 to 10 nanometers axial localization accuracy. The beads could be localized over a range of 800 nanometers exceeding the distance between the two detection planes (in this case 500 nanometers) by more than 50%.
  • In one example, the accumulation time per frame is typically 10 milliseconds. In this example, electron multiplying gain is set to 300, the readout is 2×2 binned, only the region occupied by two recorded regions of interest (“ROIs”) is read out, and, typically, 5,000 to 50,000 frames are recorded.
  • Optionally, at least some of the ROIs are detected at different wavelengths by including suitable detection filters in the BP microscope system 100. In alternative embodiments, at least some of the ROIs are detected at different polarization directions by including suitable polarization optics in the BP microscopy system 100.
  • Referring to FIG. 2, a BP microscope system 200 is shown according to an alternative embodiment. The BP microscope system 200 includes a microscope stand 202 having a piezo-driven sample stage 204 on which a sample 206 is positioned. The sample 206 includes a plurality of fluorescent particles 206 a-206 d. The microscope stand 202 further includes an objective 208 and a first lens 210.
  • Additional components are positioned between a focal plane 212 and the CCD camera 214 along a fluorescence light path 215. Specifically, the components include a second lens 216, a beam-steering device 281 (e.g., a piezo-driven mirror), a dichroic beam splitter 220, a bandpass filter 222, a third lens 224, a neutral 50:50 beam splitter 226, and a mirror 228. Optionally, the beam-steering device 218 can include generally a focusing optical element that moves illumination and detection focal planes axially to follow the tracked particle. In yet another example, the beam-steering device 218 can include a phase-modulating device that moves an illuminated area laterally and illumination and detection focal planes axially to follow the tracked particle. Optionally yet, more than one piezo-driven mirror 218 can be included in the BP microscope system 200.
  • A polarized laser beam from a laser 229 is coupled into the microscope stand 202 and focused into the sample 206 by the objective 208. A fourth lens 230 and a λ/4 plate 232 are positioned between the laser 229 and the dichroic beam splitter 220.
  • The focus can be positioned in the region of interest by moving the sample stage 204 and the beam-steering device 218. The fluorescence emerging from the focal region is collected by the objective 208 and is imaged onto the CCD camera 214 via the first lens 210, the second lens 216, and the third lens 224. The dichroic beam splitter 220 and the bandpass filter 222 filter out scattered excitation light and other background light.
  • The neutral 50:50 beam splitter 226 splits the fluorescence light into two beam paths, a transmitted beam 215 a and a reflected beam 215 b. The transmitted beam 215 a images light emitted from a plane deeper in the sample onto one area of the CCD chip. The reflected beam 215 b images light from a plane closer to the objective onto another well-separated area to avoid cross-talk.
  • Referring to FIG. 3, two ROIs on the CCD chip represent two focal planes in the sample 206 (illustrated in FIG. 2), typically 700 nanometers apart, arranged like wings of a biplane. The two ROIs include a transmitted ROI 300 and a reflected ROI 302, each having nine pixels showing an image of the fluorescent particle 206 b from the sample 206. The dashed areas 304 a-304 i, 306 a-306 i depict the pixels that are used for tracking the fluorescent particle 206 b. Thus, the two 9-pixel-areas 304 a-304 i, 306 a-306 i represent in general the position of the particle 206 b in 3D.
  • The fluorescent particle 206 b, which is generally smaller than the laser focus and located in the focal region, is excited homogeneously and 3 (binned) lines (i.e., the two 9-pixel-areas represented by dashed areas 304 a-304 i, 306 a-306 i) of the CCD chip arranged around the laser focus image are read out at every time point. Particles laterally shifted with respect to the laser focus center will appear shifted on the CCD chip. For the z direction, the two 9-pixel-areas 304 a-304 i, 306 a-306 i act in the same was as two confocal pinholes in different planes: if the particle 206 b moves axially, the signal will increase in one of the 9-pixel-area and decrease in the other 9-pixel-area. An axial shift will be represented by a sharper intensity distribution in one of the two 9-pixel-areas depending on the direction of the shift.
  • The 3D position can be determined by subtracting different pixel values of the two 9-pixel-areas from each other. For the axial coordinate (z-axis), the sum of all pixels from one 9-pixel-area can be subtracted from the other 9-pixel-area. The fact that the lateral information is preserved in the 9-pixel-areas allows for lateral localization of the particle 306 b at the same time. For the lateral x-axis (or y-axis) direction, the signal collected in the left columns 304 a, 304 d, 304 g, 306 a, 306 d, 306 g (or upper rows: 304 a, 304 b, 304 c and 306 a, 306 b, 306 c) of both 9-pixel- areas 300 and 302 can be subtracted from the one in the right columns 304 c, 304 f, 304 i, 306 c, 306 f, 306 i (or lower rows: 304 g, 304 h, 304 i and 306 g, 306 h, 306 i). Calculations show that the determined values are approximately proportional to the particle position offset of the center as long as the position stays in a range of +/−250 nanometers axially and +/−100 nanometers laterally. In a simple feedback loop, these values can be fed back to piezo controllers tilting piezo mirrors and moving the sample stage piezo to re-center the particle in the 9-pixel-areas after every measurement. Optionally, for larger movements up to about double the linear ranges, the position can be determined by taking the image shape and brightness into account in the data analysis to increase the tracking range.
  • According to an alternative embodiment, the pixels of the transmitted ROI 300 (on the left) show a brighter image than the pixels of the reflected ROI 302 (on the right). For example, the top-right dashed areas 304 b, 304 c, 304 e, 304 f of the transmitted ROI 300 are generally brighter than the other 5 pixels in the same ROI 300 and than all pixels of the reflected ROI 302 As such, the fluorescent particle 206 b is located axially more towards the focal plane 140 imaged on transmitted ROI 300 and is shifted by about half the diffraction limit toward the right and top relative to the excitation focus.
  • The signal from the two ROIs 300, 302 can also be combined into a 3D data stack (2 pixels in z; x and y dimensions are determined by the size of the ROIs 300, 302). Data analysis is a generalization of standard FPALM methods to 3D. Instead of a Gaussian, an experimentally obtained 3D-PSF can be fit to each data set consisting of the pixels around each detected probe molecule. The x, y and z-coordinates of each molecule are determined from the best fit of the molecule image with the PSF.
  • For BP FPALM, typically but not necessarily, larger ROIs 300, 302 are used to allow localization of particles over a larger field of view. Also, several particles can be present in the same ROI and still be analyzed separately. Slight variations in the magnification and rotation between the two detection areas may be corrected by software before combination of the two ROIs 300, 302 into a 3D data stack. The slight difference in the tilt of the focal planes between the two ROIs 300, 302 is negligible because of the large axial magnification (proportional to the lateral magnification squared). The analysis of the 3D data can be seen as the generalization of standard 2D FPALM analysis to 3D. Particles are identified in the z-projected images by iteratively searching for the brightest pixels and eliminating this region in the subsequent search until a lower intensity threshold has been reached. The raw data may be cut out in each ROI 300, 302 around each found particle in a square window of, for example, 10-19 pixels long and wide. Instead of a 2D Gaussian, a theoretical or experimentally obtained 3D-PSF can be fitted to the data sets in this cutout window using a simplex fitting algorithm adapted from Numerical Recipes in C, or a different algorithm. For example, the algorithm can be a localization algorithm that is independent of theoretical models and, therefore, is generally applicable to a large number of experimental realizations. A more detailed description of the localization algorithm is provided below, after the description of FIGS. 7A-7C.
  • From the resulting best fitting x, y and z-coordinates, the localized position is extracted and stored. Additionally, amplitude, background, the deviation from the cutout windows center, the number of iterations and the chi square value are stored, which allow later determination of the quality of the fit. The stored list of fit results is analyzed and translated into 3D data sets of customizable voxel sizes. The fit amplitude is used as the voxel intensity for every molecule found that fulfills the user-defined quality criteria. For operation without the piezo stage, the camera software (Solis, Andor Technology) is used for data recording. Software to operate the microscope with the piezo stage, for fitting, and to create 3D data sets, may be programmed in LabView 8.2 (National Instruments). Imspector (Andreas Schoenle, Max Planck Institute for Biophysical Chemisty, Goettingen, Germany) is used for display and analysis of 3D data sets. 3D rendered images may be created using Amira.
  • Referring to FIG. 4A, a graph illustrates the axial resolution measured using a BP FPALM setup. Specifically, the axial resolution is measured from an axial profile of caged fluorescein-labeled antibodies on a covers slip and embedded in 87% glycerol. The black line represents raw data and the dashed line represents a Gaussian fit.
  • From the axial profile, a full-width-at-half-maximum (“FWHM”) distribution of 75 nanometers is measured, which is about 10-fold below the axial FWHM of measured PSF (which represents the axial resolution of conventional diffraction-limited microscopy). Since localization-based resolution is proportional to the diffraction-limited PSF size and the axial FWHM of a widefield 1.2NA PSF is ˜250% larger than the lateral FWHM, the measured z-localization precision is consistent with x and y-resolution of 20 to 40 nanometers previously obtained in FPALM and PALM.
  • Referring to FIG. 4B, an inset shows added-up projections of the data set (of FIG. 4A) in three different orientations. The white box marks the region used to generate the axial profile. The scale bar of the original images was 2 micrometers.
  • Referring to FIGS. 5A-5E, 3D BP FPALM imaging of 2 micrometers diameter beads labeled with caged fluorescein shows data sets at different axial positions. Specifically, representative 100 nanometer thick xy images of the resulting data set are illustrated at z=+300 nanometers, +100 nanometers, −100 nanometers, −300 nanometers, and −500 nanometers, respectively. The data shown in all planes 5A-5F is recorded simultaneously without scanning. Especially to image samples thicker than 1 micrometer, the sample stage can be moved after finishing recording at one sample position to access different sample depth positions and the data recording process is repeated until all sample positions of interest have been recorded.
  • Referring to FIG. 5F, a volume-rendered representation is shown based on the data sets of FIGS. 5A-5E. The curved surface of the bead is nicely reproduced over nearly 1 μm in depth without scanning. The optical images show well-below 100 nanometers resolution in all three dimensions. With approximately 30×30×80 nanometers3, the resolvable volume is ˜500-fold below the diffraction-limited observation volume and represents the smallest observation volume achieved in a far-field light microscope.
  • Referring to FIG. 6, a BP microscope system 600 is illustrated to show the tracking of a single particle 606 positioned on a sample stage 604. The BP microscope system 600 is generally similar to the BP microscope system 300 described above in reference to FIG. 3.
  • As the single particle 606 moves relatively to the sample stage 604 from a first position (indicated in solid line) to a second position (indicated in dashed line), the fluorescence light beam is adjusted by tilting one or more piezo-mounted mirrors or adjusting alternative beam-steering devices 618. In the exemplary scenario, the piezo-mounted mirror 618 is tilted counterclockwise from a first position (indicated in solid line) to a second position (indicated in dashed line). The rotation of the mirror 618 steers the fluorescence light beam on the camera as well as the excitation light beam focusing into the sample and coming from the laser to correct for sideways movement of the particle 606. The mirror 618 is rotated until the excitation light beam is again centered on the particle 606.
  • Optionally, the sample stage 604 is moved up or down to correct for vertical movement. Alternatively, a suitable beam-steering device 618 refocuses the beam vertically. After the necessary adjustments are made to track the particle 606, the positions of the piezo and stage are recorded to reconstruct large scale movement in post-processing.
  • Referring to FIGS. 7A and 7B, two insets show the images recorded when a particle moves from a first position to a second position as described above in reference to FIG. 6. In FIG. 7A, a transmitted ROI 700 a and a reflected ROI 700 b are recorded on a CCD chip when the particle is in the first position. The pixels of the transmitted ROI 700 a show the same focus and intensity as the pixels in the reflected ROI 700 b. A black box surrounds a general 5×5 pixel area of interest.
  • When the particle moves to the second position, as shown in FIG. 7B, the transmitted ROI 700 a and the reflected ROI 700 b change such that the respective pixels in the area of interest are now out of focus and of different intensity. For example, the pixels of the transmitted ROI 700 a are now generally brighter (i.e., more intense) than in the first position, and off-center with respect to the area of interest (i.e., up and to the right). Similarly, the pixels of the reflected ROI 700 b are now generally less bright (i.e., less intense) than in the first position, and off-center with respect to the area of interest (i.e., up and to the right).
  • Referring to FIG. 7C, the fluorescence light beam has now been steered to center it on the particle 606 in the second position. The pixels of the transmitted ROI 700 a and of the reflected ROI 700 b are generally similar, if not identical, to the pixels illustrated in the first position of the particle 606 (shown in FIG. 7A). Specifically, the pixels are generally centered within the area of interest and are now of similar intensity in both the transmitted ROI 700 a and the reflected ROI 700 b.
  • Various aspects of the present invention may be used alone, in combination, or in a variety of arrangements not specifically discussed in the embodiments described in the foregoing, and the aspects of the present invention described herein are not limited in their application to the details and arrangements of components set forth in the foregoing description or illustrated in the drawings. The aspects of the invention are capable of other embodiments and of being practiced or of being carried out in various ways.
  • 3D Particle Localization Algorithm
  • Referring to the 3D particle localization algorithm, experimentally obtained 3D reference data sets are used to fit data sets obtained by either a multi-plane approach or an astigmatism approach. Practically, all raw data that contributes to the image of a particle is taken into account by the fitting process according to the particle's statistical weight, which is especially relevant for photon-limited applications such as imaging single molecules. Additional calibration steps are generally not necessary because the raw data and the fit reference data set are acquired by the same setup.
  • Based on performed experiments, the localization algorithm converges in close to 100% of the cases over a range of 1 to 2 μm by a fraction Φ of images in which the particle could be localized correctly. The fraction Φ of images depends on Δ and the number of detected photons, Ndet. Only for axial particle positions far away from either of the focal detection planes the localization algorithm may fails to converge properly.
  • Differences generally arise in experiments directed to the Axial Localization Range: biplane detection is capable of localizing particles over a range nearly twice the value achievable by astigmatic detection. This is a relevant feature in imaging of thick biological samples and, therefore, the biplane mode seems to be a favorable mode for these applications. The fact that the signal in biplane detection is spread over double the number of pixels does not have a detectable negative effect in the experimental setup.
  • As mentioned above, the localization algorithm works independently of a theoretical model function and instead uses experimentally obtained reference data sets. This enables the algorithm to account for experimental deviations from perfect theoretical descriptions that are often difficult to include accurately in theoretical models and, also, reduces artifacts in the localization process. Additionally, complex theoretical models that cannot be described by a simple formula are now accessible by numerically generating a reference data set to feed into the algorithm. Small systematic deviations of the determined positions from actual positions can result from differences between used reference data sets and data generated during imaging, or from the fact that the reference data set data set is of finite size and can be readily corrected by proper calibration curves.
  • More relevantly, the localization algorithm allows general applicability to a range of 3D localization strategies without a need to develop individual theoretical model functions for every case. Additionally, it can be readily applied to recently reported detection schemes of iPALM and double-helix reference data sets. The iPALM detection scheme is described in more detail in an article titled “Interferometric fluorescent super-resolution microscopy resolves 3D cellular ultrastructure” by G. Shtengel, J. A. Galbraith, C. G. Galbraith, J. Lippincott-Schwartz, J. M. Gillete, S. Naley, R. Sougrat, C. M. Waterman, P. Kanchanawong, M. W. Davidson, R. D. Fetter, and H. F. Hess (Proc. Natl. Acad. Sci. USA 106, 3125-3130, 2009), which is incorporated by reference in its entirety. The double-helix reference data sets detection scheme is described in more detail in an article titled “Three-dimensional, single-molecule fluorescence imaging beyond the diffraction limit by using a double-helix point spread function” by S. R. Pavani, M>A. Thompson, J. S. Biteen, S. J. Lord, N. Liu, R. J>Twieg, R. Piestun, and W. E. Moerner (Proc. Natl. Acad. Sci. USA 106, 2995-2999, 2009), which is incorporated by reference in its entirety.
  • Furthermore, the localization algorithm is not limited to the five fitting parameters identified below (i.e., x-position, y-position, z-position, amplitude, and background) or to 3D reference data sets. The localization algorithm can be readily expanded to include parameters such as interference phase, polarization, or wavelength by providing experimental reference data set sets that provide the necessary information in a fourth, fifth, or even higher dimension. Alternatively, less parameters and dimensions are possible, allowing for application to 2D imaging.
  • Software
  • Image acquisition software that controls camera and piezo actuator parameters may be written, for example, using LabVIEW 8.2 software in Windows XP, the software being available from National Instruments Corp., Austin, Tex. Recorded data is stored in a raw data format that is later analyzed by separate analysis software programmed in C, which may run on a Linux computer cluster (e.g., 31 compute nodes, each equipped with two dual-core AMD Opteron processors and 16 GB of RAM and connected by a single gigabit Ethernet network).
  • As a brief overview, ROIs corresponding to an illuminated field of view are cut out automatically. In biplane detection mode, the two ROIs in every frame representing the two detected planes are co-registered by slightly tilting and magnifying one ROI according to earlier determined calibration values. Particles are identified as the brightest pixels in smoothed versions of the ROIs. For every identified particle, one ROI of 15×15 pixels at 2×2 binning (corresponding to 1.9 μm×1.9 μm in the sample) is cut out from the non-smoothed data centered on the identified brightest pixel in the astigmatic (biplane) detection mode. This data is, then, corrected for an electronic offset in the signal stemming from the camera, translated from counts into number of photons, and fed into the fit algorithm.
  • The fit algorithm provides the best estimates for the three spatial particle coordinates of the particle, as well as the amplitude and a background value of the particle. The coordinates, amplitude, and/or background value are stored together with other parameters that indicate quality of the fit (e.g., χ2-values, number of iterations before convergence, etc.) in ASCII data lists. These lists are later compiled into the data presented below using, for example, such computer programs like Microsoft Excel, Origin (OriginLab, Northampton, Mass.), and LabVIEW.
  • The same software, especially the same fit algorithm, may be applied to data from both imaging modes with the only difference being that in biplane mode two ROIs are used instead of one. Combined with the required minimal changes in the optical setup, this warrants optimal conditions for a direct and thorough comparison of the two methods of 3D localization.
  • Fit Algorithm
  • The fit algorithm, also referred to as a particle localization routine, performs for every identified particle a least-squares fit based on the Nelder-Mead downhill simplex method. This method is described in more detail in an article titled “Numerical Recipes: The Art of Scientific Computing” by W. H. Press, S. A. Teukolsky, W. T. Vetterling, and B. P. Flannery (Cambridge University Press, 2007), which is incorporated by reference in its entirety. This algorithm, in short, finds the best fit by successively contracting an n-dimensional polytope (“simplex” with m+1 vertices) around the minimum of the figure-of-merit function in m-dimensional parameter space.
  • The figure-of-merit function, χ2, is calculated as squares of the error-weighted differences between the observed number of photons, nj, and a model function, F, which depends on a set of fit parameter values summed over all pixels, j, which describe the image of an identified particle:
  • χ 2 ( v , b , a ) = j ( n j - F v , b , a ( x j ) σ j ) 2 ( 1 )
  • The 3D positions, xj, describe the coordinates in the sample and correspond to a lattice of 15×15×2 or 15×15×1 extracted pixels in biplane and astigmatic detection mode, respectively. Alternatively, the 3D positions can represent any distribution matching the experimental imaging conditions. The estimated statistical error of nj-σ is assumed to be n1 12 because shot noise is generally the main error contribution.
  • To fit the particle data, the model function Fv,b,a(x)=vha(x)+b depends on m=5 parameters. Specifically, the 5 parameters are (1-3) the particle's 3D position a=(ax, ay, az), (4) the number of photons, v, at the intensity maximum detected over the area of one pixel, and (5) the number of background photons, b, per pixel. The normalized instrument response at point x for a particle located at position a is described by ha(x) and is derived from experimentally obtained reference data sets.
  • The reference data set h0l) is defined for a lattice of voxel coordinates ξl. The required values ha(xj) have to be determined from the reference data set by interpolation, making use of the fact that for a translationally invariant system ha(x)=h0(x−a). Simple linear interpolation and related methods generate points of non-differentiability that can cause failure of proper convergence of the simplex method and induce localization artifacts. To address this issue, the following interpolation method based on Fourier transforms has been developed.
  • The problem of estimating the value of a sampled function at a certain point of interest, x−a can be interpreted as determining the function value at the nearest node ξl after shifting the whole function by the amount necessary to make (x−a) coincide with ξl. This shift can be achieved by convolving the function with a Dirac delta distribution:

  • h a(x j)=h 0l)
    Figure US20110025831A1-20110203-P00001
    δ(ξl −D a(x j))  (2),
  • where Da(xj)=ξl−(xj−a) describes the vector between the position xj−a and its closest neighbor ξl. In Fourier space the convolution assumes the simple form of a multiplication,

  • FT{h a(x j)}=H 0le −iκ l D a (x j )  (3).
  • H0l) is the optical transfer function (“OTF”) of the system and is calculated once at the beginning of the fit procedure as the Fourier transform of h0l). Multiplication with the parameter-dependent phase factor exp(−iκl/Da(xj)) and inverse Fourier transform yield the shifted reference data set ha(xj).
  • It can be easily realized that the spacing of the pixel positions, xj, is an integer multiple of the spacing of the reference data set nodes ξl. In this case, Da(x) is independent of xj and a single inverse Fourier transformation of Eq. 3 is sufficient to find all the reference data set values required to calculate χ2 for a given shift a.
  • Because of small experimental differences between the two biplane detection reference data sets, we use a slight modification of the described method that performs the reference data set translation as described by Eq. 3 simultaneously for both reference data sets. Values ha(xj) are extracted from the appropriate OTFs according to the detection plane xj in which the values ha(xj) are located. For calculating the discrete Fourier transforms, for example, the fit algorithm can use the freely available FFTW by M. Frigo and S. G. Johnson (http://www.fftw.org, 2008), which is incorporated by reference in its entirety.
  • Sample
  • In the performed experiments, fluorescent latex beads of 100 nm diameter with an emission maximum at 560 nm (F-8800, Invitrogen, Carlsbad, Calif.) were imaged. Beads were adhered on poly-L-lysine coated (Sigma-Aldrich, St. Louis, Mo.) cover slips, immersed in water and mounted on a slide. Bead density was chosen so low, that only about eight to twelve beads were visible in the field of view when imaging. This guaranteed that fluorescence from neighboring particles was not influencing the analysis.
  • Generation of the Reference Data Set
  • In the localization algorithm, an experimentally obtained reference data set (also referred to as a point-spread function) replaces theoretical models used elsewhere. To rule out localization artifacts caused by this reference data set, great care was exercised in its generation during performed experiments. The same bead samples as in later experiments were imaged at maximum electron-multiplying gain of the camera without pixel binning. Single frames were recorded with acquisition times of 30 ms at 50 nm axial piezo steps over a range of 10 μm.
  • Typically, during performed experiments, approximately 3,000 photons were detected from each bead at each z-position near the focal plane. Single beads were identified visually from the recorded data stacks, and ROIs of 3.8 μm×3.8 μm size centered on the signal maximum were extracted. In the case of biplane detection, stacks were cut out for both recorded planes resulting in two correlated reference data sets. The extracted stacks were loaded into the data processing software Imspector (written by Dr. Andreas Schoenle, Max Planck Institute for Biophysical Chemistry, Goettingen, Germany, and available via Max-Planck-Innovation GmbH, Munich, Germany).
  • In Imspector, the background was removed and the data was corrected for bleaching that had occurred during the imaging process. The reference data sets were then cut to a size of approximately 3.8 μm×3.8 μm×7.5 μm with the reference data set centered in the middle. In biplane mode, both reference data sets were cut in an identical way such that (i) the stack centers were located axially in the middle between the two reference data set centers and (ii) the reference data set centers maintained their original axial distance. To reduce noise, the reference data sets were resampled to voxel sizes close to the resolution limit (x=y=127 nm, z=200 nm).
  • The assured maximum processing speed in the fit algorithm, due to the inclusion of Fourier transformation steps, depends on the number of reference data set voxels without altering the optical characteristics of the original reference data set. The reference data set has been further normalized to a maximum value of 1 for easier determination of reasonable start parameters for the fit algorithm. In biplane mode, the brighter reference data set was normalized to 1 and the other reference data set was normalized to a lower value.
  • Referring to FIG. 8, an exemplary photoactivation localization microscopy method in accordance with the features described above includes providing recorded raw images (801) from which particles are identified in images (803). The photoactivation localization microscopy method can be performed in both 2D and 3D. A computation is made separately for each identified particle (805) and pixels are extracted in regions of interest centered around each of the identified particles (807). Each identified particle is localized by determining the particle position from the intensity distribution in the region of interest (809). Additional details regarding the localization of each identified particle is provided in FIGS. 10-12. The determined positions of all the particles are merged (811) and a particle distribution map is created from the determined particle positions (813). The particle distribution map is provided as a resulting image (815).
  • Referring to FIG. 9, an exemplary single particle tracking method in accordance with the features described above includes providing a recorded image sequence (901) from which a particle is identified in each frame (903). The single particle tracking method can be performed in both 2D and 3D. A computation is made separately for each frame (905) and pixels are extracted in regions of interest centered around the particle (907). The particle is localized by determining the particle position from the intensity distribution in the region of interest (909). Additional details regarding the localization of each identified particle is provided in FIGS. 10-12. The determined positions of the particle are merged (911) and a particle trajectory is created from the determined particle positions (913). The resulting particle trajectory is provided (915).
  • Referring to FIG. 10, an example of a conventional non-iterative localization algorithm includes extracting pixels for one particle (imaging) or one point in time (particle tracking) (1001), wherein the center of mass is calculated (1003). The particle position is, then, determined (1005).
  • Referring to FIG. 11, an example of a conventional iterative localization algorithm includes extracting pixels for one particle (imaging) or one point in time (particle tracking) (1101). After calculating the center of mass (1103), a first guess of the particle position (1105) is provided to calculate the figure of merit as a measure of the difference between the imaged pixel data and a model function based on the guessed particle position (1107). The guess is modified (1109) and the figure of merit is calculated based on the modified guess (1111). If the figure of merit has not decreased (1113), the old guess is used to determine if the figure of merit is below a specific threshold (1115). If the figure of merit has decreased (1113), the modified guess is used to determine if the figure of merit is below the specific threshold (1115). Then, if the figure of merit is below the specific threshold, the particle position is outputted as the last guess (1117). However, if the figure of merit is not below the specific threshold, the guess is modified (1109) and the figure of merit is calculated based on the modified guess (1111).
  • Referring to FIG. 12, in an exemplary implementation of the fit algorithm described above pixels are extracted for one particle (imaging) or one point in time (particle tracking) (1201). After calculating the center of mass (1203), a first guess of the particle position (1205) and a reference data set (1207) is provided to calculate the figure of merit as a measure of the difference between the imaged pixel data and the reference data set adjusted for guessed particle position (1209). Exemplary algorithms for calculating the figure of merit are further described below in reference to FIGS. 13 and 14.
  • The guess is modified (1211) and the figure of merit is calculated based on the modified guess (1213). If the figure of merit has not decreased (1215), the old guess is used to determine if the figure of merit is below a specific threshold (1217). If the figure of merit has decreased (1215), the modified guess is used to determine if the figure of merit is below the specific threshold (1217). Then, if the figure of merit is below the specific threshold, the particle position is outputted as the last guess (1219). However, if the figure of merit is not below the specific threshold, the guess is modified (1211) and the figure of merit is calculated based on the modified guess (1213).
  • Referring to FIG. 13, an algorithm for calculating the figure of merit includes extracting pixels for a particle (1301), guessing the particle position, brightness, and background (1303), and providing the reference data set (1305). The reference data set is shifted in accordance with the particle position (1307) and the shifted reference data set (1309) and the particle position, brightness, and background are provided for calculating the model function Fv,b,a(xj)=vha(xj)+b (1311). The figure of merit is calculated (1313) based on the model function and the extracted pixels for the particle and the figure of the merit function is provided (1315).
  • Referring to FIG. 14, an exemplary embodiment for a Fourier-transform based Shift Algorithm includes extracting pixels for a particle (1401), and guessing the particle position, brightness, and background (1403). A calculation (1405) is performed to obtain a reference data set (1405 a), a Fourier Transform (1405 b), and the OTF (1405 c). The OTF is multiplied with a parameter-dependent phase factor (1407), based on the particle position, brightness, and background. The inverse Fourier Transform is calculated (1409), providing a shifted reference data set (1411). The model function Fv,b,a(xj)=vha(xj)+b is calculated (1413) based on the shifted reference data set and the particle position, brightness, and background. The figure of merit is calculated (1415) based on the model function and the extracted pixels for the particle and the figure of the merit function is provided (1417).
  • Each of these embodiments and obvious variations thereof is contemplated as falling within the spirit and scope of the claimed invention, which is set forth in the following claims.

Claims (11)

1-40. (canceled)
41. A microscopy system configured for creating 3D images from individually localized probe molecules, the system comprising:
an activation light source configured to illuminate a sample with an activation light, the activation light source being configured to activate probes of at least one probe subset of the plurality of photo-sensitive luminescent probes;
a readout light source configured to illuminate the sample with a readout light, the readout light source being configured to cause luminescence light from the activated probes;
a beam splitting device located in a detection light path that splits the luminescence light into at least two paths, the beam splitting device creating at least two detection planes that correspond to the same or different number of object planes of the sample;
at least one camera positioned to detect simultaneously the at least two detection planes, the number of object planes being represented in the camera by the same number of recorded regions of interest; and
a controller programmable to
combine a signal from the regions of interest into a 3D data stack,
calculate a figure of merit as a measure of the difference between imaged pixel data and a reference data set modifiable by at least one parameter, and
optimize the figure of merit by adjusting the at least one parameter.
42. The microscopy system of claim 41, wherein the controller is further programmable to perform a least-squares fit for finding a best fit by successively contracting an n-dimensional polytope around a minimum of a figure-of-merit function in m-dimensional parameter space.
43. The microscopy system of claim 41, wherein the controller is further programmable to calculate the figure-of-merit function as a square of error-weighted differences between observed number of photons and a model function that depends on a set of fit parameter values summed over all pixels,
44. The microscopy system of claim 41, wherein the figure-of-merit function χ2 is expressed as
χ 2 ( v , b , a ) = j ( n j - F v , b , a ( x j ) σ j ) 2 ,
wherein xj describes 3D coordinates in the sample, nj, is the observed number of photons, j is the number of all pixels, Fv,b,a(x)=vha(x)+b, a is the particle 3D position in a specific dimension, v is the number of photons at the intensity maximum detected over the area of one pixel, b is the number of background photons per pixel, and ha(x) a describes the normalized instrument response at point x for a particle located at position a.
45. The microscopy system of claim 41, wherein fitting particle data is based on 5 parameters, including (i) particle 3D position in x direction ax, (ii) particle 3D position in y direction ay, (iii) particle 3D position in x direction az, (iv) number of photons v at the intensity maximum detected over the area of one pixel, and (v) the number of background photons b per pixel.
46. The microscopy system of claim 41, wherein the reference data set is defined for a lattice of voxel coordinates ξl, the controller being further programmable to determine values ha(xj) from the reference data set by interpolation.
47. The microscopy system of claim 46, wherein interpolation is based on Fourier transforms.
48. The microscopy system of claim 46, wherein the controller is further programmable to convolve the reference data set with a shifted Dirac delta distribution.
49. The microscopy system of claim 41, wherein the at least one parameter includes a particle position, an amplitude, and a background signal.
50. The microscopy system of claim 41, wherein the at least one parameter includes an interference phase parameter, a polarization parameter, and a wavelength parameter.
US12/936,095 2008-04-01 2009-03-30 3D Biplane Microscopy Abandoned US20110025831A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/060,730 US7772569B2 (en) 2008-04-01 2008-04-01 3D biplane microscopy
PCT/US2009/038799 WO2009146016A1 (en) 2008-04-01 2009-03-30 3d biplane microscopy

Publications (1)

Publication Number Publication Date
US20110025831A1 true US20110025831A1 (en) 2011-02-03

Family

ID=41115683

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/060,730 Active 2029-02-05 US7772569B2 (en) 2008-04-01 2008-04-01 3D biplane microscopy
US12/936,095 Abandoned US20110025831A1 (en) 2008-04-01 2009-03-30 3D Biplane Microscopy
US12/826,422 Expired - Fee Related US7880149B2 (en) 2008-04-01 2010-06-29 3D biplane microscopy

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/060,730 Active 2029-02-05 US7772569B2 (en) 2008-04-01 2008-04-01 3D biplane microscopy

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/826,422 Expired - Fee Related US7880149B2 (en) 2008-04-01 2010-06-29 3D biplane microscopy

Country Status (3)

Country Link
US (3) US7772569B2 (en)
EP (3) EP2631633A1 (en)
WO (1) WO2009146016A1 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110090327A1 (en) * 2009-10-15 2011-04-21 General Electric Company System and method for imaging with enhanced depth of field
WO2012154333A1 (en) * 2011-04-07 2012-11-15 The Uwm Research Foundation, Inc. High speed microscope with spectral resolution
US20140064147A1 (en) * 2012-08-29 2014-03-06 Qualcomm Incorporated Methods and apparatus for wan enabled peer discovery
US20150015676A1 (en) * 2013-04-18 2015-01-15 Ellis Amalgamated, LLC dba Optics for Hire Astigmatic depth from defocus imaging
US8994807B2 (en) 2009-03-18 2015-03-31 University Of Utah Research Foundation Microscopy system and method for creating three dimensional images using probe molecules
US20150133048A1 (en) * 2012-04-27 2015-05-14 Sony Corporation Information processing device, information processing method, and program
US20160330436A1 (en) * 2015-05-05 2016-11-10 Goodrich Corporation Multi-axis center of mass balancing system for an optical gimbal assembly guided by inertial measurement
US20170131350A1 (en) * 2010-05-12 2017-05-11 International Business Machines Corporation Method and system for quickly identifying circuit components in an emission image
DE102015121920A1 (en) * 2015-12-16 2017-06-22 Carl Zeiss Microscopy Gmbh High-resolution short-term microscopy method and high-resolution short-term microscope
US20170366965A1 (en) * 2016-06-21 2017-12-21 Chiun Mai Communication Systems, Inc. Communication device, communication system and method therefor
US20180234866A1 (en) * 2012-08-03 2018-08-16 Intel Corporation Network assistance for device-to-device discovery
US10187626B2 (en) 2015-04-10 2019-01-22 The Board Of Trustees Of The Leland Stanford Junior University Apparatuses and methods for three-dimensional imaging of an object
US10571674B2 (en) * 2010-09-24 2020-02-25 Carl Zeiss Microscopy Gmbh 3D localization microscopy and 4D localization microscopy and tracking methods and systems
US10783697B2 (en) 2016-02-26 2020-09-22 Yale University Systems, methods, and computer-readable media for ultra-high resolution 3D imaging of whole cells
US20210302316A1 (en) * 2020-03-27 2021-09-30 Leica Microsystems Cms Gmbh Method and device for estimating a sted resolution
DE102020116547A1 (en) 2020-06-23 2021-12-23 Abberior Instruments Gmbh Method for localizing individual molecules of a dye in a sample and for generating high-resolution images of a structure in a sample
US11209367B2 (en) * 2018-08-27 2021-12-28 Yale University Multi-color imaging using salvaged fluorescence

Families Citing this family (61)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7656172B2 (en) * 2005-01-31 2010-02-02 Cascade Microtech, Inc. System for testing semiconductors
EP2023812B1 (en) 2006-05-19 2016-01-27 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US7838302B2 (en) 2006-08-07 2010-11-23 President And Fellows Of Harvard College Sub-diffraction limit image resolution and other imaging techniques
US8217992B2 (en) * 2007-01-11 2012-07-10 The Jackson Laboratory Microscopic imaging techniques
CN105403545B (en) * 2007-12-21 2019-05-28 哈佛大学 Sub- diffraction limit image resolution in three-dimensional
JP5337676B2 (en) * 2009-06-25 2013-11-06 株式会社日立ハイテクノロジーズ Fluorescence analyzer and fluorescence detector
DE102009031231A1 (en) * 2009-06-26 2010-12-30 Carl Zeiss Microlmaging Gmbh Methods and arrangements for fluorescence microscopy
CN102713571B (en) * 2009-09-28 2015-08-05 皇家飞利浦电子股份有限公司 There is the sensor device of imaging optic element
DE102009043744A1 (en) 2009-09-30 2011-03-31 Carl Zeiss Microlmaging Gmbh Method and microscope for three-dimensional resolution-enhanced microscopy
DE102009060793A1 (en) * 2009-12-22 2011-07-28 Carl Zeiss Microlmaging GmbH, 07745 High-resolution microscope and method for two- or three-dimensional position determination of objects
US8237786B2 (en) * 2009-12-23 2012-08-07 Applied Precision, Inc. System and method for dense-stochastic-sampling imaging
DE102010007730B4 (en) * 2010-02-12 2021-08-26 Leica Microsystems Cms Gmbh Method and device for setting a suitable evaluation parameter for a fluorescence microscope
DE102010013223B4 (en) * 2010-03-29 2016-05-12 Lavision Biotec Gmbh Method and arrangement for microscopy
GB201007055D0 (en) * 2010-04-28 2010-06-09 Vib Vzw Method and apparatus for the imaging of a labelled sample
US20120092480A1 (en) * 2010-05-28 2012-04-19 Putman Matthew C Unique digital imaging method employing known background
FR2966258B1 (en) * 2010-10-15 2013-05-03 Bioaxial FLUORESCENCE SUPERRESOLUTION MICROSCOPY SYSTEM AND METHOD FOR BIOLOGICAL APPLICATIONS
DE102010049751B4 (en) 2010-10-29 2020-11-05 "Stiftung Caesar" (Center Of Advanced European Studies And Research) Optical beam splitter for the simultaneous recording of a Z-stack on a semiconductor chip, kit for the construction of an optical beam splitter and light microscope
DE102010044013A1 (en) * 2010-11-16 2012-05-16 Carl Zeiss Microimaging Gmbh Depth resolution enhanced microscopy
KR101669214B1 (en) 2010-12-31 2016-10-25 삼성전자주식회사 Scanning lens apparatus adopting bimorph actuator
DE102011005432A1 (en) 2011-03-11 2012-09-13 Hellma Gmbh & Co. Kg Device for the analysis of a small amount of liquid
DE102011007751B4 (en) * 2011-04-20 2023-10-19 Carl Zeiss Microscopy Gmbh Wide-field microscope and method for wide-field microscopy
US9606209B2 (en) 2011-08-26 2017-03-28 Kineticor, Inc. Methods, systems, and devices for intra-scan motion correction
DE102011053232B4 (en) 2011-09-02 2020-08-06 Leica Microsystems Cms Gmbh Microscopic device and microscopic method for the three-dimensional localization of punctiform objects
CN103033129B (en) * 2011-10-07 2015-10-21 财团法人工业技术研究院 Optical apparatus and optical addressing method
DE102011087770A1 (en) 2011-12-05 2013-06-27 Technische Universität Braunschweig High-resolution microscope
DE102012201003A1 (en) * 2012-01-24 2013-07-25 Carl Zeiss Microscopy Gmbh Microscope and method for high-resolution 3-D fluorescence microscopy
EP2839298B1 (en) 2012-04-13 2022-06-01 Bioaxial SAS Optical measurement method and device
US9103784B1 (en) 2012-11-16 2015-08-11 Iowa State University Research Foundation, Inc. Fluorescence axial localization with nanometer accuracy and precision
JP2014115151A (en) * 2012-12-07 2014-06-26 Shimadzu Corp Optical imaging device
US9717461B2 (en) 2013-01-24 2017-08-01 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9305365B2 (en) 2013-01-24 2016-04-05 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US10327708B2 (en) 2013-01-24 2019-06-25 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
CN105392423B (en) 2013-02-01 2018-08-17 凯内蒂科尔股份有限公司 The motion tracking system of real-time adaptive motion compensation in biomedical imaging
US9435993B2 (en) 2013-03-24 2016-09-06 Bruker Nano, Inc. Three dimensional microscopy imaging
DE102013208415B4 (en) 2013-05-07 2023-12-28 Carl Zeiss Microscopy Gmbh Microscope and method for 3D high-resolution localization microscopy
DE102013208926A1 (en) * 2013-05-14 2014-11-20 Carl Zeiss Microscopy Gmbh Method for 3D high-resolution localization microscopy
DE102013106895B4 (en) 2013-07-01 2015-09-17 Leica Microsystems Cms Gmbh Light microscopic method for the localization of point objects
CN107655812A (en) * 2013-12-18 2018-02-02 香港科技大学 Method, system and the prismatic light chip device of deep layer cells super-resolution imaging
CN106572810A (en) 2014-03-24 2017-04-19 凯内蒂科尔股份有限公司 Systems, methods, and devices for removing prospective motion correction from medical imaging scans
TWI480536B (en) * 2014-05-20 2015-04-11 Univ Nat Taiwan System for analyzing fluorescence intensity and synthesizing fluorescence image and method thereof
EP3188660A4 (en) 2014-07-23 2018-05-16 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10921255B2 (en) 2014-12-09 2021-02-16 Bioaxial Sas Optical measuring device and process
CN104568877A (en) * 2014-12-25 2015-04-29 中国科学院苏州生物医学工程技术研究所 Stochastic optical reconstruction microscopy system and method based on LED light sources
JP6635052B2 (en) * 2015-02-05 2020-01-22 株式会社ニコン Structured illumination microscope and observation method
DE102015004104B4 (en) 2015-03-27 2020-09-03 Laser-Laboratorium Göttingen e.V. Method for localizing at least one emitter by means of a localization microscope
US10989661B2 (en) 2015-05-01 2021-04-27 The Board Of Regents Of The University Of Texas System Uniform and scalable light-sheets generated by extended focusing
US9943247B2 (en) 2015-07-28 2018-04-17 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US10716515B2 (en) 2015-11-23 2020-07-21 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
DE102015121403A1 (en) * 2015-12-09 2017-06-14 Carl Zeiss Microscopy Gmbh LIGHT FIELD IMAGING WITH SCANOPTICS
US10663750B2 (en) * 2016-03-15 2020-05-26 The Regents Of The University Of Colorado, A Body Super-resolution imaging of extended objects
WO2017180680A1 (en) 2016-04-12 2017-10-19 The Board Of Regents Of The University Of Texas System LIGHT-SHEET MICROSCOPE WITH PARALLELIZED 3D lMAGE ACQUISITION
DE102016116620B3 (en) * 2016-09-06 2017-11-02 Stiftung Caesar Center Of Advanced European Studies And Research Beam guidance unit and system of beam guidance units and their use
DE102016119263B4 (en) * 2016-10-10 2018-06-07 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. A method for spatially high-resolution determination of the location of a singled, excitable light for the emission of luminescent light molecule in a sample
DE102016119262B4 (en) * 2016-10-10 2018-06-07 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. A method for spatially high-resolution determination of the location of a singled, excitable light for the emission of luminescent light molecule in a sample
DE102017211031A1 (en) 2016-11-21 2018-05-24 Carl Zeiss Microscopy Gmbh Method and microscope for determining a fluorescence intensity
EP3664705A4 (en) * 2017-08-09 2021-09-29 Allen Institute Systems, devices, and methods for image processing to generate an image having predictive tagging
DE102017129519B4 (en) * 2017-12-12 2020-08-06 Technische Universität Ilmenau Arrangement and method for the simultaneous measurement of the fluorescence of individual layers in a layer system, for example the fundus
DE102018105308A1 (en) * 2018-03-08 2019-09-12 Carl Zeiss Microscopy Gmbh Microscope and method for microscopy of a sample for displaying images with extended depth of field or three-dimensional images
CN108507986A (en) * 2018-03-17 2018-09-07 杨佳苗 The discrete fluorescence spectrum of differential confocal and fluorescence lifetime detection method and device
CN110231320B (en) * 2019-06-05 2021-06-22 复旦大学 Sub-millisecond real-time three-dimensional super-resolution microscopic imaging system
US11635607B2 (en) 2020-05-18 2023-04-25 Northwestern University Spectroscopic single-molecule localization microscopy

Citations (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4621911A (en) * 1985-03-12 1986-11-11 Carnegie-Mellon University Standing wave luminescence microscopy
US4748980A (en) * 1984-11-23 1988-06-07 Christoph Cremer Application of cuts to biological material
US5731588A (en) * 1994-02-01 1998-03-24 Hell; Stefan Process and device for optically measuring a point on a sample with high local resolution
US5777732A (en) * 1994-04-28 1998-07-07 Hanninen; Pekka Luminescence-scanning microscopy process and a luminescence scanning microscope utilizing picosecond or greater pulse lasers
US5851052A (en) * 1997-10-22 1998-12-22 Gustafsson; Mats Foldable stool
US5874726A (en) * 1995-10-10 1999-02-23 Iowa State University Research Foundation Probe-type near-field confocal having feedback for adjusting probe distance
US5888734A (en) * 1992-05-22 1999-03-30 Cremer; Christoph Method for preparing and hybridizing specific probes
US5922543A (en) * 1996-03-15 1999-07-13 Universitat Heidelberg Detection as chromosomal translocations by extending and ligating differentially-labeled probes Hybridized on different sides of a break-point
US6005916A (en) * 1992-10-14 1999-12-21 Techniscan, Inc. Apparatus and method for imaging with wavefields using inverse scattering techniques
US6210977B1 (en) * 1996-01-17 2001-04-03 Micronas Intermetall Gmbh Measuring device and method for making same
US6262423B1 (en) * 1996-12-22 2001-07-17 Max-Planck-Gesellschaft Zur Forderung Der Wissenschaften E. V. Scanning microscope in which a sample is simultaneously and optically excited at various points
US6337472B1 (en) * 1998-10-19 2002-01-08 The University Of Texas System Board Of Regents Light imaging microscope having spatially resolved images
US20020023979A1 (en) * 2000-06-22 2002-02-28 Opm Fishing Tackle Ltd. Bearing arrangement
US20020030811A1 (en) * 1998-10-28 2002-03-14 Hansgeorg Schindler Arrangement for visualizing molecules
US20020064789A1 (en) * 2000-08-24 2002-05-30 Shimon Weiss Ultrahigh resolution multicolor colocalization of single fluorescent probes
US20020076200A1 (en) * 2000-10-23 2002-06-20 Takuro Hamaguchi Host system, driving apparatus, information recording and reading method for the host system, and information recording and reading method for the driving apparatus
US6424421B1 (en) * 1996-12-23 2002-07-23 Ruprecht-Karls-Universität Heidelberg Method and devices for measuring distances between object structures
US20020101593A1 (en) * 2000-04-28 2002-08-01 Massachusetts Institute Of Technology Methods and systems using field-based light scattering spectroscopy
US6608717B1 (en) * 1999-01-29 2003-08-19 Colorado State University Research Foundation Optical coherence microscope and methods of use for rapid in vivo three-dimensional visualization of biological function
USRE38307E1 (en) * 1995-02-03 2003-11-11 The Regents Of The University Of California Method and apparatus for three-dimensional microscopy with enhanced resolution
US20040114138A1 (en) * 2001-04-12 2004-06-17 Stefan Hell Method and device for multi photon excitation of a sample
US20040133112A1 (en) * 2002-03-08 2004-07-08 Milind Rajadhyaksha System and method for macroscopic and confocal imaging of tissue
US6804385B2 (en) * 2000-10-24 2004-10-12 Oncosis Method and device for selectively targeting cells within a three-dimensional specimen
US20040207854A1 (en) * 2001-11-09 2004-10-21 Stefan Hell Method and apparatus for spatially limited excitation of an optical transition
US20040212799A1 (en) * 2003-04-13 2004-10-28 Max-Planck-Gesellschaft Zur Forderung Der Wissenschaften E.V. High spatial resoulution imaging and modification of structures
US20050036667A1 (en) * 2003-08-15 2005-02-17 Massachusetts Institute Of Technology Systems and methods for volumetric tissue scanning microscopy
US20050094261A1 (en) * 2002-05-03 2005-05-05 Stefan Hell Confocal microscope comprising two microlens arrays and a pinhole diaphragm array
US6909105B1 (en) * 1999-03-02 2005-06-21 Max-Planck-Gesellschaft Zur Forderung Der Wissenschaften E.V. Method and device for representing an object
US20050238118A1 (en) * 2002-04-29 2005-10-27 Daniel Asraf Method for detecting a signal contaminated by additive gaussian noise and a detector using the method
US20050259008A1 (en) * 2004-05-21 2005-11-24 Telefonaktiebolaget Lm Ericsson (Publ) Broadband array antennas using complementary antenna
US20060013492A1 (en) * 2004-07-16 2006-01-19 Frank Hecht Process for the acquisition of images from a probe with a light scanning electron microscope with punctiform light source distribution
US20060012870A1 (en) * 2004-07-16 2006-01-19 Ralf Engelmann Light scanning microscope with line-by-line scanning and use
US20060038993A1 (en) * 2003-04-13 2006-02-23 Stefan Hell High spatial resolution imaging
US20060044985A1 (en) * 2003-04-13 2006-03-02 Stefan Hell Creating a permanet structure with high spatial resolution
US7009700B2 (en) * 2001-06-29 2006-03-07 Universite Libre De Bruxelles Method and device for obtaining a sample with three-dimensional microscopy
US20060050146A1 (en) * 1997-04-09 2006-03-09 Richardson Technologies Inc. Color translating UV microscope
US20060171846A1 (en) * 2005-01-10 2006-08-03 Marr David W M Microfluidic systems incorporating integrated optical waveguides
US20060187974A1 (en) * 2001-01-30 2006-08-24 Marcos Dantus Control system and apparatus for use with ultra-fast laser
US7105795B2 (en) * 2001-07-06 2006-09-12 Palantyr Research, Llc Imaging system, methodology, and applications employing reciprocal space optical design
US20060256338A1 (en) * 2005-01-31 2006-11-16 The Board Of Trustees Of The University Of Iilinois Methods and devices for characterizing particles in clear and turbid media
US7151246B2 (en) * 2001-07-06 2006-12-19 Palantyr Research, Llc Imaging system and methodology
US7154598B2 (en) * 2002-07-12 2006-12-26 Decision Biomarkers, Inc. Excitation and imaging of fluorescent arrays
US20070047287A1 (en) * 2005-08-26 2007-03-01 Max-Planck-Gesellchaft Zur Forderung Der Wissenschaften E.V. Method and apparatus for storing a three-dimensional arrangement of data bits in a solid-state body
US20070053594A1 (en) * 2004-07-16 2007-03-08 Frank Hecht Process for the acquisition of images from a probe with a light scanning electron microscope
US20070065936A1 (en) * 2005-09-22 2007-03-22 Kazuhiro Hasegawa Tissue culture microscope apparatus
US20070069940A1 (en) * 2005-02-28 2007-03-29 Telefonaktiebolaget Lm Ericsson (Publ) Method and arrangement for reducing the radar cross section of integrated antennas
US20070109555A1 (en) * 2003-10-09 2007-05-17 Phase Holographic Imaging Phi Ab Method and apparatus for holographic refractometry
US20070160175A1 (en) * 2005-09-23 2007-07-12 Lang Matthew J Systems and methods for force-fluorescence microscopy
US20070165225A1 (en) * 2004-03-06 2007-07-19 Michael Trainer Methods and apparatus for determining the size and shape of particles
US7253408B2 (en) * 2004-08-31 2007-08-07 West Paul E Environmental cell for a scanning probe microscope
US7256894B2 (en) * 2003-10-20 2007-08-14 The Regents Of The University Of California Method and apparatus for performing second harmonic optical coherence tomography
US7298461B2 (en) * 2001-10-09 2007-11-20 Ruprecht-Karls-Universitat Far field light microscopical method, system and computer program product for analysing at least one object having a subwavelength size
US20080070323A1 (en) * 2005-05-23 2008-03-20 Robert Betzig Optical microscopy with phototransformable optical labels
US20080158551A1 (en) * 2006-12-21 2008-07-03 Hess Harald F Systems and methods for 3-dimensional interferometric microscopy
US20080289966A1 (en) * 2004-01-29 2008-11-27 Joel Voldman Microscale sorting cytometer
US20080312540A1 (en) * 2004-12-08 2008-12-18 Vasilis Ntziachristos System and Method for Normalized Flourescence or Bioluminescence Imaging
US20090011948A1 (en) * 2005-04-25 2009-01-08 Unlu M Selim Structured Substrates for Optical Surface Profiling
US20090237501A1 (en) * 2008-03-19 2009-09-24 Ruprecht-Karis-Universitat Heidelberg Kirchhoff-Institut Fur Physik method and an apparatus for localization of single dye molecules in the fluorescent microscopy

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE465009B (en) 1989-11-07 1991-07-15 Skaanemejerier Ek Foer FOOD PRODUCTS WITH LOW FAT CONTENT AND PROCEDURES FOR PREPARING THEREOF
SE9001235L (en) 1990-04-04 1991-10-05 Faerg Ab Nv PROCEDURES AND EQUIPMENT BEFORE SHIPPING
DE4040441A1 (en) 1990-12-18 1992-07-02 Hell Stefan DOUBLE CONFOCAL GRID MICROSCOPE
WO1996006003A1 (en) 1993-06-04 1996-02-29 Mats Gustafsson A floating platform stabilizing arrangement
EP0783428B1 (en) 1994-08-25 1999-05-19 Mats Gustafsson A floating platform stabilizing arrangement
AU6718498A (en) 1997-02-22 1998-09-09 Universitat Heidelberg Marking of nucleic acids with special probe mixtures
WO1999002974A1 (en) 1997-07-10 1999-01-21 Ruprecht-Karls-Universität Heidelberg Wave field microscope, method for a wave field microscope, including for dna sequencing, and calibration method for wave field microscopy
SE9901302L (en) 1998-12-01 2000-06-02 Ericsson Telefon Ab L M Method and device in a communication network
SE0002587D0 (en) 2000-07-07 2000-07-07 Ericsson Telefon Ab L M Rake receiver and method related to a rake receiver
US6909150B2 (en) * 2001-07-23 2005-06-21 Agere Systems Inc. Mixed signal integrated circuit with improved isolation
EP1359452B1 (en) 2002-05-03 2006-05-03 Max-Planck-Gesellschaft zur Förderung der Wissenschaften e.V. Confocal microscope having two micro-lens arrays and a pinhole array
JP2006522989A (en) 2003-04-13 2006-10-05 マックス−プランク−ゲゼルシャフト・ツーア・フェルデルング・デア・ヴィセンシャフテン・エー.ファウ. Production of constant structures with high spatial resolution
JP5414147B2 (en) 2003-04-13 2014-02-12 マックス−プランク−ゲゼルシヤフト・ツーア・フェルデルング・デア・ヴィッセンシャフテン・アインゲトラーゲナー・フェライン Stereoscopic high resolution imaging
SE525789C2 (en) 2003-07-17 2005-04-26 Delaval Holding Ab Method and apparatus for indicating a state of health of a dairy animal
EP1582858A1 (en) 2004-03-29 2005-10-05 Max-Planck-Gesellschaft zur Förderung der Wissenschaften e.V. Method to excite molecules from a first state to a second state with an optical signal
JP4605447B2 (en) 2004-09-17 2011-01-05 横河電機株式会社 3D confocal microscope system
DE102005012739B4 (en) 2005-03-19 2010-09-16 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. Method for producing spatial fine structures
DE102005013969A1 (en) 2005-03-26 2006-10-05 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. Method for the microscopic examination of a spatial fine structure
DE102005020003B4 (en) 2005-04-27 2007-10-11 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. fluorescence microscope
SE528838C2 (en) 2005-04-29 2007-02-27 Delaval Holding Ab Detection method and arrangement for dairy cattle
GB2416261A (en) 2005-05-21 2006-01-18 Zeiss Carl Jena Gmbh Laser scanning microscope with parallel illumination and simultaneous, locally resolved detection
US7887803B2 (en) * 2005-12-02 2011-02-15 Amorfix Life Sciences Methods and compositions to treat misfolded-SOD1 mediated diseases
SE529453C2 (en) 2005-12-02 2007-08-14 Tetra Laval Holdings & Finance Method for detecting leaks in a heat exchanger
US7855690B2 (en) 2005-12-23 2010-12-21 Telefonaktiebolaget L M Ericsson (Publ) Array antenna with enhanced scanning
ATE449190T1 (en) 2006-03-25 2009-12-15 Univ Ruprecht Karls Heidelberg METHOD FOR MICROSCOPIC DETERMINING THE LOCATION OF A SELECTED, INTRACELLULAR DNA SECTION OF KNOWN NUCLEOTIDE SEQUENCE

Patent Citations (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4748980A (en) * 1984-11-23 1988-06-07 Christoph Cremer Application of cuts to biological material
US4621911A (en) * 1985-03-12 1986-11-11 Carnegie-Mellon University Standing wave luminescence microscopy
US5888734A (en) * 1992-05-22 1999-03-30 Cremer; Christoph Method for preparing and hybridizing specific probes
US6005916A (en) * 1992-10-14 1999-12-21 Techniscan, Inc. Apparatus and method for imaging with wavefields using inverse scattering techniques
US5731588A (en) * 1994-02-01 1998-03-24 Hell; Stefan Process and device for optically measuring a point on a sample with high local resolution
US5777732A (en) * 1994-04-28 1998-07-07 Hanninen; Pekka Luminescence-scanning microscopy process and a luminescence scanning microscope utilizing picosecond or greater pulse lasers
USRE38307E1 (en) * 1995-02-03 2003-11-11 The Regents Of The University Of California Method and apparatus for three-dimensional microscopy with enhanced resolution
US5874726A (en) * 1995-10-10 1999-02-23 Iowa State University Research Foundation Probe-type near-field confocal having feedback for adjusting probe distance
US6210977B1 (en) * 1996-01-17 2001-04-03 Micronas Intermetall Gmbh Measuring device and method for making same
US5922543A (en) * 1996-03-15 1999-07-13 Universitat Heidelberg Detection as chromosomal translocations by extending and ligating differentially-labeled probes Hybridized on different sides of a break-point
US6262423B1 (en) * 1996-12-22 2001-07-17 Max-Planck-Gesellschaft Zur Forderung Der Wissenschaften E. V. Scanning microscope in which a sample is simultaneously and optically excited at various points
US6424421B1 (en) * 1996-12-23 2002-07-23 Ruprecht-Karls-Universität Heidelberg Method and devices for measuring distances between object structures
US20060050146A1 (en) * 1997-04-09 2006-03-09 Richardson Technologies Inc. Color translating UV microscope
US5851052A (en) * 1997-10-22 1998-12-22 Gustafsson; Mats Foldable stool
US6135557A (en) * 1997-10-22 2000-10-24 Multiw Inc. Foldable stool
US6337472B1 (en) * 1998-10-19 2002-01-08 The University Of Texas System Board Of Regents Light imaging microscope having spatially resolved images
US20020030811A1 (en) * 1998-10-28 2002-03-14 Hansgeorg Schindler Arrangement for visualizing molecules
US6608717B1 (en) * 1999-01-29 2003-08-19 Colorado State University Research Foundation Optical coherence microscope and methods of use for rapid in vivo three-dimensional visualization of biological function
US6909105B1 (en) * 1999-03-02 2005-06-21 Max-Planck-Gesellschaft Zur Forderung Der Wissenschaften E.V. Method and device for representing an object
US20020101593A1 (en) * 2000-04-28 2002-08-01 Massachusetts Institute Of Technology Methods and systems using field-based light scattering spectroscopy
US20020023979A1 (en) * 2000-06-22 2002-02-28 Opm Fishing Tackle Ltd. Bearing arrangement
US20020064789A1 (en) * 2000-08-24 2002-05-30 Shimon Weiss Ultrahigh resolution multicolor colocalization of single fluorescent probes
US20020076200A1 (en) * 2000-10-23 2002-06-20 Takuro Hamaguchi Host system, driving apparatus, information recording and reading method for the host system, and information recording and reading method for the driving apparatus
US6804385B2 (en) * 2000-10-24 2004-10-12 Oncosis Method and device for selectively targeting cells within a three-dimensional specimen
US20060187974A1 (en) * 2001-01-30 2006-08-24 Marcos Dantus Control system and apparatus for use with ultra-fast laser
US20040114138A1 (en) * 2001-04-12 2004-06-17 Stefan Hell Method and device for multi photon excitation of a sample
US7115885B2 (en) * 2001-04-12 2006-10-03 Max-Planck-Gesellschaft zur Förderung der Wissen-schaften e.V. Method and device for multi photon excitation of a sample
US7009700B2 (en) * 2001-06-29 2006-03-07 Universite Libre De Bruxelles Method and device for obtaining a sample with three-dimensional microscopy
US7151246B2 (en) * 2001-07-06 2006-12-19 Palantyr Research, Llc Imaging system and methodology
US7105795B2 (en) * 2001-07-06 2006-09-12 Palantyr Research, Llc Imaging system, methodology, and applications employing reciprocal space optical design
US7298461B2 (en) * 2001-10-09 2007-11-20 Ruprecht-Karls-Universitat Far field light microscopical method, system and computer program product for analysing at least one object having a subwavelength size
US7253893B2 (en) * 2001-11-09 2007-08-07 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. Method and apparatus for spatially limited excitation of an optical transition
US20040207854A1 (en) * 2001-11-09 2004-10-21 Stefan Hell Method and apparatus for spatially limited excitation of an optical transition
US20040133112A1 (en) * 2002-03-08 2004-07-08 Milind Rajadhyaksha System and method for macroscopic and confocal imaging of tissue
US20050238118A1 (en) * 2002-04-29 2005-10-27 Daniel Asraf Method for detecting a signal contaminated by additive gaussian noise and a detector using the method
US6934079B2 (en) * 2002-05-03 2005-08-23 Max-Planck-Gesellschaft zur Förderung der Wissen-schaften e. V. Confocal microscope comprising two microlens arrays and a pinhole diaphragm array
US20050094261A1 (en) * 2002-05-03 2005-05-05 Stefan Hell Confocal microscope comprising two microlens arrays and a pinhole diaphragm array
US7154598B2 (en) * 2002-07-12 2006-12-26 Decision Biomarkers, Inc. Excitation and imaging of fluorescent arrays
US20060038993A1 (en) * 2003-04-13 2006-02-23 Stefan Hell High spatial resolution imaging
US7064824B2 (en) * 2003-04-13 2006-06-20 Max-Planck-Gesellschaft Zur Forderung Der Wissenschaften E.V. High spatial resoulution imaging and modification of structures
US20040212799A1 (en) * 2003-04-13 2004-10-28 Max-Planck-Gesellschaft Zur Forderung Der Wissenschaften E.V. High spatial resoulution imaging and modification of structures
US20060044985A1 (en) * 2003-04-13 2006-03-02 Stefan Hell Creating a permanet structure with high spatial resolution
US20050036667A1 (en) * 2003-08-15 2005-02-17 Massachusetts Institute Of Technology Systems and methods for volumetric tissue scanning microscopy
US20070109555A1 (en) * 2003-10-09 2007-05-17 Phase Holographic Imaging Phi Ab Method and apparatus for holographic refractometry
US7256894B2 (en) * 2003-10-20 2007-08-14 The Regents Of The University Of California Method and apparatus for performing second harmonic optical coherence tomography
US20080289966A1 (en) * 2004-01-29 2008-11-27 Joel Voldman Microscale sorting cytometer
US20070165225A1 (en) * 2004-03-06 2007-07-19 Michael Trainer Methods and apparatus for determining the size and shape of particles
US20050259008A1 (en) * 2004-05-21 2005-11-24 Telefonaktiebolaget Lm Ericsson (Publ) Broadband array antennas using complementary antenna
US20060013492A1 (en) * 2004-07-16 2006-01-19 Frank Hecht Process for the acquisition of images from a probe with a light scanning electron microscope with punctiform light source distribution
US20070053594A1 (en) * 2004-07-16 2007-03-08 Frank Hecht Process for the acquisition of images from a probe with a light scanning electron microscope
US20060012870A1 (en) * 2004-07-16 2006-01-19 Ralf Engelmann Light scanning microscope with line-by-line scanning and use
US7253408B2 (en) * 2004-08-31 2007-08-07 West Paul E Environmental cell for a scanning probe microscope
US20080312540A1 (en) * 2004-12-08 2008-12-18 Vasilis Ntziachristos System and Method for Normalized Flourescence or Bioluminescence Imaging
US20060171846A1 (en) * 2005-01-10 2006-08-03 Marr David W M Microfluidic systems incorporating integrated optical waveguides
US20060256338A1 (en) * 2005-01-31 2006-11-16 The Board Of Trustees Of The University Of Iilinois Methods and devices for characterizing particles in clear and turbid media
US20070069940A1 (en) * 2005-02-28 2007-03-29 Telefonaktiebolaget Lm Ericsson (Publ) Method and arrangement for reducing the radar cross section of integrated antennas
US20090011948A1 (en) * 2005-04-25 2009-01-08 Unlu M Selim Structured Substrates for Optical Surface Profiling
US20080070323A1 (en) * 2005-05-23 2008-03-20 Robert Betzig Optical microscopy with phototransformable optical labels
US7535012B2 (en) * 2005-05-23 2009-05-19 Robert Eric Betzig Optical microscopy with phototransformable optical labels
US20090206251A1 (en) * 2005-05-23 2009-08-20 Robert Eric Betzig Optical microscopy with phototransformable optical labels
US20070047287A1 (en) * 2005-08-26 2007-03-01 Max-Planck-Gesellchaft Zur Forderung Der Wissenschaften E.V. Method and apparatus for storing a three-dimensional arrangement of data bits in a solid-state body
US20070065936A1 (en) * 2005-09-22 2007-03-22 Kazuhiro Hasegawa Tissue culture microscope apparatus
US20070160175A1 (en) * 2005-09-23 2007-07-12 Lang Matthew J Systems and methods for force-fluorescence microscopy
US20080158551A1 (en) * 2006-12-21 2008-07-03 Hess Harald F Systems and methods for 3-dimensional interferometric microscopy
US20090237501A1 (en) * 2008-03-19 2009-09-24 Ruprecht-Karis-Universitat Heidelberg Kirchhoff-Institut Fur Physik method and an apparatus for localization of single dye molecules in the fluorescent microscopy

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8994807B2 (en) 2009-03-18 2015-03-31 University Of Utah Research Foundation Microscopy system and method for creating three dimensional images using probe molecules
US20110090327A1 (en) * 2009-10-15 2011-04-21 General Electric Company System and method for imaging with enhanced depth of field
US20170131350A1 (en) * 2010-05-12 2017-05-11 International Business Machines Corporation Method and system for quickly identifying circuit components in an emission image
US10895596B2 (en) * 2010-05-12 2021-01-19 International Business Machines Corporation Method and system for quickly identifying circuit components in an emission image
US10571674B2 (en) * 2010-09-24 2020-02-25 Carl Zeiss Microscopy Gmbh 3D localization microscopy and 4D localization microscopy and tracking methods and systems
WO2012154333A1 (en) * 2011-04-07 2012-11-15 The Uwm Research Foundation, Inc. High speed microscope with spectral resolution
US8982206B2 (en) 2011-04-07 2015-03-17 Uwm Research Foundation, Inc. High speed microscope with narrow detector and pixel binning
US9103721B2 (en) 2011-04-07 2015-08-11 Uwm Research Foundation, Inc. High speed microscope with spectral resolution
US20150133048A1 (en) * 2012-04-27 2015-05-14 Sony Corporation Information processing device, information processing method, and program
US20180234866A1 (en) * 2012-08-03 2018-08-16 Intel Corporation Network assistance for device-to-device discovery
US20140064147A1 (en) * 2012-08-29 2014-03-06 Qualcomm Incorporated Methods and apparatus for wan enabled peer discovery
US9532032B2 (en) * 2013-04-18 2016-12-27 Ellis Amalgamated, LLC Astigmatic depth from defocus imaging using intermediate images and a merit function map
US20150015676A1 (en) * 2013-04-18 2015-01-15 Ellis Amalgamated, LLC dba Optics for Hire Astigmatic depth from defocus imaging
US10791318B2 (en) 2015-04-10 2020-09-29 The Board Of Trustees Of The Leland Stanford Junior University Multi-wavelength phase mask
US10187626B2 (en) 2015-04-10 2019-01-22 The Board Of Trustees Of The Leland Stanford Junior University Apparatuses and methods for three-dimensional imaging of an object
US10341640B2 (en) 2015-04-10 2019-07-02 The Board Of Trustees Of The Leland Stanford Junior University Multi-wavelength phase mask
US10638112B2 (en) * 2015-04-10 2020-04-28 The Board Of Trustees Of The Leland Stanford Junior University Apparatuses and methods for three-dimensional imaging of an object
US20160330436A1 (en) * 2015-05-05 2016-11-10 Goodrich Corporation Multi-axis center of mass balancing system for an optical gimbal assembly guided by inertial measurement
US10015481B2 (en) * 2015-05-05 2018-07-03 Goodrich Corporation Multi-axis center of mass balancing system for an optical gimbal assembly guided by inertial measurement
DE102015121920A1 (en) * 2015-12-16 2017-06-22 Carl Zeiss Microscopy Gmbh High-resolution short-term microscopy method and high-resolution short-term microscope
US10119914B2 (en) 2015-12-16 2018-11-06 Carl Zeiss Microscopy Gmbh Fast high-resolution microscopy method and fast high-resolution microscope
US10783697B2 (en) 2016-02-26 2020-09-22 Yale University Systems, methods, and computer-readable media for ultra-high resolution 3D imaging of whole cells
US20170366965A1 (en) * 2016-06-21 2017-12-21 Chiun Mai Communication Systems, Inc. Communication device, communication system and method therefor
US11209367B2 (en) * 2018-08-27 2021-12-28 Yale University Multi-color imaging using salvaged fluorescence
US20210302316A1 (en) * 2020-03-27 2021-09-30 Leica Microsystems Cms Gmbh Method and device for estimating a sted resolution
US11841324B2 (en) * 2020-03-27 2023-12-12 Leica Microsystems Cms Gmbh Method and device for estimating a STED resolution
DE102020116547A1 (en) 2020-06-23 2021-12-23 Abberior Instruments Gmbh Method for localizing individual molecules of a dye in a sample and for generating high-resolution images of a structure in a sample

Also Published As

Publication number Publication date
WO2009146016A1 (en) 2009-12-03
EP2265932A1 (en) 2010-12-29
EP2265932A4 (en) 2011-12-07
US7772569B2 (en) 2010-08-10
US7880149B2 (en) 2011-02-01
US20090242798A1 (en) 2009-10-01
EP2631633A1 (en) 2013-08-28
US20100265318A1 (en) 2010-10-21
EP2631632A1 (en) 2013-08-28
EP2265932B1 (en) 2013-05-22

Similar Documents

Publication Publication Date Title
EP2265932B1 (en) 3d biplane microscopy
US10944896B2 (en) Single-frame autofocusing using multi-LED illumination
Jouchet et al. Nanometric axial localization of single fluorescent molecules with modulated excitation
US8994807B2 (en) Microscopy system and method for creating three dimensional images using probe molecules
JP6934856B2 (en) Optical sheet microscope that simultaneously images multiple target surfaces
US8217992B2 (en) Microscopic imaging techniques
Mlodzianoski et al. Experimental characterization of 3D localization techniques for particle-tracking and super-resolution microscopy
Juette et al. Three-dimensional sub–100 nm resolution fluorescence microscopy of thick samples
US8155409B2 (en) Wave field microscope with sub-wavelength resolution and methods for processing microscopic images to detect objects with sub-wavelength dimensions
US11946854B2 (en) Systems and methods for two-dimensional fluorescence wave propagation onto surfaces using deep learning
US20120287244A1 (en) Non-coherent light microscopy
US20140340482A1 (en) Three Dimensional Microscopy Imaging
JP6637653B2 (en) Microscope and SPIM microscopy method
US20150098126A1 (en) Multiview Light-Sheet Microscopy
Dobbie et al. OMX: A new platform for multi-modal, multi-channel widefield imaging
Chen et al. Superresolution structured illumination microscopy reconstruction algorithms: a review
US20160313548A1 (en) Method for capturing image of three-dimensional structure of specimen and microscopic device
US20140022373A1 (en) Correlative drift correction
US11947098B2 (en) Multi-focal light-sheet structured illumination fluorescence microscopy system
Torres-García et al. Extending resolution within a single imaging frame
US11327018B2 (en) Sub-diffraction imaging, coding and decoding of non-bleaching scatters
Herrmannsdörfer et al. 3D d STORM imaging of fixed brain tissue
Dalgarno et al. Nanometric depth resolution from multi-focal images in microscopy
Naredi‐Rainer et al. Confocal microscopy
Schüttpelz et al. dSTORM: real-time subdiffraction-resolution fluorescence imaging with organic fluorophores

Legal Events

Date Code Title Description
AS Assignment

Owner name: THE JACKSON LABORATORY, MAINE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BEWERSDORF, JOERG;JUETTE, MANUEL F.;REEL/FRAME:025083/0364

Effective date: 20080422

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION