WO2009146016A1 - 3d biplane microscopy - Google Patents

3d biplane microscopy Download PDF

Info

Publication number
WO2009146016A1
WO2009146016A1 PCT/US2009/038799 US2009038799W WO2009146016A1 WO 2009146016 A1 WO2009146016 A1 WO 2009146016A1 US 2009038799 W US2009038799 W US 2009038799W WO 2009146016 A1 WO2009146016 A1 WO 2009146016A1
Authority
WO
WIPO (PCT)
Prior art keywords
particle
sample
microscopy system
light
probes
Prior art date
Application number
PCT/US2009/038799
Other languages
French (fr)
Inventor
Joerg Bewersdorf
Manuel F. Juette
Travis Gould
Sam T. Hess
Original Assignee
The Jackson Laboratory
University Of Maine
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Jackson Laboratory, University Of Maine filed Critical The Jackson Laboratory
Priority to US12/936,095 priority Critical patent/US20110025831A1/en
Priority to EP09755379.6A priority patent/EP2265932B1/en
Publication of WO2009146016A1 publication Critical patent/WO2009146016A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • G02B21/367Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/16Microscopes adapted for ultraviolet illumination ; Fluorescence microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/361Optical details, e.g. image relay to the camera or image sensor
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/58Optics for apodization or superresolution; Optical synthetic aperture systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/218Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0085Motion estimation from stereoscopic image signals

Definitions

  • the present invention relates generally to microscopic imaging and, more specifically, to three-dimensional ("3D") sub- 100 nanometer resolution by biplane microscope imaging.
  • STED microscopy and other members of reversible saturable optical fluorescence transitions achieve a resolution > 10-fold beyond the diffraction barrier by engineering the microscope's point-spread function (“PSF”), also referred to as a “reference data set,” through optically saturable transitions of the (fluorescent) probe molecules.
  • FPALM fluorescence photoactivation localization microscopy
  • PAM photoactivation localization microscopy
  • STORM stochastic optical reconstruction microscopy
  • PALM PALM with independently running acquisition
  • FPALM is described in more detail in an article titled “Ultra- High Resolution Imaging by Fluorescence Photoactivation Localization Microscopy” by Samuel T. Hess et al. (91 Biophysical Journal, 4258-4272, December 2006), which is incorporated herein by reference in its entirety.
  • PALM is described in more detail in an article titled “Imaging Intracellular Fluorescent Proteins at Nanometer Resolution” by Eric Betzig et al. (313 Science, 1642-1645, September 15, 2006), which is incorporated herein by reference in its entirety.
  • STORM is described in more detail in an article titled “Sub- Diffraction-Limit Imaging by Stochastic Optical Reconstruction Microscopy” by Michael J. Rust et al. (Nature Methods / Advance Online Publication, August 9, 2006), which is incorporated herein by reference in its entirety.
  • PALMIRA is described in more detail in an article titled “Resolution of ⁇ /10 in Fluorescence Microscopy Using Fast Single Molecule Photo-Switching" by H. Bock et al.
  • photosensitive refers to both photo-activatable (e.g., switching probes between an on state and an off state) and photo-switching ⁇ e.g.., switching between a first color and a second color).
  • the sample is labeled with photo-sensitive probes, such as photo-activatable ("PA”) fluorescent probes (e.g., PA proteins or caged organic dyes).
  • PA photo-activatable
  • Activation of only a sparse subset of molecules at a time allows their separate localization.
  • the final sub-diffraction image of the labeled structure is generated by plotting the positions of some or all localized molecules.
  • Particle-tracking techniques can localize small objects (typically ⁇ diffraction limit) in live cells with sub-diffraction accuracy and track their movement over time. But conventional particle-tracking fluorescence microscopy cannot temporally resolve interactions of organelles, molecular machines, or even single proteins, which typically happen within milliseconds.
  • a particular 3D particle-tracking technique can track particles only with 32 milliseconds time resolution.
  • This technique scans a 2-photon excitation focus in a 3D orbit around the fluorescent particle and determines its 3D position by analyzing the temporal fluorescence fluctuations. The temporal resolution is ultimately limited by the frequency with which the focus can revolve in 3D around the particle.
  • This technique is described in more detail in an article titled "3-D Particle Tracking In A Two-Photon Microscope: Application To The Study Of Molecular Dynamics IN Cells" by V. Levi, Q. Ruan, and E. Gratton (Biophys. J., 2005, 88(4): pp. 2919-28), which is incorporated by reference in its entirety.
  • another current 3D particle-tracking technique combines traditional particle-tracking with widefield "bifocal detection" images. Particles are simultaneously detected in one plane close to the focal plane of the particle and a second plane 1 micrometer out of focus. The lateral and axial coordinates are derived from the 2 images.
  • the temporal resolution is limited to the 2-50 milliseconds range, and the localization accuracy is limited to the 2-5 nanometer range. Additional details are described in an article titled “Three-Dimensional Particle Tracking Via Bifocal Imaging" by E Toprak et al. (Nano Lett., 2007, 7(7): pp. 2043-45), which is incorporated by reference in its entirety. As such, advances in temporal resolution to sub- millisecond levels have been limited only to 2D imaging.
  • determining the 3D position of a particle by any of the methods mentioned above requires fitting a model function to the respective experimental data.
  • the particle position (and also, typically, the particle brightness and background value) can be deduced from the parameters that fit the experimental data best, according to a chosen figure of merit.
  • an analytical function is used to reasonably model the characteristics that dominantly describe the 3D particle position, e.g., the diameter of the defocused image or the particle ellipticity in the case of astigmatism.
  • the model function is calibrated with imaged particles located at known positions to achieve mapping of the determined fit parameters to real spatial positions.
  • a microscopy system is configured for creating 3D images from individually localized probe molecules.
  • the microscopy system includes a sample stage, an activation light source, a readout light source, a beam splitting device, at least one camera, and a controller.
  • the activation light source activates probes of at least one probe subset of photo-sensitive luminescent probes
  • the readout light source causes luminescence light from the activated probes.
  • the activation light source and the readout light source is the same light source.
  • the beam splitting device splits the luminescence light into at least two paths to create at least two detection planes that correspond to the same or different number of object planes of the sample.
  • a method for creating 3D images from individually localized probe molecules includes mounting a sample on a sample stage, the sample having a plurality of photo- sensitive luminescent probes. In response to illuminating the sample with an activation light, probes of at least one probe subset of the plurality of photo-sensitive luminescent probes are activated. In response to illuminating the sample with a readout light, luminescence light from the activated probes is caused.
  • the luminescence lights is split into at least two paths to create at least two detection planes, the at least two detection planes corresponding to the same or different object planes in the sample. At least two detection planes are detected via a camera. The object planes are recorded in corresponding recorded regions of interest in the camera. A signal from the regions of interest is combined into a 3D data stack.
  • a microscopy system is configured for tracking microscopic particles in 3D.
  • the system includes a sample, a sample stage, at least one light source, a beam- steering device, a beam splitting device, at least one camera, and a controller.
  • the sample which includes luminescence particles, is mounted to the sample stage.
  • the light source is configured to illuminate an area of the sample to cause luminescence light, primarily, from one tracked particle of the luminescence particles.
  • the beam-steering device is configured to selectively move a light beam to illuminate different areas of the sample such that the luminescence light is detected.
  • the beam splitting device which is located in a detection light path, splits the luminescence light into at least two paths to create at least two detection planes that correspond to different object planes in the sample.
  • the camera is positioned to detect simultaneously the at least two detection planes, the number of object planes being represented in the camera by the same number of recorded regions of interest.
  • the controller is programmable to combine a signal from the recorded regions of interest, determine a 3D trajectory of the particle at each time point of a recorded data sequence, and move the beam-steering device to illuminate the different areas of the sample in accordance with corresponding positions of the one tracked particle.
  • a method for tracking microscopic particles in 3D includes mounting a sample on a sample stage, the sample including luminescent particles.
  • a small area of the sample is illuminated to cause luminescence light from primarily one particle of the luminescent particles.
  • the light beam is selectively moved to illuminate different areas of the sample to track movement of the one particle, the different areas including the small area of the sample and corresponding to respective positions of the one particle.
  • the luminescence light is split into at least two paths to create at least two detection planes that correspond to the same or different number of object planes in the sample.
  • the at least two detection planes are detected simultaneously.
  • the number of object planes is represented in a camera by the same number of recorded regions of interest. Based on a combined signal from the recorded regions of interest, a 3D trajectory of the one particle is determined at each time point of a recorded data sequence.
  • a microscopy system is configured for creating 3D images from individually localized probe molecules.
  • the microscopy system includes an activation light source, a readout light source, a beam splitting device, at least one camera, and a controller.
  • the activation light source is configured to illuminate a sample with an activation light, the activation light source being configured to activate probes of at least one probe subset of the plurality of photo- sensitive luminescent probes.
  • the readout light source is configured to illuminate the sample with a readout light, the readout light source being configured to cause luminescence light from the activated probes.
  • the beam splitting device is located in a detection light path that splits the luminescence light into at least two paths, the beam splitting device creating at least two detection planes that correspond to the same or different number of object planes of the sample.
  • the camera is positioned to detect simultaneously the at least two detection planes, the number of object planes being represented in the camera by the same number of recorded regions of interest.
  • the controller is programmable to combine a signal from the regions of interest into a 3D data stack, to calculate a figure of merit as a measure of the difference between imaged pixel data and a reference data set /io( ⁇ /) modifiable by at least one parameter, and to optimize the figure of merit by adjusting the at least one parameter.
  • FIG. 1 is a schematic view illustrating a biplane microscope setup for
  • FPALM Fluorescence Photoactivation Localization Microscopy
  • FIG. 2 is a schematic view illustrating a biplane microscope setup, according to an alternative embodiment.
  • FIG. 3 is a schematic view illustrating a fluorescent particle image on a CCD chip.
  • FIG. 4A is a graph representing an axial resolution measured from an axial profile of caged fluorescein-labeled antibodies.
  • FIG. 4B is a representative image showing added-up projections of a data set in three different orientations for the axial resolution measured in FIG. 4A.
  • FIG. 5A is a representative image of a data set for beads labeled with caged fluorescein at an axial position of 300 nanometers.
  • FIG. 5B illustrates a representative image of a resulting data set for the beads of
  • FIG. 5 A at an axial position of 100 nanometers.
  • FIG. 5C illustrates a representative image of a resulting data set for the beads of
  • FIG. 5 A at an axial position of -100 nanometers.
  • FIG. 5D illustrates a representative image of a resulting data set for the beads of
  • FIG. 5A at an axial position of -300 nanometers.
  • FIG. 5E illustrates a representative image of a resulting data set for the beads of
  • FIG. 5A at an axial position of -500 nanometers.
  • FIG. 5F illustrates a volume-rendered representation of the data set illustrated in
  • FIGs. 5A-5E are identical to FIGs. 5A-5E.
  • FIG. 6 is a schematic view illustrating adjustment of a biplane microscope setup, according to an alternative embodiment.
  • FIG. 7A is a schematic view illustrating a fluorescent particle image on a CCD chip when the particle is in focus, in a first position.
  • FIG. 7B is a schematic view illustrating the fluorescent particle image of FIG. 7A when the particle is out of focus, in a second position.
  • FIG. 7C is a schematic view illustrating the fluorescent particle image of FIG. 7B when the particle is in focus, in a third position.
  • FIG. 8 is a diagrammatic representation of an exemplary embodiment of a photoactivation localization microscopy method.
  • FIG. 9 is a diagrammatic representation of an exemplary embodiment of a single particle tracking method.
  • FIG. 10 is a diagrammatic representation of an exemplary conventional embodiment of a non-iterative localization algorithm.
  • FIG. 11 is a diagrammatic representation of an exemplary conventional embodiment of an iterative localization algorithm.
  • FIG. 12 is a diagrammatic representation of an exemplary embodiment of a localization algorithm.
  • FIG. 13 is a diagrammatic representation of an exemplary embodiment of an algorithm for calculating a figure of merit.
  • FIG. 14 is a diagrammatic representation of an exemplary embodiment of a Fourier-Transform based Shift Algorithm.
  • a biplane (“BP") microscope system 100 allows 3D imaging at an unmatched resolution well below 100 nanometers in all three dimensions, resulting in at least a 100-fold smaller resolvable volume than obtainable by conventional 3D microscopy.
  • the BP microscope system 100 is optionally a BP FPALM system, which is generally based on a conventional FPALM design.
  • the BP microscope system 100 includes a modified detection path that allows the simultaneous detection from two focal planes.
  • the simultaneous detection of two planes for localization- based super-resolution microscopy speeds up the imaging process by making axial scanning unnecessary, and more importantly, in contrast to scanning-based systems, eliminates localization artifacts caused by abrupt blinking and bleaching common to single molecules.
  • the BP microscope system 100 can optionally be located on an air-damped optical table to minimize vibrations.
  • the BP microscope system 100 can also achieve temporal resolution ⁇ 1 milliseconds.
  • the BP microscope system 100 can also be a next-generation 3D particle-tracking microscope ("3D PTM") for providing unprecedented temporal and spatial resolution when tracking fluorescent particles in live cells in 3D.
  • 3D PTM next-generation 3D particle-tracking microscope
  • FPALM and particle-tracking are just some exemplary applications of the BP microscope system 100.
  • the BP microscope system 100 tracks one particle at a time (in contrast to conventional 2D and 3D tracking techniques that visualize the entire field).
  • the BP microscope system 100 can include a detection scheme without any moving parts that detects simultaneously two axially shifted detection planes.
  • the BP microscope system 100 can include a focused laser beam for excitation combined with spatially limited detection. Background light is filtered out to avoid localization disturbances and to increase sensitivity in samples thicker than about 1 micrometer. This enables particle-tracking even in tissue sections.
  • the BP microscope system 100 can include, for example, high-speed piezo-mirrors and a fast piezo-driven sample stage. The combination of focused excitation and feedback-driven beam-tracking reduces the background and enhances the speed limit by approximately one order of magnitude.
  • a second (different) luminescence color can be detected to enable correlative studies of the movement of the tracked particle.
  • Illumination for readout and activation can be provided by a readout laser 102, operating typically at 496 nanometers, and an activation laser 104 (e.g., 50 mW, Crystalaser), operating typically at 405 nanometers.
  • the readout laser 102 is optionally a water-cooled Argon laser (e.g., Innova 70, coherent Inc.) that can provide 458, 472, 488, 496, or 514 nanometers for readout illumination.
  • the wavelength of the readout laser 102 is selected to minimize activation of inactive probes of a plurality of photo- sensitive probes of a sample 124.
  • the readout laser 102 and the activation laser 104 can be the same source.
  • the readout laser 102 can perform both the readout functions and the activation functions, without requiring the use of the activation laser 104.
  • at least one illuminated area of the sample 124 is a relatively small area, having, for example, a general diameter that is less than about three times an Airy disk diameter.
  • Both lasers 102, 104 are combined, via a first dichroic beam splitter 110, and coupled, via a second dichroic beam splitter 120, into a microscope stand 106 equipped with a 63x 1.2NA water immersion tube lens 108 after passing through a field aperture 107.
  • Both lasers 102, 104 can be switched on and off by software-controlled electrical shutters (e.g., SH05, Thorlabs).
  • Other components that may be included along the path between the lasers 102, 104 and the microscope stand 106 are a first mirror 112 and a first lens 114.
  • the microscope stand 106 can have a plurality of components, including a sample stage 116 and an objective 118.
  • the sample 124 including for example a biological cell 124a is generally positioned on the sample stage 116.
  • the sample stage 116 can be a mechanical stage or a three-axis piezo stage (e.g., P-733.3DD, Physik Instrumente).
  • Other components which are not shown, may include shutters in front of the lasers 102, 104 and further optics for folding the beam path.
  • Fluorescence is collected by the objective 118, passes through a second dichroic beam splitter 120 (which reflects the laser light) and is focused by the tube lens 108 via an optional second mirror 122 (e.g., a piezo-driven mirror) into an intermediate focal plane 140.
  • the focal plane 140 is imaged by two lenses - a second lens 128 and a third lens 132 - onto a high- sensitivity EM-CCD camera 126 (e.g., DU897DCS-BV iXon, Andor Technology).
  • Scattered laser light is attenuated by bandpass and Raman edge filters (e.g., Chroma and Semrock), such as filter 130.
  • the detection scheme can be achieved by moving the CCD camera 126 out of the standard image plane closer to the tube lens 108 and thereby shifting the corresponding focal plane -350 nanometers deeper into the sample.
  • a beam splitter cube 134 is placed into a focused light path 136a in front of the CCD camera 126.
  • the beam splitter cube 134 redirects a reflected light path 136b via a third mirror 138 towards the CCD camera 126 to form a second image in a different region of the same CCD. Due to the longer optical path, this second image corresponds to a focal plane -350 nanometers closer to the objective 118 than the original focal plane.
  • the BP microscope system 100 using a single camera, is straightforward to implement and avoids synchronization problems between separate cameras.
  • the BP microscope system 100 features a reasonable field of view of -20 x 50 micrometers 2 (pixel size corresponding to -100 nanometers in the sample 124; 512 x 512 pixels), sufficient to image large portions of a cell.
  • the BP microscope system 100 is able to image 100 frames per second with a field of view of 10 to 20 micrometers in length and 2 x 2 binning.
  • the use of the CCD camera 126 which features negligible readout noise due to its on-chip electron multiplication, avoids additional noise that would otherwise result from splitting the light up into two fields as required for BP detection. Combined with the fact that there is minimal loss of fluorescence detection efficiency, this exemplary BP microscope system 100 expands conventional FPALM to 3D imaging without significant drawbacks.
  • BP FPALM technology is compatible with live cell imaging and can be expanded to multicolor imaging (even realizable on the same CCD detector).
  • BP FPALM can record 3D structures in a -1 micrometer thick z-section without scanning. Larger volumes can be recorded by recording BP FPALM data at different sample positions.
  • BP FPALM can be combined with a 2-photon ("2P") laser scanner. 2P excitation-mediated activation is directed to diffraction-limited planes of -800 nanometers thickness, a thickness that is compatible with the axial detection range of BP FPALM.
  • BP FPALM therefore has the potential of imaging specimens such as cell nuclei or tissue sections far exceeding 1 micrometer in thickness.
  • BP FPALM can be readily implemented in practically every existing FPALM, PALM, PALMIRA or STORM instrument. BP FPALM therefore provides the means to investigate a large variety of biological 3D structures at resolution levels previously far out of reach.
  • BP FPALM detected luminescence from activated probes is fluorescence or scattered light.
  • the activation of activated probes is achieved via a non-linear process that limits the activation to a plane of diffraction- limited thickness.
  • 100 nanometer diameter yellow-green fluorescent beads (Invitrogen, F-8803) can be attached to a poly-L-lysine coated cover slip.
  • the sample can be mounted on a piezo stage and imaged in the BP FPALM setup with 496 nm excitation.
  • 101 images at z-positions ranging from - 2.5 to +2.5 micrometers with 50 nanometers step size are recorded.
  • the same bead is imaged 2 to 3 times to check for drift and to correct for bleaching.
  • the data set can be smoothed in Imspector with a Gaussian filter of sub-diffraction size. Additionally, the data set can be corrected for mono-exponential bleaching, cropped to appropriate size and to be centered and normalized to 1.
  • the beads could be localized over a range of 800 nanometers exceeding the distance between the two detection planes (in this case 500 nanometers) by more than 50%.
  • the accumulation time per frame is typically 10 milliseconds.
  • electron multiplying gain is set to 300, the readout is 2 x 2 binned, only the region occupied by two recorded regions of interest ("ROIs") is read out, and, typically, 5,000 to 50,000 frames are recorded.
  • a BP microscope system 200 is shown according to an alternative embodiment.
  • the BP microscope system 200 includes a microscope stand 202 having a piezo-driven sample stage 204 on which a sample 206 is positioned.
  • the sample 206 includes a plurality of fluorescent particles 206a-206d.
  • the microscope stand 202 further includes an objective 208 and a first lens 210.
  • Additional components are positioned between a focal plane 212 and the CCD camera 214 along a fluorescence light path 215.
  • the components include a second lens 216, a beam-steering device 281 (e.g., a piezo-driven mirror), a dichroic beam splitter 220, a bandpass filter 222, a third lens 224, a neutral 50:50 beam splitter 226, and a mirror 228.
  • the beam-steering device 218 can include generally a focusing optical element that moves illumination and detection focal planes axially to follow the tracked particle.
  • the beam-steering device 218 can include a phase- modulating device that moves an illuminated area laterally and illumination and detection focal planes axially to follow the tracked particle.
  • more than one piezo-driven mirror 218 can be included in the BP microscope system 200.
  • a polarized laser beam from a laser 229 is coupled into the microscope stand 202 and focused into the sample 206 by the objective 208.
  • a fourth lens 230 and a ⁇ /4 plate 232 are positioned between the laser 229 and the dichroic beam splitter 220.
  • the focus can be positioned in the region of interest by moving the sample stage 204 and the beam-steering device 218.
  • the fluorescence emerging from the focal region is collected by the objective 208 and is imaged onto the CCD camera 214 via the first lens 210, the second lens 216, and the third lens 224.
  • the dichroic beam splitter 220 and the bandpass filter 222 filter out scattered excitation light and other background light.
  • the neutral 50:50 beam splitter 226 splits the fluorescence light into two beam paths, a transmitted beam 215a and a reflected beam 215b.
  • the transmitted beam 215a images light emitted from a plane deeper in the sample onto one area of the CCD chip.
  • the reflected beam 215b images light from a plane closer to the objective onto another well- separated area to avoid cross-talk.
  • two ROIs on the CCD chip represent two focal planes in the sample 206 (illustrated in FIG. 2), typically 700 nanometers apart, arranged like wings of a biplane.
  • the two ROIs include a transmitted ROI 300 and a reflected ROI 302, each having nine pixels showing an image of the fluorescent particle 206b from the sample 206.
  • the dashed areas 304a-304i, 306a-306i depict the pixels that are used for tracking the fluorescent particle 206b.
  • the two 9-pixel-areas 304a-304i, 306a-306i represent in general the position of the particle 206b in 3D.
  • the fluorescent particle 206b which is generally smaller than the laser focus and located in the focal region, is excited homogeneously and 3 (binned) lines (i.e., the two 9- pixel-areas represented by dashed areas 304a-304i, 306a-306i) of the CCD chip arranged around the laser focus image are read out at every time point. Particles laterally shifted with respect to the laser focus center will appear shifted on the CCD chip.
  • the two 9-pixel-areas 304a-304i, 306a-306i act in the same was as two confocal pinholes in different planes: if the particle 206b moves axially, the signal will increase in one of the 9- pixel-area and decrease in the other 9-pixel-area.
  • An axial shift will be represented by a sharper intensity distribution in one of the two 9-pixel-areas depending on the direction of the shift.
  • the 3D position can be determined by subtracting different pixel values of the two 9-pixel-areas from each other. For the axial coordinate (z-axis), the sum of all pixels from one 9-pixel-area can be subtracted from the other 9-pixel-area. The fact that the lateral information is preserved in the 9-pixel-areas allows for lateral localization of the particle 306b at the same time.
  • the signal collected in the left columns 304a, 304d, 304g, 306a, 306d, 306g (or upper rows: 304a, 304b, 304c and 306a, 306b, 306c) of both 9-pixel-areas 300 and 302 can be subtracted from the one in the right columns 304c, 304f, 304i, 306c, 306f, 306i (or lower rows: 304g, 304h, 304i and 306g, 306h, 306i).
  • the determined values are approximately proportional to the particle position offset of the center as long as the position stays in a range of +/- 250 nanometers axially and +/- 100 nanometers laterally.
  • these values can be fed back to piezo controllers tilting piezo mirrors and moving the sample stage piezo to re-center the particle in the 9-pixel-areas after every measurement.
  • the position can be determined by taking the image shape and brightness into account in the data analysis to increase the tracking range.
  • the pixels of the transmitted ROI 300 show a brighter image than the pixels of the reflected ROI 302 (on the right).
  • the top-right dashed areas 304b, 304c, 304e, 304f of the transmitted ROI 300 are generally brighter than the other 5 pixels in the same ROI 300 and than all pixels of the reflected ROI 302
  • the fluorescent particle 206b is located axially more towards the focal plane 140 imaged on transmitted ROI 300 and is shifted by about half the diffraction limit toward the right and top relative to the excitation focus.
  • the signal from the two ROIs 300, 302 can also be combined into a 3D data stack (2 pixels in z; x and y dimensions are determined by the size of the ROIs 300, 302).
  • Data analysis is a generalization of standard FPALM methods to 3D. Instead of a Gaussian, an experimentally obtained 3D-PSF can be fit to each data set consisting of the pixels around each detected probe molecule. The x, y and z-coordinates of each molecule are determined from the best fit of the molecule image with the PSF.
  • BP FPALM typically but not necessarily, larger ROIs 300, 302 are used to allow localization of particles over a larger field of view. Also, several particles can be present in the same ROI and still be analyzed separately. Slight variations in the magnification and rotation between the two detection areas may be corrected by software before combination of the two ROIs 300, 302 into a 3D data stack. The slight difference in the tilt of the focal planes between the two ROIs 300, 302 is negligible because of the large axial magnification (proportional to the lateral magnification squared). The analysis of the 3D data can be seen as the generalization of standard 2D FPALM analysis to 3D.
  • Particles are identified in the z-projected images by iteratively searching for the brightest pixels and eliminating this region in the subsequent search until a lower intensity threshold has been reached.
  • the raw data may be cut out in each ROI 300, 302 around each found particle in a square window of, for example, 10-19 pixels long and wide.
  • a theoretical or experimentally obtained 3D-PSF can be fitted to the data sets in this cutout window using a simplex fitting algorithm adapted from Numerical Recipes in C, or a different algorithm.
  • the algorithm can be a localization algorithm that is independent of theoretical models and, therefore, is generally applicable to a large number of experimental realizations. A more detailed description of the localization algorithm is provided below, after the description of FIGs.
  • the localized position is extracted and stored. Additionally, amplitude, background, the deviation from the cutout windows center, the number of iterations and the chi square value are stored, which allow later determination of the quality of the fit.
  • the stored list of fit results is analyzed and translated into 3D data sets of customizable voxel sizes. The fit amplitude is used as the voxel intensity for every molecule found that fulfills the user-defined quality criteria.
  • the camera software Solis, Andor Technology
  • a graph illustrates the axial resolution measured using a BP FPALM setup. Specifically, the axial resolution is measured from an axial profile of caged fluorescein-labeled antibodies on a covers slip and embedded in 87% glycerol. The black line represents raw data and the dashed line represents a Gaussian fit.
  • FWHM full-width-at-half-maximum
  • an inset shows added-up projections of the data set (of FIG. 4A) in three different orientations.
  • the white box marks the region used to generate the axial profile.
  • the scale bar of the original images was 2 micrometers.
  • the data shown in all planes 5A-5F is recorded simultaneously without scanning. Especially to image samples thicker than 1 micrometer, the sample stage can be moved after finishing recording at one sample position to access different sample depth positions and the data recording process is repeated until all sample positions of interest have been recorded.
  • FIG. 5F a volume-rendered representation is shown based on the data sets of FIGs. 5A-5E.
  • the curved surface of the bead is nicely reproduced over nearly 1 ⁇ m in depth without scanning.
  • the optical images show well-below 100 nanometers resolution in all three dimensions. With approximately 30 x 30 x 80 nanometers 3 , the resolvable volume is ⁇ 500-fold below the diffraction-limited observation volume and represents the smallest observation volume achieved in a far-field light microscope.
  • FIG. 6 a BP microscope system 600 is illustrated to show the tracking of a single particle 606 positioned on a sample stage 604.
  • the BP microscope system 600 is generally similar to the BP microscope system 300 described above in reference to FIG. 3.
  • the fluorescence light beam is adjusted by tilting one or more piezo-mounted mirrors or adjusting alternative beam-steering devices 618.
  • the piezo-mounted mirror 618 is tilted counterclockwise from a first position (indicated in solid line) to a second position (indicated in dashed line). The rotation of the mirror 618 steers the fluorescence light beam on the camera as well as the excitation light beam focusing into the sample and coming from the laser to correct for sideways movement of the particle 606.
  • FIGs. 7A and 7B two insets show the images recorded when a particle moves from a first position to a second position as described above in reference to FIG. 6.
  • a transmitted ROI 700a and a reflected ROI 700b are recorded on a CCD chip when the particle is in the first position.
  • the pixels of the transmitted ROI 700a show the same focus and intensity as the pixels in the reflected ROI 700b.
  • a black box surrounds a general 5 x 5 pixel area of interest.
  • the transmitted ROI 700a and the reflected ROI 700b change such that the respective pixels in the area of interest are now out of focus and of different intensity.
  • the pixels of the transmitted ROI 700a are now generally brighter (i.e., more intense) than in the first position, and off-center with respect to the area of interest (i.e., up and to the right).
  • the pixels of the reflected ROI 700b are now generally less bright (i.e., less intense) than in the first position, and off-center with respect to the area of interest (i.e., up and to the right).
  • the fluorescence light beam has now been steered to center it on the particle 606 in the second position.
  • the pixels of the transmitted ROI 700a and of the reflected ROI 700b are generally similar, if not identical, to the pixels illustrated in the first position of the particle 606 (shown in FIG. 7A). Specifically, the pixels are generally centered within the area of interest and are now of similar intensity in both the transmitted ROI 700a and the reflected ROI 700b.
  • 3D reference data sets are used to fit data sets obtained by either a multi-plane approach or an astigmatism approach. Practically, all raw data that contributes to the image of a particle is taken into account by the fitting process according to the particle's statistical weight, which is especially relevant for photon-limited applications such as imaging single molecules. Additional calibration steps are generally not necessary because the raw data and the fit reference data set are acquired by the same setup.
  • the localization algorithm converges in close to 100% of the cases over a range of 1 to 2 ⁇ m by a fraction ⁇ of images in which the particle could be localized correctly.
  • the fraction ⁇ of images depends on ⁇ and the number of detected photons, N det - Only for axial particle positions far away from either of the focal detection planes the localization algorithm may fails to converge properly.
  • the localization algorithm works independently of a theoretical model function and instead uses experimentally obtained reference data sets. This enables the algorithm to account for experimental deviations from perfect theoretical descriptions that are often difficult to include accurately in theoretical models and, also, reduces artifacts in the localization process. Additionally, complex theoretical models that cannot be described by a simple formula are now accessible by numerically generating a reference data set to feed into the algorithm. Small systematic deviations of the determined positions from actual positions can result from differences between used reference data sets and data generated during imaging, or from the fact that the reference data set data set is of finite size and can be readily corrected by proper calibration curves.
  • the localization algorithm allows general applicability to a range of 3D localization strategies without a need to develop individual theoretical model functions for every case. Additionally, it can be readily applied to recently reported detection schemes of iPALM and double-helix reference data sets.
  • the iPALM detection scheme is described in more detail in an article titled "Interferometric fluorescent super-resolution microscopy resolves 3D cellular ultrastructure" by G. Shtengel, J. A. Galbraith, C. G. Galbraith, J. Lippincott-Schwartz, J. M. Gillete, S. Naley, R. Sougrat, C. M. Waterman, P. Kanchanawong, M. W. Davidson, R. D. Fetter, and H.
  • the localization algorithm is not limited to the five fitting parameters identified below (i.e., jc-position, _y-position, z-position, amplitude, and background) or to 3D reference data sets.
  • the localization algorithm can be readily expanded to include parameters such as interference phase, polarization, or wavelength by providing experimental reference data set sets that provide the necessary information in a fourth, fifth, or even higher dimension. Alternatively, less parameters and dimensions are possible, allowing for application to 2D imaging.
  • Image acquisition software that controls camera and piezo actuator parameters may be written, for example, using Lab VIEW 8.2 software in Windows XP, the software being available from National Instruments Corp., Austin, Texas. Recorded data is stored in a raw data format that is later analyzed by separate analysis software programmed in C, which may run on a Linux computer cluster (e.g., 31 compute nodes, each equipped with two dual- core AMD Opteron processors and 16 GB of RAM and connected by a single gigabit Ethernet network).
  • a Linux computer cluster e.g., 31 compute nodes, each equipped with two dual- core AMD Opteron processors and 16 GB of RAM and connected by a single gigabit Ethernet network.
  • ROIs corresponding to an illuminated field of view are cut out automatically.
  • biplane detection mode the two ROIs in every frame representing the two detected planes are co-registered by slightly tilting and magnifying one ROI according to earlier determined calibration values.
  • Particles are identified as the brightest pixels in smoothed versions of the ROIs.
  • one ROI of 15 x 15 pixels at 2 x 2 binning (corresponding to 1.9 ⁇ m x 1.9 ⁇ m in the sample) is cut out from the non- smoothed data centered on the identified brightest pixel in the astigmatic (biplane) detection mode.
  • the fit algorithm provides the best estimates for the three spatial particle coordinates of the particle, as well as the amplitude and a background value of the particle.
  • the coordinates, amplitude, and/or background value are stored together with other parameters that indicate quality of the fit (e.g., ⁇ 2 -values, number of iterations before convergence, etc.) in ASCII data lists. These lists are later compiled into the data presented below using, for example, such computer programs like Microsoft Excel, Origin (OriginLab, Northampton, MA), and Lab VIEW.
  • the fit algorithm also referred to as a particle localization routine, performs for every identified particle a least- squares fit based on the Nelder-Mead downhill simplex method. This method is described in more detail in an article titled "Numerical Recipes: The Art of Scientific Computing” by W. H. Press, S. A. Teukolsky, W. T. Vetterling, and B. P. Flannery (Cambridge University Press, 2007), which is incorporated by reference in its entirety.
  • This algorithm finds the best fit by successively contracting an n- dimensional polytope ("simplex" with m+1 vertices) around the minimum of the figure-of- merit function in m-dimensional parameter space.
  • ⁇ 2 The figure-of-merit function, ⁇ 2 , is calculated as squares of the error-weighted differences between the observed number of photons, n p and a model function, F, which depends on a set of fit parameter values summed over all pixels, j, which describe the image of an identified particle:
  • the 3D positions, X 7 describe the coordinates in the sample and correspond to a lattice of l5 x 15 x 2 or l5 x 15 x 1 extracted pixels in biplane and astigmatic detection mode, respectively.
  • the 3D positions can represent any distribution matching the experimental imaging conditions.
  • the estimated statistical error of n ⁇ - ⁇ ⁇ - is assumed to be n/ /2 because shot noise is generally the main error contribution.
  • the normalized instrument response at point x for a particle located at position a is described by h a (x) and is derived from experimentally obtained reference data sets.
  • the reference data set /?o( ⁇ / ) is defined for a lattice of voxel coordinates ⁇ / .
  • Simple linear interpolation and related methods generate points of non-differentiability that can cause failure of proper convergence of the simplex method and induce localization artifacts. To address this issue, the following interpolation method based on Fourier transforms has been developed.
  • Ho(K / ) is the optical transfer function ("OTF") of the system and is calculated once at the beginning of the fit procedure as the Fourier transform of h o ( ⁇ i). Multiplication with the parameter-dependent phase factor exp(-i ⁇ 7 D a (x 7 )) and inverse Fourier transform yield the shifted reference data set h a (X j ).
  • the spacing of the pixel positions, X 7 is an integer multiple of the spacing of the reference data set nodes ⁇ 7 .
  • D a (x) is independent of X j and a single inverse Fourier transformation of Eq. 3 is sufficient to find all the reference data set values required to calculate ⁇ for a given shift a.
  • the background was removed and the data was corrected for bleaching that had occurred during the imaging process.
  • the reference data sets were then cut to a size of approximately 3.8 ⁇ m x 3.8 ⁇ m x 7.5 ⁇ m with the reference data set centered in the middle.
  • both reference data sets were cut in an identical way such that (i) the stack centers were located axially in the middle between the two reference data set centers and (ii) the reference data set centers maintained their original axial distance.
  • an exemplary photoactivation localization microscopy method in accordance with the features described above includes providing recorded raw images (801) from which particles are identified in images (803). The photoactivation localization microscopy method can be performed in both 2D and 3D.
  • an exemplary single particle tracking method in accordance with the features described above includes providing a recorded image sequence (901) from which a particle is identified in each frame (903).
  • the single particle tracking method can be performed in both 2D and 3D.
  • a computation is made separately for each frame (905) and pixels are extracted in regions of interest centered around the particle (907).
  • the particle is localized by determining the particle position from the intensity distribution in the region of interest (909). Additional details regarding the localization of each identified particle is provided in FIGs. 10-12.
  • the determined positions of the particle are merged (911) and a particle trajectory is created from the determined particle positions (913).
  • the resulting particle trajectory is provided (915).
  • an example of a conventional non-iterative localization algorithm includes extracting pixels for one particle (imaging) or one point in time (particle tracking) (1001), wherein the center of mass is calculated (1003). The particle position is, then, determined (1005).
  • an example of a conventional iterative localization algorithm includes extracting pixels for one particle (imaging) or one point in time (particle tracking) (1101). After calculating the center of mass (1103), a first guess of the particle position (1105) is provided to calculate the figure of merit as a measure of the difference between the imaged pixel data and a model function based on the guessed particle position (1107). The guess is modified (1109) and the figure of merit is calculated based on the modified guess (1111). If the figure of merit has not decreased (1113), the old guess is used to determine if the figure of merit is below a specific threshold (1115). If the figure of merit has decreased (1113), the modified guess is used to determine if the figure of merit is below the specific threshold (1115).
  • the particle position is outputted as the last guess (1117).
  • the guess is modified (1109) and the figure of merit is calculated based on the modified guess (1111).
  • pixels are extracted for one particle (imaging) or one point in time (particle tracking) (1201).
  • particle tracking 1201
  • a first guess of the particle position (1205) and a reference data set (1207) is provided to calculate the figure of merit as a measure of the difference between the imaged pixel data and the reference data set adjusted for guessed particle position (1209).
  • Exemplary algorithms for calculating the figure of merit are further described below in reference to FIGs. 13 and 14.
  • the guess is modified (1211) and the figure of merit is calculated based on the modified guess (1213). If the figure of merit has not decreased (1215), the old guess is used to determine if the figure of merit is below a specific threshold (1217). If the figure of merit has decreased (1215), the modified guess is used to determine if the figure of merit is below the specific threshold (1217). Then, if the figure of merit is below the specific threshold, the particle position is outputted as the last guess (1219). However, if the figure of merit is not below the specific threshold, the guess is modified (1211) and the figure of merit is calculated based on the modified guess (1213).
  • an algorithm for calculating the figure of merit includes extracting pixels for a particle (1301), guessing the particle position, brightness, and background (1303), and providing the reference data set (1305).
  • the figure of merit is calculated (1313) based on the model function and the extracted pixels for the particle and the figure of the merit function is provided (1315).
  • an exemplary embodiment for a Fourier-transform based Shift Algorithm includes extracting pixels for a particle (1401), and guessing the particle position, brightness, and background (1403).
  • a calculation (1405) is performed to obtain a reference data set (1405a), a Fourier Transform (1405b), and the OTF (1405c).
  • the OTF is multiplied with a parameter-dependent phase factor (1407), based on the particle position, brightness, and background.
  • the inverse Fourier Transform is calculated (1409), providing a shifted reference data set (1411).
  • the figure of merit is calculated (1415) based on the model function and the extracted pixels for the particle and the figure of the merit function is provided (1417).

Abstract

A microscopy system is configured for creating 3D images from individually localized probe molecules. The microscopy system includes a sample stage, an activation light source, a readout light source, a beam splitting device, at least one camera, and a controller. The activation light source activates probes of at least one probe subset of photo-sensitive luminescent probes, and the readout light source causes luminescence light from the activated probes. The beam splitting device splits the luminescence light into at least two paths to create at least two detection planes that correspond to the same or different number of object planes of the sample. The camera detects simultaneously the at least two detection planes, the number of object planes being represented in the camera by the same number of recorded regions of interest. The controller is programmable to combine a signal from the regions of interest into a 3D data.

Description

3D BIPLANE MICROSCOPY
FIELD OF THE INVENTION
[0001] The present invention relates generally to microscopic imaging and, more specifically, to three-dimensional ("3D") sub- 100 nanometer resolution by biplane microscope imaging.
BACKGROUND OF THE INVENTION
[0002] Until about a decade ago, resolution in far-field light microscopy was thought to be limited to -200-250 nanometers in the focal plane, concealing details of sub-cellular structures and constraining its biological applications. Breaking this diffraction barrier by the seminal concept of stimulated emission depletion ("STED") microscopy has made it possible to image biological systems at the nanoscale with light. Additional details are provided in an article titled "Far-Field Optical Nanoscopy by Stefan W. Hell (316 Science, 1153-1158, May 25, 2007), which is incorporated herein by reference in its entirety. STED microscopy and other members of reversible saturable optical fluorescence transitions ("RESOLFT") family achieve a resolution > 10-fold beyond the diffraction barrier by engineering the microscope's point-spread function ("PSF"), also referred to as a "reference data set," through optically saturable transitions of the (fluorescent) probe molecules.
[0003] Lately, an emerging group of localization-based techniques has obtained similar resolution in the lateral plane. This group includes fluorescence photoactivation localization microscopy ("FPALM"), photoactivation localization microscopy ("PALM"), stochastic optical reconstruction microscopy ("STORM"), and PALM with independently running acquisition ("PALMIRA"). FPALM is described in more detail in an article titled "Ultra- High Resolution Imaging by Fluorescence Photoactivation Localization Microscopy" by Samuel T. Hess et al. (91 Biophysical Journal, 4258-4272, December 2006), which is incorporated herein by reference in its entirety. PALM is described in more detail in an article titled "Imaging Intracellular Fluorescent Proteins at Nanometer Resolution" by Eric Betzig et al. (313 Science, 1642-1645, September 15, 2006), which is incorporated herein by reference in its entirety. STORM is described in more detail in an article titled "Sub- Diffraction-Limit Imaging by Stochastic Optical Reconstruction Microscopy" by Michael J. Rust et al. (Nature Methods / Advance Online Publication, August 9, 2006), which is incorporated herein by reference in its entirety. PALMIRA is described in more detail in an article titled "Resolution of λ/10 in Fluorescence Microscopy Using Fast Single Molecule Photo-Switching" by H. Bock et al. (88 Applied Physics A, 223-226, June 1, 2007), and an article titled "Photochromic Rhodamines Provide Nanoscopy With Optical Sectioning" by J. Foiling et al. (Angew. Chem. Int. Ed., 46, 6266-6270, 2007), each of which is incorporated herein by reference in its entirety. As referred to in the current application, the term photosensitive refers to both photo-activatable (e.g., switching probes between an on state and an off state) and photo-switching {e.g.., switching between a first color and a second color). [0004] While utilizing similar optical switching mechanisms, this latter group of microscopes circumvents the diffraction limit by basing resolution improvement on the precise localization of spatially well-separated fluorescent molecules, a method previously used to track, for example, conventionally labeled myosin V molecules with 1.5 nanometers localization accuracy. This method is described in more detail in an article titled "Myosin V Walks Hand-Over-Hand: Single Fluorophore Imaging With 1.5-nanometers Localization" by Ahmet Yildiz et al. (300 Science, 2061-2065, June 27, 2003), which is incorporated herein by reference in its entirety.
[0005] To resolve complex nanoscale structures by localization-based methods, the sample is labeled with photo-sensitive probes, such as photo-activatable ("PA") fluorescent probes (e.g., PA proteins or caged organic dyes). Activation of only a sparse subset of molecules at a time allows their separate localization. By repeated bleaching or deactivation of the active molecules in concert with activation of other inactive probe molecules, a large fraction of the whole probe ensemble can be localized over time. The final sub-diffraction image of the labeled structure is generated by plotting the positions of some or all localized molecules.
[0006] Based on the rapid development in both RESOLFT and localization-based techniques, the impact of super-resolution far-field fluorescence microscopy on the biological sciences is expected to increase significantly. Within 2007 alone subdiffraction multi-color imaging has been reported for the first time for STED microscopy, PALMIRA, STORM, and FPALM has successfully been demonstrated in live cells. Some of these reports are included in an article titled "Two-Color Far-Field Fluorescence Nanoscopy" by Gerald Donnert et al. (Biophysical Journal, L67-L69, February 6, 2007), in an article by M. Bates, B. Huang, G. T. Dempsey, and X. Zhuang (Science 317, 1749-1753, 2007), and in an article titled "Dynamic Clustered Distribution of Hemagglutinin Resolved at 40 nanometers in Living Cell Membranes Discriminates Between Raft Theories" by Samuel T. Hess et al. ( Proc. Natl. Acad. Sci. USA 104, 17370-17375, October 30, 2007), each of which is incorporated herein by reference in its entirety.
[0007] However, the slow progress in 3D super-resolution imaging has limited the application of these techniques to two-dimensional ("2D") imaging. The best 3D resolution until recently had been 100 nanometers axially at conventional lateral resolution. Achieved by the combination of two objective lens apertures in 4Pi microscopy, it has been applied for more than a decade. This is described in more detail in an article titled "H2AX Chromatin Structures and Their Response to DNA Damage Revealed by 4Pi Microscopy" by Joerg Bewersdorf et al. (Proc. Natl. Acad. Sci. USA 103, 18137-18142, November 28, 2006), which is incorporated by reference in its entirety. Only lately first 3D STED microscopy images have been published exceeding this resolution moderately with 139 nanometer lateral and 170 nanometer axial resolution. These images are presented in more detail in an article by K. I. Willig, B. Harke, R. Medda, and S. W. Hell (Nat. Methods 4, 915-918, 2007), which is incorporated by reference in its entirety. While this represents a ~ 10-fold smaller resolvable volume than provided by conventional microscopy, it is still at least 10-fold larger than a large number of sub-cellular components, for example synaptic vesicles. Recently, an article (Huang et al., Science 2008) has reported first 3D STORM of thin optical sections (<600 nanometers) with sub- 100 nanometer 3D resolution under reducing (low oxygen) conditions.
[0008] Moreover, current understanding of fundamental biological processes on the nanoscale (e.g., neural network formation, chromatin organization) is limited because these processes cannot be visualized at the necessary sub-millisecond time resolution. Current biological research at the sub-cellular level is constrained by the limits of spatial and temporal resolution in fluorescence microscopy. The diameter of most organelles is below the diffraction limit of light, limiting spatial resolution and concealing sub-structure. Recent developments {e.g., STED, FPALM, STORM, etc.) have dramatically enhanced the spatial resolution and even overcome the traditional diffraction barrier. However, comparable improvements in temporal resolution are still needed.
[0009] Particle-tracking techniques can localize small objects (typically < diffraction limit) in live cells with sub-diffraction accuracy and track their movement over time. But conventional particle-tracking fluorescence microscopy cannot temporally resolve interactions of organelles, molecular machines, or even single proteins, which typically happen within milliseconds. [0010] The spatial localization accuracy of single particles in a fluorescence microscope is approximately proportional to d I~J~N (d = spatial resolution; N = total number of detected fluorescence photons from the particle) in the absence of background and effects due to finite pixel size. For longer acquisition times more signal can be accumulated, hence increased temporal resolution requires a trade-off of decreased spatial localization accuracy. For bright organelles containing a few hundred fluorescent molecules, (or future fluorescent molecules with increased brightness), sufficient signal can be accumulated quickly. However, especially for 3D localization where data acquisition is far more complicated than in 2D, technical constraints arising from axial scanning and/or camera readout times limit the recording speed, and therefore, the temporal resolution.
[0011] For example, a particular 3D particle-tracking technique can track particles only with 32 milliseconds time resolution. This technique scans a 2-photon excitation focus in a 3D orbit around the fluorescent particle and determines its 3D position by analyzing the temporal fluorescence fluctuations. The temporal resolution is ultimately limited by the frequency with which the focus can revolve in 3D around the particle. This technique is described in more detail in an article titled "3-D Particle Tracking In A Two-Photon Microscope: Application To The Study Of Molecular Dynamics IN Cells" by V. Levi, Q. Ruan, and E. Gratton (Biophys. J., 2005, 88(4): pp. 2919-28), which is incorporated by reference in its entirety.
[0012] In another example, another current 3D particle-tracking technique combines traditional particle-tracking with widefield "bifocal detection" images. Particles are simultaneously detected in one plane close to the focal plane of the particle and a second plane 1 micrometer out of focus. The lateral and axial coordinates are derived from the 2 images. In accordance with this technique, the temporal resolution is limited to the 2-50 milliseconds range, and the localization accuracy is limited to the 2-5 nanometer range. Additional details are described in an article titled "Three-Dimensional Particle Tracking Via Bifocal Imaging" by E Toprak et al. (Nano Lett., 2007, 7(7): pp. 2043-45), which is incorporated by reference in its entirety. As such, advances in temporal resolution to sub- millisecond levels have been limited only to 2D imaging.
[0013] In general, determining the 3D position of a particle by any of the methods mentioned above requires fitting a model function to the respective experimental data. The particle position (and also, typically, the particle brightness and background value) can be deduced from the parameters that fit the experimental data best, according to a chosen figure of merit. In most cases, an analytical function is used to reasonably model the characteristics that dominantly describe the 3D particle position, e.g., the diameter of the defocused image or the particle ellipticity in the case of astigmatism. The model function is calibrated with imaged particles located at known positions to achieve mapping of the determined fit parameters to real spatial positions.
[0014] This indirect method, especially of acquiring the z-position from abstract fit parameters such as ellipticity or ring diameter, is problematic, however, because it is prone to artifacts in the analysis process. Experimental deviations from the theoretical descriptions by the model function can lead to divergence between real and measured particle positions. Additionally, every model function is limited to a certain optical setup and weighs the information content of the raw data differently. This prevents a direct comparison of different optical setups.
[0015] Thus, there is a need for a microscopy system that can provide 3D imaging with resolution below 100 nanometers in all three dimensions. Another need is directed to achieving particle-tracking in 3D with a temporal resolution below 1 millisecond for enabling visualization of dynamic sub-cellular processes. The present invention is directed to satisfying one or more of these needs and solving other problems.
SUMMARY OF THE INVENTION
[0016] According to one embodiment, a microscopy system is configured for creating 3D images from individually localized probe molecules. The microscopy system includes a sample stage, an activation light source, a readout light source, a beam splitting device, at least one camera, and a controller. The activation light source activates probes of at least one probe subset of photo-sensitive luminescent probes, and the readout light source causes luminescence light from the activated probes. Optionally, the activation light source and the readout light source is the same light source. The beam splitting device splits the luminescence light into at least two paths to create at least two detection planes that correspond to the same or different number of object planes of the sample. The camera detects simultaneously the at least two detection planes, the number of object planes being represented in the camera by the same number of recorded regions of interest. The controller is programmable to combine a signal from the regions of interest into a 3D data set. [0017] According to another embodiment, a method for creating 3D images from individually localized probe molecules includes mounting a sample on a sample stage, the sample having a plurality of photo- sensitive luminescent probes. In response to illuminating the sample with an activation light, probes of at least one probe subset of the plurality of photo-sensitive luminescent probes are activated. In response to illuminating the sample with a readout light, luminescence light from the activated probes is caused. The luminescence lights is split into at least two paths to create at least two detection planes, the at least two detection planes corresponding to the same or different object planes in the sample. At least two detection planes are detected via a camera. The object planes are recorded in corresponding recorded regions of interest in the camera. A signal from the regions of interest is combined into a 3D data stack.
[0018] According to yet another embodiment, a microscopy system is configured for tracking microscopic particles in 3D. The system includes a sample, a sample stage, at least one light source, a beam- steering device, a beam splitting device, at least one camera, and a controller. The sample, which includes luminescence particles, is mounted to the sample stage. The light source is configured to illuminate an area of the sample to cause luminescence light, primarily, from one tracked particle of the luminescence particles. The beam-steering device is configured to selectively move a light beam to illuminate different areas of the sample such that the luminescence light is detected. The beam splitting device, which is located in a detection light path, splits the luminescence light into at least two paths to create at least two detection planes that correspond to different object planes in the sample. The camera is positioned to detect simultaneously the at least two detection planes, the number of object planes being represented in the camera by the same number of recorded regions of interest. The controller is programmable to combine a signal from the recorded regions of interest, determine a 3D trajectory of the particle at each time point of a recorded data sequence, and move the beam-steering device to illuminate the different areas of the sample in accordance with corresponding positions of the one tracked particle. [0019] According to yet another embodiment, a method for tracking microscopic particles in 3D includes mounting a sample on a sample stage, the sample including luminescent particles. A small area of the sample is illuminated to cause luminescence light from primarily one particle of the luminescent particles. The light beam is selectively moved to illuminate different areas of the sample to track movement of the one particle, the different areas including the small area of the sample and corresponding to respective positions of the one particle. The luminescence light is split into at least two paths to create at least two detection planes that correspond to the same or different number of object planes in the sample. The at least two detection planes are detected simultaneously. The number of object planes is represented in a camera by the same number of recorded regions of interest. Based on a combined signal from the recorded regions of interest, a 3D trajectory of the one particle is determined at each time point of a recorded data sequence.
[0020] According to yet another embodiment, a microscopy system is configured for creating 3D images from individually localized probe molecules. The microscopy system includes an activation light source, a readout light source, a beam splitting device, at least one camera, and a controller. The activation light source is configured to illuminate a sample with an activation light, the activation light source being configured to activate probes of at least one probe subset of the plurality of photo- sensitive luminescent probes. The readout light source is configured to illuminate the sample with a readout light, the readout light source being configured to cause luminescence light from the activated probes. The beam splitting device is located in a detection light path that splits the luminescence light into at least two paths, the beam splitting device creating at least two detection planes that correspond to the same or different number of object planes of the sample. The camera is positioned to detect simultaneously the at least two detection planes, the number of object planes being represented in the camera by the same number of recorded regions of interest. The controller is programmable to combine a signal from the regions of interest into a 3D data stack, to calculate a figure of merit as a measure of the difference between imaged pixel data and a reference data set /io(ξ/) modifiable by at least one parameter, and to optimize the figure of merit by adjusting the at least one parameter.
[0021] Additional aspects of the invention will be apparent to those of ordinary skill in the art in view of the detailed description of various embodiments, which is made with reference to the drawings, a brief description of which is provided below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0022] FIG. 1 is a schematic view illustrating a biplane microscope setup for
Fluorescence Photoactivation Localization Microscopy (FPALM), according to one embodiment.
[0023] FIG. 2 is a schematic view illustrating a biplane microscope setup, according to an alternative embodiment.
[0024] FIG. 3 is a schematic view illustrating a fluorescent particle image on a CCD chip.
[0025] FIG. 4A is a graph representing an axial resolution measured from an axial profile of caged fluorescein-labeled antibodies.
[0026] FIG. 4B is a representative image showing added-up projections of a data set in three different orientations for the axial resolution measured in FIG. 4A.
[0027] FIG. 5A is a representative image of a data set for beads labeled with caged fluorescein at an axial position of 300 nanometers.
[0028] FIG. 5B illustrates a representative image of a resulting data set for the beads of
FIG. 5 A at an axial position of 100 nanometers.
[0029] FIG. 5C illustrates a representative image of a resulting data set for the beads of
FIG. 5 A at an axial position of -100 nanometers.
[0030] FIG. 5D illustrates a representative image of a resulting data set for the beads of
FIG. 5A at an axial position of -300 nanometers.
[0031] FIG. 5E illustrates a representative image of a resulting data set for the beads of
FIG. 5A at an axial position of -500 nanometers.
[0032] FIG. 5F illustrates a volume-rendered representation of the data set illustrated in
FIGs. 5A-5E.
[0033] FIG. 6 is a schematic view illustrating adjustment of a biplane microscope setup, according to an alternative embodiment.
[0034] FIG. 7A is a schematic view illustrating a fluorescent particle image on a CCD chip when the particle is in focus, in a first position.
[0035] FIG. 7B is a schematic view illustrating the fluorescent particle image of FIG. 7A when the particle is out of focus, in a second position.
[0036] FIG. 7C is a schematic view illustrating the fluorescent particle image of FIG. 7B when the particle is in focus, in a third position.
[0037] FIG. 8 is a diagrammatic representation of an exemplary embodiment of a photoactivation localization microscopy method.
[0038] FIG. 9 is a diagrammatic representation of an exemplary embodiment of a single particle tracking method.
[0039] FIG. 10 is a diagrammatic representation of an exemplary conventional embodiment of a non-iterative localization algorithm.
[0040] FIG. 11 is a diagrammatic representation of an exemplary conventional embodiment of an iterative localization algorithm.
[0041] FIG. 12 is a diagrammatic representation of an exemplary embodiment of a localization algorithm. [0042] FIG. 13 is a diagrammatic representation of an exemplary embodiment of an algorithm for calculating a figure of merit.
[0043] FIG. 14 is a diagrammatic representation of an exemplary embodiment of a Fourier-Transform based Shift Algorithm.
DETAILED DESCRIPTION
[0044] While this invention is susceptible of embodiment in many different forms, there is shown in the drawings and will herein be described in detail preferred embodiments of the invention with the understanding that the present disclosure is to be considered as an exemplification of the principles of the invention and is not intended to limit the broad aspect of the invention to the embodiments illustrated.
[0045] Referring to FIG. 1, a biplane ("BP") microscope system 100 allows 3D imaging at an unmatched resolution well below 100 nanometers in all three dimensions, resulting in at least a 100-fold smaller resolvable volume than obtainable by conventional 3D microscopy. The BP microscope system 100 is optionally a BP FPALM system, which is generally based on a conventional FPALM design. However, in contrast to conventional FPALM design, the BP microscope system 100 includes a modified detection path that allows the simultaneous detection from two focal planes. The simultaneous detection of two planes for localization- based super-resolution microscopy speeds up the imaging process by making axial scanning unnecessary, and more importantly, in contrast to scanning-based systems, eliminates localization artifacts caused by abrupt blinking and bleaching common to single molecules. The BP microscope system 100 can optionally be located on an air-damped optical table to minimize vibrations.
[0046] In addition to achieving 3D particle localization down to the nanometer range accuracy, the BP microscope system 100 can also achieve temporal resolution <1 milliseconds. As such, in addition to being a BP FPALM system, the BP microscope system 100 can also be a next-generation 3D particle-tracking microscope ("3D PTM") for providing unprecedented temporal and spatial resolution when tracking fluorescent particles in live cells in 3D. FPALM and particle-tracking are just some exemplary applications of the BP microscope system 100. To achieve unprecedented temporal resolution at least as short as 0.3 milliseconds, the BP microscope system 100 tracks one particle at a time (in contrast to conventional 2D and 3D tracking techniques that visualize the entire field). Additionally, the BP microscope system 100 can include a detection scheme without any moving parts that detects simultaneously two axially shifted detection planes.
[0047] In contrast to current PTM techniques, the BP microscope system 100 can include a focused laser beam for excitation combined with spatially limited detection. Background light is filtered out to avoid localization disturbances and to increase sensitivity in samples thicker than about 1 micrometer. This enables particle-tracking even in tissue sections. To follow a particular particle over several microns in 3D, the BP microscope system 100 can include, for example, high-speed piezo-mirrors and a fast piezo-driven sample stage. The combination of focused excitation and feedback-driven beam-tracking reduces the background and enhances the speed limit by approximately one order of magnitude. Optionally, a second (different) luminescence color can be detected to enable correlative studies of the movement of the tracked particle.
[0048] Illumination for readout and activation can be provided by a readout laser 102, operating typically at 496 nanometers, and an activation laser 104 (e.g., 50 mW, Crystalaser), operating typically at 405 nanometers. The readout laser 102 is optionally a water-cooled Argon laser (e.g., Innova 70, coherent Inc.) that can provide 458, 472, 488, 496, or 514 nanometers for readout illumination. Optionally, the wavelength of the readout laser 102 is selected to minimize activation of inactive probes of a plurality of photo- sensitive probes of a sample 124. Optionally yet, the readout laser 102 and the activation laser 104 can be the same source. For example, the readout laser 102 can perform both the readout functions and the activation functions, without requiring the use of the activation laser 104. According to one embodiment, at least one illuminated area of the sample 124 is a relatively small area, having, for example, a general diameter that is less than about three times an Airy disk diameter.
[0049] Both lasers 102, 104 are combined, via a first dichroic beam splitter 110, and coupled, via a second dichroic beam splitter 120, into a microscope stand 106 equipped with a 63x 1.2NA water immersion tube lens 108 after passing through a field aperture 107. Both lasers 102, 104 can be switched on and off by software-controlled electrical shutters (e.g., SH05, Thorlabs). Other components that may be included along the path between the lasers 102, 104 and the microscope stand 106 are a first mirror 112 and a first lens 114. [0050] The microscope stand 106 can have a plurality of components, including a sample stage 116 and an objective 118. The sample 124, including for example a biological cell 124a is generally positioned on the sample stage 116. The sample stage 116 can be a mechanical stage or a three-axis piezo stage (e.g., P-733.3DD, Physik Instrumente). Other components, which are not shown, may include shutters in front of the lasers 102, 104 and further optics for folding the beam path.
[0051] Fluorescence is collected by the objective 118, passes through a second dichroic beam splitter 120 (which reflects the laser light) and is focused by the tube lens 108 via an optional second mirror 122 (e.g., a piezo-driven mirror) into an intermediate focal plane 140. The focal plane 140 is imaged by two lenses - a second lens 128 and a third lens 132 - onto a high- sensitivity EM-CCD camera 126 (e.g., DU897DCS-BV iXon, Andor Technology). Scattered laser light is attenuated by bandpass and Raman edge filters (e.g., Chroma and Semrock), such as filter 130.
[0052] The detection scheme can be achieved by moving the CCD camera 126 out of the standard image plane closer to the tube lens 108 and thereby shifting the corresponding focal plane -350 nanometers deeper into the sample. A beam splitter cube 134 is placed into a focused light path 136a in front of the CCD camera 126. The beam splitter cube 134 redirects a reflected light path 136b via a third mirror 138 towards the CCD camera 126 to form a second image in a different region of the same CCD. Due to the longer optical path, this second image corresponds to a focal plane -350 nanometers closer to the objective 118 than the original focal plane.
[0053] The BP microscope system 100, using a single camera, is straightforward to implement and avoids synchronization problems between separate cameras. The BP microscope system 100 features a reasonable field of view of -20 x 50 micrometers2 (pixel size corresponding to -100 nanometers in the sample 124; 512 x 512 pixels), sufficient to image large portions of a cell. The BP microscope system 100 is able to image 100 frames per second with a field of view of 10 to 20 micrometers in length and 2 x 2 binning. The use of the CCD camera 126, which features negligible readout noise due to its on-chip electron multiplication, avoids additional noise that would otherwise result from splitting the light up into two fields as required for BP detection. Combined with the fact that there is minimal loss of fluorescence detection efficiency, this exemplary BP microscope system 100 expands conventional FPALM to 3D imaging without significant drawbacks.
[0054] BP FPALM technology is compatible with live cell imaging and can be expanded to multicolor imaging (even realizable on the same CCD detector). BP FPALM can record 3D structures in a -1 micrometer thick z-section without scanning. Larger volumes can be recorded by recording BP FPALM data at different sample positions. To minimize activation of out of focus PA molecules, BP FPALM can be combined with a 2-photon ("2P") laser scanner. 2P excitation-mediated activation is directed to diffraction-limited planes of -800 nanometers thickness, a thickness that is compatible with the axial detection range of BP FPALM. BP FPALM therefore has the potential of imaging specimens such as cell nuclei or tissue sections far exceeding 1 micrometer in thickness.
[0055] Moreover, combined with or without 2P excitation, BP FPALM can be readily implemented in practically every existing FPALM, PALM, PALMIRA or STORM instrument. BP FPALM therefore provides the means to investigate a large variety of biological 3D structures at resolution levels previously far out of reach. [0056] Optionally, BP FPALM detected luminescence from activated probes is fluorescence or scattered light. In an alternative embodiment, the activation of activated probes is achieved via a non-linear process that limits the activation to a plane of diffraction- limited thickness.
[0057] For PSF measurement, according to one example, 100 nanometer diameter yellow-green fluorescent beads (Invitrogen, F-8803) can be attached to a poly-L-lysine coated cover slip. The sample can be mounted on a piezo stage and imaged in the BP FPALM setup with 496 nm excitation. Typically, 101 images at z-positions ranging from - 2.5 to +2.5 micrometers with 50 nanometers step size are recorded. The same bead is imaged 2 to 3 times to check for drift and to correct for bleaching. To reduce noise, the data set can be smoothed in Imspector with a Gaussian filter of sub-diffraction size. Additionally, the data set can be corrected for mono-exponential bleaching, cropped to appropriate size and to be centered and normalized to 1.
[0058] Use of two focal planes for z-position determination is generally sufficient for particle localization under the constraints that (1) a sparse distribution of particles is analyzed (no overlapping signal within the size of one PSF) and (2) the axial position of the particle is close to one of the detection planes or lies between them. For example, to evaluate the range and accuracy of z-localization, 40 nanometers diameter fluorescent beads (FluoSpheres, F8795, Invitrogen) were imaged on a cover slip over 1,000 frames. A piezo-driven sample stage was moved by one 100 nanometers z-step every 100 frames. Localization analysis of the BP images reproduced that z-movement very accurately with σ ~ 6 to 10 nanometers axial localization accuracy. The beads could be localized over a range of 800 nanometers exceeding the distance between the two detection planes (in this case 500 nanometers) by more than 50%. [0059] In one example, the accumulation time per frame is typically 10 milliseconds. In this example, electron multiplying gain is set to 300, the readout is 2 x 2 binned, only the region occupied by two recorded regions of interest ("ROIs") is read out, and, typically, 5,000 to 50,000 frames are recorded.
[0060] Optionally, at least some of the ROIs are detected at different wavelengths by including suitable detection filters in the BP microscope system 100. In alternative embodiments, at least some of the ROIs are detected at different polarization directions by including suitable polarization optics in the BP microscopy system 100. [0061] Referring to FIG. 2, a BP microscope system 200 is shown according to an alternative embodiment. The BP microscope system 200 includes a microscope stand 202 having a piezo-driven sample stage 204 on which a sample 206 is positioned. The sample 206 includes a plurality of fluorescent particles 206a-206d. The microscope stand 202 further includes an objective 208 and a first lens 210.
[0062] Additional components are positioned between a focal plane 212 and the CCD camera 214 along a fluorescence light path 215. Specifically, the components include a second lens 216, a beam-steering device 281 (e.g., a piezo-driven mirror), a dichroic beam splitter 220, a bandpass filter 222, a third lens 224, a neutral 50:50 beam splitter 226, and a mirror 228. Optionally, the beam-steering device 218 can include generally a focusing optical element that moves illumination and detection focal planes axially to follow the tracked particle. In yet another example, the beam-steering device 218 can include a phase- modulating device that moves an illuminated area laterally and illumination and detection focal planes axially to follow the tracked particle. Optionally yet, more than one piezo-driven mirror 218 can be included in the BP microscope system 200.
[0063] A polarized laser beam from a laser 229 is coupled into the microscope stand 202 and focused into the sample 206 by the objective 208. A fourth lens 230 and a λ/4 plate 232 are positioned between the laser 229 and the dichroic beam splitter 220. [0064] The focus can be positioned in the region of interest by moving the sample stage 204 and the beam-steering device 218. The fluorescence emerging from the focal region is collected by the objective 208 and is imaged onto the CCD camera 214 via the first lens 210, the second lens 216, and the third lens 224. The dichroic beam splitter 220 and the bandpass filter 222 filter out scattered excitation light and other background light. [0065] The neutral 50:50 beam splitter 226 splits the fluorescence light into two beam paths, a transmitted beam 215a and a reflected beam 215b. The transmitted beam 215a images light emitted from a plane deeper in the sample onto one area of the CCD chip. The reflected beam 215b images light from a plane closer to the objective onto another well- separated area to avoid cross-talk.
[0066] Referring to FIG. 3, two ROIs on the CCD chip represent two focal planes in the sample 206 (illustrated in FIG. 2), typically 700 nanometers apart, arranged like wings of a biplane. The two ROIs include a transmitted ROI 300 and a reflected ROI 302, each having nine pixels showing an image of the fluorescent particle 206b from the sample 206. The dashed areas 304a-304i, 306a-306i depict the pixels that are used for tracking the fluorescent particle 206b. Thus, the two 9-pixel-areas 304a-304i, 306a-306i represent in general the position of the particle 206b in 3D.
[0067] The fluorescent particle 206b, which is generally smaller than the laser focus and located in the focal region, is excited homogeneously and 3 (binned) lines (i.e., the two 9- pixel-areas represented by dashed areas 304a-304i, 306a-306i) of the CCD chip arranged around the laser focus image are read out at every time point. Particles laterally shifted with respect to the laser focus center will appear shifted on the CCD chip. For the z direction, the two 9-pixel-areas 304a-304i, 306a-306i act in the same was as two confocal pinholes in different planes: if the particle 206b moves axially, the signal will increase in one of the 9- pixel-area and decrease in the other 9-pixel-area. An axial shift will be represented by a sharper intensity distribution in one of the two 9-pixel-areas depending on the direction of the shift.
[0068] The 3D position can be determined by subtracting different pixel values of the two 9-pixel-areas from each other. For the axial coordinate (z-axis), the sum of all pixels from one 9-pixel-area can be subtracted from the other 9-pixel-area. The fact that the lateral information is preserved in the 9-pixel-areas allows for lateral localization of the particle 306b at the same time. For the lateral x-axis (or y-axis) direction, the signal collected in the left columns 304a, 304d, 304g, 306a, 306d, 306g (or upper rows: 304a, 304b, 304c and 306a, 306b, 306c) of both 9-pixel-areas 300 and 302 can be subtracted from the one in the right columns 304c, 304f, 304i, 306c, 306f, 306i (or lower rows: 304g, 304h, 304i and 306g, 306h, 306i). Calculations show that the determined values are approximately proportional to the particle position offset of the center as long as the position stays in a range of +/- 250 nanometers axially and +/- 100 nanometers laterally. In a simple feedback loop, these values can be fed back to piezo controllers tilting piezo mirrors and moving the sample stage piezo to re-center the particle in the 9-pixel-areas after every measurement. Optionally, for larger movements up to about double the linear ranges, the position can be determined by taking the image shape and brightness into account in the data analysis to increase the tracking range. [0069] According to an alternative embodiment, the pixels of the transmitted ROI 300 (on the left) show a brighter image than the pixels of the reflected ROI 302 (on the right). For example, the top-right dashed areas 304b, 304c, 304e, 304f of the transmitted ROI 300 are generally brighter than the other 5 pixels in the same ROI 300 and than all pixels of the reflected ROI 302 As such, the fluorescent particle 206b is located axially more towards the focal plane 140 imaged on transmitted ROI 300 and is shifted by about half the diffraction limit toward the right and top relative to the excitation focus.
[0070] The signal from the two ROIs 300, 302 can also be combined into a 3D data stack (2 pixels in z; x and y dimensions are determined by the size of the ROIs 300, 302). Data analysis is a generalization of standard FPALM methods to 3D. Instead of a Gaussian, an experimentally obtained 3D-PSF can be fit to each data set consisting of the pixels around each detected probe molecule. The x, y and z-coordinates of each molecule are determined from the best fit of the molecule image with the PSF.
[0071] For BP FPALM, typically but not necessarily, larger ROIs 300, 302 are used to allow localization of particles over a larger field of view. Also, several particles can be present in the same ROI and still be analyzed separately. Slight variations in the magnification and rotation between the two detection areas may be corrected by software before combination of the two ROIs 300, 302 into a 3D data stack. The slight difference in the tilt of the focal planes between the two ROIs 300, 302 is negligible because of the large axial magnification (proportional to the lateral magnification squared). The analysis of the 3D data can be seen as the generalization of standard 2D FPALM analysis to 3D. Particles are identified in the z-projected images by iteratively searching for the brightest pixels and eliminating this region in the subsequent search until a lower intensity threshold has been reached. The raw data may be cut out in each ROI 300, 302 around each found particle in a square window of, for example, 10-19 pixels long and wide. Instead of a 2D Gaussian, a theoretical or experimentally obtained 3D-PSF can be fitted to the data sets in this cutout window using a simplex fitting algorithm adapted from Numerical Recipes in C, or a different algorithm. For example, the algorithm can be a localization algorithm that is independent of theoretical models and, therefore, is generally applicable to a large number of experimental realizations. A more detailed description of the localization algorithm is provided below, after the description of FIGs. 7A-7C. [0072] From the resulting best fitting x, y and z-coordinates, the localized position is extracted and stored. Additionally, amplitude, background, the deviation from the cutout windows center, the number of iterations and the chi square value are stored, which allow later determination of the quality of the fit. The stored list of fit results is analyzed and translated into 3D data sets of customizable voxel sizes. The fit amplitude is used as the voxel intensity for every molecule found that fulfills the user-defined quality criteria. For operation without the piezo stage, the camera software (Solis, Andor Technology) is used for data recording. Software to operate the microscope with the piezo stage, for fitting, and to create 3D data sets, may be programmed in Lab View 8.2 (National Instruments). Imspector (Andreas Schoenle, Max Planck Institute for Biophysical Chemisty, Goettingen, Germany) is used for display and analysis of 3D data sets. 3D rendered images may be created using Amira.
[0073] Referring to FIG. 4A, a graph illustrates the axial resolution measured using a BP FPALM setup. Specifically, the axial resolution is measured from an axial profile of caged fluorescein-labeled antibodies on a covers slip and embedded in 87% glycerol. The black line represents raw data and the dashed line represents a Gaussian fit.
[0074] From the axial profile, a full-width-at-half-maximum ("FWHM") distribution of 75 nanometers is measured, which is about 10-fold below the axial FWHM of measured PSF (which represents the axial resolution of conventional diffraction-limited microscopy). Since localization-based resolution is proportional to the diffraction-limited PSF size and the axial FWHM of a widefield 1.2NA PSF is -250% larger than the lateral FWHM, the measured z- localization precision is consistent with x and y-resolution of 20 to 40 nanometers previously obtained in FPALM and PALM.
[0075] Referring to FIG. 4B, an inset shows added-up projections of the data set (of FIG. 4A) in three different orientations. The white box marks the region used to generate the axial profile. The scale bar of the original images was 2 micrometers.
[0076] Referring to FIGs. 5A-5E, 3D BP FPALM imaging of 2 micrometers diameter beads labeled with caged fluorescein shows data sets at different axial positions. Specifically, representative 100 nanometer thick xy images of the resulting data set are illustrated at z = +300 nanometers, +100 nanometers, -100 nanometers, -300 nanometers, and -500 nanometers, respectively. The data shown in all planes 5A-5F is recorded simultaneously without scanning. Especially to image samples thicker than 1 micrometer, the sample stage can be moved after finishing recording at one sample position to access different sample depth positions and the data recording process is repeated until all sample positions of interest have been recorded.
[0077] Referring to FIG. 5F, a volume-rendered representation is shown based on the data sets of FIGs. 5A-5E. The curved surface of the bead is nicely reproduced over nearly 1 μm in depth without scanning. The optical images show well-below 100 nanometers resolution in all three dimensions. With approximately 30 x 30 x 80 nanometers3, the resolvable volume is ~500-fold below the diffraction-limited observation volume and represents the smallest observation volume achieved in a far-field light microscope. [0078] Referring to FIG. 6, a BP microscope system 600 is illustrated to show the tracking of a single particle 606 positioned on a sample stage 604. The BP microscope system 600 is generally similar to the BP microscope system 300 described above in reference to FIG. 3.
[0079] As the single particle 606 moves relatively to the sample stage 604 from a first position (indicated in solid line) to a second position (indicated in dashed line), the fluorescence light beam is adjusted by tilting one or more piezo-mounted mirrors or adjusting alternative beam-steering devices 618. In the exemplary scenario, the piezo-mounted mirror 618 is tilted counterclockwise from a first position (indicated in solid line) to a second position (indicated in dashed line). The rotation of the mirror 618 steers the fluorescence light beam on the camera as well as the excitation light beam focusing into the sample and coming from the laser to correct for sideways movement of the particle 606. The mirror 618 is rotated until the excitation light beam is again centered on the particle 606. [0080] Optionally, the sample stage 604 is moved up or down to correct for vertical movement. Alternatively, a suitable beam-steering device 618 refocuses the beam vertically. After the necessary adjustments are made to track the particle 606, the positions of the piezo and stage are recorded to reconstruct large scale movement in post-processing. [0081] Referring to FIGs. 7A and 7B, two insets show the images recorded when a particle moves from a first position to a second position as described above in reference to FIG. 6. In FIG. 7A, a transmitted ROI 700a and a reflected ROI 700b are recorded on a CCD chip when the particle is in the first position. The pixels of the transmitted ROI 700a show the same focus and intensity as the pixels in the reflected ROI 700b. A black box surrounds a general 5 x 5 pixel area of interest.
[0082] When the particle moves to the second position, as shown in FIG. 7B, the transmitted ROI 700a and the reflected ROI 700b change such that the respective pixels in the area of interest are now out of focus and of different intensity. For example, the pixels of the transmitted ROI 700a are now generally brighter (i.e., more intense) than in the first position, and off-center with respect to the area of interest (i.e., up and to the right). Similarly, the pixels of the reflected ROI 700b are now generally less bright (i.e., less intense) than in the first position, and off-center with respect to the area of interest (i.e., up and to the right).
[0083] Referring to FIG. 7C, the fluorescence light beam has now been steered to center it on the particle 606 in the second position. The pixels of the transmitted ROI 700a and of the reflected ROI 700b are generally similar, if not identical, to the pixels illustrated in the first position of the particle 606 (shown in FIG. 7A). Specifically, the pixels are generally centered within the area of interest and are now of similar intensity in both the transmitted ROI 700a and the reflected ROI 700b.
[0084] Various aspects of the present invention may be used alone, in combination, or in a variety of arrangements not specifically discussed in the embodiments described in the foregoing, and the aspects of the present invention described herein are not limited in their application to the details and arrangements of components set forth in the foregoing description or illustrated in the drawings. The aspects of the invention are capable of other embodiments and of being practiced or of being carried out in various ways. 3D Particle Localization Algorithm
[0085] Referring to the 3D particle localization algorithm, experimentally obtained 3D reference data sets are used to fit data sets obtained by either a multi-plane approach or an astigmatism approach. Practically, all raw data that contributes to the image of a particle is taken into account by the fitting process according to the particle's statistical weight, which is especially relevant for photon-limited applications such as imaging single molecules. Additional calibration steps are generally not necessary because the raw data and the fit reference data set are acquired by the same setup.
[0086] Based on performed experiments, the localization algorithm converges in close to 100% of the cases over a range of 1 to 2 μm by a fraction Φ of images in which the particle could be localized correctly. The fraction Φ of images depends on Δ and the number of detected photons, Ndet- Only for axial particle positions far away from either of the focal detection planes the localization algorithm may fails to converge properly. [0087] Differences generally arise in experiments directed to the Axial Localization Range: biplane detection is capable of localizing particles over a range nearly twice the value achievable by astigmatic detection. This is a relevant feature in imaging of thick biological samples and, therefore, the biplane mode seems to be a favorable mode for these applications. The fact that the signal in biplane detection is spread over double the number of pixels does not have a detectable negative effect in the experimental setup.
[0088] As mentioned above, the localization algorithm works independently of a theoretical model function and instead uses experimentally obtained reference data sets. This enables the algorithm to account for experimental deviations from perfect theoretical descriptions that are often difficult to include accurately in theoretical models and, also, reduces artifacts in the localization process. Additionally, complex theoretical models that cannot be described by a simple formula are now accessible by numerically generating a reference data set to feed into the algorithm. Small systematic deviations of the determined positions from actual positions can result from differences between used reference data sets and data generated during imaging, or from the fact that the reference data set data set is of finite size and can be readily corrected by proper calibration curves.
[0089] More relevantly, the localization algorithm allows general applicability to a range of 3D localization strategies without a need to develop individual theoretical model functions for every case. Additionally, it can be readily applied to recently reported detection schemes of iPALM and double-helix reference data sets. The iPALM detection scheme is described in more detail in an article titled "Interferometric fluorescent super-resolution microscopy resolves 3D cellular ultrastructure" by G. Shtengel, J. A. Galbraith, C. G. Galbraith, J. Lippincott-Schwartz, J. M. Gillete, S. Naley, R. Sougrat, C. M. Waterman, P. Kanchanawong, M. W. Davidson, R. D. Fetter, and H. F. Hess (Proc. Natl. Acad. Sci. USA 106, 3125-3130, 2009), which is incorporated by reference in its entirety. The double -helix reference data sets detection scheme is described in more detail in an article titled "Three- dimensional, single-molecule fluorescence imaging beyond the diffraction limit by using a double-helix point spread function" by S. R. Pavani, M> A. Thompson, J. S. Biteen, S. J. Lord, N. Liu, R. J> Twieg, R. Piestun, and W. E. Moerner (Proc. Natl. Acad. Sci. USA 106, 2995-2999, 2009), which is incorporated by reference in its entirety.
[0090] Furthermore, the localization algorithm is not limited to the five fitting parameters identified below (i.e., jc-position, _y-position, z-position, amplitude, and background) or to 3D reference data sets. The localization algorithm can be readily expanded to include parameters such as interference phase, polarization, or wavelength by providing experimental reference data set sets that provide the necessary information in a fourth, fifth, or even higher dimension. Alternatively, less parameters and dimensions are possible, allowing for application to 2D imaging.
Software
[0091] Image acquisition software that controls camera and piezo actuator parameters may be written, for example, using Lab VIEW 8.2 software in Windows XP, the software being available from National Instruments Corp., Austin, Texas. Recorded data is stored in a raw data format that is later analyzed by separate analysis software programmed in C, which may run on a Linux computer cluster (e.g., 31 compute nodes, each equipped with two dual- core AMD Opteron processors and 16 GB of RAM and connected by a single gigabit Ethernet network).
[0092] As a brief overview, ROIs corresponding to an illuminated field of view are cut out automatically. In biplane detection mode, the two ROIs in every frame representing the two detected planes are co-registered by slightly tilting and magnifying one ROI according to earlier determined calibration values. Particles are identified as the brightest pixels in smoothed versions of the ROIs. For every identified particle, one ROI of 15 x 15 pixels at 2 x 2 binning (corresponding to 1.9 μm x 1.9 μm in the sample) is cut out from the non- smoothed data centered on the identified brightest pixel in the astigmatic (biplane) detection mode. This data is, then, corrected for an electronic offset in the signal stemming from the camera, translated from counts into number of photons, and fed into the fit algorithm. [0093] The fit algorithm provides the best estimates for the three spatial particle coordinates of the particle, as well as the amplitude and a background value of the particle. The coordinates, amplitude, and/or background value are stored together with other parameters that indicate quality of the fit (e.g., χ2-values, number of iterations before convergence, etc.) in ASCII data lists. These lists are later compiled into the data presented below using, for example, such computer programs like Microsoft Excel, Origin (OriginLab, Northampton, MA), and Lab VIEW.
[0094] The same software, especially the same fit algorithm, may be applied to data from both imaging modes with the only difference being that in biplane mode two ROIs are used instead of one. Combined with the required minimal changes in the optical setup, this warrants optimal conditions for a direct and thorough comparison of the two methods of 3D localization. Fit Algorithm
[0095] The fit algorithm, also referred to as a particle localization routine, performs for every identified particle a least- squares fit based on the Nelder-Mead downhill simplex method. This method is described in more detail in an article titled "Numerical Recipes: The Art of Scientific Computing" by W. H. Press, S. A. Teukolsky, W. T. Vetterling, and B. P. Flannery (Cambridge University Press, 2007), which is incorporated by reference in its entirety. This algorithm, in short, finds the best fit by successively contracting an n- dimensional polytope ("simplex" with m+1 vertices) around the minimum of the figure-of- merit function in m-dimensional parameter space.
[0096] The figure-of-merit function, χ2, is calculated as squares of the error-weighted differences between the observed number of photons, np and a model function, F, which depends on a set of fit parameter values summed over all pixels, j, which describe the image of an identified particle:
Figure imgf000023_0001
[0097] The 3D positions, X7, describe the coordinates in the sample and correspond to a lattice of l5 x 15 x 2 or l5 x 15 x 1 extracted pixels in biplane and astigmatic detection mode, respectively. Alternatively, the 3D positions can represent any distribution matching the experimental imaging conditions. The estimated statistical error of n} - σ} - is assumed to be n//2 because shot noise is generally the main error contribution.
[0098] To fit the particle data, the model function Fv^a(x) = vha(x) + b depends on m = 5 parameters. Specifically, the 5 parameters are (1-3) the particle's 3D position a = (αx, αy, αz), (4) the number of photons, v, at the intensity maximum detected over the area of one pixel, and (5) the number of background photons, b, per pixel. The normalized instrument response at point x for a particle located at position a is described by ha(x) and is derived from experimentally obtained reference data sets.
[0099] The reference data set /?o(ξ/) is defined for a lattice of voxel coordinates ξ/. The required values ha(Xj) have to be determined from the reference data set by interpolation, making use of the fact that for a translationally invariant system ha(x) = ho(x - a). Simple linear interpolation and related methods generate points of non-differentiability that can cause failure of proper convergence of the simplex method and induce localization artifacts. To address this issue, the following interpolation method based on Fourier transforms has been developed.
[0100] The problem of estimating the value of a sampled function at a certain point of interest, x - a can be interpreted as determining the function value at the nearest node ξ/ after shifting the whole function by the amount necessary to make (x - a) coincide with ξ7. This shift can be achieved by convolving the function with a Dirac delta distribution:
A. (x> A0/ )®<5(ξ/-D. (xJ) (2),
where Da(x7) = ξ/ - (X7 - a) describes the vector between the position X7 - a and its closest neighbor ξ/. In Fourier space the convolution assumes the simple form of a multiplication,
Figure imgf000024_0001
[0101] Ho(K/) is the optical transfer function ("OTF") of the system and is calculated once at the beginning of the fit procedure as the Fourier transform of ho(ζi). Multiplication with the parameter-dependent phase factor exp(-iκ7Da(x7)) and inverse Fourier transform yield the shifted reference data set ha(Xj).
[0102] It can be easily realized that the spacing of the pixel positions, X7, is an integer multiple of the spacing of the reference data set nodes ξ7. In this case, Da(x) is independent of Xj and a single inverse Fourier transformation of Eq. 3 is sufficient to find all the reference data set values required to calculate χ for a given shift a.
[0103] Because of small experimental differences between the two biplane detection reference data sets, we use a slight modification of the described method that performs the reference data set translation as described by Eq. 3 simultaneously for both reference data sets. Values ha(Xj) are extracted from the appropriate OTFs according to the detection plane Xj in which the values ha(Xj) are located. For calculating the discrete Fourier transforms, for example, the fit algorithm can use the freely available FFTW by M. Frigo and S. G. Johnson (http://www.fftw.org, 2008), which is incorporated by reference in its entirety.
Sample
[0104] In the performed experiments, fluorescent latex beads of 100 nm diameter with an emission maximum at 560 nm (F-8800, Invitrogen, Carlsbad, California) were imaged. Beads were adhered on poly-L-lysine coated (Sigma-Aldrich, St. Louis, MO) cover slips, immersed in water and mounted on a slide. Bead density was chosen so low, that only about eight to twelve beads were visible in the field of view when imaging. This guaranteed that fluorescence from neighboring particles was not influencing the analysis.
Generation of the Reference Data Set
[0105] In the localization algorithm, an experimentally obtained reference data set (also referred to as a point-spread function) replaces theoretical models used elsewhere. To rule out localization artifacts caused by this reference data set, great care was exercised in its generation during performed experiments. The same bead samples as in later experiments were imaged at maximum electron-multiplying gain of the camera without pixel binning. Single frames were recorded with acquisition times of 30 ms at 50 nm axial piezo steps over a range of 10 μm.
[0106] Typically, during performed experiments, approximately 3,000 photons were detected from each bead at each z-position near the focal plane. Single beads were identified visually from the recorded data stacks, and ROIs of 3.8 μm x 3.8 μm size centered on the signal maximum were extracted. In the case of biplane detection, stacks were cut out for both recorded planes resulting in two correlated reference data sets. The extracted stacks were loaded into the data processing software Imspector (written by Dr. Andreas Schoenle, Max Planck Institute for Biophysical Chemistry, Goettingen, Germany, and available via Max- Planck- Innovation GmbH, Munich, Germany).
[0107] In Imspector, the background was removed and the data was corrected for bleaching that had occurred during the imaging process. The reference data sets were then cut to a size of approximately 3.8 μm x 3.8 μm x 7.5 μm with the reference data set centered in the middle. In biplane mode, both reference data sets were cut in an identical way such that (i) the stack centers were located axially in the middle between the two reference data set centers and (ii) the reference data set centers maintained their original axial distance. To reduce noise, the reference data sets were resampled to voxel sizes close to the resolution limit (x = y = 127 nm, z = 200 nm).
[0108] The assured maximum processing speed in the fit algorithm, due to the inclusion of Fourier transformation steps, depends on the number of reference data set voxels without altering the optical characteristics of the original reference data set. The reference data set has been further normalized to a maximum value of 1 for easier determination of reasonable start parameters for the fit algorithm. In biplane mode, the brighter reference data set was normalized to 1 and the other reference data set was normalized to a lower value. [0109] Referring to FIG. 8, an exemplary photoactivation localization microscopy method in accordance with the features described above includes providing recorded raw images (801) from which particles are identified in images (803). The photoactivation localization microscopy method can be performed in both 2D and 3D. A computation is made separately for each identified particle (805) and pixels are extracted in regions of interest centered around each of the identified particles (807). Each identified particle is localized by determining the particle position from the intensity distribution in the region of interest (809). Additional details regarding the localization of each identified particle is provided in FIGs. 10-12. The determined positions of all the particles are merged (811) and a particle distribution map is created from the determined particle positions (813). The particle distribution map is provided as a resulting image (815).
[0110] Referring to FIG. 9, an exemplary single particle tracking method in accordance with the features described above includes providing a recorded image sequence (901) from which a particle is identified in each frame (903). The single particle tracking method can be performed in both 2D and 3D. A computation is made separately for each frame (905) and pixels are extracted in regions of interest centered around the particle (907). The particle is localized by determining the particle position from the intensity distribution in the region of interest (909). Additional details regarding the localization of each identified particle is provided in FIGs. 10-12. The determined positions of the particle are merged (911) and a particle trajectory is created from the determined particle positions (913). The resulting particle trajectory is provided (915).
[0111] Referring to FIG. 10, an example of a conventional non-iterative localization algorithm includes extracting pixels for one particle (imaging) or one point in time (particle tracking) (1001), wherein the center of mass is calculated (1003). The particle position is, then, determined (1005).
[0112] Referring to FIG. 11, an example of a conventional iterative localization algorithm includes extracting pixels for one particle (imaging) or one point in time (particle tracking) (1101). After calculating the center of mass (1103), a first guess of the particle position (1105) is provided to calculate the figure of merit as a measure of the difference between the imaged pixel data and a model function based on the guessed particle position (1107). The guess is modified (1109) and the figure of merit is calculated based on the modified guess (1111). If the figure of merit has not decreased (1113), the old guess is used to determine if the figure of merit is below a specific threshold (1115). If the figure of merit has decreased (1113), the modified guess is used to determine if the figure of merit is below the specific threshold (1115). Then, if the figure of merit is below the specific threshold, the particle position is outputted as the last guess (1117). However, if the figure of merit is not below the specific threshold, the guess is modified (1109) and the figure of merit is calculated based on the modified guess (1111).
[0113] Referring to FIG. 12, in an exemplary implementation of the fit algorithm described above pixels are extracted for one particle (imaging) or one point in time (particle tracking) (1201). After calculating the center of mass (1203), a first guess of the particle position (1205) and a reference data set (1207) is provided to calculate the figure of merit as a measure of the difference between the imaged pixel data and the reference data set adjusted for guessed particle position (1209). Exemplary algorithms for calculating the figure of merit are further described below in reference to FIGs. 13 and 14.
[0114] The guess is modified (1211) and the figure of merit is calculated based on the modified guess (1213). If the figure of merit has not decreased (1215), the old guess is used to determine if the figure of merit is below a specific threshold (1217). If the figure of merit has decreased (1215), the modified guess is used to determine if the figure of merit is below the specific threshold (1217). Then, if the figure of merit is below the specific threshold, the particle position is outputted as the last guess (1219). However, if the figure of merit is not below the specific threshold, the guess is modified (1211) and the figure of merit is calculated based on the modified guess (1213).
[0115] Referring to FIG. 13, an algorithm for calculating the figure of merit includes extracting pixels for a particle (1301), guessing the particle position, brightness, and background (1303), and providing the reference data set (1305). The reference data set is shifted in accordance with the particle position (1307) and the shifted reference data set (1309) and the particle position, brightness, and background are provided for calculating the model function Fv,b,a(xj) = vha(xj) + b (1311). The figure of merit is calculated (1313) based on the model function and the extracted pixels for the particle and the figure of the merit function is provided (1315).
[0116] Referring to FIG. 14, an exemplary embodiment for a Fourier-transform based Shift Algorithm includes extracting pixels for a particle (1401), and guessing the particle position, brightness, and background (1403). A calculation (1405) is performed to obtain a reference data set (1405a), a Fourier Transform (1405b), and the OTF (1405c). The OTF is multiplied with a parameter-dependent phase factor (1407), based on the particle position, brightness, and background. The inverse Fourier Transform is calculated (1409), providing a shifted reference data set (1411). The model function Fv,b,a(xj) = vha(xj) + b is calculated (1413) based on the shifted reference data set and the particle position, brightness, and background. The figure of merit is calculated (1415) based on the model function and the extracted pixels for the particle and the figure of the merit function is provided (1417). [0117] Each of these embodiments and obvious variations thereof is contemplated as falling within the spirit and scope of the claimed invention, which is set forth in the following claims.

Claims

ClaimsWhat is claimed is:
1. A microscopy system configured for creating 3D images from individually localized probe molecules, the system comprising: a sample stage for mounting a sample having a plurality of photo- sensitive luminescent probes; an activation light source configured to illuminate the sample with an activation light, the activation light source being configured to activate probes of at least one probe subset of the plurality of photo-sensitive luminescent probes; a readout light source configured to illuminate the sample with a readout light, the readout light source being configured to cause luminescence light from the activated probes; a beam splitting device located in a detection light path that splits the luminescence light into at least two paths, the beam splitting device creating at least two detection planes that correspond to the same or different number of object planes of the sample; at least one camera positioned to detect simultaneously the at least two detection planes, the number of object planes being represented in the camera by the same number of recorded regions of interest; and a controller programmable to combine a signal from the regions of interest into a 3D data stack.
2. The microscopy system of claim 1, wherein the sample stage is moveable by the controller to image different object planes in the sample.
3. The microscopy system of claim 1, wherein the detected luminescence from the activated probes is fluorescence.
4. The microscopy system of claim 1, wherein the activation of the activated probes is achieved by a non-linear process limiting the activation to a plane of diffraction-limited thickness.
5. The microscopy system of claim 1, wherein a wavelength of the readout light source is selected to minimize activation of inactive probes of the plurality of photosensitive probes.
6. The microscopy system of claim 1, wherein the activation light source and the readout light source is the same.
7. The microscopy system of claim 1, wherein the at least one camera is a CCD camera.
8. The microscopy system of claim 1, wherein the at least one camera is a single camera, the recorded regions of interest being arranged on the single camera such that they are well separated to avoid cross-talk.
9. The microscopy system of claim 1, wherein at least some of the recorded regions of interest are detected at different wavelengths via suitable detection filters.
10. The microscopy system of claim 1, wherein at least some of the recorded regions of interest are detected at different polarization directions via suitable polarization optics.
11. The microscopy system of claim 1, wherein the activated probes are spatially separated by at least a microscope resolution.
12. A method for creating 3D images from individually localized probe molecules, the method comprising:
(A) mounting a sample on a sample stage, the sample having a plurality of photo-sensitive luminescent probes; (B) in response to illuminating the sample with an activation light, activating probes of at least one probe subset of the plurality of photo-sensitive luminescent probes; (C) in response to illuminating the sample with a readout light, causing luminescence light from the activated probes;
(D) splitting the luminescence light into at least two paths to create at least two detection planes, the at least two detection planes corresponding to the same or different object planes in the sample;
(E) detecting simultaneously the at least two detection planes via a camera;
(F) recording the object planes in corresponding recorded regions of interest in the camera; and
(G) combining a signal from the regions of interest into a 3D data stack.
13. The method of claim 12, further comprising moving the sample stage via a controller to image different object planes in the sample.
14. The method of claim 12, further comprising limiting the activating to a plane of diffraction-limited thickness.
15. The method of claim 12, wherein a wavelength of the readout light source is selected to minimize activation of inactive probes of the plurality of photo- sensitive probes.
16. The method of claim 12, further comprising arranging the recorded regions of interest on a single camera such that hey are well separated to avoid cross-talk.
17. The method of claim 12, further comprising (H) deactivating the activated probes of the at least one probe subset;
(I) repeating (B)-(G) for at least one more subset of probes of the plurality of photo-sensitive luminescent probes; and (J) based on localized three-dimensional positions of the activated probes of the at least one probe subset and the at least one more subset of probes, constructing a 3D image of the sample.
18. A microscopy system configured for tracking microscopic particles in 3D, the system comprising: a sample comprising luminescence particles; a sample stage for mounting the sample; at least one light source configured to illuminate an area of the sample, the at least one light source causing luminescence light from primarily one tracked particle of the luminescence particles; a beam- steering device configured to selectively move a light beam to illuminate different areas of the sample such that the luminescence light is detected; a beam splitting device located in a detection light path that splits the luminescence light into at least two paths, the beam splitting device creating at least two detection planes that correspond to different object planes in the sample at least one camera positioned to detect simultaneously the at least two detection planes, the number of object planes being represented in the camera by the same number of recorded regions of interest; and a controller programmable to combine a signal from the recorded regions of interest, determine a 3D trajectory of the particle at each time point of a recorded data sequence, move the beam- steering device to illuminate the different areas of the sample in accordance with corresponding positions of the one tracked particle.
19. The microscopy system of claim 18, wherein the controller is further programmable to move the sample stage to track motion of the one tracked particle in at least one direction.
20. The microscopy system of claim 18, wherein detected luminescence light from the one tracked particle is fluorescence light.
21. The microscopy system of claim 18, wherein detected luminescence light from the one tracked particle is scattered light.
22. The microscopy system of claim 18, wherein the luminescence particles are photo-activated.
23. The microscopy system of claim 18, wherein the light source is a laser.
24. The microscopy system of claim 23, wherein the luminescence light is created in a non-linear way, the luminescence light being limited to a focal region of the laser centered around the one tracked particle.
25. The microscopy system of claim 18, wherein the beam-steering device includes a piezo-driven tiltable mirror that moves an illuminated area laterally to follow the one tracked particle.
26. The microscopy system of claim 18, wherein the beam-steering device includes a focusing optical element that moves illumination and detection focal planes axially to follow the one tracked particle.
27. The microscopy system of claim 18, wherein the beam-steering device includes a phase-modulating device that moves an illuminated area laterally and illumination and detection focal planes axially to follow the one tracked particle.
28. The microscopy system of claim 18, wherein the at least one camera is a CCD camera.
29. The microscopy system of claim 18, wherein each of the recorded regions of interest includes a small number of pixels to limit the amount of processed data.
30. The microscopy system of claim 18, wherein the recorded regions of interest are arranged in the same pixel lines of the at least one camera to optimize readout time.
31. The microscopy system of claim 18, wherein another luminescence color is detected to enable correlative studies of the one tracked particle movement.
32. The microscopy system of claim 18, wherein at least one illuminated area of the sample has a diameter that is less than about three times an Airy disk diameter.
33. A method for tracking microscopic particles in 3D, the method comprising: mounting a sample on a sample stage, the sample including luminescence particles; illuminating an area of the sample to cause luminescence light from primarily one particle of the luminescence particles; selectively moving a light beam to illuminate different areas of the sample to track movement of the one particle, the different areas including the small area of the sample and corresponding to respective positions of the one particle; splitting the luminescence light into at least two paths to create at least two detection planes that correspond to the same or different number of object planes in the sample; detecting simultaneously the at least two detection planes; representing in a camera the number of object planes by the same number of recorded regions of interest; and based on a combined signal from the recorded regions of interest, determining a 3D trajectory of the one particle at each time point of a recorded data sequence.
34. The method of claim 33, further comprising moving the sample stage to track motion of the one particle in at least one direction.
35. The method of claim 33, further comprising creating the luminescence light in a non-linear way, the luminescence light being limited to a focal region of a laser centered around the one particle.
36. The method of claim 33, wherein a piezo-driven tiltable mirror moves an illuminated area laterally to follow the one particle.
37. The method of claim 33, wherein a focusing optical element moves illumination and detection focal planes axially to follow the one particle.
38. The method of claim 33, wherein a phase-modulating device moves an illuminated area laterally and illumination and detection focal planes axially to follow the one particle.
39. The method of claim 33, further comprising arranging the recorded regions of interest in the same pixel lines of at least one camera to optimize readout time.
40. The method of claim 33, further comprising detecting another luminescence color to enable correlative studies of the one particle movement.
41. A microscopy system configured for creating 3D images from individually localized probe molecules, the system comprising: an activation light source configured to illuminate a sample with an activation light, the activation light source being configured to activate probes of at least one probe subset of the plurality of photo- sensitive luminescent probes; a readout light source configured to illuminate the sample with a readout light, the readout light source being configured to cause luminescence light from the activated probes; a beam splitting device located in a detection light path that splits the luminescence light into at least two paths, the beam splitting device creating at least two detection planes that correspond to the same or different number of object planes of the sample; at least one camera positioned to detect simultaneously the at least two detection planes, the number of object planes being represented in the camera by the same number of recorded regions of interest; and a controller programmable to combine a signal from the regions of interest into a 3D data stack, calculate a figure of merit as a measure of the difference between imaged pixel data and a reference data set modifiable by at least one parameter, and optimize the figure of merit by adjusting the at least one parameter.
42. The microscopy system of claim 41, wherein the controller is further programmable to perform a least-squares fit for finding a best fit by successively contracting an ^-dimensional polytope around a minimum of a figure-of-merit function in m-dimensional parameter space.
43. The microscopy system of claim 41, wherein the controller is further programmable to calculate the figure-of-merit function as a square of error- weighted differences between observed number of photons and a model function that depends on a set of fit parameter values summed over all pixels,
44. The microscopy system of claim 41, wherein the figure-of-merit function χ is expressed as
Figure imgf000036_0001
wherein X7- describes 3D coordinates in the sample, rij, is the observed number of photons, j is the number of all pixels, Fv^a(x) = vha(x) + b, a is the particle 3D position in a specific dimension, v is the number of photons at the intensity maximum detected over the area of one pixel, b is the number of background photons per pixel, and ha(x) a describes the normalized instrument response at point x for a particle located at position a.
45. The microscopy system of claim 41, wherein fitting particle data is based on 5 parameters, including (i) particle 3D position in x direction αx, (ii) particle 3D position in y direction αy, (iii) particle 3D position in x direction αz, (iv) number of photons v at the intensity maximum detected over the area of one pixel, and (v) the number of background photons b per pixel.
46. The microscopy system of claim 41, wherein the reference data set is defined for a lattice of voxel coordinates ξ/, the controller being further programmable to determine values ha(Xj) from the reference data set by interpolation.
47. The microscopy system of claim 46, wherein interpolation is based on Fourier transforms.
48. The microscopy system of claim 46, wherein the controller is further programmable to convolve the reference data set with a shifted Dirac delta distribution.
49. The microscopy system of claim 41, wherein the at least one parameter includes a particle position, an amplitude, and a background signal.
50. The microscopy system of claim 41, wherein the at least one parameter includes an interference phase parameter, a polarization parameter, and a wavelength parameter.
PCT/US2009/038799 2008-04-01 2009-03-30 3d biplane microscopy WO2009146016A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12/936,095 US20110025831A1 (en) 2008-04-01 2009-03-30 3D Biplane Microscopy
EP09755379.6A EP2265932B1 (en) 2008-04-01 2009-03-30 3d biplane microscopy

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/060,730 2008-04-01
US12/060,730 US7772569B2 (en) 2008-04-01 2008-04-01 3D biplane microscopy

Publications (1)

Publication Number Publication Date
WO2009146016A1 true WO2009146016A1 (en) 2009-12-03

Family

ID=41115683

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2009/038799 WO2009146016A1 (en) 2008-04-01 2009-03-30 3d biplane microscopy

Country Status (3)

Country Link
US (3) US7772569B2 (en)
EP (3) EP2631633A1 (en)
WO (1) WO2009146016A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010049751A1 (en) 2010-10-29 2012-05-03 "Stiftung Caesar" (Center Of Advanced European Studies And Research) Optical beam splitter, particularly for use in beam path of light microscope, for displaying multiple focal planes of object on optical detector, has monolithic base module comprising beam splitter module
CN102656442A (en) * 2009-12-23 2012-09-05 应用精密公司 System and method for dense-stochastic-sampling imaging
EP2516993A1 (en) * 2009-12-22 2012-10-31 Carl Zeiss Microscopy GmbH High-resolution microscope and method for determining the two- or three-dimensional positions of objects
TWI480536B (en) * 2014-05-20 2015-04-11 Univ Nat Taiwan System for analyzing fluorescence intensity and synthesizing fluorescence image and method thereof
US9103784B1 (en) 2012-11-16 2015-08-11 Iowa State University Research Foundation, Inc. Fluorescence axial localization with nanometer accuracy and precision
DE102015004104A1 (en) 2015-03-27 2016-09-29 Laser-Laboratorium Göttingen e.V. Method for locating at least one emitter by means of a localization microscope
DE102015121403A1 (en) * 2015-12-09 2017-06-14 Carl Zeiss Microscopy Gmbh LIGHT FIELD IMAGING WITH SCANOPTICS
DE102016116620B3 (en) * 2016-09-06 2017-11-02 Stiftung Caesar Center Of Advanced European Studies And Research Beam guidance unit and system of beam guidance units and their use
DE102016119263A1 (en) * 2016-10-10 2018-04-12 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. A method for spatially high-resolution determination of the location of a singled, excitable light for the emission of luminescent light molecule in a sample
US10783697B2 (en) 2016-02-26 2020-09-22 Yale University Systems, methods, and computer-readable media for ultra-high resolution 3D imaging of whole cells

Families Citing this family (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7656172B2 (en) * 2005-01-31 2010-02-02 Cascade Microtech, Inc. System for testing semiconductors
US8121361B2 (en) 2006-05-19 2012-02-21 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
US7838302B2 (en) 2006-08-07 2010-11-23 President And Fellows Of Harvard College Sub-diffraction limit image resolution and other imaging techniques
US8217992B2 (en) * 2007-01-11 2012-07-10 The Jackson Laboratory Microscopic imaging techniques
EP4060327A1 (en) * 2007-12-21 2022-09-21 President and Fellows of Harvard College Sub-diffraction limit image resolution in three dimensions
US8994807B2 (en) 2009-03-18 2015-03-31 University Of Utah Research Foundation Microscopy system and method for creating three dimensional images using probe molecules
JP5337676B2 (en) * 2009-06-25 2013-11-06 株式会社日立ハイテクノロジーズ Fluorescence analyzer and fluorescence detector
DE102009031231A1 (en) * 2009-06-26 2010-12-30 Carl Zeiss Microlmaging Gmbh Methods and arrangements for fluorescence microscopy
US9250181B2 (en) * 2009-09-28 2016-02-02 Koninklijke Philips N.V. Sensor device with imaging optics
DE102009043744A1 (en) 2009-09-30 2011-03-31 Carl Zeiss Microlmaging Gmbh Method and microscope for three-dimensional resolution-enhanced microscopy
US20110090327A1 (en) * 2009-10-15 2011-04-21 General Electric Company System and method for imaging with enhanced depth of field
DE102010007730B4 (en) * 2010-02-12 2021-08-26 Leica Microsystems Cms Gmbh Method and device for setting a suitable evaluation parameter for a fluorescence microscope
DE102010013223B4 (en) * 2010-03-29 2016-05-12 Lavision Biotec Gmbh Method and arrangement for microscopy
GB201007055D0 (en) * 2010-04-28 2010-06-09 Vib Vzw Method and apparatus for the imaging of a labelled sample
US9581642B2 (en) * 2010-05-12 2017-02-28 International Business Machines Corporation Method and system for quickly identifying circuit components in an emission image
US20120092480A1 (en) * 2010-05-28 2012-04-19 Putman Matthew C Unique digital imaging method employing known background
DE112011103187B4 (en) 2010-09-24 2021-10-14 Carl Zeiss Microscopy Gmbh System and method for 3D localization microscopy
FR2966258B1 (en) 2010-10-15 2013-05-03 Bioaxial FLUORESCENCE SUPERRESOLUTION MICROSCOPY SYSTEM AND METHOD FOR BIOLOGICAL APPLICATIONS
DE102010044013A1 (en) * 2010-11-16 2012-05-16 Carl Zeiss Microimaging Gmbh Depth resolution enhanced microscopy
KR101669214B1 (en) 2010-12-31 2016-10-25 삼성전자주식회사 Scanning lens apparatus adopting bimorph actuator
DE102011005432A1 (en) 2011-03-11 2012-09-13 Hellma Gmbh & Co. Kg Device for the analysis of a small amount of liquid
WO2012154333A1 (en) * 2011-04-07 2012-11-15 The Uwm Research Foundation, Inc. High speed microscope with spectral resolution
DE102011007751B4 (en) 2011-04-20 2023-10-19 Carl Zeiss Microscopy Gmbh Wide-field microscope and method for wide-field microscopy
EP2747641A4 (en) 2011-08-26 2015-04-01 Kineticor Inc Methods, systems, and devices for intra-scan motion correction
DE102011053232B4 (en) * 2011-09-02 2020-08-06 Leica Microsystems Cms Gmbh Microscopic device and microscopic method for the three-dimensional localization of punctiform objects
CN103033129B (en) * 2011-10-07 2015-10-21 财团法人工业技术研究院 Optical apparatus and optical addressing method
DE102011087770A1 (en) 2011-12-05 2013-06-27 Technische Universität Braunschweig High-resolution microscope
DE102012201003A1 (en) 2012-01-24 2013-07-25 Carl Zeiss Microscopy Gmbh Microscope and method for high-resolution 3-D fluorescence microscopy
JP6335160B2 (en) 2012-04-13 2018-05-30 バイオアキシアル エスエーエス Optical measuring method and optical measuring device
EP2843973B1 (en) * 2012-04-27 2019-02-06 Sony Corporation Information processing device, information processing method, and program
US9036603B2 (en) * 2012-08-03 2015-05-19 Intel Corporation Network assistance for device-to-device discovery
US20140064147A1 (en) * 2012-08-29 2014-03-06 Qualcomm Incorporated Methods and apparatus for wan enabled peer discovery
JP2014115151A (en) * 2012-12-07 2014-06-26 Shimadzu Corp Optical imaging device
US9717461B2 (en) 2013-01-24 2017-08-01 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10327708B2 (en) 2013-01-24 2019-06-25 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9305365B2 (en) 2013-01-24 2016-04-05 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
EP2950714A4 (en) 2013-02-01 2017-08-16 Kineticor, Inc. Motion tracking system for real time adaptive motion compensation in biomedical imaging
US9435993B2 (en) 2013-03-24 2016-09-06 Bruker Nano, Inc. Three dimensional microscopy imaging
US9532032B2 (en) * 2013-04-18 2016-12-27 Ellis Amalgamated, LLC Astigmatic depth from defocus imaging using intermediate images and a merit function map
DE102013208415B4 (en) 2013-05-07 2023-12-28 Carl Zeiss Microscopy Gmbh Microscope and method for 3D high-resolution localization microscopy
DE102013208926A1 (en) * 2013-05-14 2014-11-20 Carl Zeiss Microscopy Gmbh Method for 3D high-resolution localization microscopy
DE102013106895B4 (en) 2013-07-01 2015-09-17 Leica Microsystems Cms Gmbh Light microscopic method for the localization of point objects
CN107655812A (en) * 2013-12-18 2018-02-02 香港科技大学 Method, system and the prismatic light chip device of deep layer cells super-resolution imaging
CN106572810A (en) 2014-03-24 2017-04-19 凯内蒂科尔股份有限公司 Systems, methods, and devices for removing prospective motion correction from medical imaging scans
CN106714681A (en) 2014-07-23 2017-05-24 凯内蒂科尔股份有限公司 Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10921255B2 (en) 2014-12-09 2021-02-16 Bioaxial Sas Optical measuring device and process
CN104568877A (en) * 2014-12-25 2015-04-29 中国科学院苏州生物医学工程技术研究所 Stochastic optical reconstruction microscopy system and method based on LED light sources
JP6635052B2 (en) * 2015-02-05 2020-01-22 株式会社ニコン Structured illumination microscope and observation method
US10187626B2 (en) * 2015-04-10 2019-01-22 The Board Of Trustees Of The Leland Stanford Junior University Apparatuses and methods for three-dimensional imaging of an object
WO2016178856A1 (en) 2015-05-01 2016-11-10 The Board Of Regents Of The University Of Texas System Uniform and scalable light-sheets generated by extended focusing
US10015481B2 (en) * 2015-05-05 2018-07-03 Goodrich Corporation Multi-axis center of mass balancing system for an optical gimbal assembly guided by inertial measurement
US9943247B2 (en) 2015-07-28 2018-04-17 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US10716515B2 (en) 2015-11-23 2020-07-21 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
DE102015121920A1 (en) * 2015-12-16 2017-06-22 Carl Zeiss Microscopy Gmbh High-resolution short-term microscopy method and high-resolution short-term microscope
US10663750B2 (en) 2016-03-15 2020-05-26 The Regents Of The University Of Colorado, A Body Super-resolution imaging of extended objects
WO2017180680A1 (en) 2016-04-12 2017-10-19 The Board Of Regents Of The University Of Texas System LIGHT-SHEET MICROSCOPE WITH PARALLELIZED 3D lMAGE ACQUISITION
US20170366965A1 (en) * 2016-06-21 2017-12-21 Chiun Mai Communication Systems, Inc. Communication device, communication system and method therefor
DE102016119262B4 (en) * 2016-10-10 2018-06-07 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. A method for spatially high-resolution determination of the location of a singled, excitable light for the emission of luminescent light molecule in a sample
DE102017211031A1 (en) * 2016-11-21 2018-05-24 Carl Zeiss Microscopy Gmbh Method and microscope for determining a fluorescence intensity
CA3071685C (en) * 2017-08-09 2023-11-21 Allen Institute Systems, devices, and methods for image processing to generate an image having predictive tagging
DE102017129519B4 (en) * 2017-12-12 2020-08-06 Technische Universität Ilmenau Arrangement and method for the simultaneous measurement of the fluorescence of individual layers in a layer system, for example the fundus
DE102018105308A1 (en) * 2018-03-08 2019-09-12 Carl Zeiss Microscopy Gmbh Microscope and method for microscopy of a sample for displaying images with extended depth of field or three-dimensional images
CN108507986A (en) * 2018-03-17 2018-09-07 杨佳苗 The discrete fluorescence spectrum of differential confocal and fluorescence lifetime detection method and device
US11209367B2 (en) * 2018-08-27 2021-12-28 Yale University Multi-color imaging using salvaged fluorescence
CN110231320B (en) * 2019-06-05 2021-06-22 复旦大学 Sub-millisecond real-time three-dimensional super-resolution microscopic imaging system
EP3885813A1 (en) * 2020-03-27 2021-09-29 Leica Microsystems CMS GmbH Method and device for estimating a sted resolution
US11635607B2 (en) 2020-05-18 2023-04-25 Northwestern University Spectroscopic single-molecule localization microscopy
DE102020116547A1 (en) 2020-06-23 2021-12-23 Abberior Instruments Gmbh Method for localizing individual molecules of a dye in a sample and for generating high-resolution images of a structure in a sample

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020064789A1 (en) * 2000-08-24 2002-05-30 Shimon Weiss Ultrahigh resolution multicolor colocalization of single fluorescent probes
US20020076200A1 (en) * 2000-10-23 2002-06-20 Takuro Hamaguchi Host system, driving apparatus, information recording and reading method for the host system, and information recording and reading method for the driving apparatus
US20020101593A1 (en) * 2000-04-28 2002-08-01 Massachusetts Institute Of Technology Methods and systems using field-based light scattering spectroscopy
US6804385B2 (en) * 2000-10-24 2004-10-12 Oncosis Method and device for selectively targeting cells within a three-dimensional specimen

Family Cites Families (79)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3572612D1 (en) 1984-11-23 1989-10-05 Basf Ag Process for producing cuts in biological material
US4621911A (en) 1985-03-12 1986-11-11 Carnegie-Mellon University Standing wave luminescence microscopy
SE465009B (en) 1989-11-07 1991-07-15 Skaanemejerier Ek Foer FOOD PRODUCTS WITH LOW FAT CONTENT AND PROCEDURES FOR PREPARING THEREOF
SE9001235L (en) 1990-04-04 1991-10-05 Faerg Ab Nv PROCEDURES AND EQUIPMENT BEFORE SHIPPING
DE4040441A1 (en) 1990-12-18 1992-07-02 Hell Stefan DOUBLE CONFOCAL GRID MICROSCOPE
DE4216949C2 (en) 1992-05-22 1997-07-24 Christoph Prof Dr Dr Cremer Non-enzymatic method for in situ hybridization on specific samples
US6005916A (en) 1992-10-14 1999-12-21 Techniscan, Inc. Apparatus and method for imaging with wavefields using inverse scattering techniques
WO1996006003A1 (en) 1993-06-04 1996-02-29 Mats Gustafsson A floating platform stabilizing arrangement
ATE204086T1 (en) 1994-02-01 2001-08-15 Hell Stefan DEVICE AND METHOD FOR OPTICALLY MEASURING A SAMPLE POINT OF A SAMPLE WITH HIGH SPATIAL RESOLUTION
DE4414940C2 (en) 1994-04-28 1998-07-02 Pekka Haenninen Luminescence scanning microscope with two photons excitation
EP0783428B1 (en) 1994-08-25 1999-05-19 Mats Gustafsson A floating platform stabilizing arrangement
US5671085A (en) 1995-02-03 1997-09-23 The Regents Of The University Of California Method and apparatus for three-dimensional microscopy with enhanced depth resolution
US5874726A (en) 1995-10-10 1999-02-23 Iowa State University Research Foundation Probe-type near-field confocal having feedback for adjusting probe distance
DE19601488C1 (en) 1996-01-17 1997-05-28 Itt Ind Gmbh Deutsche Measuring device manufacturing method for measuring or testing physiological parameter at biocomponent
DE19610255B4 (en) 1996-03-15 2004-11-04 Universität Heidelberg Process for the preparation of nucleic acid sequences and process for the detection of translocations between chromosomes
DE19653413C2 (en) 1996-12-22 2002-02-07 Stefan Hell Scanning microscope, in which a sample is simultaneously optically excited in several sample points
AU5748598A (en) 1996-12-23 1998-07-17 Ruprecht-Karls-Universitat Heidelberg Method and devices for measuring distances between object structures
AU6718498A (en) 1997-02-22 1998-09-09 Universitat Heidelberg Marking of nucleic acids with special probe mixtures
US6650357B1 (en) 1997-04-09 2003-11-18 Richardson Technologies, Inc. Color translating UV microscope
DE19830596B4 (en) 1997-07-10 2008-08-21 Ruprecht-Karls-Universität Heidelberg Wave field microscope, wave field microscopy method, also for DNA sequencing, and calibration method for wave field microscopy
US5851052A (en) 1997-10-22 1998-12-22 Gustafsson; Mats Foldable stool
US6337472B1 (en) 1998-10-19 2002-01-08 The University Of Texas System Board Of Regents Light imaging microscope having spatially resolved images
AT410718B (en) 1998-10-28 2003-07-25 Schindler Hansgeorg Dr DEVICE FOR VISUALIZING MOLECULES
SE9901302L (en) 1998-12-01 2000-06-02 Ericsson Telefon Ab L M Method and device in a communication network
WO2000045153A1 (en) 1999-01-29 2000-08-03 June Iris Medford Optical coherence microscope and methods of use for rapid in vivo three-dimensional visualization of biological function
DE19908883A1 (en) 1999-03-02 2000-09-07 Rainer Heintzmann Process for increasing the resolution of optical imaging
SE516598C2 (en) 2000-06-22 2002-02-05 Mats Gustavsson Bearing device for primarily fly fishing reels with threaded outer bearing ring
SE0002587D0 (en) 2000-07-07 2000-07-07 Ericsson Telefon Ab L M Rake receiver and method related to a rake receiver
US7973936B2 (en) 2001-01-30 2011-07-05 Board Of Trustees Of Michigan State University Control system and apparatus for use with ultra-fast laser
DE10118355B4 (en) 2001-04-12 2005-07-21 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. Method and apparatus for multiphoton excitation of a sample
EP2420822B1 (en) 2001-06-29 2020-09-02 Universite Libre De Bruxelles Device intended for obtaining three-dimensional images of a sample by microscopy
US7105795B2 (en) 2001-07-06 2006-09-12 Palantyr Research, Llc Imaging system, methodology, and applications employing reciprocal space optical design
US7151246B2 (en) 2001-07-06 2006-12-19 Palantyr Research, Llc Imaging system and methodology
US6909150B2 (en) * 2001-07-23 2005-06-21 Agere Systems Inc. Mixed signal integrated circuit with improved isolation
EP1436597B1 (en) 2001-10-09 2008-07-02 Ruprecht-Karls-Universität Heidelberg Far yield light microscopical method and system for analysing at least one object having a subwavelength size
DE10154699B4 (en) 2001-11-09 2004-04-08 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. Method and device for stimulating an optical transition in a spatially limited manner
US20040133112A1 (en) 2002-03-08 2004-07-08 Milind Rajadhyaksha System and method for macroscopic and confocal imaging of tissue
SE521296C2 (en) 2002-04-29 2003-10-21 Totalfoersvarets Forskningsins Methods to detect a signal in the presence of additive Gaussian noise and a detector utilizing the method
US6934079B2 (en) 2002-05-03 2005-08-23 Max-Planck-Gesellschaft zur Förderung der Wissen-schaften e. V. Confocal microscope comprising two microlens arrays and a pinhole diaphragm array
EP1359452B1 (en) 2002-05-03 2006-05-03 Max-Planck-Gesellschaft zur Förderung der Wissenschaften e.V. Confocal microscope having two micro-lens arrays and a pinhole array
US7154598B2 (en) 2002-07-12 2006-12-26 Decision Biomarkers, Inc. Excitation and imaging of fluorescent arrays
US7430045B2 (en) 2003-04-13 2008-09-30 Max-Planck-Gesellschaft Zur Forderung Der Wissenschaften E.V. High spatial resolution imaging
US7064824B2 (en) 2003-04-13 2006-06-20 Max-Planck-Gesellschaft Zur Forderung Der Wissenschaften E.V. High spatial resoulution imaging and modification of structures
US7539115B2 (en) 2003-04-13 2009-05-26 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. Creating a permanent structure with high spatial resolution
JP2006522989A (en) 2003-04-13 2006-10-05 マックス−プランク−ゲゼルシャフト・ツーア・フェルデルング・デア・ヴィセンシャフテン・エー.ファウ. Production of constant structures with high spatial resolution
WO2004090617A2 (en) 2003-04-13 2004-10-21 Max-Planck-Gesselschaft Zur Förderung Der Wissenschaften E.V. High three-dimensional resolution representation
SE525789C2 (en) 2003-07-17 2005-04-26 Delaval Holding Ab Method and apparatus for indicating a state of health of a dairy animal
US7372985B2 (en) 2003-08-15 2008-05-13 Massachusetts Institute Of Technology Systems and methods for volumetric tissue scanning microscopy
SE0302676D0 (en) 2003-10-09 2003-10-09 Neural Ab Method and apparatus for holographic refractometry
US7256894B2 (en) 2003-10-20 2007-08-14 The Regents Of The University Of California Method and apparatus for performing second harmonic optical coherence tomography
WO2005072399A2 (en) 2004-01-29 2005-08-11 Massachusetts Institute Of Technology Microscale sorting cytometer
WO2005091970A2 (en) * 2004-03-06 2005-10-06 Michael Trainer Methods and apparatus for determining the size and shape of particles
EP1582858A1 (en) 2004-03-29 2005-10-05 Max-Planck-Gesellschaft zur Förderung der Wissenschaften e.V. Method to excite molecules from a first state to a second state with an optical signal
CN101065881B (en) 2004-05-21 2012-05-16 艾利森电话股份有限公司 Broadband array antennas using complementary antenna
DE102004034984A1 (en) 2004-07-16 2006-02-02 Carl Zeiss Jena Gmbh Method for acquiring images of a sample with a light scanning microscope with punctiform light source distribution
DE102004034973A1 (en) 2004-07-16 2006-02-16 Carl Zeiss Jena Gmbh Method for acquiring images of a sample with a light scanning microscope
DE102004034993A1 (en) 2004-07-16 2006-02-02 Carl Zeiss Jena Gmbh Scanning microscope with linear scanning and use
US7253408B2 (en) 2004-08-31 2007-08-07 West Paul E Environmental cell for a scanning probe microscope
JP4605447B2 (en) 2004-09-17 2011-01-05 横河電機株式会社 3D confocal microscope system
EP1824379B1 (en) 2004-12-08 2017-04-12 The General Hospital Corporation System and method for normalized fluorescence or bioluminescence imaging
US20060171846A1 (en) 2005-01-10 2006-08-03 Marr David W M Microfluidic systems incorporating integrated optical waveguides
BRPI0606807A2 (en) 2005-01-31 2009-07-14 Univ Illinois device for analyzing particles in a sample and method for analyzing particles in a sample containing such particles
WO2006091162A1 (en) 2005-02-28 2006-08-31 Telefonaktiebolaget Lm Ericsson (Publ) Method and arrangement for reducing the radar cross section of integrated antennas
DE102005012739B4 (en) 2005-03-19 2010-09-16 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. Method for producing spatial fine structures
DE102005013969A1 (en) 2005-03-26 2006-10-05 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. Method for the microscopic examination of a spatial fine structure
US9599611B2 (en) * 2005-04-25 2017-03-21 Trustees Of Boston University Structured substrates for optical surface profiling
DE102005020003B4 (en) 2005-04-27 2007-10-11 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. fluorescence microscope
SE528838C2 (en) 2005-04-29 2007-02-27 Delaval Holding Ab Detection method and arrangement for dairy cattle
GB2416261A (en) 2005-05-21 2006-01-18 Zeiss Carl Jena Gmbh Laser scanning microscope with parallel illumination and simultaneous, locally resolved detection
JP4709278B2 (en) 2005-05-23 2011-06-22 エフ. ヘスス ハラルド Optical microscopy using optically convertible optical labels
DE102005040671B4 (en) 2005-08-26 2008-04-30 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. Method and apparatus for storing a three-dimensional array of data bits in a solid
JP5010867B2 (en) 2005-09-22 2012-08-29 オリンパス株式会社 Culture microscope equipment
WO2007038260A2 (en) 2005-09-23 2007-04-05 Massachusetts Institute Of Technology Systems and methods for force-fluorescence microscopy
SE529453C2 (en) 2005-12-02 2007-08-14 Tetra Laval Holdings & Finance Method for detecting leaks in a heat exchanger
US7887803B2 (en) * 2005-12-02 2011-02-15 Amorfix Life Sciences Methods and compositions to treat misfolded-SOD1 mediated diseases
US7855690B2 (en) 2005-12-23 2010-12-21 Telefonaktiebolaget L M Ericsson (Publ) Array antenna with enhanced scanning
ATE449190T1 (en) 2006-03-25 2009-12-15 Univ Ruprecht Karls Heidelberg METHOD FOR MICROSCOPIC DETERMINING THE LOCATION OF A SELECTED, INTRACELLULAR DNA SECTION OF KNOWN NUCLEOTIDE SEQUENCE
US7916304B2 (en) 2006-12-21 2011-03-29 Howard Hughes Medical Institute Systems and methods for 3-dimensional interferometric microscopy
WO2009115108A1 (en) 2008-03-19 2009-09-24 Ruprecht-Karls-Universität Heidelberg A method and an apparatus for localization of single dye molecules in the fluorescent microscopy

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020101593A1 (en) * 2000-04-28 2002-08-01 Massachusetts Institute Of Technology Methods and systems using field-based light scattering spectroscopy
US20020064789A1 (en) * 2000-08-24 2002-05-30 Shimon Weiss Ultrahigh resolution multicolor colocalization of single fluorescent probes
US20020076200A1 (en) * 2000-10-23 2002-06-20 Takuro Hamaguchi Host system, driving apparatus, information recording and reading method for the host system, and information recording and reading method for the driving apparatus
US6804385B2 (en) * 2000-10-24 2004-10-12 Oncosis Method and device for selectively targeting cells within a three-dimensional specimen

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2265932A4 *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2516993A1 (en) * 2009-12-22 2012-10-31 Carl Zeiss Microscopy GmbH High-resolution microscope and method for determining the two- or three-dimensional positions of objects
US9933363B2 (en) 2009-12-23 2018-04-03 Ge Healthcare Bio-Sciences Corp. System and method for dense-stochastic-sampling imaging
US8830314B2 (en) 2009-12-23 2014-09-09 Ge Healthcare Bio-Sciences Corp. System and method for dense-stochastic-sampling imaging
CN102656442A (en) * 2009-12-23 2012-09-05 应用精密公司 System and method for dense-stochastic-sampling imaging
DE102010049751A1 (en) 2010-10-29 2012-05-03 "Stiftung Caesar" (Center Of Advanced European Studies And Research) Optical beam splitter, particularly for use in beam path of light microscope, for displaying multiple focal planes of object on optical detector, has monolithic base module comprising beam splitter module
US9103784B1 (en) 2012-11-16 2015-08-11 Iowa State University Research Foundation, Inc. Fluorescence axial localization with nanometer accuracy and precision
TWI480536B (en) * 2014-05-20 2015-04-11 Univ Nat Taiwan System for analyzing fluorescence intensity and synthesizing fluorescence image and method thereof
DE102015004104A1 (en) 2015-03-27 2016-09-29 Laser-Laboratorium Göttingen e.V. Method for locating at least one emitter by means of a localization microscope
DE102015004104B4 (en) * 2015-03-27 2020-09-03 Laser-Laboratorium Göttingen e.V. Method for localizing at least one emitter by means of a localization microscope
US10371932B2 (en) 2015-12-09 2019-08-06 Carl Zeiss Microscopy Gmbh Light field imaging with scanning optical unit
DE102015121403A1 (en) * 2015-12-09 2017-06-14 Carl Zeiss Microscopy Gmbh LIGHT FIELD IMAGING WITH SCANOPTICS
US10783697B2 (en) 2016-02-26 2020-09-22 Yale University Systems, methods, and computer-readable media for ultra-high resolution 3D imaging of whole cells
DE102016116620B3 (en) * 2016-09-06 2017-11-02 Stiftung Caesar Center Of Advanced European Studies And Research Beam guidance unit and system of beam guidance units and their use
DE102016119263B4 (en) 2016-10-10 2018-06-07 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. A method for spatially high-resolution determination of the location of a singled, excitable light for the emission of luminescent light molecule in a sample
DE102016119263A1 (en) * 2016-10-10 2018-04-12 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. A method for spatially high-resolution determination of the location of a singled, excitable light for the emission of luminescent light molecule in a sample

Also Published As

Publication number Publication date
US20110025831A1 (en) 2011-02-03
EP2631633A1 (en) 2013-08-28
US20090242798A1 (en) 2009-10-01
EP2265932A4 (en) 2011-12-07
US7772569B2 (en) 2010-08-10
US7880149B2 (en) 2011-02-01
EP2631632A1 (en) 2013-08-28
EP2265932B1 (en) 2013-05-22
US20100265318A1 (en) 2010-10-21
EP2265932A1 (en) 2010-12-29

Similar Documents

Publication Publication Date Title
EP2265932B1 (en) 3d biplane microscopy
US10944896B2 (en) Single-frame autofocusing using multi-LED illumination
US10281702B2 (en) Multi-focal structured illumination microscopy systems and methods
JP6934856B2 (en) Optical sheet microscope that simultaneously images multiple target surfaces
US8217992B2 (en) Microscopic imaging techniques
US8994807B2 (en) Microscopy system and method for creating three dimensional images using probe molecules
JP6637653B2 (en) Microscope and SPIM microscopy method
US20150098126A1 (en) Multiview Light-Sheet Microscopy
US20120287244A1 (en) Non-coherent light microscopy
US20140071452A1 (en) Fluid channels for computational imaging in optofluidic microscopes
Murray Methods for imaging thick specimens: confocal microscopy, deconvolution, and structured illumination
Dobbie et al. OMX: A new platform for multi-modal, multi-channel widefield imaging
JP2022516467A (en) Two-dimensional fluorescence wave propagation system and method to the surface using deep learning
Chen et al. Superresolution structured illumination microscopy reconstruction algorithms: a review
CN110023813B (en) Multi-focal structured illumination microscopy systems and methods
US20140022373A1 (en) Correlative drift correction
WO2013176549A1 (en) Optical apparatus for multiple points of view three-dimensional microscopy and method
US11287627B2 (en) Multi-focal light-sheet structured illumination fluorescence microscopy system
US20230221541A1 (en) Systems and methods for multiview super-resolution microscopy
JP2021184264A (en) Image processing apparatus, microscope system, image processing method, and program
Clark et al. Nonscanning three-dimensional differential holographic fluorescence microscopy
CN117369106B (en) Multi-point confocal image scanning microscope and imaging method
US20220252856A1 (en) Method and device for determining the optimal position of the focal plane for examining a specimen by microscopy
Peng et al. Depth resolution enhancement using light field light sheet fluorescence microscopy
WO2023049164A1 (en) Multiscale multiview light-sheet imaging

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09755379

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 12936095

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2009755379

Country of ref document: EP