WO2001037195A2 - Graphical user interface for in-vivo imaging - Google Patents

Graphical user interface for in-vivo imaging Download PDF

Info

Publication number
WO2001037195A2
WO2001037195A2 PCT/US2000/031482 US0031482W WO0137195A2 WO 2001037195 A2 WO2001037195 A2 WO 2001037195A2 US 0031482 W US0031482 W US 0031482W WO 0137195 A2 WO0137195 A2 WO 0137195A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
user
interest
user interface
region
Prior art date
Application number
PCT/US2000/031482
Other languages
French (fr)
Other versions
WO2001037195A3 (en
Inventor
Michael D. Cable
Original Assignee
Xenogen Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xenogen Corporation filed Critical Xenogen Corporation
Priority to AU17691/01A priority Critical patent/AU1769101A/en
Publication of WO2001037195A2 publication Critical patent/WO2001037195A2/en
Publication of WO2001037195A3 publication Critical patent/WO2001037195A3/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/743Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/7435Displaying user selection data, e.g. icons in a graphical user interface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/40Animals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0223Operational features of calibration, e.g. protocols for calibrating sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical

Definitions

  • the present invention relates generally to user interface software running on computers or computer systems. More specifically, the invention relates to user interface systems and methods used in examining and analyzing images.
  • GUIs Graphical user interfaces
  • GUIs on computer systems allow easy use of windows, control icons, etc. to display information to the user.
  • the data displayed in a window may be of different types. Some may be graphical, such as icons or pictures, or textual, such as a word processing document, or a combination of both.
  • the interface may include various data- specific tools and functions.
  • an application might desirably present one or more windows for viewing the image, a tool for changing the image's appearance (e.g., sharpness), and a tool to measure features of one or more images.
  • Interfaces for available applications typically require that the user first select or open various windows, menus, buttons, and/or tiles and then and then manipulate the resulting tool to implement a single operation pertinent to image analysis. Because the user may be required to perform numerous operations for a single image, or handle numerous images simultaneously, the available user interfaces are generally very awkward or unwieldy. Obviously this compromises user efficiency and effectiveness in evaluating images.
  • the image may include one or more representations of emissions from internal portions of a specimen superimposed on a photographic representation of the specimen.
  • the photographic representation provides the user with a pictorial reference of the specimen.
  • the luminescence representation indicates portions of the specimen where an activity of interest may be taking place.
  • the in-vivo data may include light emissions from specific regions of the specimen used in tracking the progression of tumor or a pathogen within the specimen.
  • the present invention addresses this need by providing a computer user interface having a window or other feature that provides tools allowing the user to quickly define a perimeter around a "region of interest" on the image and then measure a property of the image within the region of interest.
  • the region of interest may be bounded by an ellipse, rectangle, or other shape selected and sized by the user.
  • both the image and the tool for generating the region of interest reside on the same window or other interface feature.
  • a region of interest can be generated with one or two user interface actions (e.g., clicking on a button and then dragging a perimeter to an appropriate location on the image to specify the region of interest).
  • the property measured within the region of interest may be an average or total pixel value within the region of interest.
  • a computer system is provided with an image measurement window, which allows the user to perform certain operations that are particularly useful for presenting and analyzing an image.
  • the computer system includes a graphical user interface having a measurement window that provides both the image itself and one or more tools for defining a region of interest on the image.
  • the computer system can calculate information about a portion of the image within the defined region of interest.
  • the measurement window may also include display controls for controlling at least one the following features of the displayed image: threshold, brightness, contrast, and sharpness.
  • the one or more tools for defining the region of interest allows the user to graphically create a rectangle on the image, an ellipse on the image, and/or a grid on the image. At least one of these tools may be provided as a button which, when selected, causes a region of interest to appear on the displayed image. After the region of interest is created on the image, the user can move and/or reshape the region of interest by the action of the pointer.
  • the present invention may provide a date stamped electronic notebook in conjunction with the image measurement window.
  • the electronic notebook may display image analysis data (typically text pertaining to the image) such as measurement results, experimental parameters, user notes, and the like.
  • the computer system may automatically display and date stamp image measurement results obtained via the user interface.
  • the present invention also relates to computer interfaces to assist data management. More specifically, in accordance with one embodiment of the present invention, a computer system is provided with one or more data management windows, which allow the user to perform certain operations that are useful for browsing, presenting and analyzing previously stored imaging data.
  • a user interface for presenting and analyzing an image including a photographic representation of an object and a luminescence representation of the object.
  • the luminescence representation presents the location and magnitude of radiation emitted from the object.
  • the user interface may be characterized by the following features: (1) a first display control permitting a user to manipulate the visual presentation of at least one of the luminescence representation and the photographic representation; (2) a second display control permitting the user to create at least one region of interest on the luminescence representation; and (3) a third display control permitting the user to make a measurement of a portion of the luminescence representation bounded by the at least one region of interest.
  • Other display controls of the user interface may include a fourth display control that permits the user to select which of the photographic representation and the luminescence representation is to be displayed.
  • An optional fifth display control allows the user to print some portion or all of the image.
  • Yet another aspect of the present invention relates to a method implemented on a computer system.
  • the method includes analyzing a region of interest on an image presented on a display associated with the computer system. This includes defining a region of interest on the image when the user has selected a region of interest tool from a user interface presented on the display. Note that the region of interest tool and the image are concurrently displayed on the display.
  • the method further includes calculating a property of the image within the region of interest.
  • Embodiments of the present invention further relate to a computer readable medium including instructions for applying the above mentioned interfaces and methods.
  • FIG. 1 illustrates an imaging apparatus suitable for capturing photographic and luminescence images in accordance with one embodiment of the present invention.
  • FIG. 2 is a flowchart illustrating a method of capturing photographic and luminescence images using the imaging apparatus of FIG. 1, for example, in accordance with one embodiment of the present invention.
  • FIG. 3 A is an illustration showing a graphical user interface having an overlay of a photographic representation and a luminescence representation of a specimen as well as various image manipulation, analysis and measurement tools.
  • FIG. 3B illustrates an electronic notebook page suitable for storing raw data and non-analysis information in accordance with one embodiment of the present invention.
  • FIGs. 3C-E illustrate the automatic storage of analysis data and measurement results into an electronic notebook page in accordance with another embodiment of the present invention.
  • FIG. 3F illustrates another example of an image control/measurement window in accordance with the present invention.
  • FIG. 4 is a flowchart illustrating a method of making measurements using the GUI of FIG. 3 in accordance with one embodiment of the present invention.
  • FIG. 5A illustrates an exemplary file structure for data management in accordance with one embodiment of the present invention.
  • FIG. 5B illustrates a browser control window in accordance with one embodiment of the present invention.
  • FIG. 5C illustrates an empty data file in accordance with a specific embodiment of the present invention.
  • FIG. 5D illustrates a search label name editing window in accordance with a specific embodiment of the present invention.
  • FIG. 5E illustrates an exemplary table resulting from a search using the browser control window of FIG. 5B in accordance with a specific embodiment of the present invention.
  • FIG. 5F illustrates a sorting priority window in accordance with a specific embodiment of the present invention.
  • FIG. 6A illustrates an image capture graphical user interface suitable for controlling the imaging apparatus of FIG. 1 in accordance with one embodiment of the present invention.
  • FIG. 6B illustrates another image capture graphical user interface suitable for controlling the imaging apparatus of FIG. 1 in accordance with another embodiment of the present invention.
  • FIGS. 7 A and 7B illustrate a computer system suitable for implementing embodiments of the present invention.
  • GUI graphical user interface
  • the user may create and manipulate analysis tools and perform a wide variety of measurements on complex images (such as in-vivo images) conveniently and efficiently.
  • the present invention may allow the user to store measurement results in a dated electronic notebook, display testing information, manipulate the image presentation and print while maintaining view of the image. For management of stored information corresponding to multiple images, one or more GUIs are provided which simplify imaging data management.
  • One preferred embodiment of this invention pertains to graphical user interfaces for presenting and analyzing "overlay" or "composite” images including a photographic image on which is overlaid an "emissions" image.
  • the photographic and luminescence images are taken of the same object.
  • the object is a biological specimen.
  • the luminescence image is taken without using light sources other than the object itself. Luminescence from the object is recorded as a function of position to produce the luminescence image.
  • Fig. 1 illustrates an imaging system 10 configured to capture photographic and luminescence images in accordance with one embodiment of the present invention.
  • the imaging system 10 may be used for imaging a low intensity light source, such as luminescence from luciferase-expressing cells, fluorescence from fluorescing molecules, and the like.
  • the low intensity light source may be emitted from any of a variety of light-emitting samples which may include, for example, tissue culture plates, multi-well plates (including 96, 384, 864 and 1536 well plates), and animals or plants containing light-emitting molecules, such as various mammalian subjects such as mice containing luciferase expressing cells.
  • the imaging system 10 comprises an imaging box 12 adapted to receive a light- emitting sample in which low intensity light, e.g., luciferase-based luminescence, is to be detected.
  • the imaging box 12 includes an upper housing 16 in which a camera lens is mounted.
  • An intensified or a charge-coupled device (CCD) camera 20 is optically engaged with, and positioned above, the camera lens.
  • the CCD camera 20 is capable of capturing luminescent and photographic (i.e., reflection based images) images of the sample within the imaging box 12.
  • the CCD camera 20 is cooled by a suitable source such as a refrigeration device 22 that cycles a cryogenic fluid through the CCD camera via conduits 24.
  • a suitable refrigeration device is the "CRYOTIGER" compressor, which can be obtained from IGC-APD Cryogenics Inc., Allentown, PA.
  • An image processing apparatus 26 interfaces between CCD camera 20 and a computer 28 through cables 30 and 32 respectively.
  • the computer 28 which may be of any suitable type, typically comprises a main unit 36 that contains hardware including a processor, memory components such as random-access memory (RAM) and read-only memory (ROM), and disk drive components (e.g., hard drive, CD, floppy drive, etc.).
  • the computer 28 also includes a display 38 and input devices such as a keyboard 40 and mouse 42.
  • the computer 28 is in communication with various components in the imaging box 12 via cable 34. To provide communication and control for these components, the computer 28 includes suitable processing hardware and software configured to provide output for controlling any of the devices in the imaging box 12.
  • the processing hardware and software may include an I/O card, control logic for controlling any of the components of the imaging system 10, and a suitable graphical user interface for the imaging system 10.
  • the computer 28 also includes suitable processing hardware and software for the camera 20 such as additional imaging hardware, software, and image processing logic for processing information obtained by the camera 20.
  • Components controlled by the computer 28 may include the camera 20, the motors responsible for camera 20 focus, the motors responsible for position control of a platform supporting the sample, the camera lens, f-stop, etc.
  • the logic in computer 28 may take the form of software, hardware or a combination thereof.
  • the computer 28 also communicates with a display 38 for presenting imaging information to the user.
  • the display 38 may be a monitor, which presents an image measurement graphical user interface (GUI) that allows the user to view imaging results and also acts an interface to control the imaging system 10, as will be discussed in further detail below.
  • GUI image measurement graphical user interface
  • FIG. 2 is a flowchart illustrating a method of capturing photographic and luminescence images using the imaging apparatus of FIG. 1 in accordance with one embodiment of the present invention.
  • a process flow 200 begins with placing the light-emitting sample to be imaged in the imaging device (202).
  • the imaging system 10 is then prepared for photographic capture of the light-emitting sample (204).
  • the preparation may include turning on the lights in the imaging box 12, focusing the CCD camera 20, positioning the light-emitting sample, etc.
  • the photographic image is captured (206).
  • a 'live mode' is used in photographic capture of the light- emitting sample.
  • the live mode includes a sequence of photographic images taken frequently enough to simulate live video.
  • the photographic image data is transferred to processing apparatus 26 (208).
  • the processing apparatus may manipulate and store the photographic image data as well as present it on the display 38.
  • the imaging system 10 is prepared for capturing a luminescence image (210).
  • the preparation may include turning off the lights in the imaging box 12, for example.
  • the CCD camera 20 captures the luminescence image.
  • the luminescence image data is transferred to the processing apparatus 108 (212).
  • the processing apparatus may store and manipulate the luminescence image data as well as present it on the display 38 (214).
  • the manipulation may also include overlaying the luminescence image with the photographic image and illustrating the two images together. This overlay image may then be the basis for user analysis (216).
  • the user now has the components of a digital overlay image stored in the processing apparatus 26 including the luminescence image and the photographic image.
  • the information contained in the digital overlay image may be analyzed and manipulated as desired.
  • the photographic and luminescence representations provided by the imaging system 10 and imaging interface of the present invention have a wide variety of applications.
  • the luminescence representation indicates the number of times each detector pixel has received a photon over a defined length of time.
  • the luminescence representation may display magnitude values representing the photon counts at the individual detector pixels. Regions of the object emitting radiation (e.g., photons) will appear in the luminescence representation.
  • the luminescence images may indicate the presence of a biocompatible entity, for example.
  • the entity can be a molecule, macromoloecule, cell, microorganism, a particle or the like.
  • an in-vivo analysis may include detecting localization of a biocompatible entity in a mammalian subject.
  • the information in the live mode may be used to track the localization of the entity over time.
  • FIG. 3A illustrates one example of an image control/measurement window 300 in accordance with this invention.
  • the image control window 300 includes an image measurement window 301. Within the image measurement window 301, an overlay image 302 is displayed.
  • the overlay image 302 includes a visual superposition of a photographic representation of the light-emitting sample and a luminescence representation of the light-emitting sample.
  • the light- emitting sample comprises three mice.
  • the light-emitting sample comprises a high density well plate comprising an array of wells, e.g., 24x16 and 48x32 well plates are common, wherein each well contains a separate biological specimen.
  • the image control window 300 is well suited for manipulating the display of the overlay image 302 as well as making measurements and analyzing the luminescence representation.
  • the photographic representation provides the user with a visual frame of reference of the image.
  • the luminescence representation provides photon emission data derived from the object.
  • the photon emission data may represent the specific pixels on the CCD camera 20 that detect photons over the duration of the live mode image capture period.
  • the imaging system 10 is typically used to measure the entire light-emitting sample, the data in the luminescence representation typically has one or more distinct luminescent portions of interest.
  • the luminescence representation illustrated includes luminescent portions 308 and 310 of the left mammalian sample. Alternatively, for a well plate, each portion of interest may correspond to a single well in the plate.
  • the image control window 300 displays an overlay image 302 comprised of two separate representations, most data manipulation and analysis of interest is performed on the luminescence representation.
  • an analysis may include a summation of the illumination magnitudes over the pixels within a portion of the luminescence representation. Note that although the discussion will focus on a single luminescence representation for the overlay image 302, the image control window 300 may include multiple luminescence representations taken at different times.
  • image control/measurement window 300 includes a control panel 312.
  • the control panel 312 includes a plurality of user interface control components for facilitating manipulation and analysis of information in the image measurement window 301.
  • the user interface control components may be grouped into functional sections within the control panel 312.
  • the control panel 312 includes a display function section 314, a create function section 316 and a measurement function section 318.
  • Other arrangements, with or without a "control panel" are contemplated.
  • the display function section 314 includes controls for allowing the user to manipulate the presentation of the photographic representation and the luminescence representation.
  • the display function section 314 includes a brightness setting 320.
  • the brightness setting 320 is used for improving the user's visual perception of the photographic representation by allowing adjustment of the photograph's brightness.
  • other controls on a photographic image such as contrast, sharpness, and the like may be provided.
  • the display function section 314 includes an upper luminescence limit 322 and a lower luminescence limit 324.
  • the upper luminescence limit 322 allows the user to designate the maximum data value displayed in the luminescence representation. Any pixels within the luminescence representation having a data value (e.g., a photon count) at or over this upper luminescence limit 322 will be displayed with a color corresponding to the upper luminescence limit 322.
  • the lower luminescence limit 324 allows the user to designate the minimum data value displayed in the luminescence representation. Any pixels within the luminescence representation having a data value below this lower luminescence limit 324 will not be displayed.
  • the upper and lower luminescence limits specify the range of pixel illumination values over which the full range of display colors will be vary.
  • the upper luminescence limit 322 and the lower luminescence limit 324 may be useful when the user wants to selectively clear the image of outlying data for a particular analysis.
  • the lower luminescence limit 324 may be useful when the user wants to clear the image of noise.
  • the display function section 314 also includes a global setting 326.
  • the global setting 326 provides a default option for the presentation of the luminescence representation.
  • the global setting sets the upper luminescence limit 322 and lower luminescence limit 324 to specified values.
  • the upper luminescence limit 322 and lower luminescence limit 324 are set to the 'full range' of values for the luminescence representation.
  • the upper limit is set to the value of the maximum intensity measured for any pixel in the luminescence representation and the lower limit is set to the value of the minimum intensity measured for any pixel in the luminescence representation.
  • another preset option may set the upper luminescence limit 322 and lower luminescence limit 324 to a standardized range of values for the luminescence representation.
  • the standardized range may set the upper luminescence limit 322 at 95% of the maximum photon count for the luminescence representation and the lower luminescence limit 324 at 5% of the maximum photon count.
  • another standardized range may be based on a statistical analysis, such as the standard deviation, of the range of data values for the luminescence representation.
  • the display function section 314 may also include binning control. Binning is an image processing procedure often used to account for insufficient information, e.g., per pixel.
  • the number of pixels in each direction of the luminescence representation may be halved to produce a new pixel array comprising the magnitude of four previous pixels in a single new pixel. Binning in this manner may be useful to obtain a better signal to noise ratio for the luminescence representation, or to improve statistical analysis of the luminescence representation when the amount of data in the luminescent representation is small or otherwise better analyzed when pooled.
  • the display function section 314 may include a user designated binning factor that specifies the amount of binning in each direction of the luminescence representation.
  • the number of pixels in each direction of the luminescence representation may be reduced by a factor of 5 to produce a new pixel array comprising the magnitude of 25 previous pixels in a single new pixel.
  • the display function section 314 includes one or more software tools that compensate for hardware induced errors. For example, random points and 'spots' of abnormal activity may appear in the photographic and luminescence representation as a result of interactions by cosmic rays and other background radiations.
  • FIG. 3F illustrates another example of an image control/measurement window 400 in accordance with the present invention.
  • the image control window 400 includes a cosmic control 402 that detects localized abnormal activity in the photographic and/or luminescence representation and corrects for it.
  • the cosmic control 402 applies a median filter to an abnormal point or spot and smooths the abnormality using values from neighboring points.
  • the image control window 400 may include a flat field tool 404 that compensates for hardware induced radial inconsistency in a photographic or luminescence representation.
  • the flat field tool 404 calibrates images to be displayed with calibration data previously obtained and stored for the camera that the representation was taken with.
  • the image control window may derive and maintain calibration data for each camera used in conjunction with the window. For the image control window 400 as illustrated in FIG.
  • the flat field tool 404 is a toggle that, when turned on, automatically determines whether calibration is required for an image to be displayed (based on recorded imaging parameters in the data file for the image) and calibrates the image, if necessary.
  • the user is thus provided a transparent solution to hardware induced errors.
  • the image for an blank view of the imaging box 12 without the light-emitting sample is often referred to as a 'dark image'.
  • the dark image may indicate inherent defects in a solid state camera, which defects should be subtracted from images taken with the camera.
  • the dark image may show contain bright spots corresponding to camera pixels having a high leakage current.
  • the display function section 314 may include a background compensation tool 325.
  • the computer system alters the photographic representation and the luminescence representation to compensate for any information associated with the dark image.
  • the background compensation tool 325 is a toggle that toggles for automatic calibration of images. When turned on, the background compensation tool 325 automatically calibrates all displayed photographic or luminescent images against a dark image. If a suitable dark image is not available, the image control window 300 may notify the user that a suitable dark image was not available and prompt the user to obtain one.
  • the background compensation tool 325 may include a table of different dark images that compensate for different cameras and camera imaging conditions. Dark images within the table may vary based on binning parameters, exposure time, camera temperature, etc. Thus, upon initiating a photographic or luminescent image, the image control window 300 may probe camera conditions for the image and select an appropriate dark image from the table. The image control window 300 may also update the dark image values in the table. In imaging systems where the camera 20 is on for extended periods for example, the image control window 300 may periodically update the table by capturing new dark images when the dark image reaches a certain age (e.g., 3 days), comparing these new dark images with those stored in the table and renewing them if there has been substantial change. Coupled with a tool for automatic calibration for all displayed images, automatic dark image renewal in this manner provides a transparent image calibration tool that further simplifies imaging analysis.
  • a certain age e.g., 3 days
  • the create function section 316 includes controls for allowing the user to create and manipulate tools which enable simple and flexible analysis of the data within the image measurement window 301.
  • the create function section 316 includes a create button 326.
  • the create button 326 allows the user to create a region of interest (ROI) with one action on the interface. For example, the user simply clicks on button 326 with a pointer and a new ROI appears in the image measurement window.
  • the ROI may be any geometric shape or tool for measuring and analyzing data in a portion or portions of the luminescence representation.
  • the create function section 316 includes a pop-up menu 327.
  • the pop-up menu 327 includes a number of ROIs commonly used for image analysis.
  • the create button 326 and pop-up menu 327 may allow the user to create an ellipse (circle) 328, a rectangle (square) 330 or a grid 332.
  • a label may be attached to the geometric outline of the ROI for user clarity.
  • a label 334 is attached to circle 328.
  • the label 334 may include label identification and user information such as information relating to the light-emitting sample.
  • the create function section 316 includes a designation pop-up menu 335.
  • the designation pop-up menu 335 lists and numbers the ROIs as they are created.
  • the designation pop-up menu 335 allows the user to re-access previously created ROIs that were previously numbered and stored.
  • the ROI currently being accessed by the user is indicated to the user via highlights 336.
  • the create function section 316 also includes a remove tool 337.
  • the remove tool 337 allows the user to delete any or all of the ROIs stored in the designation pop-up menu 336.
  • the remove tool 337 may also include a pop-up menu 338 for convenience in deleting the ROIs.
  • the image control window 300 also allows the user to manipulate the ROIs.
  • the size, shape, position and orientation of the circle 328 may be altered.
  • the orthogonal axis of the circle 328 may be altered to form an ellipse.
  • the ellipse may then characterized by a major axis and a minor axis.
  • the dimensions of the square 330 may be altered to form a rectangle.
  • the manipulation of the ROIs may further include rotations, expansions, etc.
  • the dragging of an ROI is done by clicking a pointer 344 on a portion of the circle 328.
  • the reshaping of an ROI may be performed by clicking the pointer 344 on one of the highlights 346 and dragging.
  • the manipulation and alterations of the ROIs may include keyboard input.
  • the ROI options may be include a free-hand drawing option, polygons of five or more sides, curve drawing options, and the like.
  • the free-hand drawing option the user marks a series of points, which then form a perimeter of a closed ROI. Using the created ROIs, the user may then proceed to use one or more of the ROIs to measure and analyze data.
  • the image control window 300 may also allow the user to create and save custom ROIs.
  • a user may create a custom ROI that corresponds to a 24x16 well plate using the grid 332 and suitable size, position and shape manipulation.
  • the grid may be then be adapted such that each well in the 24x16 well plate receives one or more grid units of the grid 332.
  • the custom grid ROI may then be saved with a name, e.g., '24x16 grid'. After saving, custom ROIs will appear in the designation pop-up menu 335 and may be repeatedly used as desired. Creating and saving custom ROIs in this manner removes the need for continually having to recreate the same ROI, which is often useful for imaging performed on the same sample over numerous days or other imaging scenarios requiring repetitive analysis.
  • the measurement function section 318 includes GUI controls for allowing the user to measure and analyze data within the image measurement window 301.
  • the illustrated measurement function section 318 includes a measure button 348.
  • the measure command allows one or more functions for analysis of the data in the luminescence representation.
  • one function may be summation of all the pixel magnitudes within the perimeter of one or more of the ROIs.
  • Another function may include an average of the magnitudes over the area within an ROI.
  • Yet another function may also be a statistical analysis of the data within one or more of the ROIs.
  • the measurement function section 318 includes a measurement designation pop-up menu 350.
  • the measurement designation pop-up menu 350 enables the user to specify which of the created ROIs stored in the designation pop-up menu 335 are to be included in a measurement.
  • the user may specify any or all of the previously created ROIs for a particular measurement.
  • the measurement function section 318 also includes a record option 352.
  • the record option 352 allows the user to store the data produced from the measure command 348.
  • the data is stored in an electronic notebook.
  • the electronic notebook is an on-line tool which allows testing results and information to be stored automatically and will be described further with respect to FIG. 3B.
  • the image control window 300 also includes numerous other user interface tools.
  • a global display tool 354 is included to allow the user to control which representations are displayed in the measurement window 301. More specifically, the user may select the overlay image 302 including the visual superposition of the photographic representation and the luminescence representation. Alternatively, using the global display tool 354, the user may select just one of the photographic representation and the luminescence representation.
  • the global display tool 354 may also allow the user to select between numerous luminescence representations stored for the photographic representation.
  • the global display tool 354 typically has a default setting when a data file is accessed. For the image control window 300, the default setting is the overlay image 302 comprising the photographic representation and one luminescence representation.
  • the image control/measurement window 400 includes a blend tool 406 that allows blending of the luminescent and photographic images.
  • the blend tool 406 allows the underlying photographic image to display details in the region of the luminescent image.
  • the blend tool 406 may also include a blend bar 408 that allows the user to vary the degree of opacity for the overlying luminescent image. More specifically, the blend bar 408 ranges from 0, corresponding to transparency of the overlying luminescent image, to 1, corresponding to no blending between the two images and the luminescent image blocking out the photographic image.
  • the luminescence image display section 358 includes a number of components to assist in viewing and comprehension of the luminescence representation.
  • the luminescence image display section 358 includes an image maximum 360 and an image minimum 362.
  • the image maximum 360 indicates the magnitude of the highest data value (photon count) for any pixel in the luminescence representation.
  • the image minimum 362 indicates the magnitude of the lowest data value (photon count) for any pixel in the luminescence representation. The difference between the image maximum 360 and the image minimum 362 corresponds to the full range of pixel magnitudes for the luminescence representation.
  • the luminescence image display section 358 also includes a legend maximum
  • the legend maximum 364 indicates the magnitude of the maximum data value (photon count) for the image measurement window 301.
  • the legend maximum 364 corresponds to the upper luminescence limit 322.
  • the legend minimum 366 indicates the magnitude of the minimum data value for the image measurement window 301, similarly corresponding to the lower luminescence limit 324.
  • the legend maximum 364 and the legend minimum 366 may correspond to the image maximum 360 and the image minimum 362.
  • the legend maximum 364 indicates the magnitude of the highest data value (photon count) in the luminescence representation that will be displayed with the highest intensity color (e.g., red).
  • the legend minimum 366 indicates the magnitude of the lowest data value (photon count) in the luminescence representation that will be displayed with the lowest intensity color (e.g., blue).
  • the scale 364 provides a visual mapping between a range of colors for the luminescence representation and the magnitude range specified at 322 and 324.
  • the scale may be represented by a gray scale and thus individual magnitudes correspond to shades of gray or by color and thus the magnitude indicator correspond to different colors.
  • the image control window 300 also includes a user information section 370.
  • the user information section 370 provides testing information for the overlay 302, which may be helpful to the user.
  • the testing information may include, for example, a user ID 371, a test classification 372, a testing reference number 373, a date 374, a label 375 and a comments field 376.
  • a print button 356 is also included and allows the user to conveniently print portions of the image control window 300.
  • the print button 356 employs a default setting corresponding to preferred use.
  • the default setting automatically prints the image measurement window 301, the luminescence image display section 358 and the user information section 370.
  • the print command 356 may introduce a settings window to set the default print options.
  • the imaging system 10 may take alternate luminescence images of the light-emitting sample. More specifically, to overcome dependency on the depth of the image, different wavelengths may be used in capturing the luminescence image.
  • FIG. 3B illustrates an electronic notebook page 380 in accordance with one embodiment of the present invention.
  • the notebook page 380 appears as a separate window to the side of the image control window 300.
  • the electronic notebook page 380 may automatically store information corresponding to the overlay image 302 upon instantiation.
  • the electronic notebook page 380 may be instantiated when the user initially loads an image. In this case, image capture information will be entered first.
  • the electronic notebook page 380 may be instantiated when the user creates an ROI for the first time or inputs any information, which is relevant to an analysis of the luminescence representation.
  • the overlay image 302 is accessed from memory, the electronic notebook page 380 may be instantiated as last saved.
  • the electronic notebook page 380 solely contains reference information corresponding to the overlay image 302 and does not include analysis information and results.
  • the electronic notebook page 380 is referred to as a 'raw data' page.
  • the electronic notebook page 380 may include a header 382.
  • the header contains classification information for the overlay image 302 such as the testing reference number 373.
  • the electronic notebook page 380 may also include a user input section 384.
  • the user input section 384 may include, for example, the user ED 371, test classification 372, the label 375 and comments field 376 corresponding to the user information section 370 of the image control window 300. Alternatively, the user may enter information into the notebook page 380 as desired.
  • the electronic notebook page 380 may also include other information used in characterizing data or an image.
  • a photographic image section 386 includes information relevant to the settings used for photographic image capture using the imaging system 10.
  • an luminescence image section 388 includes information relevant to the settings used for luminescence image capture using the imaging system 10.
  • the electronic notebook page 380 may be saved in a database which stores all the raw data files and corresponding image files in a common directory.
  • this database for raw data files is referred to as a 'raw data set'.
  • the database may be arranged such that a text file is associated with each image file in the data set.
  • the text file may contain information about the image captured.
  • the image information may include the time the image was taken, the camera settings (i.e. the exposure length), image identification numbers, labeling information entered by the user when the image was taken.
  • each photographic representation and luminescence representation may have its own file in the raw data set.
  • the file formats used in the raw data set are generic to increase application flexibility.
  • the photographic representation and luminescence representation may be saved in a TIFF format to allow access from a wide variety of software packages.
  • each text file may be saved as an ASCII text file to allow flexible access.
  • a raw data database may be established which is generic and easily accessible as will be described in further detail below.
  • an electronic notebook page may contain information that includes analysis settings and measurement results.
  • FIGs. 3C-E illustrate the automatic storage of analysis data and measurement results into an electronic notebook page 390 in accordance with another embodiment of the present invention.
  • the electronic notebook page 390 is referred to as a 'analyzed data' page.
  • the electronic notebook automatically stores the analysis data 382 obtained from the measure command 348 when the record option 352 is selected. Storing the analysis data provides the user a convenient mechanism to access the analysis data at a subsequent time.
  • the analysis data may include the characteristic geometric information of the ROIs as well as the results of any measurements using the ROIs.
  • FIG. 3C illustrates the electronic notebook page 390 before analysis information is entered.
  • the header 391 includes a testing reference 398 and file references 392.
  • the testing reference 398 includes the analysis date in the first six numerical digits, the user name and may also include other testing information.
  • the electronic notebook page 390 automatically records an automatic date stamp in which an analysis was performed in the testing reference 398.
  • the automatic date stamp may be advantageous in the future for verifying the date of analysis and testing.
  • the automatic date stamp is subsequently unalterable to further strengthen subsequent validation.
  • the file references 392 refer to data files which includes the loading information of the photographic representation and the luminescence representation.
  • the file references 392 may also include the information relevant to the settings used for photographic and luminescence image capture using the imaging system 10.
  • FIG. 3D illustrates the electronic notebook page 390 after an analysis using three ROIs 393. For each ROI, its characteristic geometric information 394 regarding the perimeter of the ROI is automatically stored in the electronic notebook page 390. In addition, results 395 for the measurement within each ROI 393 are automatically stored. Further, total measurement results 396 for all three ROIs are automatically stored.
  • FIG. 3E illustrates the electronic notebook page 390 prior to exiting the image control window 300.
  • filing information 397 may be stored prior to closing of the electronic notebook page 390 (FIG. 3E).
  • the electronic notebook page 390 allows the user to make notes and input additional analysis information at any time. Preferably this is accomplished by simply placing the cursor/pointer at the desired location within the electronic notebook and typing in the notes of interest.
  • statistical information for the luminescence representation such as a number of pixels 398 in the ROIs, area sums 399 and standard deviation, etc. may be stored if they are not included in the automatic transfer of information from the image control window 300.
  • luminescence representation information such as the image maximum 360, image minimum 362 and average data value per pixel may be stored in the electronic notebook page 390.
  • the electronic notebook page 390 may store any information relevant to re-creating a measurement or analysis.
  • a directory may be maintained for the electronic notebook page 390 and similar other analyzed data files.
  • this analyzed file directory is referred to as an 'analyzed data set'.
  • two databases may be maintained: the first containing raw data and the second containing analyzed data.
  • the files stored in the analyzed data set will contain information about the photographic representation, the luminescence representation and the text file corresponding to the analyzed data all in one file.
  • the file may be any such format which allows these pictorial and text components.
  • the file may be a Living Image file suitable for use with the Living Image Software.
  • FIG. 4 is a flowchart representative of an exemplary data analysis using the image control window 300.
  • a process flow 560 typically begins with accessing a data file (562). Accessing the data file may include opening a data file previously stored in memory. Alternatively, the data file may be accessed from an image recently captured by the imaging system 10, without opening a stored file.
  • the user may alter the photographic representation and the luminescence representation using any of the tools in the display function section 314 (566).
  • the user may alter the upper luminescence limit 322 and lower luminescence limit 324 to facilitate viewing clarity and comprehension of the image.
  • the user may also enter details in the user information section 370.
  • the user may proceed to further alter the image display (568).
  • the user may alter the luminescence limit 322 and lower luminescence limit 324 to facilitate viewing clarity of a particular portion of the luminescence representation to be analyzed.
  • the user may then create one or more ROIs (570). After the corresponding ROIs are generated, the user may alter the generic ROIs for a particular analysis (572). More specifically, the user may manipulate the position, shape and angle of rotation of one or more created RIOs. Upon completion of ROI manipulation, the user may then perform a measurement within one or more of the ROIs (574). In a preferred embodiment, the measurement involves taking a summation of the photon counts for each of the pixels within the perimeter of one or more ROIs. The results of the measurement may then be transferred to the electronic notebook page 390 (576).
  • the user may then continue (578) to make measurements on the same or different portions of the luminescence representation using the image control window 300.
  • the user may return to alter the image display for another measurement (568), create more ROIs (570) or manipulate the existing ROIs (574).
  • the user may save the work and exit (579).
  • the process flow 560 is one method of using the image control window 300 for analysis of the overlay image 302. Obviously, many of the elements of the process flow 560 may be repeated or performed outside of the order illustrated.
  • the user may alter the image display to improve viewing (568), print the image control window 300 or make notes in the electronic notebook page 390 (578).
  • a user or user group may generate and store a library of photographic, luminescence, and digital overlay images as well as associated data files. For many imaging applications, the amount of data accumulated may be excessive.
  • the present invention may provide computer interfaces and tools to assist a user with imaging data management.
  • a data management system is provided with one or more data management windows, each allowing the user to perform operations that are particularly useful for browsing, presenting and analyzing imaging data.
  • data management tools in accordance with one embodiment of the present invention are designed without requiring such specialized services.
  • conventional proprietary data management systems often require specialized skills outside the experience of most researchers and computer users.
  • data management tools in accordance with one embodiment of the present invention allow researchers and users to manage large amounts of images and imaging data in a simplified manner that doesn't require specialized data management skills.
  • FIG. 5A illustrates an exemplary file structure 400 for data management in accordance with one embodiment of the present invention.
  • the file structure 400 is illustrated in a graphical user interface (GUI) environment and includes a base file 401 that comprises various graphics files 402a-e and data files 404a-b common to a single image or sample.
  • GUI graphical user interface
  • Numerous base files 401 may be grouped in an hierarchical arrangement according to date, imaging conditions, sample, testing specifics, etc. Grouping in this manner allows a user to customize file arrangements and higher level groups according to user preferences.
  • base files 401 for testing the effect of a biological chemical on several biological specimens over an extended period of time may be labeled by day, the days grouped into a parent file by specimen, and the specimen files grouped into a parent file by chemical test.
  • the chemical test files may be further grouped according to user.
  • FIG. 5B illustrates a browser control window 410 in accordance with one embodiment of the present invention.
  • the browser control window 410 allows the user to specify one or more search criteria for obtaining a set of data files containing the search criteria.
  • the search criteria is determined by user input into one or more fields 412a-416a designated by search label names 412b-416b.
  • the fields 412a-416a of the browser control window 410 correspond to text, values or other information that may be input and stored by a user in a data file associated with an image.
  • the search criteria for the browser control window 410 may correspond to any information stored in the various imaging and information data files in the computer 28.
  • the data files to be searched may correspond to the electronic notebook page 380 of FIG. 3B and the information may include reference information corresponding to the overlay image 302.
  • the information used in searching may include any information found in the user input section 384 and photographic image section 386 of the electronic notebook page 380.
  • the information may include the capture date of an image, camera specifications, camera calibration information and any information useful for re-capturing the image, imaging specifics such as binning parameters and exposure time, image identification numbers, labeling information entered by the user when the image was taken, ROI analysis information, etc.
  • the browser control window 410 includes a Series field 412a, an Experiment field 413a, a Label field 414a, a Comment field 415a and an analysis Comment field 416a.
  • the information is stored in relation to a similar data label name in the data file.
  • the browser control window 410 also includes toggles 438a-e that allow a user to turn off or on each of the fields 412a-416a as a criteria in a search.
  • FIG. 5C illustrates an empty data file 430 in accordance with a specific embodiment of the present invention.
  • the data file 430 includes one or more data fields 432a-432e designated by data label names 434a-434e stored in a simple text format.
  • the data label names 434a-434e correspond to the search label names 412b- 416b of the browser control window 410.
  • Text, values or other information for each of the data fields 432a-432e may be input into the data file 430 during initial creation and analysis of the corresponding image or subsequent analysis and data file 430 alteration.
  • the present invention allows users to customize the labels within the data file 430 and within the browser control window 410. This allows browsing, presenting and analyzing of image data to be customized to a user. This is often advantageous when a user produces a large number of user-specific images that are related in some manner, e.g., a common testing procedure for a large number of samples tested daily.
  • the labels within the browser control window 410 may be grouped into customized sets common to a large number of data files.
  • the data file 430 includes three exemplary User Label Name Sets 432, 434 and 436 used to group common label names.
  • the User Label Name Set 434 entitled 'Xenogen Oncology', includes data fields having data label names: group ID, experiment number, time point, animal number, cell line & number, animal model, comment 1, comment 2, IACUC number, animal strain, and user.
  • All data files produced in the set will include information for these data fields.
  • the user label named set 436 entitled 'Xenogen Infectious Disease' includes another set of data fields having data label names: group ID, experiment ID, animal model, animal strain, pathogen, root of infection, dose, treatment, animal number, time point, comment 1, comment 2, and IACUC number.
  • all data files produced in the set will include information for these data fields.
  • a Label Name Set tool 446 has a pulldown menu 447 that includes User Label Name Sets. In one embodiment, selecting a specific User Label Name Set using the pulldown menu 447 automatically generates search label names and fields corresponding to data fields and data label names used in data files of that User Label Name Set; thus simplifying searching within the chosen set.
  • the browser control window 410 also includes a User ID tool 440 to allow a user to specify a user ID as a search criteria.
  • the user ID tool 440 includes a pulldown menu 442 that contains a list of previously recorded user identifications and an open ID field 444 that allows a user to specify non-recorded user identification information.
  • the user may initiate a search by selecting 'done' tool 446.
  • the browser control window 410 searches solely through data and text files in the file structure 400 according to a text or string search.
  • the browser control window 410 may include other features to expedite and simplify data management.
  • information entered into the browser control window 410 is stored and re-displayed when the browser control window 410 is re-started to remove the need to re-enter information common between two searches to expedite continual searching including similar search criteria.
  • each of the search label names 412b-416b may also include search pulldown menus 412c-416c that contains a list of previously recorded search label names. Recording search label names may remove the need to continually enter the same information and expedite input into the fields 412a-416a.
  • FIG. 5D illustrates a search label name editing window 450 in accordance with a specific embodiment of the present invention.
  • the search label name editing window 450 includes search label names 452 corresponding to the data label name set 'Xenogen Oncology' of FIG. 5C.
  • the window 450 includes a standard values menu 454 that allows a user to enter and edit standard values that will appear in the search pulldown menus of the browser control window 410.
  • FIG. 5E illustrates an exemplary table 460 resulting from a search using the browser control window 410.
  • the table 460 includes columns 462 corresponding to the search label names that were used during the search, e.g., search label names 412b-416b.
  • Individual files 464 that included the search criteria information are listed as rows as illustrated. In one embodiment, the individual files 464 are listed in the order that they are found. As the number of files produced by such a search may be excessive, the present invention may also include user assisting sorting tools.
  • FIG. 5F illustrates a sorting priority window 580 in accordance with a specific embodiment of the present invention.
  • the sorting priority window 580 includes three priority levels 582a-582c.
  • Each priority level 582 includes a pulldown menu 584 that includes each of the search label names used in a search, e.g. search label names 412b-416b.
  • search label names 412b-416b For the sorting priority window 580, only the first priority level 582 is selected and with the search label name 'Series' 412b.
  • the search label names used in the pulldown menus 584a-584c are determined using a User Label Name Set pulldown 586.
  • the sorting priority window 580 may also include additional conventional sorting tools such as a reverse toggle 588.
  • FIG. 6A illustrates an image capture GUI 500 suitable for controlling the imaging system 10.
  • the image capture GUI 500 includes an imaging mode control section 502.
  • the imaging mode control section 502 allows the user to designate one or multiple images to be taken.
  • the camera GUI interface 500 also provides the user with interfaces to control platform selection 504, lights on/off 505, set light strength 506 and set the exposure duration 508.
  • the camera GUI interface 500 may also include other functionality and tools useful in obtaining an image with the imaging system 10 not shown in FIG. 5.
  • the camera GUI interface 500 may also include control for manipulating the photon threshold for registering a pixel in the luminescence representation.
  • the photon threshold may be used to reduce electronic noise when capturing the luminescence representation.
  • FIG. 6B illustrates a second image capture GUI 520 suitable for controlling the imaging apparatus of FIG. 1 in accordance with another embodiment of the present invention.
  • the image capture GUI 520 includes an imaging mode control section 522 that allows the user to designate one or more images to be captured and control parameters associated with the images.
  • the imaging mode control section 522 allows the user to capture a photographic image 524 and a luminescence image 526.
  • the image capture GUI 520 allows the user to control both software and hardware parameters of imaging.
  • the image capture GUI 520 allows the user to designate an software parameters such as a binning factor 534 and to designate whether a combined photographic and luminescence image is represented as an overlay image using an overlay tool 536.
  • the GUI 520 also allows the user to control hardware parameters in the imaging apparatus 10 such as an exposure time 528 for the image, lights 540, live mode imaging 542, and an f-stop 532 that designates the exposure time of the camera lens and a filter control 534.
  • the filter control 534 includes a pulldown menus 538a and 538b that allow the user to select one of a series of filters implemented in the imaging apparatus 10.
  • the image capture GUI 520 allows the user to input camera cooling control parameters such as a temperature 544 and display temperature conditions for the camera 20.
  • a field of View (FOV) control 546 allows the user to adjust the image field of view by moving the sample shelf to or from the lens.
  • a focus tool 548 allows the user to manually or automatically control camera 20 focus.
  • FOV field of View
  • a system status display 550 provides the user with continual update of the current imaging status for the imaging apparatus 10.
  • the GUI 520 is suitable for use with an 'in-vivo imaging system' IVIS as produced by Xenogen Corporation of Alameda, CA.
  • FIGs. 7A and 7B illustrate a computer system 600 suitable for implementing embodiments of the present invention.
  • FIG. 6A shows one possible physical form of the computer system.
  • the computer system may have many physical forms ranging from an integrated circuit, a printed circuit board and a small handheld device up to a huge super computer.
  • Computer system 600 includes a monitor 602, a display 604, a housing 606, a disk drive 608, a keyboard 610 and a mouse 612.
  • Disk 614 is a computer-readable medium used to transfer data to and from computer system 600.
  • FIG. 6B is an example of a block diagram for computer system 600. Attached to system bus 620 are a wide variety of subsystems.
  • Processor(s) 622 also referred to as central processing apparatuss, or CPUs
  • Memory 624 includes random access memory (RAM) and read-only memory (ROM).
  • RAM random access memory
  • ROM read-only memory
  • RAM random access memory
  • ROM read-only memory
  • RAM random access memory
  • ROM read-only memory
  • a fixed disk 626 is also coupled bi-directionally to CPU 622; it provides additional data storage capacity and may also include any of the computer-readable media described below.
  • Fixed disk 626 may be used to store programs, data and the like and is typically a secondary storage medium (such as a hard disk) that is slower than primary storage. It will be appreciated that the information retained within fixed disk 626, may, in appropriate cases, be incorporated in standard fashion as virtual memory in memory 624.
  • Removable disk 614 may take the form of any of the computer-readable media described below.
  • CPU 622 is also coupled to a variety of input/output devices such as display 604, keyboard 610, mouse 612 and speakers 630.
  • an input/output device may be any of: video displays, track balls, mice, keyboards, microphones, touch- sensitive displays, transducer card readers, magnetic or paper tape readers, tablets, styluses, voice or handwriting recognizers, biometrics readers, or other computers.
  • CPU 622 optionally may be coupled to another computer or telecommunications network using network interface 640. With such a network interface, it is contemplated that the CPU might receive information from the network, or might output information to the network in the course of performing the above-described method steps.
  • method embodiments of the present invention may execute solely upon CPU 622 or may execute over a network such as the Internet in conjunction with a remote CPU that shares a portion of the processing.
  • embodiments of the present invention further relate to computer storage products with a computer-readable medium that have computer code thereon for performing various computer-implemented operations.
  • the media and computer code may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well known and available to those having skill in the computer software arts.
  • Examples of computer-readable media include, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media such as floptical disks; and hardware devices that are specially configured to store and execute program code, such as application-specific integrated circuits (ASICs), programmable logic devices (PLDs) and ROM and RAM devices.
  • Examples of computer code include machine code, such as produced by a compiler, and files containing higher level code that are executed by a computer using an interpreter.
  • the present invention has been discussed primarily in the context of making measurements for the summation of photon counts within the image measurement window 301, the present invention is suitable for other imaging applications and may be tailored correspondingly.
  • the present invention may be adapted for analysis of high detail in-vivo applications and thus may include zoom tools in the display function section 314.
  • zoom tools in the display function section 314.

Abstract

A graphical user interface is provided which allows the user to perform numerous operations suitable for analysis of in-vivo images within a single display screen or a single window. Using the the-vivo GUI, the user may create and manipulate analysis tools such as rectangle and ellipse tools to define regions of interest and perform various measurements on an in-vivo image. In addition, the GUI allows the user to store measurement results in a dated electronic notebook, display testing information, manipulate image presentation and print while maintaining view of the image.

Description

GRAPHICAL USER INTERFACE FOR IN-VIVO IMAGING
FIELD OF THE INVENTION:
The present invention relates generally to user interface software running on computers or computer systems. More specifically, the invention relates to user interface systems and methods used in examining and analyzing images.
BACKGROUND OF THE INVENTION:
In a computer application, there are numerous ways to present and manage user information. Graphical user interfaces (GUIs) on computer systems allow easy use of windows, control icons, etc. to display information to the user. The data displayed in a window may be of different types. Some may be graphical, such as icons or pictures, or textual, such as a word processing document, or a combination of both.
When a computer interface is used for data management in a scientific application, the interface may include various data- specific tools and functions. To handle images, for example, an application might desirably present one or more windows for viewing the image, a tool for changing the image's appearance (e.g., sharpness), and a tool to measure features of one or more images.
Unfortunately, the unique combination of functionality required for many imaging applications is not provided in a simple and easy to use computer interface. Specifically, available user interfaces, even those developed to handle imaging applications, do not provide a suite of particular image presentation and analysis tools that allow users to manipulate and measure image features with minimal navigation through the user interface.
Interfaces for available applications typically require that the user first select or open various windows, menus, buttons, and/or tiles and then and then manipulate the resulting tool to implement a single operation pertinent to image analysis. Because the user may be required to perform numerous operations for a single image, or handle numerous images simultaneously, the available user interfaces are generally very awkward or unwieldy. Obviously this compromises user efficiency and effectiveness in evaluating images.
Specialized in-vivo imaging applications can present particular challenges to the design of an appropriate user interface. In one example, the image may include one or more representations of emissions from internal portions of a specimen superimposed on a photographic representation of the specimen. The photographic representation provides the user with a pictorial reference of the specimen. The luminescence representation indicates portions of the specimen where an activity of interest may be taking place. For example, the in-vivo data may include light emissions from specific regions of the specimen used in tracking the progression of tumor or a pathogen within the specimen.
In view of the foregoing, an improved user interface for imaging applications would be highly beneficial.
SUMMARY OF THE INVENTION
The present invention addresses this need by providing a computer user interface having a window or other feature that provides tools allowing the user to quickly define a perimeter around a "region of interest" on the image and then measure a property of the image within the region of interest. The region of interest may be bounded by an ellipse, rectangle, or other shape selected and sized by the user. Preferably, both the image and the tool for generating the region of interest reside on the same window or other interface feature. Thus, a region of interest can be generated with one or two user interface actions (e.g., clicking on a button and then dragging a perimeter to an appropriate location on the image to specify the region of interest). The property measured within the region of interest may be an average or total pixel value within the region of interest.
In accordance with one embodiment of the present invention, a computer system is provided with an image measurement window, which allows the user to perform certain operations that are particularly useful for presenting and analyzing an image. In addition to having conventional computer hardware such as a processor, memory, and a display, the computer system includes a graphical user interface having a measurement window that provides both the image itself and one or more tools for defining a region of interest on the image. When a user uses one of the tools to define a region of interest, the computer system can calculate information about a portion of the image within the defined region of interest. By providing various frequently used features in a single window, interfaces of this invention remove the need to flip between alternate windows to take advantage of these features.
Among the other features that may be provided with the image measurement window is a measurement tool. When this tool is selected, the computer system automatically calculates the information about the portion of the image when the user uses one of the tools to define the region of interest on the image. The measurement window may also include display controls for controlling at least one the following features of the displayed image: threshold, brightness, contrast, and sharpness.
In a preferred embodiment, the one or more tools for defining the region of interest allows the user to graphically create a rectangle on the image, an ellipse on the image, and/or a grid on the image. At least one of these tools may be provided as a button which, when selected, causes a region of interest to appear on the displayed image. After the region of interest is created on the image, the user can move and/or reshape the region of interest by the action of the pointer.
In addition, the present invention may provide a date stamped electronic notebook in conjunction with the image measurement window. The electronic notebook may display image analysis data (typically text pertaining to the image) such as measurement results, experimental parameters, user notes, and the like. The computer system may automatically display and date stamp image measurement results obtained via the user interface.
As the amount of data generated and stored in many imaging applications may be excessive, the present invention also relates to computer interfaces to assist data management. More specifically, in accordance with one embodiment of the present invention, a computer system is provided with one or more data management windows, which allow the user to perform certain operations that are useful for browsing, presenting and analyzing previously stored imaging data.
In another aspect of the present invention provides a user interface for presenting and analyzing an image including a photographic representation of an object and a luminescence representation of the object. The luminescence representation presents the location and magnitude of radiation emitted from the object. The user interface may be characterized by the following features: (1) a first display control permitting a user to manipulate the visual presentation of at least one of the luminescence representation and the photographic representation; (2) a second display control permitting the user to create at least one region of interest on the luminescence representation; and (3) a third display control permitting the user to make a measurement of a portion of the luminescence representation bounded by the at least one region of interest. Other display controls of the user interface may include a fourth display control that permits the user to select which of the photographic representation and the luminescence representation is to be displayed. An optional fifth display control allows the user to print some portion or all of the image.
Yet another aspect of the present invention relates to a method implemented on a computer system. The method includes analyzing a region of interest on an image presented on a display associated with the computer system. This includes defining a region of interest on the image when the user has selected a region of interest tool from a user interface presented on the display. Note that the region of interest tool and the image are concurrently displayed on the display. The method further includes calculating a property of the image within the region of interest.
Embodiments of the present invention further relate to a computer readable medium including instructions for applying the above mentioned interfaces and methods. These and other features of the present invention will be described in more detail below in the detailed description of the invention and in conjunction with the following figures.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:
FIG. 1 illustrates an imaging apparatus suitable for capturing photographic and luminescence images in accordance with one embodiment of the present invention.
FIG. 2 is a flowchart illustrating a method of capturing photographic and luminescence images using the imaging apparatus of FIG. 1, for example, in accordance with one embodiment of the present invention.
FIG. 3 A is an illustration showing a graphical user interface having an overlay of a photographic representation and a luminescence representation of a specimen as well as various image manipulation, analysis and measurement tools.
FIG. 3B illustrates an electronic notebook page suitable for storing raw data and non-analysis information in accordance with one embodiment of the present invention.
FIGs. 3C-E illustrate the automatic storage of analysis data and measurement results into an electronic notebook page in accordance with another embodiment of the present invention.
FIG. 3F. illustrates another example of an image control/measurement window in accordance with the present invention.
FIG. 4 is a flowchart illustrating a method of making measurements using the GUI of FIG. 3 in accordance with one embodiment of the present invention.
FIG. 5A illustrates an exemplary file structure for data management in accordance with one embodiment of the present invention.
FIG. 5B illustrates a browser control window in accordance with one embodiment of the present invention.
FIG. 5C illustrates an empty data file in accordance with a specific embodiment of the present invention. FIG. 5D illustrates a search label name editing window in accordance with a specific embodiment of the present invention.
FIG. 5E illustrates an exemplary table resulting from a search using the browser control window of FIG. 5B in accordance with a specific embodiment of the present invention.
FIG. 5F illustrates a sorting priority window in accordance with a specific embodiment of the present invention.
FIG. 6A illustrates an image capture graphical user interface suitable for controlling the imaging apparatus of FIG. 1 in accordance with one embodiment of the present invention.
FIG. 6B illustrates another image capture graphical user interface suitable for controlling the imaging apparatus of FIG. 1 in accordance with another embodiment of the present invention.
FIGS. 7 A and 7B illustrate a computer system suitable for implementing embodiments of the present invention.
DETAILED DESCRIPTION
In the following detailed description of the present invention, numerous specific embodiments are set forth in order to provide a thorough understanding of the invention. However, as will be apparent to those skilled in the art, the present invention may be practiced without these specific details or by using alternate elements or processes. In other instances well known processes, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the present invention.
In accordance with one embodiment of the present invention, a graphical user interface (GUI) is provided which allows the user to perform numerous operations suitable for image analysis within a single window. This removes the need to navigate between alternate windows in order to perform a specific image analysis function. Using the GUI of this invention, the user may create and manipulate analysis tools and perform a wide variety of measurements on complex images (such as in-vivo images) conveniently and efficiently. In addition, the present invention may allow the user to store measurement results in a dated electronic notebook, display testing information, manipulate the image presentation and print while maintaining view of the image. For management of stored information corresponding to multiple images, one or more GUIs are provided which simplify imaging data management.
One preferred embodiment of this invention pertains to graphical user interfaces for presenting and analyzing "overlay" or "composite" images including a photographic image on which is overlaid an "emissions" image. The photographic and luminescence images are taken of the same object. In one application, the object is a biological specimen. The luminescence image is taken without using light sources other than the object itself. Luminescence from the object is recorded as a function of position to produce the luminescence image.
Fig. 1 illustrates an imaging system 10 configured to capture photographic and luminescence images in accordance with one embodiment of the present invention. The imaging system 10 may be used for imaging a low intensity light source, such as luminescence from luciferase-expressing cells, fluorescence from fluorescing molecules, and the like. The low intensity light source may be emitted from any of a variety of light-emitting samples which may include, for example, tissue culture plates, multi-well plates (including 96, 384, 864 and 1536 well plates), and animals or plants containing light-emitting molecules, such as various mammalian subjects such as mice containing luciferase expressing cells.
The imaging system 10 comprises an imaging box 12 adapted to receive a light- emitting sample in which low intensity light, e.g., luciferase-based luminescence, is to be detected. The imaging box 12 includes an upper housing 16 in which a camera lens is mounted. An intensified or a charge-coupled device (CCD) camera 20 is optically engaged with, and positioned above, the camera lens. The CCD camera 20 is capable of capturing luminescent and photographic (i.e., reflection based images) images of the sample within the imaging box 12. The CCD camera 20 is cooled by a suitable source such as a refrigeration device 22 that cycles a cryogenic fluid through the CCD camera via conduits 24. A suitable refrigeration device is the "CRYOTIGER" compressor, which can be obtained from IGC-APD Cryogenics Inc., Allentown, PA.
An image processing apparatus 26 interfaces between CCD camera 20 and a computer 28 through cables 30 and 32 respectively. The computer 28, which may be of any suitable type, typically comprises a main unit 36 that contains hardware including a processor, memory components such as random-access memory (RAM) and read-only memory (ROM), and disk drive components (e.g., hard drive, CD, floppy drive, etc.). The computer 28 also includes a display 38 and input devices such as a keyboard 40 and mouse 42. The computer 28 is in communication with various components in the imaging box 12 via cable 34. To provide communication and control for these components, the computer 28 includes suitable processing hardware and software configured to provide output for controlling any of the devices in the imaging box 12. The processing hardware and software may include an I/O card, control logic for controlling any of the components of the imaging system 10, and a suitable graphical user interface for the imaging system 10. The computer 28 also includes suitable processing hardware and software for the camera 20 such as additional imaging hardware, software, and image processing logic for processing information obtained by the camera 20. Components controlled by the computer 28 may include the camera 20, the motors responsible for camera 20 focus, the motors responsible for position control of a platform supporting the sample, the camera lens, f-stop, etc. The logic in computer 28 may take the form of software, hardware or a combination thereof. The computer 28 also communicates with a display 38 for presenting imaging information to the user. By way of example, the display 38 may be a monitor, which presents an image measurement graphical user interface (GUI) that allows the user to view imaging results and also acts an interface to control the imaging system 10, as will be discussed in further detail below.
FIG. 2 is a flowchart illustrating a method of capturing photographic and luminescence images using the imaging apparatus of FIG. 1 in accordance with one embodiment of the present invention. A process flow 200 begins with placing the light-emitting sample to be imaged in the imaging device (202). The imaging system 10 is then prepared for photographic capture of the light-emitting sample (204). The preparation may include turning on the lights in the imaging box 12, focusing the CCD camera 20, positioning the light-emitting sample, etc. After preparation, the photographic image is captured (206). In one imaging system suitable for use with the present invention, a 'live mode' is used in photographic capture of the light- emitting sample. The live mode includes a sequence of photographic images taken frequently enough to simulate live video. Upon completion of photographic capture, the photographic image data is transferred to processing apparatus 26 (208). The processing apparatus may manipulate and store the photographic image data as well as present it on the display 38.
Subsequently, the imaging system 10 is prepared for capturing a luminescence image (210). The preparation may include turning off the lights in the imaging box 12, for example. When ready, the CCD camera 20 captures the luminescence image. The luminescence image data is transferred to the processing apparatus 108 (212). The processing apparatus may store and manipulate the luminescence image data as well as present it on the display 38 (214). The manipulation may also include overlaying the luminescence image with the photographic image and illustrating the two images together. This overlay image may then be the basis for user analysis (216).
At this point, the user now has the components of a digital overlay image stored in the processing apparatus 26 including the luminescence image and the photographic image. The information contained in the digital overlay image may be analyzed and manipulated as desired. As explained, the photographic and luminescence representations provided by the imaging system 10 and imaging interface of the present invention have a wide variety of applications. In one particular embodiment, the luminescence representation indicates the number of times each detector pixel has received a photon over a defined length of time. In other words, the luminescence representation may display magnitude values representing the photon counts at the individual detector pixels. Regions of the object emitting radiation (e.g., photons) will appear in the luminescence representation. The luminescence images may indicate the presence of a biocompatible entity, for example. The entity can be a molecule, macromoloecule, cell, microorganism, a particle or the like. Thus, an in-vivo analysis may include detecting localization of a biocompatible entity in a mammalian subject. Alternatively, the information in the live mode may be used to track the localization of the entity over time.
FIG. 3A illustrates one example of an image control/measurement window 300 in accordance with this invention. The image control window 300 includes an image measurement window 301. Within the image measurement window 301, an overlay image 302 is displayed. The overlay image 302 includes a visual superposition of a photographic representation of the light-emitting sample and a luminescence representation of the light-emitting sample. In this example, the light- emitting sample comprises three mice. In another example, the light-emitting sample comprises a high density well plate comprising an array of wells, e.g., 24x16 and 48x32 well plates are common, wherein each well contains a separate biological specimen. The image control window 300 is well suited for manipulating the display of the overlay image 302 as well as making measurements and analyzing the luminescence representation.
The photographic representation provides the user with a visual frame of reference of the image. The luminescence representation provides photon emission data derived from the object. As mentioned, the photon emission data may represent the specific pixels on the CCD camera 20 that detect photons over the duration of the live mode image capture period. Because the imaging system 10 is typically used to measure the entire light-emitting sample, the data in the luminescence representation typically has one or more distinct luminescent portions of interest. For example, the luminescence representation illustrated includes luminescent portions 308 and 310 of the left mammalian sample. Alternatively, for a well plate, each portion of interest may correspond to a single well in the plate.
Although the image control window 300 displays an overlay image 302 comprised of two separate representations, most data manipulation and analysis of interest is performed on the luminescence representation. In particular, an analysis may include a summation of the illumination magnitudes over the pixels within a portion of the luminescence representation. Note that although the discussion will focus on a single luminescence representation for the overlay image 302, the image control window 300 may include multiple luminescence representations taken at different times.
In the illustrated embodiment, image control/measurement window 300 includes a control panel 312. The control panel 312 includes a plurality of user interface control components for facilitating manipulation and analysis of information in the image measurement window 301. To facilitate discussion, the user interface control components may be grouped into functional sections within the control panel 312. As illustrated, the control panel 312 includes a display function section 314, a create function section 316 and a measurement function section 318. Other arrangements, with or without a "control panel" are contemplated. The display function section 314 includes controls for allowing the user to manipulate the presentation of the photographic representation and the luminescence representation. To manipulate the presentation of the photographic representation, the display function section 314 includes a brightness setting 320. The brightness setting 320 is used for improving the user's visual perception of the photographic representation by allowing adjustment of the photograph's brightness. In alternative embodiments, other controls on a photographic image such as contrast, sharpness, and the like may be provided.
To manipulate the presentation of the luminescence representation, the display function section 314 includes an upper luminescence limit 322 and a lower luminescence limit 324. The upper luminescence limit 322 allows the user to designate the maximum data value displayed in the luminescence representation. Any pixels within the luminescence representation having a data value (e.g., a photon count) at or over this upper luminescence limit 322 will be displayed with a color corresponding to the upper luminescence limit 322. Similarly, the lower luminescence limit 324 allows the user to designate the minimum data value displayed in the luminescence representation. Any pixels within the luminescence representation having a data value below this lower luminescence limit 324 will not be displayed. Those pixels having a data value at the lower luminescence limit will be displayed with a color corresponding to the lower luminescence limit. Thus, the upper and lower luminescence limits specify the range of pixel illumination values over which the full range of display colors will be vary. The upper luminescence limit 322 and the lower luminescence limit 324 may be useful when the user wants to selectively clear the image of outlying data for a particular analysis. Alternatively, the lower luminescence limit 324 may be useful when the user wants to clear the image of noise.
The display function section 314 also includes a global setting 326. The global setting 326 provides a default option for the presentation of the luminescence representation. Specifically, the global setting sets the upper luminescence limit 322 and lower luminescence limit 324 to specified values. In a preferred embodiment, the upper luminescence limit 322 and lower luminescence limit 324 are set to the 'full range' of values for the luminescence representation. In other words, the upper limit is set to the value of the maximum intensity measured for any pixel in the luminescence representation and the lower limit is set to the value of the minimum intensity measured for any pixel in the luminescence representation. Alternatively, another preset option may set the upper luminescence limit 322 and lower luminescence limit 324 to a standardized range of values for the luminescence representation. For example, the standardized range may set the upper luminescence limit 322 at 95% of the maximum photon count for the luminescence representation and the lower luminescence limit 324 at 5% of the maximum photon count. Alternatively, another standardized range may be based on a statistical analysis, such as the standard deviation, of the range of data values for the luminescence representation. The display function section 314 may also include binning control. Binning is an image processing procedure often used to account for insufficient information, e.g., per pixel. For example, the number of pixels in each direction of the luminescence representation may be halved to produce a new pixel array comprising the magnitude of four previous pixels in a single new pixel. Binning in this manner may be useful to obtain a better signal to noise ratio for the luminescence representation, or to improve statistical analysis of the luminescence representation when the amount of data in the luminescent representation is small or otherwise better analyzed when pooled. In accordance with a specific embodiment of the present invention, the display function section 314 may include a user designated binning factor that specifies the amount of binning in each direction of the luminescence representation. Thus, for a bin factor of '5', the number of pixels in each direction of the luminescence representation may be reduced by a factor of 5 to produce a new pixel array comprising the magnitude of 25 previous pixels in a single new pixel.
Hardware used in many conventional imaging systems may often produce errors in the displayed images. In one embodiment, the display function section 314 includes one or more software tools that compensate for hardware induced errors. For example, random points and 'spots' of abnormal activity may appear in the photographic and luminescence representation as a result of interactions by cosmic rays and other background radiations. FIG. 3F illustrates another example of an image control/measurement window 400 in accordance with the present invention. To compensate for radiatic induced random points and spots, the image control window 400 includes a cosmic control 402 that detects localized abnormal activity in the photographic and/or luminescence representation and corrects for it. In one embodiment, the cosmic control 402 applies a median filter to an abnormal point or spot and smooths the abnormality using values from neighboring points.
Another hardware induced error often encountered is inconsistent light detection across the radius of conventional CCD camera lenses, e.g., detection becomes weaker as radius increases. This radial inconsistency significantly reduces imaging quality and may compromise data analysis. Correspondingly, the image control window 400 may include a flat field tool 404 that compensates for hardware induced radial inconsistency in a photographic or luminescence representation. In one embodiment, the flat field tool 404 calibrates images to be displayed with calibration data previously obtained and stored for the camera that the representation was taken with. The image control window may derive and maintain calibration data for each camera used in conjunction with the window. For the image control window 400 as illustrated in FIG. 3F, the flat field tool 404 is a toggle that, when turned on, automatically determines whether calibration is required for an image to be displayed (based on recorded imaging parameters in the data file for the image) and calibrates the image, if necessary. When automatically applying software correction to a hardware induced errors using the flat field tool 404 in this manner, the user is thus provided a transparent solution to hardware induced errors. Often, it is desirable to calibrate the photographic representation and the luminescence representation to a blank view of the imaging box 12 without the light- emitting sample. The image for an blank view of the imaging box 12 without the light-emitting sample is often referred to as a 'dark image'. The dark image may indicate inherent defects in a solid state camera, which defects should be subtracted from images taken with the camera. For example, the dark image may show contain bright spots corresponding to camera pixels having a high leakage current. To allow correction for such defective pixels, the display function section 314 may include a background compensation tool 325. When a user selects the background compensation tool 325, the computer system alters the photographic representation and the luminescence representation to compensate for any information associated with the dark image. In one embodiment, the background compensation tool 325 is a toggle that toggles for automatic calibration of images. When turned on, the background compensation tool 325 automatically calibrates all displayed photographic or luminescent images against a dark image. If a suitable dark image is not available, the image control window 300 may notify the user that a suitable dark image was not available and prompt the user to obtain one.
To facilitate automatic calibration, the background compensation tool 325 may include a table of different dark images that compensate for different cameras and camera imaging conditions. Dark images within the table may vary based on binning parameters, exposure time, camera temperature, etc. Thus, upon initiating a photographic or luminescent image, the image control window 300 may probe camera conditions for the image and select an appropriate dark image from the table. The image control window 300 may also update the dark image values in the table. In imaging systems where the camera 20 is on for extended periods for example, the image control window 300 may periodically update the table by capturing new dark images when the dark image reaches a certain age (e.g., 3 days), comparing these new dark images with those stored in the table and renewing them if there has been substantial change. Coupled with a tool for automatic calibration for all displayed images, automatic dark image renewal in this manner provides a transparent image calibration tool that further simplifies imaging analysis.
The create function section 316 includes controls for allowing the user to create and manipulate tools which enable simple and flexible analysis of the data within the image measurement window 301. In the specific embodiment depicted, the create function section 316 includes a create button 326. The create button 326 allows the user to create a region of interest (ROI) with one action on the interface. For example, the user simply clicks on button 326 with a pointer and a new ROI appears in the image measurement window. The ROI may be any geometric shape or tool for measuring and analyzing data in a portion or portions of the luminescence representation. To facilitate generation of ROIs, the create function section 316 includes a pop-up menu 327. The pop-up menu 327 includes a number of ROIs commonly used for image analysis. For example, the create button 326 and pop-up menu 327 may allow the user to create an ellipse (circle) 328, a rectangle (square) 330 or a grid 332. Upon creating an ROI, a label may be attached to the geometric outline of the ROI for user clarity. In FIG. 3A, for example, a label 334 is attached to circle 328. The label 334 may include label identification and user information such as information relating to the light-emitting sample.
To manage multiple ROIs, the create function section 316 includes a designation pop-up menu 335. The designation pop-up menu 335 lists and numbers the ROIs as they are created. In addition, the designation pop-up menu 335 allows the user to re-access previously created ROIs that were previously numbered and stored. Typically, the ROI currently being accessed by the user is indicated to the user via highlights 336. The create function section 316 also includes a remove tool 337. The remove tool 337 allows the user to delete any or all of the ROIs stored in the designation pop-up menu 336. The remove tool 337 may also include a pop-up menu 338 for convenience in deleting the ROIs.
The image control window 300 also allows the user to manipulate the ROIs. Thus, after the circle 328 is dragged to its desired position, the size, shape, position and orientation of the circle 328 may be altered. For example, the orthogonal axis of the circle 328 may be altered to form an ellipse. The ellipse may then characterized by a major axis and a minor axis. Similarly, the dimensions of the square 330 may be altered to form a rectangle. The manipulation of the ROIs may further include rotations, expansions, etc. In one embodiment, the dragging of an ROI is done by clicking a pointer 344 on a portion of the circle 328. Alternatively, the reshaping of an ROI may be performed by clicking the pointer 344 on one of the highlights 346 and dragging. In another embodiment, the manipulation and alterations of the ROIs may include keyboard input.
While the image control window 300 only shows three ROI options, there are a large number of alternative ROI configurations which may be implemented. By way of example, the ROI options may be include a free-hand drawing option, polygons of five or more sides, curve drawing options, and the like. In the free-hand drawing option, the user marks a series of points, which then form a perimeter of a closed ROI. Using the created ROIs, the user may then proceed to use one or more of the ROIs to measure and analyze data.
The image control window 300 may also allow the user to create and save custom ROIs. For example, a user may create a custom ROI that corresponds to a 24x16 well plate using the grid 332 and suitable size, position and shape manipulation. The grid may be then be adapted such that each well in the 24x16 well plate receives one or more grid units of the grid 332. The custom grid ROI may then be saved with a name, e.g., '24x16 grid'. After saving, custom ROIs will appear in the designation pop-up menu 335 and may be repeatedly used as desired. Creating and saving custom ROIs in this manner removes the need for continually having to recreate the same ROI, which is often useful for imaging performed on the same sample over numerous days or other imaging scenarios requiring repetitive analysis. The measurement function section 318 includes GUI controls for allowing the user to measure and analyze data within the image measurement window 301. The illustrated measurement function section 318 includes a measure button 348. The measure command allows one or more functions for analysis of the data in the luminescence representation. By way of example, one function may be summation of all the pixel magnitudes within the perimeter of one or more of the ROIs. Another function may include an average of the magnitudes over the area within an ROI. Yet another function may also be a statistical analysis of the data within one or more of the ROIs.
To increase measurement flexibility, the measurement function section 318 includes a measurement designation pop-up menu 350. The measurement designation pop-up menu 350 enables the user to specify which of the created ROIs stored in the designation pop-up menu 335 are to be included in a measurement. Correspondingly, the user may specify any or all of the previously created ROIs for a particular measurement.
The measurement function section 318 also includes a record option 352. The record option 352 allows the user to store the data produced from the measure command 348. In one embodiment, the data is stored in an electronic notebook. The electronic notebook is an on-line tool which allows testing results and information to be stored automatically and will be described further with respect to FIG. 3B.
The image control window 300 also includes numerous other user interface tools. For example, a global display tool 354 is included to allow the user to control which representations are displayed in the measurement window 301. More specifically, the user may select the overlay image 302 including the visual superposition of the photographic representation and the luminescence representation. Alternatively, using the global display tool 354, the user may select just one of the photographic representation and the luminescence representation. The global display tool 354 may also allow the user to select between numerous luminescence representations stored for the photographic representation. The global display tool 354 typically has a default setting when a data file is accessed. For the image control window 300, the default setting is the overlay image 302 comprising the photographic representation and one luminescence representation.
For the overlay image of Figure 3 A, overlying luminescent portions block out the underlying photographic image and thus prevent a user from viewing photographic details in the photographic image beneath the luminescent portions. To avoid losing photographic information in this manner, the image control/measurement window 400 includes a blend tool 406 that allows blending of the luminescent and photographic images. In other words, the blend tool 406 allows the underlying photographic image to display details in the region of the luminescent image. The blend tool 406 may also include a blend bar 408 that allows the user to vary the degree of opacity for the overlying luminescent image. More specifically, the blend bar 408 ranges from 0, corresponding to transparency of the overlying luminescent image, to 1, corresponding to no blending between the two images and the luminescent image blocking out the photographic image.
On the right side of the image control window 300 is a luminescence image display section 358. The luminescence image display section 358 includes a number of components to assist in viewing and comprehension of the luminescence representation. The luminescence image display section 358 includes an image maximum 360 and an image minimum 362. The image maximum 360 indicates the magnitude of the highest data value (photon count) for any pixel in the luminescence representation. The image minimum 362 indicates the magnitude of the lowest data value (photon count) for any pixel in the luminescence representation. The difference between the image maximum 360 and the image minimum 362 corresponds to the full range of pixel magnitudes for the luminescence representation.
The luminescence image display section 358 also includes a legend maximum
364 and legend minimum 366. The legend maximum 364 indicates the magnitude of the maximum data value (photon count) for the image measurement window 301. In other words, the legend maximum 364 corresponds to the upper luminescence limit 322. The legend minimum 366 indicates the magnitude of the minimum data value for the image measurement window 301, similarly corresponding to the lower luminescence limit 324. Thus, if the full range is selected in the global setting 326, the legend maximum 364 and the legend minimum 366 may correspond to the image maximum 360 and the image minimum 362. The legend maximum 364 indicates the magnitude of the highest data value (photon count) in the luminescence representation that will be displayed with the highest intensity color (e.g., red). Any pixels having an intensity magnitude of greater than or equal to the highest data value will be given the highest intensity color. The legend minimum 366 indicates the magnitude of the lowest data value (photon count) in the luminescence representation that will be displayed with the lowest intensity color (e.g., blue).
Included in the image display section 358 is a scale 364. The scale 364 provides a visual mapping between a range of colors for the luminescence representation and the magnitude range specified at 322 and 324. For the image control window 300, the scale may be represented by a gray scale and thus individual magnitudes correspond to shades of gray or by color and thus the magnitude indicator correspond to different colors.
The image control window 300 also includes a user information section 370.
The user information section 370 provides testing information for the overlay 302, which may be helpful to the user. The testing information may include, for example, a user ID 371, a test classification 372, a testing reference number 373, a date 374, a label 375 and a comments field 376.
A print button 356 is also included and allows the user to conveniently print portions of the image control window 300. Typically, the print button 356 employs a default setting corresponding to preferred use. For the image control window 300, the default setting automatically prints the image measurement window 301, the luminescence image display section 358 and the user information section 370. Alternatively, the print command 356 may introduce a settings window to set the default print options.
Although the present invention has been discussed primarily in the context of manipulating simple two-dimensional images, the analysis tools and methods of the present invention are also suitable for more complicated applications. By way of example, to compensate for different sized specimens or images at varying depths, the imaging system 10 may take alternate luminescence images of the light-emitting sample. More specifically, to overcome dependency on the depth of the image, different wavelengths may be used in capturing the luminescence image.
As mentioned previously, the present invention may implement an electronic notebook. The electronic notebook allows the user to automatically store analysis data, ROI settings, measurement results and conveniently perform other useful note taking functions. FIG. 3B illustrates an electronic notebook page 380 in accordance with one embodiment of the present invention. In this case, the notebook page 380 appears as a separate window to the side of the image control window 300.
The electronic notebook page 380 may automatically store information corresponding to the overlay image 302 upon instantiation. By way of example, the electronic notebook page 380 may be instantiated when the user initially loads an image. In this case, image capture information will be entered first. Alternatively, the electronic notebook page 380 may be instantiated when the user creates an ROI for the first time or inputs any information, which is relevant to an analysis of the luminescence representation. In addition, if the overlay image 302 is accessed from memory, the electronic notebook page 380 may be instantiated as last saved.
In one embodiment, the electronic notebook page 380 solely contains reference information corresponding to the overlay image 302 and does not include analysis information and results. In this case, the electronic notebook page 380 is referred to as a 'raw data' page. The electronic notebook page 380 may include a header 382. The header contains classification information for the overlay image 302 such as the testing reference number 373. The electronic notebook page 380 may also include a user input section 384. The user input section 384 may include, for example, the user ED 371, test classification 372, the label 375 and comments field 376 corresponding to the user information section 370 of the image control window 300. Alternatively, the user may enter information into the notebook page 380 as desired.
The electronic notebook page 380 may also include other information used in characterizing data or an image. For example, a photographic image section 386 includes information relevant to the settings used for photographic image capture using the imaging system 10. In addition, an luminescence image section 388 includes information relevant to the settings used for luminescence image capture using the imaging system 10.
For user convenience, the electronic notebook page 380 may be saved in a database which stores all the raw data files and corresponding image files in a common directory. In one embodiment, this database for raw data files is referred to as a 'raw data set'. More specifically, the database may be arranged such that a text file is associated with each image file in the data set. The text file may contain information about the image captured. By way of example, the image information may include the time the image was taken, the camera settings (i.e. the exposure length), image identification numbers, labeling information entered by the user when the image was taken. In addition, each photographic representation and luminescence representation may have its own file in the raw data set.
Preferably, the file formats used in the raw data set are generic to increase application flexibility. By way of example, the photographic representation and luminescence representation may be saved in a TIFF format to allow access from a wide variety of software packages. Similarly, each text file may be saved as an ASCII text file to allow flexible access. Thus, a raw data database may be established which is generic and easily accessible as will be described in further detail below.
In another embodiment, an electronic notebook page may contain information that includes analysis settings and measurement results. FIGs. 3C-E illustrate the automatic storage of analysis data and measurement results into an electronic notebook page 390 in accordance with another embodiment of the present invention. In this case, the electronic notebook page 390 is referred to as a 'analyzed data' page. In a preferred embodiment, the electronic notebook automatically stores the analysis data 382 obtained from the measure command 348 when the record option 352 is selected. Storing the analysis data provides the user a convenient mechanism to access the analysis data at a subsequent time. The analysis data may include the characteristic geometric information of the ROIs as well as the results of any measurements using the ROIs.
FIG. 3C illustrates the electronic notebook page 390 before analysis information is entered. In this case, the header 391 includes a testing reference 398 and file references 392. The testing reference 398 includes the analysis date in the first six numerical digits, the user name and may also include other testing information. In one embodiment, the electronic notebook page 390 automatically records an automatic date stamp in which an analysis was performed in the testing reference 398. The automatic date stamp may be advantageous in the future for verifying the date of analysis and testing. In one embodiment, the automatic date stamp is subsequently unalterable to further strengthen subsequent validation. The file references 392 refer to data files which includes the loading information of the photographic representation and the luminescence representation. The file references 392 may also include the information relevant to the settings used for photographic and luminescence image capture using the imaging system 10.
FIG. 3D illustrates the electronic notebook page 390 after an analysis using three ROIs 393. For each ROI, its characteristic geometric information 394 regarding the perimeter of the ROI is automatically stored in the electronic notebook page 390. In addition, results 395 for the measurement within each ROI 393 are automatically stored. Further, total measurement results 396 for all three ROIs are automatically stored. FIG. 3E illustrates the electronic notebook page 390 prior to exiting the image control window 300.
At this point, filing information 397 may be stored prior to closing of the electronic notebook page 390 (FIG. 3E). In addition, the electronic notebook page 390 allows the user to make notes and input additional analysis information at any time. Preferably this is accomplished by simply placing the cursor/pointer at the desired location within the electronic notebook and typing in the notes of interest. By way of example, statistical information for the luminescence representation such as a number of pixels 398 in the ROIs, area sums 399 and standard deviation, etc. may be stored if they are not included in the automatic transfer of information from the image control window 300. Alternatively, luminescence representation information such as the image maximum 360, image minimum 362 and average data value per pixel may be stored in the electronic notebook page 390. Broadly speaking, the electronic notebook page 390 may store any information relevant to re-creating a measurement or analysis.
In addition to the raw data set, a directory may be maintained for the electronic notebook page 390 and similar other analyzed data files. In one embodiment, this analyzed file directory is referred to as an 'analyzed data set'. Thus, two databases may be maintained: the first containing raw data and the second containing analyzed data. In one embodiment, the files stored in the analyzed data set will contain information about the photographic representation, the luminescence representation and the text file corresponding to the analyzed data all in one file. The file may be any such format which allows these pictorial and text components. By way of example, the file may be a Living Image file suitable for use with the Living Image Software.
FIG. 4 is a flowchart representative of an exemplary data analysis using the image control window 300. A process flow 560 typically begins with accessing a data file (562). Accessing the data file may include opening a data file previously stored in memory. Alternatively, the data file may be accessed from an image recently captured by the imaging system 10, without opening a stored file.
After the image measurement window 301 for the data file is displayed (564), the user may alter the photographic representation and the luminescence representation using any of the tools in the display function section 314 (566). By way of example, the user may alter the upper luminescence limit 322 and lower luminescence limit 324 to facilitate viewing clarity and comprehension of the image. For an initial use of an image, the user may also enter details in the user information section 370. In addition, the user may proceed to further alter the image display (568). By way of example, the user may alter the luminescence limit 322 and lower luminescence limit 324 to facilitate viewing clarity of a particular portion of the luminescence representation to be analyzed. Upon determining which portion or portions of the luminescence representation are to be analyzed, the user may then create one or more ROIs (570). After the corresponding ROIs are generated, the user may alter the generic ROIs for a particular analysis (572). More specifically, the user may manipulate the position, shape and angle of rotation of one or more created RIOs. Upon completion of ROI manipulation, the user may then perform a measurement within one or more of the ROIs (574). In a preferred embodiment, the measurement involves taking a summation of the photon counts for each of the pixels within the perimeter of one or more ROIs. The results of the measurement may then be transferred to the electronic notebook page 390 (576).
The user may then continue (578) to make measurements on the same or different portions of the luminescence representation using the image control window 300. Correspondingly, the user may return to alter the image display for another measurement (568), create more ROIs (570) or manipulate the existing ROIs (574). If the user is finished with analysis on the current image, then the user may save the work and exit (579). It should be noted that the process flow 560 is one method of using the image control window 300 for analysis of the overlay image 302. Obviously, many of the elements of the process flow 560 may be repeated or performed outside of the order illustrated. By way of example, at any point the user may alter the image display to improve viewing (568), print the image control window 300 or make notes in the electronic notebook page 390 (578).
After imaging a large number of samples, a user or user group may generate and store a library of photographic, luminescence, and digital overlay images as well as associated data files. For many imaging applications, the amount of data accumulated may be excessive. Correspondingly, the present invention may provide computer interfaces and tools to assist a user with imaging data management. In accordance with one aspect of the present invention, a data management system is provided with one or more data management windows, each allowing the user to perform operations that are particularly useful for browsing, presenting and analyzing imaging data.
As imaging systems in accordance with the present invention are often implemented in laboratories, research sites and facilities that do not benefit from comprehensive information services (IS) and specialized hardware and software support, data management tools in accordance with one embodiment of the present invention are designed without requiring such specialized services. In contrast, conventional proprietary data management systems often require specialized skills outside the experience of most researchers and computer users. To avoid situations where imaging system users must rely on specialized data management support, data management tools in accordance with one embodiment of the present invention allow researchers and users to manage large amounts of images and imaging data in a simplified manner that doesn't require specialized data management skills.
In one embodiment, simplified imaging data management tools are implemented using a series of data management control windows that manage imaging data according to a file structure. FIG. 5A illustrates an exemplary file structure 400 for data management in accordance with one embodiment of the present invention. The file structure 400 is illustrated in a graphical user interface (GUI) environment and includes a base file 401 that comprises various graphics files 402a-e and data files 404a-b common to a single image or sample. Numerous base files 401 may be grouped in an hierarchical arrangement according to date, imaging conditions, sample, testing specifics, etc. Grouping in this manner allows a user to customize file arrangements and higher level groups according to user preferences. For example, base files 401 for testing the effect of a biological chemical on several biological specimens over an extended period of time may be labeled by day, the days grouped into a parent file by specimen, and the specimen files grouped into a parent file by chemical test. In an environment where many users work simultaneously on the imaging system, the chemical test files may be further grouped according to user.
To assist a user in searching through a potentially large number of stored data and images and finding desired information, the present invention includes browsing tools useful for retrieving desired information. FIG. 5B illustrates a browser control window 410 in accordance with one embodiment of the present invention. The browser control window 410 allows the user to specify one or more search criteria for obtaining a set of data files containing the search criteria. For the browser control window 410, the search criteria is determined by user input into one or more fields 412a-416a designated by search label names 412b-416b. The fields 412a-416a of the browser control window 410 correspond to text, values or other information that may be input and stored by a user in a data file associated with an image.
Generally speaking, the search criteria for the browser control window 410 may correspond to any information stored in the various imaging and information data files in the computer 28. By way of example, the data files to be searched may correspond to the electronic notebook page 380 of FIG. 3B and the information may include reference information corresponding to the overlay image 302. More specifically, the information used in searching may include any information found in the user input section 384 and photographic image section 386 of the electronic notebook page 380. Alternatively, the information may include the capture date of an image, camera specifications, camera calibration information and any information useful for re-capturing the image, imaging specifics such as binning parameters and exposure time, image identification numbers, labeling information entered by the user when the image was taken, ROI analysis information, etc. As illustrated, the browser control window 410 includes a Series field 412a, an Experiment field 413a, a Label field 414a, a Comment field 415a and an analysis Comment field 416a. In many cases, the information is stored in relation to a similar data label name in the data file. The browser control window 410 also includes toggles 438a-e that allow a user to turn off or on each of the fields 412a-416a as a criteria in a search.
FIG. 5C illustrates an empty data file 430 in accordance with a specific embodiment of the present invention. The data file 430 includes one or more data fields 432a-432e designated by data label names 434a-434e stored in a simple text format. The data label names 434a-434e correspond to the search label names 412b- 416b of the browser control window 410. Text, values or other information for each of the data fields 432a-432e may be input into the data file 430 during initial creation and analysis of the corresponding image or subsequent analysis and data file 430 alteration.
As the information included in data files may vary between users, the present invention allows users to customize the labels within the data file 430 and within the browser control window 410. This allows browsing, presenting and analyzing of image data to be customized to a user. This is often advantageous when a user produces a large number of user-specific images that are related in some manner, e.g., a common testing procedure for a large number of samples tested daily.
To further improve data management, the labels within the browser control window 410 may be grouped into customized sets common to a large number of data files. The data file 430 includes three exemplary User Label Name Sets 432, 434 and 436 used to group common label names. The User Label Name Set 434, entitled 'Xenogen Oncology', includes data fields having data label names: group ID, experiment number, time point, animal number, cell line & number, animal model, comment 1, comment 2, IACUC number, animal strain, and user. For the User Label Name Set 434, all data files produced in the set will include information for these data fields. The user label named set 436, entitled 'Xenogen Infectious Disease' includes another set of data fields having data label names: group ID, experiment ID, animal model, animal strain, pathogen, root of infection, dose, treatment, animal number, time point, comment 1, comment 2, and IACUC number. Similarly, for the User Label Name Set 436, all data files produced in the set will include information for these data fields. Referring back to the browser control window 410 of FIG. 5B, a Label Name Set tool 446 has a pulldown menu 447 that includes User Label Name Sets. In one embodiment, selecting a specific User Label Name Set using the pulldown menu 447 automatically generates search label names and fields corresponding to data fields and data label names used in data files of that User Label Name Set; thus simplifying searching within the chosen set.
The browser control window 410 also includes a User ID tool 440 to allow a user to specify a user ID as a search criteria. The user ID tool 440 includes a pulldown menu 442 that contains a list of previously recorded user identifications and an open ID field 444 that allows a user to specify non-recorded user identification information. When all search criteria have been entered, the user may initiate a search by selecting 'done' tool 446. In one embodiment, the browser control window 410 searches solely through data and text files in the file structure 400 according to a text or string search.
The browser control window 410 may include other features to expedite and simplify data management. In one embodiment, information entered into the browser control window 410 is stored and re-displayed when the browser control window 410 is re-started to remove the need to re-enter information common between two searches to expedite continual searching including similar search criteria. In another embodiment, each of the search label names 412b-416b may also include search pulldown menus 412c-416c that contains a list of previously recorded search label names. Recording search label names may remove the need to continually enter the same information and expedite input into the fields 412a-416a. FIG. 5D illustrates a search label name editing window 450 in accordance with a specific embodiment of the present invention. The search label name editing window 450 includes search label names 452 corresponding to the data label name set 'Xenogen Oncology' of FIG. 5C. For each search label name 452, the window 450 includes a standard values menu 454 that allows a user to enter and edit standard values that will appear in the search pulldown menus of the browser control window 410.
FIG. 5E illustrates an exemplary table 460 resulting from a search using the browser control window 410. The table 460 includes columns 462 corresponding to the search label names that were used during the search, e.g., search label names 412b-416b. Individual files 464 that included the search criteria information are listed as rows as illustrated. In one embodiment, the individual files 464 are listed in the order that they are found. As the number of files produced by such a search may be excessive, the present invention may also include user assisting sorting tools.
FIG. 5F illustrates a sorting priority window 580 in accordance with a specific embodiment of the present invention. The sorting priority window 580 includes three priority levels 582a-582c. Each priority level 582 includes a pulldown menu 584 that includes each of the search label names used in a search, e.g. search label names 412b-416b. For the sorting priority window 580, only the first priority level 582 is selected and with the search label name 'Series' 412b. The search label names used in the pulldown menus 584a-584c are determined using a User Label Name Set pulldown 586. The sorting priority window 580 may also include additional conventional sorting tools such as a reverse toggle 588.
The present invention may also include other user interface components outside of the image control/measurement window 300 and data management windows described above. By way of example, FIG. 6A illustrates an image capture GUI 500 suitable for controlling the imaging system 10. The image capture GUI 500 includes an imaging mode control section 502. The imaging mode control section 502 allows the user to designate one or multiple images to be taken. The camera GUI interface 500 also provides the user with interfaces to control platform selection 504, lights on/off 505, set light strength 506 and set the exposure duration 508. The camera GUI interface 500 may also include other functionality and tools useful in obtaining an image with the imaging system 10 not shown in FIG. 5. By way of example, the camera GUI interface 500 may also include control for manipulating the photon threshold for registering a pixel in the luminescence representation. Correspondingly, the photon threshold may be used to reduce electronic noise when capturing the luminescence representation.
FIG. 6B illustrates a second image capture GUI 520 suitable for controlling the imaging apparatus of FIG. 1 in accordance with another embodiment of the present invention. The image capture GUI 520 includes an imaging mode control section 522 that allows the user to designate one or more images to be captured and control parameters associated with the images. The imaging mode control section 522 allows the user to capture a photographic image 524 and a luminescence image 526. For both imaging types, the image capture GUI 520 allows the user to control both software and hardware parameters of imaging. For example, the image capture GUI 520 allows the user to designate an software parameters such as a binning factor 534 and to designate whether a combined photographic and luminescence image is represented as an overlay image using an overlay tool 536. The GUI 520 also allows the user to control hardware parameters in the imaging apparatus 10 such as an exposure time 528 for the image, lights 540, live mode imaging 542, and an f-stop 532 that designates the exposure time of the camera lens and a filter control 534. The filter control 534 includes a pulldown menus 538a and 538b that allow the user to select one of a series of filters implemented in the imaging apparatus 10. The image capture GUI 520 allows the user to input camera cooling control parameters such as a temperature 544 and display temperature conditions for the camera 20. A field of View (FOV) control 546 allows the user to adjust the image field of view by moving the sample shelf to or from the lens. A focus tool 548 allows the user to manually or automatically control camera 20 focus. A system status display 550 provides the user with continual update of the current imaging status for the imaging apparatus 10. In a specific embodiment, the GUI 520 is suitable for use with an 'in-vivo imaging system' IVIS as produced by Xenogen Corporation of Alameda, CA.
FIGs. 7A and 7B illustrate a computer system 600 suitable for implementing embodiments of the present invention. FIG. 6A shows one possible physical form of the computer system. Of course, the computer system may have many physical forms ranging from an integrated circuit, a printed circuit board and a small handheld device up to a huge super computer. Computer system 600 includes a monitor 602, a display 604, a housing 606, a disk drive 608, a keyboard 610 and a mouse 612. Disk 614 is a computer-readable medium used to transfer data to and from computer system 600.
FIG. 6B is an example of a block diagram for computer system 600. Attached to system bus 620 are a wide variety of subsystems. Processor(s) 622 (also referred to as central processing apparatuss, or CPUs) are coupled to storage devices including memory 624. Memory 624 includes random access memory (RAM) and read-only memory (ROM). As is well known in the art, ROM acts to transfer data and instructions uni-directionally to the CPU and RAM is used typically to transfer data and instructions in a bi-directional manner. Both of these types of memories may include any suitable of the computer-readable media described below. A fixed disk 626 is also coupled bi-directionally to CPU 622; it provides additional data storage capacity and may also include any of the computer-readable media described below. Fixed disk 626 may be used to store programs, data and the like and is typically a secondary storage medium (such as a hard disk) that is slower than primary storage. It will be appreciated that the information retained within fixed disk 626, may, in appropriate cases, be incorporated in standard fashion as virtual memory in memory 624. Removable disk 614 may take the form of any of the computer-readable media described below.
CPU 622 is also coupled to a variety of input/output devices such as display 604, keyboard 610, mouse 612 and speakers 630. In general, an input/output device may be any of: video displays, track balls, mice, keyboards, microphones, touch- sensitive displays, transducer card readers, magnetic or paper tape readers, tablets, styluses, voice or handwriting recognizers, biometrics readers, or other computers. CPU 622 optionally may be coupled to another computer or telecommunications network using network interface 640. With such a network interface, it is contemplated that the CPU might receive information from the network, or might output information to the network in the course of performing the above-described method steps. Furthermore, method embodiments of the present invention may execute solely upon CPU 622 or may execute over a network such as the Internet in conjunction with a remote CPU that shares a portion of the processing.
In addition, embodiments of the present invention further relate to computer storage products with a computer-readable medium that have computer code thereon for performing various computer-implemented operations. The media and computer code may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well known and available to those having skill in the computer software arts. Examples of computer-readable media include, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media such as floptical disks; and hardware devices that are specially configured to store and execute program code, such as application-specific integrated circuits (ASICs), programmable logic devices (PLDs) and ROM and RAM devices. Examples of computer code include machine code, such as produced by a compiler, and files containing higher level code that are executed by a computer using an interpreter.
Although the present invention has been discussed primarily in the context of making measurements for the summation of photon counts within the image measurement window 301, the present invention is suitable for other imaging applications and may be tailored correspondingly. By way of example, the present invention may be adapted for analysis of high detail in-vivo applications and thus may include zoom tools in the display function section 314. Although various details have been omitted for brevity's sake, obvious design alternatives may be implemented. Therefore, the present examples are to be considered as illustrative and not restrictive, and the invention is not to be limited to the details given herein, but may be modified within the scope of the appended claims.

Claims

What is claimed is:
1. A computer system capable of displaying and analyzing an image, the computer system comprising: one or more processors; one or more user input devices; a display capable of displaying the image and associated information in particular ways responsive to input signals from one or more of the input devices and signals from one or more of the processors; and a graphical user interface running on one or more of the processors and providing, on a single display window, the image and one or more tools for defining a region of interest on the image, wherein when a user uses one of the tools to define a region of interest, the computer system can calculate information about a portion of the image within the defined region of interest.
2. The computer system of claim 1, wherein the display window includes a measurement tool which when selected causes the computer system to automatically calculate the information about the portion of the image when the user uses one of the tools to define the region of interest on the image.
3. The computer system of claim 1, wherein the graphical user interface further provides on the single display window display controls for controlling at least one the following features of the displayed image: threshold, brightness, contrast, and sharpness.
4. The computer system of claim 1, wherein the one or more tools for defining the region of interest allow the user to graphically create at least one of a rectangle on the image, an ellipse on the image, and a grid on the image.
5. The computer system of claim 1, wherein at least one of the one or more tools is a button which, when selected, causes a region of interest to appear on the displayed image.
6. The computer system of claim 1, wherein the graphical user interface further includes a pointer displayed on the display, and wherein the region of interest can be moved and reshaped by action of the pointer.
7. A user interface for use with an imaging system, the imaging system including a processor and a display for presenting and analyzing an image obtained from the imaging system, the image comprising a photographic representation of an object and a luminescence representation of the object, the luminescence representation corresponding to the location and magnitude of electro-magnetic radiation emitted from the object, the user interface comprising:
a first display control permitting a user to manipulate the visual presentation of the luminescence representation and the photographic representation;
a second display control permitting the user to create at least one region of interest on the luminescence representation; and
a third display control permitting the user to make a measurement of a portion of the luminescence representation bounded by the at least one region of interest.
8. The user interface of claim 7 wherein none of the display controls block the visual presentation of the photographic representation or the luminescence representation.
9. The user interface of claim 7 further including a fourth display control permitting the user to select which of the photographic representation and the luminescence representation are to be displayed.
10. The user interface of claim 7 wherein the luminescence representation includes in-vivo luminescence data from at least one mammalian specimen.
11. The user interface of claim 7 wherein the first display control permits the user to adjust threshold magnitudes of the luminescence data which will be displayed in the luminescence representation of the image.
12. The user interface of claim 11 wherein the first display control permits the user to set an upper threshold magnitude and an lower threshold magnitude on the display of luminescence representation.
13. The user interface of claim 7 wherein the first display control includes a blend tool that allows the underlying photographic image to display details in the region of the luminescent image.
14. The user interface of claim 13 wherein the blend tool comprises a blend bar that allows the user to vary the degree of opacity for the luminescent image.
15. The user interface of claim 7 wherein the first display control comprises binning control.
16. The user interface of claim 7 further comprising a tool that compensates for hardware induced errors in the imaging system.
17. The user interface of claim 7 wherein the luminescence representation comprises photon counts for each pixel in the image.
18. The user interface of claim 7 wherein the luminescent emissions data includes data obtained from using a live mode.
19. The user interface of claim 7 further including a fifth display control permitting the user to print all or a portion of the image.
20. The user interface of claim 7 wherein the at least one region of interest includes one of an ellipse, a rectangle or a grid.
21. The user interface of claim 7 wherein the user interface permits the user to manipulate the size and position of at least one region of interest and create a custom region of interest.
22. The user interface of claim 17 wherein the user interface permits the user to save the custom region of interest.
23. The user interface of claim 7 further including an electronic notebook space displaying text information pertaining to the image.
24. The user interface of claim 23 wherein the electronic notebook page automatically stores information corresponding to the photographic representation and the luminescence representation.
25. The user interface of claim 23 wherein the electronic notebook page automatically stores measurement information generated via the third display control.
26. The user interface of claim 25 wherein the electronic notebook page automatically date stamps the measurement information.
27. The user interface of claim 25 wherein the electronic notebook page is stored in a analysis directory.
28. The user interface of claim 7 further comprising a background compensation tool that calibrates a photographic or luminescent image with a dark image.
29. The user interface of claim 28 wherein the background compensation tool automatically calibrates a photographic or luminescent image.
30. In a computer system, an electronic notebook for displaying data in conjunction with a user interface, the electronic notebook comprising:
a displayed space for recording image analysis data; and
a displayed date stamp applied to the electronic notebook when the image analysis data is accessed or manipulated, wherein the displayed date stamp indicates when the data was recorded.
31. The electronic notebook of claim 30 wherein the data is obtained via an in-vivo imaging application.
32. The electronic notebook of claim 30 wherein the electronic notebook is created when an image for the in-vivo imaging application is acquired.
33. The electronic notebook of claim 30 wherein the electronic notebook automatically stores measurement information obtained from the image analysis.
34. The electronic notebook of claim 30 wherein the electronic notebook is stored as a text file in a base file, the base file comprising graphics files and data files common to a single image.
35. The electronic notebook of claim 30 further comprising data label names that correspond to search label names included in a graphical user interface browser.
36. A method implemented on a computer system, the method analyzing a region of interest on an image presented on a display associated with the computer system, the method comprising:
defining a region of interest on the image by selecting a region of interest tool from a user interface presented on the display, such that the region of interest tool and the image are concurrently displayed on the display; and
calculating a property of the image within the region of interest.
37. The method of claim 36, wherein defining a region of interest comprises at least one of positioning and reshaping, in response to a user input, the region of interest on the image to define boundaries within which the property is calculated.
38. The method of claim 36, wherein the region of interest is provided as at least one of a polygon, an ellipse, and a grid.
39. The method of claim 36, further comprising adjusting, in response to a user input, at least one of the brightness, sharpness, and contrast of the image.
40. The method of claim 36, further comprising displaying an electronic notebook on the display concurrently with the region of interest tool and the image, wherein the calculated property of the image within the region of interest is displayed in the electronic notebook.
41. The method of claim 40, further comprising displaying in said electronic notebook details about an experiment or imaged object.
42. The method of claim 40, further comprising date stamping information appearing in the electronic notebook when said information is added to the electronic notebook.
43. The method of claim 40, further comprising displaying in said electronic notebook user notes about an experiment or imaged object.
44. A computer program product comprising a computer readable medium and program instructions provided via the computer readable medium, the program instructions comprising instructions for analyzing a region of interest on an image presented on a display associated with the computer system, the instructions specifying:
defining a region of interest on the image by selecting a region of interest tool from a user interface presented on the display, such that the region of interest tool and the image are concurrently displayed on the display; and
calculating a property of the image within the region of interest.
45. The computer program product of claim 44, wherein instructions for defining a region of interest comprises instructions for at least one of positioning and reshaping, in response to a user input, the region of interest on the image to define boundaries within which the property is calculated.
46. The computer program product of claim 44, further comprising instructions for displaying an electronic notebook on the display concurrently with the region of interest tool and the image, wherein the calculated property of the image within the region of interest is displayed in the electronic notebook.
PCT/US2000/031482 1999-11-15 2000-11-14 Graphical user interface for in-vivo imaging WO2001037195A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU17691/01A AU1769101A (en) 1999-11-15 2000-11-14 Graphical user interface for in-vivo imaging

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/439,381 US6614452B1 (en) 1999-11-15 1999-11-15 Graphical user interface for in-vivo imaging
US09/439,381 1999-11-15

Publications (2)

Publication Number Publication Date
WO2001037195A2 true WO2001037195A2 (en) 2001-05-25
WO2001037195A3 WO2001037195A3 (en) 2002-07-11

Family

ID=23744486

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/031482 WO2001037195A2 (en) 1999-11-15 2000-11-14 Graphical user interface for in-vivo imaging

Country Status (3)

Country Link
US (5) US6614452B1 (en)
AU (1) AU1769101A (en)
WO (1) WO2001037195A2 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2827399A1 (en) * 2001-07-12 2003-01-17 Bruno Regnier Measuring tablet with a tactile screen, uses scanner of digital images with manual input to tactile screen to allow user to identify data of interest and perform computations or visualize the data
US6647179B2 (en) 1999-12-21 2003-11-11 Agilent Technologies, Inc. Process and device for making gratings in optical fibres
US6649143B1 (en) 1994-07-01 2003-11-18 The Board Of Trustees Of The Leland Stanford Junior University Non-invasive localization of a light-emitting conjugate in a mammal
EP1406081A1 (en) * 2001-07-03 2004-04-07 Hitachi, Ltd. Biological sample optical measuring method and biological sample optical measuring apparatus
US6737245B1 (en) 1999-09-08 2004-05-18 Xenogen Corporation Luciferase expression cassettes and methods of use
US6867348B1 (en) 1999-12-16 2005-03-15 Xenogen Corporation Methods and compositions for screening for angiogenesis modulating compounds
US7056728B2 (en) 2000-07-06 2006-06-06 Xenogen Corporation Compositions and methods for use thereof in modifying the genomes of microorganisms
US7366333B2 (en) * 2002-11-11 2008-04-29 Art, Advanced Research Technologies, Inc. Method and apparatus for selecting regions of interest in optical imaging
US7449615B2 (en) 1998-12-17 2008-11-11 Xenogen Corporation Non-invasive evaluation of physiological response in a transgenic mouse
CN100431475C (en) * 2003-04-25 2008-11-12 奥林巴斯株式会社 Device, method and program for image processing
WO2009003515A1 (en) * 2007-07-02 2009-01-08 Trimble Jena Gmbh Feature detection apparatus and metod for measuring object distances
US8545814B2 (en) 1994-07-01 2013-10-01 The Board Of Trustees Of The Leland Stanford Junior University Non-invasive localization of a light-emitting conjugate in a mammal
US8705817B2 (en) 2008-10-24 2014-04-22 Eos Imaging Measurement of geometric quantities intrinsic to an anatomical system
CN107991295A (en) * 2014-02-17 2018-05-04 安盛生科股份有限公司 Utilize mobile device measuring physics and biochemical parameters

Families Citing this family (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6614452B1 (en) * 1999-11-15 2003-09-02 Xenogen Corporation Graphical user interface for in-vivo imaging
US7581191B2 (en) * 1999-11-15 2009-08-25 Xenogen Corporation Graphical user interface for 3-D in-vivo imaging
JP2003534079A (en) * 2000-05-24 2003-11-18 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Direct mouse control of measurement functions for medical images
WO2002009571A2 (en) * 2000-07-31 2002-02-07 Galil Medical Ltd. Planning and facilitation systems and methods for cryosurgery
US7119814B2 (en) * 2001-05-18 2006-10-10 Given Imaging Ltd. System and method for annotation on a moving image
US20030053951A1 (en) * 2001-07-26 2003-03-20 Millennium Pharmaceuticals, Inc. Use of non-invasive imaging technologies to monitor in vivo gene-expression
JP2003156460A (en) * 2001-09-10 2003-05-30 Jeol Ltd Method and system for managing data
DE10229407B4 (en) * 2002-06-29 2021-10-14 Leica Microsystems Cms Gmbh Procedure for setting the system parameters of a scanning microscope and scanning microscope
US9307884B1 (en) * 2003-01-27 2016-04-12 The Pnc Financial Services Group, Inc. Visual asset structuring tool
ES2360701T3 (en) * 2003-10-02 2011-06-08 Given Imaging Ltd. SYSTEM AND PROCEDURE FOR THE PRESENTATION OF DATA FLOWS.
JP4758353B2 (en) * 2003-11-13 2011-08-24 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Three-dimensional segmentation using deformable surfaces
US20050235272A1 (en) * 2004-04-20 2005-10-20 General Electric Company Systems, methods and apparatus for image annotation
US7732743B1 (en) 2005-06-03 2010-06-08 Michael Paul Buchin Low-photon-flux image acquisition and processing tool
US20070060798A1 (en) * 2005-09-15 2007-03-15 Hagai Krupnik System and method for presentation of data streams
FR2891924B1 (en) * 2005-10-10 2007-12-28 Biospace Mesures LUMINESCENCE IMAGING DEVICE AND METHOD
CA2625775A1 (en) 2005-10-14 2007-04-19 Applied Research Associates Nz Limited A method of monitoring a surface feature and apparatus therefor
US7680629B2 (en) * 2007-08-30 2010-03-16 Fego Precision Industrial Co., Ltd. System and method for providing notes in measurement devices
US8220415B2 (en) 2007-09-05 2012-07-17 Li-Cor, Inc. Modular animal imaging apparatus
FR2920874B1 (en) * 2007-09-10 2010-08-20 Biospace Lab LUMINESCENCE IMAGING INSTALLATION AND METHOD
US9418474B2 (en) * 2008-01-04 2016-08-16 3M Innovative Properties Company Three-dimensional model refinement
WO2009129543A1 (en) * 2008-04-18 2009-10-22 Coinsecure, Inc. Apparatus for producing optical signatures from coinage
US20090295912A1 (en) * 2008-05-12 2009-12-03 Coinsecure, Inc. Coin edge imaging device
WO2009154707A2 (en) 2008-06-18 2009-12-23 The Smartpill Corporation System and method of evaluating a subject with an ingestible capsule
US8229193B2 (en) * 2008-09-03 2012-07-24 General Electric Company System and methods for applying image presentation context functions to image sub-regions
IT1396752B1 (en) * 2009-01-30 2012-12-14 Galileo Avionica S P A Ora Selex Galileo Spa VISUALIZATION OF A THREE-DIMENSIONAL VIRTUAL SPACE GENERATED BY AN ELECTRONIC SIMULATION SYSTEM
US7856135B1 (en) * 2009-12-02 2010-12-21 Aibili—Association for Innovation and Biomedical Research on Light and Image System for analyzing ocular fundus images
US8719294B2 (en) * 2010-03-12 2014-05-06 Fiitotech Company Limited Network digital creation system and method thereof
US8436321B2 (en) 2010-05-21 2013-05-07 Li-Cor, Inc. Optical background suppression systems and methods for fluorescence imaging
US8901516B2 (en) 2010-09-01 2014-12-02 Spectral Instruments Imaging, LLC Excitation light source assembly
WO2012030973A2 (en) 2010-09-01 2012-03-08 Spectral Instruments Imaging, LLC Methods and systems for producing visible light and x-ray image data
JP2012198139A (en) * 2011-03-22 2012-10-18 Olympus Corp Image processing program, image processing device, measurement analysis device and image processing method
US8873816B1 (en) 2011-04-06 2014-10-28 Given Imaging Ltd. Method and system for identification of red colored pathologies in vivo
US9179844B2 (en) 2011-11-28 2015-11-10 Aranz Healthcare Limited Handheld skin measuring or monitoring device
US8712126B2 (en) * 2012-03-12 2014-04-29 Xerox Corporation Web-based system and method for video analysis
WO2013164826A1 (en) 2012-05-04 2013-11-07 Given Imaging Ltd. System and method for automatic navigation of a capsule based on image stream captured in-vivo
US9324145B1 (en) 2013-08-08 2016-04-26 Given Imaging Ltd. System and method for detection of transitions in an image stream of the gastrointestinal tract
WO2015059613A2 (en) * 2013-10-22 2015-04-30 Koninklijke Philips N.V. Image visualization
USD771089S1 (en) * 2014-07-23 2016-11-08 General Electric Company Display screen or portion thereof with graphical user interface for a radiation dose mapping system
US9480448B2 (en) 2014-07-23 2016-11-01 General Electric Company System and method for use in mapping a radiation dose applied in an angiography imaging procedure of a patient
US10013527B2 (en) 2016-05-02 2018-07-03 Aranz Healthcare Limited Automatically assessing an anatomical surface feature and securely managing information related to the same
US11116407B2 (en) 2016-11-17 2021-09-14 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems
EP4183328A1 (en) 2017-04-04 2023-05-24 Aranz Healthcare Limited Anatomical surface assessment methods, devices and systems

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4948975A (en) * 1988-09-08 1990-08-14 The United States Of America As Represented By The Secretary Of The Air Force Quantitative luminescence imaging system
US5202091A (en) * 1985-03-01 1993-04-13 Lisenbee Wayne F Luminescence measurement arrangement
WO1994013095A2 (en) * 1992-11-25 1994-06-09 Rstar, Inc. User interface for picture archiving and communication system
US5625377A (en) * 1992-05-27 1997-04-29 Apple Computer, Inc. Method for controlling a computerized organizer

Family Cites Families (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5850465A (en) * 1989-06-26 1998-12-15 Fuji Photo Film Co., Ltd. Abnormnal pattern detecting or judging apparatus, circular pattern judging apparatus, and image finding apparatus
JP2970967B2 (en) 1991-11-20 1999-11-02 浜松ホトニクス株式会社 Intracellular ion concentration measurement method using fluorescent probe reagent
US5384862A (en) * 1992-05-29 1995-01-24 Cimpiter Corporation Radiographic image evaluation apparatus and method
US5431161A (en) * 1993-04-15 1995-07-11 Adac Laboratories Method and apparatus for information acquistion, processing, and display within a medical camera system
US5414258A (en) 1993-11-22 1995-05-09 Angstrom Technologies, Inc. Apparatus and method for calibration of fluorescence detectors
JP3494692B2 (en) * 1994-03-07 2004-02-09 富士写真フイルム株式会社 Radiation image alignment method
US6343142B1 (en) * 1994-05-20 2002-01-29 Fuji Photo Film Co., Ltd. Image analyzing apparatus
US5650135A (en) 1994-07-01 1997-07-22 The Board Of Trustees Of The Leland Stanford Junior University Non-invasive localization of a light-emitting conjugate in a mammal
US6649143B1 (en) 1994-07-01 2003-11-18 The Board Of Trustees Of The Leland Stanford Junior University Non-invasive localization of a light-emitting conjugate in a mammal
US5672881A (en) 1994-09-14 1997-09-30 Glyko, Inc. Charge-coupled device imaging apparatus
US5840572A (en) 1994-10-11 1998-11-24 United States Of America As Represented By The Secretary Of The Navy Bioluminescent bioassay system
US5705807A (en) 1994-10-24 1998-01-06 Nissan Motor Co., Ltd. Photo detecting apparatus for detecting reflected light from an object and excluding an external light componet from the reflected light
AU4594796A (en) * 1994-11-25 1996-06-19 Yuriy Alexandrov System and method for diagnosis of living tissue diseases
JP2675532B2 (en) 1994-12-20 1997-11-12 株式会社バイオセンサー研究所 Chemiluminescence measuring device
US5636299A (en) 1994-12-28 1997-06-03 Lockheed Missiles & Space Company, Inc. Hybrid luminescent device and method for imaging penetrating radiation
US5919140A (en) 1995-02-21 1999-07-06 Massachusetts Institute Of Technology Optical imaging using time gated scattered light
EP0774730B1 (en) * 1995-11-01 2005-08-24 Canon Kabushiki Kaisha Object extraction method, and image sensing apparatus using the method
US5738101A (en) 1996-01-18 1998-04-14 The Regents Of The University Of California Optical imaging through turbid media with a degenerate four-wave mixing correlation time gate
US5867250A (en) 1996-05-03 1999-02-02 Baron; William S. Apparatus and method for optically mapping front and back surface topographies of an object
US6175655B1 (en) * 1996-09-19 2001-01-16 Integrated Medical Systems, Inc. Medical imaging system for displaying, manipulating and analyzing three-dimensional images
US5832931A (en) * 1996-10-30 1998-11-10 Photogen, Inc. Method for improved selectivity in photo-activation and detection of molecular diagnostic agents
US6030344A (en) * 1996-12-04 2000-02-29 Acuson Corporation Methods and apparatus for ultrasound image quantification
US6756207B1 (en) * 1997-02-27 2004-06-29 Cellomics, Inc. System for cell-based screening
US6081612A (en) * 1997-02-28 2000-06-27 Electro Optical Sciences Inc. Systems and methods for the multispectral imaging and characterization of skin tissue
US6058322A (en) * 1997-07-25 2000-05-02 Arch Development Corporation Methods for improving the accuracy in differential diagnosis on radiologic examinations
JPH1156828A (en) * 1997-08-27 1999-03-02 Fuji Photo Film Co Ltd Abnormal shadow candidate detecting method and its device
JP3554172B2 (en) * 1998-01-09 2004-08-18 キヤノン株式会社 Radiography equipment
US6364829B1 (en) 1999-01-26 2002-04-02 Newton Laboratories, Inc. Autofluorescence imaging system for endoscopy
JP3895492B2 (en) * 1998-03-13 2007-03-22 株式会社リコー Image processing apparatus, image processing method, and computer-readable recording medium storing program for causing computer to execute the method
US6385474B1 (en) * 1999-03-19 2002-05-07 Barbara Ann Karmanos Cancer Institute Method and apparatus for high-resolution detection and characterization of medical pathologies
JP3549725B2 (en) * 1998-04-13 2004-08-04 シャープ株式会社 Image processing device
US6242743B1 (en) 1998-08-11 2001-06-05 Mosaic Imaging Technology, Inc. Non-orbiting tomographic imaging system
US6949081B1 (en) * 1998-08-26 2005-09-27 Non-Invasive Technology, Inc. Sensing and interactive drug delivery
JP2002525603A (en) 1998-09-18 2002-08-13 セロミックス インコーポレイテッド System for cell-based screening
US6757412B1 (en) * 1998-10-21 2004-06-29 Computerzied Thermal Imaging, Inc. System and method for helping to determine the condition of tissue
EP1314980B1 (en) 1999-02-26 2009-12-09 Cellomics, Inc. A system for cell-based screening
US6633657B1 (en) * 1999-07-15 2003-10-14 General Electric Company Method and apparatus for controlling a dynamic range of a digital diagnostic image
US6246745B1 (en) * 1999-10-29 2001-06-12 Compumed, Inc. Method and apparatus for determining bone mineral density
US6614452B1 (en) * 1999-11-15 2003-09-02 Xenogen Corporation Graphical user interface for in-vivo imaging
US6775567B2 (en) 2000-02-25 2004-08-10 Xenogen Corporation Imaging apparatus
US6615063B1 (en) 2000-11-27 2003-09-02 The General Hospital Corporation Fluorescence-mediated molecular tomography
US7113217B2 (en) * 2001-07-13 2006-09-26 Xenogen Corporation Multi-view imaging apparatus
US7003161B2 (en) * 2001-11-16 2006-02-21 Mitutoyo Corporation Systems and methods for boundary detection in images

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5202091A (en) * 1985-03-01 1993-04-13 Lisenbee Wayne F Luminescence measurement arrangement
US4948975A (en) * 1988-09-08 1990-08-14 The United States Of America As Represented By The Secretary Of The Air Force Quantitative luminescence imaging system
US5625377A (en) * 1992-05-27 1997-04-29 Apple Computer, Inc. Method for controlling a computerized organizer
WO1994013095A2 (en) * 1992-11-25 1994-06-09 Rstar, Inc. User interface for picture archiving and communication system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MAURO DI E C ET AL: "CHECK! A GENERIC AND SPECIFIC INDUSTRIAL INSPECTION TOOL" IEE PROCEEDINGS: VISION, IMAGE AND SIGNAL PROCESSING, INSTITUTION OF ELECTRICAL ENGINEERS, GB, vol. 143, no. 4, 1 August 1996 (1996-08-01), pages 241-249, XP000627046 ISSN: 1350-245X *

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6649143B1 (en) 1994-07-01 2003-11-18 The Board Of Trustees Of The Leland Stanford Junior University Non-invasive localization of a light-emitting conjugate in a mammal
US8545814B2 (en) 1994-07-01 2013-10-01 The Board Of Trustees Of The Leland Stanford Junior University Non-invasive localization of a light-emitting conjugate in a mammal
US7449615B2 (en) 1998-12-17 2008-11-11 Xenogen Corporation Non-invasive evaluation of physiological response in a transgenic mouse
US6737245B1 (en) 1999-09-08 2004-05-18 Xenogen Corporation Luciferase expression cassettes and methods of use
US6867348B1 (en) 1999-12-16 2005-03-15 Xenogen Corporation Methods and compositions for screening for angiogenesis modulating compounds
US7196190B2 (en) 1999-12-16 2007-03-27 Xenogen Corporation Methods and compositions for screening for angiogenesis modulating compounds
US6647179B2 (en) 1999-12-21 2003-11-11 Agilent Technologies, Inc. Process and device for making gratings in optical fibres
US7056728B2 (en) 2000-07-06 2006-06-06 Xenogen Corporation Compositions and methods for use thereof in modifying the genomes of microorganisms
EP1406081A4 (en) * 2001-07-03 2011-10-05 Hitachi Ltd Biological sample optical measuring method and biological sample optical measuring apparatus
EP1406081A1 (en) * 2001-07-03 2004-04-07 Hitachi, Ltd. Biological sample optical measuring method and biological sample optical measuring apparatus
FR2827399A1 (en) * 2001-07-12 2003-01-17 Bruno Regnier Measuring tablet with a tactile screen, uses scanner of digital images with manual input to tactile screen to allow user to identify data of interest and perform computations or visualize the data
US7366333B2 (en) * 2002-11-11 2008-04-29 Art, Advanced Research Technologies, Inc. Method and apparatus for selecting regions of interest in optical imaging
CN100431475C (en) * 2003-04-25 2008-11-12 奥林巴斯株式会社 Device, method and program for image processing
WO2009003515A1 (en) * 2007-07-02 2009-01-08 Trimble Jena Gmbh Feature detection apparatus and metod for measuring object distances
US8633983B2 (en) 2007-07-02 2014-01-21 Trimble Jena Gmbh Feature detection apparatus and method for measuring object distances
US8705817B2 (en) 2008-10-24 2014-04-22 Eos Imaging Measurement of geometric quantities intrinsic to an anatomical system
CN107991295A (en) * 2014-02-17 2018-05-04 安盛生科股份有限公司 Utilize mobile device measuring physics and biochemical parameters
US10690659B2 (en) 2014-02-17 2020-06-23 Ixensor Co., Ltd. Measuring physical and biochemical parameters with mobile devices

Also Published As

Publication number Publication date
US20080134074A1 (en) 2008-06-05
AU1769101A (en) 2001-05-30
US6614452B1 (en) 2003-09-02
US20030193517A1 (en) 2003-10-16
US7765487B2 (en) 2010-07-27
US20100260395A1 (en) 2010-10-14
US7299420B2 (en) 2007-11-20
WO2001037195A3 (en) 2002-07-11
US8734342B2 (en) 2014-05-27
US20150104086A1 (en) 2015-04-16

Similar Documents

Publication Publication Date Title
WO2001037195A2 (en) Graphical user interface for in-vivo imaging
US7581191B2 (en) Graphical user interface for 3-D in-vivo imaging
JP6313325B2 (en) Selection and display of biomarker expression
JP6348504B2 (en) Biological sample split screen display and system and method for capturing the records
US8290236B2 (en) Quantitative, multispectral image analysis of tissue specimens stained with quantum dots
JP2021063819A (en) Systems and methods for comprehensive multi-assay tissue analysis
JP7424289B2 (en) Information processing device, information processing method, information processing system, and program
JP2018512072A (en) Quality control for automated slide-wide analysis
JP2009527063A (en) System and method for using and integrating samples and data in a virtual environment
EP3844772A1 (en) Medical system, medical apparatus, and medical method
JP2021515912A (en) Digital pathology scanning interface and workflow
Heebner et al. Deep learning-based segmentation of cryo-electron tomograms
Banavar et al. Image montaging for creating a virtual pathology slide: An innovative and economical tool to obtain a whole slide image
WO2021220873A1 (en) Generation device, generation method, generation program, and diagnosis assistance system
US20230230398A1 (en) Image processing device, image processing method, image processing program, and diagnosis support system
JPH04212043A (en) Video densitometer
JP2021124861A (en) Analysis device, analysis method, analysis program, and diagnosis support system
CN116235223A (en) Annotation data collection using gaze-based tracking
US20240004540A1 (en) Methods, apparatuses, and computer-readable media for enhancing digital pathology platform
WO2023248954A1 (en) Biological specimen observation system, biological specimen observation method, and dataset creation method
JP2010128847A (en) Image forming device, image display device, image forming method, and image display method
WO2020045536A1 (en) Medical system, medical apparatus, and medical method
JP2024013078A (en) Program, display method, and display system
Carter et al. Single Particle Tracking Software
Wood A microscopy scanning system for clinical chromosome diagnostics

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase