WO2001038965A9 - User interface for a medical informatics system - Google Patents

User interface for a medical informatics system

Info

Publication number
WO2001038965A9
WO2001038965A9 PCT/US2000/042257 US0042257W WO0138965A9 WO 2001038965 A9 WO2001038965 A9 WO 2001038965A9 US 0042257 W US0042257 W US 0042257W WO 0138965 A9 WO0138965 A9 WO 0138965A9
Authority
WO
WIPO (PCT)
Prior art keywords
view
user
providing
study
medical images
Prior art date
Application number
PCT/US2000/042257
Other languages
French (fr)
Other versions
WO2001038965A2 (en
WO2001038965A3 (en
Inventor
Paul Joseph Chang
Bradford V Hebert
Benjamin J Mccurtain
Original Assignee
Stentor Inc
Paul Joseph Chang
Bradford V Hebert
Benjamin J Mccurtain
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Stentor Inc, Paul Joseph Chang, Bradford V Hebert, Benjamin J Mccurtain filed Critical Stentor Inc
Priority to AU34411/01A priority Critical patent/AU3441101A/en
Priority to EP00991761A priority patent/EP1236083A2/en
Publication of WO2001038965A2 publication Critical patent/WO2001038965A2/en
Publication of WO2001038965A3 publication Critical patent/WO2001038965A3/en
Publication of WO2001038965A9 publication Critical patent/WO2001038965A9/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/56Details of data transmission or power supply, e.g. use of slip rings
    • A61B6/563Details of data transmission or power supply, e.g. use of slip rings involving image data transmission via a network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/46Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment with special arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS

Definitions

  • the present invention is directed toward the field of medical informatics, and more particularly toward a user interface for a medical informatics system.
  • Radiology equipment e.g., CT scanners, MRI scanners, X-Ray etc.
  • X-Ray machines that generate images on film.
  • X-Ray machines typically, when collecting information from a diagnostic tool, several medical images are generated for subsequent analysis and diagnosis of the patient's medical condition. This collection of medical images may be referred to as a "study.”
  • a study from an X-Ray machine may consist of a number of X-Rays taken from different perspectives of the target area. It is the totality of the study that the physician uses to make a diagnosis of the patient.
  • image data obtained by imaging equipment such as CT scanners or MRI scanners are stored in the form of computer data files.
  • the size of a data file for an image varies depending on the size and resolution of the image. For example, a typical image file for a diagnostic-quality chest X-ray is on the order of 10 megabytes (MB).
  • the image data files are usually formatted in a
  • DICOM "standard” or widely accepted format.
  • One widely used image format is known as DICOM.
  • DICOM image data files are distributed over computer networks to specialized viewing stations capable of converting the image data to high-resolution images on a CRT display.
  • the digitized medical images potentially provide to the medical community advancements due to the ability to electronically store, transfer and view digitized images over geographically disparate areas.
  • prior art systems for viewing the digital data do not comport with how physicians traditionally operate. Physicians have become accustomed to working with analog film. First, to conduct a diagnoses using traditional film, the physician chooses the films for a patient that will aid in the analysis of the patient's condition. From the selected films, the physician organizes the films in a manner suitable to conduct the analysis and subsequent diagnoses.
  • the film is placed on a light board for viewing.
  • the light board projects light through the film so that the physician may read the image imposed on the film.
  • a physician may organize the physical sheets of film on the light board in a manner suitable for conducting the analysis. It may be advantageous for a physician to place, on the light board, two sheets of film next to one another in order to analyze a condition relative to the two films.
  • the first film may comprise data taken at an earlier date, whereas the second film may contain data recently obtained.
  • the physician may analyze how a particular condition has changed over time.
  • Prior art systems for viewing digitized medical images do not provide a means to operate in a manner in which physicians work. As illustrated by the above example, using these prior art systems, a physician is not permitted to effectively organize medical images in a manner in which physicians may organize traditional analog films. Accordingly, it is desirable to develop a user interface for a medical informatics system that emulates the way a physician works by providing maximum flexibility for the physician to select, organize, navigate and subsequently analyze medical images. Furthermore, prior art systems for viewing digitized medical images display static images, in that the user is not permitted to navigate (i.e., pan or zoom the original image). Accordingly, it is also desirable to generate a system that permits "dynamic" interaction with medical images to provide the physician with maximum flexibility to interact with the image.
  • a user interface for a medical informatics system permits a physician to work with digitized medical images in a manner that the physician is accustomed to working with traditional analog film.
  • the user interface provides the user to ability to select studies, which consist of medical images and series, for patients.
  • the user selects studies on the user interface through a patient browser view.
  • the user may then organize the studies as well as the images/series within the studies, including resizing the studies and the images/series within a study.
  • the user may also navigate around the images/series. Specifically, the user has the ability to pan and zoom images to view portions of an image at various resolutions.
  • the user of the user interface may analyze the image by selecting to view the image in detail in a large floating window.
  • the organization, navigation, and analysis of studies and images/series are performed through a patient canvas view.
  • the patient canvas view is displayed in a standard orientation such that each horizontal scroll bar contains a study.
  • each study displayed on the patient canvas view is broken out left to right into one or more series for CT/MR and one or more images for CD/DR.
  • the user, using a horizontal scroll bar is permitted to scroll left and right to display the series/images contained within the study.
  • Multiple studies are laid out from top to bottom on the patient canvas view.
  • a single vertical scroll bar is provided to permit the user to scroll, in a vertical direction (i.e., from top to bottom), to display the multiple studies.
  • the user may organize studies by re-arranging the relative vertical positions among the studies.
  • the studies may be re-sized to any user-desired size.
  • the user may also use the features of the patient canvas view to organize images, within a study, by re-arranging the relative horizontal positions among the images/series within a study via a drag and drop operation.
  • Fig. 1 illustrates one embodiment for an initial patient browser view.
  • Fig. 2 illustrates an example patient browser view with a plurality of the studies for each patient.
  • Fig. 3 illustrates an example display of a patient browser view that includes selection of patient studies.
  • Fig. 4 illustrates an example patient canvas view in accordance with one embodiment of the present invention.
  • Fig. 5 illustrates a user operation to scroll images within a study.
  • Fig. 6 illustrates the patient canvas view subsequent to a user operation that scrolls among studies.
  • Fig. 7 illustrates one embodiment of a medical informatics system for use with the user interface of the present invention.
  • Fig. 8a illustrates an example of a pyramidal data structure.
  • Fig. 8b illustrates level three and level four decompositions for the 4K x 4K source image of Fig. 8a.
  • the user interface of the medical informatics system provides a ubiquitous viewing environment for fast and simple access to medical images across the enterprise.
  • the user interface may be operated by a physician in manner in which physicians are accustomed to working with traditional analog film.
  • the user of the medical informatics system may select studies, which consist of medical images/series, for patients. In one embodiment, this functionality is provided through a patient browser view.
  • the user may then organize the studies as well as the images/series within the studies, including resizing the studies and the images/series within a study.
  • the user may also navigate around the images/series. Specifically, the user has the ability to pan and zoom images to view portions of an image at various resolutions.
  • the user of the user interface may analyze the image by selecting to view the image in detail in a large floating window.
  • the organization, navigation, and analysis of studies and images/series are performed through a patient canvas view. Accordingly, the user interface of the present invention emulates the way a physician works with medical images by providing full capabilities to select, organize, navigate and analyze medical information.
  • the user interface consists of primarily a single window interface. However, additional floating windows are generated, when appropriate, to provide detailed image viewing.
  • tabs are presented to the user to permit the user to navigate between a patient browser view and one or more patient canvas views.
  • the patient browser view permits the user to select studies for one or more patients.
  • a study specific to a patient, comprises images obtained from a diagnostic tool, and in some cases, additional information (e.g., medical report) to augment the image data.
  • the studies define the repository of medical images that may be used during the session.
  • the patient canvas view provides a screen surface area for the organization, navigation and analysis of the patient medical information selected.
  • the user interface presents the user with a simple login window. Using this login window, the user may enter a user name and password. If the user is successfully authenticated by the server (e.g., image server 720, Fig. 7), then the main window of the client computer is displayed with the patient browser tab selected.
  • the server e.g., image server 720, Fig. 7
  • the user interface operates as a plug-in with an Internet browser application, such as Microsoft Internet Explorer or Netscape Navigator.
  • the user interface comprises, in part, executable software configured as a Microsoft® ActiveX Control.
  • the ActiveX Control is a "plug-in" to a Web browser application.
  • the Internet browser application includes a title bar (e.g., title bar 102, Fig. 1) including controls to minimize, maximize and close the browser application, as well as a tool bar (e.g., tool bar 104, Fig. 1).
  • a tool bar e.g., tool bar 104, Fig.
  • Fig. 1 illustrates one embodiment for an initial patient browser view.
  • the user interface opens and displays a patient browser tab, labeled 106 on Fig. 1.
  • the user interface 100 contains search capabilities to permit a user to locate and select medical information for one or more patients.
  • the user interface 100 contains, as part of the patient browser tab view, controls and entry boxes (122) to allow searching for patients and studies.
  • entry boxes for searching include: patient's name 110, patient ID 116, patient location 118, and date of last exam 120.
  • the user interface 100 also permits submission of predefined queries for the fields: physician, patient location, physician group, and body part. These predefined queries are stored as part of a user profile.
  • the physician group is a class that groups different types of physicians (e.g., neurology, orthopedic, oncology, etc.).
  • the physician groups may be assigned by an administrator of the medical informatics system. If the predefined query occurs as part of the user login process, then the initial state of the patient browser displays the results of that query. Alternatively, if a login query is not found or available, then there is no content in the patient list display area.
  • the patient browser list view 100 displays a list of patients and their corresponding studies. Information on patients and their studies is displayed in the area labeled 128 in the user interface window 100.
  • the example display of the Fig. 1 displays, in a patient list display area, information for two patients, Jamie Walter 124 and Charles Wilkins 126. A patient ID, corresponding to the patient's name, is also displayed.
  • the list of studies only indicates the specific study, and does not indicate the series or image contained in that study.
  • the information displayed for each patient includes: last name 110, first name 112, middle initial 114, patient ID 116, patient location 118, and date of last exam 120 (derived from the most recent study).
  • Fig. 2 illustrates an example patient browser view with a plurality of the studies for each patient.
  • a patient name line e.g., patient name line 124 for Jamie Walter and patient name line 126 for Charles Wilkins
  • a tree paradigm is used to display the studies beneath the patient title bars 124 and 126.
  • the display line for each study includes: a check box to indicate selection status, modality, study description, accession number, and exam date.
  • a check box to indicate selection status, modality, study description, accession number, and exam date.
  • the example of Fig 2 shows, for the patient “ Walter Jamie", the studies labeled 130, 132 and 134 on Fig. 2.
  • the abbreviation "MR” connotes magnetic resonance
  • the abbreviation “CR” connotes conventional radiography (e.g., an X-Ray).
  • MR magnetic resonance
  • CR conventional radiography
  • FIG. 3 illustrates an example display of a patient browser view that includes selection of patient studies.
  • the user selected, for the patient "Charles Wilkins", CT studies 140, 142, 146, 157, 159 and 160.
  • the selection of the studies are indicated by the check mark in the check box adjacent to the study description (e.g., CT).
  • This selection response adds and or subtracts studies from the current selection for subsequent display in the patient canvas view.
  • additional user interface features for the patient browser view permit ease of selecting and deselecting studies.
  • the key strokes "shift ⁇ click” executed by the user selects a contiguous range of studies.
  • the key strokes "control ⁇ click” deselects all other studies and selects the single study.
  • the user interface creates tabs for each patient that has at least one study selected.
  • patient tabs 150 and 155 are displayed for the patients "Walter Jamie" and “Charles Wilkins", respectively.
  • the tabs are created on a per patient basis, one tab for each patient with selected studies.
  • the tabs are displayed from left to right in an order dictated by the current sort order.
  • the example of Fig 3 shows sorting of the patient's last name in alphabetical order. The user may move to the patient canvas view (described below) for that patient by selecting the corresponding tab.
  • a canvas tab is created for the patient, and that tab is displayed similar to tabs 150 and 155 on Fig 3.
  • the user "double-clicks" on a study in the list. If all studies are selected during a double-click, then each of the selected studies are displayed within the canvas view.
  • Fig. 4 illustrates an example patient canvas view in accordance with one embodiment of the present invention.
  • a patient canvas view 200 includes a plurality of studies for the selected patient, " Jamie, Walter.” As shown in Fig. 4, the patient tab, labeled 150 for " Jamie, Walter” is highlighted. Each tab displayed has a corresponding patient canvas view. Thus, another patient canvas view exists for the patient "Charles Wilkins.”
  • the area beneath the displayed tabs is the primary display area for the studies and series/images.
  • two studies arranged vertically on the screen, are shown. In one embodiment, selected studies are automatically laid out from top to bottom on the patient canvas view. Each study is broken out left to right into one or more series for CT/MR and one or more images for CD/DR.
  • the first or top study includes the series of images labeled 230, 235 and 240 on Fig 4.
  • the second study, displayed on the bottom of the patient canvas view currently displays the three images: 260, 265, and 270.
  • the patient canvas view is displayed in a standard orientation such that each horizontal scroll bar (scroll bar 110 for the top study) contains a study.
  • the user using the horizontal scroll bar (e.g., horizontal scroll bar 110), is permitted to scroll left and right to display the series/images contained within the study.
  • a single vertical scroll bar e.g., vertical scroll bar 205 on Fig 4
  • the height of each study may be varied within the patient canvas view.
  • the user uses a cursor control device, places the cursor on a horizontal grab bar on the study (e.g., bar 290 for the top study and bar 295 for the bottom study), and resizes the study to the appropriate height.
  • the studies i.e., the window encompassing the studies
  • the user may organize studies by re-arranging the relative vertical positions among the studies.
  • the user may also use the features of the patient canvas view to organize images, within a study, by re-arranging the relative horizontal positions among the images/series within a study.
  • these organization operations are executed via a drag and drop operation.
  • a drag and drop operation the user "selects" a series/image or study, and drags the series/image or study to the destination location. When the image is located at the destination location, the user releases the series/image or study to complete the drag and drop operation.
  • a control "hot area" at the left side of each study row is displayed to provide a handle for the user to grab the study in the drag and drop operation.
  • the study "handle” is labeled 275 for the top study and is labeled 280 for the bottom study of the Fig 4.
  • the series (CT/MR) and images (CR/DR) may also be re- arranged within a study (z ' .e., re-arrange relative horizontal positions) using the drag and drop operation.
  • the user may "grab" an image or series using the title bar or annotation area, such as title bar 220 for series 235 on Fig. 4.
  • the drag and drop operation provides maximum flexibility for the user to arrange the patient canvas view in any manner desired by the user.
  • the position of studies and images displayed on the patient canvas view may also be arranged by user execution of a cut and paste operation.
  • the user selects the study (e.g., using the cursor control device or entering a keystroke sequence), executes the cut operation with the appropriate keystroke, re-positions the cursor with the cursor control device in the new destination location, and executes the "paste" command.
  • a rule set is applied to analyze a study of series/images displayed in a single row to determine the proper height for the row.
  • row heights are selected based on the nearest optimal representation.
  • the default target row height is 320 pixels.
  • the actual row height is determined by analyzing the row contents (i.e., series/images) so that unnecessary space is eliminated.
  • the row height is saved as a user preference, and a target row height is used to display studies for that user.
  • the target row height is determined from the user's screen size or window resolution.
  • the patient browser view on the user interface provides the functionality to "clone" an image.
  • a user may copy an image or series, and paste the image or series in a different location.
  • the user may copy, through a standard copy operation, image 260 in the second study (i.e., the bottom study), and paste the image to the right of image 265.
  • the user may copy an image in one study (e.g., the bottom study), and paste the image into a different study (e.g., the first or top study).
  • the result of this operation is shown on the display of Fig. 5, starting with the display of Fig. 4, with image
  • Fig. 5 illustrates a user operation to scroll images within a study.
  • the user utilizing scroll bar 210, scrolls through the images/series contained within the study.
  • Fig. 5 shows a different view from the study of Fig.4 subsequent to a user operation to scroll the images/series from right to left.
  • Fig 6 illustrates a user operation to scroll among studies.
  • Fig. 6 illustrates the patient canvas view subsequent to a user operation that scrolls among studies.
  • the user utilizing the scroll bar 205, scrolls, in a vertical direction (e.g., from bottom to top), the top and bottom studies to view more of the bottom study (and subsequently less of the top study).
  • each series/image displayed within a study includes control points and annotation information.
  • the control points and annotation information may be implemented similar to a standard window in a user interface.
  • the study "date and time” may be displayed at the top of the image, and the study and series descriptive information may be displayed below the image.
  • the image 230 of the top study includes, as a control point, the bar with the text "test 3", labeled 215, and an annotation field
  • the annotation field may include any type of information used to describe the image, including image frame and canvas row frame information.
  • annotation information such as patient name and date of study may be displayed.
  • the selection and arrangement of studies and images/series are stored in a persistent datastore.
  • the patient browser view is restored to the previous display from the prior session.
  • the patient canvas view of the user interface permits a user to fully "navigate" the image.
  • medical images are large, and cannot be displayed at full resolution on a computer monitor.
  • small windows e.g., image 235 in Fig. 4 displayed at approximately 320 pixels
  • only portions of the medical image are displayed at any one time.
  • a medical image consisting of a pixel resolution of 4K x 4K cannot be displayed at full resolution on a monitor comprising a pixel resolution of 1024 x 768.
  • the user interface may display, in a 512 x 512 window, the entire source image at a lower resolution (i.e., a thumbnail sketch of the image).
  • the images, displayed on the user interface, are "dynamic images.”
  • the images are dynamic because the user may fully manipulate each image to display different portions of the image (pan the original image) at different resolutions (zoom in and out).
  • a dynamic transfer syntax described below, provides full functionality to allow the user to manipulate the image in any manner desired. Starting with the lower resolution "dynamic image", the user may zoom-in on a more specific portion of the image. Thereafter, the user may pan the image to view a different portion of the image at the higher resolution. Accordingly, through the pan and zoom functions, the user may navigate through the images.
  • only portions of an image are displayed as the user continuously pans an image.
  • the eye is only capable of perceiving a certain level of detail while pixels are moving during the pan operation.
  • the user interface takes advantage of this fact and only uses a lower resolution version of the image during the pan operation. The lower resolution version provides adequate detail for user perception.
  • additional details are supplied to the image to display the image at the desired resolution.
  • the patient canvas view on the user interface permits a user to link series within the canvas. With this feature, as a user scrolls through slices of a first series, the second series, linked to the first series, is also scrolled.
  • the patient canvas view also permits linking of any image or series, including images and series displayed in floating windows.
  • the user interface also permits a user to clone a series for display at different window widths and window levels ("WW/WL”) (i.e., contrast and brightness, respectively).
  • WW/WL window widths and window levels
  • the user interface further permits a user to scroll a CT/MR series to a particular slice, and then link this series to another series for simultaneous cine.
  • the patient canvas view maintains, for simultaneous cine between two series, the same anatomical position for both series, even if the series contains a different number of slices. For example, a first series may contain 100 slices within an anatomical position of a patient, and a second series may contain only 10 slices within the same anatomical position of the patient.
  • the simultaneous cine feature displays 10 slices of the first series for every 1 slice of the second series.
  • WW/WL acted upon any of the three display modes is inherited by subsequent display of that series or image during the current session. This includes larger windows created for a series or image.
  • Multiple link channels are supported, as indicated by a number by a link icon and a drop down selection option at the point of linking.
  • the user may move through a single or link series with a scroll wheel on a cursor control device, pan and zoom around CR/DR images, and use the left button of the cursor control device to change WW/WL.
  • the patient canvas view of the user interface permits a user to fully "analyze" the image.
  • the user interface of the present invention permits a user to create detailed views of selected images.
  • the user interface for the medical informatics system permits the user to generate large floating windows for detailed views.
  • the detailed views permit a physician to analyze the image once the desired portion of the image is located in the navigation phase. For example, if the user navigates to a specific portion of a large image, the user may invoke the user interface to display the detailed portion of the image in a large floating window size to capture the full resolution of the specific portion.
  • to create a floating window the user double-clicks on the image, using the cursor control device, and the image is displayed in the large floating window.
  • the full floating window consists of approximately 75 percent of the display area.
  • the floating window target height may be 640 pixels, as compared to the study scroll area target height of 320 pixels.
  • the display area may comprise a 512 x 512 pixel window.
  • Linked images and series may also be displayed in floating windows.
  • the user double clicks on a linked image, or selects multiple series/images, to receive a display of a collage of those series/images, each displayed as a floating window.
  • the user may cine through the series, as described above, linked or unlink two series, change the WW/WL with the right mouse button, or pan and zoom with a single or linked CT/DR images.
  • a double-click at the floating window level or zoom box control takes the user directly to the full screen display mode with the same image manipulation interactions.
  • a double click cursor action by the user brings the images up to a nine-on-one tile mode.
  • the nine-on-one tile mode displays images on top of one another.
  • the user may use the scroll wheel to move the tile images back and forth one page at a time.
  • a left-hand button on the cursor control device permits the user to WW/WL upon all the displayed images, so as to maintain persistence while scrolling the pages.
  • the size of each image within the window may comprise 256 x 256 pixels.
  • floating windows have basic intelligent layout properties, in that multiple floating windows stack using an offset of approximately 16 pixels to the right and 100 pixels down.
  • the floating windows have a button at the base of the window for prior image/series and next image/series control.
  • the floating windows have a link item menu available on the lower right corner of the image area, similar to the link button on the canvas. This allows shared pan zoom for plain images, shared cine for CT/MR series and shared page by page review for CT/MR series in tiled mode.
  • the user interface permits the user to toggle, using the cursor control device and keys, functionality between pan/zoom and slice navigation for CT/MR images.
  • the user interface displays, in addition to image /series and studies, radiological reports to the left side of each study display area.
  • the report area is associated with the study, and a scroll bar permits the user to scroll vertically to read the text contained in the report.
  • the report feature includes a "splitter" control bar so as to allow the user to adjust the horizontal display area of the report area to expand or contract the size.
  • Example splitter control bars 275 and 280 are shown in Fig. 4.
  • Fig. 7 illustrates one embodiment of a medical informatics system for use with the user interface of the present invention.
  • a medical informatics system employs dynamic transfer syntax.
  • medical informatics system 700 includes imaging equipment 705 to generate source images 715 for storage in electronic form in an image archive 712.
  • the image archive 712 contains electronic storage components such as disk drives and tape drives used to store the images in a highly reliable manner.
  • the images are stored in a suitable archival format, such as the above-mentioned DICOM format.
  • the imaging equipment 705 includes any type of equipment to generate images, including medical equipment (e.g., X-ray equipment, CT scanners, and MR scanners).
  • the medical informatics system 700 includes at least one image server 720.
  • the pyramidal data structure is stored in image server 720.
  • Image server 720 is coupled to one or more client computers via a direct or network connection, labeled 780 on Fig 7.
  • the user interface of the present invention operates on client computers.
  • the user interface may operate as a server application that provides functionality to the client computers.
  • client computers include both thick clients (i.e., a computer with robust processing, memory, and display resources), as well as thin clients (i.e., a computer with minimal processing, memory, and display resources).
  • a client computer 740 consists of a workstation, client computers 750 and 760 consist of desktop computers, and client computers 770 consist of a portable or notebook computer.
  • the image server 720 transmits to the client computers 740, 750 and 760 transformations of the source image 715 ("transform data"), stored as pyramidal data structure 735, to re-create images and sub-images in the client computers.
  • the image server 720 transfers only the coefficient data required to reconstruct a requested image at the client(s), thus implementing a "just in time" data delivery system.
  • the dynamic transfer syntax technique permit use of a network with moderate bandwidth capacity, while still providing low latency for transfer of large data files from the image server 720 to client computers 740, 750, 760 and 770.
  • the network 780 in the medical informatics system 700 may utilize an Ethernet (lObaseT) medium or an ISDN transmission medium.
  • any network including wide area networks (WANs) and local area networks (LANs) may be used without deviating from the spirit and scope of the invention.
  • the medical informatics system 700 processes one or more source images 715.
  • the source image(s) 715 includes a digitized medical image generated from medical instrumentation (e.g., mammogram, X-Ray, MRI, CATSCAN, etc.).
  • any large data file may be used as a source image 115 without deviating from the spirit or scope of the invention.
  • the source image(s) 715 are input to decomposition processing 125.
  • decomposition processing 125 transforms the source images 715 into the dynamic transfer syntax representation, also referred to herein as pyramidal data structure 735.
  • the pyramidal data structure 735 comprises a hierarchical representation of the source image. Each level of the hierarchical representation is sufficient to reconstruct the source image at a given resolution.
  • the decomposition processing 725 utilizes a sub-band decomposition to generate the hierarchical representation.
  • sub-band decomposition consists of executing a process to separate "high-pass" information from "low-pass” information.
  • decomposition processing 125 comprises a finite impulse response (FIR) filter.
  • FIR finite impulse response
  • wavelet transforms which are a sub-class of the sub-band decomposition transform.
  • the wavelet transform may be selected so that the kernels aggregate a sufficient amount of the image information into the terms or coefficients. Specifically, the information is aggregated into the "low low" component of the decomposition.
  • kernels of the wavelet transform are selected so as to balance the computational efficiency of the transform with optimization of the aggregate information in the low pass components. This characteristic of wavelet transforms permits transfer, and subsequent display, of a good representation of the source image at a particular resolution while maintaining the computational efficiency of the transform.
  • the wavelet transform function embodiment generates mathematically independent information among the levels of the hierarchical representation. Accordingly, there is no redundant information in the pyramidal data structure 735. Thus, pyramidal data structure 735 is not merely multiple replications of the source image at different resolutions, which consists of redundant information, but it contains unique data at the different levels of the hierarchical representation.
  • the mathematically independent nature of the wavelet transform permits minimizing the amount of data transferred over a network, by requiring only the transfer of "additional data" not yet transferred to the computer from the server necessary to construct a given image.
  • the wavelet transforms are lossless, in that no data from the original source image is lost in the decomposition into the pyramidal data structure 735. Accordingly, the dynamic transfer syntax system has applications for use in medical imaging and medical imaging applications.
  • fixed point kernels are used in the wavelet transform (i.e., decomposition processing 725).
  • the use of fixed point kernels generates coefficients for the pyramidal data structure that permit an easy implementation into a standard pixel footprint.
  • the wavelet transform, a spatial transform generates a dynamic range of the "low low” component that is equal to the dynamic range of the source image. Because of this characteristic, the "low low” component does not contain overshoot or undershoot components.
  • the use of fixed point kernels is preferred because no normalization process to convert the transformed dynamic range to the pixel dynamic range is required.
  • the medical informatics system 700 directly utilizes the transform coefficients as pixels, without re-scaling the coefficients.
  • the range of the high-pass components i.e., "low high”, “high low”, and “high high” components
  • the range of the high-pass components is the range of the input source data plus two bits per coefficient. This characteristic permits mapping of all components (i.e., high and low pass components) to a given pixel footprint.
  • the use of the wavelet transform to generate the pyramidal data structure provides a scalable solution for transferring different portions of a large data file.
  • the source image 715 is decomposed into the pyramidal data structure 735, sub-images and sub-resolution images are extracted directly from memory of the image server 720.
  • the image server then transmits only the data, in the form of physical coefficients, required to reconstruct the exact size of the desired image for display at the client. Accordingly, the multi-resolution format is implicit in the pyramidal data structure.
  • a wavelet transform is a spatial transform.
  • the information is aggregated so as to preserve the predictability of the geometry of the source image.
  • specific coefficients of the transform data may be identified that contribute to specific geometric features of the source image (i.e., a pre-defined portion of a source image is directly identifiable in the transform data).
  • the wavelet transforms use floating point kernels.
  • the wavelet transform may be used to generate multi-spectral transform data.
  • multi-spectral transform data aggregates multi-components of the source image into a vector for the transform data.
  • the wavelet transform may aggregate multi-dimensional data (e.g., two dimensional, three dimensional, etc.) for a source image.
  • multi-dimensional transform data may be used to reconstruct a source image in three dimensions.
  • the multi-spectral transform data may comprise any type of attribute for binding to the source image, such as color variations and/or non- visual components (e.g., infrared components).
  • the transform is applied across the columns, and then this transform, or a different transform, is applied across the rows.
  • the selection of the transform for decomposition processing 725 is dependent upon the particular characteristics of the pyramidal data structure desired.
  • Each level of the pyramidal data structure is generated by recurring on the low-pass, "low low", of the previous higher level. This recursion continues until a predetermined size is obtained.
  • the lowest level in the pyramidal data structure for a source image having an aspect ratio of one-to-one consists of a low-pass component of 128 x 128.
  • any granularity of resolution may be generated for use in a pyramidal data structure without deviating from the spirit or scope of the invention.
  • any quadrant may be used in the recursion process with any desired transform.
  • Fig. 8a illustrates an example of a pyramidal data structure.
  • the source image comprises a 4K x 4K image.
  • the decomposition processing 725 generates, in a first iteration, a level one Mallat structure.
  • a low-pass component, "low low” is generated and consists of a 2K x 2K sub-image.
  • the 2K x 2K sub-image is labeled in Fig. 8a as 805.
  • the high-pass components consisting of "low high”, “high high”, and "high low" contain physical coefficient coordinates (e.g., the upper right hand coordinate for the rectangle that constitutes the "low high” component is (4K, 0)).
  • decomposition processing 725 operates on the low pass (i.e., "low low"), component of the level one data.
  • the low-pass component, "low low” consists of a IK x IK sub-image, as labeled in Fig. 8a.
  • Fig. 8b illustrates level three and level four decompositions for the 4K x 4K source image of Fig. 8a.
  • decomposition processing 725 operates on the level two "low low" component (i.e., the IK x IK image).
  • the low-pass component is a 512 x 512 sub-image as labeled on Fig. 8a.
  • Fig. 8b also illustrates a fourth level of decomposition for the 4K x 4K source image.
  • the low-pass component comprises a sub-image of 256 x 256 pixels.
  • the wavelet kernel comprises the wavelet kernel derived from D.
  • the kernel consists of a low pass and a high pass biorthogonal filter.
  • the wavelet transform is a spatial transform such that the information is aggregated to preserve the predictability of the geometry of the source image.
  • coefficient coordinates sufficient to reconstruct a desired image or sub-image at a particular level are readily identifiable.
  • FIG. 9 illustrates a high-level block diagram of a general purpose computer system for implementing the user interface for the medical informatics system.
  • a computer system 1000 contains a processor unit 1005, main memory 1010, and an interconnect bus 1025.
  • the processor unit 1005 may contain a single microprocessor, or may contain a plurality of microprocessors for configuring the computer system 1000 as a multi-processor system.
  • the main memory 1010 stores, in part, instructions and data for execution by the processor unit 1005. If the user interface for the medical informatics system of the present invention is partially implemented in software, the main memory 1010 stores the executable code when in operation.
  • the main memory 1010 may include banks of dynamic random access memory (DRAM) as well as high speed cache memory.
  • DRAM dynamic random access memory
  • the computer system 1000 further includes a mass storage device 1020, peripheral device(s) 1030, portable storage medium drive(s) 1040, input control device(s) 1070, a graphics subsystem 1050, and an output display 1060.
  • a mass storage device 1020 for purposes of simplicity, all components in the computer system 1000 are shown in Fig. 9 as being connected via the bus 1025. However, the computer system 1000 may be connected through one or more data transport means.
  • the processor unit 1005 and the main memory 1010 may be connected via a local microprocessor bus, and the mass storage device 1020, peripheral device(s) 1030, portable storage medium drive(s) 1040, graphics subsystem 1050 may be connected via one or more input/output (I/O) busses.
  • I/O input/output
  • the mass storage device 1020 which may be implemented with a magnetic disk drive or an optical disk drive, is a non- volatile storage device for storing data and instructions for use by the processor unit 1005.
  • the mass storage device 1020 stores the user interface software for loading to the main memory 1010.
  • the portable storage medium drive 1040 operates in conjunction with a portable non-volatile storage medium, such as a floppy disk or a compact disc read only memory
  • the peripheral device(s) 1030 may include any type of computer support device, such as an input/output (I/O) interface, to add additional functionality to the computer system 1000.
  • the peripheral device(s) 1030 may include a network interface card for interfacing the computer system 1000 to a network.
  • the input control device(s) 1070 provide a portion of the user interface for a user of the computer system 1000.
  • the input control device(s) 1070 may include an alphanumeric keypad for inputting alphanumeric and other key information, a cursor control device, such as a mouse, a trackball, stylus, or cursor direction keys.
  • the computer system 1000 contains the graphics subsystem 1050 and the output display 1060.
  • the output display 1060 may include a cathode ray tube (CRT) display or liquid crystal display (LCD).
  • the graphics subsystem 1050 receives textual and graphical information, and processes the information for output to the output display 1060.
  • the components contained in the computer system 1000 are those typically found in general purpose computer systems, and in fact, these components are intended to represent a broad category of such computer components that are well known in the art.
  • the user interface for the medical informatics system may be implemented in either hardware or software.
  • the user interface is software that includes a plurality of computer executable instructions for implementation on a general purpose computer system. Prior to loading into a general-purpose computer system, the user interface software may reside as encoded information on a computer readable medium, such as a magnetic floppy disk, magnetic tape, and compact disc read only memory (CD - ROM).
  • the user interface system may comprise a dedicated processor including processor instructions for performing the functions described herein. Circuits may also be developed to perform the functions described herein.

Abstract

A user interface for a medical informatics system, which permits a physician to work with digitized medical images in a manner that the physician is accustomed to working with traditional analog film, is disclosed. The user interface includes a through a patient browser view to provide the ability to select studies, which consist of medical images and series, for patients. After selecting the studies, the user, through a patient canvas view, may then organize the studies as well as the images/series within the studies, including resizing the studies and the images/series within a study. The user may also pan and zoom images to view portions of an image at various resolutions. Furthermore, the user of the user interface may analyze the image by selecting to view the image in detail in a large floating window.

Description

USER INTERFACE FOR A MEDICAL INFORMATICS SYSTEM
BACKGROUND OF THE INVENTION
Field of the Invention:
The present invention is directed toward the field of medical informatics, and more particularly toward a user interface for a medical informatics system.
Art Background:
Radiology equipment (e.g., CT scanners, MRI scanners, X-Ray etc.) is in wide spread use as diagnostic tools in hospitals today. Traditionally, radiology departments utilize equipment, such as X-Ray machines, that generate images on film. Typically, when collecting information from a diagnostic tool, several medical images are generated for subsequent analysis and diagnosis of the patient's medical condition. This collection of medical images may be referred to as a "study." For example, a study from an X-Ray machine may consist of a number of X-Rays taken from different perspectives of the target area. It is the totality of the study that the physician uses to make a diagnosis of the patient.
It has become more common in the medical field for images to be stored, distributed, and viewed in digital form using computer technology. Currently, Picture Archival and
Communication Systems or PACS have been in widespread use. In a typical PACS application, image data obtained by imaging equipment such as CT scanners or MRI scanners are stored in the form of computer data files. The size of a data file for an image varies depending on the size and resolution of the image. For example, a typical image file for a diagnostic-quality chest X-ray is on the order of 10 megabytes (MB). The image data files are usually formatted in a
"standard" or widely accepted format. In the medical field, one widely used image format is known as DICOM. The DICOM image data files are distributed over computer networks to specialized viewing stations capable of converting the image data to high-resolution images on a CRT display. The digitized medical images potentially provide to the medical community advancements due to the ability to electronically store, transfer and view digitized images over geographically disparate areas. However, prior art systems for viewing the digital data do not comport with how physicians traditionally operate. Physicians have become accustomed to working with analog film. First, to conduct a diagnoses using traditional film, the physician chooses the films for a patient that will aid in the analysis of the patient's condition. From the selected films, the physician organizes the films in a manner suitable to conduct the analysis and subsequent diagnoses. Specifically, the film is placed on a light board for viewing. The light board projects light through the film so that the physician may read the image imposed on the film. Prior to analyzing a study, a physician may organize the physical sheets of film on the light board in a manner suitable for conducting the analysis. It may be advantageous for a physician to place, on the light board, two sheets of film next to one another in order to analyze a condition relative to the two films. For example, the first film may comprise data taken at an earlier date, whereas the second film may contain data recently obtained. By placing the films side-by-side, the physician may analyze how a particular condition has changed over time.
Prior art systems for viewing digitized medical images do not provide a means to operate in a manner in which physicians work. As illustrated by the above example, using these prior art systems, a physician is not permitted to effectively organize medical images in a manner in which physicians may organize traditional analog films. Accordingly, it is desirable to develop a user interface for a medical informatics system that emulates the way a physician works by providing maximum flexibility for the physician to select, organize, navigate and subsequently analyze medical images. Furthermore, prior art systems for viewing digitized medical images display static images, in that the user is not permitted to navigate (i.e., pan or zoom the original image). Accordingly, it is also desirable to generate a system that permits "dynamic" interaction with medical images to provide the physician with maximum flexibility to interact with the image.
SUMMARY OF THE INVENTION A user interface for a medical informatics system permits a physician to work with digitized medical images in a manner that the physician is accustomed to working with traditional analog film. The user interface provides the user to ability to select studies, which consist of medical images and series, for patients. In one embodiment, the user selects studies on the user interface through a patient browser view. After selecting the studies, the user may then organize the studies as well as the images/series within the studies, including resizing the studies and the images/series within a study. The user may also navigate around the images/series. Specifically, the user has the ability to pan and zoom images to view portions of an image at various resolutions. Furthermore, the user of the user interface may analyze the image by selecting to view the image in detail in a large floating window. In one embodiment, the organization, navigation, and analysis of studies and images/series are performed through a patient canvas view.
In one embodiment, the patient canvas view is displayed in a standard orientation such that each horizontal scroll bar contains a study. For this embodiment, each study displayed on the patient canvas view is broken out left to right into one or more series for CT/MR and one or more images for CD/DR. The user, using a horizontal scroll bar is permitted to scroll left and right to display the series/images contained within the study. Multiple studies are laid out from top to bottom on the patient canvas view. A single vertical scroll bar is provided to permit the user to scroll, in a vertical direction (i.e., from top to bottom), to display the multiple studies. Using the user interface, the user may organize studies by re-arranging the relative vertical positions among the studies. Thus, the studies (i.e., the window encompassing the studies) may be re-sized to any user-desired size. The user may also use the features of the patient canvas view to organize images, within a study, by re-arranging the relative horizontal positions among the images/series within a study via a drag and drop operation.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 illustrates one embodiment for an initial patient browser view.
Fig. 2 illustrates an example patient browser view with a plurality of the studies for each patient. Fig. 3 illustrates an example display of a patient browser view that includes selection of patient studies.
Fig. 4 illustrates an example patient canvas view in accordance with one embodiment of the present invention.
Fig. 5 illustrates a user operation to scroll images within a study. Fig. 6 illustrates the patient canvas view subsequent to a user operation that scrolls among studies.
Fig. 7 illustrates one embodiment of a medical informatics system for use with the user interface of the present invention.
Fig. 8a illustrates an example of a pyramidal data structure. Fig. 8b illustrates level three and level four decompositions for the 4K x 4K source image of Fig. 8a. DETAILED DESCRIPTION
The user interface of the medical informatics system provides a ubiquitous viewing environment for fast and simple access to medical images across the enterprise. The user interface may be operated by a physician in manner in which physicians are accustomed to working with traditional analog film. First, the user of the medical informatics system may select studies, which consist of medical images/series, for patients. In one embodiment, this functionality is provided through a patient browser view. The user may then organize the studies as well as the images/series within the studies, including resizing the studies and the images/series within a study. The user may also navigate around the images/series. Specifically, the user has the ability to pan and zoom images to view portions of an image at various resolutions. Furthermore, the user of the user interface may analyze the image by selecting to view the image in detail in a large floating window. In one embodiment, the organization, navigation, and analysis of studies and images/series are performed through a patient canvas view. Accordingly, the user interface of the present invention emulates the way a physician works with medical images by providing full capabilities to select, organize, navigate and analyze medical information.
In one embodiment, the user interface consists of primarily a single window interface. However, additional floating windows are generated, when appropriate, to provide detailed image viewing. For the single window interface embodiment, tabs are presented to the user to permit the user to navigate between a patient browser view and one or more patient canvas views. In general, the patient browser view permits the user to select studies for one or more patients. A study, specific to a patient, comprises images obtained from a diagnostic tool, and in some cases, additional information (e.g., medical report) to augment the image data. The studies define the repository of medical images that may be used during the session. The patient canvas view provides a screen surface area for the organization, navigation and analysis of the patient medical information selected.
In one embodiment, the user interface presents the user with a simple login window. Using this login window, the user may enter a user name and password. If the user is successfully authenticated by the server (e.g., image server 720, Fig. 7), then the main window of the client computer is displayed with the patient browser tab selected.
In one embodiment, the user interface operates as a plug-in with an Internet browser application, such as Microsoft Internet Explorer or Netscape Navigator. In one embodiment, the user interface comprises, in part, executable software configured as a Microsoft® ActiveX Control. For this embodiment, the ActiveX Control is a "plug-in" to a Web browser application. The Internet browser application includes a title bar (e.g., title bar 102, Fig. 1) including controls to minimize, maximize and close the browser application, as well as a tool bar (e.g., tool bar 104, Fig. 1). Although the user interface is shown herein as an Internet browser plug- in, the user interface may operate independent of other application programs without deviating from the spirit or scope of the invention.
Patient Browser View for Selecting Studies:
Fig. 1 illustrates one embodiment for an initial patient browser view. For this embodiment, following user login, the user interface opens and displays a patient browser tab, labeled 106 on Fig. 1. In one embodiment, the user interface 100 contains search capabilities to permit a user to locate and select medical information for one or more patients. For this feature, the user interface 100 contains, as part of the patient browser tab view, controls and entry boxes (122) to allow searching for patients and studies. Specifically, entry boxes for searching include: patient's name 110, patient ID 116, patient location 118, and date of last exam 120. The user interface 100 also permits submission of predefined queries for the fields: physician, patient location, physician group, and body part. These predefined queries are stored as part of a user profile. The physician group is a class that groups different types of physicians (e.g., neurology, orthopedic, oncology, etc.). The physician groups may be assigned by an administrator of the medical informatics system. If the predefined query occurs as part of the user login process, then the initial state of the patient browser displays the results of that query. Alternatively, if a login query is not found or available, then there is no content in the patient list display area.
In general, the patient browser list view 100 displays a list of patients and their corresponding studies. Information on patients and their studies is displayed in the area labeled 128 in the user interface window 100. The example display of the Fig. 1 displays, in a patient list display area, information for two patients, Jamie Walter 124 and Charles Wilkins 126. A patient ID, corresponding to the patient's name, is also displayed. For this embodiment, the list of studies only indicates the specific study, and does not indicate the series or image contained in that study. As shown in Fig. 1, the information displayed for each patient includes: last name 110, first name 112, middle initial 114, patient ID 116, patient location 118, and date of last exam 120 (derived from the most recent study). In addition, the user may sort the list of patients by last name, patient location, and date of last exam. Fig. 2 illustrates an example patient browser view with a plurality of the studies for each patient. To obtain the patient browser view of Fig. 2 from the patient browser view of Fig. 1, the user, using a cursor control device, "clicks" on a patient name line (e.g., patient name line 124 for Jamie Walter and patient name line 126 for Charles Wilkins), and the studies available for that patient are displayed. As shown in Fig 2, a tree paradigm is used to display the studies beneath the patient title bars 124 and 126. For the hierarchical patient study display of the Fig 2, the display line for each study includes: a check box to indicate selection status, modality, study description, accession number, and exam date. Specifically, the example of Fig 2 shows, for the patient " Walter Jamie", the studies labeled 130, 132 and 134 on Fig. 2. The abbreviation "MR" connotes magnetic resonance, and the abbreviation "CR" connotes conventional radiography (e.g., an X-Ray). For the patient "Charles Wilkins", a plurality of CT studies are revealed.
If the user clicks, using a cursor control device, to select a study, then the selection check mark in the check boxes 136 and 138 are toggled. Fig. 3 illustrates an example display of a patient browser view that includes selection of patient studies. For this example, the user selected, for the patient "Charles Wilkins", CT studies 140, 142, 146, 157, 159 and 160. The selection of the studies are indicated by the check mark in the check box adjacent to the study description (e.g., CT). This selection response adds and or subtracts studies from the current selection for subsequent display in the patient canvas view. In other embodiments, additional user interface features for the patient browser view permit ease of selecting and deselecting studies. For example, the key strokes "shift ~ click" executed by the user selects a contiguous range of studies. The key strokes "control ~ click" deselects all other studies and selects the single study.
As study selections are made by the user from the patient browser view, the user interface creates tabs for each patient that has at least one study selected. For the example shown in Fig. 3, patient tabs 150 and 155 are displayed for the patients "Walter Jamie" and "Charles Wilkins", respectively. The tabs are created on a per patient basis, one tab for each patient with selected studies. For this embodiment, the tabs are displayed from left to right in an order dictated by the current sort order. The example of Fig 3 shows sorting of the patient's last name in alphabetical order. The user may move to the patient canvas view (described below) for that patient by selecting the corresponding tab. As a shortcut, if the user "double clicks" on a patient's name (e.g., line 126 for Charles Wilkins and line 124 for Jamie Walter), a canvas tab is created for the patient, and that tab is displayed similar to tabs 150 and 155 on Fig 3. As a shortcut to adding the canvas tab for the patient and selecting all studies for display on the patient canvas view, the user "double-clicks" on a study in the list. If all studies are selected during a double-click, then each of the selected studies are displayed within the canvas view.
Patient Canvas View for Organization of Studies and Images:
In general, the patient canvas view of the user interface permits a user to organize, navigate, and analyze images/series in the studies selected. Fig. 4 illustrates an example patient canvas view in accordance with one embodiment of the present invention. A patient canvas view 200 includes a plurality of studies for the selected patient, " Jamie, Walter." As shown in Fig. 4, the patient tab, labeled 150 for " Jamie, Walter" is highlighted. Each tab displayed has a corresponding patient canvas view. Thus, another patient canvas view exists for the patient "Charles Wilkins."
The area beneath the displayed tabs is the primary display area for the studies and series/images. For the example of Fig. 4, two studies, arranged vertically on the screen, are shown. In one embodiment, selected studies are automatically laid out from top to bottom on the patient canvas view. Each study is broken out left to right into one or more series for CT/MR and one or more images for CD/DR. In the example of Fig. 4, the first or top study includes the series of images labeled 230, 235 and 240 on Fig 4. The second study, displayed on the bottom of the patient canvas view, currently displays the three images: 260, 265, and 270.
In one embodiment, the patient canvas view is displayed in a standard orientation such that each horizontal scroll bar (scroll bar 110 for the top study) contains a study. The user, using the horizontal scroll bar (e.g., horizontal scroll bar 110), is permitted to scroll left and right to display the series/images contained within the study. Also, a single vertical scroll bar (e.g., vertical scroll bar 205 on Fig 4) is provided to permit the user to scroll, in a vertical direction (i.e., from top to bottom), to display multiple studies. Furthermore, the height of each study may be varied within the patient canvas view. To accomplish this operation, the user, using a cursor control device, places the cursor on a horizontal grab bar on the study (e.g., bar 290 for the top study and bar 295 for the bottom study), and resizes the study to the appropriate height. Using this technique, the studies (i.e., the window encompassing the studies), may be re-sized.
Using the user interface, the user may organize studies by re-arranging the relative vertical positions among the studies. The user may also use the features of the patient canvas view to organize images, within a study, by re-arranging the relative horizontal positions among the images/series within a study. In one embodiment, these organization operations are executed via a drag and drop operation. As is well known, in a drag and drop operation, the user "selects" a series/image or study, and drags the series/image or study to the destination location. When the image is located at the destination location, the user releases the series/image or study to complete the drag and drop operation. A control "hot area" at the left side of each study row is displayed to provide a handle for the user to grab the study in the drag and drop operation. The study "handle" is labeled 275 for the top study and is labeled 280 for the bottom study of the Fig 4. The series (CT/MR) and images (CR/DR) may also be re- arranged within a study (z'.e., re-arrange relative horizontal positions) using the drag and drop operation. For this operation, the user may "grab" an image or series using the title bar or annotation area, such as title bar 220 for series 235 on Fig. 4. The drag and drop operation provides maximum flexibility for the user to arrange the patient canvas view in any manner desired by the user. The position of studies and images displayed on the patient canvas view may also be arranged by user execution of a cut and paste operation. As is well-known for a general cut and paste operation, the user selects the study (e.g., using the cursor control device or entering a keystroke sequence), executes the cut operation with the appropriate keystroke, re-positions the cursor with the cursor control device in the new destination location, and executes the "paste" command.
In one embodiment, a rule set is applied to analyze a study of series/images displayed in a single row to determine the proper height for the row. Using this technique, row heights are selected based on the nearest optimal representation. The default target row height is 320 pixels. The actual row height is determined by analyzing the row contents (i.e., series/images) so that unnecessary space is eliminated. In another embodiment, the row height is saved as a user preference, and a target row height is used to display studies for that user. In other embodiments, the target row height is determined from the user's screen size or window resolution.
The patient browser view on the user interface provides the functionality to "clone" an image. To this end, a user may copy an image or series, and paste the image or series in a different location. For the example shown in Fig. 4, the user may copy, through a standard copy operation, image 260 in the second study (i.e., the bottom study), and paste the image to the right of image 265. In other embodiments, the user may copy an image in one study (e.g., the bottom study), and paste the image into a different study (e.g., the first or top study). The result of this operation is shown on the display of Fig. 5, starting with the display of Fig. 4, with image
260 appearing in both the first and second studies.
Fig. 5 illustrates a user operation to scroll images within a study. To view additional images contained in the first study shown in Fig. 4, the user, utilizing scroll bar 210, scrolls through the images/series contained within the study. Fig. 5 shows a different view from the study of Fig.4 subsequent to a user operation to scroll the images/series from right to left.
Fig 6 illustrates a user operation to scroll among studies. Fig. 6 illustrates the patient canvas view subsequent to a user operation that scrolls among studies. Specifically, for this example, the user, utilizing the scroll bar 205, scrolls, in a vertical direction (e.g., from bottom to top), the top and bottom studies to view more of the bottom study (and subsequently less of the top study).
In one embodiment, each series/image displayed within a study includes control points and annotation information. The control points and annotation information may be implemented similar to a standard window in a user interface. For example, the study "date and time" may be displayed at the top of the image, and the study and series descriptive information may be displayed below the image. For the example of Fig. 4, the image 230 of the top study includes, as a control point, the bar with the text "test 3", labeled 215, and an annotation field
245 entitled "annotation test 3." The annotation field may include any type of information used to describe the image, including image frame and canvas row frame information. In addition, for floating windows (described below), annotation information such as patient name and date of study may be displayed.
After a session, the selection and arrangement of studies and images/series are stored in a persistent datastore. When that user selects the same patient again, the patient browser view is restored to the previous display from the prior session.
Patient Canvas View for Navigation and Analysis of Images:
The patient canvas view of the user interface permits a user to fully "navigate" the image. Typically, medical images are large, and cannot be displayed at full resolution on a computer monitor. Thus, when displayed in small windows (e.g., image 235 in Fig. 4 displayed at approximately 320 pixels), only portions of the medical image are displayed at any one time. For example, a medical image consisting of a pixel resolution of 4K x 4K cannot be displayed at full resolution on a monitor comprising a pixel resolution of 1024 x 768. Thus, only portions of the 4096 x 4096 source image are displayed through the user interface at a given time. For example, the user interface may display, in a 512 x 512 window, the entire source image at a lower resolution (i.e., a thumbnail sketch of the image).
The images, displayed on the user interface, are "dynamic images." The images are dynamic because the user may fully manipulate each image to display different portions of the image (pan the original image) at different resolutions (zoom in and out). In one embodiment, a dynamic transfer syntax, described below, provides full functionality to allow the user to manipulate the image in any manner desired. Starting with the lower resolution "dynamic image", the user may zoom-in on a more specific portion of the image. Thereafter, the user may pan the image to view a different portion of the image at the higher resolution. Accordingly, through the pan and zoom functions, the user may navigate through the images.
In one embodiment, only portions of an image are displayed as the user continuously pans an image. The eye is only capable of perceiving a certain level of detail while pixels are moving during the pan operation. The user interface takes advantage of this fact and only uses a lower resolution version of the image during the pan operation. The lower resolution version provides adequate detail for user perception. When the continuous panning activity halts, additional details are supplied to the image to display the image at the desired resolution. This feature enhances performance of the medical informatics system in low network bandwidth implementations . The patient canvas view on the user interface permits a user to link series within the canvas. With this feature, as a user scrolls through slices of a first series, the second series, linked to the first series, is also scrolled. The patient canvas view also permits linking of any image or series, including images and series displayed in floating windows. The user interface also permits a user to clone a series for display at different window widths and window levels ("WW/WL") (i.e., contrast and brightness, respectively). The user interface further permits a user to scroll a CT/MR series to a particular slice, and then link this series to another series for simultaneous cine. In addition, the patient canvas view maintains, for simultaneous cine between two series, the same anatomical position for both series, even if the series contains a different number of slices. For example, a first series may contain 100 slices within an anatomical position of a patient, and a second series may contain only 10 slices within the same anatomical position of the patient. For this example, the simultaneous cine feature displays 10 slices of the first series for every 1 slice of the second series. WW/WL acted upon any of the three display modes is inherited by subsequent display of that series or image during the current session. This includes larger windows created for a series or image. Multiple link channels are supported, as indicated by a number by a link icon and a drop down selection option at the point of linking. For this embodiment, the user may move through a single or link series with a scroll wheel on a cursor control device, pan and zoom around CR/DR images, and use the left button of the cursor control device to change WW/WL.
The patient canvas view of the user interface permits a user to fully "analyze" the image. For the analysis stage, the user interface of the present invention permits a user to create detailed views of selected images. In one embodiment, the user interface for the medical informatics system permits the user to generate large floating windows for detailed views. The detailed views permit a physician to analyze the image once the desired portion of the image is located in the navigation phase. For example, if the user navigates to a specific portion of a large image, the user may invoke the user interface to display the detailed portion of the image in a large floating window size to capture the full resolution of the specific portion. In one embodiment, to create a floating window, the user double-clicks on the image, using the cursor control device, and the image is displayed in the large floating window. In one embodiment, the full floating window consists of approximately 75 percent of the display area. For example, the floating window target height may be 640 pixels, as compared to the study scroll area target height of 320 pixels. For standard CT/MR, the display area may comprise a 512 x 512 pixel window. Linked images and series may also be displayed in floating windows. In one embodiment for implementing this feature, the user double clicks on a linked image, or selects multiple series/images, to receive a display of a collage of those series/images, each displayed as a floating window. In this view, the user may cine through the series, as described above, linked or unlink two series, change the WW/WL with the right mouse button, or pan and zoom with a single or linked CT/DR images.
A double-click at the floating window level or zoom box control takes the user directly to the full screen display mode with the same image manipulation interactions. On a floating unlinked CT/MR series window, a double click cursor action by the user brings the images up to a nine-on-one tile mode. The nine-on-one tile mode displays images on top of one another. Under this scenario, the user may use the scroll wheel to move the tile images back and forth one page at a time. A left-hand button on the cursor control device permits the user to WW/WL upon all the displayed images, so as to maintain persistence while scrolling the pages. In one embodiment, for the nine-on-one tile mode, the size of each image within the window may comprise 256 x 256 pixels. In one embodiment, floating windows have basic intelligent layout properties, in that multiple floating windows stack using an offset of approximately 16 pixels to the right and 100 pixels down. As an additional feature, the floating windows have a button at the base of the window for prior image/series and next image/series control. Also, the floating windows have a link item menu available on the lower right corner of the image area, similar to the link button on the canvas. This allows shared pan zoom for plain images, shared cine for CT/MR series and shared page by page review for CT/MR series in tiled mode. Additionally, the user interface permits the user to toggle, using the cursor control device and keys, functionality between pan/zoom and slice navigation for CT/MR images. In one embodiment, the user interface displays, in addition to image /series and studies, radiological reports to the left side of each study display area. The report area is associated with the study, and a scroll bar permits the user to scroll vertically to read the text contained in the report. In one embodiment, the report feature includes a "splitter" control bar so as to allow the user to adjust the horizontal display area of the report area to expand or contract the size. Example splitter control bars 275 and 280 are shown in Fig. 4.
Medical Informatics System:
Fig. 7 illustrates one embodiment of a medical informatics system for use with the user interface of the present invention. In one embodiment, a medical informatics system employs dynamic transfer syntax. For this embodiment, medical informatics system 700 includes imaging equipment 705 to generate source images 715 for storage in electronic form in an image archive 712. The image archive 712 contains electronic storage components such as disk drives and tape drives used to store the images in a highly reliable manner. The images are stored in a suitable archival format, such as the above-mentioned DICOM format. The imaging equipment 705 includes any type of equipment to generate images, including medical equipment (e.g., X-ray equipment, CT scanners, and MR scanners).
For this embodiment, the medical informatics system 700 includes at least one image server 720. The pyramidal data structure is stored in image server 720. Image server 720 is coupled to one or more client computers via a direct or network connection, labeled 780 on Fig 7. The user interface of the present invention operates on client computers. However, the user interface may operate as a server application that provides functionality to the client computers. For the example shown in Fig. 7, client computers include both thick clients (i.e., a computer with robust processing, memory, and display resources), as well as thin clients (i.e., a computer with minimal processing, memory, and display resources). Specifically, for the example embodiment of Fig 1, a client computer 740 consists of a workstation, client computers 750 and 760 consist of desktop computers, and client computers 770 consist of a portable or notebook computer. For this embodiment, the image server 720 transmits to the client computers 740, 750 and 760 transformations of the source image 715 ("transform data"), stored as pyramidal data structure 735, to re-create images and sub-images in the client computers. The image server 720 transfers only the coefficient data required to reconstruct a requested image at the client(s), thus implementing a "just in time" data delivery system. The dynamic transfer syntax technique permit use of a network with moderate bandwidth capacity, while still providing low latency for transfer of large data files from the image server 720 to client computers 740, 750, 760 and 770. For example, the network 780 in the medical informatics system 700 may utilize an Ethernet (lObaseT) medium or an ISDN transmission medium. Regardless, any network, including wide area networks (WANs) and local area networks (LANs) may be used without deviating from the spirit and scope of the invention. The medical informatics system 700 processes one or more source images 715. Generally, the source image(s) 715 includes a digitized medical image generated from medical instrumentation (e.g., mammogram, X-Ray, MRI, CATSCAN, etc.). Although the present invention is described for use with medical images, any large data file may be used as a source image 115 without deviating from the spirit or scope of the invention. As shown in Fig. 7, the source image(s) 715 are input to decomposition processing 125.
In general, decomposition processing 125 transforms the source images 715 into the dynamic transfer syntax representation, also referred to herein as pyramidal data structure 735. In general, the pyramidal data structure 735 comprises a hierarchical representation of the source image. Each level of the hierarchical representation is sufficient to reconstruct the source image at a given resolution. In one embodiment, the decomposition processing 725 utilizes a sub-band decomposition to generate the hierarchical representation. In general, sub-band decomposition consists of executing a process to separate "high-pass" information from "low-pass" information. For the sub-band decomposition embodiment, decomposition processing 125 comprises a finite impulse response (FIR) filter. In one embodiment that uses sub-band decomposition, the decomposition processing
125 uses wavelet transforms, which are a sub-class of the sub-band decomposition transform. In general, the wavelet transform may be selected so that the kernels aggregate a sufficient amount of the image information into the terms or coefficients. Specifically, the information is aggregated into the "low low" component of the decomposition. In one embodiment, kernels of the wavelet transform are selected so as to balance the computational efficiency of the transform with optimization of the aggregate information in the low pass components. This characteristic of wavelet transforms permits transfer, and subsequent display, of a good representation of the source image at a particular resolution while maintaining the computational efficiency of the transform.
The wavelet transform function embodiment generates mathematically independent information among the levels of the hierarchical representation. Accordingly, there is no redundant information in the pyramidal data structure 735. Thus, pyramidal data structure 735 is not merely multiple replications of the source image at different resolutions, which consists of redundant information, but it contains unique data at the different levels of the hierarchical representation. The mathematically independent nature of the wavelet transform permits minimizing the amount of data transferred over a network, by requiring only the transfer of "additional data" not yet transferred to the computer from the server necessary to construct a given image. The wavelet transforms are lossless, in that no data from the original source image is lost in the decomposition into the pyramidal data structure 735. Accordingly, the dynamic transfer syntax system has applications for use in medical imaging and medical imaging applications.
In one embodiment, fixed point kernels are used in the wavelet transform (i.e., decomposition processing 725). The use of fixed point kernels generates coefficients for the pyramidal data structure that permit an easy implementation into a standard pixel footprint. The wavelet transform, a spatial transform, generates a dynamic range of the "low low" component that is equal to the dynamic range of the source image. Because of this characteristic, the "low low" component does not contain overshoot or undershoot components. As a result, the use of fixed point kernels is preferred because no normalization process to convert the transformed dynamic range to the pixel dynamic range is required.
For this embodiment, the medical informatics system 700 directly utilizes the transform coefficients as pixels, without re-scaling the coefficients. The range of the high-pass components (i.e., "low high", "high low", and "high high" components) is the range of the input source data plus two bits per coefficient. This characteristic permits mapping of all components (i.e., high and low pass components) to a given pixel footprint.
The use of the wavelet transform to generate the pyramidal data structure provides a scalable solution for transferring different portions of a large data file. When the source image 715 is decomposed into the pyramidal data structure 735, sub-images and sub-resolution images are extracted directly from memory of the image server 720. The image server then transmits only the data, in the form of physical coefficients, required to reconstruct the exact size of the desired image for display at the client. Accordingly, the multi-resolution format is implicit in the pyramidal data structure.
A wavelet transform is a spatial transform. In general, in a spatial transform, the information is aggregated so as to preserve the predictability of the geometry of the source image. For example, using a wavelet transform with fixed point kernels, specific coefficients of the transform data may be identified that contribute to specific geometric features of the source image (i.e., a pre-defined portion of a source image is directly identifiable in the transform data). In another embodiment, the wavelet transforms use floating point kernels.
In other embodiments, the wavelet transform may be used to generate multi-spectral transform data. In general, multi-spectral transform data aggregates multi-components of the source image into a vector for the transform data. Through use of multi-spectral transform data, the wavelet transform may aggregate multi-dimensional data (e.g., two dimensional, three dimensional, etc.) for a source image. For example, multi-dimensional transform data may be used to reconstruct a source image in three dimensions. Also, the multi-spectral transform data may comprise any type of attribute for binding to the source image, such as color variations and/or non- visual components (e.g., infrared components). In general, to generate the pyramidal data structure 735, the transform is applied across the columns, and then this transform, or a different transform, is applied across the rows. The selection of the transform for decomposition processing 725 is dependent upon the particular characteristics of the pyramidal data structure desired. Each level of the pyramidal data structure is generated by recurring on the low-pass, "low low", of the previous higher level. This recursion continues until a predetermined size is obtained. For example, in one embodiment, the lowest level in the pyramidal data structure for a source image having an aspect ratio of one-to-one consists of a low-pass component of 128 x 128. However, any granularity of resolution may be generated for use in a pyramidal data structure without deviating from the spirit or scope of the invention. Also, any quadrant may be used in the recursion process with any desired transform.
Fig. 8a illustrates an example of a pyramidal data structure. For this example, the source image comprises a 4K x 4K image. The decomposition processing 725 generates, in a first iteration, a level one Mallat structure. Specifically, as shown in Fig. 8a, a low-pass component, "low low", is generated and consists of a 2K x 2K sub-image. The 2K x 2K sub-image is labeled in Fig. 8a as 805. The high-pass components, consisting of "low high", "high high", and "high low", contain physical coefficient coordinates (e.g., the upper right hand coordinate for the rectangle that constitutes the "low high" component is (4K, 0)). Fig. 8a also illustrates a second level decomposition. The second iteration of decomposition processing 725 operates on the low pass (i.e., "low low"), component of the level one data. For the second level, the low-pass component, "low low", consists of a IK x IK sub-image, as labeled in Fig. 8a. Fig. 8b illustrates level three and level four decompositions for the 4K x 4K source image of Fig. 8a. To generate the level three decomposition, decomposition processing 725 operates on the level two "low low" component (i.e., the IK x IK image). For the level three transform, the low-pass component, "low low", is a 512 x 512 sub-image as labeled on Fig. 8a. Fig. 8b also illustrates a fourth level of decomposition for the 4K x 4K source image. For the level four transform, the low-pass component comprises a sub-image of 256 x 256 pixels. In one embodiment, the wavelet kernel comprises the wavelet kernel derived from D.
LeGall and A. Tabatabai, See "Sub-band coding of digital images using symmetric short kernel filters and arithmetic coding techniques," IEEE International Conference on Acoustics, Speech and Signal Processing, New York, NY, pp. 761-765, 1988. Any sub-band kernel or pyramid transform could be used within the infrastructure described by the dynamic transfer syntax; however, an integer kernel with no coefficient growth in the low pass term has particular advantages in that the low pass coefficients can be used without processing as pixels, and the transform can be inverted exactly in the integer domain. Although floating point kernels can have superior signal transfer characteristics, the additional processing required to use these coefficients as pixels, and the need for additional storage to guarantee perfect reconstruction works to their disadvantage.
The kernel consists of a low pass and a high pass biorthogonal filter. With input defined as {dj} and [x] defined as the floor function, the forward transform is: LowD] = [(d2j + d2j + ι)/2] High[2] = d2j - d2j+1 + PolyD] PolyO] = [(3*Low[j-2] - 22*Low[j-l] + 22*Low[j+l] - 3*Low[j+2] +
32)/64] The inverse transform, used to reconstruct the image, is: d2j = LowD] + [(HighD] - PolyD] + l)/2] d2j+ι = LowD] - [(HighD] - Poly[j])/2]
As discussed above, the wavelet transform is a spatial transform such that the information is aggregated to preserve the predictability of the geometry of the source image. Thus, coefficient coordinates sufficient to reconstruct a desired image or sub-image at a particular level are readily identifiable.
A more complete description of the dynamic transfer syntax is contained in U.S. Provisional Patent Application, entitled "Flexible Representation and Interactive Image Data Delivery Protocol", Serial No.: 60/091,697, inventors Paul Joseph Chang and Carlos Bentancourt, filed July 3, 1998, and U.S. Patent Application, entitled "Methods and Apparatus for Dynamic Transfer of Image Data", Serial No.: 09/339,077, inventors Paul Joseph Chang and Carlos Bentancourt, filed June 23, 1999, both of which are expressly incorporated herein by reference.
Computer System: Fig. 9 illustrates a high-level block diagram of a general purpose computer system for implementing the user interface for the medical informatics system. A computer system 1000 contains a processor unit 1005, main memory 1010, and an interconnect bus 1025. The processor unit 1005 may contain a single microprocessor, or may contain a plurality of microprocessors for configuring the computer system 1000 as a multi-processor system. The main memory 1010 stores, in part, instructions and data for execution by the processor unit 1005. If the user interface for the medical informatics system of the present invention is partially implemented in software, the main memory 1010 stores the executable code when in operation. The main memory 1010 may include banks of dynamic random access memory (DRAM) as well as high speed cache memory. The computer system 1000 further includes a mass storage device 1020, peripheral device(s) 1030, portable storage medium drive(s) 1040, input control device(s) 1070, a graphics subsystem 1050, and an output display 1060. For purposes of simplicity, all components in the computer system 1000 are shown in Fig. 9 as being connected via the bus 1025. However, the computer system 1000 may be connected through one or more data transport means. For example, the processor unit 1005 and the main memory 1010 may be connected via a local microprocessor bus, and the mass storage device 1020, peripheral device(s) 1030, portable storage medium drive(s) 1040, graphics subsystem 1050 may be connected via one or more input/output (I/O) busses. The mass storage device 1020, which may be implemented with a magnetic disk drive or an optical disk drive, is a non- volatile storage device for storing data and instructions for use by the processor unit 1005. In the software embodiment, the mass storage device 1020 stores the user interface software for loading to the main memory 1010.
The portable storage medium drive 1040 operates in conjunction with a portable non-volatile storage medium, such as a floppy disk or a compact disc read only memory
(CD-ROM), to input and output data and code to and from the computer system 1000. In one embodiment, the user interface for the medical informatics system software is stored on such a portable medium, and is input to the computer system 1000 via the portable storage medium drive 1040. The peripheral device(s) 1030 may include any type of computer support device, such as an input/output (I/O) interface, to add additional functionality to the computer system 1000. For example, the peripheral device(s) 1030 may include a network interface card for interfacing the computer system 1000 to a network.
The input control device(s) 1070 provide a portion of the user interface for a user of the computer system 1000. The input control device(s) 1070 may include an alphanumeric keypad for inputting alphanumeric and other key information, a cursor control device, such as a mouse, a trackball, stylus, or cursor direction keys. In order to display textual and graphical information, the computer system 1000 contains the graphics subsystem 1050 and the output display 1060. The output display 1060 may include a cathode ray tube (CRT) display or liquid crystal display (LCD). The graphics subsystem 1050 receives textual and graphical information, and processes the information for output to the output display 1060. The components contained in the computer system 1000 are those typically found in general purpose computer systems, and in fact, these components are intended to represent a broad category of such computer components that are well known in the art.
The user interface for the medical informatics system may be implemented in either hardware or software. For the software implementation, the user interface is software that includes a plurality of computer executable instructions for implementation on a general purpose computer system. Prior to loading into a general-purpose computer system, the user interface software may reside as encoded information on a computer readable medium, such as a magnetic floppy disk, magnetic tape, and compact disc read only memory (CD - ROM). In one hardware implementation, the user interface system may comprise a dedicated processor including processor instructions for performing the functions described herein. Circuits may also be developed to perform the functions described herein. Although the present invention has been described in terms of specific exemplary embodiments, it will be appreciated that various modifications and alterations might be made by those skilled in the art without departing from the spirit and scope of the invention.

Claims

What is claimed is:
A method for viewing medical images in a computer system, said method comprising the steps of: receiving in a computer at least one study, wherein said study comprises an identification to a plurality of medical images for a patient; displaying, on an output display of said computer, a first arrangement of said medical images for said study including displaying a first view of said medical images, wherein a view comprises display of at least a portion of a medical images at a specified resolution; providing through said computer a means to organize said first arrangement of said medical images for said study to generate a second arrangement of said medical images; providing through said computer a means to select a second view of said medical images, wherein said second view comprises a view different than said first view; and displaying on an output display of said computer said second view of said medical image in said second arrangement.
2. The method as set forth in claim 1, further comprising the step of providing through said computer a means to select said at least one study of a patient.
3. The method as set forth in claim 2, wherein the step of selecting at least one study of a patient comprises the step of displaying a patient browser view, said patient browser view providing a means to select at least one patient and at least one study for said patient.
4. The method as set forth in claim 1, wherein: the step of receiving at least one study comprises the step of receiving a plurality of studies for a single patient; and the step of providing to a user a means to organize said first arrangement into a second arrangement comprises the step of providing to a user a means to re-arrange an order of display for said studies.
5. The method as set forth in claim 1, wherein: the step of receiving at least one study comprises the step of receiving a plurality of studies for a single patient; and the step of providing to a user a means to organize said first arrangement into a second arrangement comprises the step of providing to a user a means to scroll among said medical images of said studies.
6. The method as set forth in claim 1, wherein the step of providing to a user a means to organize said first arrangement into a second arrangement comprises the step of providing to a user a means to re-arrange an order of display for said medical images within said study.
7. The method as set forth in claim 1, wherein the step of providing a means to select a second view of said medical images comprises the step of providing to a user a means to resize said medical images.
8. The method as set forth in claim 7, wherein the step of providing to a user a means to resize said medical images further comprises the step of maintaining an aspect ratio between height and width of said medical image.
9. The method as set forth in claim 1, wherein the step of providing a means to select a second view of said medical images comprises the step of providing to a user a means to pan a medical image such that said second view comprises a display of a different portion of said medical images than said first view .
10. The method as set forth in claim 1, wherein the step of providing a means to select a second view of said medical images comprises the step of providing to a user a means to zoom in on a portion of a medical image such that said second view comprises a resolution different from said first view.
11. The method as set forth in claim 1 , wherein the step of providing to a user a means to organize said first arrangement into a second arrangement comprises the step of providing to a user a means to scroll said medical images within a study.
12. The method as set forth in claim 1, further comprising the steps of : receiving at least two series; and linking said two series so as to provide simultaneous cine.
13. The method as set forth in claim 1, further comprising the step of displaying, in response to user selection, a report associated with said study.
14. A method for viewing and navigating a medical image on a computer, said method comprising the steps of: storing a dynamic medical image so as to define data to reconstruct a plurality of portions of said medical image at a plurality of resolutions, and to reconstruct said portions of said medical image at a plurality of resolutions; displaying at least one medical image on an output display at a first view, wherein a view comprises display of at least a portion of said medical image at a specified resolution; receiving user input that designates a second view for said medical image, wherein said second view comprises a view different from said first view; reconstructing, at said computer, said second view for said medical image; and displaying said second view on said output display.
15. A computer readable medium comprising a plurality of instructions, which when executed by a computer, causes the computer to perform the steps of: receiving in a computer at least one study, wherein said study comprises an identification to a plurality of medical images for a patient; displaying, on an output display of said computer, a first arrangement of said medical images for said study including displaying a first view of said medical images, wherein a view comprises display of at least a portion of a medical images at a specified resolution; providing through said computer a means to organize said first arrangement of said medical images for said study to generate a second arrangement of said medical images; providing through said computer a means to select a second view of said medical images, wherein said second view comprises a view different than said first view; and displaying on an output display of said computer said second view of said medical image in said second arrangement.
16. The computer readable medium as set forth in claim 15, further comprising the step of providing through said computer a means to select said at least one study of a patient.
17. The computer readable medium as set forth in claim 16, wherein the step of selecting at least one study of a patient comprises the step of displaying a patient browser view, said patient browser view providing a means to select at least one patient and at least one study for said patient.
18. The computer readable medium as set forth in claim 15, wherein: the step of receiving at least one study comprises the step of receiving a plurality of studies for a single patient; and the step of providing to a user a means to organize said first arrangement into a second arrangement comprises the step of providing to a user a means to re-arrange an order of display for said studies.
19. The computer readable medium as set forth in claim 15, wherein: the step of receiving at least one study comprises the step of receiving a plurality of studies for a single patient; and the step of providing to a user a means to organize said first arrangement into a second arrangement comprises the step of providing to a user a means to scroll among said medical images of said studies.
20. The computer readable medium as set forth in claim 15, wherein the step of providing to a user a means to organize said first arrangement into a second arrangement comprises the step of providing to a user a means to re-arrange an order of display for said medical images within said study.
21. The computer readable medium as set forth in claim 15, wherein the step of providing a means to select a second view of said medical images comprises the step of providing to a user a means to resize said medical images.
22. The computer readable medium as set forth in claim 21, wherein the step of providing to a user a means to resize said medical images further comprises the step of maintaining an aspect ratio between height and width of said medical image.
23. The computer readable medium as set forth in claim 15, wherein the step of providing a means to select a second view of said medical images comprises the step of providing to a user a means to pan a medical image such that said second view comprises a display of a different portion of said medical images than said first view .
24. The computer readable medium as set forth in claim 15, wherein the step of providing a means to select a second view of said medical images comprises the step of providing to a user a means to zoom in on a portion of a medical image such that said second view comprises a resolution different from said first view.
25. The computer readable medium as set forth in claim 15, wherein the step of providing to a user a means to organize said first arrangement into a second arrangement comprises the step of providing to a user a means to scroll said medical images within a study.
26. The computer readable medium as set forth in claim 15, further comprising the steps of : receiving at least two series; and linking said two series so as to provide simultaneous cine.
27. The computer readable medium as set forth in claim 15, further comprising the step of displaying, in response to user selection, a report associated with said study.
28. A computer system comprising: input device for receiving at least one study, wherein said study comprises an identification to a plurality of medical images for a patient; output display for displaying a first arrangement of said medical images for said study including displaying a first view of said medical images, wherein a view comprises display of at least a portion of a medical images at a specified resolution; input control device for receiving user input; processing unit, coupled to said input control device, for providing a means to organize said first arrangement of said medical images for said study to generate a second arrangement of said medical images, for_providing a means to select a second view of said medical images, wherein said second view comprises a view different than said first view; and wherein said output display for displaying said second view of said medical image in said second arrangement.
PCT/US2000/042257 1999-11-24 2000-11-21 User interface for a medical informatics system WO2001038965A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU34411/01A AU3441101A (en) 1999-11-24 2000-11-21 User interface for a medical informatics system
EP00991761A EP1236083A2 (en) 1999-11-24 2000-11-21 User interface for a medical informatics system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/449,115 US6734880B2 (en) 1999-11-24 1999-11-24 User interface for a medical informatics systems
US09/449,115 1999-11-24

Publications (3)

Publication Number Publication Date
WO2001038965A2 WO2001038965A2 (en) 2001-05-31
WO2001038965A3 WO2001038965A3 (en) 2002-06-06
WO2001038965A9 true WO2001038965A9 (en) 2002-11-28

Family

ID=23782921

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/042257 WO2001038965A2 (en) 1999-11-24 2000-11-21 User interface for a medical informatics system

Country Status (4)

Country Link
US (1) US6734880B2 (en)
EP (1) EP1236083A2 (en)
AU (1) AU3441101A (en)
WO (1) WO2001038965A2 (en)

Families Citing this family (133)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6674449B1 (en) * 1998-11-25 2004-01-06 Ge Medical Systems Global Technology Company, Llc Multiple modality interface for imaging systems
US20040015079A1 (en) 1999-06-22 2004-01-22 Teratech Corporation Ultrasound probe with integrated electronics
US9402601B1 (en) * 1999-06-22 2016-08-02 Teratech Corporation Methods for controlling an ultrasound imaging procedure and providing ultrasound images to an external non-ultrasound application via a network
US7236637B2 (en) * 1999-11-24 2007-06-26 Ge Medical Systems Information Technologies, Inc. Method and apparatus for transmission and display of a compressed digitized image
US7421136B2 (en) * 1999-11-24 2008-09-02 Ge Medical Systems Information Technologies Inc. Image tessellation for region-specific coefficient access
US20020044696A1 (en) * 1999-11-24 2002-04-18 Sirohey Saad A. Region of interest high resolution reconstruction for display purposes and a novel bookmarking capability
JP2002165787A (en) * 2000-02-22 2002-06-11 Nemoto Kyorindo:Kk Medical tomogram display device
AU2001245481A1 (en) * 2000-03-07 2001-09-17 Hotlens.Com Inc. Server-side web browsing and multiple lens system, method and apparatus
JP2003525720A (en) * 2000-03-09 2003-09-02 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ User interface for processing and displaying image data
WO2002009107A1 (en) * 2000-07-21 2002-01-31 Fujitsu Limited Optical disk device, formatting method for optical disk, and optical disk
EP1348168A1 (en) * 2000-10-24 2003-10-01 Singingfish.Com, Inc. Method of collecting data using an embedded media player page
US7165221B2 (en) * 2000-11-13 2007-01-16 Draeger Medical Systems, Inc. System and method for navigating patient medical information
US20020138512A1 (en) * 2000-11-17 2002-09-26 William Buresh Flexible form and window arrangement for the display of medical data
AU2002218556A1 (en) * 2000-11-25 2002-06-03 Jin-Wook Chung 3 dimensional slab rendering system, method and computer-readable medium
US6934698B2 (en) * 2000-12-20 2005-08-23 Heart Imaging Technologies Llc Medical image management system
US6807293B2 (en) * 2001-03-29 2004-10-19 Ge Medical Systems Global Technology Company, Llp Method for multi-path rendering of medical images
US6826729B1 (en) 2001-06-29 2004-11-30 Microsoft Corporation Gallery user interface controls
DE10202286A1 (en) * 2002-01-22 2003-07-31 Siemens Ag Control of access to personal data, especially medical data, whereby to ensure that only authorized persons can access sensitive patient data at least a part of an authentication code is specific to the patient alone
JP2003220056A (en) * 2002-01-29 2003-08-05 Konica Corp Medical image display, image acquisition display, image display method and display format select program therein
US20040078211A1 (en) * 2002-03-18 2004-04-22 Merck & Co., Inc. Computer assisted and/or implemented process and system for managing and/or providing a medical information portal for healthcare providers
JP3788510B2 (en) * 2002-03-20 2006-06-21 コニカミノルタホールディングス株式会社 Medical image apparatus, display screen transition method and screen transition program in the apparatus
US7343565B2 (en) * 2002-03-20 2008-03-11 Mercurymd, Inc. Handheld device graphical user interfaces for displaying patient medical records
US7757183B2 (en) * 2002-04-23 2010-07-13 Draeger Medical Systems, Inc. Timing adaptive patient parameter acquisition and display system and method
US20040047497A1 (en) * 2002-09-10 2004-03-11 Confirma, Inc. User interface for viewing medical images
US20040061889A1 (en) * 2002-09-27 2004-04-01 Confirma, Inc. System and method for distributing centrally located pre-processed medical image data to remote terminals
US7406150B2 (en) * 2002-11-29 2008-07-29 Hologic, Inc. Distributed architecture for mammographic image acquisition and processing
US20040146221A1 (en) * 2003-01-23 2004-07-29 Siegel Scott H. Radiography Image Management System
US6859547B2 (en) * 2003-01-25 2005-02-22 The Mostert Group Methods and computer-readable medium for tracking motion
US20050021377A1 (en) * 2003-05-13 2005-01-27 Dobbs Andrew Bruno Method and system for direct and persistent access to digital medical data
US9715678B2 (en) 2003-06-26 2017-07-25 Microsoft Technology Licensing, Llc Side-by-side shared calendars
US8799808B2 (en) 2003-07-01 2014-08-05 Microsoft Corporation Adaptive multi-line view user interface
US7716593B2 (en) 2003-07-01 2010-05-11 Microsoft Corporation Conversation grouping of electronic mail records
US7707255B2 (en) * 2003-07-01 2010-04-27 Microsoft Corporation Automatic grouping of electronic mail
DE10332831B4 (en) * 2003-07-18 2009-07-30 Siemens Ag Method for displaying a file structure
US7366992B2 (en) * 2003-09-19 2008-04-29 Siemens Medical Solutions Usa, Inc. Method and system for displaying and/or manipulating medical image data
JP2005103055A (en) * 2003-09-30 2005-04-21 Konica Minolta Medical & Graphic Inc Medical image processor
WO2005040843A1 (en) * 2003-10-24 2005-05-06 Koninklijke Philips Electronics N.V. Diagnostic imaging system with user interface
US10437964B2 (en) 2003-10-24 2019-10-08 Microsoft Technology Licensing, Llc Programming interface for licensing
JP2005158010A (en) * 2003-10-31 2005-06-16 Hewlett-Packard Development Co Lp Apparatus, method and program for classification evaluation
US7555707B1 (en) 2004-03-12 2009-06-30 Microsoft Corporation Method and system for data binding in a block structured user interface scripting language
KR20050093019A (en) * 2004-03-18 2005-09-23 주식회사 메디슨 System and method for providing ultrasound image of an embryo through wireless network
US20050206967A1 (en) * 2004-03-19 2005-09-22 General Electric Company Method and system for managing modality worklists in hybrid scanners
JP4653557B2 (en) * 2004-05-24 2011-03-16 株式会社東芝 Medical image display device and medical image display method
US7912268B2 (en) * 2004-06-14 2011-03-22 Canon Kabushiki Kaisha Image processing device and method
US20050283739A1 (en) * 2004-06-18 2005-12-22 Julia Mohr Method and system to improve usability of a web application by providing a zoom function
US9015621B2 (en) 2004-08-16 2015-04-21 Microsoft Technology Licensing, Llc Command user interface for displaying multiple sections of software functionality controls
US7703036B2 (en) * 2004-08-16 2010-04-20 Microsoft Corporation User interface for displaying selectable software functionality controls that are relevant to a selected object
US7895531B2 (en) 2004-08-16 2011-02-22 Microsoft Corporation Floating command object
US8117542B2 (en) 2004-08-16 2012-02-14 Microsoft Corporation User interface for displaying selectable software functionality controls that are contextually relevant to a selected object
US8146016B2 (en) 2004-08-16 2012-03-27 Microsoft Corporation User interface for displaying a gallery of formatting options applicable to a selected object
US8255828B2 (en) 2004-08-16 2012-08-28 Microsoft Corporation Command user interface for displaying selectable software functionality controls
US20070016018A1 (en) * 2004-08-18 2007-01-18 Koninklijke Phillips Electronics N.V. Review mode graphical user interface for an ultrasound imaging system
JP4885432B2 (en) * 2004-08-18 2012-02-29 オリンパス株式会社 Image display device, image display method, and image display program
GB0419607D0 (en) * 2004-09-03 2004-10-06 Accenture Global Services Gmbh Documenting processes of an organisation
US7747966B2 (en) 2004-09-30 2010-06-29 Microsoft Corporation User interface for providing task management and calendar information
US7885440B2 (en) 2004-11-04 2011-02-08 Dr Systems, Inc. Systems and methods for interleaving series of medical images
US7970625B2 (en) 2004-11-04 2011-06-28 Dr Systems, Inc. Systems and methods for retrieval of medical data
US7920152B2 (en) 2004-11-04 2011-04-05 Dr Systems, Inc. Systems and methods for viewing medical 3D imaging volumes
US7787672B2 (en) * 2004-11-04 2010-08-31 Dr Systems, Inc. Systems and methods for matching, naming, and displaying medical images
US7660488B2 (en) * 2004-11-04 2010-02-09 Dr Systems, Inc. Systems and methods for viewing medical images
DE102004055473A1 (en) * 2004-11-17 2006-05-24 Siemens Ag Image representation method for use in medical examination, involves executing different electronic processing of representation of image in subarea of image in comparison to remaining area, and shifting electronic processing across image
US7738684B2 (en) * 2004-11-24 2010-06-15 General Electric Company System and method for displaying images on a PACS workstation based on level of significance
US7929740B2 (en) * 2004-11-26 2011-04-19 Hologic, Inc. User definable scanning protocols for use with mammographic computer-aided detection and film scanning systems
US20060126108A1 (en) * 2004-12-15 2006-06-15 Lexmark International, Inc. Method, printer, and storage medium for printing a medical image
US20060229748A1 (en) * 2005-04-11 2006-10-12 Yarger Richard W Method and apparatus for dynamic comparison of data sets
US20070050718A1 (en) * 2005-05-19 2007-03-01 Moore Michael R Systems and methods for web server based media production
US7886290B2 (en) 2005-06-16 2011-02-08 Microsoft Corporation Cross version and cross product user interface
WO2007012996A2 (en) * 2005-07-26 2007-02-01 Koninklijke Philips Electronics, N.V. Revolutionary series control for medical imaging archive manager
US8239882B2 (en) 2005-08-30 2012-08-07 Microsoft Corporation Markup based extensibility for user interfaces
US8689137B2 (en) 2005-09-07 2014-04-01 Microsoft Corporation Command user interface for displaying selectable functionality controls in a database application
US9542667B2 (en) 2005-09-09 2017-01-10 Microsoft Technology Licensing, Llc Navigating messages within a thread
US7739259B2 (en) 2005-09-12 2010-06-15 Microsoft Corporation Integrated search and find user interface
US8627222B2 (en) 2005-09-12 2014-01-07 Microsoft Corporation Expanded search and find user interface
US20070197909A1 (en) * 2006-02-06 2007-08-23 General Electric Company System and method for displaying image studies using hanging protocols with perspectives/views
US7864995B2 (en) * 2006-02-11 2011-01-04 General Electric Company Systems, methods and apparatus of handling structures in three-dimensional images
US7864994B2 (en) * 2006-02-11 2011-01-04 General Electric Company Systems, methods and apparatus of handling structures in three-dimensional images having multiple modalities and multiple phases
US8683362B2 (en) * 2008-05-23 2014-03-25 Qualcomm Incorporated Card metaphor for activities in a computing device
US9274807B2 (en) 2006-04-20 2016-03-01 Qualcomm Incorporated Selective hibernation of activities in an electronic device
US8296684B2 (en) 2008-05-23 2012-10-23 Hewlett-Packard Development Company, L.P. Navigating among activities in a computing device
US7945083B2 (en) * 2006-05-25 2011-05-17 Carestream Health, Inc. Method for supporting diagnostic workflow from a medical imaging apparatus
US9727989B2 (en) 2006-06-01 2017-08-08 Microsoft Technology Licensing, Llc Modifying and formatting a chart using pictorially provided chart elements
US8605090B2 (en) 2006-06-01 2013-12-10 Microsoft Corporation Modifying and formatting a chart using pictorially provided chart elements
US8280483B2 (en) * 2006-06-14 2012-10-02 Koninklijke Philips Electronics N.V. Multi-modality medical image viewing
US10387612B2 (en) * 2006-06-14 2019-08-20 Koninklijke Philips N.V. Multi-modality medical image layout editor
US7869637B2 (en) * 2006-07-31 2011-01-11 Siemens Medical Solutions Usa, Inc. Histogram calculation for auto-windowing of collimated X-ray image
US8242972B2 (en) * 2006-09-06 2012-08-14 Stereotaxis, Inc. System state driven display for medical procedures
US7634733B2 (en) * 2006-09-18 2009-12-15 Agfa Inc. Imaging history display system and method
JP2008119146A (en) * 2006-11-09 2008-05-29 Olympus Medical Systems Corp Image display device
US7953614B1 (en) 2006-11-22 2011-05-31 Dr Systems, Inc. Smart placement rules
US7684544B2 (en) * 2006-12-14 2010-03-23 Wilson Kevin S Portable digital radiographic devices
US20080175460A1 (en) * 2006-12-19 2008-07-24 Bruce Reiner Pacs portal with automated data mining and software selection
EP2165326A2 (en) * 2007-05-31 2010-03-24 Visan Industries Systems and methods for rendering media
US8484578B2 (en) 2007-06-29 2013-07-09 Microsoft Corporation Communication between a document editor in-space user interface and a document editor out-space user interface
US8201103B2 (en) 2007-06-29 2012-06-12 Microsoft Corporation Accessing an out-space user interface for a document editor program
US8762880B2 (en) 2007-06-29 2014-06-24 Microsoft Corporation Exposing non-authoring features through document status information in an out-space user interface
EP2031531A3 (en) * 2007-07-20 2009-04-29 BrainLAB AG Integrated medical technical display system
US20090037840A1 (en) * 2007-08-03 2009-02-05 Siemens Medical Solutions Usa, Inc. Location Determination For Z-Direction Increments While Viewing Medical Images
US8286090B2 (en) * 2007-10-22 2012-10-09 General Electric Company Systems and methods for displaying and visualizing information
US8803911B2 (en) * 2007-11-16 2014-08-12 Three Palm Software User interface and viewing workflow for mammography workstation
US20090158181A1 (en) * 2007-12-18 2009-06-18 Mellmo Llc User interface method and apparatus to navigate a document file
US9588781B2 (en) 2008-03-31 2017-03-07 Microsoft Technology Licensing, Llc Associating command surfaces with multiple active components
EP2108328B2 (en) * 2008-04-09 2020-08-26 Brainlab AG Image-based control method for medicinal devices
DE102008026610B4 (en) * 2008-06-03 2010-08-12 Siemens Aktiengesellschaft Method for remote monitoring of the image data quality when taking pictures with at least one medical image recording device
RU2566462C2 (en) * 2008-06-11 2015-10-27 Конинклейке Филипс Электроникс Н.В. System and method of computerised diagnostics with multiple modalities
US9665850B2 (en) 2008-06-20 2017-05-30 Microsoft Technology Licensing, Llc Synchronized conversation-centric message list and message reading pane
US8402096B2 (en) 2008-06-24 2013-03-19 Microsoft Corporation Automatic conversation techniques
US8839116B2 (en) * 2008-08-22 2014-09-16 Siemens Aktiengesellschaft User interface in an information technology (IT) system
EP2316327B1 (en) * 2008-10-14 2013-05-15 Olympus Medical Systems Corp. Image display device, image display method, and image display program
US8380533B2 (en) 2008-11-19 2013-02-19 DR Systems Inc. System and method of providing dynamic and customizable medical examination forms
US9003319B2 (en) * 2008-11-26 2015-04-07 General Electric Company Method and apparatus for dynamic multiresolution clinical data display
US8082312B2 (en) * 2008-12-12 2011-12-20 Event Medical, Inc. System and method for communicating over a network with a medical device
DE102009007040A1 (en) * 2009-02-02 2010-08-12 Siemens Aktiengesellschaft Method for controlling the measurement data recording at a medical image recording device, image recording device and user interface
US8799353B2 (en) 2009-03-30 2014-08-05 Josef Larsson Scope-based extensibility for control surfaces
US9046983B2 (en) 2009-05-12 2015-06-02 Microsoft Technology Licensing, Llc Hierarchically-organized control galleries
US8762889B2 (en) * 2009-09-23 2014-06-24 Vidan Industries Method and system for dynamically placing graphic elements into layouts
US8712120B1 (en) 2009-09-28 2014-04-29 Dr Systems, Inc. Rules-based approach to transferring and/or viewing medical images
WO2011066222A1 (en) * 2009-11-25 2011-06-03 Vital Images, Inc. User interface for providing clinical applications and associated data sets based on image data
WO2011085815A1 (en) 2010-01-14 2011-07-21 Brainlab Ag Controlling a surgical navigation system
US8171094B2 (en) * 2010-01-19 2012-05-01 Event Medical, Inc. System and method for communicating over a network with a medical device
US10235728B1 (en) * 2010-02-18 2019-03-19 NexTech Systems, Inc. Integrated medical practice management and image management
US8302014B2 (en) 2010-06-11 2012-10-30 Microsoft Corporation Merging modifications to user interface components while preserving user customizations
US9262444B2 (en) 2010-11-24 2016-02-16 General Electric Company Systems and methods for applying series level operations and comparing images using a thumbnail navigator
US20120324397A1 (en) * 2011-06-20 2012-12-20 Tabb Alan Patz System and method for wireless interaction with medical image data
US9075899B1 (en) * 2011-08-11 2015-07-07 D.R. Systems, Inc. Automated display settings for categories of items
JP5782520B2 (en) * 2011-09-29 2015-09-24 株式会社日立メディコ Image display control device, image display control method, and program
US8799358B2 (en) 2011-11-28 2014-08-05 Merge Healthcare Incorporated Remote cine viewing of medical images on a zero-client application
US9471747B2 (en) 2012-01-06 2016-10-18 Upmc Apparatus and method for viewing medical information
US8773463B2 (en) 2012-01-20 2014-07-08 Nephosity, Inc. Systems and methods for image data management
US9495604B1 (en) 2013-01-09 2016-11-15 D.R. Systems, Inc. Intelligent management of computerized advanced processing
KR20150065376A (en) * 2013-12-05 2015-06-15 삼성전자주식회사 Radiation imaging apparatus and method for display of radioacitve image
USD789973S1 (en) * 2013-12-20 2017-06-20 Roche Diagnostics Operations, Inc. Display screen or portion thereof with graphical user interface and computer icons
US10909168B2 (en) 2015-04-30 2021-02-02 Merge Healthcare Solutions Inc. Database systems and interactive user interfaces for dynamic interaction with, and review of, digital medical image data
US20230084352A1 (en) * 2021-09-10 2023-03-16 Cerner Innovation, Inc. Linking graph

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5452416A (en) * 1992-12-30 1995-09-19 Dominator Radiology, Inc. Automated system and a method for organizing, presenting, and manipulating medical images
US5431161A (en) * 1993-04-15 1995-07-11 Adac Laboratories Method and apparatus for information acquistion, processing, and display within a medical camera system
US5542003A (en) 1993-09-13 1996-07-30 Eastman Kodak Method for maximizing fidelity and dynamic range for a region of interest within digitized medical image display
JP3544557B2 (en) * 1994-04-08 2004-07-21 オリンパス株式会社 Image file device
US5621430A (en) 1994-08-29 1997-04-15 Software Garden, Inc. Method and apparatus for navigating multiple independent windowed images
US5986662A (en) * 1996-10-16 1999-11-16 Vital Images, Inc. Advanced diagnostic viewer employing automated protocol selection for volume-rendered imaging
JP3878259B2 (en) * 1996-11-13 2007-02-07 東芝医用システムエンジニアリング株式会社 Medical image processing device
US5987345A (en) 1996-11-29 1999-11-16 Arch Development Corporation Method and system for displaying medical images
JPH10225441A (en) 1997-02-14 1998-08-25 Hitachi Medical Corp Method for selecting display image in medical image observing device
JP4202461B2 (en) 1998-04-09 2008-12-24 株式会社日立メディコ Medical image display device
US6081267A (en) * 1998-11-19 2000-06-27 Columbia Scientific Incorporated Computerized apparatus and method for displaying X-rays and the like for radiological analysis and manipulation and transmission of data

Also Published As

Publication number Publication date
WO2001038965A2 (en) 2001-05-31
US6734880B2 (en) 2004-05-11
US20020109735A1 (en) 2002-08-15
WO2001038965A3 (en) 2002-06-06
AU3441101A (en) 2001-06-04
EP1236083A2 (en) 2002-09-04

Similar Documents

Publication Publication Date Title
US6734880B2 (en) User interface for a medical informatics systems
US6556724B1 (en) Methods and apparatus for resolution independent image collaboration
US7280702B2 (en) Methods and apparatus for dynamic transfer of image data
US20200227158A1 (en) Electronic Method and System that Improves Efficiencies for Rendering Diagnosis of Radiology Procedures
Reiner et al. Radiologists' productivity in the interpretation of CT scans: a comparison of PACS with conventional film
JP3704492B2 (en) Reporting system in network environment
US7212661B2 (en) Image data navigation method and apparatus
US20030026503A1 (en) Workstation interface for use in digital mammography and associated methods
US7492970B2 (en) Reporting system in a networked environment
US20020023067A1 (en) Integrating a primary record viewing system with a different secondary record viewing system
US20190138157A1 (en) Display device and image display system
US8803911B2 (en) User interface and viewing workflow for mammography workstation
Strickland Some cost-benefit considerations for PACS: a radiological perspective
US20080123925A1 (en) Medical Imaging System
Haynor et al. Hardware and software requirements for a picture archiving and communication system’s diagnostic workstations
Honeyman et al. Functional requirements for diagnostic workstations
Bauman et al. The digital computer in medical imaging: a critical review.
Gay et al. Processes involved in reading imaging studies: workflow analysis and implications for workstation development
Moise et al. Workflow oriented hanging protocols for radiology workstation
Feingold et al. Web-based radiology applications for clinicians and radiologists
Wernert et al. PViN: a scalable and flexible system for visualizing pedigree databases
Thoma et al. A client/server system for Internet access to biomedical text/image databanks
Weinberg et al. X-window-based 2k Display Workstation
Gohel et al. Workstation interface for ROC studies in digital mammography
Knots et al. PACS in practice: on-line communications in daily routine

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

REEP Request for entry into the european phase

Ref document number: 2000991761

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2000991761

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2000991761

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

AK Designated states

Kind code of ref document: C2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: C2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG