CA2343825A1 - Method and system for the assessment of tumor extent in magnetic resonance images - Google Patents
Method and system for the assessment of tumor extent in magnetic resonance images Download PDFInfo
- Publication number
- CA2343825A1 CA2343825A1 CA002343825A CA2343825A CA2343825A1 CA 2343825 A1 CA2343825 A1 CA 2343825A1 CA 002343825 A CA002343825 A CA 002343825A CA 2343825 A CA2343825 A CA 2343825A CA 2343825 A1 CA2343825 A1 CA 2343825A1
- Authority
- CA
- Canada
- Prior art keywords
- voxels
- volume
- mechanism configured
- image data
- seed point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10088—Magnetic resonance imaging [MRI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30068—Mammography; Breast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
Abstract
A method, computer program product, and system for assessing tumor extent in medical temporally acquired images. An image acquisition unit (2) may acquir e images, and converts the breast images into digital images. A breast volume segmentation unit (6) performs volume segmentation on the images obtained by the image acquisition unit (2), and a border removal unit (8) performs breas t border removal based on data generated by the volume segmentation unit (6). A lesion enhancement unit (10) performs lesion enhancement for image data generated by the volume segmentation unit (6). A bounding unit (12) determin es a bounding sphere based on the image data generated by the lesion enhancemen t unit (10), and a three-dimensional search volume unit (14) determines a thre e- dimensional search volume within the image data generated by the bounding un it (12). A suppression unit (16) suppresses surrounding structures in the image data generated by the three-dimensional search unit.
Description
TITLE OF THE I?~'VENTION
METHOD AND SYSTEM FOR THE ASSESSMENT OF TUMOR
EXTENT IN MAGNETIC RESONANCE IMAGES
The present invention was made in part with U.S. Government support under grant numbers DAMD I7-96-1-6058 and DAMD 17-98-I-8194 from the US Army Medical Research and Material Command and grant number RR11459 from the NIH USPHS. The U.S. Government has certain rights in this invention.
BACKGROUND OF THE INVENTION
Field of the Invention:
The invention relates generally to a method and system for the computerized assessment of tumor extent in magnetic resonance images. Accurate quantification of the shape and extent of breast tumors is of interest for the assessment of the likelihood of malignancy. the response to therapy, the planning of radiation and/or surgical procedures, and to establish a reliable predictor for disease-free survival. Specifically, the system includes the assessment of tumor extent in the breast using three-dimensional analyses.
Techniques include novel developments and implementations of breast volume segmentation, breast border removal, lesion enhancement. determination of the bounding sphere, computation of a 3-D search volume. suppression of surrounding structures. and volume growing.
Output From the methods yields an estimate of the extent of tumor (lesion) in the breast. The system '_'0 also includes an option to use the methods for initial detection of lesions in the breast.
The present invention generally relates to computerized techniques for automated analysis of digital images, for example, as disclosed in one or more of U.S.
Patents 4,839.807; 4,841.~~~; 4,851,984; 4,875,16; 4,907,156: 4.918,534; 5,072,384;
5.133,020;
5,150.292: 5.224,177; x,289,374; 5,319.549; 5,343,390; x,359,513; 5,452,367;
5,463.548;
x.491,627; ~,~37.-185; 6.698,481; 6,622,171; 5,638,458; 5,657,362; 5,666,434;
5,673.332;
x.668.888: x,732.697; 5,740.268: 5,790,690; 5,832,103; 5,873,824; 5,881,124:
AND
5,931.780; as well as U.S. patent applications 08/173,935; 08/398.307;
08/523,210;
08/562,087; 08/757.611; 08/900,191; 08/900,361; 08/900.362; 08/900.188;
08/900,189, 08/900.192; 08/979.623; 08/979.639; 08/982,282; 09/027.468; 09/027,685;
09/028,518;
09/053,798: 091092.004; 09/098,504; 09/121,719; 09/131,162; and 09/141,53. all of which are incorporated herein by reference.
SUBSTITUTE SHEET (RULE 26) The present invention includes use of various technologies referenced and described in the above-noted U.S. Patents and Applications, as well as described in the references identified in the appended APPENDIX and cross-referenced throughout the specification by reference to the authors) and year of publication, in brackets, of the respective references listed in the APPENDIX, the entire contents of which, including the related patents and applications listed above and references listed in the APPENDIX, are incorporated herein by reference.
Discussion of the Back rg ourid:
Accurate quantification of the shape and extent of breast tumors is of interest for the assessment of the likelihood of malignancy, the response to therapy, the planning of radiation and/or surgical procedures, and to establish a reliable predictor for disease-free survival.
Magnetic resonance imaging (MRI) has been shown to yield higher correlation between measured tumor size and actual tumor size than mammography and sonography [Gribbestad et al., 1992; Boetes et al., 1995; Davis et al., 1996; Mumtaz et al., 1997].
Currently, MR
images are typically examined on a slice-by-slice basis, thus not taking full advantage of the three-dimensional (3-D) nature of the data. The analysis is often performed visually by radiologists. Visual assessment of the 3-D extent of lesions from two-dimensional (2-D) cross sections may be difficult, especially in irregular masses. Random variations of up to 30% have been reported between the true and the observed tumor size using MRI
[Boetes et al., 1995]. These variations may be due, in part, to inter- and intra-observer differences in the analysis.
Automated techniques for the segmentation of breast lesions may improve the objectivity and the consistency of the results. Semi-automated methods for lesion segmentation have been developed in other areas such as mammography [Kupinski, 1998].
Computerized methods based on 2-D slice-by-slice analysis of MR images have also been proposed [Lucas, 1996]. In a previous study, the present inventors developed a method for computerized assessment of the malignancy of suspicious masses in MR images [Gilhuijs et al., 1998], based on interactive segmentation of the lesion by a radiologist.
METHOD AND SYSTEM FOR THE ASSESSMENT OF TUMOR
EXTENT IN MAGNETIC RESONANCE IMAGES
The present invention was made in part with U.S. Government support under grant numbers DAMD I7-96-1-6058 and DAMD 17-98-I-8194 from the US Army Medical Research and Material Command and grant number RR11459 from the NIH USPHS. The U.S. Government has certain rights in this invention.
BACKGROUND OF THE INVENTION
Field of the Invention:
The invention relates generally to a method and system for the computerized assessment of tumor extent in magnetic resonance images. Accurate quantification of the shape and extent of breast tumors is of interest for the assessment of the likelihood of malignancy. the response to therapy, the planning of radiation and/or surgical procedures, and to establish a reliable predictor for disease-free survival. Specifically, the system includes the assessment of tumor extent in the breast using three-dimensional analyses.
Techniques include novel developments and implementations of breast volume segmentation, breast border removal, lesion enhancement. determination of the bounding sphere, computation of a 3-D search volume. suppression of surrounding structures. and volume growing.
Output From the methods yields an estimate of the extent of tumor (lesion) in the breast. The system '_'0 also includes an option to use the methods for initial detection of lesions in the breast.
The present invention generally relates to computerized techniques for automated analysis of digital images, for example, as disclosed in one or more of U.S.
Patents 4,839.807; 4,841.~~~; 4,851,984; 4,875,16; 4,907,156: 4.918,534; 5,072,384;
5.133,020;
5,150.292: 5.224,177; x,289,374; 5,319.549; 5,343,390; x,359,513; 5,452,367;
5,463.548;
x.491,627; ~,~37.-185; 6.698,481; 6,622,171; 5,638,458; 5,657,362; 5,666,434;
5,673.332;
x.668.888: x,732.697; 5,740.268: 5,790,690; 5,832,103; 5,873,824; 5,881,124:
AND
5,931.780; as well as U.S. patent applications 08/173,935; 08/398.307;
08/523,210;
08/562,087; 08/757.611; 08/900,191; 08/900,361; 08/900.362; 08/900.188;
08/900,189, 08/900.192; 08/979.623; 08/979.639; 08/982,282; 09/027.468; 09/027,685;
09/028,518;
09/053,798: 091092.004; 09/098,504; 09/121,719; 09/131,162; and 09/141,53. all of which are incorporated herein by reference.
SUBSTITUTE SHEET (RULE 26) The present invention includes use of various technologies referenced and described in the above-noted U.S. Patents and Applications, as well as described in the references identified in the appended APPENDIX and cross-referenced throughout the specification by reference to the authors) and year of publication, in brackets, of the respective references listed in the APPENDIX, the entire contents of which, including the related patents and applications listed above and references listed in the APPENDIX, are incorporated herein by reference.
Discussion of the Back rg ourid:
Accurate quantification of the shape and extent of breast tumors is of interest for the assessment of the likelihood of malignancy, the response to therapy, the planning of radiation and/or surgical procedures, and to establish a reliable predictor for disease-free survival.
Magnetic resonance imaging (MRI) has been shown to yield higher correlation between measured tumor size and actual tumor size than mammography and sonography [Gribbestad et al., 1992; Boetes et al., 1995; Davis et al., 1996; Mumtaz et al., 1997].
Currently, MR
images are typically examined on a slice-by-slice basis, thus not taking full advantage of the three-dimensional (3-D) nature of the data. The analysis is often performed visually by radiologists. Visual assessment of the 3-D extent of lesions from two-dimensional (2-D) cross sections may be difficult, especially in irregular masses. Random variations of up to 30% have been reported between the true and the observed tumor size using MRI
[Boetes et al., 1995]. These variations may be due, in part, to inter- and intra-observer differences in the analysis.
Automated techniques for the segmentation of breast lesions may improve the objectivity and the consistency of the results. Semi-automated methods for lesion segmentation have been developed in other areas such as mammography [Kupinski, 1998].
Computerized methods based on 2-D slice-by-slice analysis of MR images have also been proposed [Lucas, 1996]. In a previous study, the present inventors developed a method for computerized assessment of the malignancy of suspicious masses in MR images [Gilhuijs et al., 1998], based on interactive segmentation of the lesion by a radiologist.
SUMMARY OF THE INVENTION
Accordingly, an object of this invention is to provide a method and system for the assessment of tumor extent in magnetic resonance (MR) images.
Another object of the invention is to provide an automated method and system for automatic volume growing of lesions in MR images.
Another object of the invention is to provide an automated method and system for lesion enhancement in MR images.
Anot>zer object of the invention is to provide an automated method and system for the detection of lesions (tumors) in MR images of the breast.
Another object of the is to provide a method and system for the automated 3-D
segmentation of breast lesions in contrast enhanced MRI data.
These and other objects are achieved according to the invention by providing a new and improved automated method, storage medium storing a program for performing the steps of the method, and system for assessing tumor extent in medical temporally acquired images.
1 S The method, on which the computer program product and system is based, includes obtaining image data corresponding to temporally acquired images including a tumor and surrounding anatomy and performing variance processing on the obtained image data to derive variance image data defining a variance image indicative of variation of voxels in said temporally acquired medical images.
Techniques include novel developments and implementations of breast volume segmentation, breast border removal, lesion enhancement, determination of the bounding sphere, computation of a 3-D search volume, suppression of surrounding structures, and volume growing. Output from these techniques yields an estimate of the extent of tumor (lesion) in the breast.
BRIEF DESCRIPTION OF THE DRAWINGS
A more complete appreciation of the invention and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
Figure 1 is a flowchart describing the overall method for the automated assessment of tumor extent in MRI images, which consists of seven stages: breast volume segmentation, breast border removal, lesion enhancement, determination of the bounding sphere, computation of a 3-D search volume, suppression of surrounding structures, and volume growing (for the assessment of tumor extent, the lesion location, in terms of estimated center, is determined by either a human or computerized detection method and input;
this input is referred to as the seed point};
Figure 2 is a graph showing the distribution of tumor sizes in the database used in the demonstration and evaluation of the invention;
Figure 3 is a graph showing the distribution of tumor pathology in the database used in the demonstration and evaluation of the invention;
Figures 4(a), 4(b), 4(c), and 4(d) are illustrative images of the cross sections of two breasts (left and right) at four MR acquisition times, respectively, where the time of 0 seconds refers to the pre-contrast image and a lesion is apparent in the left breast;
Figures 5(a), 5(b), 5(c), and 5(d) are images illustrating the results at various stages of breast volume segmentation, including respectively, (a) preprocessing with 3x3x3 morphological processing, (b) segmentation using maximization of the interclass variance, (c) contour tracing and filling for gap removal, and (d) the segmented breast volume in 3-D
presentation;
Figure 5(e) is a gray level histogram used in determining the gray level threshold that maximizes the variance between two classes, here, background voxels and breast voxels;
Figures 6(a) and 6(b) are images illustrating the method for the removal of the breast border (i.e., the skin) from the volume of interest for subsequent analyses, including respectively, (a) the segmented borders overlaid on the precontrast image data, and (b) the remaining volume of interest after a 3x3x3 erosion operation along the 3-D
breast border data;
Figures 7(a) and 7(b) are images illustrative of the resulting image data after lesion enhancement using, respectively, either (a) the variance or (b) the time-weighted variance processing methods;
Figure 8(a) is an image of a seed point;
Figure 8(b) is a graph of the resulting function when the mean of the voxel values on the surface of a sphere which is centered in the seed point in Figure 8(a) is computed as the function of the radius of the sphere and low-pass filtered;
Accordingly, an object of this invention is to provide a method and system for the assessment of tumor extent in magnetic resonance (MR) images.
Another object of the invention is to provide an automated method and system for automatic volume growing of lesions in MR images.
Another object of the invention is to provide an automated method and system for lesion enhancement in MR images.
Anot>zer object of the invention is to provide an automated method and system for the detection of lesions (tumors) in MR images of the breast.
Another object of the is to provide a method and system for the automated 3-D
segmentation of breast lesions in contrast enhanced MRI data.
These and other objects are achieved according to the invention by providing a new and improved automated method, storage medium storing a program for performing the steps of the method, and system for assessing tumor extent in medical temporally acquired images.
1 S The method, on which the computer program product and system is based, includes obtaining image data corresponding to temporally acquired images including a tumor and surrounding anatomy and performing variance processing on the obtained image data to derive variance image data defining a variance image indicative of variation of voxels in said temporally acquired medical images.
Techniques include novel developments and implementations of breast volume segmentation, breast border removal, lesion enhancement, determination of the bounding sphere, computation of a 3-D search volume, suppression of surrounding structures, and volume growing. Output from these techniques yields an estimate of the extent of tumor (lesion) in the breast.
BRIEF DESCRIPTION OF THE DRAWINGS
A more complete appreciation of the invention and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
Figure 1 is a flowchart describing the overall method for the automated assessment of tumor extent in MRI images, which consists of seven stages: breast volume segmentation, breast border removal, lesion enhancement, determination of the bounding sphere, computation of a 3-D search volume, suppression of surrounding structures, and volume growing (for the assessment of tumor extent, the lesion location, in terms of estimated center, is determined by either a human or computerized detection method and input;
this input is referred to as the seed point};
Figure 2 is a graph showing the distribution of tumor sizes in the database used in the demonstration and evaluation of the invention;
Figure 3 is a graph showing the distribution of tumor pathology in the database used in the demonstration and evaluation of the invention;
Figures 4(a), 4(b), 4(c), and 4(d) are illustrative images of the cross sections of two breasts (left and right) at four MR acquisition times, respectively, where the time of 0 seconds refers to the pre-contrast image and a lesion is apparent in the left breast;
Figures 5(a), 5(b), 5(c), and 5(d) are images illustrating the results at various stages of breast volume segmentation, including respectively, (a) preprocessing with 3x3x3 morphological processing, (b) segmentation using maximization of the interclass variance, (c) contour tracing and filling for gap removal, and (d) the segmented breast volume in 3-D
presentation;
Figure 5(e) is a gray level histogram used in determining the gray level threshold that maximizes the variance between two classes, here, background voxels and breast voxels;
Figures 6(a) and 6(b) are images illustrating the method for the removal of the breast border (i.e., the skin) from the volume of interest for subsequent analyses, including respectively, (a) the segmented borders overlaid on the precontrast image data, and (b) the remaining volume of interest after a 3x3x3 erosion operation along the 3-D
breast border data;
Figures 7(a) and 7(b) are images illustrative of the resulting image data after lesion enhancement using, respectively, either (a) the variance or (b) the time-weighted variance processing methods;
Figure 8(a) is an image of a seed point;
Figure 8(b) is a graph of the resulting function when the mean of the voxel values on the surface of a sphere which is centered in the seed point in Figure 8(a) is computed as the function of the radius of the sphere and low-pass filtered;
Figures 9(a), 9(b}, and 9(c) are images demonstrating the suppression of surrounding structures, including respectively, (a) the enhanced volume data, (b) the limited search volume, and (c) the resulting volume with the surrounding structures suppressed;
Figures 10(a) and 10(b) are, respectively, images of (a) an initial seed point and (b) a modified seed point, for illustrating the shifting of the seed point in the x, y, and z directions;
Figure 10(c) is a graph showing the method for assessment of tumor extent including the volume growing of the tumor in which the stopping criterion (to determine the gray level for final volume growing) is derived from the maximization of the interclass voxel variance between the tumor and the surrounding structures;
Figures 11 (a) and 11 (b) are images illustrating, respectively, (a) the computer output indicating the grown volume using the time-weighted variance processing and (b) the "truth"
as indicated by an experienced MR radiologist;
Figures 12(a) and 12(b) are graphs illustrating the performance of the variance and the time-weighted variance processing on assessing tumor extent in terms of, respectively, (a) overlap with the radiologist assessment of the tumor extent and (b) rms distances from the radiologist assessment of tumor extent;
Figure 13 is a flowchart describing a method for the computerized detection of lesions (tumors} within volume images of the breast by iteratively performing the volume growing and subsequent assessment of tumor extent at all regions exhibiting high uptake (of the contrast agent) that result from the variance or time-weighted variance enhancement;
Figure 14 is a block diagram of a system fox performing the computerized assessment of tumor extent in MR images of the breast in accordance with the present invention; and Figure 15 is a schematic illustration of a general purpose computer 100 programmed according to the teachings of the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Referring now to the drawings, and more particularly to Figure 1 thereof, a flowchart for the automated method for the assessment of tumor extent in MR images is shown. In stef S 1 a set of 2-D MR images that include a volume are acquired. Then, in step S2 a 3-D
representation representation of the volume is generated from the 2-D images.
The lesion location, in terms of the estimated center of the lesion within the volume, is input from either a human or a computer. The estimated center that is input is referred to as the seed point. The remaining steps of the image analysis scheme include breast volume segmentation in step S3, breast border removal in step S4, lesion enhancement in step SS, determination of the bounding sphere in step S6, determination of a 3-D search volume in step S7, suppression of surrounding structures in step S8, volume growing in step S9, and generating and outputting the extent of the tumor in step S 10. The entire image analysis scheme is described in greater detail below.
Database The images in this study were obtained by 3-D FLASH acquisition. A Gd-DTPA
contrast agent was administered after acquisition of the precontrast volume.
At least 3 postcontrast volumes were obtained at 90 second intervals. Each volume consists of 64 slices of 256 x 256 pixels. The pixel size is 1.25 x I .25 mmz , and the typical slice thickness is 2.0 mm. The database consists of 13 benign and 15 malignant lesions from 27 patients. The size of the lesions varies from 0.1 cm' to 17 cm'. As shown in Figure 2, most of the lesions in the database (twenty out of twenty-eight) have volumes less than 2 cm3. Referring to Figure 3, pathology included fibroadenoma in six out of the twenty-eight lesions, papilloma in two out of the twenty-eight legions, benign mastopathy in five out of the twenty-eight lesions, ductal carcinoma in situ (DCIS) in one out of the twenty-eight lesions, papillary carcinoma in one out of the twenty-eight lesions, tubular carcinoma in two out of the twenty-eight lesions, medullary carcinoma in one out of the twenty-eight lesions, invasive lobular carcinoma in three out of the twenty-eight lesions, and ductal carcinoma in seven out of twenty-eight of the lesions. All lesions were unifocal, some diffuse in appearance. Examples of cross section images of a malignant lesion in a contrast-enhanced MRI image are shown in Figures 4(a), 4(b), 4(c), and 4(d). Nate that the images contain cross sections through both the left and right breasts. The lesion (tumor) is in the left breast and is the result of invasive ductal cancer.
Computerized segmentation and assessment of tumor extent Referring back to Figure 1, the computerized segmentation of the breast lesions includes: breast volume segmentation in step S3, breast border removal in step S4, lesion enhancement in step S5, determination of the bounding sphere in step S6, determination of a 3-D search volume in step S7, suppression of surrounding structures in step S8, and volume growing in step S9. As noted above, the lesion location in terms of estimated center is input from either a human or computer, and is referred to as the seed point.
In step S3, the breast volume is segmented from the rest of the image data as shown in Figures 5(a), 5(b), 5(c), and 5(d). The precontrast volume data are processed by 3-D
5 morphological opening (dilation followed by erosion) and closing (erosion followed by dilation) using a structuring element (kernel) of 3 x 3 x 3 voxels (Figure 5(a)). This processing is employed to remove small gaps (holes) and spikes from the data.
Next, the preprocessed volume is segmented at a threshold derived from the global histogram of voxel values by maximization of the inter-class variance [Otsu, 1997] between breast and 10 background voxels (Figure 5(b)). That is, given that there are two classes (breast voxels and background voxels), the gray level threshold is set at the gray level that separates the classes such that the variance between the classes is maximized. In this maximization, values from the gray level histogram (Figure 5(e)) of the entire 3-D image are used in calculating the interclass variance. Finally, gaps in the segmented breast region, which are caused by breast 15 tissue that corresponds to low MR signals, are removed by a sequence of contour tracing and region filling (Figure 5(c)). It should be noted that step S3 can be performed on either pre or post contrast MR volume data. Figure 5(d) illustrates the contour of the final segmented breast volume.
In step S4 the border of the segmented breast volume is removed by morphological 20 erosion along the contour of the breasts as determined in step S5. Figure 6(a) shows the borders of the breasts (determined in step S3) overlaid on the precontrast image data. Erosion is performed along this border with a structuring element, e.g., a 3x3x3 structuring element.
The border is removed to prevent the skin's uptake of the contrast agent from affecting the subsequent segmentation of lesions close to the skin. Therefore, the size of the structuring 25 element preferably varies in proportion to the thickness of the skin in real coordinates. Figure 6(b) shows the breast section after removal of the breast borders. The resulting binary volume is used as a mask in which subsequent calculations are performed. That is, the subsequent calculations are only performed within the remaining breast volume data (i.e., without the breast borders).
30 In step S5, regions that exhibit a relatively high uptake of the contrast agent are enhanced by assigning the variance of the uptake values over time to each individual voxel.
_7_ The processed image (volume data), P(r), is given by P(r) _ ~ ~; (F(r,i) - <F(r)>]2 } / (N-1 ) where F(r,i) denotes the voxel value at position (vector) r and time frame i (where there are N
time frames) prior to this variance processing and <F(r)> denotes the average voxel value over time at position r. Optionally, the calculation of the variance of the uptake values over time can be weighted by the sequence of number of the acquisition, as given by P(r) - {~~ ((F(r~l) - <F(r)>)Z / i} } / (N-1) .
Using the time-weighted procedure, regions that exhibit high uptake of the contrast agent in earlier time sequences (a property commonly associated with lesions) are assigned higher values in the enhanced image than regions that correspond with slower uptake of contrast.
Figure 7(a) is an image illustrating the results of the variance method for lesion enhancement, and Figure 7(b) is an image illustrating the results of the time-weighted variance method for lesion enhancement. After enhancement, the volume data are smoothed with a 3 x 3 x 3 uniform filter for further processing in the subsequent stages. Note that the processed image generated in step SS may also called the variance image.
In step S6 a bounding sphere for roughly limiting the subsequent volume growing of the tumor is determined as follows. The mean of the values of the voxels that intersect the surface of a sphere, which is overlaid on the enhanced image data (generated in step SS) and centered in the seed point, is computed as a function of the radius of the sphere. This function, after application of low-pass filtering to capture the low frequency trend, shall be referred to as the expansion function. Figure 8(b) shows an example of the expansion function for a malignant lesion with a necrotic core. The radius Rbo~"a corresponds to the radius of the sphere that closely encompasses the uptake region of the lesion.
This sphere shall be referred to as the bounding sphere. Radius R~"~a corresponds to the first point where the derivative of the expansion function reaches zero after crossing the global maximum. As shown in Figure 8(a), if the center of the tumor is necrotic, the function will usually rise and then decrease, indicating slower uptake of the contrast agent in the middle of the tumor. If the center of the tumor contains live cells, then the function will start high and level for a _g_ WO 00/1b696 PCTNS99/20862 short distance, and then decrease, indicating a faster uptake of the contrast agent in the middle of the tumor. The inner, middle, and outer circles in Figure 8(a) correspond to radii R,, R2, and Rbo"~a of the sphere, respectively. The radius R~~~a.provides a conservative estimate of the extent of the tumor. Further, R,~~~a could be used to analyze tumor growth/shrinkage in a patient over time. A more detailed estimate of the tumor extent is obtained by additionally performing steps S7-S9 described below.
In step S7, the limiting search function for eventual volume growing is obtained as follows. A binary image representation of the bounding sphere is initially obtained. As an option, a smoothed version of the binary image representation of the bounding sphere is obtained by means of a triangular smoothing filter. The triangular smoothing filter is implemented as the convolution of two cubes, resulting in the a function that includes the bounding sphere (white) and a gray area surrounding the bounding sphere (gray). Figure 9(b) illustrates the constraint function when determined from the bounding sphere by convolution with a triangular smoothing filter (i.e., convolution with two cubes).
In step S8, details outside the bounding sphere are suppressed by multiplication of the variance-processed volume (i.e., the variance image determined in step 5 and shown in Figure 9(a)) with the limiting search function (Figure 9(b)). The resulting volume is referred to as the target volume, since this volume may be sufficient for radiation treatment planning or surgical treatment planning. Figure 9(c) illustrates the target volume obtained by the multiplication (voxel by voxel) of the data from Figure 9(a) and Figure 9(b).
The target volume defines the limits of search for the final volume growing in which a more detailed quantification of the tumor surface is obtained (see step S9 below). The multiplication used to determine the target volume also suppresses spurious details outside the tumor. Since the triangle smoothing was used on the binary representation of the bounding sphere, the target volume represents the area of the variance image corresponding to the bounding sphere and the area of the variance image corresponding to the area within the vicinity of the sphere weighted according to the triangular smoothing function. Note that if the triangular smoothing filter had not been used on the binary image representation of the bounding sphere, volume data beyond Rbo~~a (centered on the seed point) would just be excluded from subsequent analyses.
In Step S9 volume growing is performed. A gray value threshold to terminate volume growing is determined by maximizing the interclass variance [Otsu, 1997]
between lesion and background voxels in the volume of interest encompassed by the bounding sphere, i.e., in the target volume. That is, given that there are two classes (lesion voxels and background voxels), the gray level threshold is set at the gray level that separates the classes such that the variance between the classes is maximized, based on a gray level histogram of all of the voxels in the volume of interest.
The gray level to be used in the subsequent volume growing is the level where the interclass voxel variance within Rbo,~d is maximized at a given gray level threshold. Next, the user indicated seed point is shifted to the nearest point in 3-D that has a voxel value larger or equal to the global maximum in the expansion function. The expansion function (shown in Figure 10(c) indicates the mean voxel value along the surface as a function of radius from the initial seed point. The nearest voxel to the initial seed point having a voxel value (i.e., gray value) of that corresponding to that at the radius Rmax becomes the new seed point. This step is required to minimize the effect of inter- and intra-observer variations in indication of the seed point and to allow successful volume growing of lesions with seed points placed in necrotic centers (so that volume growing does not begin in the necrotic center). Figure 10{a) illustrates the tumor with the original seed point, and Figure 10(b) illustrates the tumor with the seed point shifted in the x, y, and z directions. The modified seed point is determined by shifting the seed point to the voxel with value greater or equal to F(R",~) at distance Rmax (in three-dimensions) from the initial seed point. Note that due to the shift in z-direction (i.e., to another cross section), the necrotic center is not seen. Next, the extent of the tumor is obtained by 6-connected volume growing in the target volume, based on the gray level threshold determined in the earlier interclass variance calculation (between background and tumor) and starting from the adjusted seed point. Six-connected volume growing proceeds as follows: starting at the seed point, determine whether any of the six voxels on the surface of the seed point voxel are above the gray level threshold; connect each voxel on the surface of the seed point that has a gray level higher than the gray level threshold;
repeat for each connected voxel. Then, in step S 10 the extent of the tumor determined in step S9 is output visually, as shown in Figure 11 (a).
Evaluation To serve as reference in the evaluation of the computerized segmentation method, the lesions were also manually segmented in each slice by a well-trained MR
radiologist. This segmentation was performed in the subtraction images (postcontrast -precontrast) by S indicating the enhanced tumor area in each slice that intersected the lesion. All available subtraction images were used for this purpose. As additional reference, the radiologist used the original (non-subtracted) MR images, and complementary information from other modalities in'case of doubt.
Figure 11 (a) is a computer output image indicating the grown volume using the time-weighted variance processing. Figure 11 (b) is an image showing the "truth" as indicated by an experienced MR radiologist. The agreement between the results of the computerized segmentation and the well-trained MR radiologist was objectively evaluated by overlap measures and by metric measures. The overlap measure is defined as:
(AnB)/(AuB) , where A in this study denotes the set of voxels in the result of the computerized segmentation determination of tumor extent, and B denotes the set of voxels in the result of the reference segmentation (segmentation performed by radiologist). Figure 12(a) shows the performance of the variance and the time-weighted variance processing on assessing tumor extent in terms of overlap with the radiologist assessment of the tumor extent. The mean overlap in 3-D
between the results of the computerized segmentation and the results of the radiologist was found to be 0.66 ~ 0.14 (standard deviation). The overlap measure has a tendency to penalize deviations in contours extracted from small lesions more than it penalizes similar deviations in contours extracted from larger lesions. This problem was not observed in metric measures of similarity, as shown below.
The metric measures of performance are based on the nearest distances in 3-D
between the contour points of the computerized segmentation and those of the reference segmentation. Figure 12(b) shows the performance of the variance and time-weighted variance processing on assessing tumor extent in terms of rms distances from the radiologist assessment of tumor extent. The significance of difference in the results obtained by the computerized method and by the well-trained radiologist was calculated by a two-tailed student test at confidence interval of 95%, given the hypothesized mean deviation of zero between the position of the nearest contour points in 3-D. The mean of the rms distances in 3-D between the nearest contour points per lesion was found to be 1.3 mm ~ 1.0 mm (standard deviation). The method showed no significant difference in performance compared with the well-trained radiologist (p>0.5).
Discussion This present invention provides a method for consistent, computerized segmentation of breast lesions and quantitation of tumor extent in 3-D from contrast-enhanced MR data.
The method requires selection of only a single point inside the lesion, which can be done by either human or computer. Inhomogeneity of the magnetic field may cause low frequency variations in image intensity in the MR data, in particular near the borders of the breast. To minimize the effect of these variations on the result of the segmentation, the stop criterion for volume growing is computed only in a volume of interest encompassed by the bounding sphere. Patient motion during acquisition of the MR data may cause artifacts in the enhancement. The average patient motion in this study is estimated to be 2 mm.
Since the pixel size is 1.25 mm, the motion mostly caused additional blurring rather than actual displacement of image structures. Image registration to align the breast volumes in the images obtained at the different time frames may further improve the accuracy of the performance.
It should be noted that although the method is presented on 3-D image data sets, the variance processing can be performed on 2-D image data sets obtained over time. Such images (slices) are obtained from the MR images of the breast.
It should also be noted that although the method is presented for MR images of the breast, the application can be performed on MR images of tumors elsewhere in the body. The uptake of the contrast agent would still be increased by the increased vascularity of the tumor.
It should also be noted that although the method is presented for MR images, the techniques can be applied on other 2-D or 3-D image sets acquired over time, thus allowing for calculation of variance processing.
It should also be noted that although the above discussion relates primarily to the initial identification of a seed point for the volume growing and the assessment of tumor extent, the presented method can also be used for the computerized detection of lesions (tumors) within volume images of the breast. The computerized detection of lesions (tumors) within volume images of the breast is performed by iteratively performing the volume growing and subsequent assessment of tumor extent at all regions of high uptake that result from the variance or time-weighted variance enhancement. As shown in Figure 13, regions exhibiting high uptake are identified in step SS and in step S 12 tumors (lesions are detected).
To detect tumors, thresholding is performed on the variance image data to detect one or more seed points were volume growing will be initiated. The seed points correspond to detected tumors. The thresholding may be performed by identifying voxels with a higher value than that of other voxels. Alternatively, seed point voxels can be determined by identifying the center voxel (i.e., the centroid) in groups (clusters) of contiguous voxels having values above the threshold value. Volume growing (described above) may be initiated from one or more of the identified seed points to determine the extent of the identified tumors. In the event that a seed point does not have a value greater than the threshold, volume growing can start from that voxel closest to the seed point and having a value exceeding the threshold to terminate volume growing in the volume growing target 1 S region.
Figure 14 is a block diagram of a system for the automated assessment of tumor extent in MR images of the breast. The system of Figure 14 is configured to perform any of Steps S 1 through S 12, as described above in conjunction with Figures 1 and 13. An image acquisition unit 2 acquires breast images and converts the breast images into digital images.
The image acquisition unit 2 may also acquire digital images, eliminating the need to convert the images into a digital format. A breast volume segmentation unit 6 performs breast volume segmentation on the images obtained by the image acquisition unit, and a border removal unit 8 performs breast border removal based on data generated by the volume segmentation unit 6. A lesion enhancement unit 10 performs lesion enhancement for image data generated by the volume segmentation unit 6. A bounding unit 12 determines a bounding sphere based on the image data generated by the lesion enhancement unit 10, and a three-dimensional search volume unit 14 determines a three-dimensional search volume within the image data generated by the bounding unit 12. A suppression unit 16 suppresses surrounding structures in the image data generated by the three-dimensional search unit. A
volume growing unit 18 performs volume growing based on the image data generated by the suppression unit 16, and an output unit 20 generates a final set of image data based on the image data generated by the volume growing unit 18. The output unit may also generate additional data, including statistical data for any of the image data acquired, or generated by, the system, information representative of whether tumors were detected, and/or information representative of the extent of the tumors. As noted above, the lesion location in terms of estimated center is input from either a human or computer, and is referred to as the seed point. A three-dimensional display unit 4 generates image data corresponding to a three-dimensional representation of the breast images acquired by the image acquisition unit 2 and displays data, including two- and three-dimensional digital images generated by any element of the system, including the image acquisition unit 2, a breast volume segmentation unit 6, a border removal unit 8, a lesion enhancement unit 10, a bounding unit 12, a three-dimensional search volume unit 14, a suppression unit 16, a volume growing unit 18, and an output unit 20.
Figure 1 S is a schematic illustration of a computer system for the assessment of tumor extent in magnetic resonance images. A computer 100 implements the method of the present invention, wherein the computer housing 102 houses a motherboard I 04 which contains a CPU 106, memory 108 (e.g., DRAM, ROM, EPROM, EEPROM, SRAM, SDRAM, and Flash RAM), and other optional special purpose logic devices (e.g., ASICs) or configurable logic devices (e.g., GAL and reprogrammable FPGA). The computer 100 also includes plural input devices, (e.g., a keyboard 122 and mouse 124), and a display card 110 for controlling monitor 120. In addition, the computer system 100 further includes a floppy disk drive 114;
other removable media devices (e.g., compact disc 119, tape, and removable magneto-optical media (not shown)); and a hard disk 112, or other fixed, high density media drives, connected using an appropriate device bus (e.g., a SCSI bus, an Enhanced IDE bus, or a Ultra DMA
bus). Also connected to the same device bus or another device bus, the computer 100 may additionally include a compact disc reader 118, a compact disc reader/writer unit (not shown) or a compact disc jukebox (not shown). Although compact disc 119 is shown in a CD caddy, the compact disc 119 can be inserted directly into CD-ROM drives which do not require caddies.
As stated above, the system includes at least one computer readable medium.
Examples of computer readable media are compact discs 119, hard disks 112, floppy disks, tape, magneto-optical disks, PROMS (EPROM, EEPROM, Flash EPROM), DRAM, SRAM, SDRAM, etc. Stored on any one or on a combination of computer readable media, the present invention includes software for controlling both the hardware of the computer 100 and for enabling the computer 100 to interact with a human user. Such software may include, but is not limited to, device drivers, operating systems and user applications, such as development tools. Such computer readable media further includes the computer program product of the present invention for performing any of steps S 1 through S 12, described above. The computer code devices of the present invention can be any interpreted or executable code mechanism, including but not limited to scripts, interpreters, dynamic link libraries, Java classes, and complete executable programs.
Obviously, numerous modifications and variations of the present invention are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the invention may be practiced otherwise than as specifically described herein.
APPENDIX
References:
Gribbestad, LS., Nilsen, G., Fj-sne, H., Fougner, R., Haugen, O.A., Petersen, S.B., Rinck, P.A.. and Kvinnsland, S., Contrast-enhanced magnetic resonance imaging of the breast, Acta Oncologica 8, 1992, 833-842.
Boetes, C., lVlus, R.D.M., Holland, R., Barentsz, J.O., Strijk, S.P., Wobbes, T., Hendriks, J.H.C.L., and Ruys, S.H.J., Breast tumors: comparative accuracy of MR imaging relative to mammography and US for demonstrating extent, Radiology 197, 1995, 743-747.
Davis, P.L., Staiger, M.J., Harris, K.B., Gannot, M.A., Klementaviciene, J., McCarty K.S.
Jr., and Tobon H., Breast cancer measurements with magnetic resonance imaging, ultrasonography, and mammography, Breast Cancer Res. Treat. 37, 1996, 1-9.
Mumtaz, H., Hall-Graggs, M.A., Davidson, T., Walmsley, K., Thurell, W., Kissin, M.W., and Taylor, L, Staging of primary breast cancer with MR imaging, AJR 169, 1997, 417-424.
Kupinski, M., and Giger, M., Automated seeded lesion segmentation on digital mammograms, IEEE trans. Med. Imaging, in press, 1998.
Lucas-Quesada, F.A., Sinha, U., and Sinha S., Segmentation strategies for breast tumors from dynamic MR images, J. Magn. Reson. Imaging 6, 1996, 753-763.
Gilhuijs, K.G.A., Giger, M., and Bick, U., Computerized analysis of breast lesions in three dimensions using dynamic magnetic resonance imaging, Med. Phys. 25, 1998, 1647-1654.
Otsu, N., A threshold selection method from gray-value histograms, IEEE
transactions on systems, man, and cybernetics 9, 1997, 62-66.
Figures 10(a) and 10(b) are, respectively, images of (a) an initial seed point and (b) a modified seed point, for illustrating the shifting of the seed point in the x, y, and z directions;
Figure 10(c) is a graph showing the method for assessment of tumor extent including the volume growing of the tumor in which the stopping criterion (to determine the gray level for final volume growing) is derived from the maximization of the interclass voxel variance between the tumor and the surrounding structures;
Figures 11 (a) and 11 (b) are images illustrating, respectively, (a) the computer output indicating the grown volume using the time-weighted variance processing and (b) the "truth"
as indicated by an experienced MR radiologist;
Figures 12(a) and 12(b) are graphs illustrating the performance of the variance and the time-weighted variance processing on assessing tumor extent in terms of, respectively, (a) overlap with the radiologist assessment of the tumor extent and (b) rms distances from the radiologist assessment of tumor extent;
Figure 13 is a flowchart describing a method for the computerized detection of lesions (tumors} within volume images of the breast by iteratively performing the volume growing and subsequent assessment of tumor extent at all regions exhibiting high uptake (of the contrast agent) that result from the variance or time-weighted variance enhancement;
Figure 14 is a block diagram of a system fox performing the computerized assessment of tumor extent in MR images of the breast in accordance with the present invention; and Figure 15 is a schematic illustration of a general purpose computer 100 programmed according to the teachings of the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Referring now to the drawings, and more particularly to Figure 1 thereof, a flowchart for the automated method for the assessment of tumor extent in MR images is shown. In stef S 1 a set of 2-D MR images that include a volume are acquired. Then, in step S2 a 3-D
representation representation of the volume is generated from the 2-D images.
The lesion location, in terms of the estimated center of the lesion within the volume, is input from either a human or a computer. The estimated center that is input is referred to as the seed point. The remaining steps of the image analysis scheme include breast volume segmentation in step S3, breast border removal in step S4, lesion enhancement in step SS, determination of the bounding sphere in step S6, determination of a 3-D search volume in step S7, suppression of surrounding structures in step S8, volume growing in step S9, and generating and outputting the extent of the tumor in step S 10. The entire image analysis scheme is described in greater detail below.
Database The images in this study were obtained by 3-D FLASH acquisition. A Gd-DTPA
contrast agent was administered after acquisition of the precontrast volume.
At least 3 postcontrast volumes were obtained at 90 second intervals. Each volume consists of 64 slices of 256 x 256 pixels. The pixel size is 1.25 x I .25 mmz , and the typical slice thickness is 2.0 mm. The database consists of 13 benign and 15 malignant lesions from 27 patients. The size of the lesions varies from 0.1 cm' to 17 cm'. As shown in Figure 2, most of the lesions in the database (twenty out of twenty-eight) have volumes less than 2 cm3. Referring to Figure 3, pathology included fibroadenoma in six out of the twenty-eight lesions, papilloma in two out of the twenty-eight legions, benign mastopathy in five out of the twenty-eight lesions, ductal carcinoma in situ (DCIS) in one out of the twenty-eight lesions, papillary carcinoma in one out of the twenty-eight lesions, tubular carcinoma in two out of the twenty-eight lesions, medullary carcinoma in one out of the twenty-eight lesions, invasive lobular carcinoma in three out of the twenty-eight lesions, and ductal carcinoma in seven out of twenty-eight of the lesions. All lesions were unifocal, some diffuse in appearance. Examples of cross section images of a malignant lesion in a contrast-enhanced MRI image are shown in Figures 4(a), 4(b), 4(c), and 4(d). Nate that the images contain cross sections through both the left and right breasts. The lesion (tumor) is in the left breast and is the result of invasive ductal cancer.
Computerized segmentation and assessment of tumor extent Referring back to Figure 1, the computerized segmentation of the breast lesions includes: breast volume segmentation in step S3, breast border removal in step S4, lesion enhancement in step S5, determination of the bounding sphere in step S6, determination of a 3-D search volume in step S7, suppression of surrounding structures in step S8, and volume growing in step S9. As noted above, the lesion location in terms of estimated center is input from either a human or computer, and is referred to as the seed point.
In step S3, the breast volume is segmented from the rest of the image data as shown in Figures 5(a), 5(b), 5(c), and 5(d). The precontrast volume data are processed by 3-D
5 morphological opening (dilation followed by erosion) and closing (erosion followed by dilation) using a structuring element (kernel) of 3 x 3 x 3 voxels (Figure 5(a)). This processing is employed to remove small gaps (holes) and spikes from the data.
Next, the preprocessed volume is segmented at a threshold derived from the global histogram of voxel values by maximization of the inter-class variance [Otsu, 1997] between breast and 10 background voxels (Figure 5(b)). That is, given that there are two classes (breast voxels and background voxels), the gray level threshold is set at the gray level that separates the classes such that the variance between the classes is maximized. In this maximization, values from the gray level histogram (Figure 5(e)) of the entire 3-D image are used in calculating the interclass variance. Finally, gaps in the segmented breast region, which are caused by breast 15 tissue that corresponds to low MR signals, are removed by a sequence of contour tracing and region filling (Figure 5(c)). It should be noted that step S3 can be performed on either pre or post contrast MR volume data. Figure 5(d) illustrates the contour of the final segmented breast volume.
In step S4 the border of the segmented breast volume is removed by morphological 20 erosion along the contour of the breasts as determined in step S5. Figure 6(a) shows the borders of the breasts (determined in step S3) overlaid on the precontrast image data. Erosion is performed along this border with a structuring element, e.g., a 3x3x3 structuring element.
The border is removed to prevent the skin's uptake of the contrast agent from affecting the subsequent segmentation of lesions close to the skin. Therefore, the size of the structuring 25 element preferably varies in proportion to the thickness of the skin in real coordinates. Figure 6(b) shows the breast section after removal of the breast borders. The resulting binary volume is used as a mask in which subsequent calculations are performed. That is, the subsequent calculations are only performed within the remaining breast volume data (i.e., without the breast borders).
30 In step S5, regions that exhibit a relatively high uptake of the contrast agent are enhanced by assigning the variance of the uptake values over time to each individual voxel.
_7_ The processed image (volume data), P(r), is given by P(r) _ ~ ~; (F(r,i) - <F(r)>]2 } / (N-1 ) where F(r,i) denotes the voxel value at position (vector) r and time frame i (where there are N
time frames) prior to this variance processing and <F(r)> denotes the average voxel value over time at position r. Optionally, the calculation of the variance of the uptake values over time can be weighted by the sequence of number of the acquisition, as given by P(r) - {~~ ((F(r~l) - <F(r)>)Z / i} } / (N-1) .
Using the time-weighted procedure, regions that exhibit high uptake of the contrast agent in earlier time sequences (a property commonly associated with lesions) are assigned higher values in the enhanced image than regions that correspond with slower uptake of contrast.
Figure 7(a) is an image illustrating the results of the variance method for lesion enhancement, and Figure 7(b) is an image illustrating the results of the time-weighted variance method for lesion enhancement. After enhancement, the volume data are smoothed with a 3 x 3 x 3 uniform filter for further processing in the subsequent stages. Note that the processed image generated in step SS may also called the variance image.
In step S6 a bounding sphere for roughly limiting the subsequent volume growing of the tumor is determined as follows. The mean of the values of the voxels that intersect the surface of a sphere, which is overlaid on the enhanced image data (generated in step SS) and centered in the seed point, is computed as a function of the radius of the sphere. This function, after application of low-pass filtering to capture the low frequency trend, shall be referred to as the expansion function. Figure 8(b) shows an example of the expansion function for a malignant lesion with a necrotic core. The radius Rbo~"a corresponds to the radius of the sphere that closely encompasses the uptake region of the lesion.
This sphere shall be referred to as the bounding sphere. Radius R~"~a corresponds to the first point where the derivative of the expansion function reaches zero after crossing the global maximum. As shown in Figure 8(a), if the center of the tumor is necrotic, the function will usually rise and then decrease, indicating slower uptake of the contrast agent in the middle of the tumor. If the center of the tumor contains live cells, then the function will start high and level for a _g_ WO 00/1b696 PCTNS99/20862 short distance, and then decrease, indicating a faster uptake of the contrast agent in the middle of the tumor. The inner, middle, and outer circles in Figure 8(a) correspond to radii R,, R2, and Rbo"~a of the sphere, respectively. The radius R~~~a.provides a conservative estimate of the extent of the tumor. Further, R,~~~a could be used to analyze tumor growth/shrinkage in a patient over time. A more detailed estimate of the tumor extent is obtained by additionally performing steps S7-S9 described below.
In step S7, the limiting search function for eventual volume growing is obtained as follows. A binary image representation of the bounding sphere is initially obtained. As an option, a smoothed version of the binary image representation of the bounding sphere is obtained by means of a triangular smoothing filter. The triangular smoothing filter is implemented as the convolution of two cubes, resulting in the a function that includes the bounding sphere (white) and a gray area surrounding the bounding sphere (gray). Figure 9(b) illustrates the constraint function when determined from the bounding sphere by convolution with a triangular smoothing filter (i.e., convolution with two cubes).
In step S8, details outside the bounding sphere are suppressed by multiplication of the variance-processed volume (i.e., the variance image determined in step 5 and shown in Figure 9(a)) with the limiting search function (Figure 9(b)). The resulting volume is referred to as the target volume, since this volume may be sufficient for radiation treatment planning or surgical treatment planning. Figure 9(c) illustrates the target volume obtained by the multiplication (voxel by voxel) of the data from Figure 9(a) and Figure 9(b).
The target volume defines the limits of search for the final volume growing in which a more detailed quantification of the tumor surface is obtained (see step S9 below). The multiplication used to determine the target volume also suppresses spurious details outside the tumor. Since the triangle smoothing was used on the binary representation of the bounding sphere, the target volume represents the area of the variance image corresponding to the bounding sphere and the area of the variance image corresponding to the area within the vicinity of the sphere weighted according to the triangular smoothing function. Note that if the triangular smoothing filter had not been used on the binary image representation of the bounding sphere, volume data beyond Rbo~~a (centered on the seed point) would just be excluded from subsequent analyses.
In Step S9 volume growing is performed. A gray value threshold to terminate volume growing is determined by maximizing the interclass variance [Otsu, 1997]
between lesion and background voxels in the volume of interest encompassed by the bounding sphere, i.e., in the target volume. That is, given that there are two classes (lesion voxels and background voxels), the gray level threshold is set at the gray level that separates the classes such that the variance between the classes is maximized, based on a gray level histogram of all of the voxels in the volume of interest.
The gray level to be used in the subsequent volume growing is the level where the interclass voxel variance within Rbo,~d is maximized at a given gray level threshold. Next, the user indicated seed point is shifted to the nearest point in 3-D that has a voxel value larger or equal to the global maximum in the expansion function. The expansion function (shown in Figure 10(c) indicates the mean voxel value along the surface as a function of radius from the initial seed point. The nearest voxel to the initial seed point having a voxel value (i.e., gray value) of that corresponding to that at the radius Rmax becomes the new seed point. This step is required to minimize the effect of inter- and intra-observer variations in indication of the seed point and to allow successful volume growing of lesions with seed points placed in necrotic centers (so that volume growing does not begin in the necrotic center). Figure 10{a) illustrates the tumor with the original seed point, and Figure 10(b) illustrates the tumor with the seed point shifted in the x, y, and z directions. The modified seed point is determined by shifting the seed point to the voxel with value greater or equal to F(R",~) at distance Rmax (in three-dimensions) from the initial seed point. Note that due to the shift in z-direction (i.e., to another cross section), the necrotic center is not seen. Next, the extent of the tumor is obtained by 6-connected volume growing in the target volume, based on the gray level threshold determined in the earlier interclass variance calculation (between background and tumor) and starting from the adjusted seed point. Six-connected volume growing proceeds as follows: starting at the seed point, determine whether any of the six voxels on the surface of the seed point voxel are above the gray level threshold; connect each voxel on the surface of the seed point that has a gray level higher than the gray level threshold;
repeat for each connected voxel. Then, in step S 10 the extent of the tumor determined in step S9 is output visually, as shown in Figure 11 (a).
Evaluation To serve as reference in the evaluation of the computerized segmentation method, the lesions were also manually segmented in each slice by a well-trained MR
radiologist. This segmentation was performed in the subtraction images (postcontrast -precontrast) by S indicating the enhanced tumor area in each slice that intersected the lesion. All available subtraction images were used for this purpose. As additional reference, the radiologist used the original (non-subtracted) MR images, and complementary information from other modalities in'case of doubt.
Figure 11 (a) is a computer output image indicating the grown volume using the time-weighted variance processing. Figure 11 (b) is an image showing the "truth" as indicated by an experienced MR radiologist. The agreement between the results of the computerized segmentation and the well-trained MR radiologist was objectively evaluated by overlap measures and by metric measures. The overlap measure is defined as:
(AnB)/(AuB) , where A in this study denotes the set of voxels in the result of the computerized segmentation determination of tumor extent, and B denotes the set of voxels in the result of the reference segmentation (segmentation performed by radiologist). Figure 12(a) shows the performance of the variance and the time-weighted variance processing on assessing tumor extent in terms of overlap with the radiologist assessment of the tumor extent. The mean overlap in 3-D
between the results of the computerized segmentation and the results of the radiologist was found to be 0.66 ~ 0.14 (standard deviation). The overlap measure has a tendency to penalize deviations in contours extracted from small lesions more than it penalizes similar deviations in contours extracted from larger lesions. This problem was not observed in metric measures of similarity, as shown below.
The metric measures of performance are based on the nearest distances in 3-D
between the contour points of the computerized segmentation and those of the reference segmentation. Figure 12(b) shows the performance of the variance and time-weighted variance processing on assessing tumor extent in terms of rms distances from the radiologist assessment of tumor extent. The significance of difference in the results obtained by the computerized method and by the well-trained radiologist was calculated by a two-tailed student test at confidence interval of 95%, given the hypothesized mean deviation of zero between the position of the nearest contour points in 3-D. The mean of the rms distances in 3-D between the nearest contour points per lesion was found to be 1.3 mm ~ 1.0 mm (standard deviation). The method showed no significant difference in performance compared with the well-trained radiologist (p>0.5).
Discussion This present invention provides a method for consistent, computerized segmentation of breast lesions and quantitation of tumor extent in 3-D from contrast-enhanced MR data.
The method requires selection of only a single point inside the lesion, which can be done by either human or computer. Inhomogeneity of the magnetic field may cause low frequency variations in image intensity in the MR data, in particular near the borders of the breast. To minimize the effect of these variations on the result of the segmentation, the stop criterion for volume growing is computed only in a volume of interest encompassed by the bounding sphere. Patient motion during acquisition of the MR data may cause artifacts in the enhancement. The average patient motion in this study is estimated to be 2 mm.
Since the pixel size is 1.25 mm, the motion mostly caused additional blurring rather than actual displacement of image structures. Image registration to align the breast volumes in the images obtained at the different time frames may further improve the accuracy of the performance.
It should be noted that although the method is presented on 3-D image data sets, the variance processing can be performed on 2-D image data sets obtained over time. Such images (slices) are obtained from the MR images of the breast.
It should also be noted that although the method is presented for MR images of the breast, the application can be performed on MR images of tumors elsewhere in the body. The uptake of the contrast agent would still be increased by the increased vascularity of the tumor.
It should also be noted that although the method is presented for MR images, the techniques can be applied on other 2-D or 3-D image sets acquired over time, thus allowing for calculation of variance processing.
It should also be noted that although the above discussion relates primarily to the initial identification of a seed point for the volume growing and the assessment of tumor extent, the presented method can also be used for the computerized detection of lesions (tumors) within volume images of the breast. The computerized detection of lesions (tumors) within volume images of the breast is performed by iteratively performing the volume growing and subsequent assessment of tumor extent at all regions of high uptake that result from the variance or time-weighted variance enhancement. As shown in Figure 13, regions exhibiting high uptake are identified in step SS and in step S 12 tumors (lesions are detected).
To detect tumors, thresholding is performed on the variance image data to detect one or more seed points were volume growing will be initiated. The seed points correspond to detected tumors. The thresholding may be performed by identifying voxels with a higher value than that of other voxels. Alternatively, seed point voxels can be determined by identifying the center voxel (i.e., the centroid) in groups (clusters) of contiguous voxels having values above the threshold value. Volume growing (described above) may be initiated from one or more of the identified seed points to determine the extent of the identified tumors. In the event that a seed point does not have a value greater than the threshold, volume growing can start from that voxel closest to the seed point and having a value exceeding the threshold to terminate volume growing in the volume growing target 1 S region.
Figure 14 is a block diagram of a system for the automated assessment of tumor extent in MR images of the breast. The system of Figure 14 is configured to perform any of Steps S 1 through S 12, as described above in conjunction with Figures 1 and 13. An image acquisition unit 2 acquires breast images and converts the breast images into digital images.
The image acquisition unit 2 may also acquire digital images, eliminating the need to convert the images into a digital format. A breast volume segmentation unit 6 performs breast volume segmentation on the images obtained by the image acquisition unit, and a border removal unit 8 performs breast border removal based on data generated by the volume segmentation unit 6. A lesion enhancement unit 10 performs lesion enhancement for image data generated by the volume segmentation unit 6. A bounding unit 12 determines a bounding sphere based on the image data generated by the lesion enhancement unit 10, and a three-dimensional search volume unit 14 determines a three-dimensional search volume within the image data generated by the bounding unit 12. A suppression unit 16 suppresses surrounding structures in the image data generated by the three-dimensional search unit. A
volume growing unit 18 performs volume growing based on the image data generated by the suppression unit 16, and an output unit 20 generates a final set of image data based on the image data generated by the volume growing unit 18. The output unit may also generate additional data, including statistical data for any of the image data acquired, or generated by, the system, information representative of whether tumors were detected, and/or information representative of the extent of the tumors. As noted above, the lesion location in terms of estimated center is input from either a human or computer, and is referred to as the seed point. A three-dimensional display unit 4 generates image data corresponding to a three-dimensional representation of the breast images acquired by the image acquisition unit 2 and displays data, including two- and three-dimensional digital images generated by any element of the system, including the image acquisition unit 2, a breast volume segmentation unit 6, a border removal unit 8, a lesion enhancement unit 10, a bounding unit 12, a three-dimensional search volume unit 14, a suppression unit 16, a volume growing unit 18, and an output unit 20.
Figure 1 S is a schematic illustration of a computer system for the assessment of tumor extent in magnetic resonance images. A computer 100 implements the method of the present invention, wherein the computer housing 102 houses a motherboard I 04 which contains a CPU 106, memory 108 (e.g., DRAM, ROM, EPROM, EEPROM, SRAM, SDRAM, and Flash RAM), and other optional special purpose logic devices (e.g., ASICs) or configurable logic devices (e.g., GAL and reprogrammable FPGA). The computer 100 also includes plural input devices, (e.g., a keyboard 122 and mouse 124), and a display card 110 for controlling monitor 120. In addition, the computer system 100 further includes a floppy disk drive 114;
other removable media devices (e.g., compact disc 119, tape, and removable magneto-optical media (not shown)); and a hard disk 112, or other fixed, high density media drives, connected using an appropriate device bus (e.g., a SCSI bus, an Enhanced IDE bus, or a Ultra DMA
bus). Also connected to the same device bus or another device bus, the computer 100 may additionally include a compact disc reader 118, a compact disc reader/writer unit (not shown) or a compact disc jukebox (not shown). Although compact disc 119 is shown in a CD caddy, the compact disc 119 can be inserted directly into CD-ROM drives which do not require caddies.
As stated above, the system includes at least one computer readable medium.
Examples of computer readable media are compact discs 119, hard disks 112, floppy disks, tape, magneto-optical disks, PROMS (EPROM, EEPROM, Flash EPROM), DRAM, SRAM, SDRAM, etc. Stored on any one or on a combination of computer readable media, the present invention includes software for controlling both the hardware of the computer 100 and for enabling the computer 100 to interact with a human user. Such software may include, but is not limited to, device drivers, operating systems and user applications, such as development tools. Such computer readable media further includes the computer program product of the present invention for performing any of steps S 1 through S 12, described above. The computer code devices of the present invention can be any interpreted or executable code mechanism, including but not limited to scripts, interpreters, dynamic link libraries, Java classes, and complete executable programs.
Obviously, numerous modifications and variations of the present invention are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the invention may be practiced otherwise than as specifically described herein.
APPENDIX
References:
Gribbestad, LS., Nilsen, G., Fj-sne, H., Fougner, R., Haugen, O.A., Petersen, S.B., Rinck, P.A.. and Kvinnsland, S., Contrast-enhanced magnetic resonance imaging of the breast, Acta Oncologica 8, 1992, 833-842.
Boetes, C., lVlus, R.D.M., Holland, R., Barentsz, J.O., Strijk, S.P., Wobbes, T., Hendriks, J.H.C.L., and Ruys, S.H.J., Breast tumors: comparative accuracy of MR imaging relative to mammography and US for demonstrating extent, Radiology 197, 1995, 743-747.
Davis, P.L., Staiger, M.J., Harris, K.B., Gannot, M.A., Klementaviciene, J., McCarty K.S.
Jr., and Tobon H., Breast cancer measurements with magnetic resonance imaging, ultrasonography, and mammography, Breast Cancer Res. Treat. 37, 1996, 1-9.
Mumtaz, H., Hall-Graggs, M.A., Davidson, T., Walmsley, K., Thurell, W., Kissin, M.W., and Taylor, L, Staging of primary breast cancer with MR imaging, AJR 169, 1997, 417-424.
Kupinski, M., and Giger, M., Automated seeded lesion segmentation on digital mammograms, IEEE trans. Med. Imaging, in press, 1998.
Lucas-Quesada, F.A., Sinha, U., and Sinha S., Segmentation strategies for breast tumors from dynamic MR images, J. Magn. Reson. Imaging 6, 1996, 753-763.
Gilhuijs, K.G.A., Giger, M., and Bick, U., Computerized analysis of breast lesions in three dimensions using dynamic magnetic resonance imaging, Med. Phys. 25, 1998, 1647-1654.
Otsu, N., A threshold selection method from gray-value histograms, IEEE
transactions on systems, man, and cybernetics 9, 1997, 62-66.
Claims (89)
1. A method for assessing tumor extent in medical temporally acquired images, comprising:
obtaining image data corresponding to temporally acquired images including a tumor and surrounding anatomy; and performing variance processing with respect to time on the obtained image data to derive variance image data defining a variance image indicative of variation of voxels in said temporally acquired medical images over time.
obtaining image data corresponding to temporally acquired images including a tumor and surrounding anatomy; and performing variance processing with respect to time on the obtained image data to derive variance image data defining a variance image indicative of variation of voxels in said temporally acquired medical images over time.
2. The method of Claim 1, wherein the obtaining step comprises:
obtaining magnetic resonance (MR) image data representing temporal progression of a contrast agent in said tumor and surrounding anatomy.
obtaining magnetic resonance (MR) image data representing temporal progression of a contrast agent in said tumor and surrounding anatomy.
3. The method of Claim 1, further comprising:
performing extent processing on the variance image data to determine the extent of the tumor.
performing extent processing on the variance image data to determine the extent of the tumor.
4. The method of Claim 3, wherein:
said variance processing step comprises weighting each voxel as a function of temporal change in value of the voxel due to contrast uptake to obtain said variance image data; and said extent processing step comprises, determining a mean value function based on the mean value of the values of voxels of the variance image data that intersect the surface of a sphere which has a center defined by a seed point, as a function of radius of the sphere, and determining a radius which closely encompasses an uptake region of said tumor based on the mean value function.
said variance processing step comprises weighting each voxel as a function of temporal change in value of the voxel due to contrast uptake to obtain said variance image data; and said extent processing step comprises, determining a mean value function based on the mean value of the values of voxels of the variance image data that intersect the surface of a sphere which has a center defined by a seed point, as a function of radius of the sphere, and determining a radius which closely encompasses an uptake region of said tumor based on the mean value function.
5. The method of Claim 4, wherein said extent processing step comprises:
low pass filtering said mean value function to produce an expansion function.
low pass filtering said mean value function to produce an expansion function.
6. The method of Claim 5, where said extent processing step comprises determining said radius which closely encompasses said uptake region of said tumor based on the expansion function.
7. The method of Claim 6, wherein said extent processing step comprises:
determining said radius which closely encompasses said uptake region of said tumor as that radius corresponding to the first point in said expansion function after a global maximum at which the derivative of the expansion function reaches zero.
determining said radius which closely encompasses said uptake region of said tumor as that radius corresponding to the first point in said expansion function after a global maximum at which the derivative of the expansion function reaches zero.
8. The method of Claim 4, wherein said extent processing comprises:
producing a binary image representative of a bounding sphere defined by said radius which closely encompasses said uptake region of said tumor; and multiplying said variance image with data derived from said binary image to suppress data outside said bounding sphere and to identify a volume growing target region.
producing a binary image representative of a bounding sphere defined by said radius which closely encompasses said uptake region of said tumor; and multiplying said variance image with data derived from said binary image to suppress data outside said bounding sphere and to identify a volume growing target region.
9. The method of Claim 4, wherein said extent processing comprises:
producing a binary image representative of a bounding sphere defined by said radius which closely encompasses said uptake region of said tumor;
smoothing said binary image to obtain a smoothed binary image; and multiplying said variance image with data derived from said smoothed binary image.
producing a binary image representative of a bounding sphere defined by said radius which closely encompasses said uptake region of said tumor;
smoothing said binary image to obtain a smoothed binary image; and multiplying said variance image with data derived from said smoothed binary image.
10. The method of Claim 8, wherein said extent processing comprises:
determining a threshold to terminate volume growing in said volume growing target region;
starting from a predetermined voxel having a value exceeding said threshold, performing volume growing processsing by comparing neighboring voxels to said threshold, and adding to a region growing volume those neighboring voxels bearing a predetermined relationship to said threshold in dependence on the result of the comparison; and performing further iterative comparing, in which for each voxel included in said region growing volume, voxels neighboring thereto are compared to said threshold and included in said region growing volume in dependence on the result of the comparison.
determining a threshold to terminate volume growing in said volume growing target region;
starting from a predetermined voxel having a value exceeding said threshold, performing volume growing processsing by comparing neighboring voxels to said threshold, and adding to a region growing volume those neighboring voxels bearing a predetermined relationship to said threshold in dependence on the result of the comparison; and performing further iterative comparing, in which for each voxel included in said region growing volume, voxels neighboring thereto are compared to said threshold and included in said region growing volume in dependence on the result of the comparison.
11. The method of Claim 10, further comprising:
identifying said voxels included in said region growing volume as being tumor voxels.
identifying said voxels included in said region growing volume as being tumor voxels.
12. The method of Claim 1, wherein said step of obtaining image data comprises:
obtaining image data including a breast of a subject; and segmenting the breast volume from the rest of the obtained image data.
obtaining image data including a breast of a subject; and segmenting the breast volume from the rest of the obtained image data.
13. The method of Claim 12, wherein said step of segmenting comprises:
performing 3-dimensional morphological processing to remove small gaps and spikes from the obtained image data;
determining a global histogram of voxel values from the image data after performing 3-dimensional morphological processing;
determining a threshold from the global histogram by maximization of the inter-class variance between peaks in said global histogram; and defining the breast volume based on those voxels having a value exceeding the threshold determined from the global histogram.
performing 3-dimensional morphological processing to remove small gaps and spikes from the obtained image data;
determining a global histogram of voxel values from the image data after performing 3-dimensional morphological processing;
determining a threshold from the global histogram by maximization of the inter-class variance between peaks in said global histogram; and defining the breast volume based on those voxels having a value exceeding the threshold determined from the global histogram.
14. The method of Claim 13, wherein said step of defining a breast volume comprises:
performing a sequence of contour tracing and region filling steps to remove gaps in the voxels having a value exceeding the threshold determined from the global histogram.
performing a sequence of contour tracing and region filling steps to remove gaps in the voxels having a value exceeding the threshold determined from the global histogram.
15. The method of Claim 12, further comprising:
removing a border of the segmented breast volume by morphological erosion.
removing a border of the segmented breast volume by morphological erosion.
16. The method of Claim 14, further comprising:
removing a border of the segmented breast volume by morphological erosion.
removing a border of the segmented breast volume by morphological erosion.
17. The method of Claim 1, further comprising:
performing thresholding on the variance image data to detect potential tumor sites.
performing thresholding on the variance image data to detect potential tumor sites.
18. The method of Claim 4, further comprising:
performing thresholding on the variance image data to detect said seed point.
performing thresholding on the variance image data to detect said seed point.
19. The method of Claim 18, wherein said step of performing thresholding comprises:
identifying voxels with a higher value than that of other voxels and using at least one of the identified voxels as said seed point.
identifying voxels with a higher value than that of other voxels and using at least one of the identified voxels as said seed point.
20. The method of Claim 19, wherein said step of performing thresholding comprises:
detecting plural clusters of voxels with a higher value than that of other voxels, and using a voxel at a center of said plural clusters of voxels as said seed point.
detecting plural clusters of voxels with a higher value than that of other voxels, and using a voxel at a center of said plural clusters of voxels as said seed point.
21. The method of Claim 10, further comprising:
performing thresholding on the variance image data to detect said seed point.
performing thresholding on the variance image data to detect said seed point.
22. The method of Claim 21, wherein said step of performing thresholding comprises:
identifying voxels with a higher value than that of other voxels and using at least one of the identified voxels as said seed point.
identifying voxels with a higher value than that of other voxels and using at least one of the identified voxels as said seed point.
23. The method of Claim 22, wherein said step of performing thresholding comprises:
detecting plural clusters of voxels with a higher value than that of other voxels, and using a voxel at a center of said plural clusters of voxels as said seed paint.
detecting plural clusters of voxels with a higher value than that of other voxels, and using a voxel at a center of said plural clusters of voxels as said seed paint.
24. The method of Claim 22, wherein said step of starting from a predetermined voxel having a value exceeding said threshold comprises starting from said seed point when said seed point has a value equal to or exceeding the threshold to terminate volume growing in said volume growing target region.
25. The method of Claim 23, wherein said step of starting from a predetermined voxel having a value exceeding said threshold comprises starting from said seed point when said seed point has a value equal to or exceeding the threshold to terminate volume growing in said volume growing target region.
26. The method of Claim 21, wherein said step of starting from a predetermined voxel having a value exceeding said threshold comprises starting from that voxel closest to said seed point and having a value exceeding said threshold to terminate volume growing in said volume growing target region.
27. The method of Claim 11, further comprising:
performing thresholding on the variance image data to detect said seed point.
performing thresholding on the variance image data to detect said seed point.
28. The method of Claim 27, wherein said step of performing thresholding comprises:
identifying voxels with a higher value than that of other voxels and using at least one of the identified voxels as said seed point.
identifying voxels with a higher value than that of other voxels and using at least one of the identified voxels as said seed point.
29. The method of Claim 28, wherein said step of performing thresholding comprises:
detecting plural clusters of voxels with a higher value than that of other voxels, and using a voxel at a center of said plural clusters of voxels as said seed point.
detecting plural clusters of voxels with a higher value than that of other voxels, and using a voxel at a center of said plural clusters of voxels as said seed point.
30. (Deleted)
31. (Deleted)
32. A computer readable medium storing computer instructions for assessing tumor extent in medical temporally acquired images, by performing the steps of:
obtaining image data corresponding to temporally acquired images including a tumor and surrounding anatomy; and performing variance processing with respect to time on the obtained image data to derive variance image data defining a variance image indicative of variation of voxels in said temporally acquired medical images over time.
obtaining image data corresponding to temporally acquired images including a tumor and surrounding anatomy; and performing variance processing with respect to time on the obtained image data to derive variance image data defining a variance image indicative of variation of voxels in said temporally acquired medical images over time.
33. The computer readable medium of Claim 32, wherein the obtaining step comprises:
obtaining magnetic resonance (MR) image data representing temporal progression of a contrast agent in said tumor and surrounding anatomy.
obtaining magnetic resonance (MR) image data representing temporal progression of a contrast agent in said tumor and surrounding anatomy.
34. The computer readable medium of Claim 32, further comprising:
performing extent processing on the variance image data to determine the extent of the tumor.
performing extent processing on the variance image data to determine the extent of the tumor.
35. The computer readable medium of Claim 34, wherein:
said variance processing step comprises weighting each voxel as a function of temporal change in value of the voxel due to contrast uptake to obtain said variance image data; and said extent processing step comprises, determining a mean value function based on the mean value of the values of voxels of the variance image data that intersect the surface of a sphere which has a center defined by a seed point, as a function of radius of the sphere, and determining a radius which closely encompasses an uptake region of said tumor based on the mean value function.
said variance processing step comprises weighting each voxel as a function of temporal change in value of the voxel due to contrast uptake to obtain said variance image data; and said extent processing step comprises, determining a mean value function based on the mean value of the values of voxels of the variance image data that intersect the surface of a sphere which has a center defined by a seed point, as a function of radius of the sphere, and determining a radius which closely encompasses an uptake region of said tumor based on the mean value function.
36. The computer readable medium of Claim 35, wherein said extent processing step comprises:
low pass filtering said mean value function to produce an expansion function.
low pass filtering said mean value function to produce an expansion function.
37. The computer readable medium of Claim 36, where said extent processing step comprises determining said radius which closely encompasses said uptake region of said tumor based on the expansion function.
38. The computer readable medium of Claim 37, wherein said extent processing step comprises:
determining said radius which closely encompasses said uptake region of said tumor as that radius corresponding to the first point in said expansion function after a global maximum at which the derivative of the expansion function reaches zero.
determining said radius which closely encompasses said uptake region of said tumor as that radius corresponding to the first point in said expansion function after a global maximum at which the derivative of the expansion function reaches zero.
39. The computer readable medium of Claim 35, wherein said extent processing comprises:
producing a binary image representative of a bounding sphere defined by said radius which closely encompasses said uptake region of said tumor; and multiplying said variance image with data derived from said binary image to suppress data outside said bounding sphere and to identify a volume growing target region.
producing a binary image representative of a bounding sphere defined by said radius which closely encompasses said uptake region of said tumor; and multiplying said variance image with data derived from said binary image to suppress data outside said bounding sphere and to identify a volume growing target region.
40. The computer readable medium of Claim 35, wherein said extent processing comprises:
producing a binary image representative of a bounding sphere defined by said radius which closely encompasses said uptake region of said tumor;
smoothing said binary image to obtain a smoothed binary image; and multiplying said variance image with data derived from said smoothed binary image.
producing a binary image representative of a bounding sphere defined by said radius which closely encompasses said uptake region of said tumor;
smoothing said binary image to obtain a smoothed binary image; and multiplying said variance image with data derived from said smoothed binary image.
41. The computer readable medium of Claim 39, wherein said extent processing comprises:
determining a threshold to terminate volume growing in said volume growing target region;
starting from a predetermined voxel having a value exceeding said threshold, performing volume growing processsing by comparing neighboring voxels to said threshold, and adding to a region growing volume those neighboring voxels bearing a predetermined relationship to said threshold in dependence on the result of the comparison; and performing further iterative comparing, in which for each voxel included in said region growing volume, voxels neighboring thereto are compared to said threshold and included in said region growing volume in dependence on the result of the comparison.
determining a threshold to terminate volume growing in said volume growing target region;
starting from a predetermined voxel having a value exceeding said threshold, performing volume growing processsing by comparing neighboring voxels to said threshold, and adding to a region growing volume those neighboring voxels bearing a predetermined relationship to said threshold in dependence on the result of the comparison; and performing further iterative comparing, in which for each voxel included in said region growing volume, voxels neighboring thereto are compared to said threshold and included in said region growing volume in dependence on the result of the comparison.
42. The computer readable medium of Claim 41, further comprising:
identifying said voxels included in said region growing volume as being tumor voxels.
identifying said voxels included in said region growing volume as being tumor voxels.
43. The computer readable medium of Claim 32, wherein said step of obtaining image data comprises:
obtaining image data including a breast of a subject; and segmenting the breast volume from the rest of the obtained image data.
obtaining image data including a breast of a subject; and segmenting the breast volume from the rest of the obtained image data.
44. The computer readable medium of Claim 43, wherein said step of segmenting comprises:
performing 3-dimensional morphological processing to remove small gaps and spikes from the obtained image data;
determining a global histogram of voxel values from the image data after performing 3-dimensional morphological processing;
determining a threshold from the global histogram by maximization of the inter-class variance between peaks in said global histogram; and defining the breast volume based on those voxels having a value exceeding the threshold determined from the global histogram.
performing 3-dimensional morphological processing to remove small gaps and spikes from the obtained image data;
determining a global histogram of voxel values from the image data after performing 3-dimensional morphological processing;
determining a threshold from the global histogram by maximization of the inter-class variance between peaks in said global histogram; and defining the breast volume based on those voxels having a value exceeding the threshold determined from the global histogram.
45. The computer readable medium of Claim 44, wherein said step of defining a breast volume comprises:
performing a sequence of contour tracing and region filling steps to remove gaps in the voxels having a value exceeding the threshold determined from the global histogram.
performing a sequence of contour tracing and region filling steps to remove gaps in the voxels having a value exceeding the threshold determined from the global histogram.
46. The computer readable medium of Claim 43, further comprising:
removing a border of the segmented breast volume by morphological erosion.
removing a border of the segmented breast volume by morphological erosion.
47. The computer readable medium of Claim 45, further comprising:
removing a border of the segmented breast volume by morphological erosion.
removing a border of the segmented breast volume by morphological erosion.
48. The computer readable medium of Claim 32, further comprising:
performing thresholding on the variance image data to detect potential tumor sites.
performing thresholding on the variance image data to detect potential tumor sites.
49. The computer readable medium of Claim 35, further comprising:
performing thresholding on the variance image data to detect said seed point.
performing thresholding on the variance image data to detect said seed point.
50. The computer readable medium of Claim 49, wherein said step of performing thresholding comprises:
identifying voxels with a higher value than that of other voxels, and using at least one of the identified voxels as said seed point.
identifying voxels with a higher value than that of other voxels, and using at least one of the identified voxels as said seed point.
51. The computer readable medium of Claim 50, wherein said step of performing thresholding comprises:
detecting plural clusters of voxels with a higher value than that of other voxels, and using a voxel at a center of said plural clusters of voxels as said seed point.
detecting plural clusters of voxels with a higher value than that of other voxels, and using a voxel at a center of said plural clusters of voxels as said seed point.
52. The computer readable medium of Claim 41, further comprising:
performing thresholding on the variance image data to detect said seed point.
performing thresholding on the variance image data to detect said seed point.
53. The computer readable medium of Claim 52, wherein said step of performing thresholding comprises:
identifying voxels with a higher value than that of other voxels, and using at least one of the identified voxels as said seed point.
identifying voxels with a higher value than that of other voxels, and using at least one of the identified voxels as said seed point.
54. The computer readable medium of Claim 53, wherein said step of performing thresholding comprises:
detecting plural clusters of voxels with a higher value than that of other voxels, and using a voxel at a center of said plural clusters of voxels as said seed point.
detecting plural clusters of voxels with a higher value than that of other voxels, and using a voxel at a center of said plural clusters of voxels as said seed point.
55. The computer readable medium of Claim 53, wherein said step of starting from a predetermined voxel having a value exceeding said threshold comprises starting from said seed point when said seed point has a value equal to or exceeding the threshold to terminate volume growing in said volume growing target region.
56. The computer readable medium of Claim 54, wherein said step of starting from a predetermined voxel having a value exceeding said threshold comprises starting from said seed point when said seed point has a value equal to or exceeding the threshold to terminate volume growing in said volume growing target region.
57. The computer readable medium of Claim 52, wherein said step of starting from a predetermined voxel having a value exceeding said threshold comprises starting from that voxel closest to said seed point and having a value exceeding said threshold to terminate volume growing in said volume growing target region.
58. The computer readable medium of Claim 42, further comprising:
performing thresholding on the variance image data to detect said seed point.
performing thresholding on the variance image data to detect said seed point.
59. The computer readable medium of Claim 58, wherein said step of performing thresholding comprises:
identifying voxels with a higher value than that of other voxels and using at least one of the identified voxels as said seed point.
identifying voxels with a higher value than that of other voxels and using at least one of the identified voxels as said seed point.
60. The computer readable medium of Claim 59, wherein said step of performing thresholding comprises:
detecting plural clusters of voxels with a higher value than that of other voxels, and using a voxel at a center of said plural clusters of voxels as said seed point.
detecting plural clusters of voxels with a higher value than that of other voxels, and using a voxel at a center of said plural clusters of voxels as said seed point.
61. A system for assessing tumor extent in medical temporally acquired images, comprising:
a mechanism configured to obtain image data corresponding to temporally acquired images including a tumor and surrounding anatomy; and a mechanism configured to perform variance processing with respect to time on the obtained image data to derive variance image data defining a variance image indicative of variation of voxels in said temporally acquired medical images over time.
a mechanism configured to obtain image data corresponding to temporally acquired images including a tumor and surrounding anatomy; and a mechanism configured to perform variance processing with respect to time on the obtained image data to derive variance image data defining a variance image indicative of variation of voxels in said temporally acquired medical images over time.
62. The system of Claim 61, wherein the mechanism configured to obtain image data comprises:
a mechanism configured to obtain magnetic resonance (MK) image data representing temporal progression of a contrast agent in said tumor and surrounding anatomy.
a mechanism configured to obtain magnetic resonance (MK) image data representing temporal progression of a contrast agent in said tumor and surrounding anatomy.
63. The system of Claim 61, further comprising:
a mechanism configured to perform extent processing on the variance image data to determine the extent of the tumor.
a mechanism configured to perform extent processing on the variance image data to determine the extent of the tumor.
64. The system of Claim 63, wherein:
said mechanism configured to perform said variance processing comprises weighting each voxel as a function of temporal change in value of the voxel due to contrast uptake to obtain said variance image data; and said mechanism configured to perform extent processing comprises, a mechanism configured to determine a mean value function based on the mean value of the values of voxels of the variance image data that intersect the surface of a sphere which has a center defined by a seed point, as a function of radius of the sphere, and a mechanism configured to determine a radius which closely encompasses an uptake region of said tumor based on the mean value function.
said mechanism configured to perform said variance processing comprises weighting each voxel as a function of temporal change in value of the voxel due to contrast uptake to obtain said variance image data; and said mechanism configured to perform extent processing comprises, a mechanism configured to determine a mean value function based on the mean value of the values of voxels of the variance image data that intersect the surface of a sphere which has a center defined by a seed point, as a function of radius of the sphere, and a mechanism configured to determine a radius which closely encompasses an uptake region of said tumor based on the mean value function.
65. The system of Claim 64, wherein said mechanism configured to perform extent processing comprises:
a mechanism configured to low pass filter said mean value function to produce an expansion function.
a mechanism configured to low pass filter said mean value function to produce an expansion function.
66. The system of Claim 65, where said mechanism configured to perform extent processing comprises:
a mechanism configured to determine said radius which closely encompasses said uptake region of said tumor based on the expansion function.
a mechanism configured to determine said radius which closely encompasses said uptake region of said tumor based on the expansion function.
67. The system of Claim 66, wherein said mechanism configured to perform extent processing comprises:
a mechanism configured to determine said radius which closely encompasses said uptake region of said tumor as that radius corresponding to the first point in said expansion function after a global maximum at which the derivative of the expansion function reaches zero.
a mechanism configured to determine said radius which closely encompasses said uptake region of said tumor as that radius corresponding to the first point in said expansion function after a global maximum at which the derivative of the expansion function reaches zero.
68. The system of Claim 64, wherein said mechanism configured to perform extent processing comprises:
a mechanism configured to produce a binary image representative of a bounding sphere defined by said radius which closely encompasses said uptake region of said tumor; and a mechanism configured to multiply said variance image with data derived from said binary image to suppress data outside said bounding sphere and to identify a volume growing target region.
a mechanism configured to produce a binary image representative of a bounding sphere defined by said radius which closely encompasses said uptake region of said tumor; and a mechanism configured to multiply said variance image with data derived from said binary image to suppress data outside said bounding sphere and to identify a volume growing target region.
69. The system of Claim 64, wherein said mechanism configured to perform extent processing comprises:
a mechanism configured to produce a binary image representative of a bounding sphere defined by said radius which closely encompasses said uptake region of said tumor;
a mechanism configured to smooth said binary image to obtain a smoothed binary image;
and a mechanism configured to multiply said variance image with data derived from said smoothed binary image.
a mechanism configured to produce a binary image representative of a bounding sphere defined by said radius which closely encompasses said uptake region of said tumor;
a mechanism configured to smooth said binary image to obtain a smoothed binary image;
and a mechanism configured to multiply said variance image with data derived from said smoothed binary image.
70. The system of Claim 68, wherein said extent processing comprises:
a mechanism configured to determine a threshold to terminate volume growing in said volume growing target region;
a mechanism configured to perform volume growing processsing, starting from a predetermined voxel having a value exceeding said threshold, by comparing neighboring voxels to said threshold, and adding to a region growing volume those neighboring voxels bearing a predetermined relationship to said threshold in dependence on the result of the comparison; and a mechanism configured to perform further iterative comparing, in which for each voxel included in said region growing volume, voxels neighboring thereto are compared to said threshold and included in said region growing volume in dependence on the result of the comparison.
a mechanism configured to determine a threshold to terminate volume growing in said volume growing target region;
a mechanism configured to perform volume growing processsing, starting from a predetermined voxel having a value exceeding said threshold, by comparing neighboring voxels to said threshold, and adding to a region growing volume those neighboring voxels bearing a predetermined relationship to said threshold in dependence on the result of the comparison; and a mechanism configured to perform further iterative comparing, in which for each voxel included in said region growing volume, voxels neighboring thereto are compared to said threshold and included in said region growing volume in dependence on the result of the comparison.
71. The system of Claim 70, further comprising:
a mechanism configured to identify said voxels included in said region growing volume as being tumor voxels.
a mechanism configured to identify said voxels included in said region growing volume as being tumor voxels.
72. The system of Claim 61, wherein said mechanism configured to obtain image data compnses:
a mechanism configured to obtain image data including a breast of a subject;
and a mechanism configured to segment the breast volume from the rest of the obtained image data.
a mechanism configured to obtain image data including a breast of a subject;
and a mechanism configured to segment the breast volume from the rest of the obtained image data.
73. The system of Claim 72, wherein said a mechanism configured to perform segmenting compnses:
a mechanism configured to perform 3-dimensional morphological processing to remove small gaps and spikes from the obtained image data;
a mechanism configured to determine a global histogram of voxel values from image data received from the mechanism configured to perform 3-dimensional morphological processing;
a mechanism configured to determine a threshold from the global histogram by maximization of the inter-class variance between peaks in said global histogram; and a mechanism configured to define the breast volume based on those voxels having a value exceeding the threshold determined from the global histogram.
a mechanism configured to perform 3-dimensional morphological processing to remove small gaps and spikes from the obtained image data;
a mechanism configured to determine a global histogram of voxel values from image data received from the mechanism configured to perform 3-dimensional morphological processing;
a mechanism configured to determine a threshold from the global histogram by maximization of the inter-class variance between peaks in said global histogram; and a mechanism configured to define the breast volume based on those voxels having a value exceeding the threshold determined from the global histogram.
74. The system of Claim 73, wherein said mechanism configured to define a breast volume comprises:
a mechanism configured to perform a sequence of contour tracing and region filling steps to remove gaps in the voxels having a value exceeding the threshold determined from the global histogram.
a mechanism configured to perform a sequence of contour tracing and region filling steps to remove gaps in the voxels having a value exceeding the threshold determined from the global histogram.
75. The system of Claim 72, fiu-ther comprising:
a mechanism configured to remove a border of the segmented breast volume by morphological erosion.
a mechanism configured to remove a border of the segmented breast volume by morphological erosion.
76. The system of Claim 74, fiirther comprising:
a mechanism configured to remove a border of the segmented breast volume by morphological erosion.
a mechanism configured to remove a border of the segmented breast volume by morphological erosion.
77. The system of Claim 61, further comprising:
a mechanism configured to perform thresholding on the variance image data to detect potential tumor sites.
a mechanism configured to perform thresholding on the variance image data to detect potential tumor sites.
78. The system of Claim 64, further comprising:
a mechanism configured to perform thresholding on the variance image data to detect said seed point.
a mechanism configured to perform thresholding on the variance image data to detect said seed point.
79. The system of Claim 78, wherein said mechanism configured to perform thresholding comprises:
a mechanism configured to identify voxels with a higher value than that of other voxels, and to use at least one of the identified voxels as said seed point.
a mechanism configured to identify voxels with a higher value than that of other voxels, and to use at least one of the identified voxels as said seed point.
80. The system of Claim 79, wherein said mechanism configured to perform thresholding comprises:
a mechanism configured to detect plural clusters of voxels with a higher value than that of other voxels, and to use a voxel at a center of said plural clusters of voxels as said seed point.
a mechanism configured to detect plural clusters of voxels with a higher value than that of other voxels, and to use a voxel at a center of said plural clusters of voxels as said seed point.
81. The system of Claim 70, further comprising:
a mechanism configured to perform thresholding on the variance image data to detect said seed point.
a mechanism configured to perform thresholding on the variance image data to detect said seed point.
82. The system of Claim 81, wherein said mechanism configured to perform thresholding comprises:
a mechanism configured to identify voxels with a higher value than that of other voxels, and to use at least one of the identified voxels as said seed point.
a mechanism configured to identify voxels with a higher value than that of other voxels, and to use at least one of the identified voxels as said seed point.
83. The system of Claim 82, wherein said mechanism configured to perform thresholding comprises:
a mechanism configured to detect plural clusters of voxels with a higher value than that of other voxels, and to use a voxel at a center of said plural clusters of voxels as said seed point.
a mechanism configured to detect plural clusters of voxels with a higher value than that of other voxels, and to use a voxel at a center of said plural clusters of voxels as said seed point.
84. The system of Claim 82, wherein said mechanism configured to perform volume growing processing is further configured to start said volume growing processing from said seed point when said seed point has a value equal to or exceeding the threshold to terminate volume growing in said volume growing target region.
85. The system of Claim 83, wherein said mechanism configured to perform volume growing processing is further configured to start said volume growing processing from said seed point when said seed point has a value equal to or exceeding the threshold to terminate volume growing in said volume growing target region.
86. The system of Claim 81, wherein said mechanism configured to perform volume growing processing is further configured to start said volume growing processing from that voxel closest to said seed point and having a value exceeding said threshold to terminate volume growing in said volume growing target region.
87. The system of Claim 71, further comprising:
a mechanism configured to perform thresholding on the variance image data to detect said seed point.
a mechanism configured to perform thresholding on the variance image data to detect said seed point.
88. The system of Claim 87, wherein said mechanism configured to perform thresholding comprises:
a mechanism configured to identify voxels with a higher value than that of other voxels, and to use at least one of the identified voxels as said seed point.
a mechanism configured to identify voxels with a higher value than that of other voxels, and to use at least one of the identified voxels as said seed point.
89. The system of Claim 88, wherein said mechanism configured to perform thresholding comprises:
a mechanism configured to detect plural clusters of voxels with a higher value than that of other voxels, and to use a voxel at a center of said plural clusters of voxels as said seed point.
a mechanism configured to detect plural clusters of voxels with a higher value than that of other voxels, and to use a voxel at a center of said plural clusters of voxels as said seed point.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/156,413 US6112112A (en) | 1998-09-18 | 1998-09-18 | Method and system for the assessment of tumor extent in magnetic resonance images |
US09/156,413 | 1998-09-18 | ||
PCT/US1999/020862 WO2000016696A1 (en) | 1998-09-18 | 1999-09-15 | Method and system for the assessment of tumor extent in magnetic resonance images |
Publications (1)
Publication Number | Publication Date |
---|---|
CA2343825A1 true CA2343825A1 (en) | 2000-03-30 |
Family
ID=22559472
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA002343825A Abandoned CA2343825A1 (en) | 1998-09-18 | 1999-09-15 | Method and system for the assessment of tumor extent in magnetic resonance images |
Country Status (6)
Country | Link |
---|---|
US (1) | US6112112A (en) |
EP (1) | EP1113752A4 (en) |
JP (1) | JP2002526190A (en) |
AU (1) | AU5919099A (en) |
CA (1) | CA2343825A1 (en) |
WO (1) | WO2000016696A1 (en) |
Families Citing this family (94)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6009212A (en) | 1996-07-10 | 1999-12-28 | Washington University | Method and apparatus for image registration |
US6611630B1 (en) | 1996-07-10 | 2003-08-26 | Washington University | Method and apparatus for automatic shape characterization |
CN1230120C (en) | 1997-05-23 | 2005-12-07 | 普罗里森姆股份有限公司 | MRI-guided therapeutic unit and method |
AU8586098A (en) | 1997-07-25 | 1999-02-16 | Arch Development Corporation | Method and system for the segmentation of lung regions in lateral chest radiographs |
US6633686B1 (en) | 1998-11-05 | 2003-10-14 | Washington University | Method and apparatus for image registration using large deformation diffeomorphisms on a sphere |
US6754374B1 (en) * | 1998-12-16 | 2004-06-22 | Surgical Navigation Technologies, Inc. | Method and apparatus for processing images with regions representing target objects |
US6694057B1 (en) | 1999-01-27 | 2004-02-17 | Washington University | Method and apparatus for processing images with curves |
KR20020057946A (en) * | 1999-07-29 | 2002-07-12 | 로퍼, 랜달 비. | Targeting multimeric imaging agents through multilocus binding |
US6941323B1 (en) | 1999-08-09 | 2005-09-06 | Almen Laboratories, Inc. | System and method for image comparison and retrieval by enhancing, defining, and parameterizing objects in images |
US6898303B2 (en) * | 2000-01-18 | 2005-05-24 | Arch Development Corporation | Method, system and computer readable medium for the two-dimensional and three-dimensional detection of lesions in computed tomography scans |
US6901156B2 (en) * | 2000-02-04 | 2005-05-31 | Arch Development Corporation | Method, system and computer readable medium for an intelligent search workstation for computer assisted interpretation of medical images |
US6772132B1 (en) * | 2000-03-02 | 2004-08-03 | Trading Technologies International, Inc. | Click based trading with intuitive grid display of market depth |
US7310437B2 (en) * | 2000-03-08 | 2007-12-18 | Fujifilm Corporation | Image processing method and system, and storage medium |
WO2002001242A2 (en) * | 2000-06-28 | 2002-01-03 | The Regents Of The University Of Minnesota | Imaging methods for visualizing implanted living cells |
AU2002239267A1 (en) * | 2000-11-24 | 2002-06-03 | U-Systems, Inc. | Diagnosis method and ultrasound information display system therefor |
US20020164070A1 (en) * | 2001-03-14 | 2002-11-07 | Kuhner Mark B. | Automatic algorithm generation |
US6859554B2 (en) * | 2001-04-04 | 2005-02-22 | Mitsubishi Electric Research Laboratories, Inc. | Method for segmenting multi-resolution video objects |
US7995825B2 (en) * | 2001-04-05 | 2011-08-09 | Mayo Foundation For Medical Education | Histogram segmentation of FLAIR images |
US8958654B1 (en) * | 2001-04-25 | 2015-02-17 | Lockheed Martin Corporation | Method and apparatus for enhancing three-dimensional imagery data |
US20030036083A1 (en) * | 2001-07-19 | 2003-02-20 | Jose Tamez-Pena | System and method for quantifying tissue structures and their change over time |
TWI221406B (en) * | 2001-07-30 | 2004-10-01 | Epix Medical Inc | Systems and methods for targeted magnetic resonance imaging of the vascular system |
US20030088177A1 (en) * | 2001-09-05 | 2003-05-08 | Virtualscopics, Llc | System and method for quantitative assessment of neurological diseases and the change over time of neurological diseases |
US20030095696A1 (en) * | 2001-09-14 | 2003-05-22 | Reeves Anthony P. | System, method and apparatus for small pulmonary nodule computer aided diagnosis from computed tomography scans |
CA2459557A1 (en) * | 2001-09-17 | 2003-03-27 | Virtualscopics, Llc | System and method for quantitative assessment of cancers and their change over time |
US20030072479A1 (en) * | 2001-09-17 | 2003-04-17 | Virtualscopics | System and method for quantitative assessment of cancers and their change over time |
US7123762B2 (en) * | 2002-02-08 | 2006-10-17 | University Of Chicago | Method and system for risk-modulated diagnosis of disease |
US20030161513A1 (en) * | 2002-02-22 | 2003-08-28 | The University Of Chicago | Computerized schemes for detecting and/or diagnosing lesions on ultrasound images using analysis of lesion shadows |
US7123008B1 (en) | 2002-04-19 | 2006-10-17 | Fonar Corporation | Positional magnetic resonance imaging |
US8036730B1 (en) | 2002-04-19 | 2011-10-11 | Fonar Corporation | Temporal magnetic resonance imaging |
US7418123B2 (en) * | 2002-07-12 | 2008-08-26 | University Of Chicago | Automated method and system for computerized image analysis for prognosis |
US6731782B2 (en) * | 2002-10-02 | 2004-05-04 | Virtualscopics, Llc | Method and system for automatic identification and quantification of abnormal anatomical structures in medical images |
US6836557B2 (en) * | 2002-10-02 | 2004-12-28 | VirtualS{tilde over (c)}opics, LLC | Method and system for assessment of biomarkers by measurement of response to stimulus |
FR2846830B1 (en) * | 2002-10-31 | 2005-01-21 | Ge Med Sys Global Tech Co Llc | SPATIO-TEMPORAL FILTRATION METHOD OF FLUOROSCOPIC NOISE |
US20040102692A1 (en) * | 2002-11-27 | 2004-05-27 | General Electric Company | Magnetic resonance imaging system and methods for the detection of brain iron deposits |
JP4499090B2 (en) * | 2003-02-28 | 2010-07-07 | セダラ ソフトウェア コーポレイション | Image region segmentation system and method |
JP4184842B2 (en) * | 2003-03-19 | 2008-11-19 | 富士フイルム株式会社 | Image discrimination device, method and program |
US8045770B2 (en) * | 2003-03-24 | 2011-10-25 | Cornell Research Foundation, Inc. | System and method for three-dimensional image rendering and analysis |
WO2004088589A1 (en) * | 2003-04-04 | 2004-10-14 | Philips Intellectual Property & Standards Gmbh | Volume measurements in 3d datasets |
US8055323B2 (en) * | 2003-08-05 | 2011-11-08 | Imquant, Inc. | Stereotactic system and method for defining a tumor treatment region |
US7343030B2 (en) * | 2003-08-05 | 2008-03-11 | Imquant, Inc. | Dynamic tumor treatment system |
US20050096530A1 (en) * | 2003-10-29 | 2005-05-05 | Confirma, Inc. | Apparatus and method for customized report viewer |
US20050113651A1 (en) * | 2003-11-26 | 2005-05-26 | Confirma, Inc. | Apparatus and method for surgical planning and treatment monitoring |
CA2749057A1 (en) * | 2004-02-20 | 2005-09-09 | University Of Florida Research Foundation, Inc. | System for delivering conformal radiation therapy while simultaneously imaging soft tissue |
US7233687B2 (en) * | 2004-03-30 | 2007-06-19 | Virtualscopics Llc | System and method for identifying optimized blood signal in medical images to eliminate flow artifacts |
GB2414357A (en) * | 2004-05-18 | 2005-11-23 | Medicsight Plc | Nodule boundary detection |
US20060018524A1 (en) * | 2004-07-15 | 2006-01-26 | Uc Tech | Computerized scheme for distinction between benign and malignant nodules in thoracic low-dose CT |
DE102004061507B4 (en) * | 2004-12-21 | 2007-04-12 | Siemens Ag | Method for correcting inhomogeneities in an image and imaging device therefor |
CN100370952C (en) * | 2004-12-30 | 2008-02-27 | 中国医学科学院北京协和医院 | Method for processing lung images |
JP2008529639A (en) * | 2005-02-11 | 2008-08-07 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Inspection apparatus, image processing device, method of inspecting target object with inspection apparatus, computer-readable medium, and program element |
US8611632B2 (en) * | 2005-04-08 | 2013-12-17 | 361° Systems, Inc. | Method of selecting and visualizing findings within medical images |
US7599542B2 (en) * | 2005-04-08 | 2009-10-06 | John Philip Brockway | System and method for detection and display of diseases and abnormalities using confidence imaging |
US7756317B2 (en) * | 2005-04-28 | 2010-07-13 | Carestream Health, Inc. | Methods and systems for automated detection and analysis of lesion on magnetic resonance images |
US20060247864A1 (en) * | 2005-04-29 | 2006-11-02 | Jose Tamez-Pena | Method and system for assessment of biomarkers by measurement of response to surgical implant |
CN101288102B (en) * | 2005-08-01 | 2013-03-20 | 拜奥普蒂根公司 | Methods and systems for analysis of three dimensional data sets obtained from samples |
US7760941B2 (en) * | 2005-09-23 | 2010-07-20 | Mevis Research Gmbh | Method and apparatus of segmenting an object in a data set and of determination of the volume of segmented object |
US20070127796A1 (en) * | 2005-11-23 | 2007-06-07 | General Electric Company | System and method for automatically assessing active lesions |
WO2007078258A1 (en) * | 2006-01-06 | 2007-07-12 | Agency For Science, Technology And Research | Obtaining a threshold for partitioning a dataset based on class variance and contrast |
US20070206844A1 (en) * | 2006-03-03 | 2007-09-06 | Fuji Photo Film Co., Ltd. | Method and apparatus for breast border detection |
US20070211930A1 (en) * | 2006-03-09 | 2007-09-13 | Terry Dolwick | Attribute based image enhancement and display for medical imaging applications |
WO2007113720A1 (en) * | 2006-03-30 | 2007-10-11 | Koninklijke Philips Electronics N.V. | Automatic cardiac band detection on breast mri |
US20080056548A1 (en) * | 2006-09-05 | 2008-03-06 | Pablo Irarrazaval | Enhancement of visual perception through dynamic cues |
CN101675455B (en) * | 2007-04-26 | 2016-11-09 | 皇家飞利浦电子股份有限公司 | The risk instruction of surgical procedures |
WO2008144751A1 (en) | 2007-05-21 | 2008-11-27 | Cornell University | Method for segmenting objects in images |
US20090082637A1 (en) * | 2007-09-21 | 2009-03-26 | Michael Galperin | Multi-modality fusion classifier with integrated non-imaging factors |
US20100113921A1 (en) * | 2008-06-02 | 2010-05-06 | Uti Limited Partnership | Systems and Methods for Object Surface Estimation |
WO2010067219A1 (en) * | 2008-12-09 | 2010-06-17 | Koninklijke Philips Electronics N.V. | Synopsis of multiple segmentation results for breast lesion characterization |
US8744159B2 (en) * | 2010-03-05 | 2014-06-03 | Bioptigen, Inc. | Methods, systems and computer program products for collapsing volume data to lower dimensional representations thereof using histogram projection |
WO2012090106A1 (en) * | 2010-12-29 | 2012-07-05 | Koninklijke Philips Electronics N.V. | Tnm classification using image overlays |
US8965484B2 (en) * | 2011-04-27 | 2015-02-24 | General Electric Company | Method and apparatus for generating a perfusion image |
JP5980490B2 (en) * | 2011-10-18 | 2016-08-31 | オリンパス株式会社 | Image processing apparatus, operation method of image processing apparatus, and image processing program |
US10561861B2 (en) | 2012-05-02 | 2020-02-18 | Viewray Technologies, Inc. | Videographic display of real-time medical treatment |
US9439623B2 (en) | 2012-05-22 | 2016-09-13 | Covidien Lp | Surgical planning system and navigation system |
US9498182B2 (en) | 2012-05-22 | 2016-11-22 | Covidien Lp | Systems and methods for planning and navigation |
US9439622B2 (en) | 2012-05-22 | 2016-09-13 | Covidien Lp | Surgical navigation system |
US9439627B2 (en) | 2012-05-22 | 2016-09-13 | Covidien Lp | Planning system and navigation system for an ablation procedure |
US8750568B2 (en) | 2012-05-22 | 2014-06-10 | Covidien Lp | System and method for conformal ablation planning |
US9262834B2 (en) * | 2012-07-30 | 2016-02-16 | General Electric Company | Systems and methods for performing segmentation and visualization of images |
KR20150080527A (en) | 2012-10-26 | 2015-07-09 | 뷰레이 인코포레이티드 | Assessment and improvement of treatment using imaging of physiological responses to radiation therapy |
US9446263B2 (en) | 2013-03-15 | 2016-09-20 | Viewray Technologies, Inc. | Systems and methods for linear accelerator radiotherapy with magnetic resonance imaging |
CN105684040B (en) | 2013-10-23 | 2020-04-03 | 皇家飞利浦有限公司 | Method of supporting tumor response measurement |
US9377291B2 (en) | 2013-12-05 | 2016-06-28 | Bioptigen, Inc. | Image registration, averaging, and compounding for high speed extended depth optical coherence tomography |
US9786092B2 (en) * | 2015-02-18 | 2017-10-10 | The Regents Of The University Of California | Physics-based high-resolution head and neck biomechanical models |
KR20180087310A (en) | 2015-11-24 | 2018-08-01 | 뷰레이 테크놀로지스 인크. | Radiation beam collimation system and method |
KR20180120705A (en) | 2016-03-02 | 2018-11-06 | 뷰레이 테크놀로지스 인크. | Particle therapy using magnetic resonance imaging |
US11378629B2 (en) | 2016-06-22 | 2022-07-05 | Viewray Technologies, Inc. | Magnetic resonance imaging |
CA3030577A1 (en) | 2016-07-12 | 2018-01-18 | Mindshare Medical, Inc. | Medical analytics system |
CA3046091A1 (en) | 2016-12-13 | 2018-06-21 | Viewray Technologies, Inc. | Radiation therapy systems and methods |
US11276175B2 (en) | 2017-05-18 | 2022-03-15 | Brainlab Ag | Determining a clinical target volume |
EP3438928A1 (en) | 2017-08-02 | 2019-02-06 | Koninklijke Philips N.V. | Detection of regions with low information content in digital x-ray images |
WO2019112880A1 (en) | 2017-12-06 | 2019-06-13 | Viewray Technologies, Inc. | Optimization of multimodal radiotherapy |
US11049606B2 (en) | 2018-04-25 | 2021-06-29 | Sota Precision Optics, Inc. | Dental imaging system utilizing artificial intelligence |
US11209509B2 (en) | 2018-05-16 | 2021-12-28 | Viewray Technologies, Inc. | Resistive electromagnet systems and methods |
EP3833290A1 (en) | 2018-08-10 | 2021-06-16 | Covidien LP | Systems for ablation visualization |
CN111815624A (en) * | 2020-07-28 | 2020-10-23 | 复旦大学附属华山医院 | Tumor interstitial ratio determination method and system based on image processing algorithm |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5845639A (en) * | 1990-08-10 | 1998-12-08 | Board Of Regents Of The University Of Washington | Optical imaging methods |
US5311131A (en) * | 1992-05-15 | 1994-05-10 | Board Of Regents Of The University Of Washington | Magnetic resonance imaging using pattern recognition |
US5627907A (en) * | 1994-12-01 | 1997-05-06 | University Of Pittsburgh | Computerized detection of masses and microcalcifications in digital mammograms |
US5579360A (en) * | 1994-12-30 | 1996-11-26 | Philips Electronics North America Corporation | Mass detection by computer using digital mammograms of the same breast taken from different viewing directions |
US5615243A (en) * | 1996-03-12 | 1997-03-25 | University Of Pittsburgh | Identification of suspicious mass regions in mammograms |
US5799100A (en) * | 1996-06-03 | 1998-08-25 | University Of South Florida | Computer-assisted method and apparatus for analysis of x-ray images using wavelet transforms |
US5796862A (en) * | 1996-08-16 | 1998-08-18 | Eastman Kodak Company | Apparatus and method for identification of tissue regions in digital mammographic images |
US5768333A (en) * | 1996-12-02 | 1998-06-16 | Philips Electronics N.A. Corporation | Mass detection in digital radiologic images using a two stage classifier |
US6026316A (en) * | 1997-05-15 | 2000-02-15 | Regents Of The University Of Minnesota | Method and apparatus for use with MR imaging |
-
1998
- 1998-09-18 US US09/156,413 patent/US6112112A/en not_active Expired - Lifetime
-
1999
- 1999-09-15 AU AU59190/99A patent/AU5919099A/en not_active Abandoned
- 1999-09-15 WO PCT/US1999/020862 patent/WO2000016696A1/en not_active Application Discontinuation
- 1999-09-15 CA CA002343825A patent/CA2343825A1/en not_active Abandoned
- 1999-09-15 JP JP2000573659A patent/JP2002526190A/en not_active Withdrawn
- 1999-09-15 EP EP99946877A patent/EP1113752A4/en not_active Withdrawn
Also Published As
Publication number | Publication date |
---|---|
AU5919099A (en) | 2000-04-10 |
EP1113752A1 (en) | 2001-07-11 |
EP1113752A4 (en) | 2003-06-25 |
JP2002526190A (en) | 2002-08-20 |
US6112112A (en) | 2000-08-29 |
WO2000016696A1 (en) | 2000-03-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6112112A (en) | Method and system for the assessment of tumor extent in magnetic resonance images | |
Noble et al. | Ultrasound image segmentation: a survey | |
Mughal et al. | Removal of pectoral muscle based on topographic map and shape-shifting silhouette | |
US7983732B2 (en) | Method, system, and computer software product for automated identification of temporal patterns with high initial enhancement in dynamic magnetic resonance breast imaging | |
Rogowska | Overview and fundamentals of medical image segmentation | |
US6317617B1 (en) | Method, computer program product, and system for the automated analysis of lesions in magnetic resonance, mammogram and ultrasound images | |
US8634622B2 (en) | Computer-aided detection of regions of interest in tomographic breast imagery | |
Chang et al. | Segmentation of breast tumor in three-dimensional ultrasound images using three-dimensional discrete active contour model | |
US20040252870A1 (en) | System and method for three-dimensional image rendering and analysis | |
US8311301B2 (en) | Segmenting an organ in a medical digital image | |
Aggarwal et al. | Role of segmentation in medical imaging: A comparative study | |
JP2002523123A (en) | Method and system for lesion segmentation and classification | |
US7486812B2 (en) | Shape estimates and temporal registration of lesions and nodules | |
Shao et al. | Prostate boundary detection from ultrasonographic images | |
US20030099384A1 (en) | Detection and analysis of lesions in contact with a structural boundary | |
Singh et al. | Segmentation of prostate contours for automated diagnosis using ultrasound images: A survey | |
Park | Connectivity-based local adaptive thresholding for carotid artery segmentation using MRA images | |
Jumaat et al. | Comparison of balloon snake and GVF snake in segmenting masses from breast ultrasound images | |
Chen et al. | Multiview contouring for breast tumor on magnetic resonance imaging | |
Wu et al. | Automatic segmentation of ultrasound tomography image | |
Li et al. | Image segmentation and 3D visualization for MRI mammography | |
Logeswaran et al. | Liver isolation in abdominal MRI | |
Moltz et al. | Segmentation of juxtapleural lung nodules in ct scan based on ellipsoid approximation | |
Kutty et al. | Improved Segmentation for Intravascular Ultrasound (IVUS) Modality | |
Deenadhayalan et al. | Computed Tomography Image based Classification and Detection of Lung Diseases with Image Processing Approach |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request | ||
FZDE | Discontinued |