CA2214101A1 - Method and system for the detection of lesions in medical images - Google Patents
Method and system for the detection of lesions in medical images Download PDFInfo
- Publication number
- CA2214101A1 CA2214101A1 CA002214101A CA2214101A CA2214101A1 CA 2214101 A1 CA2214101 A1 CA 2214101A1 CA 002214101 A CA002214101 A CA 002214101A CA 2214101 A CA2214101 A CA 2214101A CA 2214101 A1 CA2214101 A1 CA 2214101A1
- Authority
- CA
- Canada
- Prior art keywords
- recited
- determining
- gray
- mass
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/149—Segmentation; Edge detection involving deformable models, e.g. active contour models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/155—Segmentation; Edge detection involving morphological operators
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/187—Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
- G06V10/478—Contour-based spectral representations or scale-space representations, e.g. by Fourier analysis, wavelet analysis or curvature scale-space [CSS]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20056—Discrete and fast Fourier transform, [DFT, FFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20112—Image segmentation details
- G06T2207/20156—Automatic seed setting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30068—Mammography; Breast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
Abstract
A method and system for the automated detection of lesions in the medical images. Medical images, such as mammograms are segmented and optionally processing with peripheral enhancement and/or modified median filtering. A
modified morphological open operation (104-106) and filtering with a modified mass filter (107-109) are performed for the initial detection of circumscribed lesions. Then, the lesions are matched using a deformable shape template with Fourier descriptors (110-112). Characterization of the match is done using simulated annealing, and measuring the circularity and density characteristics of the suspected lesion. The procedure is performed iteratively at different spatial resolution in which at each resolution step a specific lesion size is detected. The detection of the lesion leads to a localization of a suspicious region and thus the likelyhood of cancer.
modified morphological open operation (104-106) and filtering with a modified mass filter (107-109) are performed for the initial detection of circumscribed lesions. Then, the lesions are matched using a deformable shape template with Fourier descriptors (110-112). Characterization of the match is done using simulated annealing, and measuring the circularity and density characteristics of the suspected lesion. The procedure is performed iteratively at different spatial resolution in which at each resolution step a specific lesion size is detected. The detection of the lesion leads to a localization of a suspicious region and thus the likelyhood of cancer.
Description
CA 02214101 1997-os-2x W 096/27846 PCT~US96/02439 TITLE OF q~HE lNv~NllON
NE~HOD A~rD SYSTEM FOR ~ E D~ lON OF LESIONS IN NoEDIC~L TM~
BACKGROmND OF q~HE lN V~NlloN
Field of the Invention S The invention relates generally to a method and system for an improved computerized, automatic detection and characterization of lesions in medical images, and more particularly to the detection of circumscribed masses in digital mammograms. Novel t~chn; ques in the localization (segmentation) and detection of ~c~c in m~mGyl c, include initially processing with peripheral equalization (correction), a modified median filter, a modified morphological open operation, filtering with a modified mass filter for the initial detection of circumscribed densities, matching using a deformable shape template with Fourier descriptors, optimization of the match using simulated ~nne~l ing, and measuring the circularity and density characteristics of the suspected lesion to distinguish true positives from false positives and malignant lesions from benign lesions. The procedure i5 performed iteratively at different spatial resolution in which at each resolution step a specific lesion size is detected. The detection of the mass leads to a localization of a suspicious region and thus the likelihood of cancer.
Discussion of the Backqround Although mammography is currently the best method for the detection of breast cancer, between 10-30% of women who have breast cancer and undergo mammography have negative mammograms. In approximately two-thirds of these false-negative mammograms, the radiologist failed to det~ect the c~nc~r that was evident retrospectively. The missed detections may be due to the subtle nature of the radiographic findings (i.e., low conspicuity of the lesion), poor image quality, eye fatigue or oversight by the radiologists. In addition, it has been suggested that double reading (by two radiologists) may increase sensitivity. It is apparent that the efficiency ar.d effectiveness of scre~ning proc~ ~es could be increased by using a c uLer system, as a "second opinion or second reading", to aid the radiologist by indicating locations of suspicious abnormalities in mammograms. In addition, mammoqraphy is becoming a high volume x-ray procedure routinely interpreted by radiologists.
If a suspicious region is detected by a radiologist, he or she must then visually extract various radiographic characteristics. Using these features, the radiologist then decides if the abnormality is likely to be malignant or benign, and what course of action should be rec~ -n~e~ (i.e., return to screening, return for follow-up or return for biopsy). Many patients are referred for surgical biopsy on the basis of a radiographically detected mass lesion or cluster of microcalcifications. Although general rules for the differentiation between benign and malignant breast lesions exist, considerable misclassification of lesions occurs with current radiographic techn; ques. On average, only 10-20% of masses referred for surgical breast biopsy are actually malignant. Thus, another aim of computer use is to extract and analyze the characteristics of benign and W 0~6/27846 PCTrUS96/02439 malign~nt lesions in an objective ~nn~ in order to aid the radiologist by reducing the ~.~ h~s of false-positive ~; A~noc~ of malignancies, thereby decreasing patient morbidity as well as the number of surgical biopsies performed and their associated complications.
SUMMARY OF q'HE I~V~:N'1 10N
Accordingly, an object of this invention is to provide a method and system for detecting, classifying, and displaying masses in medical images of the breast.
Another object of this invention is to provide an automated method and system for the detection and/or classification of masses based on a multi-resolution analysis of mammograms.
Another object of this invention is to provide an automated method and system for the detection and/or classification of masses based on a modified morphological open operation, filtering with a modified mass filter for the initial detection of cil~l cribed densities, match; ng using a deformable shape template with Fourier descriptors, optimization of the match using simulated ~nne~ ling~ and measuring the circularity and density characteristics of the suspected lesion.
These and other objects are achieved according to the invention by providing a new and improved automated method and system in which a segmentation of densities (masses) within a mammogram is performed followed by optimal characterization.
W O 96/27846 CA 02214101 1997-08-28 PCTrUS96/02439 BRIEF DESCRIPTION OF THE DRAWINGS ~
A more complete appreciation of the invention and many of the at~en~nt advantages thereof will be readily obt~; n~ as the same b~c: -c better understood by the reference to the following detailed description when considered in conn~ction with the acco~p~nying drawings, wherein:
FIGS. lA-lC are schematic diagram illustrating embodiments of the automated method for the detection of lesions according to the invention;
FIG. 2 is a graph illustrating the step of peripheral enhancement according to the invention;
FIG. 3A is a schematic diagram of the modified median filtering according to the invention;
FIG. 3B is a schematic diagram of the modified morphological open operation according to the invention;
FIGS. 3C and 3D are graphs illustrating the criteria used in the modified morphological open operation of FIG. 3B;
FIG. 4 is a diagram illustrating the circular kernel used in the modified mass filter;
FIG. 5 is a diagram illustrating a gradient vector in the modified mass filtering;
FIG. 6 is a diagram illustrating examples of the deformable templates corresponding to the possible shapes assigned to localized densities from the Fourier descriptors analysis;
FIG. 7 is a diagram of calculating a gradient in a region of interest;
FIG. 8 is a schematic diagram illustrating the analysis W 096/27846 PCTrUS96/02439 o~ a suspected lesion;
FIGS. 9A and 9B are tables illustrating the relationchi r between pixel size of the image and the lesion size being detected, and the relationship between kernel size and the lesion size being detected, respectively;
= - FIG. 10 is a schematic diagram of the changing of the kernel size in mass filtering;
FIG. 11 is a diagram of two detected lesions;
FIGS. 12A-12F illustrate examples of (12A) an original mammogram, (12B) after border segmentation, (12C) after the modified open operation, (12D) after the modified mass filter, (12E) after template matching and (12F) after feature extraction;
FIGS. 13A-13F illustrate examples of (13A) a mammogram, after peripheral enhancement, (13B) after morphological filtering, (13C) a image of the difference of the images of FIGS. 13A and 13B, and (13D-13F) after morphological filtering with pixel sizes of 1, 2 and 4 mm;
FIGS. 14A-14C illustrate (14A) an artificial lesion, (14B) its detection results, and (14C) the edge maps used in the detection;
FIG. 15A shows the location of a region of interest (ROI) used for feature analysis;
FIGS. 15B-15D show enlargement of the ROI of FIG. 15A, a truth margin, and detection results, respectively;
FIG. 16 is a graph illustrating the performance of the method in the detection of malignant lesions in a screening mammographic database; and W096/27846 CA 02214101 1997-OX-28 PCT~S96tO2439 FIG. 17 is a schematic block diagram illustra~ting a system for implementing the automated method for the detection of lesions in medical images.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Referring now to the drawings, and more particularly to FIGS. lA-lC thereof, schematic diagrams of the automated method for the detection and classification of lesions in breast images is shown. In FIG. lA a first emhoA; -~t of the overall scheme includes an initial acquisition of a mammogram and digitization (step 100). Next, the breast border is seymented from the rest of the image area (step 101) and peripheral density enhancement is performed on the image (step 102). The image is processed (step 103) and then subjected to a modified morphological open operation using different filter sizes (steps 104-106). The image after the open operation is mass filtered (steps 107-109) and template matched (steps 110-112). Feature extraction is then performed (step 113) followed by integration (step 114) and classification (step 115) of the detected lesions.
The method of detecting circumscribed masses according to the invention uses an automatically ~ ented ~- - yL aphic image idicating only the actual breast reyion (step 101) after an optional application of the peripheral density equalization (step 102). Segmentation of a mammogram is described in application Serial No. 08/158,320 to Bick et al, the disclosure of which is herein incorporated by reference.
In the segmentation process, noise filtering is applied W 096/27846 PCTrUS96/02439 tc the digital - ~g~am followed by application of the gray-value range operator. Using information from the local range operator a modified global histogram analysis is performed.
Region growing is p~rformed on the threshold image using connectivity (counting pixels), followed by a morphological erosion operation. The distance map of the image is determined and the boundary of the segmented object (breast) in the image is then tracked to yield its contour. The contour can then be output onto the digital image or FA~ to other computer algorithms.
Note that there is an inverse relationship between gray level and optical density. A low optical density (white region) on the mammogram (high anatomic density) corresponds to a high gray level (1023), whereas a high optical density (black region) on the mammogram corresponds to a low gray level (0).
The image after segmenting can be pro~ (step 103) or peripheral density enhancement can be performed. Peripheral density enhancement is described in application Serial No.
08/158,320. The average gray values of the pixels as a function of distance from the breast border. An e~hA~cement curve is determined by fitting, such as polynomial fitting, a curve of the average gray values as a function of distance, and then reversing the fit. The PnhAnc~ ~nt curve is added to the curve of the average gray values as a function of distance to produce an enhanced gray value curve. This results in a peripherally enhanced image where the center and the portion near the border are simultaneously displayed without loss in W096l27846 CA 022l4l0l 1997-08-28 PCT~S96102439 ~l.L~st. FIG. 2 shows the curve of the average ~ray values as a function of distance, the reversed fitted curve and the peripherally ~nhAnc~ curve.
The segmented image, with or without peripheral density enhancement, is then optionally proc~s~ (step 103). An initial modified median filter of size nxn may be used to eliminate isolated aberrant (very dark, low gray level) pixel values in the segmented image, since this would disturb the erosion step. The modified median filtering is shown in FIG.
3A. The median filter can be of 3x3 size, for example. The conventional median filter is described in, for example, "The Image Processing Handbook," 2nd Ed., by John Russ (CRC Press 1995).
At a beginning pixel location l(x,y) in the image (step 300), which can be either the segmented or the peripherally enhanced segmented image, the local minimum is determined (step 301) in the surrounding neighborhood (n m pixels). If the gray level at pixel location l(x,y) is smaller than the local minimum by a certain number of gray levels (M) in step 302, then that gray level is corrected by the median filter (step 303). An example of M is 5 gray levels, but other values are possible. In the embodiment, the gray level of the pixel at l(x,y) is updated to the median pixel value of the neighborhood.
It is checked whether the pixel is the last pixel for processing (step 304). If no, the next pixel is selected (step 305) and step 301 is repeated. If the answer in step 302 is no, the process moves to step 304. When the last pixel locatiGn is r~chP~ (step 304), the filtering is completed for all of the pixels in the image.
Two criteria are then used to ~ollLrol which pixels are used as seed pixels for the morphological operation, to preserve the gray value characteristics of larger lesions as ~ far as possible. As shown in FIG. 3B, beg;nn;n~ at pixel location l(x,y) (step 310) a check is made to determine whether pixel l(x,y) is a seed pixel. The local ~Y; of the neighborhood is calculated (step 311).
To qualify as a seed pixel the following criteria must be fulfilled. First, there must be a negative Laplacian (gray value of the pixel in question minus the local ;~; gray value must be less than the local ~Y; l~- gray value minus the gray value of the pixel in question, (step 312). This, as demonstrated in FIGS. 3C and 3D, prevents erosion of the center of a small mass. In FIG. 3C the gray value I(x,y)--MAX, so no change is made to pixel value and the center is preserved. In FIG. 3D, I(x,y)-MIN < MAX-I(x,y), so the gray value of pixel at location l(x,y) is changed.
Second, only pixels with a small distance from the local m;n; ~ are used as erosion centers (step 313). That is, the location of the seed pixel must be close to the location of the local MIN. This preserves the gray value slope in the periphery of larger lesions. An example of the distance is 3 pixels, and other values may be chosen.
, If the answer is no at either of steps 312 and 313, the next pixel is selected (step 314) and the process is repeated.
For those pixels which qualify as a seed pixel, the W O 96/27846 PCTrUS96/02439 morphological open operation is performed (step 315).
The morphological open (erosion followed by dilation) shown in FIG. 3B is performed on the segmented image, with or without modified median filtering, as shown in FIG. lA. The morphological open operation is also described in Russ, supra.
Only the erosion processing can be performed, omitting the dilation procedure. The main effect of the erosion is smoothing of the image while keeping lesions that are of interest. The main effect of the dilation is to return masses to roughly their original size. The dilation is optional.
The structuring element in the embodiment for the morphological operation is a circle with a diameter of 7 pixels, e.g. for a pixel size of 0.5 mm. The structuring eliminates small circular and thin linear structures up to a diameter of 3.5 mm (for a 0.5 mm pixel size). If larger structuring elements are used, the subsequently used mass filter size is changed (as discussed below). At the same time irregular densities are rounded by this process.
This morphological operation is different from the conventional operation in the sense that a threshold E is used to control how much structure is eroded. If the difference, i.e. the gray level value of a pixel in the image prior to the morphological operation, I(x,y), minus the gray level value after the morphological operation, P(x,y), is larger than the threshold E (step 316), then the gray level value of the pixel is replaced by the output of the morphological operation (step 317). Examples of E can range from 0-10 in terms of gray levels. When dilation is performed, if the gray level after W O 96/27846 PCT~US96102439 dilation ~Yc~e~ the original gray level of the pixel, the original gray level value is used for the pixel. This is repeated for all pixels.
Referring to FIG. lA, the morphological step is performed at different image resolutions. For example, resolution 1 (step 104) can use an image having a 0.5 mm pixel size (resolution 1), with the image being 512x512 pixels. The process is repeated in parallel for images having 1 mm, 1.5 mm, 2.5 mm, etc. pixel size with a corresp~n~in~
decrease in image size as the pixel size increase lfor 1.0 mm pixel size, the image is 256x256, etc.).
The process can also be conducted serially with a change in the resolution for each iteration. A second ,-- hor~i -rt of the method according to the invention is shown in FIG. lB.
After steps 100-103, the morphological operation is performed at a beginning resolution (step 104), followed by mass filtering (step 107) and template mat~h;ng (step 110). The image resolution is changed in step 116 and the results of the matching are stored in step 117. It then determined whether the ~Y; resolution has been ~xc~ (step 118). If no, the process is repeated at the new resolution. If yes, feature extraction, integration and classification (steps 113-115) are performed the same as in FIG. lA.
FIG. lC shows a third embodiment of the invention. The method shown in FIG. lC differs from the method shown in FIG.
lB in that a thresholding operation 119 is performed using the output of the mass filtering step. The mass filtered image identifies areas suspected of cont~; n; ng a lesion that can be W096/27846 CA 022l4l0l 1997-08-28 PCT~S96/02439 further processed by gray-level thresholding. After thresholding the image with the L~ ~ i n; ~g suspected lesions is input to step 113 for feature analysis, followed by steps 114 and 115, as in the method of FIG. lB.
FIG. 4 is a diagram illustrating the circular kernel used in the mass filter. For detection of ci ~ cribed densities a mass filter with a circular base is used (this mass filter is a modified IRIS filter; for a description of the IRIS
filter see Kobatake et al., CAR 1993: pp 624-629). The kernel is ring-shaped (pixels 402) around a center pixel 400.
Note in this kernel that the center pixel locations 401 are absent since they would not contribute useful values to the overall filter value (as described below). A ring-shAp~A
filter rather than just a solid circular filter is thus used.
The mass f i lter value is based on the local gradient (in the embodiment a 7x7 kernel is used) in x- (Dx) and y- (Dy) directions. Differences from the description of the IRIS
filter in Kobatake et al. include use of a ring-shaped filter, second derivative instead of the gradient, and edge orientation bins. Gradient values smaller than a gradient threshold (e.g., 10) are not used in the calculation of the filter value.
The edge orientation at a specific image point is equivalent to the gradient vector and the edge strength is calculated as the second derivative in edge orientation. FIG.
5 shows a gradient 500 at point 501. This assures that regions with a constant gradual slope do not contribute to the mass f ilter value. The gradient is oriented at an angle W 096/27846 PCT~US96/02439 relative to a radial line from point 501 to point (x,y). The filter value is calculated separately for a specific h~ of edge orientation bins, such as 16 (B1, B2...B16). Orientation bins are radial sectors of the circular area. For example, each of 16 bins would cover an angle of ~/8. A bin 502, shown for a sector of ~/8, is made of the pixels 402 between lines 503.
The calculation for a given pixel location (x,y) is given for the calculation of each orientation bin by:
f(Bi) = (l/N)2pinK[~X(O~cos~) * Edge Strength (at P)]
where:
f(B;) filter value for edge orientation bin K filter kernel P neighbor point in K
N number of points in K
~ angle between gradient vector and connection line center point/neighbor point Edge strength is obtained from the second derivative of P
calculated in edge orientation. The final filter value is calculated as sum of the individual orientation bins, where a specified number of bins j, for example 4, with the highest values are ignored. That is:
filter value at pixel l(x,y) = ~ j j f(Bj) for Bj not equal to the j highest bins. This prevents an influence of straight edges (e.g. the pectoralis muscle ~ 25 border) on the filter value, since all points along this edge are within the same orientation bin without changing the filter value for ideal circular lesions.
W 096/27846 PCTrUS96/02439 Usually the filter value is highest in the center of a lesion. The highest filter values are found for round or slightly oval shaped lesions. The neighborhood used in calculation of the filter value is empirically det~ ; n~ to be around 10 pixels (outer radius); this could be increased to improve the detection of oval shaped masses. In addition, a gradient threshold can be employed so that pixels in the neighborhood that have a gradient smaller than the threshold (e.g., 10) do not contribute to the calculation of the filter value.
The image outputted by the mass filter is then subjected to template matching. Local ~ir~ of the filter value define potential center points of mass lesions, which are used in steps 111-113, the matching of a deformable template on to the lesion border. The edges of the suspect lesion can be obtained from the derivative or second derivative of the image output from the mass filtering. The deformable shape template is defined using Fourier descriptors. An initial shape is selected and the Fourier descriptors are varied to dynamically fit the shape of the lesion. Fourier descriptors are described in, for example Arbter et al., Application of Affine-invariant Fourier Descriptors to Recognition of 3-D
Objects, IEEE Trans. Pattern Analysis Machine Intelligence 12:640-647 (1990); Kuhl et al., Elliptic Fourier Features of a Closed Contour, Computer Graphics Image Processing 18:236-258 (1982); Wallace et al., An Efficient Three-dimensional Aircraft Recognition Algorithm Using Normalized Fourier Descriptors, ibid., 13:99-126 (1980); Granlund, Fourier W 096/27846 PCTrUS96/02439 Preprocessing for ~and Print Character Reco~nition~ IEEE
Trans. Computers 21:195-201 (1972~; Zahn et al., Fourier Descriptors for Plane Closed Curves. ibid., 21:269-281 (1972);
Cri i nC ~ A Complete Set of Fourier Descriptors for 'hwo-~; ~n.cional Shape, IEEE Trans. Sys. Man Cybernetics 12:848-855 (1982); Persoon et al., Shape Discrimination Using Fourier Descriptors, ibid., 7:170-179 (1977); and Richard et al., Identification of Three-dimensional Objects Using Fourier Descriptors of the Boundary Curve, ibid., 4:371-378 (1974).
In the template match; ng step the object contour is generated as an inverse Fourier transform of a limited number of complex Fourier terms. The following relationship exists between a closed planar curve g(l) and Fourier descriptors Ck:
g(1) planar curve and Fourier descriptors:
c~ = jck¦ei~ = (1/L)¦ g(l)e~~ dl o where:
g(l) is a planar curve with a runlength l; the real of part of g = x coordinate, the imaginary part of g = y coordinate Ck Fourier descriptors with -N/2 S k S N/2, N ~ ~
By variation of the terms -2, -1, 0, 1, and 2 arbitrary elliptical or kidney shape contours can be generated. The terms -2 to 2 were selected since the lesions are of simple shape. However, one can modify the terms using a priori knowledge of the lesions to be detected. The term 0 defines the position and the terms -1 and 1 define size and orientation of the main ellipse.
In the mass detection the following fourier descriptors W 096/27846 PCTrUS96/02~39 are used:
Ck = ~ for k << -2 or k > 2 C-l = sp1eiQ
CO = X + jy C1 =
C2 = sp2e~
x x center position y y center position s size ~ orientation (angle between main ellipse and x-axis) P1: variable parameter to describe the short/long axis ratio of the main ellipse with ~ ~ P1 S 0.5 (long axis: s + sP1~ short axis: s - sP1 for P1 = 0, the Fourier descriptors define a circle as a special case of an ellipse) P2 variable parameter to describe the degree of asymmetry (kidney shape) with 0 S P2 ~ 0~3 FIG. 6 is a diagram illustrating examples of the deformable templates correspon~i ng to the possible ch~rt~s assigned to localized densities from the Fourier descriptors analysis discussed above, with the P1 and P2 values indicated for each shape. Note that the center position and the angle (orientation) and the size of each can be varied. FIG. 6 is an example of possible shapes, and the invention is not limited to these particular shapes or this number of ~hArt~c.
The lesion contour is generated by variation of the Fourier terms within a certain range with ~i n; ~ i zation of a cost function using lesion contrast, edge strength and W 096127846 PCT~US96/02439 deviation from the ideal circular shape. This process is performed on the ouLyuL from the mass filter. Simulated ~nne~l ing is used for minimization.
= Simulated annealing is a tPchnique for optimiza~ion, which involves a description of possible system configurations, a generator of random changes in the configuration (i.e., the "options"), a function for minimization and a control parameter (temperature) that controls the increments of the random changes. It is described in, for example, Numerical Recipcs by Press, et al., Cambridge Press (1988).
The configuration in the ~hg~; -nt is the "correct"
Fourier descriptor of an extracted contour. This configuration could be ob~i neA as an entire curve or in radial segments of the curve using different Fourier descriptors for each segment. Once "fit", the inverse of the Fourier descriptors is performed yi~l~;nq the contour. With the radial segments, only a limited number of points are generated in the inverse transformation. The changes in the configuration (i.e., the contour shape, that is the Fourier descriptor coefficients Ck) are changed by changes in the center location, the size of the "lesion", the orientation (~), the long/short axis ratio (indicating the degree of being oval) and the degree of asymmetry. The method limits the changes to these in the Fourier descriptors. Examples of the range of variation for each parameter include increments in center position by one pixel, a size range of 5 to 80 pixels in diameter with an increment of 2 pixels, and a range in ~
W096/27846 CA 02214101 1997-08-28 PCT~S96/02439 from -360 to 360 . The function to be m; n; ; 7ed includes a center cost index of 20 (in each direction), a size cost index of 10 and an angle cost index of 10. The starting temperature was set at 30. Upon minimizing the cost function, the difference between the "lesion" center and the "fit" center, the difference between the size of the "lesion" and the size of the "fit", the Euclidean difference between x-y position of the lesion contour and the x-y position of the fit contour, etc. are ; nim; zed. The temperature is modified (cooled) as the iterations increase so that after a specified number of iterations a downward step in the temperature is taken.
In the template matching the following can be varied: the shape in terms of Fourier descriptors, the penalty factor for deviations from the mean, the center, the angle and the size.
The penalty factor is a measure based on st~n~Ard deviation, i.e. a limit on the amount of deformation during the template match; nq, An example of the parameter file used in the deformable template matching is shown below:
-- a shape file giving which part of the curve is used -- start temperature for the simulated An~eAling -- number of iterations -- in~,~ ents such as for incrementing the center position, the size, the angle during the simulated annealing -- number of points generated in the inverse transformation.
Note that after the matching is 5nCc~ful~ the final coefficients of the Fourier descriptors are used to return to the x,y domain. Thus, discontinuous margin pixels along a -W 096/27846 PCT~US96/02439 "mass" will be ronn~cted. The ouL~L of the template matchin~
is contour or a partial contour of the suspect lesion.
Sixteen edge maps can be used in the shape matçhi n~.
Edge maps are obtained from the second derivative, as described above. Edge maps are used since sometimes there is only one good edge in the suspected lesion. The lesion contour is generated by variation of the Fourier terms within a certain range with ~;n; m i7ation of a cost function using lesion contrast, edge strength and deviation from the ideal circular shape. Simulated annealing is used for ;ni~;~ation.
In the matchinq one can have varied the following: the shape in terms of Fourier descriptors, the penalty factor deviations from the mean, the center, the angle and the size.
For further characterization a rectangular ROI cont~;n;~
the suspected lesion identified in the open, mass filtering and template matching operations is extracted from the original peripheral density ~nh;~nr~rl image. Feature extraction and analysis is performed on the suspected lesion.
Feature extraction and is described in application Serial No.
08/158,389 to Giger et al., the disclosure of which is herein incorporated by reference.
This is shown in more detail in FIG. 8. The suspected lesion from the template matching is obtained (step 800).
Note that a suspected lesion from another method, by a computer or manually by an observer, can also be used as input (step 801). A region of interest (ROI) cont~;n; ng the suspected lesion is selected (automatically or ~nt~lly) in = step 802. The gradient and orientation of the ROI is W096/27846 CA 02214101 1997-08-28 PCT~S96/02439 calculated in step 803, followed by a calculation of the gradient index R, contrast and elongation factor in step 804.
This is shown in more detail in FIG. 7, where in an ROI 700 a gradient 701 is calculated at a point 702 in a suspected lesion 703 having a center point (x,y). The pixels in the area enclosed by ~AshP~ line 704 are those pixels that do not contribute much to the gradient index (the gray value varies more towards the edge of the suspected lesion), and may be excluded.
The radial gradient index R, defined as follows:
NE~HOD A~rD SYSTEM FOR ~ E D~ lON OF LESIONS IN NoEDIC~L TM~
BACKGROmND OF q~HE lN V~NlloN
Field of the Invention S The invention relates generally to a method and system for an improved computerized, automatic detection and characterization of lesions in medical images, and more particularly to the detection of circumscribed masses in digital mammograms. Novel t~chn; ques in the localization (segmentation) and detection of ~c~c in m~mGyl c, include initially processing with peripheral equalization (correction), a modified median filter, a modified morphological open operation, filtering with a modified mass filter for the initial detection of circumscribed densities, matching using a deformable shape template with Fourier descriptors, optimization of the match using simulated ~nne~l ing, and measuring the circularity and density characteristics of the suspected lesion to distinguish true positives from false positives and malignant lesions from benign lesions. The procedure i5 performed iteratively at different spatial resolution in which at each resolution step a specific lesion size is detected. The detection of the mass leads to a localization of a suspicious region and thus the likelihood of cancer.
Discussion of the Backqround Although mammography is currently the best method for the detection of breast cancer, between 10-30% of women who have breast cancer and undergo mammography have negative mammograms. In approximately two-thirds of these false-negative mammograms, the radiologist failed to det~ect the c~nc~r that was evident retrospectively. The missed detections may be due to the subtle nature of the radiographic findings (i.e., low conspicuity of the lesion), poor image quality, eye fatigue or oversight by the radiologists. In addition, it has been suggested that double reading (by two radiologists) may increase sensitivity. It is apparent that the efficiency ar.d effectiveness of scre~ning proc~ ~es could be increased by using a c uLer system, as a "second opinion or second reading", to aid the radiologist by indicating locations of suspicious abnormalities in mammograms. In addition, mammoqraphy is becoming a high volume x-ray procedure routinely interpreted by radiologists.
If a suspicious region is detected by a radiologist, he or she must then visually extract various radiographic characteristics. Using these features, the radiologist then decides if the abnormality is likely to be malignant or benign, and what course of action should be rec~ -n~e~ (i.e., return to screening, return for follow-up or return for biopsy). Many patients are referred for surgical biopsy on the basis of a radiographically detected mass lesion or cluster of microcalcifications. Although general rules for the differentiation between benign and malignant breast lesions exist, considerable misclassification of lesions occurs with current radiographic techn; ques. On average, only 10-20% of masses referred for surgical breast biopsy are actually malignant. Thus, another aim of computer use is to extract and analyze the characteristics of benign and W 0~6/27846 PCTrUS96/02439 malign~nt lesions in an objective ~nn~ in order to aid the radiologist by reducing the ~.~ h~s of false-positive ~; A~noc~ of malignancies, thereby decreasing patient morbidity as well as the number of surgical biopsies performed and their associated complications.
SUMMARY OF q'HE I~V~:N'1 10N
Accordingly, an object of this invention is to provide a method and system for detecting, classifying, and displaying masses in medical images of the breast.
Another object of this invention is to provide an automated method and system for the detection and/or classification of masses based on a multi-resolution analysis of mammograms.
Another object of this invention is to provide an automated method and system for the detection and/or classification of masses based on a modified morphological open operation, filtering with a modified mass filter for the initial detection of cil~l cribed densities, match; ng using a deformable shape template with Fourier descriptors, optimization of the match using simulated ~nne~ ling~ and measuring the circularity and density characteristics of the suspected lesion.
These and other objects are achieved according to the invention by providing a new and improved automated method and system in which a segmentation of densities (masses) within a mammogram is performed followed by optimal characterization.
W O 96/27846 CA 02214101 1997-08-28 PCTrUS96/02439 BRIEF DESCRIPTION OF THE DRAWINGS ~
A more complete appreciation of the invention and many of the at~en~nt advantages thereof will be readily obt~; n~ as the same b~c: -c better understood by the reference to the following detailed description when considered in conn~ction with the acco~p~nying drawings, wherein:
FIGS. lA-lC are schematic diagram illustrating embodiments of the automated method for the detection of lesions according to the invention;
FIG. 2 is a graph illustrating the step of peripheral enhancement according to the invention;
FIG. 3A is a schematic diagram of the modified median filtering according to the invention;
FIG. 3B is a schematic diagram of the modified morphological open operation according to the invention;
FIGS. 3C and 3D are graphs illustrating the criteria used in the modified morphological open operation of FIG. 3B;
FIG. 4 is a diagram illustrating the circular kernel used in the modified mass filter;
FIG. 5 is a diagram illustrating a gradient vector in the modified mass filtering;
FIG. 6 is a diagram illustrating examples of the deformable templates corresponding to the possible shapes assigned to localized densities from the Fourier descriptors analysis;
FIG. 7 is a diagram of calculating a gradient in a region of interest;
FIG. 8 is a schematic diagram illustrating the analysis W 096/27846 PCTrUS96/02439 o~ a suspected lesion;
FIGS. 9A and 9B are tables illustrating the relationchi r between pixel size of the image and the lesion size being detected, and the relationship between kernel size and the lesion size being detected, respectively;
= - FIG. 10 is a schematic diagram of the changing of the kernel size in mass filtering;
FIG. 11 is a diagram of two detected lesions;
FIGS. 12A-12F illustrate examples of (12A) an original mammogram, (12B) after border segmentation, (12C) after the modified open operation, (12D) after the modified mass filter, (12E) after template matching and (12F) after feature extraction;
FIGS. 13A-13F illustrate examples of (13A) a mammogram, after peripheral enhancement, (13B) after morphological filtering, (13C) a image of the difference of the images of FIGS. 13A and 13B, and (13D-13F) after morphological filtering with pixel sizes of 1, 2 and 4 mm;
FIGS. 14A-14C illustrate (14A) an artificial lesion, (14B) its detection results, and (14C) the edge maps used in the detection;
FIG. 15A shows the location of a region of interest (ROI) used for feature analysis;
FIGS. 15B-15D show enlargement of the ROI of FIG. 15A, a truth margin, and detection results, respectively;
FIG. 16 is a graph illustrating the performance of the method in the detection of malignant lesions in a screening mammographic database; and W096/27846 CA 02214101 1997-OX-28 PCT~S96tO2439 FIG. 17 is a schematic block diagram illustra~ting a system for implementing the automated method for the detection of lesions in medical images.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Referring now to the drawings, and more particularly to FIGS. lA-lC thereof, schematic diagrams of the automated method for the detection and classification of lesions in breast images is shown. In FIG. lA a first emhoA; -~t of the overall scheme includes an initial acquisition of a mammogram and digitization (step 100). Next, the breast border is seymented from the rest of the image area (step 101) and peripheral density enhancement is performed on the image (step 102). The image is processed (step 103) and then subjected to a modified morphological open operation using different filter sizes (steps 104-106). The image after the open operation is mass filtered (steps 107-109) and template matched (steps 110-112). Feature extraction is then performed (step 113) followed by integration (step 114) and classification (step 115) of the detected lesions.
The method of detecting circumscribed masses according to the invention uses an automatically ~ ented ~- - yL aphic image idicating only the actual breast reyion (step 101) after an optional application of the peripheral density equalization (step 102). Segmentation of a mammogram is described in application Serial No. 08/158,320 to Bick et al, the disclosure of which is herein incorporated by reference.
In the segmentation process, noise filtering is applied W 096/27846 PCTrUS96/02439 tc the digital - ~g~am followed by application of the gray-value range operator. Using information from the local range operator a modified global histogram analysis is performed.
Region growing is p~rformed on the threshold image using connectivity (counting pixels), followed by a morphological erosion operation. The distance map of the image is determined and the boundary of the segmented object (breast) in the image is then tracked to yield its contour. The contour can then be output onto the digital image or FA~ to other computer algorithms.
Note that there is an inverse relationship between gray level and optical density. A low optical density (white region) on the mammogram (high anatomic density) corresponds to a high gray level (1023), whereas a high optical density (black region) on the mammogram corresponds to a low gray level (0).
The image after segmenting can be pro~ (step 103) or peripheral density enhancement can be performed. Peripheral density enhancement is described in application Serial No.
08/158,320. The average gray values of the pixels as a function of distance from the breast border. An e~hA~cement curve is determined by fitting, such as polynomial fitting, a curve of the average gray values as a function of distance, and then reversing the fit. The PnhAnc~ ~nt curve is added to the curve of the average gray values as a function of distance to produce an enhanced gray value curve. This results in a peripherally enhanced image where the center and the portion near the border are simultaneously displayed without loss in W096l27846 CA 022l4l0l 1997-08-28 PCT~S96102439 ~l.L~st. FIG. 2 shows the curve of the average ~ray values as a function of distance, the reversed fitted curve and the peripherally ~nhAnc~ curve.
The segmented image, with or without peripheral density enhancement, is then optionally proc~s~ (step 103). An initial modified median filter of size nxn may be used to eliminate isolated aberrant (very dark, low gray level) pixel values in the segmented image, since this would disturb the erosion step. The modified median filtering is shown in FIG.
3A. The median filter can be of 3x3 size, for example. The conventional median filter is described in, for example, "The Image Processing Handbook," 2nd Ed., by John Russ (CRC Press 1995).
At a beginning pixel location l(x,y) in the image (step 300), which can be either the segmented or the peripherally enhanced segmented image, the local minimum is determined (step 301) in the surrounding neighborhood (n m pixels). If the gray level at pixel location l(x,y) is smaller than the local minimum by a certain number of gray levels (M) in step 302, then that gray level is corrected by the median filter (step 303). An example of M is 5 gray levels, but other values are possible. In the embodiment, the gray level of the pixel at l(x,y) is updated to the median pixel value of the neighborhood.
It is checked whether the pixel is the last pixel for processing (step 304). If no, the next pixel is selected (step 305) and step 301 is repeated. If the answer in step 302 is no, the process moves to step 304. When the last pixel locatiGn is r~chP~ (step 304), the filtering is completed for all of the pixels in the image.
Two criteria are then used to ~ollLrol which pixels are used as seed pixels for the morphological operation, to preserve the gray value characteristics of larger lesions as ~ far as possible. As shown in FIG. 3B, beg;nn;n~ at pixel location l(x,y) (step 310) a check is made to determine whether pixel l(x,y) is a seed pixel. The local ~Y; of the neighborhood is calculated (step 311).
To qualify as a seed pixel the following criteria must be fulfilled. First, there must be a negative Laplacian (gray value of the pixel in question minus the local ;~; gray value must be less than the local ~Y; l~- gray value minus the gray value of the pixel in question, (step 312). This, as demonstrated in FIGS. 3C and 3D, prevents erosion of the center of a small mass. In FIG. 3C the gray value I(x,y)--MAX, so no change is made to pixel value and the center is preserved. In FIG. 3D, I(x,y)-MIN < MAX-I(x,y), so the gray value of pixel at location l(x,y) is changed.
Second, only pixels with a small distance from the local m;n; ~ are used as erosion centers (step 313). That is, the location of the seed pixel must be close to the location of the local MIN. This preserves the gray value slope in the periphery of larger lesions. An example of the distance is 3 pixels, and other values may be chosen.
, If the answer is no at either of steps 312 and 313, the next pixel is selected (step 314) and the process is repeated.
For those pixels which qualify as a seed pixel, the W O 96/27846 PCTrUS96/02439 morphological open operation is performed (step 315).
The morphological open (erosion followed by dilation) shown in FIG. 3B is performed on the segmented image, with or without modified median filtering, as shown in FIG. lA. The morphological open operation is also described in Russ, supra.
Only the erosion processing can be performed, omitting the dilation procedure. The main effect of the erosion is smoothing of the image while keeping lesions that are of interest. The main effect of the dilation is to return masses to roughly their original size. The dilation is optional.
The structuring element in the embodiment for the morphological operation is a circle with a diameter of 7 pixels, e.g. for a pixel size of 0.5 mm. The structuring eliminates small circular and thin linear structures up to a diameter of 3.5 mm (for a 0.5 mm pixel size). If larger structuring elements are used, the subsequently used mass filter size is changed (as discussed below). At the same time irregular densities are rounded by this process.
This morphological operation is different from the conventional operation in the sense that a threshold E is used to control how much structure is eroded. If the difference, i.e. the gray level value of a pixel in the image prior to the morphological operation, I(x,y), minus the gray level value after the morphological operation, P(x,y), is larger than the threshold E (step 316), then the gray level value of the pixel is replaced by the output of the morphological operation (step 317). Examples of E can range from 0-10 in terms of gray levels. When dilation is performed, if the gray level after W O 96/27846 PCT~US96102439 dilation ~Yc~e~ the original gray level of the pixel, the original gray level value is used for the pixel. This is repeated for all pixels.
Referring to FIG. lA, the morphological step is performed at different image resolutions. For example, resolution 1 (step 104) can use an image having a 0.5 mm pixel size (resolution 1), with the image being 512x512 pixels. The process is repeated in parallel for images having 1 mm, 1.5 mm, 2.5 mm, etc. pixel size with a corresp~n~in~
decrease in image size as the pixel size increase lfor 1.0 mm pixel size, the image is 256x256, etc.).
The process can also be conducted serially with a change in the resolution for each iteration. A second ,-- hor~i -rt of the method according to the invention is shown in FIG. lB.
After steps 100-103, the morphological operation is performed at a beginning resolution (step 104), followed by mass filtering (step 107) and template mat~h;ng (step 110). The image resolution is changed in step 116 and the results of the matching are stored in step 117. It then determined whether the ~Y; resolution has been ~xc~ (step 118). If no, the process is repeated at the new resolution. If yes, feature extraction, integration and classification (steps 113-115) are performed the same as in FIG. lA.
FIG. lC shows a third embodiment of the invention. The method shown in FIG. lC differs from the method shown in FIG.
lB in that a thresholding operation 119 is performed using the output of the mass filtering step. The mass filtered image identifies areas suspected of cont~; n; ng a lesion that can be W096/27846 CA 022l4l0l 1997-08-28 PCT~S96/02439 further processed by gray-level thresholding. After thresholding the image with the L~ ~ i n; ~g suspected lesions is input to step 113 for feature analysis, followed by steps 114 and 115, as in the method of FIG. lB.
FIG. 4 is a diagram illustrating the circular kernel used in the mass filter. For detection of ci ~ cribed densities a mass filter with a circular base is used (this mass filter is a modified IRIS filter; for a description of the IRIS
filter see Kobatake et al., CAR 1993: pp 624-629). The kernel is ring-shaped (pixels 402) around a center pixel 400.
Note in this kernel that the center pixel locations 401 are absent since they would not contribute useful values to the overall filter value (as described below). A ring-shAp~A
filter rather than just a solid circular filter is thus used.
The mass f i lter value is based on the local gradient (in the embodiment a 7x7 kernel is used) in x- (Dx) and y- (Dy) directions. Differences from the description of the IRIS
filter in Kobatake et al. include use of a ring-shaped filter, second derivative instead of the gradient, and edge orientation bins. Gradient values smaller than a gradient threshold (e.g., 10) are not used in the calculation of the filter value.
The edge orientation at a specific image point is equivalent to the gradient vector and the edge strength is calculated as the second derivative in edge orientation. FIG.
5 shows a gradient 500 at point 501. This assures that regions with a constant gradual slope do not contribute to the mass f ilter value. The gradient is oriented at an angle W 096/27846 PCT~US96/02439 relative to a radial line from point 501 to point (x,y). The filter value is calculated separately for a specific h~ of edge orientation bins, such as 16 (B1, B2...B16). Orientation bins are radial sectors of the circular area. For example, each of 16 bins would cover an angle of ~/8. A bin 502, shown for a sector of ~/8, is made of the pixels 402 between lines 503.
The calculation for a given pixel location (x,y) is given for the calculation of each orientation bin by:
f(Bi) = (l/N)2pinK[~X(O~cos~) * Edge Strength (at P)]
where:
f(B;) filter value for edge orientation bin K filter kernel P neighbor point in K
N number of points in K
~ angle between gradient vector and connection line center point/neighbor point Edge strength is obtained from the second derivative of P
calculated in edge orientation. The final filter value is calculated as sum of the individual orientation bins, where a specified number of bins j, for example 4, with the highest values are ignored. That is:
filter value at pixel l(x,y) = ~ j j f(Bj) for Bj not equal to the j highest bins. This prevents an influence of straight edges (e.g. the pectoralis muscle ~ 25 border) on the filter value, since all points along this edge are within the same orientation bin without changing the filter value for ideal circular lesions.
W 096/27846 PCTrUS96/02439 Usually the filter value is highest in the center of a lesion. The highest filter values are found for round or slightly oval shaped lesions. The neighborhood used in calculation of the filter value is empirically det~ ; n~ to be around 10 pixels (outer radius); this could be increased to improve the detection of oval shaped masses. In addition, a gradient threshold can be employed so that pixels in the neighborhood that have a gradient smaller than the threshold (e.g., 10) do not contribute to the calculation of the filter value.
The image outputted by the mass filter is then subjected to template matching. Local ~ir~ of the filter value define potential center points of mass lesions, which are used in steps 111-113, the matching of a deformable template on to the lesion border. The edges of the suspect lesion can be obtained from the derivative or second derivative of the image output from the mass filtering. The deformable shape template is defined using Fourier descriptors. An initial shape is selected and the Fourier descriptors are varied to dynamically fit the shape of the lesion. Fourier descriptors are described in, for example Arbter et al., Application of Affine-invariant Fourier Descriptors to Recognition of 3-D
Objects, IEEE Trans. Pattern Analysis Machine Intelligence 12:640-647 (1990); Kuhl et al., Elliptic Fourier Features of a Closed Contour, Computer Graphics Image Processing 18:236-258 (1982); Wallace et al., An Efficient Three-dimensional Aircraft Recognition Algorithm Using Normalized Fourier Descriptors, ibid., 13:99-126 (1980); Granlund, Fourier W 096/27846 PCTrUS96/02439 Preprocessing for ~and Print Character Reco~nition~ IEEE
Trans. Computers 21:195-201 (1972~; Zahn et al., Fourier Descriptors for Plane Closed Curves. ibid., 21:269-281 (1972);
Cri i nC ~ A Complete Set of Fourier Descriptors for 'hwo-~; ~n.cional Shape, IEEE Trans. Sys. Man Cybernetics 12:848-855 (1982); Persoon et al., Shape Discrimination Using Fourier Descriptors, ibid., 7:170-179 (1977); and Richard et al., Identification of Three-dimensional Objects Using Fourier Descriptors of the Boundary Curve, ibid., 4:371-378 (1974).
In the template match; ng step the object contour is generated as an inverse Fourier transform of a limited number of complex Fourier terms. The following relationship exists between a closed planar curve g(l) and Fourier descriptors Ck:
g(1) planar curve and Fourier descriptors:
c~ = jck¦ei~ = (1/L)¦ g(l)e~~ dl o where:
g(l) is a planar curve with a runlength l; the real of part of g = x coordinate, the imaginary part of g = y coordinate Ck Fourier descriptors with -N/2 S k S N/2, N ~ ~
By variation of the terms -2, -1, 0, 1, and 2 arbitrary elliptical or kidney shape contours can be generated. The terms -2 to 2 were selected since the lesions are of simple shape. However, one can modify the terms using a priori knowledge of the lesions to be detected. The term 0 defines the position and the terms -1 and 1 define size and orientation of the main ellipse.
In the mass detection the following fourier descriptors W 096/27846 PCTrUS96/02~39 are used:
Ck = ~ for k << -2 or k > 2 C-l = sp1eiQ
CO = X + jy C1 =
C2 = sp2e~
x x center position y y center position s size ~ orientation (angle between main ellipse and x-axis) P1: variable parameter to describe the short/long axis ratio of the main ellipse with ~ ~ P1 S 0.5 (long axis: s + sP1~ short axis: s - sP1 for P1 = 0, the Fourier descriptors define a circle as a special case of an ellipse) P2 variable parameter to describe the degree of asymmetry (kidney shape) with 0 S P2 ~ 0~3 FIG. 6 is a diagram illustrating examples of the deformable templates correspon~i ng to the possible ch~rt~s assigned to localized densities from the Fourier descriptors analysis discussed above, with the P1 and P2 values indicated for each shape. Note that the center position and the angle (orientation) and the size of each can be varied. FIG. 6 is an example of possible shapes, and the invention is not limited to these particular shapes or this number of ~hArt~c.
The lesion contour is generated by variation of the Fourier terms within a certain range with ~i n; ~ i zation of a cost function using lesion contrast, edge strength and W 096127846 PCT~US96/02439 deviation from the ideal circular shape. This process is performed on the ouLyuL from the mass filter. Simulated ~nne~l ing is used for minimization.
= Simulated annealing is a tPchnique for optimiza~ion, which involves a description of possible system configurations, a generator of random changes in the configuration (i.e., the "options"), a function for minimization and a control parameter (temperature) that controls the increments of the random changes. It is described in, for example, Numerical Recipcs by Press, et al., Cambridge Press (1988).
The configuration in the ~hg~; -nt is the "correct"
Fourier descriptor of an extracted contour. This configuration could be ob~i neA as an entire curve or in radial segments of the curve using different Fourier descriptors for each segment. Once "fit", the inverse of the Fourier descriptors is performed yi~l~;nq the contour. With the radial segments, only a limited number of points are generated in the inverse transformation. The changes in the configuration (i.e., the contour shape, that is the Fourier descriptor coefficients Ck) are changed by changes in the center location, the size of the "lesion", the orientation (~), the long/short axis ratio (indicating the degree of being oval) and the degree of asymmetry. The method limits the changes to these in the Fourier descriptors. Examples of the range of variation for each parameter include increments in center position by one pixel, a size range of 5 to 80 pixels in diameter with an increment of 2 pixels, and a range in ~
W096/27846 CA 02214101 1997-08-28 PCT~S96/02439 from -360 to 360 . The function to be m; n; ; 7ed includes a center cost index of 20 (in each direction), a size cost index of 10 and an angle cost index of 10. The starting temperature was set at 30. Upon minimizing the cost function, the difference between the "lesion" center and the "fit" center, the difference between the size of the "lesion" and the size of the "fit", the Euclidean difference between x-y position of the lesion contour and the x-y position of the fit contour, etc. are ; nim; zed. The temperature is modified (cooled) as the iterations increase so that after a specified number of iterations a downward step in the temperature is taken.
In the template matching the following can be varied: the shape in terms of Fourier descriptors, the penalty factor for deviations from the mean, the center, the angle and the size.
The penalty factor is a measure based on st~n~Ard deviation, i.e. a limit on the amount of deformation during the template match; nq, An example of the parameter file used in the deformable template matching is shown below:
-- a shape file giving which part of the curve is used -- start temperature for the simulated An~eAling -- number of iterations -- in~,~ ents such as for incrementing the center position, the size, the angle during the simulated annealing -- number of points generated in the inverse transformation.
Note that after the matching is 5nCc~ful~ the final coefficients of the Fourier descriptors are used to return to the x,y domain. Thus, discontinuous margin pixels along a -W 096/27846 PCT~US96/02439 "mass" will be ronn~cted. The ouL~L of the template matchin~
is contour or a partial contour of the suspect lesion.
Sixteen edge maps can be used in the shape matçhi n~.
Edge maps are obtained from the second derivative, as described above. Edge maps are used since sometimes there is only one good edge in the suspected lesion. The lesion contour is generated by variation of the Fourier terms within a certain range with ~;n; m i7ation of a cost function using lesion contrast, edge strength and deviation from the ideal circular shape. Simulated annealing is used for ;ni~;~ation.
In the matchinq one can have varied the following: the shape in terms of Fourier descriptors, the penalty factor deviations from the mean, the center, the angle and the size.
For further characterization a rectangular ROI cont~;n;~
the suspected lesion identified in the open, mass filtering and template matching operations is extracted from the original peripheral density ~nh;~nr~rl image. Feature extraction and analysis is performed on the suspected lesion.
Feature extraction and is described in application Serial No.
08/158,389 to Giger et al., the disclosure of which is herein incorporated by reference.
This is shown in more detail in FIG. 8. The suspected lesion from the template matching is obtained (step 800).
Note that a suspected lesion from another method, by a computer or manually by an observer, can also be used as input (step 801). A region of interest (ROI) cont~;n; ng the suspected lesion is selected (automatically or ~nt~lly) in = step 802. The gradient and orientation of the ROI is W096/27846 CA 02214101 1997-08-28 PCT~S96/02439 calculated in step 803, followed by a calculation of the gradient index R, contrast and elongation factor in step 804.
This is shown in more detail in FIG. 7, where in an ROI 700 a gradient 701 is calculated at a point 702 in a suspected lesion 703 having a center point (x,y). The pixels in the area enclosed by ~AshP~ line 704 are those pixels that do not contribute much to the gradient index (the gray value varies more towards the edge of the suspected lesion), and may be excluded.
The radial gradient index R, defined as follows:
2 cos~DZ + Dy 2 ~Dx + Dy where:
R radial gradient index -1 S R S 1 P image point (x,y) center of suspected lesion from template match;nq 15 L detected lesion excluding the center part Dx gradient in x-direction Dy gradient in y-direction angle between gradient vector and co~n~ction line center (x,y) to P
The radial gradient index is a measure of circularity and density characteristics of the lesion. The radial gradient index approaches 1 for ideal circular lesions. This radial gradient index can be viewed as the average gradient in the radial direction normalized by the average gradient. The suspected lesion size is given by the difference between the W 096/27846 PCTÇUS96/02439 gray lQvel at the center to that at the margin of the suspected lesion.
To limit the number of false positives, thresholding is performed in step 805. For example, lesions with a diameter less than some preset value (e.g. ~ 4 mm), a contrast less than some preset value (e.g. < 0.1 optical density) or a radial gradient less that a preset value (e.g., 0.5) are eli ;n~ted.
The features after thresholding are indicated in step 806 and can be merged using, for example, rule-based methods or an artificial neural network trained to detect and classify lesions (step 807) to eliminate further more false positives or to distinguish between malignant and benign lesions.
Malignant and benign lesions will pos~ecs different R values = 15 if the maglinant lesion is highly spiculated.
The open, mass filtering and template mat~hing are performed repeatedly with different resolutions. In each resolution step a specific lesion size is detected. FIG. 9A
is a table illustrating the relationship between pixel size of the image and the lesion size being detected. The number and size of resolutions ~hoc~n dep~n~c upon the type of lesions to be detected and the amount of processing time available for detection.
The kernel size in the mass filtering can also be varied.
FIG. 9B is a table showing the relationship between kernel size and the size of the lesion being detected. In the embodiments described above, a single mass filter can be chosen for the different resolutions of the open filter. In a W 096/27846 PCTrUS96/02439 modification of theses embodiments, the kernel size in the mass filtering can be varied, for example as shown in FIG. 9B.
The modified mass filtering step is shown in FIG. 10. The image resolution is kept constant while the kernel size is varied, the kernel size is kept constant while the image resolution is varied, or both can be varied.
In step 1000, the image from the morphological operation is obt~i n~ . The initial kernel size is set (step 1001) and the mass filtering is performed at the initial kernel size (step 1002). The image after mass filtering is stored (step 1003). Next, it is chP~k~ whether the maximum kernel size has been reached in step 1004. If no, then a new kernel size is selected (step 1005) and the mass filtering is performed again. After the last kernel size is used, the images are ouL~uL (step 1006).
After features analysis has been performed, in step 116 of FIG. lA the different detected lesions from all of the o~L~Ls obtained from different resolution images, different size kernels, or both, are integrated. Locations indicating the same lesion may show up in more than one image.
If two lesions overlap, the lesion with the smaller radial gradient index is eliminated. The amount of acceptable overlap can be varied by specifying the percent of overlap allowed. In the embodiment, 30% was chosen, but other values can be used. Referring to FIG. 11, two lesions 1100 and 1101 are shown. The smaller lesion, having the larger gradient index is kept.
FIGS. 12A-12F illustrate example of (12A) an original W 096/27846 PCT~US96/02439 mammogram, (12B) after border segmentation, (12C) after the modified open operation, (12D) after the mass filtering, (12E) after template mat~hing and (12F) after feature extraction in which the suspect lesions are prioritized by number (with one being the most suspicious). In this case lesion 1 was an intramammary lymph node with a radial gradient index of 0.92, lesion 2 was a 10 mm invasive ductal ~nc~ (R = 0.90), lesion 3 was a 7 mm invasive ductal ~nc~ (R = 0.85), and lesions 4 through 7 were false positive with R ranging from 0.78 to 0.52.
In FIG. 12D the suspected lesions are evidently highlighted, allowing their extraction through thresholding as described above. FIG. 12E contains many contrast features not evident from a visual inspection of FIG. 12D. The template matching is sensitive to subtle variations in the mass-filtered image.
FIGS. 13A-13F show examples of a ~ ~am (13A) after peripheral enh~n~ment and (13B) after morphological filter with a pixel size of 0.5 mm. Figure 13C shows the difference image of FIG. 13A minus FIG. 13B, illustrating the small detail, non-lesion like structures that are eliminated by the morphological operation. The effect of morphological operations with different pixel sizes is shown in FIGS. 13D-13F for pixel sizes of 1.0 mm, 2.0 mm and 4.0 mm, respectively.
FIGS. 14A-14C illustrates (14A) an artificial ideal spherical lesion and (14B) its detection results. FIG. 14C
shows the 16 directional edge maps used in the method. The 16 W O 96/27846 PCT~US96/02439 edge maps ~u~e~o-,d to 16 equal radial sectors making up the circular lesion. Other numbers of edge maps can be ~hoeen.
FIG. 15A shows the location of the ROI used for feature analysis within the original mammogram after peripheral ~nh~nC~ment. FIGS. l5B-lsD show enlargements of the ROI, the truth margin as marked by a radiologists and the detection result for lesions 1 and 2 from FIG. 12F, respectively.
Figure 16 is a graph illustrating the performance of the method in the detection of malignant lesions in a screening mammographic database in terms of FROC (free response operating characteristic) curve. For this performance evaluation, 45 invasive c~nc~s less than 10 mm in size were used.
FIG. 17 is a schematic block diagram illustrating a system for implementing the automated method for the detection of lesions in medical images. The system of FIG. 17 operates and carries out functions as described above. A data input device 1700, such as a x-ray mammography device with a laser cc~nner and digitizer, pro~lreC a digitized mammogram. The digitized mammogram is segmented by segmenting circuit 1701 and then input to a peripheral ~nh~ncement circuit 1702 or sampling circuit 1703. Either the digitized mammogram or the peripherally enhanced is sampled by the sampling circuit 1703 (to select a pixel size) and then optionally pro~Cc~ by a 2S modified median filter 1704. Either the ouL~uL of the sampling circuit 1703 or the filter 1704 is input to and processed by morphological circuit 1705. The ouL~uL of circuit 1705 is fed to mass filter circuit 1706 for mass W 096/27846 PCT~US96/02439 filtering. Next, the mass-filtered image is fed to a Fourier descriptors generating circuit 1707, edge images generating circuit 1708 and simulated ann~ n~ circuit 1709 for template mat~hi~g. The image(s) are then fed to a feature analysis circuit 1710 for feature extraction and analysis. Memory 1711 is available to store images. The features are merged for classification and integration in feature merging circuit 1712, and can be displayed on display 1713, such as a video display terminal.
~he images can also be transferred from memory 1711 via transfer circuit 1714 to a feature analysis circuit 1715 to perform feature extraction and analysis. The features are fed to a rule-based circuit or neural network 1716 to perform detection and classification of lesions. Superimposing circuit 1717 allows the detected lesions to be displayed on the images.
The elements of the system of FIG. 17 can be carried out in software or in hardware, such as a programmed microcomputer. The neural network can also be carried out in software or as a semiconductor layout.
Obviously, numerous modifications and variations of the present invention are possible in light of the above t~ n; ~ue. It is therefore to be understood that within the scope of the appended claims, the invention may be practiced otherwise than as specifically described herein. Although the current application is focussed on the detection and classification of mass lesions in mammograms, the concept can be expanded to the detection and classification of abnormalities in other organs in the human body, such as the lungs and the liver.
R radial gradient index -1 S R S 1 P image point (x,y) center of suspected lesion from template match;nq 15 L detected lesion excluding the center part Dx gradient in x-direction Dy gradient in y-direction angle between gradient vector and co~n~ction line center (x,y) to P
The radial gradient index is a measure of circularity and density characteristics of the lesion. The radial gradient index approaches 1 for ideal circular lesions. This radial gradient index can be viewed as the average gradient in the radial direction normalized by the average gradient. The suspected lesion size is given by the difference between the W 096/27846 PCTÇUS96/02439 gray lQvel at the center to that at the margin of the suspected lesion.
To limit the number of false positives, thresholding is performed in step 805. For example, lesions with a diameter less than some preset value (e.g. ~ 4 mm), a contrast less than some preset value (e.g. < 0.1 optical density) or a radial gradient less that a preset value (e.g., 0.5) are eli ;n~ted.
The features after thresholding are indicated in step 806 and can be merged using, for example, rule-based methods or an artificial neural network trained to detect and classify lesions (step 807) to eliminate further more false positives or to distinguish between malignant and benign lesions.
Malignant and benign lesions will pos~ecs different R values = 15 if the maglinant lesion is highly spiculated.
The open, mass filtering and template mat~hing are performed repeatedly with different resolutions. In each resolution step a specific lesion size is detected. FIG. 9A
is a table illustrating the relationship between pixel size of the image and the lesion size being detected. The number and size of resolutions ~hoc~n dep~n~c upon the type of lesions to be detected and the amount of processing time available for detection.
The kernel size in the mass filtering can also be varied.
FIG. 9B is a table showing the relationship between kernel size and the size of the lesion being detected. In the embodiments described above, a single mass filter can be chosen for the different resolutions of the open filter. In a W 096/27846 PCTrUS96/02439 modification of theses embodiments, the kernel size in the mass filtering can be varied, for example as shown in FIG. 9B.
The modified mass filtering step is shown in FIG. 10. The image resolution is kept constant while the kernel size is varied, the kernel size is kept constant while the image resolution is varied, or both can be varied.
In step 1000, the image from the morphological operation is obt~i n~ . The initial kernel size is set (step 1001) and the mass filtering is performed at the initial kernel size (step 1002). The image after mass filtering is stored (step 1003). Next, it is chP~k~ whether the maximum kernel size has been reached in step 1004. If no, then a new kernel size is selected (step 1005) and the mass filtering is performed again. After the last kernel size is used, the images are ouL~uL (step 1006).
After features analysis has been performed, in step 116 of FIG. lA the different detected lesions from all of the o~L~Ls obtained from different resolution images, different size kernels, or both, are integrated. Locations indicating the same lesion may show up in more than one image.
If two lesions overlap, the lesion with the smaller radial gradient index is eliminated. The amount of acceptable overlap can be varied by specifying the percent of overlap allowed. In the embodiment, 30% was chosen, but other values can be used. Referring to FIG. 11, two lesions 1100 and 1101 are shown. The smaller lesion, having the larger gradient index is kept.
FIGS. 12A-12F illustrate example of (12A) an original W 096/27846 PCT~US96/02439 mammogram, (12B) after border segmentation, (12C) after the modified open operation, (12D) after the mass filtering, (12E) after template mat~hing and (12F) after feature extraction in which the suspect lesions are prioritized by number (with one being the most suspicious). In this case lesion 1 was an intramammary lymph node with a radial gradient index of 0.92, lesion 2 was a 10 mm invasive ductal ~nc~ (R = 0.90), lesion 3 was a 7 mm invasive ductal ~nc~ (R = 0.85), and lesions 4 through 7 were false positive with R ranging from 0.78 to 0.52.
In FIG. 12D the suspected lesions are evidently highlighted, allowing their extraction through thresholding as described above. FIG. 12E contains many contrast features not evident from a visual inspection of FIG. 12D. The template matching is sensitive to subtle variations in the mass-filtered image.
FIGS. 13A-13F show examples of a ~ ~am (13A) after peripheral enh~n~ment and (13B) after morphological filter with a pixel size of 0.5 mm. Figure 13C shows the difference image of FIG. 13A minus FIG. 13B, illustrating the small detail, non-lesion like structures that are eliminated by the morphological operation. The effect of morphological operations with different pixel sizes is shown in FIGS. 13D-13F for pixel sizes of 1.0 mm, 2.0 mm and 4.0 mm, respectively.
FIGS. 14A-14C illustrates (14A) an artificial ideal spherical lesion and (14B) its detection results. FIG. 14C
shows the 16 directional edge maps used in the method. The 16 W O 96/27846 PCT~US96/02439 edge maps ~u~e~o-,d to 16 equal radial sectors making up the circular lesion. Other numbers of edge maps can be ~hoeen.
FIG. 15A shows the location of the ROI used for feature analysis within the original mammogram after peripheral ~nh~nC~ment. FIGS. l5B-lsD show enlargements of the ROI, the truth margin as marked by a radiologists and the detection result for lesions 1 and 2 from FIG. 12F, respectively.
Figure 16 is a graph illustrating the performance of the method in the detection of malignant lesions in a screening mammographic database in terms of FROC (free response operating characteristic) curve. For this performance evaluation, 45 invasive c~nc~s less than 10 mm in size were used.
FIG. 17 is a schematic block diagram illustrating a system for implementing the automated method for the detection of lesions in medical images. The system of FIG. 17 operates and carries out functions as described above. A data input device 1700, such as a x-ray mammography device with a laser cc~nner and digitizer, pro~lreC a digitized mammogram. The digitized mammogram is segmented by segmenting circuit 1701 and then input to a peripheral ~nh~ncement circuit 1702 or sampling circuit 1703. Either the digitized mammogram or the peripherally enhanced is sampled by the sampling circuit 1703 (to select a pixel size) and then optionally pro~Cc~ by a 2S modified median filter 1704. Either the ouL~uL of the sampling circuit 1703 or the filter 1704 is input to and processed by morphological circuit 1705. The ouL~uL of circuit 1705 is fed to mass filter circuit 1706 for mass W 096/27846 PCT~US96/02439 filtering. Next, the mass-filtered image is fed to a Fourier descriptors generating circuit 1707, edge images generating circuit 1708 and simulated ann~ n~ circuit 1709 for template mat~hi~g. The image(s) are then fed to a feature analysis circuit 1710 for feature extraction and analysis. Memory 1711 is available to store images. The features are merged for classification and integration in feature merging circuit 1712, and can be displayed on display 1713, such as a video display terminal.
~he images can also be transferred from memory 1711 via transfer circuit 1714 to a feature analysis circuit 1715 to perform feature extraction and analysis. The features are fed to a rule-based circuit or neural network 1716 to perform detection and classification of lesions. Superimposing circuit 1717 allows the detected lesions to be displayed on the images.
The elements of the system of FIG. 17 can be carried out in software or in hardware, such as a programmed microcomputer. The neural network can also be carried out in software or as a semiconductor layout.
Obviously, numerous modifications and variations of the present invention are possible in light of the above t~ n; ~ue. It is therefore to be understood that within the scope of the appended claims, the invention may be practiced otherwise than as specifically described herein. Although the current application is focussed on the detection and classification of mass lesions in mammograms, the concept can be expanded to the detection and classification of abnormalities in other organs in the human body, such as the lungs and the liver.
Claims (55)
PATENT IS:
1. A method for the automated detection of mass lesions in mammographic images, comprising:
generating a mammogram;
segmenting said mammogram to produce a segmented mammogram;
performing a modified morphological operation on said segmented mammogram;
performing mass filtering;
performing template matching; and detecting whether a lesion is present in said mammogram.
generating a mammogram;
segmenting said mammogram to produce a segmented mammogram;
performing a modified morphological operation on said segmented mammogram;
performing mass filtering;
performing template matching; and detecting whether a lesion is present in said mammogram.
2. A method as recited in Claim 1, wherein performing said morphological open operation comprises performing a modified morphological operation using a plurality of filter sizes.
3. A method as recited in Claim 1, wherein segmenting said mammogram comprises segmenting a breast border from said mammogram.
4. A method as recited in Claim 1, wherein segmenting said mammogram comprises:
performing noise analysis on said mammogram;
applying a gray value range operator;
performing global gray-level histogram analysis to provide a thresholded image;
region growing using said thresholded image;
performing a morphological erosion operation; and determining a distance map.
performing noise analysis on said mammogram;
applying a gray value range operator;
performing global gray-level histogram analysis to provide a thresholded image;
region growing using said thresholded image;
performing a morphological erosion operation; and determining a distance map.
5. A method as recited in Claim 1, comprising:
peripherally enhancing said segmented image to produce an enhanced image; and performing said morphological operation using said enhanced image.
peripherally enhancing said segmented image to produce an enhanced image; and performing said morphological operation using said enhanced image.
6. A method as recited in Claim 5, wherein said peripherally enhancing said segmented image comprises:
determining average gray-level values of pixels as a function of distance from a border of a breast in said mammogram;
fitting an enhancement curve to said gray-level values;
reversing a fit of said enhancement curve to obtain a second enhancement curve; and adding said second enhancement curve to a curve of said average gray-level values as a function of distance.
determining average gray-level values of pixels as a function of distance from a border of a breast in said mammogram;
fitting an enhancement curve to said gray-level values;
reversing a fit of said enhancement curve to obtain a second enhancement curve; and adding said second enhancement curve to a curve of said average gray-level values as a function of distance.
7. A method as recited in Claim 1, comprising:
processing said segmented image to produce a processed image; and performing said morphological operation using said processed image.
processing said segmented image to produce a processed image; and performing said morphological operation using said processed image.
8. A method as recited in Claim 7, wherein processing said segmented image comprises:
obtaining a gray-level value in said segmented image at a first pixel;
determining a local gray-level minimum in said segmented image in a neighborhood of pixels disposed around said first pixel;
substituting said local gray-level minimum for said gray level value of said first pixel when a first difference between said minimum and said gray-level value is greater than a first gray-level threshold value; and repeating said obtaining, determining and substituting steps for selected pixels of said segmented image to obtain a processed image.
obtaining a gray-level value in said segmented image at a first pixel;
determining a local gray-level minimum in said segmented image in a neighborhood of pixels disposed around said first pixel;
substituting said local gray-level minimum for said gray level value of said first pixel when a first difference between said minimum and said gray-level value is greater than a first gray-level threshold value; and repeating said obtaining, determining and substituting steps for selected pixels of said segmented image to obtain a processed image.
9. A method as recited in Claim 8, comprising:
determining whether a pixel in said processed image is a seed pixel; and performing said morphological operation using said seed pixel.
determining whether a pixel in said processed image is a seed pixel; and performing said morphological operation using said seed pixel.
10. A method as recited in Claim 9, wherein determining whether said pixel is said seed pixel comprises:
determining a local gray-level maximum of said neighborhood;
a first step of determining whether a first difference between said gray-level value and said local gray-level minimum is less than a second difference between said local gray-level maximum and said gray-level value; and a second step of determining whether said pixel is within a predetermined distance of a pixel having said local minimum gray-level value.
determining a local gray-level maximum of said neighborhood;
a first step of determining whether a first difference between said gray-level value and said local gray-level minimum is less than a second difference between said local gray-level maximum and said gray-level value; and a second step of determining whether said pixel is within a predetermined distance of a pixel having said local minimum gray-level value.
11. A method as recited in Claim 9, comprising:
performing said morphological operation at a selected seed pixel in said segmented image;
determining whether a second difference of a first gray level value of said selected seed pixel before performing said morphological operation and a second gray-level value of said selected seed pixel after performing said morphological operation is less than a second gray-level threshold value; and replacing said first gray-level value with said second gray-level value when said second difference is greater than said second gray-level threshold value.
performing said morphological operation at a selected seed pixel in said segmented image;
determining whether a second difference of a first gray level value of said selected seed pixel before performing said morphological operation and a second gray-level value of said selected seed pixel after performing said morphological operation is less than a second gray-level threshold value; and replacing said first gray-level value with said second gray-level value when said second difference is greater than said second gray-level threshold value.
12. A method as recited in Claim 1, comprising:
performing said morphological operation, said mass filtering and said template matching serially at a plurality of morphological resolutions;
wherein said morphological operations performed at said plurality of resolutions are conducted in parallel.
performing said morphological operation, said mass filtering and said template matching serially at a plurality of morphological resolutions;
wherein said morphological operations performed at said plurality of resolutions are conducted in parallel.
13. A method as recited in Claim 1, comprising:
performing said morphological operation, said mass filtering and said template matching serially at a plurality of morphological resolutions;
wherein said morphological operations performed at said plurality of resolutions are conducted iteratively.
performing said morphological operation, said mass filtering and said template matching serially at a plurality of morphological resolutions;
wherein said morphological operations performed at said plurality of resolutions are conducted iteratively.
14. A method as recited in Claim 13, further comprising thresholding a mass filtered image obtained from said mass filtering.
15. A method as recited in Claim 1, further comprising:
thresholding a mass filtered image obtained from said mass filtering to obtained a threshold image; and using said threshold image in detecting said lesions.
thresholding a mass filtered image obtained from said mass filtering to obtained a threshold image; and using said threshold image in detecting said lesions.
16. A method as recited in Claim 15, wherein:
detecting said lesions comprises performing feature extraction; and said threshold image is used in said feature extraction.
detecting said lesions comprises performing feature extraction; and said threshold image is used in said feature extraction.
17. A method as recited in Claim 1, wherein detecting said lesions comprises:
extracting features using a result of said morphological operation, said mass filtering and said template matching;
integrating said features; and classifying said features.
extracting features using a result of said morphological operation, said mass filtering and said template matching;
integrating said features; and classifying said features.
18. A method as recited in Claim 1, comprising:
performing said mass filtering after performing said morphological operation to produce a filtered image;
performing template matching using said filtered image;
performing thresholding using said filtered image; and detecting whether said lesion is present using said template matching and said thresholding.
performing said mass filtering after performing said morphological operation to produce a filtered image;
performing template matching using said filtered image;
performing thresholding using said filtered image; and detecting whether said lesion is present using said template matching and said thresholding.
19. A method as recited in Claim 1, wherein performing said mass filtering comprises:
using a ring-shaped kernel; and determining a filter value based upon a second derivative of edge orientation.
using a ring-shaped kernel; and determining a filter value based upon a second derivative of edge orientation.
20. A method as recited in Claim 1, wherein performing said mass filtering comprises:
determining a filter value for a plurality of edge orientation bins.
determining a filter value for a plurality of edge orientation bins.
21. A method as recited in Claim 20, comprising:
determining a final filter value as a sun of said filter values.
determining a final filter value as a sun of said filter values.
22. A method as recited in Claim 21, comprising:
determining said final filter value using a plurality of said orientation bins excluding a predetermined number of said orientation bins having highest values.
determining said final filter value using a plurality of said orientation bins excluding a predetermined number of said orientation bins having highest values.
23. A method as recited in Claim 20, wherein determining said filter value comprises:
calculating said filter value using:
f (Bt) = (1/N) .SIGMA. [MAX(0,cos.PHI.) * EDGE STRENGTH (at P)]
P in K
where:
f (BI) is said filter value at an orientation bin;
K is a filter kernel;
P is a neighbor point in K;
N is a number of points in K; and .PHI. is an angle between a gradient vector and a connection line between said neighbor point and a center point.
calculating said filter value using:
f (Bt) = (1/N) .SIGMA. [MAX(0,cos.PHI.) * EDGE STRENGTH (at P)]
P in K
where:
f (BI) is said filter value at an orientation bin;
K is a filter kernel;
P is a neighbor point in K;
N is a number of points in K; and .PHI. is an angle between a gradient vector and a connection line between said neighbor point and a center point.
24. A method as recited in Claim 1, wherein said template matching comprises:
determining a deformable shape template; and fitting said shape template to a shape of a suspected lesion.
determining a deformable shape template; and fitting said shape template to a shape of a suspected lesion.
25. A method as recited in Claim 24, comprising:
determining said deformable shape template using Fourier descriptors; and varying said descriptors to dynamically fit said shape template to said shape of said suspected lesion.
determining said deformable shape template using Fourier descriptors; and varying said descriptors to dynamically fit said shape template to said shape of said suspected lesion.
26. A method as recited in claim 24, comprising:
determining said shape template as an inverse Fourier transform having a plurality of Fourier terms;
varying said Fourier terms to fit said shape template to fit said shape template to said shape of said suspected lesion;
and outputting a contour of said suspected lesion.
determining said shape template as an inverse Fourier transform having a plurality of Fourier terms;
varying said Fourier terms to fit said shape template to fit said shape template to said shape of said suspected lesion;
and outputting a contour of said suspected lesion.
27. A method as recited in Claim 26, wherein varying said Fourier terms comprises minimizing a cost function using simulated annealing.
28. A method as recited in claim 26, comprising:
varying a center location, a size, an orientation, a ratio of a long axis to a short axis, and a degree of asymmetry of said inverse Fourier transform.
varying a center location, a size, an orientation, a ratio of a long axis to a short axis, and a degree of asymmetry of said inverse Fourier transform.
29. A method as recited in claim 24, comprising:
determining a center of a suspected lesion as a maximum mass filter value;
determining edges of said suspected lesion as one of a derivative and a second derivative of said image after mass-filtering; and fitting said template using said center and said edges.
determining a center of a suspected lesion as a maximum mass filter value;
determining edges of said suspected lesion as one of a derivative and a second derivative of said image after mass-filtering; and fitting said template using said center and said edges.
30. A method as recited in Claim 5, comprising:
selecting a region of interest containing said lesion;
and performing feature extraction using said region of interest.
selecting a region of interest containing said lesion;
and performing feature extraction using said region of interest.
31. A method as recited in Claim 30, comprising:
selecting said region of interest containing a lesion identified using template matching.
selecting said region of interest containing a lesion identified using template matching.
32. A method as recited in Claim 29, wherein performing feature extraction comprises:
determining a gradient of said region of interest;
determining an orientation of said region of interest;
determining a gradient index of said region of interest;
determining a contrast of said region of interest; and determining an elongation factor of said region of interest.
determining a gradient of said region of interest;
determining an orientation of said region of interest;
determining a gradient index of said region of interest;
determining a contrast of said region of interest; and determining an elongation factor of said region of interest.
33. A method as recited in Claim 32, comprising:
thresholding at least one of said gradient, orientation, gradient index, contrast and elongation factor.
thresholding at least one of said gradient, orientation, gradient index, contrast and elongation factor.
34. A method as recited in Claim l, wherein performing said mass filtering comprises:
selecting an initial kernel size of a mass filter;
performing mass filtering using said initial kernel size to produce a mass-filtered image;
varying said kernel size up to a maximum kernel size; and performing mass filtering at each kernel size and producing corresponding mass-filter images.
selecting an initial kernel size of a mass filter;
performing mass filtering using said initial kernel size to produce a mass-filtered image;
varying said kernel size up to a maximum kernel size; and performing mass filtering at each kernel size and producing corresponding mass-filter images.
35. A method as recited in Claim 31, comprising:
performing said template matching on each of said mass-filter images;
detecting suspected lesions in said mass-filtered images;
and integrating said suspected lesions.
performing said template matching on each of said mass-filter images;
detecting suspected lesions in said mass-filtered images;
and integrating said suspected lesions.
36. A method as recited in Claim 35, wherein integrating said suspected images comprises:
determining whether suspected lesions in said mass-filtered images spatially overlap; and eliminating selected ones of overlapping suspected lesions.
determining whether suspected lesions in said mass-filtered images spatially overlap; and eliminating selected ones of overlapping suspected lesions.
37. A method as recited in Claim 36, comprising:
determining a radial index for each of said overlapping suspected lesions; and eliminating one of said overlapping suspected lesions having a smaller radial index.
determining a radial index for each of said overlapping suspected lesions; and eliminating one of said overlapping suspected lesions having a smaller radial index.
38. A system for detecting lesions in medical images, comprising:
a data input device;
a segmenting circuit connected to said input device;
a modified morphological circuit connected to receive an output of said segmenting circuit;
a mass filter circuit connected to said morphological circuit;
a template matching circuit connected to said mass filter circuit;
a feature analysis circuit connected to said template matching circuit; and a display.
a data input device;
a segmenting circuit connected to said input device;
a modified morphological circuit connected to receive an output of said segmenting circuit;
a mass filter circuit connected to said morphological circuit;
a template matching circuit connected to said mass filter circuit;
a feature analysis circuit connected to said template matching circuit; and a display.
39. A system as recited in claim 38, comprising:
a peripheral enhancement circuit connected to said segmenting circuit;
wherein said morphological circuit is further connected to receive an output of said peripheral enhancement circuit.
a peripheral enhancement circuit connected to said segmenting circuit;
wherein said morphological circuit is further connected to receive an output of said peripheral enhancement circuit.
40. A system as recited in Claim 39, comprising:
a sampling circuit connected to said segmenting, peripheral enhancement and morphological circuits;
wherein said sampling circuit is connected to receive said outputs of said segmenting and peripheral enhancement circuits;
and said morphological circuit is connected to receive an output of said sampling circuit.
a sampling circuit connected to said segmenting, peripheral enhancement and morphological circuits;
wherein said sampling circuit is connected to receive said outputs of said segmenting and peripheral enhancement circuits;
and said morphological circuit is connected to receive an output of said sampling circuit.
41. A method as recited in Claim 38, comprising:
a merging circuit and a rule-based circuit connected to said feature analysis circuit.
a merging circuit and a rule-based circuit connected to said feature analysis circuit.
42. A method as recited in Claim 38, wherein said template matching circuit comprises:
a Fourier descriptor generating circuit;
an edge generating circuit connected to said Fourier descriptor generating circuit; and a simulated annealing circuit connected to said edge generating circuit.
a Fourier descriptor generating circuit;
an edge generating circuit connected to said Fourier descriptor generating circuit; and a simulated annealing circuit connected to said edge generating circuit.
43. A system as recited in Claim 38, wherein said morphological circuit comprises:
means for performing a modified morphological operation using a plurality of filter sizes.
means for performing a modified morphological operation using a plurality of filter sizes.
44. A system as recited in Claim 38, wherein said morphological circuit comprises:
means for determining whether a pixel in said medical image is a seed pixel; and performing a modified morphological operation using said seed pixel.
means for determining whether a pixel in said medical image is a seed pixel; and performing a modified morphological operation using said seed pixel.
45. A system an recited in Claim 38, wherein said morphological circuit comprises:
means for obtaining a gray-level value in said medical image at a first pixel;
means for determining a local gray-level minimum in said segmented image in a neighborhood of pixels disposed around said first pixel;
means for substituting said local gray-level minimum for said gray-level value of said first pixel when a first difference between said minimum and said gray-level value is greater than a first gray-level threshold value.
means for obtaining a gray-level value in said medical image at a first pixel;
means for determining a local gray-level minimum in said segmented image in a neighborhood of pixels disposed around said first pixel;
means for substituting said local gray-level minimum for said gray-level value of said first pixel when a first difference between said minimum and said gray-level value is greater than a first gray-level threshold value.
46. A system as recited in Claim 45, comprising:
means for determining a local gray-level maximum of said neighborhood of pixels;
first means for determining whether a first difference between said gray-level value and said local gray level minimum is less than a second difference between said local gray-level maximum and said gray-level value; and second means for determining whether said pixel is within a predetermined distance of a pixel having said local minimum gray-level value.
means for determining a local gray-level maximum of said neighborhood of pixels;
first means for determining whether a first difference between said gray-level value and said local gray level minimum is less than a second difference between said local gray-level maximum and said gray-level value; and second means for determining whether said pixel is within a predetermined distance of a pixel having said local minimum gray-level value.
47. A system as recited in Claim 38, wherein said mass filter circuit comprises:
means for determining a filter value for a plurality of edge orientation bins; and means for determining a final filter value as a sum of said filter values.
means for determining a filter value for a plurality of edge orientation bins; and means for determining a final filter value as a sum of said filter values.
48. A system as recited in Claim 38, wherein said template matching circuit comprises:
means for determining a deformable shape template; and means for fitting said shape template to a shape of a suspected lesion.
means for determining a deformable shape template; and means for fitting said shape template to a shape of a suspected lesion.
49. A system as recited in Claim 48, wherein said template matching circuit further comprises:
means for determining said deformable shape template using Fourier descriptors; and means for varying said descriptors to dynamically fit said shape template to said shape of said suspected lesion.
means for determining said deformable shape template using Fourier descriptors; and means for varying said descriptors to dynamically fit said shape template to said shape of said suspected lesion.
50. A system as recited in Claim 48, wherein said template matching circuit further comprises:
means for determining said shape template as an inverse Fourier transform having a plurality of Fourier terms;
means for varying said Fourier terms to fit said shape template to fit said shape template to said shape of said suspected lesion; and means for outputting a contour of said suspected lesion.
means for determining said shape template as an inverse Fourier transform having a plurality of Fourier terms;
means for varying said Fourier terms to fit said shape template to fit said shape template to said shape of said suspected lesion; and means for outputting a contour of said suspected lesion.
51. A system as recited in Claim 50, wherein said template matching circuit further comprises:
means for varying a center location, a size, an orientation, a ratio of a long axis to a short axis, and a degree of asymmetry of said inverse Fourier transform.
means for varying a center location, a size, an orientation, a ratio of a long axis to a short axis, and a degree of asymmetry of said inverse Fourier transform.
52 . A system as recited in Claim 48, wherein said template matching circuit further comprises:
means for determining a center of a suspected lesion as a maximum mass filter value;
means for determining edges of said suspected lesion; and means for fitting said template using said center and said edges.
means for determining a center of a suspected lesion as a maximum mass filter value;
means for determining edges of said suspected lesion; and means for fitting said template using said center and said edges.
53. A system as recited in Claim 38, wherein said feature extraction circuit comprises:
means for determining a gradient of said region of interest;
means for determining an orientation of said region of interest;
means for determining a gradient index of said region of interest;
means for determining a contrast of said region of interest; and means for determining an elongation factor of said region of interest.
means for determining a gradient of said region of interest;
means for determining an orientation of said region of interest;
means for determining a gradient index of said region of interest;
means for determining a contrast of said region of interest; and means for determining an elongation factor of said region of interest.
54. A system as recited in Claim 33, wherein said feature extraction circuit further comprises a thresholding circuit.
55. A method as recited in Claim 38, wherein said mass filter circuit comprises:
means for selecting an initial kernel size of a mass filter;
means for performing mass filtering using said initial kernel size to produce a mass-filtered image;
means for varying said kernel size up to a maximum kernel size; and means for performing mass filtering at each kernel size and producing corresponding mass-filter images.
means for selecting an initial kernel size of a mass filter;
means for performing mass filtering using said initial kernel size to produce a mass-filtered image;
means for varying said kernel size up to a maximum kernel size; and means for performing mass filtering at each kernel size and producing corresponding mass-filter images.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US39830795A | 1995-03-03 | 1995-03-03 | |
US08/398,307 | 1995-03-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
CA2214101A1 true CA2214101A1 (en) | 1996-09-12 |
Family
ID=23574882
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA002214101A Abandoned CA2214101A1 (en) | 1995-03-03 | 1996-03-04 | Method and system for the detection of lesions in medical images |
Country Status (6)
Country | Link |
---|---|
US (1) | US6185320B1 (en) |
EP (1) | EP0813720A4 (en) |
JP (1) | JPH11501538A (en) |
AU (1) | AU705713B2 (en) |
CA (1) | CA2214101A1 (en) |
WO (1) | WO1996027846A1 (en) |
Families Citing this family (91)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6009212A (en) | 1996-07-10 | 1999-12-28 | Washington University | Method and apparatus for image registration |
US5768333A (en) * | 1996-12-02 | 1998-06-16 | Philips Electronics N.A. Corporation | Mass detection in digital radiologic images using a two stage classifier |
KR100219628B1 (en) * | 1997-02-15 | 1999-09-01 | 윤종용 | Signal adaptive filtering method and signal adaptive filter |
EP0923760B1 (en) * | 1997-06-06 | 2005-11-16 | Koninklijke Philips Electronics N.V. | Noise reduction in an image |
EP0998719A4 (en) | 1997-07-25 | 2000-11-22 | Arch Dev Corp | Method and system for the segmentation of lung regions in lateral chest radiographs |
US6542187B1 (en) * | 1998-07-09 | 2003-04-01 | Eastman Kodak Company | Correcting for chrominance interpolation artifacts |
US6697107B1 (en) * | 1998-07-09 | 2004-02-24 | Eastman Kodak Company | Smoothing a digital color image using luminance values |
US6633686B1 (en) * | 1998-11-05 | 2003-10-14 | Washington University | Method and apparatus for image registration using large deformation diffeomorphisms on a sphere |
FR2790851B1 (en) | 1999-03-12 | 2001-06-08 | Ge Medical Syst Sa | METHOD FOR IMPROVING THE DETECTION OF ELEMENTS OF INTEREST IN A DIGITAL RADIOGRAPHIC IMAGE |
US6941323B1 (en) * | 1999-08-09 | 2005-09-06 | Almen Laboratories, Inc. | System and method for image comparison and retrieval by enhancing, defining, and parameterizing objects in images |
US6674880B1 (en) * | 1999-11-24 | 2004-01-06 | Confirma, Inc. | Convolution filtering of similarity data for visual display of enhanced image |
KR20010101880A (en) * | 1999-11-29 | 2001-11-15 | 요트.게.아. 롤페즈 | Method for coding and decoding multimedia data |
US6898303B2 (en) * | 2000-01-18 | 2005-05-24 | Arch Development Corporation | Method, system and computer readable medium for the two-dimensional and three-dimensional detection of lesions in computed tomography scans |
US6901156B2 (en) * | 2000-02-04 | 2005-05-31 | Arch Development Corporation | Method, system and computer readable medium for an intelligent search workstation for computer assisted interpretation of medical images |
US6724945B1 (en) * | 2000-05-24 | 2004-04-20 | Hewlett-Packard Development Company, L.P. | Correcting defect pixels in a digital image |
JP4169954B2 (en) * | 2000-09-18 | 2008-10-22 | 富士フイルム株式会社 | Abnormal shadow candidate detection method |
CA2323883C (en) * | 2000-10-19 | 2016-02-16 | Patrick Ryan Morin | Method and device for classifying internet objects and objects stored oncomputer-readable media |
US20020164070A1 (en) * | 2001-03-14 | 2002-11-07 | Kuhner Mark B. | Automatic algorithm generation |
JP2002330950A (en) * | 2001-05-11 | 2002-11-19 | Fuji Photo Film Co Ltd | Abnormal shadow candidate detector |
JP2004520923A (en) * | 2001-06-20 | 2004-07-15 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | How to segment digital images |
US7110525B1 (en) | 2001-06-25 | 2006-09-19 | Toby Heller | Agent training sensitive call routing system |
JP2003057771A (en) * | 2001-08-20 | 2003-02-26 | Fuji Photo Film Co Ltd | Abnormal shadow detector |
WO2003024184A2 (en) * | 2001-09-14 | 2003-03-27 | Cornell Research Foundation, Inc. | System, method and apparatus for small pulmonary nodule computer aided diagnosis from computed tomography scans |
US6766043B2 (en) * | 2001-11-23 | 2004-07-20 | R2 Technology, Inc. | Pleural nodule detection from CT thoracic images |
US6855114B2 (en) * | 2001-11-23 | 2005-02-15 | Karen Drukker | Automated method and system for the detection of abnormalities in sonographic images |
US20030103663A1 (en) * | 2001-11-23 | 2003-06-05 | University Of Chicago | Computerized scheme for distinguishing between benign and malignant nodules in thoracic computed tomography scans by use of similar images |
US7336809B2 (en) * | 2001-11-23 | 2008-02-26 | R2 Technology, Inc. | Segmentation in medical images |
US7359538B2 (en) * | 2001-11-23 | 2008-04-15 | R2 Technology | Detection and analysis of lesions in contact with a structural boundary |
WO2004010374A2 (en) * | 2002-07-19 | 2004-01-29 | Philips Intellectual Property & Standards Gmbh | Simultaneous segmentation of multiple or composed objects by mesh adaptation |
US20040215072A1 (en) * | 2003-01-24 | 2004-10-28 | Quing Zhu | Method of medical imaging using combined near infrared diffusive light and ultrasound |
US9818136B1 (en) | 2003-02-05 | 2017-11-14 | Steven M. Hoffberg | System and method for determining contingent relevance |
GB2398379A (en) * | 2003-02-11 | 2004-08-18 | Qinetiq Ltd | Automated digital image analysis |
US7489829B2 (en) * | 2003-03-11 | 2009-02-10 | Sightic Vista Ltd. | Adaptive low-light image processing |
NO322089B1 (en) * | 2003-04-09 | 2006-08-14 | Norsar V Daglig Leder | Procedure for simulating local preamp deep-migrated seismic images |
US7668358B2 (en) * | 2003-07-18 | 2010-02-23 | Hologic, Inc. | Model-based grayscale registration of medical images |
US7664302B2 (en) * | 2003-07-18 | 2010-02-16 | Hologic, Inc. | Simultaneous grayscale and geometric registration of images |
KR100503424B1 (en) * | 2003-09-18 | 2005-07-22 | 한국전자통신연구원 | Automated method for detection of pulmonary nodules on multi-slice computed tomographic images and recording medium in which the method is recorded |
US20050075566A1 (en) * | 2003-09-19 | 2005-04-07 | Fuji Photo Film Co., Ltd. | Ultrasonice diagnosing apparatus |
US7515743B2 (en) * | 2004-01-08 | 2009-04-07 | Siemens Medical Solutions Usa, Inc. | System and method for filtering a medical image |
US7634139B2 (en) * | 2004-03-16 | 2009-12-15 | Sony Corporation | System and method for efficiently performing a pattern matching procedure |
GB2461199B (en) * | 2004-06-23 | 2010-04-28 | Medicsight Plc | Lesion extent determination in a CT scan image |
WO2006093523A2 (en) * | 2004-07-15 | 2006-09-08 | Kenji Suzuki | Computerized scheme for distinction between benign and malignant nodules in thoracic low-dose ct |
US7970625B2 (en) | 2004-11-04 | 2011-06-28 | Dr Systems, Inc. | Systems and methods for retrieval of medical data |
US7885440B2 (en) * | 2004-11-04 | 2011-02-08 | Dr Systems, Inc. | Systems and methods for interleaving series of medical images |
US7787672B2 (en) | 2004-11-04 | 2010-08-31 | Dr Systems, Inc. | Systems and methods for matching, naming, and displaying medical images |
US7660488B2 (en) * | 2004-11-04 | 2010-02-09 | Dr Systems, Inc. | Systems and methods for viewing medical images |
US7920152B2 (en) * | 2004-11-04 | 2011-04-05 | Dr Systems, Inc. | Systems and methods for viewing medical 3D imaging volumes |
US7736313B2 (en) * | 2004-11-22 | 2010-06-15 | Carestream Health, Inc. | Detecting and classifying lesions in ultrasound images |
JP2008529639A (en) * | 2005-02-11 | 2008-08-07 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Inspection apparatus, image processing device, method of inspecting target object with inspection apparatus, computer-readable medium, and program element |
US8517945B2 (en) * | 2005-04-28 | 2013-08-27 | Carestream Health, Inc. | Segmentation of lesions in ultrasound images |
CN1907225B (en) * | 2005-08-05 | 2011-02-02 | Ge医疗系统环球技术有限公司 | Process and apparatus for dividing intracerebral hemorrhage injury |
US7764820B2 (en) * | 2005-08-24 | 2010-07-27 | The General Hospital Corporation | Multi-threshold peripheral equalization method and apparatus for digital mammography and breast tomosynthesis |
JP4717585B2 (en) * | 2005-10-14 | 2011-07-06 | 富士フイルム株式会社 | Medical image determination apparatus, medical image determination method and program thereof |
WO2007048844A1 (en) * | 2005-10-28 | 2007-05-03 | France Telecom | Method for processing a representaitve source image of at least one object, processing device, corresponding distance map and a computer software product |
US20070211930A1 (en) * | 2006-03-09 | 2007-09-13 | Terry Dolwick | Attribute based image enhancement and display for medical imaging applications |
US20070250548A1 (en) * | 2006-04-21 | 2007-10-25 | Beckman Coulter, Inc. | Systems and methods for displaying a cellular abnormality |
US8571287B2 (en) * | 2006-06-26 | 2013-10-29 | General Electric Company | System and method for iterative image reconstruction |
WO2008005554A2 (en) * | 2006-07-06 | 2008-01-10 | University Of Connecticut | Method and apparatus for medical imaging using near-infrared optical tomography, fluorence tomography combined with ultrasound |
US8070682B2 (en) * | 2006-07-19 | 2011-12-06 | The University Of Connecticut | Method and apparatus for medical imaging using combined near-infrared optical tomography, fluorescent tomography and ultrasound |
US7873194B2 (en) * | 2006-10-25 | 2011-01-18 | Rcadia Medical Imaging Ltd. | Method and system for automatic analysis of blood vessel structures and pathologies in support of a triple rule-out procedure |
US7983459B2 (en) | 2006-10-25 | 2011-07-19 | Rcadia Medical Imaging Ltd. | Creating a blood vessel tree from imaging data |
US7940970B2 (en) * | 2006-10-25 | 2011-05-10 | Rcadia Medical Imaging, Ltd | Method and system for automatic quality control used in computerized analysis of CT angiography |
US7860283B2 (en) | 2006-10-25 | 2010-12-28 | Rcadia Medical Imaging Ltd. | Method and system for the presentation of blood vessel structures and identified pathologies |
US7940977B2 (en) * | 2006-10-25 | 2011-05-10 | Rcadia Medical Imaging Ltd. | Method and system for automatic analysis of blood vessel structures to identify calcium or soft plaque pathologies |
US7953614B1 (en) | 2006-11-22 | 2011-05-31 | Dr Systems, Inc. | Smart placement rules |
US7929762B2 (en) * | 2007-03-12 | 2011-04-19 | Jeffrey Kimball Tidd | Determining edgeless areas in a digital image |
US7903900B2 (en) * | 2007-03-30 | 2011-03-08 | Hong Kong Applied Science And Technology Research Institute Co., Ltd. | Low complexity color de-noising filter |
US20090082637A1 (en) * | 2007-09-21 | 2009-03-26 | Michael Galperin | Multi-modality fusion classifier with integrated non-imaging factors |
US20090118600A1 (en) * | 2007-11-02 | 2009-05-07 | Ortiz Joseph L | Method and apparatus for skin documentation and analysis |
US20100094134A1 (en) * | 2008-10-14 | 2010-04-15 | The University Of Connecticut | Method and apparatus for medical imaging using near-infrared optical tomography combined with photoacoustic and ultrasound guidance |
US8380533B2 (en) | 2008-11-19 | 2013-02-19 | DR Systems Inc. | System and method of providing dynamic and customizable medical examination forms |
US8189943B2 (en) * | 2009-03-17 | 2012-05-29 | Mitsubishi Electric Research Laboratories, Inc. | Method for up-sampling depth images |
CN101540061B (en) * | 2009-04-10 | 2011-06-22 | 西北工业大学 | Topological and ordering matching method for disordered images based on simulated annealing |
JP5258694B2 (en) * | 2009-07-27 | 2013-08-07 | 富士フイルム株式会社 | Medical image processing apparatus and method, and program |
US8712120B1 (en) | 2009-09-28 | 2014-04-29 | Dr Systems, Inc. | Rules-based approach to transferring and/or viewing medical images |
KR101111055B1 (en) | 2009-10-12 | 2012-02-15 | 서울대학교산학협력단 | Method for Automatic Breast Density Measurement on Digital Mammogram |
US9092551B1 (en) | 2011-08-11 | 2015-07-28 | D.R. Systems, Inc. | Dynamic montage reconstruction |
US9495604B1 (en) | 2013-01-09 | 2016-11-15 | D.R. Systems, Inc. | Intelligent management of computerized advanced processing |
KR20140138501A (en) * | 2013-05-24 | 2014-12-04 | 삼성전자주식회사 | Lesion classification apparatus, and method for modifying a lesion classification data |
EP3035850B1 (en) | 2013-08-20 | 2020-05-13 | Densitas Incorporated | Methods and systems for determining breast density |
CN105982685A (en) * | 2015-03-03 | 2016-10-05 | 东芝医疗系统株式会社 | Medical image processing device and method and medical image diagnosing device and method |
CN104700419A (en) * | 2015-03-27 | 2015-06-10 | 马学梅 | Image handling method of X-ray picture of radiology department |
US20170046483A1 (en) | 2015-04-30 | 2017-02-16 | D.R. Systems, Inc. | Database systems and interactive user interfaces for dynamic interaction with, and comparison of, digital medical image data |
CN104933701B (en) * | 2015-05-18 | 2017-10-27 | 重庆大学 | The mammary glandular cell dividing method of adhesion model is removed with double strategies based on multiple dimensioned growth |
CA3030577A1 (en) | 2016-07-12 | 2018-01-18 | Mindshare Medical, Inc. | Medical analytics system |
US10380739B2 (en) * | 2017-08-15 | 2019-08-13 | International Business Machines Corporation | Breast cancer detection |
US11049606B2 (en) | 2018-04-25 | 2021-06-29 | Sota Precision Optics, Inc. | Dental imaging system utilizing artificial intelligence |
CN109801235B (en) * | 2018-12-28 | 2023-03-28 | 佛山科学技术学院 | Method and device for detecting disease cause of epipremnum aureum leaves |
CN110136161A (en) * | 2019-05-31 | 2019-08-16 | 苏州精观医疗科技有限公司 | Image characteristics extraction analysis method, system and device |
EP4038546A1 (en) * | 2019-10-01 | 2022-08-10 | 10X Genomics, Inc. | Systems and methods for identifying morphological patterns in tissue samples |
US11749401B2 (en) * | 2020-10-30 | 2023-09-05 | Guerbet | Seed relabeling for seed-based segmentation of a medical image |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4797806A (en) * | 1987-02-19 | 1989-01-10 | Gtx Corporation | High speed serial pixel neighborhood processor and method |
US4761819A (en) * | 1987-02-27 | 1988-08-02 | Picker International, Inc. | Adaptive noise reduction filter for reconstructed images |
US4907156A (en) * | 1987-06-30 | 1990-03-06 | University Of Chicago | Method and system for enhancement and detection of abnormal anatomic regions in a digital image |
US5121436A (en) * | 1987-08-14 | 1992-06-09 | International Remote Imaging Systems, Inc. | Method and apparatus for generating a plurality of parameters of an object in a field of view |
US4945478A (en) * | 1987-11-06 | 1990-07-31 | Center For Innovative Technology | Noninvasive medical imaging system and method for the identification and 3-D display of atherosclerosis and the like |
US5016173A (en) * | 1989-04-13 | 1991-05-14 | Vanguard Imaging Ltd. | Apparatus and method for monitoring visually accessible surfaces of the body |
US5079698A (en) * | 1989-05-03 | 1992-01-07 | Advanced Light Imaging Technologies Ltd. | Transillumination method apparatus for the diagnosis of breast tumors and other breast lesions by normalization of an electronic image of the breast |
US5133020A (en) * | 1989-07-21 | 1992-07-21 | Arch Development Corporation | Automated method and system for the detection and classification of abnormal lesions and parenchymal distortions in digital medical images |
US5212637A (en) * | 1989-11-22 | 1993-05-18 | Stereometrix Corporation | Method of investigating mammograms for masses and calcifications, and apparatus for practicing such method |
US5237626A (en) * | 1991-09-12 | 1993-08-17 | International Business Machines Corporation | Universal image processing module |
US5359513A (en) * | 1992-11-25 | 1994-10-25 | Arch Development Corporation | Method and system for detection of interval change in temporally sequential chest images |
US5440653A (en) * | 1993-09-24 | 1995-08-08 | Genesis Microchip Inc. | Image mirroring and image extension for digital filtering |
FR2712415B1 (en) * | 1993-11-09 | 1995-12-22 | Ge Medical Syst Sa | Method for automatically locating points of interest during a stereotaxic examination in mammography. |
US5452367A (en) * | 1993-11-29 | 1995-09-19 | Arch Development Corporation | Automated method and system for the segmentation of medical images |
US5579445A (en) * | 1993-12-17 | 1996-11-26 | Xerox Corporation | Image resolution conversion method that employs statistically generated multiple morphological filters |
US5781667A (en) * | 1995-07-31 | 1998-07-14 | Neopath, Inc. | Apparatus for high speed morphological processing |
US5757953A (en) * | 1996-02-29 | 1998-05-26 | Eastman Kodak Company | Automated method and system for region decomposition in digital radiographic images |
JP3678377B2 (en) * | 1996-08-26 | 2005-08-03 | 富士写真フイルム株式会社 | Abnormal shadow extraction method and apparatus |
-
1996
- 1996-03-04 WO PCT/US1996/002439 patent/WO1996027846A1/en not_active Application Discontinuation
- 1996-03-04 AU AU49932/96A patent/AU705713B2/en not_active Ceased
- 1996-03-04 JP JP8526892A patent/JPH11501538A/en active Pending
- 1996-03-04 EP EP96906597A patent/EP0813720A4/en not_active Withdrawn
- 1996-03-04 CA CA002214101A patent/CA2214101A1/en not_active Abandoned
-
1997
- 1997-12-01 US US08/982,282 patent/US6185320B1/en not_active Expired - Lifetime
Also Published As
Publication number | Publication date |
---|---|
EP0813720A1 (en) | 1997-12-29 |
JPH11501538A (en) | 1999-02-09 |
WO1996027846A1 (en) | 1996-09-12 |
AU705713B2 (en) | 1999-05-27 |
AU4993296A (en) | 1996-09-23 |
EP0813720A4 (en) | 1998-07-01 |
US6185320B1 (en) | 2001-02-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU705713B2 (en) | Method and system for the detection of lesions in medical images | |
CN109635846B (en) | Multi-type medical image judging method and system | |
Schilham et al. | A computer-aided diagnosis system for detection of lung nodules in chest radiographs with an evaluation on a public database | |
EP0757544B1 (en) | Computerized detection of masses and parenchymal distortions | |
EP1035508B1 (en) | Automated method and system for the segmentation of medical images | |
US6801645B1 (en) | Computer aided detection of masses and clustered microcalcifications with single and multiple input image context classification strategies | |
US5815591A (en) | Method and apparatus for fast detection of spiculated lesions in digital mammograms | |
US6640001B2 (en) | Method and apparatus for fast detection of lesions | |
US20020006216A1 (en) | Method, system and computer readable medium for the two-dimensional and three-dimensional detection of lesions in computed tomography scans | |
US20090297002A1 (en) | Computer aided detection of microcalcification clusters | |
Székely et al. | A hybrid system for detecting masses in mammographic images | |
EP1009283A1 (en) | Method and system for automated detection of clustered microcalcifications from digital mammograms | |
WO1999028853A1 (en) | Automated detection of clustered microcalcifications from digital mammograms | |
JP2006325937A (en) | Image determination device, image determination method, and program therefor | |
JP2006034585A (en) | Picture display device and picture display method, and program thereof | |
Amit et al. | Hybrid mass detection in breast MRI combining unsupervised saliency analysis and deep learning | |
WO2000079474A1 (en) | Computer aided detection of masses and clustered microcalcification strategies | |
Valliappan et al. | A theoretical methodology and prototype implementation for detection segmentation classification of digital mammogram tumor by machine learning and problem solving approach | |
Nguyen et al. | Combination of gabor filter and convolutional neural network for suspicious mass classification | |
Murphy et al. | Automated detection of pulmonary nodules from low-dose computed tomography scans using a two-stage classification system based on local image features | |
Thomas et al. | An automated kidney tumour detection technique from computer tomography images | |
Novitasari et al. | Automatic detection of breast cancer in mammographic image using the histogram oriented gradient (HOG) descriptor and deep rule based (DRB) classifier method | |
Sahba et al. | Mean shift based algorithm for mammographic breast mass detection | |
Alhabib et al. | Detection of partially overlapped masses in mammograms | |
Naveen et al. | Deep Learning Technique to Detect and Classify Brain Tumor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FZDE | Discontinued |