WO2015087203A1 - Imaging systems and methods for monitoring treatment of tissue lesions - Google Patents

Imaging systems and methods for monitoring treatment of tissue lesions Download PDF

Info

Publication number
WO2015087203A1
WO2015087203A1 PCT/IB2014/066537 IB2014066537W WO2015087203A1 WO 2015087203 A1 WO2015087203 A1 WO 2015087203A1 IB 2014066537 W IB2014066537 W IB 2014066537W WO 2015087203 A1 WO2015087203 A1 WO 2015087203A1
Authority
WO
WIPO (PCT)
Prior art keywords
lesion
ultrasound image
tissue
ultrasound
image
Prior art date
Application number
PCT/IB2014/066537
Other languages
French (fr)
Inventor
James Robertson Jago
Thomas Patrice Jean Arsene Gauthier
Lars Jonas Olsson
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2015087203A1 publication Critical patent/WO2015087203A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/085Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/04Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
    • A61B18/12Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
    • A61B18/14Probes or electrodes therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5261Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • G06T7/0016Biomedical image inspection using an image reference approach involving temporal comparison
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/18Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves
    • A61B18/1815Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves using microwaves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00571Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body for achieving a particular surgical effect
    • A61B2018/00577Ablation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00636Sensing and controlling the application of energy
    • A61B2018/0066Sensing and controlling the application of energy without feedback, i.e. open loop control
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00994Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body combining two or more different kinds of non-mechanical energy or combining one or more non-mechanical energies with ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/06Measuring blood flow
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/485Diagnostic techniques involving measuring strain or elastic properties
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image

Definitions

  • the present invention relates to medical
  • diagnostic ultrasound systems and, in particular, to imaging systems and methods for monitoring ablation of tissue lesions.
  • tissue ablation where the diseased tissue is destroyed by application of local tissue heating, cooling, or other means.
  • ablation methods in common use are RF ablation, microwave ablation, HIFU, and cryo ablation.
  • Imaging methods are typically used to attempt to verify that all the diseased tissue has been treated, but current imaging methods have limitations.
  • Computed tomography is often used to plan, guide and monitor ablation, but it is expensive, delivers potentially harmful ionizing radiation doses to the patient and the operators, and has relatively poor soft tissue contrast.
  • Magnetic resonance imaging (MRI) is expensive and not well suited to interventional procedures involving
  • PET Positron emission tomography
  • methods and systems are provided for monitoring treatment (e.g., ablation) of lesions using a combination of ultrasound imaging with another non-ultrasound imaging modality.
  • the present invention includes imaging a lesion of interest using a modality other than ultrasound, such as MRI, CT and/or PET imaging.
  • the non-ultrasound image in particular, can be used to provide an accurate representation of the lesion boundary.
  • a first ultrasound image of the same lesion is generated and registered with the MRI, CT and/or PET image.
  • the lesion can be treated with or without a treatment plan using known ablative therapies, chemotherapy, radiation therapies, and/or other treatment
  • a second ultrasound image is generated to identify the volume of the treated tissue, which can be registered with the first ultrasound image, the non-ultrasound image, or both.
  • the ultrasound images, along with the non-ultrasound image, are also more accurately registered with reference to tissue features outside of the treatment volume.
  • tissue differences and/or similarities between the lesion and the treated tissue can be identified by comparing data in the two ultrasound images within or around the lesion boundary. Portions of the lesion and the treated tissue that have similar tissue characteristics can indicate insufficiently treated lesion tissue, and because the images are spatially registered a
  • Tissue lesions being treated include, e.g., tumors, cysts, and other tissues that can be treated using therapies, such as known ablative therapies,
  • chemotherapy radiation therapies, and/or other treatment techniques, such as local injections of alcohol or other substances for killing the lesion tissue .
  • FIGURE 1 illustrates in block diagram form the use of three dimensional ultrasonic imaging to guide or monitor treatment in an embodiment of the present invention .
  • FIGURE 2 illustrates in block diagram form the functional subsystems of a three dimensional
  • ultrasonic imaging system suitable for use in an embodiment of the present invention.
  • FIGURE 3 illustrates a workflow in accordance with the present invention for monitoring treatment of a lesion.
  • FIGURE 4 depicts an example registration of an ultrasound image and an MRI image using tissue features outside of a treatment volume.
  • FIGURE 5A illustrates a comparison of 3D
  • FIGURE 5B illustrates untreated tissue in need of further treatment as identified with registration with an MRI image.
  • the present invention includes imaging systems.
  • the present invention provides an imaging system for measuring a remaining volume of a lesion after an ablation treatment.
  • the system can receive a non-ultrasound image of a target region comprising a lesion.
  • the system can acquire a first ultrasound image comprising the lesion, and register the first ultrasound image with the non- ultrasound image.
  • the system can acquire a second ultrasound image that includes image data
  • the system can determine if the lesion has been sufficiently treated by comparing the lesion in the first ultrasound image with the treated tissue in the second ultrasound image and identifying whether there are untreated portions within a lesion boundary in the non-ultrasound image. In certain embodiments, the system can determine the level of treatment by registering any combination of the non- ultrasound image, the first ultrasound image, and the second ultrasound image using imaged structures that are outside of the treated tissue. In some
  • the imaging system can include an ultrasonic diagnostic imaging system adapted to acquire the first and second ultrasound images.
  • Non-ultrasound images can be used, such as, e.g., a magnetic resonance (MR) image, a computed tomography (CT) image, or a positron emission
  • MR magnetic resonance
  • CT computed tomography
  • positron emission positron emission
  • the first and second ultrasound images can include a 3D volume of the lesion and a 3D volume of the treated tissue, respectively. Determining if sufficient treatment has occurred can include subtracting the 3D volume of the lesion from the 3D volume of the treated tissue according to similar or different tissue
  • the system can also register the non- ultrasound image, the first and second ultrasound images, and a treatment plan depicting a predicted tissue volume to be treated.
  • the treated tissue is ablated tissue.
  • Structural components of the system can include processors and other well-known components used to carry out the methods and features of imaging and ultrasound
  • FIGURE 1 the use of three dimensional ultrasonic imaging to monitor ablation with a tissue ablation probe is shown in partial block diagram form.
  • 3D three dimensional ultrasonic imaging system
  • an ultrasound probe 10 having a two
  • the transducer array transmits ultrasonic beams over a volumetric field of view 12 including a lesion 14 under control of an ultrasound acquisition subsystem 16 and receives echoes in response to the transmitted beams which are coupled to and processed by the acquisition subsystem.
  • the echoes received by the elements of the transducer array are combined into coherent echo signals by the acquisition subsystem and the echo signals along with the coordinates from which they are received (r,6,cp for a radial transmission pattern) are coupled to a 3D image processor 18.
  • the 3D image processor processes the echo signals into a three dimensional ultrasonic image, which is displayed on a display 20.
  • the ultrasound system is controlled by a control panel 22 by which the user defines the
  • FIGURE 1 includes an interventional device system for performing treatment, e.g., tissue ablation.
  • the interventional device system includes an ablation probe 24, the different types of which are well known in the art.
  • the ablation probe 24 is used to ablate a desired tissue region in a patient, and it can be manipulated by a physician (not shown) and/or a guidance subsystem 26, which may mechanically assist the maneuvering and placement of the interventional device within the body.
  • the ablation probe 24 is operated to ablate tissue under the control of an intervention subsystem 28, which can be operated via control panel 36 (or control panel 22, if only one control panel is used) .
  • the intervention subsystem 28 can also receive information on the procedure being performed, such as optical or acoustic image
  • the ablation probe 24 and/or the ultrasound probe 10 may also have active position sensors that are used to provide information as to the location of the tip of the ablation probe along the insertion path 32 and/or the position of the transducer, which can be used to determine the position of the transducer imaging plane as well.
  • the active position sensors may operate by transmitting or receiving signals in the acoustic, optical, radio frequency or electromagnetic spectrum and its output is coupled to a device position
  • Position information of the interventional device is coupled to the display
  • processor 30 when appropriate for the processing or display of information concerning the position of the interventional within the body.
  • Information pertinent to the functioning or operation of the ablation probe is displayed on a display 20.
  • image data may be exchanged over a signal path 38 between the 3D image processor 18 of the ultrasound system and the display processor 30 of the interventional device system for the formation of a 3D image containing information from both systems.
  • the system in FIGURE 1 further includes a signal path 40 that connects the ultrasound acquisition subsystem 16 of the ultrasound system and the device position measurement subsystem 34 of the interventional device system to allow synchronization of the imaging system and the interventional device.
  • FIGURE 2 illustrates some of the components of the 3D ultrasound system of FIG. 1 in further detail.
  • the elements of a two dimensional array transducer 42 are coupled to a plurality of microbeamformers 44.
  • the microbeamformers control the transmission of ultrasound by the elements of the array transducer 42 and
  • the microbeamformers 44 are preferably fabricated in integrated circuit form and located in the housing of the ultrasound probe 10 near the array transducer. Microbeamformers , or subarray beamformers as they are often called, are more fully described in U.S. Pat. Nos. 6,375,617 and 5,997,479, which are incorporated by reference herein in their entirety.
  • the ultrasound probe 10 may also include a position sensor 46 which provides signals indicative of the position of the probe 10 to a transducer position detector 48.
  • the sensor 46 may be a magnetic, electromagnetic, radio frequency, infrared, or other type of sensor such as one which transmits a signal that is detected by a voltage impedance circuit.
  • the transducer position signal 50 produced by the detector 48 may be used by the ultrasound system or coupled to the interventional device system when useful for the formation of spatially coordinated images containing information from both systems.
  • the partially beamformed signals produced by the microbeamformers 44 are coupled to a beamformer 52 where the beam formation process is completed.
  • the resultant coherent echo signals along the beams are processed by filtering, amplitude detection, Doppler signal detection, and other processes by a signal processor 54.
  • the echo signals are then processed into image signals in the coordinate system of the probe (r,6,cp for example) by an image processor 56.
  • the image signals are converted to a desired image format (x,y,z Cartesian coordinates, for example) by a scan converter 58.
  • the three dimensional image data is coupled to a volume renderer 60 which renders a three dimensional view of the volumetric region 12 as seen from a
  • Volume rendering is well known in the art and is described, e.g., in U.S. Pat. No. 5,474,073, which is incorporated by reference herein in its entirety. Volume rendering may also be performed on image data which has not been scan converted as
  • the image plane data bypasses the volume renderer and is coupled directly to a video processor 62 which produces video drive signals compatible with the requirements of the display 64.
  • the volume rendered 3D images are also coupled to the video processor 62 for display.
  • the system can display individual volume rendered images or a series of volume rendered images.
  • two volume renderings can be done of a volumetric data set from slightly offset look directions, and the two are displayed
  • a graphics and registration processor 66 is used for analysis and registration of images, such as the registration of two ultrasound images, an ultrasound image with a non- ultrasound image, or two ultrasound images and a non- ultrasound image.
  • the graphics and registration processor 66 can receive images and data associated with a treatment plan for the ablation procedure, including an expected ablation region for the treatment. The treatment plan and the expected ablation region can also be registered with the other images.
  • the graphics and registration processor 66 receives either scan
  • the ultrasound system described above can be performed with a freehand approach or with an
  • interventional device system such as the PercuNav system, elements of which are shown in FIGURE 2.
  • the PercuNav system provides imaging tools to assist clinicians in ablation procedures. It combines electromagnetic tracking of flexible or rigid instruments with patient images from multiple modalities (e.g., CT, MRI, PET and/or ultrasound) to create a real-time 3D map of the patient space that displays the instrument position, orientation, and trajectory, as well as anatomical landmarks. This map helps guide physicians to areas of interest, even when they are small, hard to visualize, difficult to access, or close to sensitive organs, vessels, or tissue. Furthermore, display of corresponding data from multiple imaging modalities can be overlaid and side-by-side images with areas of interest can be automatically marked on images. Easy localization and comparison of hard-to- find or ambiguous ultrasound targets can also be conducted by referring to related CT or MR images with corresponding areas of interest marked on the different modality images.
  • modalities e.g., CT, MRI, PET and/or ultrasound
  • PercuNav system has a field generator 68 which radiates an electromagnetic field permeating the site of the procedure and surrounding space.
  • Sensors 46 are located on the ultrasound probe 10, the ablation probe 24 and the patient (not shown) which interact with the electromagnetic field and produce signals used to calculate the position and orientation of the 2D image plane of the ultrasound transducer, the tissue ablation probe, and the patient.
  • a coordinate generator 70 of the PercuNav system which receives the transducer position signal 50 and signals from the ablation apparatus and orientation coordinates for the image plane of the probe are also coupled to the field generator for field registration purposes.
  • Coordinate information of the ablation probe and image plane is coupled to the graphics and
  • registration processor 66 which produces graphics in response to operator control signals from the control panel 36 and uses the positional information provided by the PercuNav system to register the various images in accordance with the implementations of the present invention.
  • the present invention provides methods of identifying insufficiently treated portions of a lesion after treatment.
  • the methods can include, for example, generating a non-ultrasound image of a target region of tissue comprising the lesion,
  • acquiring a first ultrasound image comprising the lesion registering the first ultrasound image with the non-ultrasound image, treating tissue that comprises at least a portion of the lesion, acquiring a second ultrasound image comprising the treated tissue, determining if the lesion has been sufficiently treated by comparing the lesion in the first ultrasound image with the treated tissue in the second ultrasound image and identifying whether there are untreated portions within a lesion boundary in the non-ultrasound image.
  • FIGURE 3 is a flow chart showing the workflow 72 of an implementation of the present invention.
  • This workflow 72 begins with a step 74 that includes obtaining a non-ultrasound image, such as a CT, MR, and/or PET image that includes three-dimensional data of imaged tissue of a patient.
  • the non-ultrasound images can be acquired as part of the interventional procedure, or the CT, MR, and/or PET images can be acquired prior to the interventional procedure and uploaded for display in the system.
  • the field of view for the non-ultrasound image includes a lesion of interest (e.g., a liver tumor) that can be processed using known methods to generate a region of interest identifying an accurate representation of a lesion boundary in 2D or 3D.
  • the lesion boundary is used to accurately determine whether the entire lesion is sufficiently treated during a treatment procedure.
  • the non-ultrasound image also includes surrounding tissue that is outside a treatment volume (e.g., an ablation volume) in the vicinity of the lesion.
  • the surrounding tissue can include, e.g., tissue structures and/or blood vessels, that will provide accurate registration of later acquired ultrasound images with the non- ultrasound images.
  • Automated image-based systems for registering ultrasound volumes of tissue are well known. However, current algorithms that reference the lesion for registration will be confused or compromised by changes in the tissue due to ablation. These changes are, in fact, the very changes in the tissue that are detected and used to determine whether further treatment is needed.
  • tissue that is outside the ablation region is used by the registration algorithm for more accurate registration of images acquired before and after ablation because the tissue outside the ablation region is unaffected. Identifying the tissue outside the ablation region can be done manually through user selection of tissue spatially removed from the lesion or via the expected ablation region defined by a treatment plan that is determined using the non-ultrasound images.
  • a treatment plan can also be used to model how to treat the lesion.
  • the treatment plan depicts a
  • the treatment plan and the expected tissue treatment (e.g., ablation) volume can also be registered with the non- ultrasound image and superimposed over ultrasound images of the patient.
  • Step 76 includes acquiring an ultrasound image that includes the lesion and surrounding tissue that is outside the treatment volume.
  • the ultrasound image can be acquired in 2D or it can include 3D image data of the lesion that is processed and rendered with the volume renderer 60 to generate a 3D volume of the lesion .
  • the non-ultrasound image and the ultrasound image are registered and fused in step 78, thereby overlaying the 3D ultrasound image volume of the lesion with the lesion imaged with the non-ultrasound modality.
  • registration can be conducted using the lesion (as no tissue changes have occurred due to ablation) or by using tissue features surrounding the lesion, such as those outside the treatment volume.
  • Step 80 includes treating (e.g., ablating) the tissue that includes the lesion. Depending on the volume of the ablation, some or all of the lesion will be ablated.
  • a second ultrasound image is acquired after tissue ablation in step 82.
  • the second 3D ultrasound image is acquired immediately or after a specified duration, such as a duration needed for dispersal of any gases associated with the treatment, such as gases generated with RF ablation.
  • the second ultrasound image also includes tissue that is outside of the predicted treatment volume. This ensures that the tissue within the treatment volume is registered accurately even though it has been affected by the treatment.
  • Step 84 includes determining if any portion of the lesion is remaining after ablation that was
  • the determination can be performed in a variety of ways that use the non- ultrasound image to provide a more accurate
  • registration of the ultrasound images, non-ultrasound images, and/or the treatment plan can be accomplished in different orders. But, registration with the non- ultrasound image provides accurate data for defining the lesion boundary, and therefore allows for a
  • the non-ultrasound image is registered with the
  • the second ultrasound image can be registered with the non-ultrasound image, the ultrasound image, or both.
  • the registration algorithm can use tissue features outside the treatment volume.
  • the treatment plan is also registered with one or more of the images and is used to define tissue that is outside the treatment volume .
  • differences in elastography imaging differences in image parameter analysis (e.g., grayscale differences), and/or differences identified with flow-based
  • tissue characteristics within a given volume can also be used to identify portions of the lesion or surrounding tissue that was not ablated.
  • rendered ultrasound volumes can be compared between the pre- ablation lesion and the ablation tissue.
  • the two rendered volumes of tissue can be subtracted from each other to highlight tissue that has changed
  • a 3D volume of the lesion can be
  • the lesion border more accurately defined by the non-ultrasound modality can be used to determine whether the treated (e.g., ablated) tissue (as determined by the pre- and post-ablation comparison algorithm) extends beyond the border of the target lesion in all orientations. In this way, the clinician can be confident that the entire lesion has been treated or, if not, can continue with the treatment for a further period of time. Any tissue that lies within the lesion border that has not been highlighted may then be considered as tissue that was not sufficiently ablated and that may require further ablation. If there is any lesion tissue remaining, then the steps can be repeated until the entire lesion has been fully treated. In some
  • the treated tissue volume may be larger than the lesion volume, which would indicated that all of the lesion was treated and no further treatment is needed.
  • the ultrasound systems can operate to perform any of the following steps: receive a non-ultrasound image of a target region comprising a lesion in a patient; acquire a first ultrasound image comprising the lesion; register the first ultrasound image with the non-ultrasound image; acquire a second ultrasound image comprising a region of treated tissue, the region of treated tissue comprising at least a portion of the lesion in the patient; and determine if the lesion has been sufficiently treated by comparing the lesion in the first ultrasound image with the treated tissue in the second ultrasound image and identifying whether there are untreated portions within a lesion boundary in the non-ultrasound image.
  • FIGURE 4 illustrates an embodiment of registering an ultrasound image 86 with a non-ultrasound image 88, such as a MRI image.
  • a non-ultrasound image 88 such as a MRI image.
  • the ultrasound image 86 includes tissue features 90 that are outside of a region surrounding the lesion 92.
  • tissue features 90 that are spatially removed from the lesion
  • the ultrasound and non-ultrasound images may be anatomically aligned in the same orientation and overlaid. Registration may be done using known image fusion techniques such as the image fusion capability available on the Percunav image guidance system with image fusion, available from Philips Healthcare of Andover, MA.
  • Image matching techniques may also be used, such as those used to stitch digital photographs together to form a panoramic image or those used in medical diagnostic panoramic imaging, in which a sequence of images are stitched together as they are acquired.
  • Common image matching techniques use block matching, in which arrays of pixels from two images are manipulated to find a difference between them which meets a least squares (MSAD) fit. These techniques are useful for both 2D and 3D medical images as described in US Pat. 6,442,289 (Olsson et al . ) , which is
  • FIGURE 5A and 5B illustrate an implementation of comparing ultrasound images of a lesion for determining whether further treatment is needed after one treatment (e.g., an ablation treatment) .
  • ultrasound imaging data from a first ultrasound scan 94 can be used to render a 3D volume 96 of a lesion of interest.
  • a second 3D volume of the treated tissue volume 98 can be rendered from ultrasound imaging data from a second scan 100.
  • the tissue of the lesion and the treated volume can be compared using a variety of techniques that respond differently to treated vs.
  • untreated tissue such as using ultrasound contrast agents that show up differently in treated vs.
  • tissue characteristics within a given volume can also be used to identify portions of the lesion or surrounding tissue that was not treated.
  • the volume of the lesion is larger than the treated tissue volume that showed different characteristics than the untreated lesion tissue.
  • the portion having similar tissue characteristics is shown in black with respect to the lesion and can readily be determined for 2D or for 3D volumes.
  • the accurate location of the insufficiently treated tissue is
  • non-ultrasound image such as an MRI image 88
  • non-ultrasound images can have brighter contrast to show a lesion boundary more clearly.
  • registering the ultrasound images and/or compared ultrasound image data with the MRI image provide the physician with guidance data on what region of a lesion will need further treatment (e.g., ablation) .

Abstract

Methods and systems are provided for monitoring treatment of lesions using a combination of ultrasound imaging with another non-ultrasound imaging modality, such as CT, MRI, and/or PET. An image of a lesion is acquired using a modality other than ultrasound. A first ultrasound image of the lesion is generated and registered with the MRI, CT and/or PET images. The lesion is treated, and a second ultrasound image is generated to identify the volume of the treated tissue. The two ultrasound images are registered with the non-ultrasound image to better identify what tissue lies within a boundary of the lesion. The registered first and second ultrasound images are compared to determine whether any lesion tissue remains within the lesion boundary in the non-ultrasound image, thereby identifying lesion tissue that may be insufficiently treated.

Description

IMAGING SYSTEMS AND METHODS FOR MONITORING TREATMENT
OF TISSUE LESIONS
This application claims benefit to U.S.
Provisional Application 61/915657, filed December 13,
2013, which is incorporated by reference herein in its entirety.
The present invention relates to medical
diagnostic ultrasound systems and, in particular, to imaging systems and methods for monitoring ablation of tissue lesions.
The use of local and minimally invasive
therapies as alternatives to surgery, are growing rapidly for the treatment of many lesions, especially cancer, and in many parts of the body. The
advantages of these minimally invasive treatments include fewer side effects, faster recovery and, in some cases, the possibility to treat more advanced disease. One of these minimally invasive therapies is tissue ablation, where the diseased tissue is destroyed by application of local tissue heating, cooling, or other means. Some examples of ablation methods in common use are RF ablation, microwave ablation, HIFU, and cryo ablation.
Failure to destroy the diseased tissue
completely can result in recurrence of the disease. Since local ablation therapies rely on destroying the diseased tissue in-situ, rather than excising it (as in surgery) , it is more difficult to determine whether the diseased tissue has been completely destroyed because the margins of the treated tissue cannot be examined directly, such as by
histopathology . Imaging methods are typically used to attempt to verify that all the diseased tissue has been treated, but current imaging methods have limitations. Computed tomography (CT) is often used to plan, guide and monitor ablation, but it is expensive, delivers potentially harmful ionizing radiation doses to the patient and the operators, and has relatively poor soft tissue contrast. Magnetic resonance imaging (MRI) is expensive and not well suited to interventional procedures involving
metallic instruments. Positron emission tomography (PET) is expensive and not available in many
institutions. Conventional ultrasound has the
advantages of real-time imaging, lower cost and is non-ionizing, but unfortunately in some cases is limited in reliably visualizing tissues as compared to other imaging modalities.
Accordingly, there is a need for better methods to monitor treatment therapies, such as ablation, and to determine whether a lesion has been completely treated .
In accordance with the principles of the present invention, methods and systems are provided for monitoring treatment (e.g., ablation) of lesions using a combination of ultrasound imaging with another non-ultrasound imaging modality. The present invention includes imaging a lesion of interest using a modality other than ultrasound, such as MRI, CT and/or PET imaging. The non-ultrasound image, in particular, can be used to provide an accurate representation of the lesion boundary. A first ultrasound image of the same lesion is generated and registered with the MRI, CT and/or PET image. The lesion can be treated with or without a treatment plan using known ablative therapies, chemotherapy, radiation therapies, and/or other treatment
techniques, such as local injections of alcohol or other substances for killing the lesion tissue. Following treatment, a second ultrasound image is generated to identify the volume of the treated tissue, which can be registered with the first ultrasound image, the non-ultrasound image, or both. The ultrasound images, along with the non-ultrasound image, are also more accurately registered with reference to tissue features outside of the treatment volume. After registration, tissue differences and/or similarities between the lesion and the treated tissue can be identified by comparing data in the two ultrasound images within or around the lesion boundary. Portions of the lesion and the treated tissue that have similar tissue characteristics can indicate insufficiently treated lesion tissue, and because the images are spatially registered a
physician can easily identify which portions in the lesion will be need to undergo further treatment. Tissue lesions being treated include, e.g., tumors, cysts, and other tissues that can be treated using therapies, such as known ablative therapies,
chemotherapy, radiation therapies, and/or other treatment techniques, such as local injections of alcohol or other substances for killing the lesion tissue .
In the drawings :
FIGURE 1 illustrates in block diagram form the use of three dimensional ultrasonic imaging to guide or monitor treatment in an embodiment of the present invention .
FIGURE 2 illustrates in block diagram form the functional subsystems of a three dimensional
ultrasonic imaging system suitable for use in an embodiment of the present invention.
FIGURE 3 illustrates a workflow in accordance with the present invention for monitoring treatment of a lesion.
FIGURE 4 depicts an example registration of an ultrasound image and an MRI image using tissue features outside of a treatment volume.
FIGURE 5A illustrates a comparison of 3D
ultrasound image data before and after treatment of a lesion .
FIGURE 5B illustrates untreated tissue in need of further treatment as identified with registration with an MRI image.
In one aspect, the present invention includes imaging systems. For example, the present invention provides an imaging system for measuring a remaining volume of a lesion after an ablation treatment. The system can receive a non-ultrasound image of a target region comprising a lesion. The system can acquire a first ultrasound image comprising the lesion, and register the first ultrasound image with the non- ultrasound image. The system can acquire a second ultrasound image that includes image data
representing a region of treated tissue. The region of treated tissue can include at least a portion of the lesion. The system can determine if the lesion has been sufficiently treated by comparing the lesion in the first ultrasound image with the treated tissue in the second ultrasound image and identifying whether there are untreated portions within a lesion boundary in the non-ultrasound image. In certain embodiments, the system can determine the level of treatment by registering any combination of the non- ultrasound image, the first ultrasound image, and the second ultrasound image using imaged structures that are outside of the treated tissue. In some
embodiments, the imaging system can include an ultrasonic diagnostic imaging system adapted to acquire the first and second ultrasound images.
Different non-ultrasound images can be used, such as, e.g., a magnetic resonance (MR) image, a computed tomography (CT) image, or a positron emission
tomography (PET) image. In some aspects, the first and second ultrasound images can include a 3D volume of the lesion and a 3D volume of the treated tissue, respectively. Determining if sufficient treatment has occurred can include subtracting the 3D volume of the lesion from the 3D volume of the treated tissue according to similar or different tissue
characteristics in the first and second ultrasound images. The system can also register the non- ultrasound image, the first and second ultrasound images, and a treatment plan depicting a predicted tissue volume to be treated. In certain embodiments, the treated tissue is ablated tissue. Structural components of the system can include processors and other well-known components used to carry out the methods and features of imaging and ultrasound
imaging systems. Software and algorithms can further be used to operate the various structures of the systems of the present invention.
Referring first to FIGURE 1, the use of three dimensional ultrasonic imaging to monitor ablation with a tissue ablation probe is shown in partial block diagram form. On the left side of the drawing is a three dimensional (3D) ultrasonic imaging system including an ultrasound probe 10 having a two
dimensional array transducer. A two-dimensional ultrasonic imaging system can also be used. Here, the transducer array transmits ultrasonic beams over a volumetric field of view 12 including a lesion 14 under control of an ultrasound acquisition subsystem 16 and receives echoes in response to the transmitted beams which are coupled to and processed by the acquisition subsystem. The echoes received by the elements of the transducer array are combined into coherent echo signals by the acquisition subsystem and the echo signals along with the coordinates from which they are received (r,6,cp for a radial transmission pattern) are coupled to a 3D image processor 18. The 3D image processor processes the echo signals into a three dimensional ultrasonic image, which is displayed on a display 20. The ultrasound system is controlled by a control panel 22 by which the user defines the
characteristics of the imaging to be performed.
In addition to the ultrasound imaging system, FIGURE 1 includes an interventional device system for performing treatment, e.g., tissue ablation. In some embodiments, the interventional device system includes an ablation probe 24, the different types of which are well known in the art. The ablation probe 24 is used to ablate a desired tissue region in a patient, and it can be manipulated by a physician (not shown) and/or a guidance subsystem 26, which may mechanically assist the maneuvering and placement of the interventional device within the body. The ablation probe 24 is operated to ablate tissue under the control of an intervention subsystem 28, which can be operated via control panel 36 (or control panel 22, if only one control panel is used) . The intervention subsystem 28 can also receive information on the procedure being performed, such as optical or acoustic image
information, temperature, electrophysiologic, or other measured information, or information signaling the completion of an invasive operation. Information that is acceptable to process for display is coupled to a display processor 30. As described further below, the ablation probe 24 and/or the ultrasound probe 10 may also have active position sensors that are used to provide information as to the location of the tip of the ablation probe along the insertion path 32 and/or the position of the transducer, which can be used to determine the position of the transducer imaging plane as well. The active position sensors may operate by transmitting or receiving signals in the acoustic, optical, radio frequency or electromagnetic spectrum and its output is coupled to a device position
measurement subsystem 34. Position information of the interventional device is coupled to the display
processor 30 when appropriate for the processing or display of information concerning the position of the interventional within the body. Information pertinent to the functioning or operation of the ablation probe is displayed on a display 20.
As also shown in FIGURE 1, image data may be exchanged over a signal path 38 between the 3D image processor 18 of the ultrasound system and the display processor 30 of the interventional device system for the formation of a 3D image containing information from both systems. The system in FIGURE 1 further includes a signal path 40 that connects the ultrasound acquisition subsystem 16 of the ultrasound system and the device position measurement subsystem 34 of the interventional device system to allow synchronization of the imaging system and the interventional device.
FIGURE 2 illustrates some of the components of the 3D ultrasound system of FIG. 1 in further detail. The elements of a two dimensional array transducer 42 are coupled to a plurality of microbeamformers 44. The microbeamformers control the transmission of ultrasound by the elements of the array transducer 42 and
partially beamform echoes returned to groups of the elements. The microbeamformers 44 are preferably fabricated in integrated circuit form and located in the housing of the ultrasound probe 10 near the array transducer. Microbeamformers , or subarray beamformers as they are often called, are more fully described in U.S. Pat. Nos. 6,375,617 and 5,997,479, which are incorporated by reference herein in their entirety. The ultrasound probe 10 may also include a position sensor 46 which provides signals indicative of the position of the probe 10 to a transducer position detector 48. The sensor 46 may be a magnetic, electromagnetic, radio frequency, infrared, or other type of sensor such as one which transmits a signal that is detected by a voltage impedance circuit. As will be described further below, the transducer position signal 50 produced by the detector 48 may be used by the ultrasound system or coupled to the interventional device system when useful for the formation of spatially coordinated images containing information from both systems.
The partially beamformed signals produced by the microbeamformers 44 are coupled to a beamformer 52 where the beam formation process is completed. The resultant coherent echo signals along the beams are processed by filtering, amplitude detection, Doppler signal detection, and other processes by a signal processor 54. The echo signals are then processed into image signals in the coordinate system of the probe (r,6,cp for example) by an image processor 56. The image signals are converted to a desired image format (x,y,z Cartesian coordinates, for example) by a scan converter 58. The three dimensional image data is coupled to a volume renderer 60 which renders a three dimensional view of the volumetric region 12 as seen from a
selected look direction. Volume rendering is well known in the art and is described, e.g., in U.S. Pat. No. 5,474,073, which is incorporated by reference herein in its entirety. Volume rendering may also be performed on image data which has not been scan converted as
described in U.S. Pat. No. 6,723,050, which is
incorporated by reference herein in its entirety.
During two dimensional imaging the image plane data bypasses the volume renderer and is coupled directly to a video processor 62 which produces video drive signals compatible with the requirements of the display 64. The volume rendered 3D images are also coupled to the video processor 62 for display. The system can display individual volume rendered images or a series of volume rendered images. In addition, two volume renderings can be done of a volumetric data set from slightly offset look directions, and the two are displayed
simultaneously on a stereoscopic display. A graphics and registration processor 66 is used for analysis and registration of images, such as the registration of two ultrasound images, an ultrasound image with a non- ultrasound image, or two ultrasound images and a non- ultrasound image. In addition, the graphics and registration processor 66 can receive images and data associated with a treatment plan for the ablation procedure, including an expected ablation region for the treatment. The treatment plan and the expected ablation region can also be registered with the other images. For the graphics function, the graphics and registration processor 66 receives either scan
converted image data from the scan converter 58 or unscan-converted image data from the image processor 56. Any graphics associated with the images are coupled to the video processor where they are
coordinated and overlaid for display.
The ultrasound system described above can be performed with a freehand approach or with an
interventional device system, such as the PercuNav system, elements of which are shown in FIGURE 2.
Like a GPS for medical instruments, the PercuNav system provides imaging tools to assist clinicians in ablation procedures. It combines electromagnetic tracking of flexible or rigid instruments with patient images from multiple modalities (e.g., CT, MRI, PET and/or ultrasound) to create a real-time 3D map of the patient space that displays the instrument position, orientation, and trajectory, as well as anatomical landmarks. This map helps guide physicians to areas of interest, even when they are small, hard to visualize, difficult to access, or close to sensitive organs, vessels, or tissue. Furthermore, display of corresponding data from multiple imaging modalities can be overlaid and side-by-side images with areas of interest can be automatically marked on images. Easy localization and comparison of hard-to- find or ambiguous ultrasound targets can also be conducted by referring to related CT or MR images with corresponding areas of interest marked on the different modality images.
For determining position information, the
PercuNav system has a field generator 68 which radiates an electromagnetic field permeating the site of the procedure and surrounding space. Sensors 46 are located on the ultrasound probe 10, the ablation probe 24 and the patient (not shown) which interact with the electromagnetic field and produce signals used to calculate the position and orientation of the 2D image plane of the ultrasound transducer, the tissue ablation probe, and the patient. This
calculation is done by a coordinate generator 70 of the PercuNav system, which receives the transducer position signal 50 and signals from the ablation apparatus and orientation coordinates for the image plane of the probe are also coupled to the field generator for field registration purposes.
Coordinate information of the ablation probe and image plane is coupled to the graphics and
registration processor 66, which produces graphics in response to operator control signals from the control panel 36 and uses the positional information provided by the PercuNav system to register the various images in accordance with the implementations of the present invention.
In another aspect, the present invention provides methods of identifying insufficiently treated portions of a lesion after treatment. The methods can include, for example, generating a non-ultrasound image of a target region of tissue comprising the lesion,
acquiring a first ultrasound image comprising the lesion, registering the first ultrasound image with the non-ultrasound image, treating tissue that comprises at least a portion of the lesion, acquiring a second ultrasound image comprising the treated tissue, determining if the lesion has been sufficiently treated by comparing the lesion in the first ultrasound image with the treated tissue in the second ultrasound image and identifying whether there are untreated portions within a lesion boundary in the non-ultrasound image.
FIGURE 3 is a flow chart showing the workflow 72 of an implementation of the present invention. This workflow 72 begins with a step 74 that includes obtaining a non-ultrasound image, such as a CT, MR, and/or PET image that includes three-dimensional data of imaged tissue of a patient. The non-ultrasound images can be acquired as part of the interventional procedure, or the CT, MR, and/or PET images can be acquired prior to the interventional procedure and uploaded for display in the system. The field of view for the non-ultrasound image includes a lesion of interest (e.g., a liver tumor) that can be processed using known methods to generate a region of interest identifying an accurate representation of a lesion boundary in 2D or 3D. As will be described further below, the lesion boundary is used to accurately determine whether the entire lesion is sufficiently treated during a treatment procedure.
For registration purposes, the non-ultrasound image also includes surrounding tissue that is outside a treatment volume (e.g., an ablation volume) in the vicinity of the lesion. In particular, the surrounding tissue can include, e.g., tissue structures and/or blood vessels, that will provide accurate registration of later acquired ultrasound images with the non- ultrasound images. Automated image-based systems for registering ultrasound volumes of tissue are well known. However, current algorithms that reference the lesion for registration will be confused or compromised by changes in the tissue due to ablation. These changes are, in fact, the very changes in the tissue that are detected and used to determine whether further treatment is needed. In the present invention, tissue that is outside the ablation region is used by the registration algorithm for more accurate registration of images acquired before and after ablation because the tissue outside the ablation region is unaffected. Identifying the tissue outside the ablation region can be done manually through user selection of tissue spatially removed from the lesion or via the expected ablation region defined by a treatment plan that is determined using the non-ultrasound images.
A treatment plan can also be used to model how to treat the lesion. The treatment plan depicts a
predicted treatment volume expected for a given therapy system, and can be generated using methods generally known in the art. For the treatment procedure, the treatment plan and the expected tissue treatment (e.g., ablation) volume can also be registered with the non- ultrasound image and superimposed over ultrasound images of the patient.
Step 76 includes acquiring an ultrasound image that includes the lesion and surrounding tissue that is outside the treatment volume. The ultrasound image can be acquired in 2D or it can include 3D image data of the lesion that is processed and rendered with the volume renderer 60 to generate a 3D volume of the lesion .
The non-ultrasound image and the ultrasound image are registered and fused in step 78, thereby overlaying the 3D ultrasound image volume of the lesion with the lesion imaged with the non-ultrasound modality. In this step, registration can be conducted using the lesion (as no tissue changes have occurred due to ablation) or by using tissue features surrounding the lesion, such as those outside the treatment volume.
Step 80 includes treating (e.g., ablating) the tissue that includes the lesion. Depending on the volume of the ablation, some or all of the lesion will be ablated. To determine the volume of ablated tissue, a second ultrasound image is acquired after tissue ablation in step 82. The second 3D ultrasound image is acquired immediately or after a specified duration, such as a duration needed for dispersal of any gases associated with the treatment, such as gases generated with RF ablation. For registration purposes, the second ultrasound image also includes tissue that is outside of the predicted treatment volume. This ensures that the tissue within the treatment volume is registered accurately even though it has been affected by the treatment.
Step 84 includes determining if any portion of the lesion is remaining after ablation that was
insufficiently treated. The determination can be performed in a variety of ways that use the non- ultrasound image to provide a more accurate
representation of the lesion boundary due to better contrast and resolution for imaging modalities, such as MRI and CT, as compared to ultrasound. The
registration of the ultrasound images, non-ultrasound images, and/or the treatment plan can be accomplished in different orders. But, registration with the non- ultrasound image provides accurate data for defining the lesion boundary, and therefore allows for a
physician to determine whether further ablation
treatment is needed after an ablation treatment is performed. In one example, and as provided in step 78, the non-ultrasound image is registered with the
ultrasound image acquired before ablation. After treatment (e.g., ablation), the second ultrasound image can be registered with the non-ultrasound image, the ultrasound image, or both. Moreover, the registration algorithm can use tissue features outside the treatment volume. And, in some embodiments, the treatment plan is also registered with one or more of the images and is used to define tissue that is outside the treatment volume .
Once the non-ultrasound image is registered with the two ultrasound images, two dimensional images or 3D volume renderings of the lesion and the ablated tissue are compared to determine whether any tissue has not been treated sufficiently during the subsequent
ablation treatments. Identifying differences in tissue before and after treatment (e.g., ablation) is
performed with known methods that can distinguish differences between treated and untreated tissue, such as using ultrasound contrast agents that show up differently in treated vs. untreated tissue,
differences in elastography imaging, differences in image parameter analysis (e.g., grayscale differences), and/or differences identified with flow-based
techniques. The same tissue characteristics within a given volume can also be used to identify portions of the lesion or surrounding tissue that was not ablated.
After identifying tissue differences, rendered ultrasound volumes can be compared between the pre- ablation lesion and the ablation tissue. The two rendered volumes of tissue can be subtracted from each other to highlight tissue that has changed
significantly (e.g. as defined by a threshold level) .
For example, a 3D volume of the lesion can be
subtracted from the 3D volume of the treated tissue according to similar or different tissue
characteristics in the first and second ultrasound images. 3D subtraction methods and algorithms are well known in the art. Because the ultrasound images are registered with the non-ultrasound image, the lesion border more accurately defined by the non-ultrasound modality can be used to determine whether the treated (e.g., ablated) tissue (as determined by the pre- and post-ablation comparison algorithm) extends beyond the border of the target lesion in all orientations. In this way, the clinician can be confident that the entire lesion has been treated or, if not, can continue with the treatment for a further period of time. Any tissue that lies within the lesion border that has not been highlighted may then be considered as tissue that was not sufficiently ablated and that may require further ablation. If there is any lesion tissue remaining, then the steps can be repeated until the entire lesion has been fully treated. In some
instances, the treated tissue volume may be larger than the lesion volume, which would indicated that all of the lesion was treated and no further treatment is needed.
The methods of the present invention are carried out using ultrasound systems as described herein. For example, the ultrasound systems can operate to perform any of the following steps: receive a non-ultrasound image of a target region comprising a lesion in a patient; acquire a first ultrasound image comprising the lesion; register the first ultrasound image with the non-ultrasound image; acquire a second ultrasound image comprising a region of treated tissue, the region of treated tissue comprising at least a portion of the lesion in the patient; and determine if the lesion has been sufficiently treated by comparing the lesion in the first ultrasound image with the treated tissue in the second ultrasound image and identifying whether there are untreated portions within a lesion boundary in the non-ultrasound image.
FIGURE 4 illustrates an embodiment of registering an ultrasound image 86 with a non-ultrasound image 88, such as a MRI image. The figure depicts the process in 2D, however, the application in 3D will be readily appreciated by one of ordinary skill in the art. As depicted with the arrows, the ultrasound image 86 includes tissue features 90 that are outside of a region surrounding the lesion 92. Using these tissue features that are spatially removed from the lesion, the ultrasound and non-ultrasound images may be anatomically aligned in the same orientation and overlaid. Registration may be done using known image fusion techniques such as the image fusion capability available on the Percunav image guidance system with image fusion, available from Philips Healthcare of Andover, MA. Image matching techniques may also be used, such as those used to stitch digital photographs together to form a panoramic image or those used in medical diagnostic panoramic imaging, in which a sequence of images are stitched together as they are acquired. Common image matching techniques use block matching, in which arrays of pixels from two images are manipulated to find a difference between them which meets a least squares (MSAD) fit. These techniques are useful for both 2D and 3D medical images as described in US Pat. 6,442,289 (Olsson et al . ) , which is
incorporated by reference herein in its entirety, and can also allow a 2D image to be aligned with the corresponding projection or tomographic section in a 3D dataset. The images can also be anatomically aligned manually by manipulating one until the same image or image plane is seen in both images. Image orientation alignment (registration) is performed by the graphics and registration processor 74 shown in FIGURE 2.
FIGURE 5A and 5B illustrate an implementation of comparing ultrasound images of a lesion for determining whether further treatment is needed after one treatment (e.g., an ablation treatment) . As shown, ultrasound imaging data from a first ultrasound scan 94 can be used to render a 3D volume 96 of a lesion of interest. Following treatment (not shown) , a second 3D volume of the treated tissue volume 98 can be rendered from ultrasound imaging data from a second scan 100. As described above, the tissue of the lesion and the treated volume can be compared using a variety of techniques that respond differently to treated vs.
untreated tissue, such as using ultrasound contrast agents that show up differently in treated vs.
untreated tissue, elastography imaging results that shows different values, image parameter analysis (e.g., grayscale differences), and/or flow-based techniques. Similar tissue characteristics within a given volume can also be used to identify portions of the lesion or surrounding tissue that was not treated. Here, the volume of the lesion is larger than the treated tissue volume that showed different characteristics than the untreated lesion tissue. The portion having similar tissue characteristics is shown in black with respect to the lesion and can readily be determined for 2D or for 3D volumes. In FIGURE 5B, the accurate location of the insufficiently treated tissue (in black) is
determined with the registration of a non-ultrasound image, such as an MRI image 88, with the two ultrasound images. As described herein, non-ultrasound images can have brighter contrast to show a lesion boundary more clearly. As such, registering the ultrasound images and/or compared ultrasound image data with the MRI image provide the physician with guidance data on what region of a lesion will need further treatment (e.g., ablation) .

Claims

WHAT IS CLAIMED IS:
1. A method of identifying insufficiently treated portions of a lesion after treatment, the method comprising:
generating a non-ultrasound image of a target region of tissue comprising the lesion;
acquiring a first ultrasound image comprising the lesion;
registering the first ultrasound image with the non-ultrasound image;
treating tissue that comprises at least a portion of the lesion;
acquiring a second ultrasound image comprising the treated tissue; and
determining if the lesion has been sufficiently treated by comparing the lesion in the first
ultrasound image with the treated tissue in the second ultrasound image and identifying whether there are untreated portions within a lesion boundary in the non-ultrasound image.
2. The method of claim 1, wherein the
determining comprises registering any combination of the non-ultrasound image, the first ultrasound image, and the second ultrasound image using imaged
structures that are outside of the treated tissue.
3. The method of claim 1, wherein the non- ultrasound image comprises a magnetic resonance (MR) image, a computed tomography (CT) image, or a
positron emission tomography (PET) image.
4. The method of claim 1, wherein the first and second ultrasound images comprise a 3D volume of the lesion and a 3D volume of the treated tissue, respectively .
5. The method of claim 4, wherein the
determining comprises subtracting the 3D volume of the lesion from the 3D volume of the treated tissue according to similar or different tissue
characteristics in the first and second ultrasound images .
6. The method of claim 1, wherein the treating is performed according to a treatment plan depicting a predicted tissue volume to be treated.
7. The method of claim 1, further comprising registering the treatment plan with any one of the non-ultrasound image, the first ultrasound image, or the second ultrasound image.
8. The method of claim 1, comprising treating the insufficiently treated portions within the lesion boundary in the non-ultrasound image.
9. The method of claim 1, wherein the treating comprising ablating the tissue.
10. An imaging system for measuring a remaining volume of a lesion after an ablation treatment, the system comprising instructions that when executed cause the system to:
receive a non-ultrasound image of a target region comprising a lesion;
acquire a first ultrasound image comprising the lesion;
register the first ultrasound image with the non-ultrasound image;
acquire a second ultrasound image comprising a region of treated tissue, the region of treated tissue comprising at least a portion of the lesion; and
determine if the lesion has been sufficiently treated by comparing the lesion in the first
ultrasound image with the treated tissue in the second ultrasound image and identifying whether there are untreated portions within a lesion boundary in the non-ultrasound image.
11. The imaging system of claim 10, wherein the determine step comprises registering any combination of the non-ultrasound image, the first ultrasound image, and the second ultrasound image using imaged structures that are outside of the treated tissue.
12. The imaging system of claim 10, comprising an ultrasonic diagnostic imaging system adapted to acquire the first and second ultrasound images.
13. The imaging system of claim 10, wherein the non-ultrasound image comprises a magnetic resonance (MR) image, a computed tomography (CT) image, or a positron emission tomography (PET) image.
14. The imaging system of claim 10, wherein the first and second ultrasound images comprise a 3D volume of the lesion and a 3D volume of the treated tissue, respectively.
15. The imaging system of claim 10, wherein the determine step comprises subtracting the 3D volume of the lesion from the 3D volume of the treated tissue according to similar or different tissue
characteristics in the first and second ultrasound images .
16. The imaging system of claim 10, wherein the instructions, when executed, further cause the system to register the non-ultrasound image, the first and second ultrasound images, and a treatment plan depicting a predicted tissue volume to be treated.
17. The imaging system of claim 10, wherein the treated tissue is ablated tissue.
PCT/IB2014/066537 2013-12-13 2014-12-03 Imaging systems and methods for monitoring treatment of tissue lesions WO2015087203A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361915657P 2013-12-13 2013-12-13
US61/915,657 2013-12-13

Publications (1)

Publication Number Publication Date
WO2015087203A1 true WO2015087203A1 (en) 2015-06-18

Family

ID=52350159

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/066537 WO2015087203A1 (en) 2013-12-13 2014-12-03 Imaging systems and methods for monitoring treatment of tissue lesions

Country Status (1)

Country Link
WO (1) WO2015087203A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020518385A (en) * 2017-05-04 2020-06-25 ガイネソニックス, インコーポレイテッド A method for monitoring ablation progression using Doppler ultrasound
US11419682B2 (en) 2016-11-11 2022-08-23 Gynesonics, Inc. Controlled treatment of tissue and dynamic interaction with, and comparison of, tissue and/or treatment data
WO2023222845A1 (en) * 2022-05-20 2023-11-23 Koninklijke Philips N.V. Multi-modality image visualization for stroke detection

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5474073A (en) 1994-11-22 1995-12-12 Advanced Technology Laboratories, Inc. Ultrasonic diagnostic scanning for three dimensional display
US5997479A (en) 1998-05-28 1999-12-07 Hewlett-Packard Company Phased array acoustic systems with intra-group processors
WO2002009588A1 (en) * 2000-08-01 2002-02-07 Tony Falco Method and apparatus for lesion localization, definition and verification
US6375617B1 (en) 2000-08-24 2002-04-23 Atl Ultrasound Ultrasonic diagnostic imaging system with dynamic microbeamforming
US6442289B1 (en) 1999-06-30 2002-08-27 Koninklijke Philips Electronics N.V. Extended field of view ultrasonic diagnostic imaging
US6723050B2 (en) 2001-12-19 2004-04-20 Koninklijke Philips Electronics N.V. Volume rendered three dimensional ultrasonic images with polar coordinates
WO2005010711A2 (en) * 2003-07-21 2005-02-03 Johns Hopkins University Robotic 5-dimensional ultrasound
US20050261591A1 (en) * 2003-07-21 2005-11-24 The Johns Hopkins University Image guided interventions with interstitial or transmission ultrasound
WO2013179221A1 (en) * 2012-05-29 2013-12-05 Koninklijke Philips N.V. Elasticity imaging-based methods for improved gating efficiency and dynamic margin adjustment in radiation therapy

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5474073A (en) 1994-11-22 1995-12-12 Advanced Technology Laboratories, Inc. Ultrasonic diagnostic scanning for three dimensional display
US5997479A (en) 1998-05-28 1999-12-07 Hewlett-Packard Company Phased array acoustic systems with intra-group processors
US6442289B1 (en) 1999-06-30 2002-08-27 Koninklijke Philips Electronics N.V. Extended field of view ultrasonic diagnostic imaging
WO2002009588A1 (en) * 2000-08-01 2002-02-07 Tony Falco Method and apparatus for lesion localization, definition and verification
US6375617B1 (en) 2000-08-24 2002-04-23 Atl Ultrasound Ultrasonic diagnostic imaging system with dynamic microbeamforming
US6723050B2 (en) 2001-12-19 2004-04-20 Koninklijke Philips Electronics N.V. Volume rendered three dimensional ultrasonic images with polar coordinates
WO2005010711A2 (en) * 2003-07-21 2005-02-03 Johns Hopkins University Robotic 5-dimensional ultrasound
US20050261591A1 (en) * 2003-07-21 2005-11-24 The Johns Hopkins University Image guided interventions with interstitial or transmission ultrasound
WO2013179221A1 (en) * 2012-05-29 2013-12-05 Koninklijke Philips N.V. Elasticity imaging-based methods for improved gating efficiency and dynamic margin adjustment in radiation therapy

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11419682B2 (en) 2016-11-11 2022-08-23 Gynesonics, Inc. Controlled treatment of tissue and dynamic interaction with, and comparison of, tissue and/or treatment data
JP2020518385A (en) * 2017-05-04 2020-06-25 ガイネソニックス, インコーポレイテッド A method for monitoring ablation progression using Doppler ultrasound
EP3638126A4 (en) * 2017-05-04 2021-03-10 Gynesonics, Inc. Methods for monitoring ablation progress with doppler ultrasound
US11612431B2 (en) 2017-05-04 2023-03-28 Gynesonics, Inc. Methods for monitoring ablation progress with doppler ultrasound
WO2023222845A1 (en) * 2022-05-20 2023-11-23 Koninklijke Philips N.V. Multi-modality image visualization for stroke detection

Similar Documents

Publication Publication Date Title
US8075486B2 (en) Enhanced ultrasound image display
CN107072736B (en) Computed tomography enhanced fluoroscopy systems, devices, and methods of use thereof
Wein et al. Automatic CT-ultrasound registration for diagnostic imaging and image-guided intervention
JP5345275B2 (en) Superposition of ultrasonic data and pre-acquired image
JP5265091B2 (en) Display of 2D fan-shaped ultrasonic image
JP4795099B2 (en) Superposition of electroanatomical map and pre-acquired image using ultrasound
JP5622995B2 (en) Display of catheter tip using beam direction for ultrasound system
EP2064991B1 (en) Flashlight view of an anatomical structure
JP2008535560A (en) 3D imaging for guided interventional medical devices in body volume
KR20080053224A (en) Coloring electroanatomical maps to indicate ultrasound data acquisition
JP2006305358A (en) Three-dimensional cardiac imaging using ultrasound contour reconstruction
WO2005092198A1 (en) System for guiding a medical instrument in a patient body
CA2796067A1 (en) Systems and methods for enhanced imaging of objects within an image
JP2006305359A (en) Software product for three-dimensional cardiac imaging using ultrasound contour reconstruction
Mohareri et al. Automatic localization of the da Vinci surgical instrument tips in 3-D transrectal ultrasound
WO2014031531A1 (en) System and method for image guided medical procedures
JP6088653B2 (en) Ultrasonic volume flow measurement for ablation treatment
WO2015087203A1 (en) Imaging systems and methods for monitoring treatment of tissue lesions
Shahin et al. Ultrasound-based tumor movement compensation during navigated laparoscopic liver interventions
AU2013251245B2 (en) Coloring electroanatomical maps to indicate ultrasound data acquisition
Neshat et al. Development of a 3D ultrasound-guided system for thermal ablation of liver tumors
Caskey et al. Electromagnetically tracked ultrasound for small animal imaging

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14827540

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14827540

Country of ref document: EP

Kind code of ref document: A1