CA2300245A1 - System and method for intra-operative image-based, interactive verification of a pre-operative surgical plan - Google Patents

System and method for intra-operative image-based, interactive verification of a pre-operative surgical plan Download PDF

Info

Publication number
CA2300245A1
CA2300245A1 CA002300245A CA2300245A CA2300245A1 CA 2300245 A1 CA2300245 A1 CA 2300245A1 CA 002300245 A CA002300245 A CA 002300245A CA 2300245 A CA2300245 A CA 2300245A CA 2300245 A1 CA2300245 A1 CA 2300245A1
Authority
CA
Canada
Prior art keywords
operative
data
images
intra
imaging camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002300245A
Other languages
French (fr)
Inventor
Andre Gueziec
Alan David Kalvin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
International Business Machines Corp
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines Corp filed Critical International Business Machines Corp
Publication of CA2300245A1 publication Critical patent/CA2300245A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5211Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
    • A61B6/5229Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image
    • A61B6/5235Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data combining image data of a patient, e.g. combining a functional image with an anatomical image combining images from the same or different ionising radiation imaging techniques, e.g. PET and CT
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/16Bone cutting, breaking or removal means other than saws, e.g. Osteoclasts; Drills or chisels for bones; Trepans
    • A61B17/17Guides or aligning means for drills, mills, pins or wires
    • A61B17/1739Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body
    • A61B17/1742Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body for the hip
    • A61B17/175Guides or aligning means for drills, mills, pins or wires specially adapted for particular parts of the body for the hip for preparing the femur for hip prosthesis insertion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers

Abstract

A system and method for intra-operatively providing a surgeon with visual evaluations of possible surgical outcomes ahead of time, and generating simulated data, includes a medical imaging camera, a registration device for registering data to a physical space, and to the medical imaging camera, and a fusion mechanism for fusing the data and the images to generate simulated data. The simulated data (e.g., such as augmented X-ray images) is natural and easy for a surgeon to interpret.
In an exemplary implementation, the system preferably includes a data processor which receives a three-dimensional surgical plan or three-dimensional plan of therapy delivery, one or a plurality of two-dimensional intra-operative images, a three-dimensional model of pre-operative data, registration data, and image calibration data. The data processor produces one or a plurality of simulated post-operative images, by integrating a projection of a three-dimensional model of pre-operative data onto one or a plurality of two- dimensional intra-operative images.

Description

SYSTEM AND METHOD FOR INTRA-OPERATIVE, IMAGE-BASED, INTERACTIVE VERIFICATION
OF A PRE-OPERATIVE SURGICAL PLAN
BACKGROUND OF THE INVENTION
Field of the Invention The present invention generally relates to robotics and medical imaging techniques and, more to particularly to robotically-assisted surgical systems and other devices incorporating methods for registering image data (both pre-operative and intra-operative) to physical space and for providing feedback, and in particular visual feedback, to the clinician.
Description of the Related Art Computers are increasingly used to plan complex surgeries by analyzing pre-operative Computed Tomography (CT) or Magnetic Resonance Imaging (MRI) scans of a patient.
To execute the surgical plan, it is important to accurately align or register the three-dimensional pre-operative and intra-operative data to an actual location of the patient's anatomical features of interest during surgery.
One conventional technique for performing this type of registration is to attach a stereo-tactic frame or fiducial markers to the patient, and to precisely locate the frame or markers prior to and during surgery.
For example, in the case of a surgery involving a patient's femur, a conventional registration protocol includes implanting three metallic markers or pins in the patient's femur (e.g., one proximally in the trochanter and two distally in the condyles, near the knee).
However, the insertion of the pins requires minor surgery. A CT-scan image of the patient is subsequently acquired. By analyzing the CT data, the surgeon decides upon the size and location of the implant that best fits the patient's anatomy. During surgery, the metallic pins are exposed at the hip and knee. The patient's leg is attached to a surgical robot device that then must locate the exposed pins. A

registration, or coordinate transformation from CT space to robot space, is computed using the locations of the three pins as a Cartesian frame. The accuracy of this registration has been measured to be better than one millimeter. This conventional registration protocol is described in U.S. Patent No. 5,299,288 entitled "IMAGE-DIRECTED ROBOTIC SYSTEM FOR PRECISE ROBOTIC
SURGERY INCLUDING REDUNDANT CONSISTENCY CHECKING" by Glassman et al.
However, using such pins as markers is not always desirable, as they may cause significant patient discomfort, and the required surgical procedure to insert and subsequently remove the pins is inconvenient and costly to the patient.
An alternative registration technique is to perform anatomy-based registration that uses to anatomical features of the patient (e.g., generally bone features), as markers for registration.
Conventional methods for performing anatomy-based registration are described in "Registration of Head CT Images to Physical Space Using a Weighted Combination of Points and Surfaces" by Hernng et al., in IEEE Transactions on Medical Ima~in~, Vol. 17, No 5, pages 753-761, 1998 and in U.S. Patent ApplicationNo. 08/936,935 (Y0997-322) entitled "METHODS AND
APPARATUS
FOR REGISTERING CT-SCAN DATA TO MULTIPLE FLUOROSCOPIC IMAGES", filed on September 27, 1997 by A Gueziec et al.
Once the registration has been performed, it is important to provide the clinician with means to assess the registration, allowing him or her to validate, reject or improve the registration (and the surgical plan). A system and method for advising a surgeon is described in U.S. Patent No.
5,445,166, entitled "SYSTEM FOR ADVISING A SURGEON", by Taylor. Taylor describes a system for guiding the motions of a robot, or of a positioning device controlled by motors, and teaches how audio feedback and force feedback can be provided to a surgeon.
Taylor also describes a visual adviser allowing comparison of the surgical plan with its execution.
The system taught by Taylor optionally uses a camera at the end of a surgical instrument that sends an image to the graphics adapter, optionally mixed with graphics output of the computer.
A conventional technique for simulating a post-operative X-ray image is described in "An Overview of Computer-Integrated Surgery at the IBM T. J. Watson Research Center" by Taylor et al., in IBM Journal of Research, 1996.
Thus, conventional techniques are useful for registering three- dimensional pre-operative and intra-operative data to an actual location of anatomical features of interest during surgery, and to provide advice to the surgeon. However, none of the conventional techniques teaches how to simulate a post-operative condition depending upon the registration of image data to physical space, by fusing intra-operative images with registered pre- operative data, and generating new images.
In Taylor et al., the simulated post-operative X-ray image is generated using only pre-operative CT (Computed Tomography) data. Herring et al. do not teach how to evaluate the registration accuracy intra-operatively.
Although Glassman et al.'s and Taylor's systems compare a surgical plan and its execution, neither Glassman et al. nor Taylor teaches how to simulate the outcome of a surgical plan prior to the actual execution of the plan. With Taylor's system, a surgeon can take corrective measures to minimize the effects of a wrongful execution of the plan, but cannot make a decision before any execution of the plan and therefore cannot prevent all errors before they occur.
Further, the information produced by Taylor's system for advising a surgeon is not represented in the form of conventional medical media (e.g., such as X-ray images) and require an extra burden on the surgeon in order to interpret and evaluate this information.
Thus, it is believed that conventional techniques do not exist (or at the very least are inadequate) for (a) providing the surgeon with post-operative evaluations prior to surgery, that are obtained by merging intra-operative image data and pre-operative data, and (b) presenting such evaluations in a standard clinical fashion (e.g., such as augmented X-ray images) that is natural for a surgeon to interpret.
Other problems of the conventional systems and methods include the limited availability of 2-D/3-D registration methods in conventional art systems for advising a surgeon and the 2-D/3-D
registration postdates.
SUMMARY OF THE INVENTION
In view of the foregoing and other problems of the conventional methods and structures, an object of the present invention is to provide a method and structure for intra-operatively providing the surgeon with visual evaluations of possible surgical outcomes ahead of time, the evaluations being obtained by merging intra-operative image data and pre-operative data, and presented in a standard clinical fashion (e.g., such as augmented X-ray images) that is natural and easy for a surgeon to interpret.
Another object of the present invention is to provide a system and method for comparing several registration methods of pre-operative data to the physical space of the operating room.
Yet another object of the present invention is to provide a system and method for assisting the surgeon in improving an inaccurate registration of a pre-operative surgical plan to the physical space.
Still another object of the present invention is to provide an improved robotically assisted surgical system that also provides visual post-operative evaluations.
l0 A further object of the present invention is to provide an improved robotically-assisted surgical system that includes a system for assisting the surgeon in improving a registration.
Another object of this invention is to provide an improved robotically assisted surgical system that includes a system for preventing surgical errors caused by internal failure of the robot's calibration system.
is The present invention includes a system to intra-operatively provide the surgeon with visual evaluations of possible surgical outcomes ahead of time, the evaluations being obtained by merging intra-operative image data and pre-operative data, and being presented in a standard clinical fashion (e.g., such as augmented X-ray images) that is natural and easy for a surgeon to interpret.
The inventive system preferably includes a data processor. The data processor takes as inputs 2o a three-dimensional surgical plan or three- dimensional plan of therapy delivery, one or a plurality of two-dimensional intra-operative images, a three-dimensional model of pre-operative data, registration data, and image calibration data.
The data processor produces one or a plurality of simulated post- operative images, by integrating a projection of a three-dimensional model of pre-operative data onto one or a plurality 25 of two-dimensional intra-operative images.
The data processor optionally receives an input from a surgeon or a clinician.
The input preferably includes a set of constraints on the surgical plan or plan of therapy delivery. The data processor preferably optimizes the surgical plan or plan of therapy delivery using the constraints.
In a first aspect of the present invention, a system (and method) for generating simulated data, includes a medical imaging camera for generating images, a registration device for registering data to a physical space, and to the medical imaging camera, and a fusion (integration) mechanism for fusing (integrating) the data and the images to generate simulated data.
In another aspect of the invention, a signal-bearing medium is provided for storing a program for performing the method of the invention. Other aspects of the invention are also set forth below.
With the invention, the surgeon is provided with intra-operative visual evaluations of possible surgical outcomes in advance, with the evaluations being obtained by merging intra-operative image data and pre- operative data. Such evaluations are presented in a standard clinical fashion that is natural and easy for a surgeon to interpret. Further, the inventive system 1o compares several registration methods of pre-operative data to the physical space of the operating room.
Moreover, the invention assists the surgeon in improving an inaccurate registration of a pre-operative surgical plan to the physical space. Additionally, the system can be robotically-assisted and can provide visual post-operative evaluations.
Finally, in the robotically-assisted implementation of the inventive system, surgical errors, caused by internal failure of the robot's calibration system, can be prevented.
BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing and other purposes, aspects and advantages will be better understood from the 2o following detailed description of preferred embodiments of the invention with reference to the drawings, in which:
Figure 1 is a block diagram of a preferred embodiment of a system according to the present invention;
Figure 2 is a flow chart showing an overview of a process to generate a post-operative simulation;
Figure 3 is a flow chart showing an overview of a process for validating, rejecting or improving a surgical plan using post-operative simulations;
Figure 4 is a diagram illustrating a preferred method for calibrating intra-operative images with respect to a coordinate system of relevance;

Figure 5 is a diagram illustrating a preferred method for generating a reformatted and optionally undistorted image from an intra-operative image;
Figure 6 is a diagram illustrating a method for determining a new image resolution and new pixel locations;
s Figure 7 is a diagram illustrating a method for determining image gray-scale values corresponding to new pixel locations;
Figure 8 is a diagram showing a simulated post-operative X-ray image;
Figure 9 is a diagram illustrating a method for comparing several registration methods;
Figure 10 is a diagram illustrating how one or several two- dimensional regions of acceptable 1o positions can be used to define a three- dimensional region of acceptable positions;
Figure 11 is a diagram illustrating a two-dimensional region of acceptable positions; and Figure 12 illustrates a storage medium 1200 for storing steps of the program for generating a post-operative simulation.

EMBODIMENTS OF THE INVENTION
Refernng now to the drawings, and more particularly to Figures 1-12, there are shown preferred embodiments of the method and structures according to the present invention.
20 Generally, the present invention resides in a system and method to intra-operatively provide the surgeon with visual evaluations of possible surgical outcomes ahead of time, the evaluations being obtained by merging intra-operative image data and pre-operative data, and being presented in a standard clinical fashion (such as augmented X-ray images) that is natural and easy for a surgeon to interpret.
25 A novel aspect of the present invention is to allow intra-operative manipulation of a model (e.g., such as a CAD model of an implant) as opposed to a real object (e.g., such as a cutter of a surgical robot as in Taylor's system).
Refernng to Figure 1, a system 1000 according to the present invention uses a two-dimensional intra-operative image 1 O 10 (e.g., a two- dimensional X-ray or other type of image) and a three-dimensional shape of a prosthetic implant 1020, and comprises a data processor 1040.
The pre- operative image (e.g., of the shape of the implant with respect to anatomical features) may be obtained by an X-ray, computed tomography (CT) scanner, whereas the intra-operative images) may be obtained by a two-dimensional (2D) X-ray camera.
The data processor 1040 receives the image 1010 and the shape 1020, as well as registration data 1050 and a surgical plan 1060. The registration data registers the shape 1020 with the camera used for acquiring the image 1 O 10. An example of registration process producing registration data 1050 is provided in the above-mentioned U.S. Patent Application No.
08/936,935.
A typical example of the surgical plan 1060 is a planned type, orientation and position of an to implant relative to anatomical structures in a pre-operative CT scan.
Another example of the surgical plan 1060 is the planned type, orientation and position of an implant relative to co-registered intra-operative X-ray images of anatomical structures.
Image calibration data 1070 is also input to the data processor. The data processor 1040 produces a simulated post-operative image 1030. Image 1030 may be presented visually to the surgeon on a display 1035. That is, the post-operative simulation (e.g., data which preferably includes an image such as a 2-dimensional image) may be displayed on any of a cathode ray tube (CRT), liquid crystal display (LCD), or the like.
Referring now to Figure 2, the operation of the present invention will be described hereinbelow. Figure 2 is a flow chart illustrating how a post- operative simulation can be generated 2o using the present invention.
In Step 2010, an image (e.g., an X-ray image or other intra-operative image 1010 as shown in Figure 1) is captured intra-operatively. Conventional methods for capturing an X-ray image include using a frame grabber connected to the video output of a conventional fluoroscope.
Fluoroscopes are manufactured by many medical imaging equipment manufacturers.
An example of a fluoroscope is the Ziehm Exposcop Plus~ System (Exposcop Plus is a trademark of the Ziehm Corporation). Another method for capturing an X-ray image intra-operatively is to use an X-ray flat panel detector. An example of an X-ray flat panel detector is the FlashScan 30~. FlashScan 30 is a trademark of the DPIX Corporation.
Then in Step 2020, a geometric calibration of the X-ray image is performed.
Geometric calibration is preferably performed using the teachings of the above-mentioned U.S. Patent Application 08/936,935.
In Step 2030, X-ray and pre-operative CT data are registered (e.g., this data represents the registration data 1050 of Figure 1 ). A preferred method for registering X-ray and pre-operative CT
data is described in the above-mentioned U.S. Patent Application 08/936,935.
Then, in Step 2040, the geometric distortion of the X-ray image is corrected.
Further understanding of Step 2040 can be achieved with reference to Figures 5-7 described below.
In Step 2050, the registration and calibration are used to project pre-operative data such as a three-dimensional shape of an implant (e.g., shape 1020 in Figure 1 ) onto the X-ray image. 'The result is the simulated post- operative image 1030 in Figure 1.
Figure 3 is a flow chart showing an overview of a process for validating, rejecting or improving a surgical plan using post-operative simulations.
In Step 2000, a post-operative simulation (as described above in Figure 2) is generated using the present invention. After reviewing this simulation on the display or the like, the surgeon may optionally request one or more additional 2D X-ray views to assist in evaluating the simulation, as shown in Step 3010.
In Step 3020, the surgical plan is evaluated as follows: ( 1 ) if the surgeon evaluates it as being satisfactory, the plan is accepted as is, and the surgery proceeds (Step 3030); (2) if the plan is considered unsatisfactory, the plan is either (a) rejected (Step 3040), and, as shown in Step 3050, the 2o surgeon reverts to manual surgery or the surgical procedure is canceled, or (b) the plan is modified, as shown in Steps 3060-3070.
In Step 3060, the surgeon has the option to specify additional three-dimensional constraints in a manner that preserve existing three-dimensional constraints. That is, the surgeon may define constraints that preserve a projected alignment on a subset of the total number of views. For example, the surgeon can indicate "prohibited regions" on one or more of the two- dimension images, where such regions are "off limits" to the prosthesis being implanted.
Typical "off limit" regions are those close to critical arteries, nerves, or regions where the bone is very thin and fragile. Each of such two- dimensional "prohibited regions" corresponds to a three-dimensional "prohibited volume" that is defined by the 2-D region and the projection geometry. It is noted that these existing three-dimensional constraints are a subset of the constraints that are represented in the existing two-dimensional X-ray views.
In Step 3070, the surgeon optionally refines the surgical plan (e.g., adjusting the position of an end-effector or prosthetic implant, etc.) based on the existing three-dimensional constraints and those new constraints produced in Step 3060. A typical example of the refinement process occurs if the surgeon sees that the planned position for implanting a prosthesis is unsatisfactory because the implant will impinge on a region of thin bone, thereby increasing the chance of subsequent bone-fracture. The surgical procedure is therefore refined by calculating a modified position for the implant relative to the patient.
Then, the process loops to step 2000, and a new post-operative simulation is now generated.
The sequence of steps 2000, 3010, 3020, 3040, 3060, 3070 is repeated until the resulting surgical plan is accepted (Step 3030) or rejected (Step 3050).
Referring now to Figure 4, the geometric calibration of an X-ray image of Step 2020 will be further described below.
As shown in Figure 4, both a near grid 4020 of markers and a far grid 4030 of markers are positioned in front of an X-ray source 4010. Distorted grid points (e.g., distorted because of the physical process of x-ray image capture using an image intensifier and because of other potential optical distortions) are observed on one or a plurality of captured distorted images 4040. Any image pixel x can be associated to two locations x~ (near grid) and xf (far grid)on the grids, specifying a ray 2o in three dimensions.
Referring now to Figures 5-7, the correction of geometric distortion in X-ray images of Step 2040 will be described in detail.
As shown in Figure 5, each pixel of a distorted X-ray image SO10 is mapped to (u, v)-space 5020 (or any suitable coordinate system) of any one of the near grid 4020 of markers or the far grid 4030 of markers.
The four corner pixels of the image SO10 are preferably mapped to (u, v)-space 5020, and a quadrilateron 5030 of transformed corner pixels is obtained.
Then, a rectangular area 5040 bounding the distortion-corrected image is determined. A
preferred method for determining the rectangular area 5040 is to use the first and last transformed pixels of the first pixel row of image 5010 to define a first side of the rectangular area 5040 and to determine a second side, orthogonal to the first side, whose length is equal to the distance between the first and last transformed pixels of the first pixel column of image 5010.
Referring to Figure 6, new pixel locations 6010 are preferably determined by dividing a rectangular boundary 6030 of a distortion-corrected image determined with the rectangular area 5040 in substantially equally- sized square pixels, the rectangular boundary 6030 approximating an area 6020 of a transformed image.
Referring to Figure 7, a triangular mesh 7010 is defined by transforming the pixels of the distorted image and connecting the transformed pixel location with triangles.
to For each new pixel 7020, a closest triangle is determined. A gray- scale value for the new pixel 7020 is preferably determined using gray-scale values of the three corners 7030 of the closest triangle, by interpolating the gray-scale values of the three corners of the closest triangle using triangle barycentric coordinates 7040 of the new pixel 7020 with respect to the three corners 7030 of the closest triangle.
Refernng to Figure 8, the step 2050 of using the registration and calibration to project pre-operative data, such as a three-dimensional shape of an implant, onto an X-ray image will be further described.
As shown in Figure 8, a two-dimensional projection 8010 of a silhouette of a three-dimensional implant model 8030 is integrated in an X- ray image 8020.
The process of integrating the two-dimensional projection 8010 onto the image 8020 preferably uses the following steps.
Using the calibration of the X-ray image 8020, a center of perspective 8040 is determined whose location represents an estimate of the location of an X-ray source. The center of perspective 8040 is used to compute silhouette curves 8050 on the three-dimensional implant model 8030.
Silhouette curves (or apparent contours) are such that rays emanating from the center ofperspective and tangent to the three-dimensional model meet the three- dimensional model on a silhouette curve.
Silhouette curves are defined precisely in the above-mentioned US Patent Application 08/936,935.
Various techniques can be used to project silhouette curves 8050 on a two-dimensional image, such as the X-ray image 8020.
YOR9-1999-0095 l0 One preferred method for projecting silhouette curves is to consider in turn each new pixel 7020, determine a line in three-dimensions corresponding to that pixel by image calibration, compute the distance from the line to the silhouette curves, and assign a pixel gray-scale value depending on that distance. The distance is preferably computed using a method described in the above-mentioned U.S. Patent Application No. 08/936,935.
An example of assignment of pixel gray-scale values corresponding to distances is as follows.
If the distance is less than, for example, about 0.05 mm (e.g., this value can be freely selected depending on the size of the anatomical feature or area of interest, etc.), then the gray-scale value to is set to 0. Otherwise, if the distance is less than about 0.1 mm, then the gray-scale value is set to 30. Otherwise, if the distance is less than 0.2 mm, then the gray-scale value is set to 60, and otherwise the gray-scale value is not modified for the purpose of projecting the silhouette curves.
It is noted that other values may be more suitable for other applications.
Various techniques are known in the conventional computer graphics methods to project a three-dimensional image calibration information 1070. These conventional methods differ from the method explained previously in that individual three-dimensional polygons (or other elementary primitives) that define the shape are individually projected onto the two-dimensional image and shaded according to the orientation of normals of vertices of the three-dimensional polygon and other factors including colors potentially attached to polygon vertices. Generally, the objective is to preserve the 2o three-dimensional appearance of the object when projecting it. Such techniques are described in standard textbooks such as "Computer Graphics: Principles and Practice", by Foley et al., Addison Wesley 1991.
In an alternative embodiment, the silhouette curves 8050 may be projected on the two-dimensional image (e.g., by associating pixels with proj ected curve vertices and connecting such pixels with lines using Bresenham's line joining algorithm, which is described in Foley et al., pages 72-73), forming closed two- dimensional polygonal loops.
In yet another embodiment, the closed polygonal loops may be rasterized using conventional polygon-filling methods (e.g., yielding a uniform gray level or color for the area of the two-dimensional image where the three-dimensional shape projects).

Referring to Figure 9, two or more registration methods (preferably used for registering pre-operative data to the physical space of the operating room) are compared by integrating a projection 9010 of an implant predicted with a first registration method (any method, e.g. by the above-mentioned "Registration of Head CT Images to Physical Space Using a Weighted Combination of Points and Surfaces" by Herring et al., in IEEE Transactions on Medical Ima~i~,, Vol. 17, No 5, pages 753-761 ) onto an X-ray image 9030 and a second projection 9020 of an implant predicted with a second registration method. Thus, the invention provides a mechanism for comparing the efficiency of various methods.
Figure 10 illustrates the relationship between (a) the set of three-dimensional acceptable to positions 10030 of an end-effector, prosthetic implant or other object being positioned as part of the surgical plan, and (b) the two- dimensional regions of acceptable positions 10010 depicted in the two- dimensional intra-operative images 10020.
Given the set of three-dimensional acceptable positions 10030, the two-dimensional regions of acceptable positions 10010 are computed using standard conventional methods to compute two-dimensional projections of a three-dimensional object.
Figure 11 shows a technique for depicting to the surgeon the relationship between a two-dimensional projection of the end-effector, prosthetic implant or other object being positioned as part of the surgical plan 11010, and the corresponding two-dimensional region of acceptable positions 11020. The region of acceptable positions 11020 is represented as a closed 2o two-dimensional curve.
If, in repositioning the object being positioned as part of the surgical plan, any portion of its two-dimensional projection 11010 moves outside of the two-dimensional region of acceptable positions 11020, the surgical plan is considered unacceptable. Therefore, as shown in Figure 11, the surgeon is provided with a visual representation (e.g., on a display coupled to, for example, the data processor) to assist him in evaluating and modifying the surgical plan.
As shown in Figure 12, in addition to the hardware and process environment described above, a different aspect of the invention includes a computer-implemented method for generating a post-operative simulation, as described above. As an example, this method may be implemented in the particular hardware environment discussed above.

Such a method may be implemented, for example, by operating a CPU, to execute a sequence of machine-readable instructions. These instructions may reside in various types of signal-bearing media.
Thus, this aspect of the present invention is directed to a programmed product, comprising signal-bearing media tangibly embodying a program of machine-readable instructions executable by a digital data processor incorporating the CPU and hardware above, to perform a method of generating a post-operative simulation.
This signal-bearing media may include, for example, a random access memory (RAM) contained within the CPU, as represented by a fast-access storage, for example. Alternatively, the to instructions may be contained in another signal-bearing media, such as a magnetic data storage diskette 1200 (Figure 12), directly or indirectly accessible by the CPU.
Whether contained in the diskette 1200, the computer/CPU, or elsewhere, the instructions may be stored on a variety of machine-readable data storage media, such as DASD storage (e.g., a conventional "hard drive" or a RAID array), magnetic tape, electronic read-only memory (e.g., ROM, EPROM, or EEPROM), an optical storage device (e.g. CD-ROM, WORM, DVD, digital optical tape, etc.), paper "punch" cards, or other suitable signal- bearing media including transmission media such as digital and analog communication links and wireless. In an illustrative embodiment of the invention, the machine-readable instructions may comprise software object code, compiled from a language such as "C", etc.
2o While the invention has been described in terms of several preferred embodiments, those skilled in the art will recognize that the invention can be practiced with modification within the spirit and scope of the appended claims.
It is noted that the present invention can be implemented in many applications.
For example, the invention can be used in orthopedic surgery (e.g., such as total hip replacement surgery, revision total hip replacement surgery, spine surgery, etc.). In one implementation, the pre-operative images typically are three-dimensional CT
images or MRI
(Magnetic Resonance Imaging) images, and the intra-operative images typically are X-ray fluoroscopy images. A three-dimensional pre-operative plan (e.g., such as planning position of a prosthetic implant with respect to the surrounding bony anatomy) may be integrated onto one or several two-dimensional X-ray images to provide the surgeon with images to evaluate a potential surgical outcome.
The present invention also can be used in treating cancer by radio- therapy.
Conventional radio-therapy delivery devices include an imaging device (e.g., producing "portal" images), whereas the present invention can be used to project a three-dimensional radio-therapy plan onto two-dimensional images produced by the imaging device, thereby providing the clinician with a mechanism and technique to evaluate the accuracy with which the therapy will be delivered.
The present invention also can be used in brain surgery, in which case the pre-operative images typically may be three-dimensional CT or MRI images, and the intra-operative images typically may be X-ray images. A three-dimensional surgical plan (e.g., such as planning the removal of a tumor of a specified shape and location relatively to the surrounding imaged anatomy) may be integrated onto one or several two-dimensional X-ray images to provide the surgeon with images to evaluate a potential surgical outcome.
The present invention also can be used in craniofacial surgery. In such a case, the pre-operative images typically would be three-dimensional CT or MRI images, and the intra-operative images typically would be X-ray images. A three-dimensional surgical plan typically would involve osteotomies and the relocation of bone fragments to correct some physical deformities. A robotic device would be used to manipulate bone fragments. The three-dimensional plan would be integrated onto one or several two- dimensional X-ray images to provide the surgeon with images to evaluate a potential surgical outcome, and in particular to compare the resulting images with X-ray images of normal individuals, or to evaluate that the execution of the plan will be correct.

Claims (45)

1. A system for generating simulated data, comprising:
a medical imaging camera for generating images;
a registration device for registering data to a physical space, and to said medical imaging camera; and a fusion mechanism for fusing said data and said images to generate simulated data.
2. The system according to claim 1, further comprising:
another medical imaging camera for collecting said data, and wherein said medical imaging camera collects said images.
3. The system according to claim 1, wherein said data comprises pre- operative images and said images comprise intra-operative images.
4. The system according to claim 1, wherein said data comprises a two-dimensional image.
5. The system according to claim 1, wherein said simulated data comprises simulated post-operative images.
6. The system according to claim 1, wherein said fusion mechanism generates said simulated data while a surgery is being performed.
7. The system as in claim 3, wherein the pre-operative data comprises data of a surgical plan including a position of a component, and a three- dimensional shape of said component.
8. The system as in claim 7, wherein said component comprises an implant for a patient.
9. The system as in claim 2, wherein said another imaging camera comprises an X-ray, computed tomography (CT) scanner.
10. The system as in claim 1, wherein said medical imaging camera comprises a two-dimensional (2D) X-ray camera.
11. The system as in claim 1, wherein said fusion mechanism comprises a data processor.
12. A system for generating simulated post-operative data, comprising:
an imaging camera;
a registration device for registering data to a physical space, and to the imaging camera; and a fusion mechanism for fusing said data and intra-operative images to generate simulated data.
13. The system according to claim 12, wherein data comprises pre-operative images and other pre-operative data, and wherein said simulated data comprises simulated post-operative images.
14. The system according to claim 13, further comprising:
a calibration mechanism for calibrating said imaging camera; and another imaging camera for collecting said pre-operative images, wherein said imaging camera collects said intra-operative images.
15. The system according to claim 12, wherein said fusion mechanism generates said simulated data while a surgery is being performed.
16. The system as in claim 13, wherein the pre-operative data comprises data of a surgical plan including a position of a component, and a three- dimensional shape of said component.
17. The system as in claim 16, wherein said component comprises an implant for a patient.
18. The system as in claim 14, wherein said another imaging camera comprises an X-ray, computed tomography (CT) scanner.
19. The system as in claim 12, wherein said imaging camera comprises a two-dimensional (2D) X-ray camera.
20. The system as in claim 1, wherein said fusion mechanism comprises a data processor.
21. A system for providing intra-operative visual evaluations of potential surgical outcomes, using medical images, comprising:
a first medical imaging camera for collecting pre-operative images;
a second medical imaging camera for collecting intra-operative images, while a surgery is being performed;
a registration mechanism for registering said pre-operative images and other pre-operative data to a physical space, and to said second medical imaging camera; and a fusion mechanism for fusing said pre-operative data and said intra-operative images to generate simulated post-operative images.
22. A system for providing advice to a surgeon without requiring physical manipulation of objects inside and in a vicinity of a patient, said system comprising:
a manipulation device for manipulating one or a plurality of virtual objects;
a registration mechanism for registering the one or plurality of virtual objects to a physical space, and to an imaging camera; and a fusion mechanism for fusing the one or plurality of virtual objects with an inter-operative image.
23. A robotically-assisted surgical system, comprising:

a surgical robot;
a first medical imaging camera for collecting pre-operative data;
a second medical imaging camera for collecting intra-operative images;
a registration mechanism for registering said pre-operative data to the surgical robot, and to the second medical imaging camera; and a fusion mechanism for fusing said pre-operative data and said intra-operative images to generate simulated data.
24. A system for comparing a plurality of methods for registering pre-operative data with a physical space, comprising:
a first medical imaging device for collecting pre-operative data;
a second medical imaging device for collecting intra-operative images;
an image registration mechanism for registering the second medical imaging device to a physical space;
a plurality of pre-operative-data-to-physical-space registration mechanisms for registering said pre-operative data to the physical space; and a fusion mechanism for fusing said pre-operative data, that are registered using the plurality pre-operative-data-to-physical-space registration mechanisms, and said intra-operative images, to generate new images.
25. A system for assisting a surgeon in a registration of pre-operative data with a physical space, comprising:
means for collecting pre-operative data;
means for collecting intra-operative images;
an image registration mechanism for registering the intra-operative image collecting means to a physical space;
a plurality of pre-operative-data-to-physical-space registration mechanisms for registering said pre-operative data to the physical space;
a fusion mechanism for fusing said pre-operative data, that are registered using the plurality of pre-operative-data-to-physical-space registration mechanisms, and said intra-operative images, to generate new images; and a constraint mechanism for specifying positional constraints on a three-dimensional position of a physical object that is moved during surgery.
26. The system according to claim 25, further comprising:
a refinement mechanism for refining a surgical plan by refining the constraints on the three-dimensional position of a physical object that is moved during surgery.
27. A robotically-assisted surgical system, comprising:
a surgical robot;
an image registration mechanism for registering a medical imaging camera to a physical space;
a plurality of pre-operative-data-to-physical-space registration mechanisms for registering pre-operative data to the physical space;
a fusion mechanism for fusing said pre-operative data, that are registered using the plurality of pre-operative-data-to-physical-space registration mechanisms, and intra-operative images produced by said imaging camera, to generate new images; and a refinement mechanism for refining a surgical plan by refining constraints on a three-dimensional position of a physical object that is moved during surgery.
28. The system according to claim 27, further comprising:
a constraint mechanism for specifying positional constraints on the three-dimensional position of the physical object that is moved during surgery.
29. The system according to claim 27, wherein said surgical robot includes an internal calibration system, said system further comprising:
a graphical warning mechanism for providing a visual representation of an outcome of a surgical plan, and potential failure of the surgical plan resulting from the internal failure of the calibration system of said robot.
30. A system for assessing accuracy of a proposed execution of a pre-operative surgical plan, comprising:
an image device for obtaining intra-operative images of an anatomical feature of interest; and a data processor for computing a projection of a pre-operative three-dimensional model of an implant or planned trajectory onto the intra- operative image.
31. The system according to claim 30, further comprising a display system for displaying an output of said data processor while a surgery is on-going.
32. A method of generating simulated data, comprising:
registering data to a physical space, and to a medical imaging camera; and fusing said data and said images to generate simulated data.
33. A method of generating simulated post-operative data, comprising:
collecting pre-operative data and collecting intra-operative images;
registering said pre-operative data to a physical space, and to a medical imaging camera; and fusing said pre-operative data and said intra-operative images to generate simulated post-operative data.
34. The method of claim 33, further comprising:
calibrating the medical imaging camera, wherein the pre-operative data comprises data of a surgical plan including a position of a component, and a three-dimensional shape of said component.
35. The method of claim 34, wherein said component of said surgical plan includes an implant position.
36. A signal-bearing medium tangibly embodying a program of machine- readable instructions executable by a digital processing apparatus to perform a method for computer-implemented generating of simulated data, comprising:
registering data to a physical space, and to a medical imaging camera; and fusing said data and said images to generate simulated data.
37. A signal-bearing medium tangibly embodying a program of machine- readable instructions executable by a digital processing apparatus to perform a method for computer-implemented generating simulated post-operative data, comprising:
collecting pre-operative data and collecting intra-operative images;
registering said pre-operative data to a physical space, and to a medical imaging camera; and fusing said pre-operative data and said intra-operative images to generate simulated post-operative data.
38. The system according to claim 1, wherein said fusion mechanism fuses said data with said images by integrating a two-dimensional projection of a silhouette of a three-dimensional implant model in an X-ray image.
39. The system according to claim 38, wherein said fusion mechanism uses a calibration of the X-ray image, to determine a center of perspective whose location represents an estimate of a location of an X-ray source, said center of perspective being used to compute silhouette curves on the three-dimensional implant model.
40. The system according to claim 39, wherein said silhouette curves are such that rays emanating from the center of perspective and tangent to the three-dimensional model meet the three-dimensional implant model on a silhouette curve.
41. The system according to claim l, wherein said fusion mechanism fuses by projecting a silhouette curve of said data by considering in turn each new pixel, determining a line in three-dimensions corresponding to that pixel by image calibration, computing a distance from a line to the silhouette curve, and assigning a pixel gray-scale value depending on the distance.
42. The system according to claim 41, wherein said fusion mechanism assigns pixel gray-scale values corresponding to a distance, wherein if the distance is less than a first predetermined value, then the gray-scale value is set to a first predetermined number.
43. The system according to claim 42, wherein if the distance is less than a second predetermined value, then the gray-scale value is set to a second predetermined number larger than said first predetermined number.
44. The system according to claim 43, wherein if the distance is less than a third predetermined value greater than said first and second predetermined values, then the gray-scale value is set to a third predetermined number larger than said first and second predetermined numbers.
45. The system according to claim 44, wherein if the distance is greater than or equal to said third predetermined value, then the gray-scale value is not modified for projecting the silhouette curves.
CA002300245A 1999-04-27 2000-03-08 System and method for intra-operative image-based, interactive verification of a pre-operative surgical plan Abandoned CA2300245A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/299,643 1999-04-27
US09/299,643 US6301495B1 (en) 1999-04-27 1999-04-27 System and method for intra-operative, image-based, interactive verification of a pre-operative surgical plan

Publications (1)

Publication Number Publication Date
CA2300245A1 true CA2300245A1 (en) 2000-10-27

Family

ID=23155657

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002300245A Abandoned CA2300245A1 (en) 1999-04-27 2000-03-08 System and method for intra-operative image-based, interactive verification of a pre-operative surgical plan

Country Status (2)

Country Link
US (1) US6301495B1 (en)
CA (1) CA2300245A1 (en)

Families Citing this family (275)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6256529B1 (en) * 1995-07-26 2001-07-03 Burdette Medical Systems, Inc. Virtual reality 3D visualization for surgical procedures
US6778850B1 (en) 1999-03-16 2004-08-17 Accuray, Inc. Frameless radiosurgery treatment system and method
US7853311B1 (en) * 1999-04-23 2010-12-14 3M Innovative Properties Company Surgical targeting system
US6415171B1 (en) * 1999-07-16 2002-07-02 International Business Machines Corporation System and method for fusing three-dimensional shape data on distorted images without correcting for distortion
JP3608448B2 (en) 1999-08-31 2005-01-12 株式会社日立製作所 Treatment device
US6837892B2 (en) * 2000-07-24 2005-01-04 Mazor Surgical Technologies Ltd. Miniature bone-mounted surgical robot
US6718194B2 (en) * 2000-11-17 2004-04-06 Ge Medical Systems Global Technology Company, Llc Computer assisted intramedullary rod surgery system with enhanced features
US6990222B2 (en) * 2001-11-21 2006-01-24 Arnold Ben A Calibration of tissue densities in computerized tomography
FR2836818B1 (en) * 2002-03-05 2004-07-02 Eurosurgical PROCESS FOR VISUALIZING AND CHECKING THE BALANCE OF A SPINE COLUMN
FR2855292B1 (en) * 2003-05-22 2005-12-09 Inst Nat Rech Inf Automat DEVICE AND METHOD FOR REAL TIME REASONING OF PATTERNS ON IMAGES, IN PARTICULAR FOR LOCALIZATION GUIDANCE
US20050004446A1 (en) * 2003-06-25 2005-01-06 Brett Cowan Model assisted planning of medical imaging
DE10333543A1 (en) * 2003-07-23 2005-02-24 Siemens Ag A method for the coupled presentation of intraoperative as well as interactive and iteratively re-registered preoperative images in medical imaging
US8276091B2 (en) * 2003-09-16 2012-09-25 Ram Consulting Haptic response system and method of use
WO2005079660A1 (en) * 2004-02-18 2005-09-01 Siemens Aktiengesellschaft Method and device for verifying the adherence to a performance specification for a medical procedure performed on a patient
US20080199059A1 (en) * 2004-05-14 2008-08-21 Koninklijke Philips Electronics, N.V. Information Enhanced Image Guided Interventions
US7097357B2 (en) * 2004-06-02 2006-08-29 General Electric Company Method and system for improved correction of registration error in a fluoroscopic image
US7371068B2 (en) * 2004-07-22 2008-05-13 General Electric Company System and method for improved surgical workflow development
US20060030771A1 (en) * 2004-08-03 2006-02-09 Lewis Levine System and method for sensor integration
US20060063998A1 (en) * 2004-09-21 2006-03-23 Von Jako Ron Navigation and visualization of an access needle system
JP2008526422A (en) 2005-01-13 2008-07-24 メイザー サージカル テクノロジーズ リミテッド Image guide robot system for keyhole neurosurgery
US8055487B2 (en) * 2005-02-22 2011-11-08 Smith & Nephew, Inc. Interactive orthopaedic biomechanics system
DE102005016256B3 (en) * 2005-04-08 2006-06-08 Siemens Ag Displaying preoperative three-dimensional images with two-dimensional x-ray image acquisition involves repeatedly generating two-dimensional representations with varying parameters and displaying them on a screen
US20060241972A1 (en) * 2005-04-26 2006-10-26 Lang Robert G Medical outcomes systems
US8219178B2 (en) 2007-02-16 2012-07-10 Catholic Healthcare West Method and system for performing invasive medical procedures using a surgical robot
US10357184B2 (en) 2012-06-21 2019-07-23 Globus Medical, Inc. Surgical tool systems and method
US10893912B2 (en) 2006-02-16 2021-01-19 Globus Medical Inc. Surgical tool systems and methods
US10653497B2 (en) 2006-02-16 2020-05-19 Globus Medical, Inc. Surgical tool systems and methods
US20150335438A1 (en) 2006-02-27 2015-11-26 Biomet Manufacturing, Llc. Patient-specific augments
US9918740B2 (en) 2006-02-27 2018-03-20 Biomet Manufacturing, Llc Backup surgical instrument system and method
US8603180B2 (en) 2006-02-27 2013-12-10 Biomet Manufacturing, Llc Patient-specific acetabular alignment guides
US7967868B2 (en) 2007-04-17 2011-06-28 Biomet Manufacturing Corp. Patient-modified implant and associated method
US8092465B2 (en) * 2006-06-09 2012-01-10 Biomet Manufacturing Corp. Patient specific knee alignment guide and associated method
US9907659B2 (en) 2007-04-17 2018-03-06 Biomet Manufacturing, Llc Method and apparatus for manufacturing an implant
US9173661B2 (en) 2006-02-27 2015-11-03 Biomet Manufacturing, Llc Patient specific alignment guide with cutting surface and laser indicator
US9113971B2 (en) 2006-02-27 2015-08-25 Biomet Manufacturing, Llc Femoral acetabular impingement guide
US8407067B2 (en) 2007-04-17 2013-03-26 Biomet Manufacturing Corp. Method and apparatus for manufacturing an implant
US9339278B2 (en) 2006-02-27 2016-05-17 Biomet Manufacturing, Llc Patient-specific acetabular guides and associated instruments
US9289253B2 (en) 2006-02-27 2016-03-22 Biomet Manufacturing, Llc Patient-specific shoulder guide
US10278711B2 (en) 2006-02-27 2019-05-07 Biomet Manufacturing, Llc Patient-specific femoral guide
US9345548B2 (en) 2006-02-27 2016-05-24 Biomet Manufacturing, Llc Patient-specific pre-operative planning
US9795399B2 (en) 2006-06-09 2017-10-24 Biomet Manufacturing, Llc Patient-specific knee alignment guide and associated method
US8331634B2 (en) * 2006-09-26 2012-12-11 Siemens Aktiengesellschaft Method for virtual adaptation of an implant to a body part of a patient
EP2044884B1 (en) * 2007-10-02 2015-12-09 Brainlab AG Detection and determination of changes in position of structural parts of a body
US8549888B2 (en) 2008-04-04 2013-10-08 Nuvasive, Inc. System and device for designing and forming a surgical implant
US8758263B1 (en) 2009-10-31 2014-06-24 Voxel Rad, Ltd. Systems and methods for frameless image-guided biopsy and therapeutic intervention
US9014835B2 (en) * 2010-08-25 2015-04-21 Siemens Aktiengesellschaft Semi-automatic customization of plates for internal fracture fixation
US9968376B2 (en) 2010-11-29 2018-05-15 Biomet Manufacturing, Llc Patient-specific orthopedic instruments
US9053563B2 (en) 2011-02-11 2015-06-09 E4 Endeavors, Inc. System and method for modeling a biopsy specimen
WO2012131660A1 (en) 2011-04-01 2012-10-04 Ecole Polytechnique Federale De Lausanne (Epfl) Robotic system for spinal and other surgeries
DE102011078212B4 (en) 2011-06-28 2017-06-29 Scopis Gmbh Method and device for displaying an object
US11207132B2 (en) 2012-03-12 2021-12-28 Nuvasive, Inc. Systems and methods for performing spinal surgery
US11871901B2 (en) 2012-05-20 2024-01-16 Cilag Gmbh International Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage
US9498182B2 (en) 2012-05-22 2016-11-22 Covidien Lp Systems and methods for planning and navigation
US8750568B2 (en) 2012-05-22 2014-06-10 Covidien Lp System and method for conformal ablation planning
US9439627B2 (en) 2012-05-22 2016-09-13 Covidien Lp Planning system and navigation system for an ablation procedure
US9439623B2 (en) 2012-05-22 2016-09-13 Covidien Lp Surgical planning system and navigation system
US9439622B2 (en) 2012-05-22 2016-09-13 Covidien Lp Surgical navigation system
US11045267B2 (en) 2012-06-21 2021-06-29 Globus Medical, Inc. Surgical robotic automation with tracking markers
US10231791B2 (en) 2012-06-21 2019-03-19 Globus Medical, Inc. Infrared signal based position recognition system for use with a robot-assisted surgery
US10136954B2 (en) 2012-06-21 2018-11-27 Globus Medical, Inc. Surgical tool systems and method
US11607149B2 (en) 2012-06-21 2023-03-21 Globus Medical Inc. Surgical tool systems and method
US11298196B2 (en) 2012-06-21 2022-04-12 Globus Medical Inc. Surgical robotic automation with tracking markers and controlled tool advancement
US11399900B2 (en) 2012-06-21 2022-08-02 Globus Medical, Inc. Robotic systems providing co-registration using natural fiducials and related methods
US11116576B2 (en) 2012-06-21 2021-09-14 Globus Medical Inc. Dynamic reference arrays and methods of use
US11857149B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. Surgical robotic systems with target trajectory deviation monitoring and related methods
US11857266B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. System for a surveillance marker in robotic-assisted surgery
US11253327B2 (en) 2012-06-21 2022-02-22 Globus Medical, Inc. Systems and methods for automatically changing an end-effector on a surgical robot
US11395706B2 (en) 2012-06-21 2022-07-26 Globus Medical Inc. Surgical robot platform
US10624710B2 (en) 2012-06-21 2020-04-21 Globus Medical, Inc. System and method for measuring depth of instrumentation
US11793570B2 (en) 2012-06-21 2023-10-24 Globus Medical Inc. Surgical robotic automation with tracking markers
US11864839B2 (en) 2012-06-21 2024-01-09 Globus Medical Inc. Methods of adjusting a virtual implant and related surgical navigation systems
US11317971B2 (en) 2012-06-21 2022-05-03 Globus Medical, Inc. Systems and methods related to robotic guidance in surgery
US10350013B2 (en) 2012-06-21 2019-07-16 Globus Medical, Inc. Surgical tool systems and methods
JP2015528713A (en) 2012-06-21 2015-10-01 グローバス メディカル インコーポレイティッド Surgical robot platform
US10758315B2 (en) 2012-06-21 2020-09-01 Globus Medical Inc. Method and system for improving 2D-3D registration convergence
US11864745B2 (en) 2012-06-21 2024-01-09 Globus Medical, Inc. Surgical robotic system with retractor
WO2014008613A1 (en) * 2012-07-12 2014-01-16 Ao Technology Ag Method for generating a graphical 3d computer model of at least one anatomical structure in a selectable pre-, intra-, or postoperative status
KR102084534B1 (en) * 2013-03-13 2020-03-04 씽크 써지컬, 인크. Methods, devices and systems for computer-assisted robotic surgery
DE102013209158A1 (en) * 2013-05-16 2014-11-20 Fiagon Gmbh Method for integrating data obtained by means of an imaging method
DE102013211055B3 (en) 2013-06-13 2014-09-18 Scopis Gmbh Adapter for receiving a medical device and a position detection system
US9283048B2 (en) 2013-10-04 2016-03-15 KB Medical SA Apparatus and systems for precise guidance of surgical tools
US9848922B2 (en) 2013-10-09 2017-12-26 Nuvasive, Inc. Systems and methods for performing spine surgery
KR20160068922A (en) * 2013-10-11 2016-06-15 소나케어 메디컬, 엘엘씨 System for and method of performing sonasurgery
DE102013222230A1 (en) 2013-10-31 2015-04-30 Fiagon Gmbh Surgical instrument
WO2015107099A1 (en) 2014-01-15 2015-07-23 KB Medical SA Notched apparatus for guidance of an insertable instrument along an axis during spinal surgery
US10039605B2 (en) 2014-02-11 2018-08-07 Globus Medical, Inc. Sterile handle for controlling a robotic surgical system from a sterile field
GB2524955A (en) 2014-04-01 2015-10-14 Scopis Gmbh Method for cell envelope segmentation and visualisation
WO2015162256A1 (en) 2014-04-24 2015-10-29 KB Medical SA Surgical instrument holder for use with a robotic surgical system
US10357257B2 (en) 2014-07-14 2019-07-23 KB Medical SA Anti-skid surgical instrument for use in preparing holes in bone tissue
US9913669B1 (en) 2014-10-17 2018-03-13 Nuvasive, Inc. Systems and methods for performing spine surgery
US11504192B2 (en) 2014-10-30 2022-11-22 Cilag Gmbh International Method of hub communication with surgical instrument systems
GB201501157D0 (en) 2015-01-23 2015-03-11 Scopis Gmbh Instrument guidance system for sinus surgery
US10013808B2 (en) 2015-02-03 2018-07-03 Globus Medical, Inc. Surgeon head-mounted display apparatuses
WO2016131903A1 (en) 2015-02-18 2016-08-25 KB Medical SA Systems and methods for performing minimally invasive spinal surgery with a robotic surgical system using a percutaneous technique
US10646298B2 (en) 2015-07-31 2020-05-12 Globus Medical, Inc. Robot arm and methods of use
US10058394B2 (en) 2015-07-31 2018-08-28 Globus Medical, Inc. Robot arm and methods of use
US10080615B2 (en) 2015-08-12 2018-09-25 Globus Medical, Inc. Devices and methods for temporary mounting of parts to bone
JP6894431B2 (en) 2015-08-31 2021-06-30 ケービー メディカル エスアー Robotic surgical system and method
US10034716B2 (en) 2015-09-14 2018-07-31 Globus Medical, Inc. Surgical robotic systems and methods thereof
US9771092B2 (en) 2015-10-13 2017-09-26 Globus Medical, Inc. Stabilizer wheel assembly and methods of use
US10201320B2 (en) * 2015-12-18 2019-02-12 OrthoGrid Systems, Inc Deformed grid based intra-operative system and method of use
US11883217B2 (en) 2016-02-03 2024-01-30 Globus Medical, Inc. Portable medical imaging system and method
US10842453B2 (en) 2016-02-03 2020-11-24 Globus Medical, Inc. Portable medical imaging system
US10448910B2 (en) 2016-02-03 2019-10-22 Globus Medical, Inc. Portable medical imaging system
US11058378B2 (en) 2016-02-03 2021-07-13 Globus Medical, Inc. Portable medical imaging system
US10117632B2 (en) 2016-02-03 2018-11-06 Globus Medical, Inc. Portable medical imaging system with beam scanning collimator
US10866119B2 (en) 2016-03-14 2020-12-15 Globus Medical, Inc. Metal detector for detecting insertion of a surgical device into a hollow tube
JP7233841B2 (en) 2017-01-18 2023-03-07 ケービー メディカル エスアー Robotic Navigation for Robotic Surgical Systems
US10722310B2 (en) 2017-03-13 2020-07-28 Zimmer Biomet CMF and Thoracic, LLC Virtual surgery planning system and method
US11071594B2 (en) 2017-03-16 2021-07-27 KB Medical SA Robotic navigation of robotic surgical systems
US11478662B2 (en) 2017-04-05 2022-10-25 Accuray Incorporated Sequential monoscopic tracking
US10349986B2 (en) 2017-04-20 2019-07-16 Warsaw Orthopedic, Inc. Spinal implant system and method
US10675094B2 (en) 2017-07-21 2020-06-09 Globus Medical Inc. Robot surgical platform
US11311342B2 (en) 2017-10-30 2022-04-26 Cilag Gmbh International Method for communicating with surgical instrument systems
US11229436B2 (en) 2017-10-30 2022-01-25 Cilag Gmbh International Surgical system comprising a surgical tool and a surgical hub
US11123070B2 (en) 2017-10-30 2021-09-21 Cilag Gmbh International Clip applier comprising a rotatable clip magazine
US20190125320A1 (en) 2017-10-30 2019-05-02 Ethicon Llc Control system arrangements for a modular surgical instrument
US11291510B2 (en) 2017-10-30 2022-04-05 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11801098B2 (en) 2017-10-30 2023-10-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11317919B2 (en) 2017-10-30 2022-05-03 Cilag Gmbh International Clip applier comprising a clip crimping system
US11510741B2 (en) 2017-10-30 2022-11-29 Cilag Gmbh International Method for producing a surgical instrument comprising a smart electrical system
US11911045B2 (en) 2017-10-30 2024-02-27 Cllag GmbH International Method for operating a powered articulating multi-clip applier
US11564756B2 (en) 2017-10-30 2023-01-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11794338B2 (en) 2017-11-09 2023-10-24 Globus Medical Inc. Robotic rod benders and related mechanical and motor housings
US11357548B2 (en) 2017-11-09 2022-06-14 Globus Medical, Inc. Robotic rod benders and related mechanical and motor housings
EP3492032B1 (en) 2017-11-09 2023-01-04 Globus Medical, Inc. Surgical robotic systems for bending surgical rods
US11134862B2 (en) 2017-11-10 2021-10-05 Globus Medical, Inc. Methods of selecting surgical implants and related devices
US10426424B2 (en) 2017-11-21 2019-10-01 General Electric Company System and method for generating and performing imaging protocol simulations
US11864934B2 (en) 2017-11-22 2024-01-09 Mazor Robotics Ltd. Method for verifying hard tissue location using implant imaging
US11771487B2 (en) 2017-12-28 2023-10-03 Cilag Gmbh International Mechanisms for controlling different electromechanical systems of an electrosurgical instrument
US11559307B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method of robotic hub communication, detection, and control
US10849697B2 (en) 2017-12-28 2020-12-01 Ethicon Llc Cloud interface for coupled surgical devices
US10695081B2 (en) 2017-12-28 2020-06-30 Ethicon Llc Controlling a surgical instrument according to sensed closure parameters
US11100631B2 (en) 2017-12-28 2021-08-24 Cilag Gmbh International Use of laser light and red-green-blue coloration to determine properties of back scattered light
US11291495B2 (en) 2017-12-28 2022-04-05 Cilag Gmbh International Interruption of energy due to inadvertent capacitive coupling
US11304720B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Activation of energy devices
US11257589B2 (en) 2017-12-28 2022-02-22 Cilag Gmbh International Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes
US11076921B2 (en) 2017-12-28 2021-08-03 Cilag Gmbh International Adaptive control program updates for surgical hubs
US11179175B2 (en) 2017-12-28 2021-11-23 Cilag Gmbh International Controlling an ultrasonic surgical instrument according to tissue location
US20190201146A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Safety systems for smart powered surgical stapling
US10987178B2 (en) 2017-12-28 2021-04-27 Ethicon Llc Surgical hub control arrangements
US11179208B2 (en) 2017-12-28 2021-11-23 Cilag Gmbh International Cloud-based medical analytics for security and authentication trends and reactive measures
US11857152B2 (en) 2017-12-28 2024-01-02 Cilag Gmbh International Surgical hub spatial awareness to determine devices in operating theater
US11324557B2 (en) 2017-12-28 2022-05-10 Cilag Gmbh International Surgical instrument with a sensing array
US11589888B2 (en) 2017-12-28 2023-02-28 Cilag Gmbh International Method for controlling smart energy devices
US11069012B2 (en) 2017-12-28 2021-07-20 Cilag Gmbh International Interactive surgical systems with condition handling of devices and data capabilities
US11844579B2 (en) 2017-12-28 2023-12-19 Cilag Gmbh International Adjustments based on airborne particle properties
US11633237B2 (en) 2017-12-28 2023-04-25 Cilag Gmbh International Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures
US11051876B2 (en) 2017-12-28 2021-07-06 Cilag Gmbh International Surgical evacuation flow paths
US11571234B2 (en) 2017-12-28 2023-02-07 Cilag Gmbh International Temperature control of ultrasonic end effector and control system therefor
US11432885B2 (en) 2017-12-28 2022-09-06 Cilag Gmbh International Sensing arrangements for robot-assisted surgical platforms
US20190200981A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Method of compressing tissue within a stapling device and simultaneously displaying the location of the tissue within the jaws
US11423007B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Adjustment of device control programs based on stratified contextual data in addition to the data
US11903601B2 (en) 2017-12-28 2024-02-20 Cilag Gmbh International Surgical instrument comprising a plurality of drive systems
US11308075B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Surgical network, instrument, and cloud responses based on validation of received dataset and authentication of its source and integrity
US11818052B2 (en) 2017-12-28 2023-11-14 Cilag Gmbh International Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US10943454B2 (en) 2017-12-28 2021-03-09 Ethicon Llc Detection and escalation of security responses of surgical instruments to increasing severity threats
US11317937B2 (en) 2018-03-08 2022-05-03 Cilag Gmbh International Determining the state of an ultrasonic end effector
US11304699B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Method for adaptive control schemes for surgical network control and interaction
US11666331B2 (en) 2017-12-28 2023-06-06 Cilag Gmbh International Systems for detecting proximity of surgical end effector to cancerous tissue
US11446052B2 (en) 2017-12-28 2022-09-20 Cilag Gmbh International Variation of radio frequency and ultrasonic power level in cooperation with varying clamp arm pressure to achieve predefined heat flux or power applied to tissue
US11659023B2 (en) 2017-12-28 2023-05-23 Cilag Gmbh International Method of hub communication
US11576677B2 (en) 2017-12-28 2023-02-14 Cilag Gmbh International Method of hub communication, processing, display, and cloud analytics
US11096693B2 (en) 2017-12-28 2021-08-24 Cilag Gmbh International Adjustment of staple height of at least one row of staples based on the sensed tissue thickness or force in closing
US11304745B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Surgical evacuation sensing and display
US11559308B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method for smart energy device infrastructure
US11896443B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Control of a surgical system through a surgical barrier
US11832840B2 (en) 2017-12-28 2023-12-05 Cilag Gmbh International Surgical instrument having a flexible circuit
US11389164B2 (en) 2017-12-28 2022-07-19 Cilag Gmbh International Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices
US11234756B2 (en) 2017-12-28 2022-02-01 Cilag Gmbh International Powered surgical tool with predefined adjustable control algorithm for controlling end effector parameter
US11529187B2 (en) 2017-12-28 2022-12-20 Cilag Gmbh International Surgical evacuation sensor arrangements
US11202570B2 (en) 2017-12-28 2021-12-21 Cilag Gmbh International Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems
US11304763B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Image capturing of the areas outside the abdomen to improve placement and control of a surgical device in use
US11278281B2 (en) 2017-12-28 2022-03-22 Cilag Gmbh International Interactive surgical system
US10892995B2 (en) 2017-12-28 2021-01-12 Ethicon Llc Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US11744604B2 (en) 2017-12-28 2023-09-05 Cilag Gmbh International Surgical instrument with a hardware-only control circuit
US11266468B2 (en) 2017-12-28 2022-03-08 Cilag Gmbh International Cooperative utilization of data derived from secondary sources by intelligent surgical hubs
US11832899B2 (en) * 2017-12-28 2023-12-05 Cilag Gmbh International Surgical systems with autonomously adjustable control programs
US11013563B2 (en) 2017-12-28 2021-05-25 Ethicon Llc Drive arrangements for robot-assisted surgical platforms
US11786245B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Surgical systems with prioritized data transmission capabilities
US11540855B2 (en) 2017-12-28 2023-01-03 Cilag Gmbh International Controlling activation of an ultrasonic surgical instrument according to the presence of tissue
US11364075B2 (en) 2017-12-28 2022-06-21 Cilag Gmbh International Radio frequency energy device for delivering combined electrical signals
US10966791B2 (en) 2017-12-28 2021-04-06 Ethicon Llc Cloud-based medical analytics for medical facility segmented individualization of instrument function
US11109866B2 (en) 2017-12-28 2021-09-07 Cilag Gmbh International Method for circular stapler control algorithm adjustment based on situational awareness
US10932872B2 (en) 2017-12-28 2021-03-02 Ethicon Llc Cloud-based medical analytics for linking of local usage trends with the resource acquisition behaviors of larger data set
US11273001B2 (en) 2017-12-28 2022-03-15 Cilag Gmbh International Surgical hub and modular device response adjustment based on situational awareness
US10944728B2 (en) 2017-12-28 2021-03-09 Ethicon Llc Interactive surgical systems with encrypted communication capabilities
US11056244B2 (en) 2017-12-28 2021-07-06 Cilag Gmbh International Automated data scaling, alignment, and organizing based on predefined parameters within surgical networks
US11410259B2 (en) 2017-12-28 2022-08-09 Cilag Gmbh International Adaptive control program updates for surgical devices
US10898622B2 (en) 2017-12-28 2021-01-26 Ethicon Llc Surgical evacuation system with a communication circuit for communication between a filter and a smoke evacuation device
US10892899B2 (en) 2017-12-28 2021-01-12 Ethicon Llc Self describing data packets generated at an issuing instrument
US11253315B2 (en) 2017-12-28 2022-02-22 Cilag Gmbh International Increasing radio frequency to create pad-less monopolar loop
US11864728B2 (en) 2017-12-28 2024-01-09 Cilag Gmbh International Characterization of tissue irregularities through the use of mono-chromatic light refractivity
US11213359B2 (en) 2017-12-28 2022-01-04 Cilag Gmbh International Controllers for robot-assisted surgical platforms
US11602393B2 (en) 2017-12-28 2023-03-14 Cilag Gmbh International Surgical evacuation sensing and generator control
US11160605B2 (en) 2017-12-28 2021-11-02 Cilag Gmbh International Surgical evacuation sensing and motor control
US11311306B2 (en) 2017-12-28 2022-04-26 Cilag Gmbh International Surgical systems for detecting end effector tissue distribution irregularities
US11424027B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Method for operating surgical instrument systems
US11672605B2 (en) 2017-12-28 2023-06-13 Cilag Gmbh International Sterile field interactive control displays
US11896322B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub
US10755813B2 (en) 2017-12-28 2020-08-25 Ethicon Llc Communication of smoke evacuation system parameters to hub or cloud in smoke evacuation module for interactive surgical platform
US10758310B2 (en) 2017-12-28 2020-09-01 Ethicon Llc Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices
US11132462B2 (en) 2017-12-28 2021-09-28 Cilag Gmbh International Data stripping method to interrogate patient records and create anonymized record
US11678881B2 (en) 2017-12-28 2023-06-20 Cilag Gmbh International Spatial awareness of surgical hubs in operating rooms
US11376002B2 (en) 2017-12-28 2022-07-05 Cilag Gmbh International Surgical instrument cartridge sensor assemblies
US11284936B2 (en) 2017-12-28 2022-03-29 Cilag Gmbh International Surgical instrument having a flexible electrode
US11419667B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Ultrasonic energy device which varies pressure applied by clamp arm to provide threshold control pressure at a cut progression location
US11419630B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Surgical system distributed processing
US11147607B2 (en) 2017-12-28 2021-10-19 Cilag Gmbh International Bipolar combination device that automatically adjusts pressure based on energy modality
US11464559B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Estimating state of ultrasonic end effector and control system therefor
US11464535B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Detection of end effector emersion in liquid
US11786251B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Method for adaptive control schemes for surgical network control and interaction
US11166772B2 (en) 2017-12-28 2021-11-09 Cilag Gmbh International Surgical hub coordination of control and communication of operating room devices
US20190254753A1 (en) 2018-02-19 2019-08-22 Globus Medical, Inc. Augmented reality navigation systems for use with robotic surgical systems and methods of their use
US11337746B2 (en) 2018-03-08 2022-05-24 Cilag Gmbh International Smart blade and power pulsing
US11534196B2 (en) 2018-03-08 2022-12-27 Cilag Gmbh International Using spectroscopy to determine device use state in combo instrument
US11259830B2 (en) 2018-03-08 2022-03-01 Cilag Gmbh International Methods for controlling temperature in ultrasonic device
US11090047B2 (en) 2018-03-28 2021-08-17 Cilag Gmbh International Surgical instrument comprising an adaptive control system
US11278280B2 (en) 2018-03-28 2022-03-22 Cilag Gmbh International Surgical instrument comprising a jaw closure lockout
US11471156B2 (en) 2018-03-28 2022-10-18 Cilag Gmbh International Surgical stapling devices with improved rotary driven closure systems
US11096688B2 (en) 2018-03-28 2021-08-24 Cilag Gmbh International Rotary driven firing members with different anvil and channel engagement features
US11219453B2 (en) 2018-03-28 2022-01-11 Cilag Gmbh International Surgical stapling devices with cartridge compatible closure and firing lockout arrangements
US10973520B2 (en) 2018-03-28 2021-04-13 Ethicon Llc Surgical staple cartridge with firing member driven camming assembly that has an onboard tissue cutting feature
US11589865B2 (en) 2018-03-28 2023-02-28 Cilag Gmbh International Methods for controlling a powered surgical stapler that has separate rotary closure and firing systems
US11166716B2 (en) 2018-03-28 2021-11-09 Cilag Gmbh International Stapling instrument comprising a deactivatable lockout
US11207067B2 (en) 2018-03-28 2021-12-28 Cilag Gmbh International Surgical stapling device with separate rotary driven closure and firing systems and firing member that engages both jaws while firing
US10573023B2 (en) 2018-04-09 2020-02-25 Globus Medical, Inc. Predictive visualization of medical imaging scanner component movement
WO2020033947A1 (en) 2018-08-10 2020-02-13 Covidien Lp Systems for ablation visualization
US11337742B2 (en) 2018-11-05 2022-05-24 Globus Medical Inc Compliant orthopedic driver
US11278360B2 (en) 2018-11-16 2022-03-22 Globus Medical, Inc. End-effectors for surgical robotic systems having sealed optical components
US11744655B2 (en) 2018-12-04 2023-09-05 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
US11602402B2 (en) 2018-12-04 2023-03-14 Globus Medical, Inc. Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
US10806339B2 (en) 2018-12-12 2020-10-20 Voxel Rad, Ltd. Systems and methods for treating cancer using brachytherapy
US11317915B2 (en) 2019-02-19 2022-05-03 Cilag Gmbh International Universal cartridge based key feature that unlocks multiple lockout arrangements in different surgical staplers
US11357503B2 (en) 2019-02-19 2022-06-14 Cilag Gmbh International Staple cartridge retainers with frangible retention features and methods of using same
US11369377B2 (en) 2019-02-19 2022-06-28 Cilag Gmbh International Surgical stapling assembly with cartridge based retainer configured to unlock a firing lockout
US11259807B2 (en) 2019-02-19 2022-03-01 Cilag Gmbh International Staple cartridges with cam surfaces configured to engage primary and secondary portions of a lockout of a surgical stapling device
US11751872B2 (en) 2019-02-19 2023-09-12 Cilag Gmbh International Insertable deactivator element for surgical stapler lockouts
US11571265B2 (en) 2019-03-22 2023-02-07 Globus Medical Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11317978B2 (en) 2019-03-22 2022-05-03 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11806084B2 (en) 2019-03-22 2023-11-07 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
US11419616B2 (en) 2019-03-22 2022-08-23 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
US11382549B2 (en) 2019-03-22 2022-07-12 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
EP3719749A1 (en) 2019-04-03 2020-10-07 Fiagon AG Medical Technologies Registration method and setup
US11045179B2 (en) 2019-05-20 2021-06-29 Global Medical Inc Robot-mounted retractor system
USD964564S1 (en) 2019-06-25 2022-09-20 Cilag Gmbh International Surgical staple cartridge retainer with a closure system authentication key
USD952144S1 (en) 2019-06-25 2022-05-17 Cilag Gmbh International Surgical staple cartridge retainer with firing system authentication key
USD950728S1 (en) 2019-06-25 2022-05-03 Cilag Gmbh International Surgical staple cartridge
US11628023B2 (en) 2019-07-10 2023-04-18 Globus Medical, Inc. Robotic navigational system for interbody implants
US11571171B2 (en) 2019-09-24 2023-02-07 Globus Medical, Inc. Compound curve cable chain
US11890066B2 (en) 2019-09-30 2024-02-06 Globus Medical, Inc Surgical robot with passive end effector
US11426178B2 (en) 2019-09-27 2022-08-30 Globus Medical Inc. Systems and methods for navigating a pin guide driver
US11864857B2 (en) 2019-09-27 2024-01-09 Globus Medical, Inc. Surgical robot with passive end effector
US11510684B2 (en) 2019-10-14 2022-11-29 Globus Medical, Inc. Rotary motion passive end effector for surgical robots in orthopedic surgeries
US11464581B2 (en) 2020-01-28 2022-10-11 Globus Medical, Inc. Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums
US11382699B2 (en) 2020-02-10 2022-07-12 Globus Medical Inc. Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery
US11207150B2 (en) 2020-02-19 2021-12-28 Globus Medical, Inc. Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
US11253216B2 (en) 2020-04-28 2022-02-22 Globus Medical Inc. Fixtures for fluoroscopic imaging systems and related navigation systems and methods
US11607277B2 (en) 2020-04-29 2023-03-21 Globus Medical, Inc. Registration of surgical tool with reference array tracked by cameras of an extended reality headset for assisted navigation during surgery
US11510750B2 (en) 2020-05-08 2022-11-29 Globus Medical, Inc. Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
US11382700B2 (en) 2020-05-08 2022-07-12 Globus Medical Inc. Extended reality headset tool tracking and control
US11153555B1 (en) 2020-05-08 2021-10-19 Globus Medical Inc. Extended reality headset camera system for computer assisted navigation in surgery
US11317973B2 (en) 2020-06-09 2022-05-03 Globus Medical, Inc. Camera tracking bar for computer assisted navigation during surgery
US11382713B2 (en) 2020-06-16 2022-07-12 Globus Medical, Inc. Navigated surgical system with eye to XR headset display calibration
US11877807B2 (en) 2020-07-10 2024-01-23 Globus Medical, Inc Instruments for navigated orthopedic surgeries
US11793588B2 (en) 2020-07-23 2023-10-24 Globus Medical, Inc. Sterile draping of robotic arms
US11737831B2 (en) 2020-09-02 2023-08-29 Globus Medical Inc. Surgical object tracking template generation for computer assisted navigation during surgical procedure
US11523785B2 (en) 2020-09-24 2022-12-13 Globus Medical, Inc. Increased cone beam computed tomography volume length without requiring stitching or longitudinal C-arm movement
US11911112B2 (en) 2020-10-27 2024-02-27 Globus Medical, Inc. Robotic navigational system
US11717350B2 (en) 2020-11-24 2023-08-08 Globus Medical Inc. Methods for robotic assistance and navigation in spinal surgery and related systems
US11857273B2 (en) 2021-07-06 2024-01-02 Globus Medical, Inc. Ultrasonic robotic surgical navigation
CN113558765B (en) * 2021-07-09 2023-03-21 北京罗森博特科技有限公司 Navigation and reset operation control system and method
WO2023288261A1 (en) * 2021-07-15 2023-01-19 Bullseye Hip Replacement, Llc Devices, systems, and methods for orthopedics
US11439444B1 (en) 2021-07-22 2022-09-13 Globus Medical, Inc. Screw tower and rod reduction tool
US11911115B2 (en) 2021-12-20 2024-02-27 Globus Medical Inc. Flat panel registration fixture and method of using same

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5568384A (en) * 1992-10-13 1996-10-22 Mayo Foundation For Medical Education And Research Biomedical imaging and analysis
US5951475A (en) * 1997-09-25 1999-09-14 International Business Machines Corporation Methods and apparatus for registering CT-scan data to multiple fluoroscopic images

Also Published As

Publication number Publication date
US6301495B1 (en) 2001-10-09

Similar Documents

Publication Publication Date Title
US6301495B1 (en) System and method for intra-operative, image-based, interactive verification of a pre-operative surgical plan
US6747646B2 (en) System and method for fusing three-dimensional shape data on distorted images without correcting for distortion
Joskowicz et al. FRACAS: a system for computer‐aided image‐guided long bone fracture surgery
US9524581B2 (en) Orthopedic treatment device co-display systems and methods
US5868675A (en) Interactive system for local intervention inside a nonhumogeneous structure
JP5474303B2 (en) Surgery planning
EP1631931B1 (en) Methods and systems for image-guided placement of implants
US20100030232A1 (en) System for positioning of surgical inserts and tools
EP2056255B1 (en) Method for reconstruction of a three-dimensional model of an osteo-articular structure
Penney Registration of tomographic images to X-ray projections for use in image guided interventions
Guéziec et al. Providing visual information to validate 2-D to 3-D registration
US20100063420A1 (en) Method for verifying the relative position of bone structures
Morooka et al. A survey on statistical modeling and machine learning approaches to computer assisted medical intervention: Intraoperative anatomy modeling and optimization of interventional procedures
US20080119724A1 (en) Systems and methods for intraoperative implant placement analysis
WO2022133442A1 (en) Systems and methods for generating a three-dimensional model of a joint from two-dimensional images
Zollei 2D-3D rigid-body registration of X-ray Fluoroscopy and CT images
US20100094308A1 (en) Artificial joint replacement assisting device, artificial joint replacement assisting method using same, and assisting system
CN109155068B (en) Motion compensation in combined X-ray/camera interventions
US20220130509A1 (en) Image-processing methods and systems
Popescu et al. A new method to compare planned and achieved position of an orthopaedic implant
Gueziec et al. Registration of computed tomography data to a surgical robot using fluoroscopy: A feasibility study
Yao et al. Deformable 2D-3D medical image registration using a statistical model: accuracy factor assessment
Xie et al. A small-scaled intraoperative 3d visualization navigation system for femoral head repair surgery
Zheng et al. Reality-augmented virtual fluoroscopy for computer-assisted diaphyseal long bone fracture osteosynthesis: a novel technique and feasibility study results
Gueziec et al. Anatomy-based registration of CT-scan and x-ray fluoroscopy data for intraoperative guidance of a surgical robot

Legal Events

Date Code Title Description
EEER Examination request
FZDE Discontinued