US20080033240A1 - Auxiliary image display and manipulation on a computer display in a medical robotic system - Google Patents

Auxiliary image display and manipulation on a computer display in a medical robotic system Download PDF

Info

Publication number
US20080033240A1
US20080033240A1 US11/583,963 US58396306A US2008033240A1 US 20080033240 A1 US20080033240 A1 US 20080033240A1 US 58396306 A US58396306 A US 58396306A US 2008033240 A1 US2008033240 A1 US 2008033240A1
Authority
US
United States
Prior art keywords
image
display screen
robotic system
computer display
medical robotic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/583,963
Inventor
Brian Hoffman
Rajesh Kumar
David Larkin
Giuseppe Prisco
Nitish Swarup
Guanghua Zhang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intuitive Surgical Operations Inc
Original Assignee
Intuitive Surgical Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intuitive Surgical Inc filed Critical Intuitive Surgical Inc
Priority to US11/583,963 priority Critical patent/US20080033240A1/en
Assigned to INTUITIVE SURGICAL, INC. reassignment INTUITIVE SURGICAL, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HOFFMAN, BRIAN DAVID, LARKIN, DAVID Q., SWARUP, NITISH, PRISCO, GIUSEPPE, KUMAR, RAJESH, ZHANG, GUANGHUA
Publication of US20080033240A1 publication Critical patent/US20080033240A1/en
Priority to US15/139,682 priority patent/US20160235496A1/en
Assigned to Intuitive Surgical Operations, Inc. reassignment Intuitive Surgical Operations, Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: INTUITIVE SURGICAL, INC.
Assigned to Intuitive Surgical Operations, Inc. reassignment Intuitive Surgical Operations, Inc. CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNOR EXECUTION DATE PREVIOUSLY RECORDED AT REEL: 042260 FRAME: 0780. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT. Assignors: INTUITIVE SURGICAL, INC.
Priority to US16/564,734 priority patent/US11197731B2/en
Priority to US17/530,166 priority patent/US20220071721A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00193Optical arrangements adapted for stereoscopic vision
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/044Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for absorption imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/313Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/04Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
    • A61B18/12Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/04Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
    • A61B18/12Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
    • A61B18/14Probes or electrodes therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/71Manipulators operated by drive cable mechanisms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/76Manipulators having means for providing feel, e.g. force or tactile feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/04Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating
    • A61B18/12Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by heating by passing a current through the tissue to be heated, e.g. high-frequency current
    • A61B18/14Probes or electrodes therefor
    • A61B18/1482Probes or electrodes therefor having a long rigid shaft for accessing the inner body transcutaneously in minimal invasive surgery, e.g. laparoscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00571Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body for achieving a particular surgical effect
    • A61B2018/00577Ablation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00571Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body for achieving a particular surgical effect
    • A61B2018/00595Cauterization
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00982Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body combined with or comprising means for visual or photographic inspections inside the body, e.g. endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00994Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body combining two or more different kinds of non-mechanical energy or combining one or more non-mechanical energies with ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/10Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis
    • A61B2090/101Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges for stereotaxic surgery, e.g. frame-based stereotaxis for stereotaxic radiosurgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/374NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • A61B2090/3782Surgical systems with images on a monitor during operation using ultrasound transmitter or receiver in catheter or minimal invasive instrument
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N7/00Ultrasound therapy
    • A61N7/02Localised ultrasound hyperthermia
    • A61N7/022Localised ultrasound hyperthermia intracavitary
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/014Force feedback applied to GUI
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04804Transparency, e.g. transparent or translucent windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the present invention generally relates to medical robotic systems and in particular, to the displaying and manipulating of auxiliary images on a computer display in a medical robotic system.
  • Medical robotic systems such as those used in performing minimally invasive surgical procedures offer many benefits over traditional open surgery techniques, including less pain, shorter hospital stays, quicker return to normal activities, minimal scarring, reduced recovery time, and less injury to tissue. Consequently, demand for minimally invasive surgery using medical robotic systems is strong and growing.
  • the daVinci® Surgical System includes a surgeon's console, a patient-side cart, a high performance 3-D vision system, and Intuitive Surgical's proprietary EndoWristTM articulating instruments, which are modeled after the human wrist so that when added to the motions of the robotic arm assembly holding the surgical instrument, they allow at least a full six degrees of freedom of motion, which is comparable to the natural motions of open surgery.
  • the daVinci® surgeon's console has a high-resolution stereoscopic video display with two progressive scan cathode ray tubes (“CRTs”).
  • CRTs progressive scan cathode ray tubes
  • the system offers higher fidelity than polarization, shutter eyeglass, or other techniques.
  • Each eye views a separate CRT presenting the left or right eye perspective, through an objective lens and a series of mirrors. The surgeon sits comfortably and looks into this display throughout surgery, making it an ideal place for the surgeon to display and manipulate 3-D intra-operative imagery.
  • auxiliary information may be provided in various modes such as text information, bar graphs, two-dimensional picture-in-picture images, and two-dimensional or three-dimensional images that are registered and properly overlaid with respect to their primary image counterparts.
  • the images may be captured pre-operatively or intra-operatively using techniques such as ultrasonography, magnetic resonance imaging, computed axial tomography, and fluoroscopy to provide internal details of an anatomic structure being treated. This information may then be used to supplement external views of the anatomic structure such as captured by a locally placed camera.
  • one object of various aspects of the present invention is a method for displaying auxiliary information including the effect of a therapeutic procedure as an overlay to or otherwise associated with an image of an anatomic structure being treated at the time by the procedure.
  • Another object of various aspects of the present invention is a method for displaying a user selected portion at a user specified magnification factor of a volume rendering of an auxiliary image of an anatomic structure as a registered overlay to a primary image of the anatomic structure on a computer display screen.
  • Another object of various aspects of the present invention is a medical robotic system having a master input device that may be used to manually register images in a three-dimensional space of a computer display.
  • Another object of various aspects of the present invention is a medical robotic system having a master input device that may be used to define cut-planes of a volume rendering of an anatomic structure in a three-dimensional space of a computer display.
  • Another object of various aspects of the present invention is a medical robotic system having a master input device that may be used to selectively modify portions or details of a volume rendering of an anatomic structure in a three-dimensional space of a computer display.
  • Another object of various aspects of the present invention is a medical robotic system having a master input device that may be used to vary display parameters for a rendering of an anatomic structure being displayed on a computer display screen.
  • Still another object of various aspects of the present invention is a medical robotic system having a master input device that may be switched between an image capturing mode wherein the master input device controls movement of an image capturing device, and an image manipulating mode wherein the master input device controls display and manipulation of images captured by the image capturing device on a computer display screen.
  • one aspect is method for displaying on a computer display screen an effect of a therapeutic procedure being applied by a therapy instrument to an anatomic structure, comprising: generating an auxiliary image that indicates the effect of the therapeutic procedure being applied by the therapy instrument to the anatomic structure; and displaying a primary image of the anatomic structure overlaid with the auxiliary image on the computer display screen during the therapeutic procedure.
  • Another aspect is a method for displaying a selected portion of an auxiliary image of an anatomic structure as an overlay to a primary image of the anatomic structure on a computer display screen, comprising: associating a movable window with a pointing device such that the movable window is positionable on the computer display screen using the pointing device; registering an auxiliary image of an anatomic structure with a primary image of the anatomic structure so as to be at a same position and orientation in a common reference frame; and displaying the primary image on the computer display screen, and a portion of the registered auxiliary image corresponding to the same screen coordinates as the movable window as an overlay to the primary image in the movable window.
  • Still another aspect is a medical robotic system comprising: an image capturing device for capturing images; a robotic arm holding the image capturing device; a computer display screen; a master input device adapted to be manipulatable by an user in multiple degrees-of-freedom movement; and a processor configured to control movement of the image capturing device according to user manipulation of the master input device when the master input device is in an image capturing mode, and controlling the displaying of images derived from the captured images on the computer display screen according to user manipulation of the master input device when the master input device is in an image manipulating mode.
  • FIG. 1 illustrates a top view of an operating room employing a medical robotic system utilizing aspects of the present invention.
  • FIG. 2 illustrates a block diagram of a medical robotic system utilizing aspects of the present invention.
  • FIG. 3 illustrates a laparoscopic ultrasound probe useful for a medical robotic system utilizing aspects of the present invention.
  • FIG. 4 illustrates a flow diagram of a method for displaying on a computer display screen an effect of a therapeutic procedure being applied by a therapeutic instrument to an anatomic structure, utilizing aspects of the present invention.
  • FIG. 5 illustrates an external view of an anatomic structure with a therapeutic instrument inserted in the anatomic structure for performing a therapeutic procedure.
  • FIG. 6 illustrates an internal view of an anatomic structure with a discernable therapeutic effect shown as captured by a therapy sensing device.
  • FIG. 7 illustrates a computer display screen displaying an effect of a therapeutic procedure registered to an anatomic structure being treated by the procedure, as generated by a method utilizing aspects of the present invention.
  • FIG. 8 illustrates a flow diagram of a method for displaying a selected portion of an auxiliary image of an anatomic structure in a user movable magnifying glass on a computer display screen, utilizing aspects of the present invention.
  • FIG. 9 illustrates a flow diagram of a method for displaying a manipulatable window of an internal view of an anatomic structure at a specified magnification factor, utilizing aspects of the present invention.
  • FIG. 10 illustrates an auxiliary image of an anatomic structure and concentric areas of the auxiliary image representing different magnification factors for display on a computer display screen in a magnifying glass by a method utilizing aspects of the present invention.
  • FIG. 11 illustrates a computer display screen with a primary image of an anatomic structure and an overlaid portion of an auxiliary image of the anatomic structure viewed in a magnifying glass lens as displayed by a method utilizing aspects of the present invention.
  • FIG. 12 illustrates a flow diagram of a method performed by a processor in a medical robotic system for manipulating objects displayed on a computer display screen utilizing aspects of the present invention.
  • FIG. 1 illustrates, as an example, a top view of an operating room employing a medial robotic system.
  • the medical robotic system in this case is a Minimally Invasive Robotic Surgical (“MIRS”) System 100 including a Console (“C”) utilized by a Surgeon (“S”) while performing a minimally invasive diagnostic or surgical procedure with assistance from one or more Assistants (“A”) on a Patient (“P”) who is reclining on an Operating table (“O”).
  • MIRS Minimally Invasive Robotic Surgical
  • the Console includes a Master Display 104 (also referred to herein as a “Display Screen” or “computer display screen”) for displaying one or more images of a surgical site within the Patient as well as perhaps other information to the Surgeon. Also included are Master Input Devices 107 , 108 (also referred to herein as “Master Manipulators”), one or more Foot Pedals 105 , 106 , a Microphone 103 for receiving voice commands from the Surgeon, and a Processor 102 .
  • the Master Input Devices 107 , 108 may include any one or more of a variety of input devices such as joysticks, gloves, trigger-guns, hand-operated controllers, grippers, or the like.
  • the Processor 102 is preferably a personal computer that may be integrated into the Console or otherwise connected to it in a conventional manner.
  • the Surgeon performs a medical procedure using the MIRS System 100 by manipulating the Master Input Devices 107 , 108 so that the Processor 102 causes their respectively associated Slave Arms 121 , 122 to manipulate their respective removably coupled and held Surgical Instruments 138 , 139 (also referred to herein as “Tools”) accordingly, while the Surgeon views three-dimensional (“3D”) images of the surgical site on the Master Display 104 .
  • Tools three-dimensional
  • the Tools 138 , 139 are preferably Intuitive Surgical's proprietary EndoWristTM articulating instruments, which are modeled after the human wrist so that when added to the motions of the robot arm holding the tool, they allow at least a full six degrees of freedom of motion, which is comparable to the natural motions of open surgery. Additional details on such tools may be found in commonly owned U.S. Pat. No. 5,797,900 entitled “Wrist Mechanism for Surgical Instrument for Performing Minimally Invasive Surgery with Enhanced Dexterity and Sensitivity,” which is incorporated herein by this reference.
  • a manipulatable end effector such as a clamp, grasper, scissor, stapler, blade, needle, needle holder, or energizable probe.
  • the Master Display 104 has a high-resolution stereoscopic video display with two progressive scan cathode ray tubes (“CRTs”).
  • CRTs progressive scan cathode ray tubes
  • the system offers higher fidelity than polarization, shutter eyeglass, or other techniques.
  • Each eye views a separate CRT presenting the left or right eye perspective, through an objective lens and a series of mirrors. The Surgeon sits comfortably and looks into this display throughout surgery, making it an ideal place for the Surgeon to display and manipulate 3-D intra-operative imagery.
  • a Stereoscopic Endoscope 140 provides right and left camera views to the Processor 102 so that it may process the information according to programmed instructions and cause it to be displayed on the Master Display 104 .
  • a Laparoscopic Ultrasound (“LUS”) Probe 150 provides two-dimensional (“2D”) ultrasound image slices of an anatomic structure to the Processor 102 so that the Processor 102 may generate a 3D ultrasound computer model or volume rendering of the anatomic structure.
  • Each of the Tools 138 , 139 , as well as the Endoscope 140 and LUS Probe 150 is preferably inserted through a cannula or trocar (not shown) or other tool guide into the Patient so as to extend down to the surgical site through a corresponding minimally invasive incision such as Incision 161 .
  • Each of the Slave Arms 121 - 124 includes a slave manipulator and setup arms.
  • the slave manipulators are robotically moved using motor controlled joints (also referred to as “active joints”) in order to manipulate and/or move their respectively held Tools.
  • the setup arms are manually manipulated by releasing normally braked joints (also referred to as “setup joints”) to horizontally and vertically position the Slave Arms 121 - 124 so that their respective Tools may be inserted into the cannulae.
  • the number of surgical tools used at one time and consequently, the number of slave arms being used in the System 100 will generally depend on the medical procedure to be performed and the space constraints within the operating room, among other factors. If it is necessary to change one or more of the tools being used during a procedure, the Assistant may remove the tool no longer being used from its slave arm, and replace it with another tool, such as Tool 131 , from a Tray (“T”) in the Operating Room.
  • T Tray
  • the Master Display 104 is positioned near the Surgeon's hands so that it will display a projected image that is oriented so that the Surgeon feels that he or she is actually looking directly down onto the surgical site.
  • an image of the Tools 138 , 139 preferably appear to be located substantially where the Surgeon's hands are located even though the observation points (i.e., that of the Endoscope 140 and LUS Probe 150 ) may not be from the point of view of the image.
  • the real-time image is preferably projected into a perspective image such that the Surgeon can manipulate the end effector of a Tool, 138 or 139 , through its associated Master Input Device, 107 or 108 , as if viewing the workspace in substantially true presence.
  • true presence it is meant that the presentation of an image is a true perspective image simulating the viewpoint of an operator that is physically manipulating the Tools.
  • the Processor 102 transforms the coordinates of the Tools to a perceived position so that the perspective image is the image that one would see if the Endoscope 140 was looking directly at the Tools from a Surgeon's eye-level during an open cavity procedure.
  • the Processor 102 performs various functions in the System 100 .
  • One important function that it performs is to translate and transfer the mechanical motion of Master Input Devices 107 , 108 to their associated Slave Arms 121 , 122 through control signals over Bus 110 so that the Surgeon can effectively manipulate their respective Tools 138 , 139 .
  • Another important function is to implement the various methods described herein in reference to FIGS. 4-12 .
  • Processor 102 may be implemented in practice by any combination of hardware, software and firmware. Also, its functions as described herein may be performed by one unit, or divided up among different components, each of which may be implemented in turn by any combination of hardware, software and firmware. When divided up among different components, the components may be centralized in one location or distributed across the System 100 for distributed processing purposes.
  • ultrasound images captured by the LUS Probe 150 Prior to performing a medical procedure, ultrasound images captured by the LUS Probe 150 , right and left 2D camera images captured by the stereoscopic Endoscope 140 , and end effector positions and orientations as determined using kinematics of the Slave Arms 121 - 124 and their sensed joint positions, are calibrated and registered with each other.
  • Slave Arms 123 , 124 may manipulate the Endoscope 140 and LUS Probe 150 in similar manners as Slave Arms 121 , 122 manipulate Tools 138 , 139 .
  • Master Input Devices 107 , 108 in the System 100 in order for the Surgeon to manually control movement of either the Endoscope 140 or LUS Probe 150 , it may be required to temporarily associate one of the Master Input Devices 107 , 108 with the Endoscope 140 or the LUS Probe 150 that the Surgeon desires manual control over, while its previously associated Tool and Slave Manipulator are locked in position.
  • auxiliary images of anatomic structures may be included in the System 100 , such as those commonly used for capturing ultrasound, magnetic resonance, computed axial tomography, and fluoroscopic images. Each of these sources of imagery may be used pre-operatively, and where appropriate and practical, intra-operatively.
  • FIG. 2 illustrates, as an example, a block diagram of the System 100 .
  • Master Input Device 107 controls movement of either a Tool 138 or a stereoscopic Endoscope 140 , depending upon which mode its Control Switch Mechanism 211 is in, and Master Input Device 108 controls movement of either a Tool 139 or a LUS Probe 150 , depending upon which mode its Control Switch Mechanism 231 is in.
  • the Control Switch Mechanisms 211 and 231 may be placed in either a first or second mode by a Surgeon using voice commands, switches physically placed on or near the Master Input Devices 107 , 108 , Foot Pedals 105 , 106 on the Console, or Surgeon selection of appropriate icons or other graphical user interface selection means displayed on the Master Display 104 or an auxiliary display (not shown).
  • Control Switch Mechanism 211 When Control Switch Mechanism 211 is placed in the first mode, it causes Master Controller 202 to communicate with Slave Controller 203 so that manipulation of the Master Input 107 by the Surgeon results in corresponding movement of Tool 138 by Slave Arm 121 , while the Endoscope 140 is locked in position. On the other hand, when Control Switch Mechanism 211 is placed in the second mode, it causes Master Controller 202 to communicate with Slave Controller 233 so that manipulation of the Master Input 107 by the Surgeon results in corresponding movement of Endoscope 140 by Slave Arm 123 , while the Tool 138 is locked in position.
  • Control Switch Mechanism 231 when Control Switch Mechanism 231 is placed in the first mode, it causes Master Controller 108 to communicate with Slave Controller 223 so that manipulation of the Master Input 108 by the Surgeon results in corresponding movement of Tool 139 by Slave Arm 122 .
  • the LUS Probe 150 is not necessarily locked in position. Its movement may be guided by an Auxiliary Controller 242 according to stored instructions in Memory 240 .
  • the Auxiliary Controller 242 also provides haptic feedback to the Surgeon through Master Input 108 that reflects readings of a LUS Probe Force Sensor 247 .
  • Control Switch Mechanism 231 when placed in the second mode, it causes Master Controller 108 to communicate with Slave Controller 243 so that manipulation of the Master Input 108 by the Surgeon results in corresponding movement of LUS Probe 150 by Slave Arm 124 , while the Tool 139 is locked in position.
  • a Control Switch Mechanism Before a Control Switch Mechanism effects a switch back to its first or normal mode, its associated Master Input Device is preferably repositioned to where it was before the switch. Alternatively, the Master Input Device may remain in its current position and kinematic relationships between the Master Input Device and its associated Tool Slave Arm readjusted so that upon the Control Switch Mechanism switching back to its first or normal mode, abrupt movement of the Tool does not occur.
  • control switching see, e.g., commonly owned U.S. Pat. No. 6,659,939 entitled “Cooperative Minimally Invasive Telesurgical System,” which is incorporated herein by this reference.
  • a third Control Switch Mechanism 241 is provided to allow the user to switch between an image capturing mode and an image manipulating mode while the Control Switch Mechanism 231 is in its second mode (i.e., associating the Master Input Device 108 with the LUS Probe 150 ).
  • the LUS Probe 150 In its first or normal mode (i.e., image capturing mode), the LUS Probe 150 is normally controlled by the Master Input Device 108 as described above.
  • the LUS Probe 150 In its second mode (i.e., image manipulating mode), the LUS Probe 150 is not controlled by the Master Input Device 108 , leaving the Master Input Device 108 free to perform other tasks such as the displaying and manipulating of auxiliary images on the Display Screen 104 and in particular, for performing certain user specified functions as described herein.
  • the LUS Probe 150 may not be controlled by the Master Input Device 108 in this second mode of the Control Switch Mechanism 241 , it may still be automatically rocked or otherwise moved under the control of the Auxiliary Controller 242 according to stored instructions in Memory 240 so that a 3D volume rendering of a proximate anatomic structure may be generated from a series of 2D ultrasound image slices captured by the LUS Probe 150 .
  • a 3D volume rendering of a proximate anatomic structure may be generated from a series of 2D ultrasound image slices captured by the LUS Probe 150 .
  • the Auxiliary Controller 242 also performs other functions related to the LUS Probe 150 and the Endoscope 140 . It receives output from a LUS Probe Force Sensor 247 , which senses forces being exerted against the LUS Probe 150 , and feeds the force information back to the Master Input Device 108 through the Master Controller 222 so that the Surgeon may feel those forces even if he or she is not directly controlling movement of the LUS Probe 150 at the time. Thus, potential injury to the Patient is minimized since the Surgeon has the capability to immediately stop any movement of the LUS Probe 150 as well as the capability to take over manual control of its movement.
  • Another key function of the Auxiliary Control 242 is to cause processed information from the Endoscope 140 and the LUS Probe 150 to be displayed on the Master Display 104 according to user selected display options. Examples of such processing include generating a 3D ultrasound image from 2D ultrasound image slices received from the LUS Probe 150 through an Ultrasound Processor 246 , causing either 3D or 2D ultrasound images corresponding to a selected position and orientation to be displayed in a picture-in-picture window of the Master Display 104 , causing either 3D or 2D ultrasound images of an anatomic structure to overlay a camera captured image of the anatomic structure being displayed on the Master Display 104 , and performing the methods described below in reference to FIGS. 4-12 .
  • the Master Controllers 202 , 222 , Slave Controllers 203 , 233 , 223 , 243 , and Auxiliary Controller 242 are preferably implemented as software modules executed by the Processor 102 , as well as certain mode switching aspects of the Control Switch Mechanisms 211 , 231 , 241 .
  • the Ultrasound Processor 246 and Video Processor 236 may be software modules or separate boards or cards that are inserted into appropriate slots coupled to or otherwise integrated with the Processor 102 to convert signals received from these image capturing devices into signals suitable for display on the Master Display 104 and/or for additional processing by the Auxiliary Controller 242 before being displayed on the Master Display 104 .
  • each Master Input Device is being shared by only one pre-assigned Tool Slave Robotic Arm and one pre-assigned Image Capturing Device Robotic Arm
  • alternative arrangements are also feasible and envisioned to be within the full scope of the present invention.
  • each of the Master Input Devices may be selectively associated with any one of the Tool and Image Capturing Device Robotic Arms is also possible and even preferably for maximum flexibility.
  • the Endoscope Robotic Arm is shown in this example as being controlled by a single Master Input Device, it may also be controlled using both Master Input Devices to give the sensation of being able to “grab the image” and move it to a different location or view.
  • FIG. 3 illustrates a side view of one embodiment of the LUS Probe 150 .
  • the LUS Probe 150 is a dexterous tool with preferably two distal degrees of freedom. Opposing pairs of Drive Rods or Cables (not shown) physically connected to a proximal end of the LUS Sensor 301 and extending through an internal passage of Elongated Shaft 312 mechanically control pitch and yaw movement of the LUS Sensor 301 using conventional push-pull type action.
  • the LUS Sensor 301 captures 2D ultrasound slices of a proximate anatomic structure, and transmits the information back to the Processor 102 through LUS Cable 304 . Although shown as running outside of the Elongated Shaft 312 , the LUS Cable 304 may also extend within it.
  • a Clamshell Sheath 321 encloses the Elongate Shaft 312 and LUS Cable 304 to provide a good seal passing through a Cannula 331 (or trocar). Fiducial Marks 302 and 322 are placed on the LUS Sensor 301 and the Sheath 321 for video tracking purposes.
  • FIG. 4 illustrates, as an example, a flow diagram of a method for displaying the effect of a therapeutic procedure or treatment on the Display Screen 104 .
  • a primary image of an anatomic structure is captured by an image capturing device.
  • FIG. 5 illustrates a primary image which has been captured by the Endoscope 140 and includes an anatomic structure 501 and therapeutic instrument 511 that has been partially inserted into the anatomic structure 501 in order to perform a therapeutic procedure at a therapy site within the anatomic structure 501 .
  • the therapeutic instrument 511 may only need to touch or come close to the anatomic structure 501 in order to perform a therapeutic procedure.
  • the primary image may be captured before or during the therapeutic procedure.
  • a primary image captured before the procedure is referred to as being a “pre-operative” image, and a primary image captured during the procedure is referred to as being an “intra-operative” image.
  • the image is generally not updated during the procedure, so that the method generally only employs one primary image.
  • the primary image is an intra-operative image, the image is preferably updated periodically during the procedure, so that the method employs multiple primary images in that case.
  • Pre-operative images are typically captured using techniques such as Ultrasonography, Magnetic Resonance Imaging (MRI), or Computed Axial Tomography (CAT).
  • Intra-operative images may be captured at the surgical or therapeutic site by image capturing devices such as the stereoscopic Endoscope 140 or LUS Probe 150 , or they may be captured externally by techniques such as those used to capture the pre-operative images.
  • the therapeutic instrument is turned on, or otherwise activated or energized, so as to be capable of applying therapy to the anatomic structure within the patient.
  • the instrument generally has a tip for applying the therapeutic energy to abnormal tissue such as diseased or damaged tissue.
  • Radio Frequency Ablation may be used to destroy diseased tissue such as a tumor located in an anatomic structure such as the liver by applying heat to the diseased tissue site using an RFA probe.
  • RFA Radio Frequency Ablation
  • HIFU High Intensity Focused Ultrasound
  • the therapeutic instrument may be one of the Tools 138 , 139 attached to Slave Arms 121 , 122 so that it may be moved to and manipulated at the therapy site through the master/slave control system by the Surgeon.
  • an auxiliary image is generated, wherein the auxiliary image indicates the effect of the therapeutic procedure on the anatomic structure.
  • the auxiliary image may be an actual image of the anatomic structure that has been provided by or generated from information captured by a sensing device which is capable of sensing the effect of the therapeutic procedure.
  • the auxiliary image may be a computer model indicating the effect of the therapy, which may be generated using an empirically derived or otherwise conventionally determined formula of such effect.
  • the computer model is generally a volumetric shape determined by such factors as the geometry of the tip of the therapeutic instrument, the heat or energy level being applied to the anatomic structure by the tip of the therapeutic instrument, and the features of the surrounding tissue of a therapy site being subjected to the therapeutic procedure in the anatomic structure.
  • FIG. 6 illustrates a three-dimensional ultrasound image of an anatomic structure 601 which has been conventionally derived from two-dimensional ultrasound slices captured by the LUS probe 150 .
  • an ablation volume 621 is shown which represents the effect of a therapeutic procedure in which a tip 613 of an RFA probe 612 is being applied to a tumor site of the anatomic structure 601 .
  • the growth of the ablation volume in this case is viewable due to changes in tissue properties from the heating and necrosis of the surrounding tissue at the tumor site.
  • the primary and auxiliary images are registered so as to be of the same scale and refer to a same position and orientation in a common reference frame. Registration of this sort is well known.
  • U.S. Pat. No. 6,522,906 entitled “Devices and Methods for Presenting and Regulating Auxiliary Information on an Image Display of a Telesurgical System to Assist an Operator in Performing a Surgical Procedure,” which is incorporated herein by this reference.
  • the primary image is displayed on the Display Screen 104 while the therapeutic procedure is being performed, with the registered auxiliary image preferably overlaid upon the primary image so that corresponding structures or objects in each of the images appear as the same size and at the same location and orientation on the Display Screen 104 .
  • the effect of the therapeutic procedure is shown as an overlay over the anatomic structure that is being subjected to the procedure.
  • FIG. 7 shows an exemplary Display Screen 104 in which an auxiliary image, distinguished as a dotted line for illustrative purposes, is overlaid over the primary image of FIG. 5 .
  • the auxiliary image is provided by or derives from information captured by a sensing device
  • the therapy effect 521 , therapeutic instrument 512 , and instrument tip 513 is provided by or derived from the captured information.
  • the therapy effect 521 is generated as a volumetric shaped computer model using an empirically determined formula
  • the therapeutic instrument 512 and instrument tip 513 may be determined using conventional tool tracking computations based at least in part upon joint positions of its manipulating slave arm.
  • the method then checks whether the therapeutic instrument has been turned off. If it has, then this means that the therapeutic procedure is over, and the method ends. On the other hand, if the therapeutic instrument is still on, then the method assumes that the therapeutic procedure is still being performed, and proceeds in 407 to determine whether a new primary image has been captured. If no new primary image has been captured, for example, because the primary image is a pre-operative image, then the method jumps back to 403 to update the auxiliary image and continue to loop through 403 - 407 until the therapeutic procedure is determined to be completed by detecting that the therapeutic instrument has been turned off.
  • the method updates the primary image in 408 before jumping back to 403 to update the auxiliary image and continue to loop through 403 - 408 until the therapeutic procedure is determined to be completed by detecting that the therapeutic instrument has been turned off.
  • FIG. 8 illustrates, as an example, a flow diagram of a method for displaying an auxiliary image of an anatomic structure as a registered overlay to a primary image of the anatomic structure at a user specified magnification in a window defined as the lens area of a magnifying glass whose position and orientation as displayed on the Display Screen 104 is manipulatable by the user using an associated pointing device.
  • the method starts out by associating the magnifying glass with the pointing device so that as the pointing device moves, the magnifying glass being displayed on the Display Screen 104 (and in particular, its lens which may be thought of as a window) moves in a corresponding fashion.
  • the association in this case may be performed by “grabbing” the magnifying glass in a conventional manner using the pointing device, or by making the magnifying glass effectively the cursor for the pointing device.
  • the Display Screen 104 is preferably a three-dimensional display
  • the pointing device is correspondingly preferably a three-dimensional pointing device with orientation indicating capability.
  • the primary image in this example is captured by the Endoscope 140 and the auxiliary captured by the LUS Probe 150 .
  • other sources for the primary and auxiliary images are also usable and contemplated in practicing the invention, including primary and auxiliary images captured from the same source.
  • a high resolution camera may capture images at a resolution greater than that being used to display images on a display screen.
  • the high resolution image captured by the camera may be treated as the auxiliary image, and the downsized image to be displayed on the display screen may be treated as the primary image.
  • a user selectable magnification factor is read.
  • the magnification factor is user selectable by, for example, a dial or wheel type control on the pointing device. Alternatively, it may be user selectable by user selection of item in a menu displayed on the Display Screen 104 , or any other conventional user selectable parameter value scheme or mechanism. If the user fails to make a selection, then a default value is used, such as a magnification factor of 1.0.
  • the primary and auxiliary images are registered so as to be of the same scale and refer to a same position and orientation in a common reference frame so that corresponding structures and objects in the two images have the same coordinates.
  • the primary image is displayed on the Display Screen 104 such as a three-dimensional view of the anatomic structure, in which case, a portion of a two-dimensional slice of the auxiliary image of the anatomic structure may be displayed as an overlay in the lens of the magnifying glass.
  • the portion of the two-dimensional slice in this case is defined by a window area having a central point that has the same position and orientation of as the central point of the lens of the magnifying glass, and an area determined by the magnification factor so that the portion of the two-dimensional slice may be enlarged or reduced so as to fit in the lens of the magnifying glass.
  • the two-dimensional slice can correspond to any user selected depth within the anatomic structure.
  • its view is not limited to inspecting only the exterior of the anatomic structure.
  • the method determines whether the magnifying glass command has been turned off by, for example, the user releasing a “grabbed” image of the magnifying glass, or otherwise switching off the association between the magnifying glass and the pointing device by the use of a conventional switch mechanism of some sort. If it has, then the method ends. On the other hand, if it has not, then the method jumps back to 802 and continues to loop through 802 - 806 until the magnifying glass command is detected to have been turned off. Note that each time the method loops through 802 - 806 , updated versions, if any, of the primary and auxiliary images are processed along with updated values, if any, for the user selectable magnification factor. Thus, if the method proceeds through the looping in a sufficiently fast manner, the user will not notice any significant delay if the user is turning a dial or knob to adjust the magnification factor while viewing the anatomic structure at a selected position and orientation of the magnifying glass.
  • FIG. 9 illustrates, as an example, a flow diagram of a method for displaying an auxiliary image view of an anatomic structure at a specified magnification factor as an overlay to a primary image view of the anatomic structure in the lens of a user movable magnifying glass. As previously explained, this method may be used to perform 805 of FIG. 8 .
  • the current position and orientation of a central point of the lens of the magnifying glass are determined in the three-dimensional space of the Display Screen 104 .
  • a two-dimensional slice of the registered volumetric model of the auxiliary image is taken from the perspective of that position and orientation, and a portion of the two-dimensional slice is taken as defined in an auxiliary view window having a central point preferably at that same position and orientation.
  • the area of the auxiliary view window in this case is inversely proportional to that of the lens according to the current magnification factor for the magnifying glass.
  • the portion of the two-dimensional slice defined by the auxiliary view window is then enlarged by the magnification factor so that it fits in the lens area of the magnifying glass, and in 904 , the primary image of the anatomic structure is displayed on the Display Screen 104 with the enlarged portion of the two-dimensional slice of the auxiliary image overlaid in the lens area of the magnifying glass being displayed on the Display Screen 104 .
  • FIGS. 10-11 a two-dimensional slice 1001 of an auxiliary image of an anatomic structure is shown along with two circular windows 1021 , 1022 on the two-dimensional slice as illustrated in FIG. 10 .
  • Each of the windows 1021 , 1022 in this case corresponds in shape to and having a central point equal to that of a lens 1121 of a magnifying glass 1120 which is being displayed along with a primary image of an external view 1101 of the anatomic structure on the Display Screen 104 as illustrated in FIG. 11 .
  • the area of the window 1021 is equal to the area of the lens 1121 , so that if the magnification factor was 1.0, then window 1021 would be selected for use in 902 .
  • the area of the window 1022 is less than the area of the lens 1121 , so that if the magnification factor is greater than 1.0, then the window 1022 may be selected for use in 902 .
  • the lens 1121 of the magnifying glass 1120 is depicted as being circularly shaped, it may also have other common shapes for a magnifying glass, such as a rectangular shape.
  • FIG. 12 illustrates, as an example, a flow diagram of a method performed by a processor in a medical robotic system for manipulating image objects displayed on a computer display screen of the medical robotic system in response to corresponding manipulation of an associated master input device when the master input device is in an image manipulating mode.
  • the medical robotic system includes an image capturing device to capture images (such as either the Endoscope 140 or the LUS Probe 150 ); a robotic arm holding the image capturing device (such as the Slave Arm 123 or the Slave Arm 124 respectively holding the Endoscope 140 and the LUS Probe 150 ); a computer display screen (such as the Display Screen 104 ); a master input device adapted to be manipulatable by a user in multiple degrees-of-freedom movement (such as the Master Input Device 107 or the Master Input Device 108 ); and a processor (such as the Auxiliary Controller 242 ) that is configured to control movement of the image capturing device according to user manipulation of the master input device when the master input device is in an image capturing mode, and control the displaying of images derived from the captured images on the computer display screen according to user manipulation of the master input device when the master input device is in the image manipulating mode.
  • an image capturing device to capture images such as either the Endoscope 140 or the LUS Probe 150
  • the processor detects that the user has placed the master input device into its image manipulating mode.
  • a master clutch mechanism provided in the medical robotic system, which supports disengaging the master input device from its associated robotic arm so that the master input device may be repositioned.
  • this mode is activated by some mechanism such as the user depressing a button on the master input device, pressing down on a foot pedal, or using voice activation, the associated robotic arm is locked in position, and a cursor (nominally an iconic representation of a hand, e.g. ) is presented to the user on the computer display screen.
  • a cursor nominally an iconic representation of a hand, e.g.
  • the processor determines whether a control input such as that generated by depressing a button on a conventional mouse has been activated by the user.
  • the control input in this case may be activated by depressing a button provided on the master input device, or it may be activated by some other fashion such as squeezing a gripper or pincher formation provided on the master input device.
  • a control input such as that generated by depressing a button on a conventional mouse has been activated by the user.
  • the control input in this case may be activated by depressing a button provided on the master input device, or it may be activated by some other fashion such as squeezing a gripper or pincher formation provided on the master input device.
  • For additional details on clutching, and gripper or pincher formations on a master input device see, e.g., commonly owned U.S. Pat. No. 6,659,939 entitled “Cooperative Minimally Invasive Telesurgical System,” which has been previously incorporated herein by reference. If the control
  • the processor After receiving an indication that the control input is “on”, the processor checks to see if the cursor is positioned on (or within a predefined distance to) an object being displayed on the computer display screen. If it is not, then in 1204 , the processor causes a menu of user selectable items or actions to be displayed on the computer display screen, and in 1205 , the processor receives and reacts to a menu selection made by the user.
  • Examples of user selectable menu items include: magnifying glass, cut-plane, eraser, and image registration. If the user selects the magnifying glass item, then an image of a magnifying glass is displayed on the computer display screen and the method described in reference to FIG. 8 may be performed by the processor. When the user is finished with the magnifying glass function, then the user may indicate exiting of the function in any conventional manner and the processor returns to 1202 .
  • a plane (or rectangular window of fixed or user adjustable size) is displayed on the computer display screen.
  • the master input device may then be associated with the plane so that the user may position and orientate the plane in the three-dimensional space of the computer display screen by manipulating the master input device in the manner of a pointing device. If the plane is maneuvered so as to intersect a volume rendering of an anatomic structure, then it functions as a cut-plane defining a two-dimensional slice of the volume rendering at the intersection. Alternatively, the master input device may be associated with the volume rendering of the anatomic structure, which may then be maneuvered so as to intersect the displayed plane to define the cut-plane. Association of the plane or volume rendering with the pointing device may be performed in substantially the same manner as described in reference to the magnifying glass with respect to 801 of FIG. 8 .
  • the two-dimensional slice may then be viewed either in the plane itself, or in a separate window on the computer display screen such as in a picture-in-picture.
  • the user may further select the cut-plane item additional times to define additional two-dimensional slices of the volume rendering for concurrent viewing in respective planes or picture-in-picture windows on the computer display screen.
  • a conventional delete function is provided so that the user may selectively delete any cut-planes and their corresponding slices.
  • the user may indicate exiting of the function in any conventional manner and the processor returns to 1202 .
  • an eraser is displayed on the computer display screen.
  • the master input device is then associated with the eraser so that the user may position and orientate the eraser in the three-dimensional space of the computer display screen by manipulating the master input device in the manner of a pointing device. Association of the eraser with the pointing device in this case may be performed in substantially the same manner as described in reference to the magnifying glass with respect to 801 of FIG. 8 . If the eraser is maneuvered so as to intersect a volume rendering of an anatomic structure, then it functions to either completely or partially erase such rendering wherever it traverses the volume rendering.
  • partial erasing is selected by the user (or otherwise pre-programmed into the processor), then each time the eraser traverses the volume rendering, less detail of the anatomic structure may be shown. Less detail in this case may refer to the coarseness/fineness of the rendering, or it may refer to the stripping away of layers in the three-dimensional volume rendering. All such characteristics or options of the erasing may be user selected using conventional means. If the user inadvertently erases a portion of the volume rendering, a conventional undo feature is provided to allow the user to undo the erasure. When the user is finished with the erasing function, then the user may indicate exiting of the function in any conventional manner and the processor returns to 1202 .
  • spatially localized modifying functions are also contemplated and considered to be within the full scope of the present invention, including selectively sharpening, brightening, or coloring portions of a displayed image to enhance its visibility in, or otherwise highlight, a selected area.
  • Each such spatially localized modifying function may be performed using substantially the same method described above in reference to the eraser function.
  • Image registration in this case typically involves manually registering an auxiliary image of an object such as an anatomic structure with a corresponding primary image of the object.
  • icons respectively indicating each of the selectable items as described above may be displayed on the computer display screen upon entering image manipulating mode and selected by the user clicking on them, after which, the processor proceeds to perform as described above in reference to selection of their corresponding menu items.
  • the processor after receiving an indication that the control input is on in 1201 and determining that the cursor is positioned on or near an object (not an icon) being displayed on the computer display screen in 1202 , the processor preferably changes the cursor from an iconic representation of a hand, for example, to that of a grasping hand to indicate that the object has been “grabbed” and is ready to be moved or “dragged” to another position and/or orientation in the three-dimensional space of the computer display screen through user manipulation of the master input device.
  • the processor determines whether the user has indicated that a display parameter of the selected object is to be adjusted, and if the user has so indicated, in 1207 , the processor performs the display adjustment.
  • a dial on the master input device may be turned by the user to indicate both that a display adjustment for a display parameter associated with dial is to be adjusted according to the amount of rotation of the dial on the selected object.
  • the gripper may be rotated so as to function as a dial. Examples of display parameters that may be adjusted in this manner include: brightness, contrast, color, and level of detail (e.g., mesh coarseness/fineness, or voxel size and/or opaqueness) of the selected object being displayed on the computer display screen.
  • the processor then proceeds to 1208 to determine whether the cursor has moved since “grabbing” the selected object after an affirmative determination in 1203 . If it has not moved, then the processor jumps back to 1202 since the user may only have wanted to adjust a display parameter of a selected object at this time. On the other hand, if the cursor has moved since “grabbing” the selected object, then in 1209 , the processor moves the selected object to the new cursor position. Since the cursor operates in the three-dimensional space of the computer display screen, when it moves “into” the display screen, it may indicate such movement by, for example, getting progressively smaller in size.
  • haptic feedback may be provided back to the master input device so that the user may sense reflected forces while the “grabbed” object is being moved in 1209 .
  • user interactions with the object may be reflected haptically back to the user by associating a virtual mass and inertial properties with the object so that the user feels a reflected force when coming into contact with the object or when translating or rotating the object as it is accelerated/decelerated.
  • the haptic feedback performed in this 1210 may only be performed for some types of objects and not for others, or it may take effect only in certain circumstances. Use of such haptic feedback may also be applied to the movement of the magnifying glass and/or the plane used for defining cut-planes as described above. In such cases, however, the haptic feedback may be restricted to only occurring after the magnifying glass or the plane enters into an anatomic structure of interest.
  • the processor determines whether the control input is still in an “on” state. If the control is still “on”, then the processor jumps back to 1208 to track and respond to cursor movement. On the other hand, if the control has been turned off by, for example, the user releasing a button that was initially depressed to indicate that control was turned “on”, then in 1212 , the processor performs a selected menu action.
  • the object that has been moved is registered with another image of the object that is now aligned with and is being displayed on the computer display screen at the time so that they have the same coordinate and orientation values in a common reference frame such as that of the computer display screen.
  • This feature facilitates, for example, manual registration of an auxiliary image of an anatomic structure (such as obtained using the LUS Probe 150 ) with a primary image of the anatomic structure (such as obtained using the Endoscope 140 ).
  • changes to the position and/or orientation of the corresponding object in the primary image may be mirrored so as to cause corresponding changes to the selected object in the auxiliary image so as to maintain its relative position/orientation with respect to the primary image.
  • the processor returns to 1202 .

Abstract

To assist a surgeon performing a medical procedure, auxiliary images generally indicating internal details of an anatomic structure being treated are displayed and manipulated by the surgeon on a computer display screen to supplement primary images generally of an external view of the anatomic structure. A master input device controlling a robotic arm in a first mode may be switched by the surgeon to a second mode in order to function instead as a mouse-like pointing device to facilitate the surgeon performing such auxiliary information display and manipulation.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims priority to U.S. provisional application Ser. No. 60/728,450 filed Oct. 20, 2005, which is incorporated herein by reference.
  • FIELD OF THE INVENTION
  • The present invention generally relates to medical robotic systems and in particular, to the displaying and manipulating of auxiliary images on a computer display in a medical robotic system.
  • BACKGROUND OF THE INVENTION
  • Medical robotic systems such as those used in performing minimally invasive surgical procedures offer many benefits over traditional open surgery techniques, including less pain, shorter hospital stays, quicker return to normal activities, minimal scarring, reduced recovery time, and less injury to tissue. Consequently, demand for minimally invasive surgery using medical robotic systems is strong and growing.
  • One example of a medical robotic system is the daVinci® Surgical System from Intuitive Surgical, Inc., of Sunnyvale, Calif. The daVinci® system includes a surgeon's console, a patient-side cart, a high performance 3-D vision system, and Intuitive Surgical's proprietary EndoWrist™ articulating instruments, which are modeled after the human wrist so that when added to the motions of the robotic arm assembly holding the surgical instrument, they allow at least a full six degrees of freedom of motion, which is comparable to the natural motions of open surgery.
  • The daVinci® surgeon's console has a high-resolution stereoscopic video display with two progressive scan cathode ray tubes (“CRTs”). The system offers higher fidelity than polarization, shutter eyeglass, or other techniques. Each eye views a separate CRT presenting the left or right eye perspective, through an objective lens and a series of mirrors. The surgeon sits comfortably and looks into this display throughout surgery, making it an ideal place for the surgeon to display and manipulate 3-D intra-operative imagery.
  • In addition to primary imagery being displayed on the display screen, it is also desirable at times to be able to concurrently view auxiliary information to gain better insight or to otherwise assist in the medical procedure being performed. The auxiliary information may be provided in various modes such as text information, bar graphs, two-dimensional picture-in-picture images, and two-dimensional or three-dimensional images that are registered and properly overlaid with respect to their primary image counterparts.
  • For auxiliary images, the images may be captured pre-operatively or intra-operatively using techniques such as ultrasonography, magnetic resonance imaging, computed axial tomography, and fluoroscopy to provide internal details of an anatomic structure being treated. This information may then be used to supplement external views of the anatomic structure such as captured by a locally placed camera.
  • Although there are a plethora of auxiliary information sources as well as manners of displaying that information, improvements in the display and manipulation of auxiliary images is still useful to better assist surgeons in performing medical procedures with medical robotic systems.
  • OBJECTS AND SUMMARY OF THE INVENTION
  • Accordingly, one object of various aspects of the present invention is a method for displaying auxiliary information including the effect of a therapeutic procedure as an overlay to or otherwise associated with an image of an anatomic structure being treated at the time by the procedure.
  • Another object of various aspects of the present invention is a method for displaying a user selected portion at a user specified magnification factor of a volume rendering of an auxiliary image of an anatomic structure as a registered overlay to a primary image of the anatomic structure on a computer display screen.
  • Another object of various aspects of the present invention is a medical robotic system having a master input device that may be used to manually register images in a three-dimensional space of a computer display.
  • Another object of various aspects of the present invention is a medical robotic system having a master input device that may be used to define cut-planes of a volume rendering of an anatomic structure in a three-dimensional space of a computer display.
  • Another object of various aspects of the present invention is a medical robotic system having a master input device that may be used to selectively modify portions or details of a volume rendering of an anatomic structure in a three-dimensional space of a computer display.
  • Another object of various aspects of the present invention is a medical robotic system having a master input device that may be used to vary display parameters for a rendering of an anatomic structure being displayed on a computer display screen.
  • Still another object of various aspects of the present invention is a medical robotic system having a master input device that may be switched between an image capturing mode wherein the master input device controls movement of an image capturing device, and an image manipulating mode wherein the master input device controls display and manipulation of images captured by the image capturing device on a computer display screen.
  • These and additional objects are accomplished by the various aspects of the present invention, wherein briefly stated, one aspect is method for displaying on a computer display screen an effect of a therapeutic procedure being applied by a therapy instrument to an anatomic structure, comprising: generating an auxiliary image that indicates the effect of the therapeutic procedure being applied by the therapy instrument to the anatomic structure; and displaying a primary image of the anatomic structure overlaid with the auxiliary image on the computer display screen during the therapeutic procedure.
  • Another aspect is a method for displaying a selected portion of an auxiliary image of an anatomic structure as an overlay to a primary image of the anatomic structure on a computer display screen, comprising: associating a movable window with a pointing device such that the movable window is positionable on the computer display screen using the pointing device; registering an auxiliary image of an anatomic structure with a primary image of the anatomic structure so as to be at a same position and orientation in a common reference frame; and displaying the primary image on the computer display screen, and a portion of the registered auxiliary image corresponding to the same screen coordinates as the movable window as an overlay to the primary image in the movable window.
  • Still another aspect is a medical robotic system comprising: an image capturing device for capturing images; a robotic arm holding the image capturing device; a computer display screen; a master input device adapted to be manipulatable by an user in multiple degrees-of-freedom movement; and a processor configured to control movement of the image capturing device according to user manipulation of the master input device when the master input device is in an image capturing mode, and controlling the displaying of images derived from the captured images on the computer display screen according to user manipulation of the master input device when the master input device is in an image manipulating mode.
  • Additional objects, features and advantages of the various aspects of the present invention will become apparent from the following description of its preferred embodiment, which description should be taken in conjunction with the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates a top view of an operating room employing a medical robotic system utilizing aspects of the present invention.
  • FIG. 2 illustrates a block diagram of a medical robotic system utilizing aspects of the present invention.
  • FIG. 3 illustrates a laparoscopic ultrasound probe useful for a medical robotic system utilizing aspects of the present invention.
  • FIG. 4 illustrates a flow diagram of a method for displaying on a computer display screen an effect of a therapeutic procedure being applied by a therapeutic instrument to an anatomic structure, utilizing aspects of the present invention.
  • FIG. 5 illustrates an external view of an anatomic structure with a therapeutic instrument inserted in the anatomic structure for performing a therapeutic procedure.
  • FIG. 6 illustrates an internal view of an anatomic structure with a discernable therapeutic effect shown as captured by a therapy sensing device.
  • FIG. 7 illustrates a computer display screen displaying an effect of a therapeutic procedure registered to an anatomic structure being treated by the procedure, as generated by a method utilizing aspects of the present invention.
  • FIG. 8 illustrates a flow diagram of a method for displaying a selected portion of an auxiliary image of an anatomic structure in a user movable magnifying glass on a computer display screen, utilizing aspects of the present invention.
  • FIG. 9 illustrates a flow diagram of a method for displaying a manipulatable window of an internal view of an anatomic structure at a specified magnification factor, utilizing aspects of the present invention.
  • FIG. 10 illustrates an auxiliary image of an anatomic structure and concentric areas of the auxiliary image representing different magnification factors for display on a computer display screen in a magnifying glass by a method utilizing aspects of the present invention.
  • FIG. 11 illustrates a computer display screen with a primary image of an anatomic structure and an overlaid portion of an auxiliary image of the anatomic structure viewed in a magnifying glass lens as displayed by a method utilizing aspects of the present invention.
  • FIG. 12 illustrates a flow diagram of a method performed by a processor in a medical robotic system for manipulating objects displayed on a computer display screen utilizing aspects of the present invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • FIG. 1 illustrates, as an example, a top view of an operating room employing a medial robotic system. The medical robotic system in this case is a Minimally Invasive Robotic Surgical (“MIRS”) System 100 including a Console (“C”) utilized by a Surgeon (“S”) while performing a minimally invasive diagnostic or surgical procedure with assistance from one or more Assistants (“A”) on a Patient (“P”) who is reclining on an Operating table (“O”).
  • The Console includes a Master Display 104 (also referred to herein as a “Display Screen” or “computer display screen”) for displaying one or more images of a surgical site within the Patient as well as perhaps other information to the Surgeon. Also included are Master Input Devices 107, 108 (also referred to herein as “Master Manipulators”), one or more Foot Pedals 105, 106, a Microphone 103 for receiving voice commands from the Surgeon, and a Processor 102. The Master Input Devices 107, 108 may include any one or more of a variety of input devices such as joysticks, gloves, trigger-guns, hand-operated controllers, grippers, or the like. The Processor 102 is preferably a personal computer that may be integrated into the Console or otherwise connected to it in a conventional manner.
  • The Surgeon performs a medical procedure using the MIRS System 100 by manipulating the Master Input Devices 107, 108 so that the Processor 102 causes their respectively associated Slave Arms 121, 122 to manipulate their respective removably coupled and held Surgical Instruments 138, 139 (also referred to herein as “Tools”) accordingly, while the Surgeon views three-dimensional (“3D”) images of the surgical site on the Master Display 104.
  • The Tools 138, 139 are preferably Intuitive Surgical's proprietary EndoWrist™ articulating instruments, which are modeled after the human wrist so that when added to the motions of the robot arm holding the tool, they allow at least a full six degrees of freedom of motion, which is comparable to the natural motions of open surgery. Additional details on such tools may be found in commonly owned U.S. Pat. No. 5,797,900 entitled “Wrist Mechanism for Surgical Instrument for Performing Minimally Invasive Surgery with Enhanced Dexterity and Sensitivity,” which is incorporated herein by this reference. At the operating end of each of the Tools 138, 139 is a manipulatable end effector such as a clamp, grasper, scissor, stapler, blade, needle, needle holder, or energizable probe.
  • The Master Display 104 has a high-resolution stereoscopic video display with two progressive scan cathode ray tubes (“CRTs”). The system offers higher fidelity than polarization, shutter eyeglass, or other techniques. Each eye views a separate CRT presenting the left or right eye perspective, through an objective lens and a series of mirrors. The Surgeon sits comfortably and looks into this display throughout surgery, making it an ideal place for the Surgeon to display and manipulate 3-D intra-operative imagery.
  • A Stereoscopic Endoscope 140 provides right and left camera views to the Processor 102 so that it may process the information according to programmed instructions and cause it to be displayed on the Master Display 104. A Laparoscopic Ultrasound (“LUS”) Probe 150 provides two-dimensional (“2D”) ultrasound image slices of an anatomic structure to the Processor 102 so that the Processor 102 may generate a 3D ultrasound computer model or volume rendering of the anatomic structure.
  • Each of the Tools 138, 139, as well as the Endoscope 140 and LUS Probe 150, is preferably inserted through a cannula or trocar (not shown) or other tool guide into the Patient so as to extend down to the surgical site through a corresponding minimally invasive incision such as Incision 161. Each of the Slave Arms 121-124 includes a slave manipulator and setup arms. The slave manipulators are robotically moved using motor controlled joints (also referred to as “active joints”) in order to manipulate and/or move their respectively held Tools. The setup arms are manually manipulated by releasing normally braked joints (also referred to as “setup joints”) to horizontally and vertically position the Slave Arms 121-124 so that their respective Tools may be inserted into the cannulae.
  • The number of surgical tools used at one time and consequently, the number of slave arms being used in the System 100 will generally depend on the medical procedure to be performed and the space constraints within the operating room, among other factors. If it is necessary to change one or more of the tools being used during a procedure, the Assistant may remove the tool no longer being used from its slave arm, and replace it with another tool, such as Tool 131, from a Tray (“T”) in the Operating Room.
  • Preferably, the Master Display 104 is positioned near the Surgeon's hands so that it will display a projected image that is oriented so that the Surgeon feels that he or she is actually looking directly down onto the surgical site. To that end, an image of the Tools 138, 139 preferably appear to be located substantially where the Surgeon's hands are located even though the observation points (i.e., that of the Endoscope 140 and LUS Probe 150) may not be from the point of view of the image.
  • In addition, the real-time image is preferably projected into a perspective image such that the Surgeon can manipulate the end effector of a Tool, 138 or 139, through its associated Master Input Device, 107 or 108, as if viewing the workspace in substantially true presence. By true presence, it is meant that the presentation of an image is a true perspective image simulating the viewpoint of an operator that is physically manipulating the Tools. Thus, the Processor 102 transforms the coordinates of the Tools to a perceived position so that the perspective image is the image that one would see if the Endoscope 140 was looking directly at the Tools from a Surgeon's eye-level during an open cavity procedure.
  • The Processor 102 performs various functions in the System 100. One important function that it performs is to translate and transfer the mechanical motion of Master Input Devices 107, 108 to their associated Slave Arms 121, 122 through control signals over Bus 110 so that the Surgeon can effectively manipulate their respective Tools 138, 139. Another important function is to implement the various methods described herein in reference to FIGS. 4-12.
  • Although described as a processor, it is to be appreciated that the Processor 102 may be implemented in practice by any combination of hardware, software and firmware. Also, its functions as described herein may be performed by one unit, or divided up among different components, each of which may be implemented in turn by any combination of hardware, software and firmware. When divided up among different components, the components may be centralized in one location or distributed across the System 100 for distributed processing purposes.
  • Prior to performing a medical procedure, ultrasound images captured by the LUS Probe 150, right and left 2D camera images captured by the stereoscopic Endoscope 140, and end effector positions and orientations as determined using kinematics of the Slave Arms 121-124 and their sensed joint positions, are calibrated and registered with each other.
  • Slave Arms 123, 124 may manipulate the Endoscope 140 and LUS Probe 150 in similar manners as Slave Arms 121, 122 manipulate Tools 138, 139. When there are only two master input devices in the system, however, such as Master Input Devices 107, 108 in the System 100, in order for the Surgeon to manually control movement of either the Endoscope 140 or LUS Probe 150, it may be required to temporarily associate one of the Master Input Devices 107, 108 with the Endoscope 140 or the LUS Probe 150 that the Surgeon desires manual control over, while its previously associated Tool and Slave Manipulator are locked in position.
  • Although not shown in this example, other sources of primary and auxiliary images of anatomic structures may be included in the System 100, such as those commonly used for capturing ultrasound, magnetic resonance, computed axial tomography, and fluoroscopic images. Each of these sources of imagery may be used pre-operatively, and where appropriate and practical, intra-operatively.
  • FIG. 2 illustrates, as an example, a block diagram of the System 100. In this system, there are two Master Input Devices 107, 108. Master Input Device 107 controls movement of either a Tool 138 or a stereoscopic Endoscope 140, depending upon which mode its Control Switch Mechanism 211 is in, and Master Input Device 108 controls movement of either a Tool 139 or a LUS Probe 150, depending upon which mode its Control Switch Mechanism 231 is in.
  • The Control Switch Mechanisms 211 and 231 may be placed in either a first or second mode by a Surgeon using voice commands, switches physically placed on or near the Master Input Devices 107, 108, Foot Pedals 105, 106 on the Console, or Surgeon selection of appropriate icons or other graphical user interface selection means displayed on the Master Display 104 or an auxiliary display (not shown).
  • When Control Switch Mechanism 211 is placed in the first mode, it causes Master Controller 202 to communicate with Slave Controller 203 so that manipulation of the Master Input 107 by the Surgeon results in corresponding movement of Tool 138 by Slave Arm 121, while the Endoscope 140 is locked in position. On the other hand, when Control Switch Mechanism 211 is placed in the second mode, it causes Master Controller 202 to communicate with Slave Controller 233 so that manipulation of the Master Input 107 by the Surgeon results in corresponding movement of Endoscope 140 by Slave Arm 123, while the Tool 138 is locked in position.
  • Similarly, when Control Switch Mechanism 231 is placed in the first mode, it causes Master Controller 108 to communicate with Slave Controller 223 so that manipulation of the Master Input 108 by the Surgeon results in corresponding movement of Tool 139 by Slave Arm 122. In this case, however, the LUS Probe 150 is not necessarily locked in position. Its movement may be guided by an Auxiliary Controller 242 according to stored instructions in Memory 240. The Auxiliary Controller 242 also provides haptic feedback to the Surgeon through Master Input 108 that reflects readings of a LUS Probe Force Sensor 247. On the other hand, when Control Switch Mechanism 231 is placed in the second mode, it causes Master Controller 108 to communicate with Slave Controller 243 so that manipulation of the Master Input 108 by the Surgeon results in corresponding movement of LUS Probe 150 by Slave Arm 124, while the Tool 139 is locked in position.
  • Before a Control Switch Mechanism effects a switch back to its first or normal mode, its associated Master Input Device is preferably repositioned to where it was before the switch. Alternatively, the Master Input Device may remain in its current position and kinematic relationships between the Master Input Device and its associated Tool Slave Arm readjusted so that upon the Control Switch Mechanism switching back to its first or normal mode, abrupt movement of the Tool does not occur. For additional details on control switching, see, e.g., commonly owned U.S. Pat. No. 6,659,939 entitled “Cooperative Minimally Invasive Telesurgical System,” which is incorporated herein by this reference.
  • A third Control Switch Mechanism 241 is provided to allow the user to switch between an image capturing mode and an image manipulating mode while the Control Switch Mechanism 231 is in its second mode (i.e., associating the Master Input Device 108 with the LUS Probe 150). In its first or normal mode (i.e., image capturing mode), the LUS Probe 150 is normally controlled by the Master Input Device 108 as described above. In its second mode (i.e., image manipulating mode), the LUS Probe 150 is not controlled by the Master Input Device 108, leaving the Master Input Device 108 free to perform other tasks such as the displaying and manipulating of auxiliary images on the Display Screen 104 and in particular, for performing certain user specified functions as described herein. Note however that although the LUS Probe 150 may not be controlled by the Master Input Device 108 in this second mode of the Control Switch Mechanism 241, it may still be automatically rocked or otherwise moved under the control of the Auxiliary Controller 242 according to stored instructions in Memory 240 so that a 3D volume rendering of a proximate anatomic structure may be generated from a series of 2D ultrasound image slices captured by the LUS Probe 150. For additional details on such and other programmed movement of the LUS Probe 150, see commonly owned U.S. patent Application Ser. No. 11/447,668 entitled “Laparoscopic Ultrasound Robotic Surgical System,” filed Jun. 6, 2006, which is incorporated herein by this reference.
  • The Auxiliary Controller 242 also performs other functions related to the LUS Probe 150 and the Endoscope 140. It receives output from a LUS Probe Force Sensor 247, which senses forces being exerted against the LUS Probe 150, and feeds the force information back to the Master Input Device 108 through the Master Controller 222 so that the Surgeon may feel those forces even if he or she is not directly controlling movement of the LUS Probe 150 at the time. Thus, potential injury to the Patient is minimized since the Surgeon has the capability to immediately stop any movement of the LUS Probe 150 as well as the capability to take over manual control of its movement.
  • Another key function of the Auxiliary Control 242 is to cause processed information from the Endoscope 140 and the LUS Probe 150 to be displayed on the Master Display 104 according to user selected display options. Examples of such processing include generating a 3D ultrasound image from 2D ultrasound image slices received from the LUS Probe 150 through an Ultrasound Processor 246, causing either 3D or 2D ultrasound images corresponding to a selected position and orientation to be displayed in a picture-in-picture window of the Master Display 104, causing either 3D or 2D ultrasound images of an anatomic structure to overlay a camera captured image of the anatomic structure being displayed on the Master Display 104, and performing the methods described below in reference to FIGS. 4-12.
  • Although shown as separate entities, the Master Controllers 202, 222, Slave Controllers 203, 233, 223, 243, and Auxiliary Controller 242 are preferably implemented as software modules executed by the Processor 102, as well as certain mode switching aspects of the Control Switch Mechanisms 211, 231, 241. The Ultrasound Processor 246 and Video Processor 236, on the other hand, may be software modules or separate boards or cards that are inserted into appropriate slots coupled to or otherwise integrated with the Processor 102 to convert signals received from these image capturing devices into signals suitable for display on the Master Display 104 and/or for additional processing by the Auxiliary Controller 242 before being displayed on the Master Display 104.
  • Although the present example assumes that each Master Input Device is being shared by only one pre-assigned Tool Slave Robotic Arm and one pre-assigned Image Capturing Device Robotic Arm, alternative arrangements are also feasible and envisioned to be within the full scope of the present invention. For example, a different arrangement wherein each of the Master Input Devices may be selectively associated with any one of the Tool and Image Capturing Device Robotic Arms is also possible and even preferably for maximum flexibility. Also, although the Endoscope Robotic Arm is shown in this example as being controlled by a single Master Input Device, it may also be controlled using both Master Input Devices to give the sensation of being able to “grab the image” and move it to a different location or view. Still further, although only an Endoscope and LUS Probe are show in this example, other Image Capturing Devices such as those used for capturing camera, ultrasound, magnetic resonance, computed axial tomography, and fluoroscopic images are also fully contemplated within the System 100, although each of these Image Capturing Devices may not necessarily be manipulated by one of the Master Input Devices.
  • FIG. 3 illustrates a side view of one embodiment of the LUS Probe 150. The LUS Probe 150 is a dexterous tool with preferably two distal degrees of freedom. Opposing pairs of Drive Rods or Cables (not shown) physically connected to a proximal end of the LUS Sensor 301 and extending through an internal passage of Elongated Shaft 312 mechanically control pitch and yaw movement of the LUS Sensor 301 using conventional push-pull type action.
  • The LUS Sensor 301 captures 2D ultrasound slices of a proximate anatomic structure, and transmits the information back to the Processor 102 through LUS Cable 304. Although shown as running outside of the Elongated Shaft 312, the LUS Cable 304 may also extend within it. A Clamshell Sheath 321 encloses the Elongate Shaft 312 and LUS Cable 304 to provide a good seal passing through a Cannula 331 (or trocar). Fiducial Marks 302 and 322 are placed on the LUS Sensor 301 and the Sheath 321 for video tracking purposes.
  • FIG. 4 illustrates, as an example, a flow diagram of a method for displaying the effect of a therapeutic procedure or treatment on the Display Screen 104. In 401, a primary image of an anatomic structure is captured by an image capturing device. As an example, FIG. 5 illustrates a primary image which has been captured by the Endoscope 140 and includes an anatomic structure 501 and therapeutic instrument 511 that has been partially inserted into the anatomic structure 501 in order to perform a therapeutic procedure at a therapy site within the anatomic structure 501. In another application, the therapeutic instrument 511 may only need to touch or come close to the anatomic structure 501 in order to perform a therapeutic procedure.
  • The primary image may be captured before or during the therapeutic procedure. A primary image captured before the procedure is referred to as being a “pre-operative” image, and a primary image captured during the procedure is referred to as being an “intra-operative” image. When the primary image is a pre-operative image, the image is generally not updated during the procedure, so that the method generally only employs one primary image. On the other hand, when the primary image is an intra-operative image, the image is preferably updated periodically during the procedure, so that the method employs multiple primary images in that case.
  • Pre-operative images are typically captured using techniques such as Ultrasonography, Magnetic Resonance Imaging (MRI), or Computed Axial Tomography (CAT). Intra-operative images may be captured at the surgical or therapeutic site by image capturing devices such as the stereoscopic Endoscope 140 or LUS Probe 150, or they may be captured externally by techniques such as those used to capture the pre-operative images.
  • In 402 of FIG. 4, the therapeutic instrument is turned on, or otherwise activated or energized, so as to be capable of applying therapy to the anatomic structure within the patient. The instrument generally has a tip for applying the therapeutic energy to abnormal tissue such as diseased or damaged tissue. As one example of such a therapeutic procedure, Radio Frequency Ablation (RFA) may be used to destroy diseased tissue such as a tumor located in an anatomic structure such as the liver by applying heat to the diseased tissue site using an RFA probe. Examples of other procedures include High Intensity Focused Ultrasound (HIFU) and Cauterization. The therapeutic instrument may be one of the Tools 138, 139 attached to Slave Arms 121, 122 so that it may be moved to and manipulated at the therapy site through the master/slave control system by the Surgeon.
  • In 403, an auxiliary image is generated, wherein the auxiliary image indicates the effect of the therapeutic procedure on the anatomic structure. The auxiliary image may be an actual image of the anatomic structure that has been provided by or generated from information captured by a sensing device which is capable of sensing the effect of the therapeutic procedure. Alternatively, the auxiliary image may be a computer model indicating the effect of the therapy, which may be generated using an empirically derived or otherwise conventionally determined formula of such effect. In this latter case, the computer model is generally a volumetric shape determined by such factors as the geometry of the tip of the therapeutic instrument, the heat or energy level being applied to the anatomic structure by the tip of the therapeutic instrument, and the features of the surrounding tissue of a therapy site being subjected to the therapeutic procedure in the anatomic structure.
  • As an example of an auxiliary image provided or otherwise derived from information captured by a sensing device, FIG. 6 illustrates a three-dimensional ultrasound image of an anatomic structure 601 which has been conventionally derived from two-dimensional ultrasound slices captured by the LUS probe 150. In this example, an ablation volume 621 is shown which represents the effect of a therapeutic procedure in which a tip 613 of an RFA probe 612 is being applied to a tumor site of the anatomic structure 601. The growth of the ablation volume in this case is viewable due to changes in tissue properties from the heating and necrosis of the surrounding tissue at the tumor site.
  • In 404, the primary and auxiliary images are registered so as to be of the same scale and refer to a same position and orientation in a common reference frame. Registration of this sort is well known. As an example, see commonly owned U.S. Pat. No. 6,522,906 entitled “Devices and Methods for Presenting and Regulating Auxiliary Information on an Image Display of a Telesurgical System to Assist an Operator in Performing a Surgical Procedure,” which is incorporated herein by this reference.
  • In 405, the primary image is displayed on the Display Screen 104 while the therapeutic procedure is being performed, with the registered auxiliary image preferably overlaid upon the primary image so that corresponding structures or objects in each of the images appear as the same size and at the same location and orientation on the Display Screen 104. In this way, the effect of the therapeutic procedure is shown as an overlay over the anatomic structure that is being subjected to the procedure.
  • As an example, FIG. 7 shows an exemplary Display Screen 104 in which an auxiliary image, distinguished as a dotted line for illustrative purposes, is overlaid over the primary image of FIG. 5. When the auxiliary image is provided by or derives from information captured by a sensing device, the therapy effect 521, therapeutic instrument 512, and instrument tip 513 is provided by or derived from the captured information. On the other hand, when the therapy effect 521 is generated as a volumetric shaped computer model using an empirically determined formula, the therapeutic instrument 512 and instrument tip 513 may be determined using conventional tool tracking computations based at least in part upon joint positions of its manipulating slave arm.
  • In 406 of FIG. 4, the method then checks whether the therapeutic instrument has been turned off. If it has, then this means that the therapeutic procedure is over, and the method ends. On the other hand, if the therapeutic instrument is still on, then the method assumes that the therapeutic procedure is still being performed, and proceeds in 407 to determine whether a new primary image has been captured. If no new primary image has been captured, for example, because the primary image is a pre-operative image, then the method jumps back to 403 to update the auxiliary image and continue to loop through 403-407 until the therapeutic procedure is determined to be completed by detecting that the therapeutic instrument has been turned off. On the other hand, if a new primary image has been captured, for example, because the primary image is an intra-operative image, then the method updates the primary image in 408 before jumping back to 403 to update the auxiliary image and continue to loop through 403-408 until the therapeutic procedure is determined to be completed by detecting that the therapeutic instrument has been turned off.
  • FIG. 8 illustrates, as an example, a flow diagram of a method for displaying an auxiliary image of an anatomic structure as a registered overlay to a primary image of the anatomic structure at a user specified magnification in a window defined as the lens area of a magnifying glass whose position and orientation as displayed on the Display Screen 104 is manipulatable by the user using an associated pointing device.
  • In 801, the method starts out by associating the magnifying glass with the pointing device so that as the pointing device moves, the magnifying glass being displayed on the Display Screen 104 (and in particular, its lens which may be thought of as a window) moves in a corresponding fashion. The association in this case may be performed by “grabbing” the magnifying glass in a conventional manner using the pointing device, or by making the magnifying glass effectively the cursor for the pointing device. Since the Display Screen 104 is preferably a three-dimensional display, the pointing device is correspondingly preferably a three-dimensional pointing device with orientation indicating capability.
  • In 802, current primary and auxiliary images are made available for processing. The primary image in this example is captured by the Endoscope 140 and the auxiliary captured by the LUS Probe 150. However, other sources for the primary and auxiliary images are also usable and contemplated in practicing the invention, including primary and auxiliary images captured from the same source. As an example of this last case, a high resolution camera may capture images at a resolution greater than that being used to display images on a display screen. In this case, the high resolution image captured by the camera may be treated as the auxiliary image, and the downsized image to be displayed on the display screen may be treated as the primary image.
  • In 803, a user selectable magnification factor is read. The magnification factor is user selectable by, for example, a dial or wheel type control on the pointing device. Alternatively, it may be user selectable by user selection of item in a menu displayed on the Display Screen 104, or any other conventional user selectable parameter value scheme or mechanism. If the user fails to make a selection, then a default value is used, such as a magnification factor of 1.0.
  • In 804, the primary and auxiliary images are registered so as to be of the same scale and refer to a same position and orientation in a common reference frame so that corresponding structures and objects in the two images have the same coordinates.
  • In 805, the primary image is displayed on the Display Screen 104 such as a three-dimensional view of the anatomic structure, in which case, a portion of a two-dimensional slice of the auxiliary image of the anatomic structure may be displayed as an overlay in the lens of the magnifying glass. The portion of the two-dimensional slice in this case is defined by a window area having a central point that has the same position and orientation of as the central point of the lens of the magnifying glass, and an area determined by the magnification factor so that the portion of the two-dimensional slice may be enlarged or reduced so as to fit in the lens of the magnifying glass. Since the position and orientation of the magnifying glass is manipulatable by the positioning device to any position in the three-dimensional space of the Display Screen 104, including those within the volume of the anatomic structure, the two-dimensional slice can correspond to any user selected depth within the anatomic structure. Unlike a physical magnifying glass, its view is not limited to inspecting only the exterior of the anatomic structure. For additional details on 805, see the description below in reference to FIG. 9.
  • In 806, the method then determines whether the magnifying glass command has been turned off by, for example, the user releasing a “grabbed” image of the magnifying glass, or otherwise switching off the association between the magnifying glass and the pointing device by the use of a conventional switch mechanism of some sort. If it has, then the method ends. On the other hand, if it has not, then the method jumps back to 802 and continues to loop through 802-806 until the magnifying glass command is detected to have been turned off. Note that each time the method loops through 802-806, updated versions, if any, of the primary and auxiliary images are processed along with updated values, if any, for the user selectable magnification factor. Thus, if the method proceeds through the looping in a sufficiently fast manner, the user will not notice any significant delay if the user is turning a dial or knob to adjust the magnification factor while viewing the anatomic structure at a selected position and orientation of the magnifying glass.
  • FIG. 9 illustrates, as an example, a flow diagram of a method for displaying an auxiliary image view of an anatomic structure at a specified magnification factor as an overlay to a primary image view of the anatomic structure in the lens of a user movable magnifying glass. As previously explained, this method may be used to perform 805 of FIG. 8.
  • In 901, the current position and orientation of a central point of the lens of the magnifying glass are determined in the three-dimensional space of the Display Screen 104. In 902, a two-dimensional slice of the registered volumetric model of the auxiliary image is taken from the perspective of that position and orientation, and a portion of the two-dimensional slice is taken as defined in an auxiliary view window having a central point preferably at that same position and orientation. The area of the auxiliary view window in this case is inversely proportional to that of the lens according to the current magnification factor for the magnifying glass. In 903, the portion of the two-dimensional slice defined by the auxiliary view window is then enlarged by the magnification factor so that it fits in the lens area of the magnifying glass, and in 904, the primary image of the anatomic structure is displayed on the Display Screen 104 with the enlarged portion of the two-dimensional slice of the auxiliary image overlaid in the lens area of the magnifying glass being displayed on the Display Screen 104.
  • As a pictorially example of 901-904, in FIGS. 10-11, a two-dimensional slice 1001 of an auxiliary image of an anatomic structure is shown along with two circular windows 1021, 1022 on the two-dimensional slice as illustrated in FIG. 10. Each of the windows 1021, 1022 in this case corresponds in shape to and having a central point equal to that of a lens 1121 of a magnifying glass 1120 which is being displayed along with a primary image of an external view 1101 of the anatomic structure on the Display Screen 104 as illustrated in FIG. 11. In this example, the area of the window 1021 is equal to the area of the lens 1121, so that if the magnification factor was 1.0, then window 1021 would be selected for use in 902. On the other hand, the area of the window 1022 is less than the area of the lens 1121, so that if the magnification factor is greater than 1.0, then the window 1022 may be selected for use in 902. Note that although the lens 1121 of the magnifying glass 1120 is depicted as being circularly shaped, it may also have other common shapes for a magnifying glass, such as a rectangular shape.
  • FIG. 12 illustrates, as an example, a flow diagram of a method performed by a processor in a medical robotic system for manipulating image objects displayed on a computer display screen of the medical robotic system in response to corresponding manipulation of an associated master input device when the master input device is in an image manipulating mode.
  • As a preface to the method, the medical robotic system includes an image capturing device to capture images (such as either the Endoscope 140 or the LUS Probe 150); a robotic arm holding the image capturing device (such as the Slave Arm 123 or the Slave Arm 124 respectively holding the Endoscope 140 and the LUS Probe 150); a computer display screen (such as the Display Screen 104); a master input device adapted to be manipulatable by a user in multiple degrees-of-freedom movement (such as the Master Input Device 107 or the Master Input Device 108); and a processor (such as the Auxiliary Controller 242) that is configured to control movement of the image capturing device according to user manipulation of the master input device when the master input device is in an image capturing mode, and control the displaying of images derived from the captured images on the computer display screen according to user manipulation of the master input device when the master input device is in the image manipulating mode.
  • In 1201, the processor detects that the user has placed the master input device into its image manipulating mode. One way that this may be implemented is using a master clutch mechanism provided in the medical robotic system, which supports disengaging the master input device from its associated robotic arm so that the master input device may be repositioned. When this mode is activated by some mechanism such as the user depressing a button on the master input device, pressing down on a foot pedal, or using voice activation, the associated robotic arm is locked in position, and a cursor (nominally an iconic representation of a hand, e.g.
    Figure US20080033240A1-20080207-P00900
    ) is presented to the user on the computer display screen. When the user exits this mode, the cursor is hidden and control of the robotic arm may be resumed after readjusting its position if required.
  • In 1202, the processor determines whether a control input such as that generated by depressing a button on a conventional mouse has been activated by the user. The control input in this case may be activated by depressing a button provided on the master input device, or it may be activated by some other fashion such as squeezing a gripper or pincher formation provided on the master input device. For additional details on clutching, and gripper or pincher formations on a master input device, see, e.g., commonly owned U.S. Pat. No. 6,659,939 entitled “Cooperative Minimally Invasive Telesurgical System,” which has been previously incorporated herein by reference. If the control input is not determined to be “on” (i.e., activated) in 1202, then the processor waits until it either receives an “on” indication or the image manipulating mode is exited.
  • In 1203, after receiving an indication that the control input is “on”, the processor checks to see if the cursor is positioned on (or within a predefined distance to) an object being displayed on the computer display screen. If it is not, then in 1204, the processor causes a menu of user selectable items or actions to be displayed on the computer display screen, and in 1205, the processor receives and reacts to a menu selection made by the user.
  • Examples of user selectable menu items include: magnifying glass, cut-plane, eraser, and image registration. If the user selects the magnifying glass item, then an image of a magnifying glass is displayed on the computer display screen and the method described in reference to FIG. 8 may be performed by the processor. When the user is finished with the magnifying glass function, then the user may indicate exiting of the function in any conventional manner and the processor returns to 1202.
  • If the user selects the cut-plane item, then a plane (or rectangular window of fixed or user adjustable size) is displayed on the computer display screen. The master input device may then be associated with the plane so that the user may position and orientate the plane in the three-dimensional space of the computer display screen by manipulating the master input device in the manner of a pointing device. If the plane is maneuvered so as to intersect a volume rendering of an anatomic structure, then it functions as a cut-plane defining a two-dimensional slice of the volume rendering at the intersection. Alternatively, the master input device may be associated with the volume rendering of the anatomic structure, which may then be maneuvered so as to intersect the displayed plane to define the cut-plane. Association of the plane or volume rendering with the pointing device may be performed in substantially the same manner as described in reference to the magnifying glass with respect to 801 of FIG. 8.
  • The two-dimensional slice may then be viewed either in the plane itself, or in a separate window on the computer display screen such as in a picture-in-picture. The user may further select the cut-plane item additional times to define additional two-dimensional slices of the volume rendering for concurrent viewing in respective planes or picture-in-picture windows on the computer display screen. So as not to clutter the computer display screen with unwanted cut-plane slices, a conventional delete function is provided so that the user may selectively delete any cut-planes and their corresponding slices. When the user is finished with the cut-plane function, then the user may indicate exiting of the function in any conventional manner and the processor returns to 1202.
  • If the user selects the eraser item, then an eraser is displayed on the computer display screen. The master input device is then associated with the eraser so that the user may position and orientate the eraser in the three-dimensional space of the computer display screen by manipulating the master input device in the manner of a pointing device. Association of the eraser with the pointing device in this case may be performed in substantially the same manner as described in reference to the magnifying glass with respect to 801 of FIG. 8. If the eraser is maneuvered so as to intersect a volume rendering of an anatomic structure, then it functions to either completely or partially erase such rendering wherever it traverses the volume rendering. If partial erasing is selected by the user (or otherwise pre-programmed into the processor), then each time the eraser traverses the volume rendering, less detail of the anatomic structure may be shown. Less detail in this case may refer to the coarseness/fineness of the rendering, or it may refer to the stripping away of layers in the three-dimensional volume rendering. All such characteristics or options of the erasing may be user selected using conventional means. If the user inadvertently erases a portion of the volume rendering, a conventional undo feature is provided to allow the user to undo the erasure. When the user is finished with the erasing function, then the user may indicate exiting of the function in any conventional manner and the processor returns to 1202.
  • In addition to an eraser function as described above, other spatially localized modifying functions are also contemplated and considered to be within the full scope of the present invention, including selectively sharpening, brightening, or coloring portions of a displayed image to enhance its visibility in, or otherwise highlight, a selected area. Each such spatially localized modifying function may be performed using substantially the same method described above in reference to the eraser function.
  • If the user selects the image registration item, then the processor records such selection for future action as described below in reference to 1212 before jumping back to process 1202 again. Image registration in this case typically involves manually registering an auxiliary image of an object such as an anatomic structure with a corresponding primary image of the object.
  • As an alternative to the above described menu approach, icons respectively indicating each of the selectable items as described above may be displayed on the computer display screen upon entering image manipulating mode and selected by the user clicking on them, after which, the processor proceeds to perform as described above in reference to selection of their corresponding menu items.
  • Now continuing with the method described in reference to FIG. 12, after receiving an indication that the control input is on in 1201 and determining that the cursor is positioned on or near an object (not an icon) being displayed on the computer display screen in 1202, the processor preferably changes the cursor from an iconic representation of a hand, for example, to that of a grasping hand to indicate that the object has been “grabbed” and is ready to be moved or “dragged” to another position and/or orientation in the three-dimensional space of the computer display screen through user manipulation of the master input device.
  • In 1206, the processor then determines whether the user has indicated that a display parameter of the selected object is to be adjusted, and if the user has so indicated, in 1207, the processor performs the display adjustment. As an example, a dial on the master input device may be turned by the user to indicate both that a display adjustment for a display parameter associated with dial is to be adjusted according to the amount of rotation of the dial on the selected object. Alternatively, if the master input device is equipped with a gripper, the gripper may be rotated so as to function as a dial. Examples of display parameters that may be adjusted in this manner include: brightness, contrast, color, and level of detail (e.g., mesh coarseness/fineness, or voxel size and/or opaqueness) of the selected object being displayed on the computer display screen.
  • The processor then proceeds to 1208 to determine whether the cursor has moved since “grabbing” the selected object after an affirmative determination in 1203. If it has not moved, then the processor jumps back to 1202 since the user may only have wanted to adjust a display parameter of a selected object at this time. On the other hand, if the cursor has moved since “grabbing” the selected object, then in 1209, the processor moves the selected object to the new cursor position. Since the cursor operates in the three-dimensional space of the computer display screen, when it moves “into” the display screen, it may indicate such movement by, for example, getting progressively smaller in size. Where the three-dimensional nature of the computer display screen is achieved through the use of right and left two-dimensional views of the object with disparities of common points between the two views indicating depth values, decreasing of the depth values for images of the cursor in the right and left views indicates that the cursor is moving “into” the display screen.
  • Optionally, in 1210, haptic feedback may be provided back to the master input device so that the user may sense reflected forces while the “grabbed” object is being moved in 1209. As an example, user interactions with the object may be reflected haptically back to the user by associating a virtual mass and inertial properties with the object so that the user feels a reflected force when coming into contact with the object or when translating or rotating the object as it is accelerated/decelerated. The haptic feedback performed in this 1210 may only be performed for some types of objects and not for others, or it may take effect only in certain circumstances. Use of such haptic feedback may also be applied to the movement of the magnifying glass and/or the plane used for defining cut-planes as described above. In such cases, however, the haptic feedback may be restricted to only occurring after the magnifying glass or the plane enters into an anatomic structure of interest.
  • In 1211, the processor determines whether the control input is still in an “on” state. If the control is still “on”, then the processor jumps back to 1208 to track and respond to cursor movement. On the other hand, if the control has been turned off by, for example, the user releasing a button that was initially depressed to indicate that control was turned “on”, then in 1212, the processor performs a selected menu action.
  • For example, if the image registration item had been selected by the user in response to the processor displaying the menu in 1204 (or alternatively, the user clicking an icon indicating that item), then the object that has been moved is registered with another image of the object that is now aligned with and is being displayed on the computer display screen at the time so that they have the same coordinate and orientation values in a common reference frame such as that of the computer display screen. This feature facilitates, for example, manual registration of an auxiliary image of an anatomic structure (such as obtained using the LUS Probe 150) with a primary image of the anatomic structure (such as obtained using the Endoscope 140). After the initial registration, changes to the position and/or orientation of the corresponding object in the primary image may be mirrored so as to cause corresponding changes to the selected object in the auxiliary image so as to maintain its relative position/orientation with respect to the primary image. When the user is finished with the image registration function, then the processor returns to 1202.
  • Although the various aspects of the present invention have been described with respect to a preferred embodiment, it will be understood that the invention is entitled to full protection within the full scope of the appended claims.

Claims (72)

1. A method for displaying on a computer display screen an effect of a therapeutic procedure being applied by a therapy instrument to an anatomic structure, comprising:
generating an auxiliary image indicating the effect of the therapeutic procedure being applied by the therapy instrument to the anatomic structure; and
displaying a primary image of the anatomic structure overlaid with the auxiliary image on the computer display screen during the therapeutic procedure.
2. The method according to claim 1, wherein the therapeutic procedure is performed using a medical robotic system, and the therapy instrument is robotically manipulatable by a surgeon using the medical robotic system to perform the therapeutic procedure.
3. The method according to claim 1, wherein the primary image is captured prior to the therapeutic procedure.
4. The method according to claim 3, wherein the primary image is a pre-operative image generated by ultrasound.
5. The method according to claim 3, wherein the primary image is a pre-operative image generated by magnetic resonance imaging.
6. The method according to claim 3, wherein the primary image is a pre-operative image generated by computed axial tomography.
7. The method according to claim 3, wherein the auxiliary image is a computer model of the therapeutic effect being applied by the therapy instrument during the therapeutic procedure.
8. The method according to claim 7, wherein the computer model is a volumetric shape determined at least partially by the geometry of a therapeutic end of the therapy instrument.
9. The method according to claim 7, wherein the computer model is a volumetric shape determined at least partially by a heat level being applied to the anatomic structure by a therapeutic end of the therapy instrument.
10. The method according to claim 7, wherein the computer model is a volumetric shape determined at least partially by features of surrounding tissue of the anatomic structure being subjected to the therapeutic procedure.
11. The method according to claim 1, wherein the primary image is captured during the therapeutic procedure.
12. The method according to claim 11, wherein the primary image is an intra-operative image captured by a camera unit.
13. The method according to claim 12, wherein the camera unit includes a stereoscopic pair of cameras.
14. The method according to claim 12, wherein the camera unit is included in an endoscope.
15. The method according to claim 14, wherein the endoscope is a laparoscope.
16. The method according to claim 11, wherein the auxiliary image is a computer model of the therapeutic effect being applied by the therapy instrument during the therapeutic procedure.
17. The method according to claim 16, wherein the computer model is a volumetric shape determined at least partially by a shape of a therapeutic end of the therapy instrument.
18. The method according to claim 16, wherein the computer model is a volumetric shape determined at least partially by a heat level being applied to the anatomic structure by a therapeutic end of the therapy instrument.
19. The method according to claim 16, wherein the computer model is a volumetric shape determined at least partially by features of surrounding tissue of the anatomic structure being subjected to the therapeutic procedure.
20. The method according to claim 11, wherein the auxiliary image is an intra-operative image generated by ultrasound.
21. The method according to claim 22, wherein the therapeutic procedure destroys abnormal tissue of the anatomic structure using radio frequency ablation.
22. The method according to claim 21, wherein the abnormal tissue includes diseased tissue.
23. The method according to claim 22, wherein the diseased tissue includes at least one tumor.
24. The method according to claim 21, wherein the abnormal tissue includes damaged tissue.
25. The method according to claim 21, wherein the therapeutic procedure is one of a group consisting of radio frequency ablation, high intensity focused ultrasound, and cauterization.
26. A method for displaying a selected portion of an auxiliary image of an anatomic structure as an overlay to a primary image of the anatomic structure on a computer display screen, comprising:
associating a movable window with a pointing device such that the movable window is positionable on the computer display screen using the pointing device;
registering an auxiliary image of an anatomic structure with a primary image of the anatomic structure so as to be at a same position and orientation in a common reference frame; and
displaying the primary image on the computer display screen, and a portion of the registered auxiliary image corresponding to the same screen coordinates as the movable window as an overlay to the primary image in the movable window.
27. The method according to claim 26, wherein the primary image is captured by an image capturing device during a minimally invasive surgical procedure being performed using a medical robotic system, and the image capturing device is robotically manipulatable using the medical robotic system while performing the medical procedure.
28. The method according to claim 26, wherein the movable window appears as a circular lens on the display screen.
29. The method according to claim 26, wherein the movable window appears as a rectangular lens on the display screen.
30. The method according to claim 26, wherein the primary image is a three-dimensional image of the anatomic structure, and the computer display screen is a three-dimensional computer display screen.
31. The method according to claim 26, wherein an entire part of the registered auxiliary image corresponding to the same screen coordinates as the movable window is displayed as an overlay to the primary image in the movable window.
32. The method according to claim 26, wherein the portion of the registered auxiliary image corresponding to the same screen coordinates as the movable window is expanded so as to fit and be displayed as an overlay to the primary image in the movable window so as to appear as a magnified view of the auxiliary image.
33. The method according to claim 32, further comprising:
receiving a magnification factor selected by a user viewing the computer display screen; and
applying the magnification factor to determine the portion of the registered auxiliary image to be fitted and displayed as an overlay to the primary image in the movable window.
34. The method according to claim 26, wherein the primary and auxiliary images are three-dimensional images of the anatomic structure, and the computer display screen is a three-dimensional computer display screen.
35. The method according to claim 26, wherein the movable window is associated with a user selectable depth of the auxiliary image so that a two-dimensional slice of the auxiliary image corresponding to a depth selected by a user is displayed as an overlay to the primary image in the movable window.
36. The method according to claim 26, wherein the primary image is a pre-operative image generated by magnetic resonance imaging.
37. The method according to claim 26, wherein the primary image is a pre-operative image generated by computed axial tomography.
38. The method according to claim 26, wherein the primary image is an inter-operative image captured by a camera unit.
39. The method according to claim 38, wherein the camera unit is included in an endoscope.
40. The method according to claim 38, wherein the auxiliary image is a pre-operative captured image.
41. The method according to claim 40, wherein the pre-operative captured image is generated by magnetic resonance imaging.
42. The method according to claim 40, wherein the pre-operative captured image is generated by computed axial tomography.
43. The method according to claim 40, wherein the pre-operative captured image is generated by ultrasound.
44. The method according to claim 38, wherein the auxiliary image is an intra-operative captured image.
45. The method according to claim 44, wherein the intra-operative captured image is generated by ultrasound.
46. The method according to claim 44, wherein the intra-operative captured image is generated by a second camera unit.
47. A medical robotic system comprising:
an image capturing device for capturing images;
a robotic arm holding the image capturing device;
a computer display screen;
a master input device adapted to be manipulatable by an user in multiple degrees-of-freedom movement; and
a processor configured to control movement of the image capturing device according to user manipulation of the master input device when the master input device is in an image capturing mode, and controlling the displaying of images derived from the captured images on the computer display screen according to user manipulation of the master input device when the master input device is in an image manipulating mode.
48. The medical robotic system according to claim 47, wherein the master input device is configured so as to be manipulatable in six degrees of freedom so that the master input device operates as a three-dimensional mouse when in the image manipulating mode.
49. The medical robotic system according to claim 47, wherein the processor is configured so as to perform a grabbing function on one of the derived images being displayed on the computer display screen when a user activates a control input while a cursor associated with the master input device is being displayed on the derived image, and perform a moving function on the derived image when the user moves the master input device while keeping the control input activated when in the image manipulating mode.
50. The medical robotic system according to claim 49, wherein the processor is further configured to provide haptic feedback to the master input device while performing the moving function on the derived image.
51. The medical robotic system according to claim 50, wherein the haptic feedback is provided by associating a virtual mass and inertial properties to the derived image so that the user would feel a reflected force when the processor is performing the grabbing and moving functions on the derived image in response to user manipulation of the master input device while in the image manipulating mode.
52. The medical robotic system according to claim 49, wherein the image capturing device captures auxiliary images and the processor is configured to cause a primary image captured by a primary image capturing device to be displayed on the computer display screen with at least a portion of one of the derived images overlayed over the primary image.
53. The medical robotic system according to claim 52, wherein the processor is configured to facilitate manually registering the one of the derived images with the primary image by a user performing the grabbing and moving functions on the derived image so as to register the derived image with the primary image as they are both being displayed on the computer display screen when in the image manipulating mode.
54. The medical robotic system according to claim 47, wherein the master input device has a gripper adapted to be squeezed by a hand of a user to function as a control input when the master input device is in the image manipulating mode.
55. The medical robotic system according to claim 54, wherein the processor is configured to adjust a parameter associated with the derived images when the gripper is squeezed and rotated around an axis of the gripper when the master input device is in the image manipulating mode.
56. The medical robotic system according to claim 55, wherein the adjustable parameter is a brightness of the derived image.
57. The medical robotic system according to claim 55, wherein the adjustable parameter is a color of the derived image.
58. The medical robotic system according to claim 55, wherein the adjustable parameter is a level of detail of the derived image.
59. The medical robotic system according to claim 58, wherein the level of detail of the derived image is determined by a level of coarseness of a mesh structure defining the derived image.
60. The medical robotic system according to claim 47, wherein the derived images are three-dimensional volumes generated from the captured images; and the processor is further configured to display one of the three-dimensional volumes and a two-dimensional window on the computer display screen, manipulate a position and orientation of the window on the computer display screen in response to user manipulation of the master input device, and define a cut-plane by an intersection of the window with the three-dimensional volume so as to indicate a two-dimensional slice of the three-dimensional volume.
61. The medical robotic system according to claim 60, wherein the two-dimensional slice is displayed in the window.
62. The medical robotic system according to claim 60, wherein the two-dimensional slice is displayed in a picture-in-picture window of the computer display screen.
63. The medical robotic system according to claim 60, wherein the processor is further configured to display a user selectable number of two-dimensional windows on the computer display screen, individually manipulate positions and orientations of the windows on the computer display screen in response to user manipulation of the master input device, and define cut-planes by intersections of the manipulated windows with the three-dimensional volume so as to indicate corresponding two-dimensional slices of the three-dimensional volume.
64. The medical robotic system according to claim 63, wherein the two-dimensional slices are displayed in corresponding picture-in-picture windows of the computer display screen.
65. The medical robotic system according to claim 60, wherein the processor is configured to display the two-dimensional window on the computer display screen in response to user selection of an item included in a displayed menu on the computer display screen.
66. The medical robotic system according to claim 60, wherein the processor is configured to display the two-dimensional window on the computer display screen in response to user selection of an icon being displayed on the display screen.
67. The medical robotic system according to claim 66, wherein the icon is displayed in a periphery area of the computer display screen, and the processor is further configured to interpret user mouse-type actions of clicking on the icon and dragging the icon away from the periphery area as a user selection of the icon.
68. The medical robotic system according to claim 67, wherein the image capturing device is an ultrasound probe and the derived images are three-dimensional ultrasound images of an anatomic structure that are computer generated from two-dimensional ultrasound slices captured by the ultrasound probe.
69. The medical robotic system according to claim 47, wherein the processor is further configured to display one of the derived images and an eraser image on the computer display screen, manipulate at least a position of the eraser image on the computer display screen in response to user manipulation of the master input device, and erase any portion of one of the derived image being displayed on the computer display screen that is traversed by the eraser image as the eraser image is being manipulated on the computer display screen.
70. The medical robotic system according to claim 69, wherein the processor is configured to erase all detail of the portion of the derived image that is traversed by the eraser image.
71. The medical robotic system according to claim 69, wherein the processor is configured to reduce the detail of the portion of the derived image that is traversed by the eraser image.
72. The medical robotic system according to claim 71, wherein the reduction of detail of the portion of the derived image that is traversed by the eraser image entails reducing the fineness of a mesh structure defining the derived image.
US11/583,963 2005-10-20 2006-10-19 Auxiliary image display and manipulation on a computer display in a medical robotic system Abandoned US20080033240A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US11/583,963 US20080033240A1 (en) 2005-10-20 2006-10-19 Auxiliary image display and manipulation on a computer display in a medical robotic system
US15/139,682 US20160235496A1 (en) 2005-10-20 2016-04-27 Auxiliary image display and manipulation on a computer display in a medical robotic system
US16/564,734 US11197731B2 (en) 2005-10-20 2019-09-09 Auxiliary image display and manipulation on a computer display in a medical robotic system
US17/530,166 US20220071721A1 (en) 2005-10-20 2021-11-18 Auxiliary image display and manipulation on a computer display in a medical robotic system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US72845005P 2005-10-20 2005-10-20
US11/583,963 US20080033240A1 (en) 2005-10-20 2006-10-19 Auxiliary image display and manipulation on a computer display in a medical robotic system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/139,682 Division US20160235496A1 (en) 2005-10-20 2016-04-27 Auxiliary image display and manipulation on a computer display in a medical robotic system

Publications (1)

Publication Number Publication Date
US20080033240A1 true US20080033240A1 (en) 2008-02-07

Family

ID=37744551

Family Applications (4)

Application Number Title Priority Date Filing Date
US11/583,963 Abandoned US20080033240A1 (en) 2005-10-20 2006-10-19 Auxiliary image display and manipulation on a computer display in a medical robotic system
US15/139,682 Abandoned US20160235496A1 (en) 2005-10-20 2016-04-27 Auxiliary image display and manipulation on a computer display in a medical robotic system
US16/564,734 Active 2026-11-20 US11197731B2 (en) 2005-10-20 2019-09-09 Auxiliary image display and manipulation on a computer display in a medical robotic system
US17/530,166 Pending US20220071721A1 (en) 2005-10-20 2021-11-18 Auxiliary image display and manipulation on a computer display in a medical robotic system

Family Applications After (3)

Application Number Title Priority Date Filing Date
US15/139,682 Abandoned US20160235496A1 (en) 2005-10-20 2016-04-27 Auxiliary image display and manipulation on a computer display in a medical robotic system
US16/564,734 Active 2026-11-20 US11197731B2 (en) 2005-10-20 2019-09-09 Auxiliary image display and manipulation on a computer display in a medical robotic system
US17/530,166 Pending US20220071721A1 (en) 2005-10-20 2021-11-18 Auxiliary image display and manipulation on a computer display in a medical robotic system

Country Status (6)

Country Link
US (4) US20080033240A1 (en)
EP (4) EP3155998B1 (en)
JP (4) JP5322648B2 (en)
KR (1) KR101320379B1 (en)
CN (3) CN101291635B (en)
WO (1) WO2007047782A2 (en)

Cited By (105)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050200324A1 (en) * 1999-04-07 2005-09-15 Intuitive Surgical Inc. Non-force reflecting method for providing tool force information to a user of a telesurgical system
US20060258938A1 (en) * 2005-05-16 2006-11-16 Intuitive Surgical Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US20080004603A1 (en) * 2006-06-29 2008-01-03 Intuitive Surgical Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US20080065109A1 (en) * 2006-06-13 2008-03-13 Intuitive Surgical, Inc. Preventing instrument/tissue collisions
US20080234866A1 (en) * 2007-03-20 2008-09-25 Kosuke Kishi Master-slave manipulator system
US20090069804A1 (en) * 2007-09-12 2009-03-12 Jensen Jeffrey L Apparatus for efficient power delivery
US20090088774A1 (en) * 2007-09-30 2009-04-02 Nitish Swarup Apparatus and method of user interface with alternate tool mode for robotic surgical tools
US20090141966A1 (en) * 2007-11-30 2009-06-04 Microsoft Corporation Interactive geo-positioning of imagery
US20090192524A1 (en) * 2006-06-29 2009-07-30 Intuitive Surgical, Inc. Synthetic representation of a surgical robot
US20090248036A1 (en) * 2008-03-28 2009-10-01 Intuitive Surgical, Inc. Controlling a robotic surgical tool with a display monitor
US20090326556A1 (en) * 2008-06-27 2009-12-31 Intuitive Surgical, Inc. Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the positioning and orienting of its tip
US20090326318A1 (en) * 2008-06-27 2009-12-31 Intuitive Surgical, Inc. Medical robotic system providing an auxilary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
US20090326553A1 (en) * 2008-06-27 2009-12-31 Intuitive Surgical, Inc. Medical robotic system providing an auxiliary view of articulatable instruments extending out of a distal end of an entry guide
US20100082039A1 (en) * 2008-09-26 2010-04-01 Intuitive Surgical, Inc. Method for graphically providing continuous change of state directions to a user of a medical robotic system
US20100139808A1 (en) * 2007-11-26 2010-06-10 Thompson Ray P Special articulating tool holder
US20100149183A1 (en) * 2006-12-15 2010-06-17 Loewke Kevin E Image mosaicing systems and methods
US20100164950A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Efficient 3-d telestration for local robotic proctoring
US20100166323A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical. Inc. Robust sparse image matching for robotic surgery
US20100274087A1 (en) * 2007-06-13 2010-10-28 Intuitive Surgical Operations, Inc. Medical robotic system with coupled control modes
US20100285438A1 (en) * 2009-03-12 2010-11-11 Thenkurussi Kesavadas Method And System For Minimally-Invasive Surgery Training
US20100318099A1 (en) * 2009-06-16 2010-12-16 Intuitive Surgical, Inc. Virtual measurement tool for minimally invasive surgery
US20100331855A1 (en) * 2005-05-16 2010-12-30 Intuitive Surgical, Inc. Efficient Vision and Kinematic Data Fusion For Robotic Surgical Instruments and Other Applications
US20110040305A1 (en) * 2009-08-15 2011-02-17 Intuitive Surgical, Inc. Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide
US20110050852A1 (en) * 2005-12-30 2011-03-03 Intuitive Surgical Operations, Inc. Stereo telestration for robotic surgery
US20110202068A1 (en) * 2010-02-12 2011-08-18 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
WO2011104135A1 (en) * 2010-02-25 2011-09-01 Siemens Aktiengesellschaft Method for displaying an area to be examined and/or treated
US20110234754A1 (en) * 2008-11-24 2011-09-29 Koninklijke Philips Electronics N.V. Combining 3d video and auxiliary data
WO2012060901A1 (en) * 2010-11-04 2012-05-10 The Johns Hopkins University System and method for the evaluation of or improvement of minimally invasive surgery skills
CN103054612A (en) * 2012-12-10 2013-04-24 苏州佳世达电通有限公司 Ultrasonic probe mouse and ultrasonoscope
US20130314418A1 (en) * 2012-05-24 2013-11-28 Siemens Medical Solutions Usa, Inc. System for Erasing Medical Image Features
US8675939B2 (en) 2010-07-13 2014-03-18 Stryker Leibinger Gmbh & Co. Kg Registration of anatomical data sets
US20140142593A1 (en) * 2007-04-16 2014-05-22 Tim Fielding Frame Mapping and Force Feedback Methods, Devices and Systems
US8792963B2 (en) 2007-09-30 2014-07-29 Intuitive Surgical Operations, Inc. Methods of determining tissue distances using both kinematic robotic tool position information and image-derived position information
US20140330432A1 (en) * 2012-04-20 2014-11-06 Vanderbilt University Systems and methods for safe compliant insertion and hybrid force/motion telemanipulation of continuum robots
US8903546B2 (en) 2009-08-15 2014-12-02 Intuitive Surgical Operations, Inc. Smooth control of an articulated instrument across areas with different work space conditions
US20140358161A1 (en) * 1999-09-17 2014-12-04 Intuitive Surgical Operations, Inc. Systems and methods for commanded reconfiguration of a surgical manipulator using the null-space
US9138129B2 (en) 2007-06-13 2015-09-22 Intuitive Surgical Operations, Inc. Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide
US9161772B2 (en) 2011-08-04 2015-10-20 Olympus Corporation Surgical instrument and medical manipulator
US20150313445A1 (en) * 2014-05-01 2015-11-05 Endochoice, Inc. System and Method of Scanning a Body Cavity Using a Multiple Viewing Elements Endoscope
US9218053B2 (en) 2011-08-04 2015-12-22 Olympus Corporation Surgical assistant system
US9244524B2 (en) 2011-08-04 2016-01-26 Olympus Corporation Surgical instrument and control method thereof
US9244523B2 (en) 2011-08-04 2016-01-26 Olympus Corporation Manipulator system
US20160026266A1 (en) * 2006-12-28 2016-01-28 David Byron Douglas Method and apparatus for three dimensional viewing of images
US9259204B2 (en) 2011-05-30 2016-02-16 General Electric Company Ultrasound diagnostic apparatus and method of displaying medical image thereof
US9277968B2 (en) 2011-12-09 2016-03-08 Samsung Electronics Co., Ltd. Medical robot system and method for controlling the same
WO2016049294A1 (en) * 2014-09-25 2016-03-31 The Johns Hopkins University Surgical system user interface using cooperatively-controlled robot
US20160235285A1 (en) * 2013-10-30 2016-08-18 Olympus Corporation Endoscope apparatus
US9423869B2 (en) 2011-08-04 2016-08-23 Olympus Corporation Operation support device
US9469034B2 (en) 2007-06-13 2016-10-18 Intuitive Surgical Operations, Inc. Method and system for switching modes of a robotic system
US9477301B2 (en) 2011-08-04 2016-10-25 Olympus Corporation Operation support device and assembly method thereof
US9486189B2 (en) 2010-12-02 2016-11-08 Hitachi Aloka Medical, Ltd. Assembly for use with surgery system
US9492927B2 (en) 2009-08-15 2016-11-15 Intuitive Surgical Operations, Inc. Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose
US9492240B2 (en) 2009-06-16 2016-11-15 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US9519341B2 (en) 2011-08-04 2016-12-13 Olympus Corporation Medical manipulator and surgical support apparatus
US9524022B2 (en) 2011-08-04 2016-12-20 Olympus Corporation Medical equipment
US9549720B2 (en) 2012-04-20 2017-01-24 Vanderbilt University Robotic device for establishing access channel
US9568992B2 (en) 2011-08-04 2017-02-14 Olympus Corporation Medical manipulator
US9586323B2 (en) 2012-02-15 2017-03-07 Intuitive Surgical Operations, Inc. User selection of robotic system operating modes using mode distinguishing operator actions
US9632577B2 (en) 2011-08-04 2017-04-25 Olympus Corporation Operation support device and control method thereof
US9632573B2 (en) 2011-08-04 2017-04-25 Olympus Corporation Medical manipulator and method of controlling the same
US9671860B2 (en) 2011-08-04 2017-06-06 Olympus Corporation Manipulation input device and manipulator system having the same
US9687303B2 (en) 2012-04-20 2017-06-27 Vanderbilt University Dexterous wrists for surgical intervention
US9699445B2 (en) 2008-03-28 2017-07-04 Intuitive Surgical Operations, Inc. Apparatus for automated panning and digital zooming in robotic surgical systems
US9713460B2 (en) 2013-05-02 2017-07-25 Samsung Medison Co., Ltd. Ultrasound system and method for providing change information of target object
US9788909B2 (en) 2006-06-29 2017-10-17 Intuitive Surgical Operations, Inc Synthetic representation of a surgical instrument
US9814392B2 (en) 2009-10-30 2017-11-14 The Johns Hopkins University Visual tracking and annotaton of clinically important anatomical landmarks for surgical interventions
US9851782B2 (en) 2011-08-04 2017-12-26 Olympus Corporation Operation support device and attachment and detachment method thereof
US9956042B2 (en) 2012-01-13 2018-05-01 Vanderbilt University Systems and methods for robot-assisted transurethral exploration and intervention
US10008017B2 (en) 2006-06-29 2018-06-26 Intuitive Surgical Operations, Inc. Rendering tool information as graphic overlays on displayed images of tools
US10057590B2 (en) * 2014-01-13 2018-08-21 Mediatek Inc. Method and apparatus using software engine and hardware engine collaborated with each other to achieve hybrid video encoding
US20180254099A1 (en) * 2017-03-03 2018-09-06 University of Maryland Medical Center Universal device and method to integrate diagnostic testing into treatment in real-time
US20190110855A1 (en) * 2017-10-17 2019-04-18 Verily Life Sciences Llc Display of preoperative and intraoperative images
US10272270B2 (en) 2012-04-12 2019-04-30 Koninklijke Philips N.V. Coordinate transformation of graphical objects registered to a magnetic resonance image
US20190167221A1 (en) * 2016-08-18 2019-06-06 Stryker European Holdings I, Llc Method For Visualizing A Bone
US10390728B2 (en) * 2014-03-31 2019-08-27 Canon Medical Systems Corporation Medical image diagnosis apparatus
US20190282308A1 (en) * 2002-03-20 2019-09-19 P Tech, Llc Robotic surgery
US10420575B2 (en) 2014-07-25 2019-09-24 Olympus Corporation Treatment tool and treatment tool system
US10507066B2 (en) 2013-02-15 2019-12-17 Intuitive Surgical Operations, Inc. Providing information of tools by filtering image areas adjacent to or on displayed images of the tools
US10571671B2 (en) * 2014-03-31 2020-02-25 Sony Corporation Surgical control device, control method, and imaging control system
US20200073526A1 (en) * 2018-08-28 2020-03-05 Johnson Controls Technology Company Energy management system with draggable and non-draggable building component user interface elements
US10682191B2 (en) 2012-06-01 2020-06-16 Intuitive Surgical Operations, Inc. Systems and methods for commanded reconfiguration of a surgical manipulator using the null-space
US10795457B2 (en) 2006-12-28 2020-10-06 D3D Technologies, Inc. Interactive 3D cursor
US20210030491A1 (en) * 2014-11-13 2021-02-04 Intuitive Surgical Operations, Inc. Interaction between user-interface and master controller
US10967504B2 (en) 2017-09-13 2021-04-06 Vanderbilt University Continuum robots with multi-scale motion through equilibrium modulation
US11013480B2 (en) 2012-06-28 2021-05-25 Koninklijke Philips N.V. C-arm trajectory planning for optimal image acquisition in endoscopic surgery
US11197728B2 (en) 2018-09-17 2021-12-14 Auris Health, Inc. Systems and methods for concomitant medical procedures
US11197731B2 (en) 2005-10-20 2021-12-14 Intuitive Surgical Operations, Inc. Auxiliary image display and manipulation on a computer display in a medical robotic system
US20220001092A1 (en) * 2017-08-21 2022-01-06 RELIGN Corporation Arthroscopic devices and methods
US11228753B1 (en) 2006-12-28 2022-01-18 Robert Edwin Douglas Method and apparatus for performing stereoscopic zooming on a head display unit
US11275242B1 (en) 2006-12-28 2022-03-15 Tipping Point Medical Images, Llc Method and apparatus for performing stereoscopic rotation of a volume on a head display unit
US11278182B2 (en) * 2012-06-28 2022-03-22 Koninklijke Philips N.V. Enhanced visualization of blood vessels using a robotically steered endoscope
US11315307B1 (en) 2006-12-28 2022-04-26 Tipping Point Medical Images, Llc Method and apparatus for performing rotating viewpoints using a head display unit
US11317979B2 (en) * 2014-03-17 2022-05-03 Intuitive Surgical Operations, Inc. Systems and methods for offscreen indication of instruments in a teleoperational medical system
US11485012B2 (en) * 2019-12-12 2022-11-01 Seiko Epson Corporation Control method and robot system
US11504197B1 (en) 2021-03-31 2022-11-22 Moon Surgical Sas Co-manipulation surgical system having multiple operational modes for use with surgical instruments for performing laparoscopic surgery
USD981425S1 (en) * 2020-09-30 2023-03-21 Karl Storz Se & Co. Kg Display screen with graphical user interface
US11723734B2 (en) 2014-11-13 2023-08-15 Intuitive Surgical Operations, Inc. User-interface control using master controller
US11793394B2 (en) 2016-12-02 2023-10-24 Vanderbilt University Steerable endoscope with continuum manipulator
US11812938B2 (en) 2021-03-31 2023-11-14 Moon Surgical Sas Co-manipulation surgical system having a coupling mechanism removeably attachable to surgical instruments
US11819302B2 (en) 2021-03-31 2023-11-21 Moon Surgical Sas Co-manipulation surgical system having user guided stage control
US11832910B1 (en) 2023-01-09 2023-12-05 Moon Surgical Sas Co-manipulation surgical system having adaptive gravity compensation
US11832909B2 (en) 2021-03-31 2023-12-05 Moon Surgical Sas Co-manipulation surgical system having actuatable setup joints
US11844583B2 (en) 2021-03-31 2023-12-19 Moon Surgical Sas Co-manipulation surgical system having an instrument centering mode for automatic scope movements
US11903650B2 (en) 2019-09-11 2024-02-20 Ardeshir Rastinehad Method for providing clinical support for surgical guidance during robotic surgery
USD1022197S1 (en) 2020-11-19 2024-04-09 Auris Health, Inc. Endoscope

Families Citing this family (77)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102008041867B4 (en) 2008-09-08 2015-09-10 Deutsches Zentrum für Luft- und Raumfahrt e.V. Medical workstation and operating device for manually moving a robot arm
KR101039108B1 (en) * 2009-09-01 2011-06-07 한양대학교 산학협력단 Medical robot system and Method for controlling the same
WO2011040769A2 (en) * 2009-10-01 2011-04-07 주식회사 이턴 Surgical image processing device, image-processing method, laparoscopic manipulation method, surgical robot system and an operation-limiting method therefor
KR101683057B1 (en) * 2009-10-30 2016-12-07 (주)미래컴퍼니 Surgical robot system and motion restriction control method thereof
KR101598774B1 (en) * 2009-10-01 2016-03-02 (주)미래컴퍼니 Apparatus and Method for processing surgical image
US8706184B2 (en) * 2009-10-07 2014-04-22 Intuitive Surgical Operations, Inc. Methods and apparatus for displaying enhanced imaging data on a clinical image
US9298260B2 (en) * 2010-03-12 2016-03-29 Broadcom Corporation Tactile communication system with communications based on capabilities of a remote system
EP2729084A4 (en) 2011-07-07 2015-03-04 Olympus Corp Medical master slave manipulator
JP5892361B2 (en) * 2011-08-02 2016-03-23 ソニー株式会社 Control device, control method, program, and robot control system
US11871901B2 (en) 2012-05-20 2024-01-16 Cilag Gmbh International Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage
US9216068B2 (en) 2012-06-27 2015-12-22 Camplex, Inc. Optics for video cameras on a surgical visualization system
US9642606B2 (en) 2012-06-27 2017-05-09 Camplex, Inc. Surgical visualization system
US9076227B2 (en) * 2012-10-01 2015-07-07 Mitsubishi Electric Research Laboratories, Inc. 3D object tracking in multiple 2D sequences
US9386908B2 (en) * 2013-01-29 2016-07-12 Gyrus Acmi, Inc. (D.B.A. Olympus Surgical Technologies America) Navigation using a pre-acquired image
AU2014233662B2 (en) * 2013-03-15 2019-01-17 Sri International Hyperdexterous surgical system
EP4331519A2 (en) * 2013-03-15 2024-03-06 Medtronic Holding Company Sàrl A system for treating tissue
EP2999414B1 (en) 2013-05-21 2018-08-08 Camplex, Inc. Surgical visualization systems
US10881286B2 (en) 2013-09-20 2021-01-05 Camplex, Inc. Medical apparatus for use with a surgical tubular retractor
JP5781135B2 (en) * 2013-09-27 2015-09-16 エフ・エーシステムエンジニアリング株式会社 3D navigation video generation device
JP6358463B2 (en) 2013-11-13 2018-07-18 パナソニックIpマネジメント株式会社 Master device for master-slave device, control method therefor, and master-slave device
EP3096692B1 (en) * 2014-01-24 2023-06-14 Koninklijke Philips N.V. Virtual image with optical shape sensing device perspective
US10083278B2 (en) * 2014-02-12 2018-09-25 Edda Technology, Inc. Method and system for displaying a timing signal for surgical instrument insertion in surgical procedures
KR102356213B1 (en) 2014-03-17 2022-01-28 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Guided setup for teleoperated medical device
CN106456258B (en) * 2014-03-17 2019-05-10 直观外科手术操作公司 Remotely operate the automatic structure with pre-established arm position in medical system
CN106456271B (en) 2014-03-28 2019-06-28 直观外科手术操作公司 The quantitative three-dimensional imaging and printing of surgery implant
JP6854237B2 (en) 2014-03-28 2021-04-07 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Quantitative 3D visualization of instruments in the field of view
CN110897590B (en) 2014-03-28 2021-11-16 直观外科手术操作公司 Surgical system with haptic feedback based on quantitative three-dimensional imaging
WO2015149040A1 (en) 2014-03-28 2015-10-01 Dorin Panescu Quantitative three-dimensional imaging of surgical scenes
CN106535806B (en) 2014-03-28 2019-06-18 直观外科手术操作公司 The quantitative three-dimensional imaging of surgical scene from multiport visual angle
CN105321202A (en) * 2014-07-16 2016-02-10 南京普爱射线影像设备有限公司 Medical two-dimensional image and 3D image display software system
WO2016090336A1 (en) 2014-12-05 2016-06-09 Camplex, Inc. Surgical visualization systems and displays
WO2016154589A1 (en) 2015-03-25 2016-09-29 Camplex, Inc. Surgical visualization systems and displays
CN104887175A (en) * 2015-06-03 2015-09-09 皖南医学院 Virtual gastroscopy and diagnosis system
EP3355824A1 (en) * 2015-09-29 2018-08-08 Koninklijke Philips N.V. Instrument controller for robotically assisted minimally invasive surgery
US20180271613A1 (en) * 2015-10-02 2018-09-27 Sony Corporation Medical control apparatus, control method, program, and medical control system
WO2017091704A1 (en) * 2015-11-25 2017-06-01 Camplex, Inc. Surgical visualization systems and displays
CN105376503B (en) * 2015-12-14 2018-07-20 北京医千创科技有限公司 A kind of operative image processing unit and method
KR20230141937A (en) * 2016-06-09 2023-10-10 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Computer-assist remote control surgical system and method
CN114027987A (en) 2016-06-30 2022-02-11 直观外科手术操作公司 Graphical user interface for displaying instructional information in multiple modes during an image guidance procedure
KR102387222B1 (en) 2016-06-30 2022-04-18 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Graphical user interface for displaying guidance information during video-guided procedures
KR102549728B1 (en) * 2016-07-14 2023-07-03 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Systems and methods for onscreen menus in a teleoperational medical system
KR101715026B1 (en) * 2016-08-26 2017-03-13 (주)미래컴퍼니 Surgical robot system and motion restriction control method thereof
GB2568616B (en) * 2016-11-01 2019-12-25 Bio Medical Eng Hk Ltd Surgical robotic devices and systems for use in performing minimally invasive and natural orifice transluminal endoscopic surgical actions
EP3582709A4 (en) * 2017-02-14 2020-11-25 Intuitive Surgical Operations Inc. Multi-dimensional visualization in computer-assisted tele-operated surgery
CN110621252B (en) * 2017-04-18 2024-03-15 直观外科手术操作公司 Graphical user interface for monitoring image-guided procedures
US10918455B2 (en) 2017-05-08 2021-02-16 Camplex, Inc. Variable light source
CN110913749B (en) * 2017-07-03 2022-06-24 富士胶片株式会社 Medical image processing device, endoscope device, diagnosis support device, medical service support device, and report creation support device
EP4279013A3 (en) * 2017-08-22 2023-11-29 Intuitive Surgical Operations, Inc. User-installable part installation detection techniques
US20200261180A1 (en) * 2017-09-06 2020-08-20 Covidien Lp 27-3systems, methods, and computer-readable media for providing stereoscopic visual perception notifications and/or recommendations during a robotic surgical procedure
CN111356407A (en) * 2017-10-20 2020-06-30 昆山华大智造云影医疗科技有限公司 Ultrasonic detection device, ultrasonic control device, ultrasonic system and ultrasonic imaging method
US10959744B2 (en) 2017-10-30 2021-03-30 Ethicon Llc Surgical dissectors and manufacturing techniques
US11896443B2 (en) * 2017-12-28 2024-02-13 Cilag Gmbh International Control of a surgical system through a surgical barrier
US11612444B2 (en) 2017-12-28 2023-03-28 Cilag Gmbh International Adjustment of a surgical device function based on situational awareness
US11857152B2 (en) 2017-12-28 2024-01-02 Cilag Gmbh International Surgical hub spatial awareness to determine devices in operating theater
US11026751B2 (en) 2017-12-28 2021-06-08 Cilag Gmbh International Display of alignment of staple cartridge to prior linear staple line
US11896322B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub
US11864728B2 (en) 2017-12-28 2024-01-09 Cilag Gmbh International Characterization of tissue irregularities through the use of mono-chromatic light refractivity
US20190201139A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Communication arrangements for robot-assisted surgical platforms
US11389188B2 (en) 2018-03-08 2022-07-19 Cilag Gmbh International Start temperature of blade
US11090047B2 (en) 2018-03-28 2021-08-17 Cilag Gmbh International Surgical instrument comprising an adaptive control system
CN108836392B (en) * 2018-03-30 2021-06-22 中国科学院深圳先进技术研究院 Ultrasonic imaging method, device and equipment based on ultrasonic RF signal and storage medium
EP3793465A4 (en) 2018-05-18 2022-03-02 Auris Health, Inc. Controllers for robotically-enabled teleoperated systems
CN109330690B (en) * 2018-07-31 2020-06-16 深圳市精锋医疗科技有限公司 Slave operating equipment assembly and surgical robot
WO2020054566A1 (en) * 2018-09-11 2020-03-19 ソニー株式会社 Medical observation system, medical observation device and medical observation method
US11529038B2 (en) 2018-10-02 2022-12-20 Elements Endoscopy, Inc. Endoscope with inertial measurement units and / or haptic input controls
CN109498162B (en) * 2018-12-20 2023-11-03 深圳市精锋医疗科技股份有限公司 Main operation table for improving immersion sense and surgical robot
US11331100B2 (en) 2019-02-19 2022-05-17 Cilag Gmbh International Staple cartridge retainer system with authentication keys
WO2020243425A1 (en) * 2019-05-31 2020-12-03 Intuitive Surgical Operations, Inc. Composite medical imaging systems and methods
KR102247545B1 (en) * 2019-07-24 2021-05-03 경북대학교 산학협력단 Surgical Location Information Providing Method and Device Thereof
US11918307B1 (en) * 2019-11-15 2024-03-05 Verily Life Sciences Llc Integrating applications in a surgeon console user interface of a robotic surgical system
US11931119B1 (en) 2019-11-15 2024-03-19 Verily Life Sciences Llc Integrating applications in a surgeon console user interface of a robotic surgical system
US20220087763A1 (en) * 2020-09-23 2022-03-24 Verb Surgical Inc. Deep disengagement detection during telesurgery
CN114601564B (en) * 2020-10-08 2023-08-22 深圳市精锋医疗科技股份有限公司 Surgical robot, graphical control device thereof and graphical display method thereof
CN112957107B (en) * 2021-02-19 2021-11-30 南昌华安众辉健康科技有限公司 Pleuroperitoneal cavity surgical instrument with laparoscope
CN113925615A (en) * 2021-10-26 2022-01-14 北京歌锐科技有限公司 Minimally invasive surgery equipment and control method thereof
US11717149B1 (en) 2022-04-27 2023-08-08 Maciej J. Kieturakis Methods and systems for robotic single-port laparoscopic access
CN115363751B (en) * 2022-08-12 2023-05-16 华平祥晟(上海)医疗科技有限公司 Intraoperative anatomical structure indication method

Citations (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5279309A (en) * 1991-06-13 1994-01-18 International Business Machines Corporation Signaling device and method for monitoring positions in a surgical operation
US5397323A (en) * 1992-10-30 1995-03-14 International Business Machines Corporation Remote center-of-motion robot for surgery
US5417210A (en) * 1992-05-27 1995-05-23 International Business Machines Corporation System and method for augmentation of endoscopic surgery
US5493595A (en) * 1982-02-24 1996-02-20 Schoolman Scientific Corp. Stereoscopically displayed three dimensional medical imaging
US5551432A (en) * 1995-06-19 1996-09-03 New York Eye & Ear Infirmary Scanning control system for ultrasound biomicroscopy
US5759153A (en) * 1992-06-30 1998-06-02 Cardiovascular Imaging Systems, Inc. Automated longitudinal position translator for ultrasonic imaging probes, and methods of using same
US5765561A (en) * 1994-10-07 1998-06-16 Medical Media Systems Video-based surgical targeting system
US5788688A (en) * 1992-11-05 1998-08-04 Bauer Laboratories, Inc. Surgeon's command and control
US5797849A (en) * 1995-03-28 1998-08-25 Sonometrics Corporation Method for carrying out a medical procedure using a three-dimensional tracking and imaging system
US5810008A (en) * 1996-12-03 1998-09-22 Isg Technologies Inc. Apparatus and method for visualizing ultrasonic images
US5817022A (en) * 1995-03-28 1998-10-06 Sonometrics Corporation System for displaying a 2-D ultrasound image within a 3-D viewing environment
US5836880A (en) * 1995-02-27 1998-11-17 Micro Chemical, Inc. Automated system for measuring internal tissue characteristics in feed animals
US5842993A (en) * 1997-12-10 1998-12-01 The Whitaker Corporation Navigable ultrasonic imaging probe assembly
US5842473A (en) * 1993-11-29 1998-12-01 Life Imaging Systems Three-dimensional imaging system
US5853367A (en) * 1997-03-17 1998-12-29 General Electric Company Task-interface and communications system and method for ultrasound imager control
US5887121A (en) * 1995-04-21 1999-03-23 International Business Machines Corporation Method of constrained Cartesian control of robotic mechanisms with active and passive joints
US6129670A (en) * 1997-11-24 2000-10-10 Burdette Medical Systems Real time brachytherapy spatial registration and visualization system
US6241725B1 (en) * 1993-12-15 2001-06-05 Sherwood Services Ag High frequency thermal ablation of cancerous tumors and functional targets with image data assistance
US6256529B1 (en) * 1995-07-26 2001-07-03 Burdette Medical Systems, Inc. Virtual reality 3D visualization for surgical procedures
US6312391B1 (en) * 2000-02-16 2001-11-06 Urologix, Inc. Thermodynamic modeling of tissue treatment procedure
US6402737B1 (en) * 1998-03-19 2002-06-11 Hitachi, Ltd. Surgical apparatus
US20020193800A1 (en) * 2001-06-11 2002-12-19 Kienzle Thomas C. Surgical drill for use with a computer assisted surgery system
US6522906B1 (en) * 1998-12-08 2003-02-18 Intuitive Surgical, Inc. Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure
US6599247B1 (en) * 2000-07-07 2003-07-29 University Of Pittsburgh System and method for location-merging of real-time tomographic slice images with human vision
US6602185B1 (en) * 1999-02-18 2003-08-05 Olympus Optical Co., Ltd. Remote surgery support system
US6642836B1 (en) * 1996-08-06 2003-11-04 Computer Motion, Inc. General purpose distributed operating room control system
US6799065B1 (en) * 1998-12-08 2004-09-28 Intuitive Surgical, Inc. Image shifting apparatus and method for a telerobotic system
US6837883B2 (en) * 1998-11-20 2005-01-04 Intuitive Surgical, Inc. Arm cart for telerobotic surgical system
US20060058988A1 (en) * 2001-05-29 2006-03-16 Defranoux Nadine A Method and apparatus for computer modeling a joint
US7107124B2 (en) * 1992-01-21 2006-09-12 Sri International Roll-pitch-roll wrist methods for minimally invasive robotic surgery
US20080020362A1 (en) * 2004-08-10 2008-01-24 Cotin Stephane M Methods and Apparatus for Simulaton of Endovascular and Endoluminal Procedures
US7413565B2 (en) * 2002-01-16 2008-08-19 Intuitive Surgical, Inc. Minimally invasive surgical training using robotics and telecollaboration

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5181514A (en) 1991-05-21 1993-01-26 Hewlett-Packard Company Transducer positioning system
US5182728A (en) * 1991-06-28 1993-01-26 Acoustic Imaging Technologies Corporation Ultrasound imaging system and method
JPH07508449A (en) * 1993-04-20 1995-09-21 ゼネラル・エレクトリック・カンパニイ Computer graphics and live video systems to better visualize body structures during surgical procedures
JPH08111816A (en) * 1994-10-11 1996-04-30 Toshiba Corp Medical image display device
JPH08275958A (en) * 1995-04-07 1996-10-22 Olympus Optical Co Ltd Manipulator device for operation
JPH09173352A (en) * 1995-12-25 1997-07-08 Toshiba Medical Eng Co Ltd Medical navigation system
US5797900A (en) 1996-05-20 1998-08-25 Intuitive Surgical, Inc. Wrist mechanism for surgical instrument for performing minimally invasive surgery with enhanced dexterity and sensitivity
JPH10322629A (en) * 1997-05-16 1998-12-04 Canon Inc Image pickup device, image pickup system and storage medium
US6950689B1 (en) * 1998-08-03 2005-09-27 Boston Scientific Scimed, Inc. Dynamically alterable three-dimensional graphical model of a body region
US6468265B1 (en) * 1998-11-20 2002-10-22 Intuitive Surgical, Inc. Performing cardiac surgery without cardioplegia
JP2000300579A (en) * 1999-04-26 2000-10-31 Olympus Optical Co Ltd Multifunctional manipulator
US7831292B2 (en) * 2002-03-06 2010-11-09 Mako Surgical Corp. Guidance system and method for surgical procedures with improved feedback
AU2003257309A1 (en) * 2002-08-13 2004-02-25 Microbotics Corporation Microsurgical robot system
US20060161218A1 (en) * 2003-11-26 2006-07-20 Wicab, Inc. Systems and methods for treating traumatic brain injury
JP4377827B2 (en) * 2004-03-30 2009-12-02 株式会社東芝 Manipulator device
JP2006055273A (en) * 2004-08-18 2006-03-02 Olympus Corp Surgery support system
US7396129B2 (en) * 2004-11-22 2008-07-08 Carestream Health, Inc. Diagnostic system having gaze tracking
JP2006320427A (en) * 2005-05-17 2006-11-30 Hitachi Medical Corp Endoscopic operation support system
JP2006321027A (en) * 2005-05-20 2006-11-30 Hitachi Ltd Master slave type manipulator system and its operation input device
JP4398405B2 (en) * 2005-05-30 2010-01-13 アロカ株式会社 Medical system
EP3679882A1 (en) * 2005-06-06 2020-07-15 Intuitive Surgical Operations, Inc. Laparoscopic ultrasound robotic surgical system
CN101291635B (en) 2005-10-20 2013-03-27 直观外科手术操作公司 Auxiliary image display and manipulation on a computer display in a medical robotic system

Patent Citations (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5493595A (en) * 1982-02-24 1996-02-20 Schoolman Scientific Corp. Stereoscopically displayed three dimensional medical imaging
US6201984B1 (en) * 1991-06-13 2001-03-13 International Business Machines Corporation System and method for augmentation of endoscopic surgery
US5279309A (en) * 1991-06-13 1994-01-18 International Business Machines Corporation Signaling device and method for monitoring positions in a surgical operation
US7107124B2 (en) * 1992-01-21 2006-09-12 Sri International Roll-pitch-roll wrist methods for minimally invasive robotic surgery
US5417210A (en) * 1992-05-27 1995-05-23 International Business Machines Corporation System and method for augmentation of endoscopic surgery
US5572999A (en) * 1992-05-27 1996-11-12 International Business Machines Corporation Robotic system for positioning a surgical instrument relative to a patient's body
US5749362A (en) * 1992-05-27 1998-05-12 International Business Machines Corporation Method of creating an image of an anatomical feature where the feature is within a patient's body
US5759153A (en) * 1992-06-30 1998-06-02 Cardiovascular Imaging Systems, Inc. Automated longitudinal position translator for ultrasonic imaging probes, and methods of using same
US5397323A (en) * 1992-10-30 1995-03-14 International Business Machines Corporation Remote center-of-motion robot for surgery
US5788688A (en) * 1992-11-05 1998-08-04 Bauer Laboratories, Inc. Surgeon's command and control
US5842473A (en) * 1993-11-29 1998-12-01 Life Imaging Systems Three-dimensional imaging system
US6241725B1 (en) * 1993-12-15 2001-06-05 Sherwood Services Ag High frequency thermal ablation of cancerous tumors and functional targets with image data assistance
US5765561A (en) * 1994-10-07 1998-06-16 Medical Media Systems Video-based surgical targeting system
US5836880A (en) * 1995-02-27 1998-11-17 Micro Chemical, Inc. Automated system for measuring internal tissue characteristics in feed animals
US5817022A (en) * 1995-03-28 1998-10-06 Sonometrics Corporation System for displaying a 2-D ultrasound image within a 3-D viewing environment
US5797849A (en) * 1995-03-28 1998-08-25 Sonometrics Corporation Method for carrying out a medical procedure using a three-dimensional tracking and imaging system
US5887121A (en) * 1995-04-21 1999-03-23 International Business Machines Corporation Method of constrained Cartesian control of robotic mechanisms with active and passive joints
US5551432A (en) * 1995-06-19 1996-09-03 New York Eye & Ear Infirmary Scanning control system for ultrasound biomicroscopy
US6256529B1 (en) * 1995-07-26 2001-07-03 Burdette Medical Systems, Inc. Virtual reality 3D visualization for surgical procedures
US6642836B1 (en) * 1996-08-06 2003-11-04 Computer Motion, Inc. General purpose distributed operating room control system
US5810008A (en) * 1996-12-03 1998-09-22 Isg Technologies Inc. Apparatus and method for visualizing ultrasonic images
US5853367A (en) * 1997-03-17 1998-12-29 General Electric Company Task-interface and communications system and method for ultrasound imager control
US6129670A (en) * 1997-11-24 2000-10-10 Burdette Medical Systems Real time brachytherapy spatial registration and visualization system
US5842993A (en) * 1997-12-10 1998-12-01 The Whitaker Corporation Navigable ultrasonic imaging probe assembly
US6402737B1 (en) * 1998-03-19 2002-06-11 Hitachi, Ltd. Surgical apparatus
US6837883B2 (en) * 1998-11-20 2005-01-04 Intuitive Surgical, Inc. Arm cart for telerobotic surgical system
US6522906B1 (en) * 1998-12-08 2003-02-18 Intuitive Surgical, Inc. Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure
US6799065B1 (en) * 1998-12-08 2004-09-28 Intuitive Surgical, Inc. Image shifting apparatus and method for a telerobotic system
US6602185B1 (en) * 1999-02-18 2003-08-05 Olympus Optical Co., Ltd. Remote surgery support system
US6312391B1 (en) * 2000-02-16 2001-11-06 Urologix, Inc. Thermodynamic modeling of tissue treatment procedure
US6599247B1 (en) * 2000-07-07 2003-07-29 University Of Pittsburgh System and method for location-merging of real-time tomographic slice images with human vision
US20060058988A1 (en) * 2001-05-29 2006-03-16 Defranoux Nadine A Method and apparatus for computer modeling a joint
US20020193800A1 (en) * 2001-06-11 2002-12-19 Kienzle Thomas C. Surgical drill for use with a computer assisted surgery system
US7413565B2 (en) * 2002-01-16 2008-08-19 Intuitive Surgical, Inc. Minimally invasive surgical training using robotics and telecollaboration
US20080020362A1 (en) * 2004-08-10 2008-01-24 Cotin Stephane M Methods and Apparatus for Simulaton of Endovascular and Endoluminal Procedures

Cited By (219)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10271909B2 (en) 1999-04-07 2019-04-30 Intuitive Surgical Operations, Inc. Display of computer generated image of an out-of-view portion of a medical device adjacent a real-time image of an in-view portion of the medical device
US8944070B2 (en) 1999-04-07 2015-02-03 Intuitive Surgical Operations, Inc. Non-force reflecting method for providing tool force information to a user of a telesurgical system
US9101397B2 (en) 1999-04-07 2015-08-11 Intuitive Surgical Operations, Inc. Real-time generation of three-dimensional ultrasound image using a two-dimensional ultrasound transducer in a robotic system
US20050200324A1 (en) * 1999-04-07 2005-09-15 Intuitive Surgical Inc. Non-force reflecting method for providing tool force information to a user of a telesurgical system
US9232984B2 (en) 1999-04-07 2016-01-12 Intuitive Surgical Operations, Inc. Real-time generation of three-dimensional ultrasound image using a two-dimensional ultrasound transducer in a robotic system
US10433919B2 (en) 1999-04-07 2019-10-08 Intuitive Surgical Operations, Inc. Non-force reflecting method for providing tool force information to a user of a telesurgical system
US20110105898A1 (en) * 1999-04-07 2011-05-05 Intuitive Surgical Operations, Inc. Real-Time Generation of Three-Dimensional Ultrasound image using a Two-Dimensional Ultrasound Transducer in a Robotic System
US9517106B2 (en) * 1999-09-17 2016-12-13 Intuitive Surgical Operations, Inc. Systems and methods for commanded reconfiguration of a surgical manipulator using the null-space
US9949801B2 (en) 1999-09-17 2018-04-24 Intuitive Surgical Operations, Inc. Systems and methods for commanded reconfiguration of a surgical manipulator using the null-space
US20140358161A1 (en) * 1999-09-17 2014-12-04 Intuitive Surgical Operations, Inc. Systems and methods for commanded reconfiguration of a surgical manipulator using the null-space
US20190282308A1 (en) * 2002-03-20 2019-09-19 P Tech, Llc Robotic surgery
US20200060775A1 (en) * 2002-03-20 2020-02-27 P Tech, Llc Robotic surgery
US10932869B2 (en) * 2002-03-20 2021-03-02 P Tech, Llc Robotic surgery
US10959791B2 (en) * 2002-03-20 2021-03-30 P Tech, Llc Robotic surgery
US10842571B2 (en) 2005-05-16 2020-11-24 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US11672606B2 (en) 2005-05-16 2023-06-13 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US10792107B2 (en) 2005-05-16 2020-10-06 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US11116578B2 (en) 2005-05-16 2021-09-14 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US20060258938A1 (en) * 2005-05-16 2006-11-16 Intuitive Surgical Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US20100331855A1 (en) * 2005-05-16 2010-12-30 Intuitive Surgical, Inc. Efficient Vision and Kinematic Data Fusion For Robotic Surgical Instruments and Other Applications
US11478308B2 (en) 2005-05-16 2022-10-25 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US8971597B2 (en) 2005-05-16 2015-03-03 Intuitive Surgical Operations, Inc. Efficient vision and kinematic data fusion for robotic surgical instruments and other applications
US10555775B2 (en) 2005-05-16 2020-02-11 Intuitive Surgical Operations, Inc. Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery
US11197731B2 (en) 2005-10-20 2021-12-14 Intuitive Surgical Operations, Inc. Auxiliary image display and manipulation on a computer display in a medical robotic system
US20110050852A1 (en) * 2005-12-30 2011-03-03 Intuitive Surgical Operations, Inc. Stereo telestration for robotic surgery
US20080065109A1 (en) * 2006-06-13 2008-03-13 Intuitive Surgical, Inc. Preventing instrument/tissue collisions
US9345387B2 (en) 2006-06-13 2016-05-24 Intuitive Surgical Operations, Inc. Preventing instrument/tissue collisions
US20080004603A1 (en) * 2006-06-29 2008-01-03 Intuitive Surgical Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US9718190B2 (en) 2006-06-29 2017-08-01 Intuitive Surgical Operations, Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US11865729B2 (en) 2006-06-29 2024-01-09 Intuitive Surgical Operations, Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US10137575B2 (en) 2006-06-29 2018-11-27 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical robot
US20090192524A1 (en) * 2006-06-29 2009-07-30 Intuitive Surgical, Inc. Synthetic representation of a surgical robot
US10008017B2 (en) 2006-06-29 2018-06-26 Intuitive Surgical Operations, Inc. Rendering tool information as graphic overlays on displayed images of tools
US10773388B2 (en) 2006-06-29 2020-09-15 Intuitive Surgical Operations, Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US11638999B2 (en) 2006-06-29 2023-05-02 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical robot
US9789608B2 (en) 2006-06-29 2017-10-17 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical robot
US9788909B2 (en) 2006-06-29 2017-10-17 Intuitive Surgical Operations, Inc Synthetic representation of a surgical instrument
US9801690B2 (en) 2006-06-29 2017-10-31 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical instrument
US10730187B2 (en) 2006-06-29 2020-08-04 Intuitive Surgical Operations, Inc. Tool position and identification indicator displayed in a boundary area of a computer display screen
US10737394B2 (en) 2006-06-29 2020-08-11 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical robot
US20100149183A1 (en) * 2006-12-15 2010-06-17 Loewke Kevin E Image mosaicing systems and methods
US10936090B2 (en) 2006-12-28 2021-03-02 D3D Technologies, Inc. Interactive 3D cursor for use in medical imaging
US11520415B2 (en) 2006-12-28 2022-12-06 D3D Technologies, Inc. Interactive 3D cursor for use in medical imaging
US11016579B2 (en) 2006-12-28 2021-05-25 D3D Technologies, Inc. Method and apparatus for 3D viewing of images on a head display unit
US11228753B1 (en) 2006-12-28 2022-01-18 Robert Edwin Douglas Method and apparatus for performing stereoscopic zooming on a head display unit
US11275242B1 (en) 2006-12-28 2022-03-15 Tipping Point Medical Images, Llc Method and apparatus for performing stereoscopic rotation of a volume on a head display unit
US20160026266A1 (en) * 2006-12-28 2016-01-28 David Byron Douglas Method and apparatus for three dimensional viewing of images
US11315307B1 (en) 2006-12-28 2022-04-26 Tipping Point Medical Images, Llc Method and apparatus for performing rotating viewpoints using a head display unit
US10942586B1 (en) 2006-12-28 2021-03-09 D3D Technologies, Inc. Interactive 3D cursor for use in medical imaging
US11036311B2 (en) 2006-12-28 2021-06-15 D3D Technologies, Inc. Method and apparatus for 3D viewing of images on a head display unit
US9980691B2 (en) * 2006-12-28 2018-05-29 David Byron Douglas Method and apparatus for three dimensional viewing of images
US10795457B2 (en) 2006-12-28 2020-10-06 D3D Technologies, Inc. Interactive 3D cursor
US20080234866A1 (en) * 2007-03-20 2008-09-25 Kosuke Kishi Master-slave manipulator system
US8002694B2 (en) * 2007-03-20 2011-08-23 Hitachi, Ltd. Master-slave manipulator system
US9044257B2 (en) * 2007-04-16 2015-06-02 Tim Fielding Frame mapping and force feedback methods, devices and systems
US20140142593A1 (en) * 2007-04-16 2014-05-22 Tim Fielding Frame Mapping and Force Feedback Methods, Devices and Systems
US10695136B2 (en) 2007-06-13 2020-06-30 Intuitive Surgical Operations, Inc. Preventing instrument/tissue collisions
US9333042B2 (en) 2007-06-13 2016-05-10 Intuitive Surgical Operations, Inc. Medical robotic system with coupled control modes
US9469034B2 (en) 2007-06-13 2016-10-18 Intuitive Surgical Operations, Inc. Method and system for switching modes of a robotic system
US10271912B2 (en) 2007-06-13 2019-04-30 Intuitive Surgical Operations, Inc. Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide
US9629520B2 (en) 2007-06-13 2017-04-25 Intuitive Surgical Operations, Inc. Method and system for moving an articulated instrument back towards an entry guide while automatically reconfiguring the articulated instrument for retraction into the entry guide
US20100274087A1 (en) * 2007-06-13 2010-10-28 Intuitive Surgical Operations, Inc. Medical robotic system with coupled control modes
US9138129B2 (en) 2007-06-13 2015-09-22 Intuitive Surgical Operations, Inc. Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide
US8620473B2 (en) 2007-06-13 2013-12-31 Intuitive Surgical Operations, Inc. Medical robotic system with coupled control modes
US10188472B2 (en) 2007-06-13 2019-01-29 Intuitive Surgical Operations, Inc. Medical robotic system with coupled control modes
US11432888B2 (en) 2007-06-13 2022-09-06 Intuitive Surgical Operations, Inc. Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide
US11399908B2 (en) 2007-06-13 2022-08-02 Intuitive Surgical Operations, Inc. Medical robotic system with coupled control modes
US9901408B2 (en) 2007-06-13 2018-02-27 Intuitive Surgical Operations, Inc. Preventing instrument/tissue collisions
US11751955B2 (en) 2007-06-13 2023-09-12 Intuitive Surgical Operations, Inc. Method and system for retracting an instrument into an entry guide
US20090069804A1 (en) * 2007-09-12 2009-03-12 Jensen Jeffrey L Apparatus for efficient power delivery
US8224484B2 (en) 2007-09-30 2012-07-17 Intuitive Surgical Operations, Inc. Methods of user interface with alternate tool mode for robotic surgical tools
US9649174B2 (en) 2007-09-30 2017-05-16 Intuitive Surgical Operations, Inc. User interface with state machine for alternate tool mode for robotic surgical tools
US20090088775A1 (en) * 2007-09-30 2009-04-02 Nitish Swarup Methods of user interface with alternate tool mode for robotic surgical tools
US9050120B2 (en) 2007-09-30 2015-06-09 Intuitive Surgical Operations, Inc. Apparatus and method of user interface with alternate tool mode for robotic surgical tools
US9339343B2 (en) 2007-09-30 2016-05-17 Intuitive Surgical Operations, Inc. User interface methods for alternate tool modes for robotic surgical tools
US8792963B2 (en) 2007-09-30 2014-07-29 Intuitive Surgical Operations, Inc. Methods of determining tissue distances using both kinematic robotic tool position information and image-derived position information
US20090088774A1 (en) * 2007-09-30 2009-04-02 Nitish Swarup Apparatus and method of user interface with alternate tool mode for robotic surgical tools
US20100139808A1 (en) * 2007-11-26 2010-06-10 Thompson Ray P Special articulating tool holder
US8042435B2 (en) * 2007-11-26 2011-10-25 Thompson Ray P Special articulating tool holder
US20090141966A1 (en) * 2007-11-30 2009-06-04 Microsoft Corporation Interactive geo-positioning of imagery
US9123159B2 (en) * 2007-11-30 2015-09-01 Microsoft Technology Licensing, Llc Interactive geo-positioning of imagery
US8808164B2 (en) * 2008-03-28 2014-08-19 Intuitive Surgical Operations, Inc. Controlling a robotic surgical tool with a display monitor
US20090248036A1 (en) * 2008-03-28 2009-10-01 Intuitive Surgical, Inc. Controlling a robotic surgical tool with a display monitor
US10432921B2 (en) 2008-03-28 2019-10-01 Intuitive Surgical Operations, Inc. Automated panning in robotic surgical systems based on tool tracking
US10038888B2 (en) 2008-03-28 2018-07-31 Intuitive Surgical Operations, Inc. Apparatus for automated panning and zooming in robotic surgical systems
US11076748B2 (en) 2008-03-28 2021-08-03 Intuitive Surgical Operations, Inc. Display monitor control of a telesurgical tool
US11019329B2 (en) 2008-03-28 2021-05-25 Intuitive Surgical Operations, Inc. Automated panning and zooming in teleoperated surgical systems with stereo displays
US10674900B2 (en) 2008-03-28 2020-06-09 Intuitive Surgical Operations, Inc. Display monitor control of a telesurgical tool
US20140323803A1 (en) * 2008-03-28 2014-10-30 Intuitive Surgical Operations, Inc. Methods of controlling a robotic surgical tool with a display monitor
US9699445B2 (en) 2008-03-28 2017-07-04 Intuitive Surgical Operations, Inc. Apparatus for automated panning and digital zooming in robotic surgical systems
US20090326556A1 (en) * 2008-06-27 2009-12-31 Intuitive Surgical, Inc. Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the positioning and orienting of its tip
US11382702B2 (en) 2008-06-27 2022-07-12 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxiliary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
US10368952B2 (en) 2008-06-27 2019-08-06 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxiliary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
US9516996B2 (en) 2008-06-27 2016-12-13 Intuitive Surgical Operations, Inc. Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the position and orienting of its tip
US8864652B2 (en) 2008-06-27 2014-10-21 Intuitive Surgical Operations, Inc. Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the positioning and orienting of its tip
US9089256B2 (en) 2008-06-27 2015-07-28 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxiliary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
US10258425B2 (en) 2008-06-27 2019-04-16 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxiliary view of articulatable instruments extending out of a distal end of an entry guide
US20090326318A1 (en) * 2008-06-27 2009-12-31 Intuitive Surgical, Inc. Medical robotic system providing an auxilary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
US11638622B2 (en) 2008-06-27 2023-05-02 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxiliary view of articulatable instruments extending out of a distal end of an entry guide
US20090326553A1 (en) * 2008-06-27 2009-12-31 Intuitive Surgical, Inc. Medical robotic system providing an auxiliary view of articulatable instruments extending out of a distal end of an entry guide
US9717563B2 (en) 2008-06-27 2017-08-01 Intuitive Surgical Operations, Inc. Medical robotic system providing an auxilary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide
US8583274B2 (en) 2008-09-26 2013-11-12 Intuitive Surgical Operations, Inc. Method for graphically providing continuous change of state directions to a user of medical robotic system
US8315720B2 (en) 2008-09-26 2012-11-20 Intuitive Surgical Operations, Inc. Method for graphically providing continuous change of state directions to a user of a medical robotic system
US8892224B2 (en) 2008-09-26 2014-11-18 Intuitive Surgical Operations, Inc. Method for graphically providing continuous change of state directions to a user of a medical robotic system
US20100082039A1 (en) * 2008-09-26 2010-04-01 Intuitive Surgical, Inc. Method for graphically providing continuous change of state directions to a user of a medical robotic system
US20110234754A1 (en) * 2008-11-24 2011-09-29 Koninklijke Philips Electronics N.V. Combining 3d video and auxiliary data
US20100164950A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Efficient 3-d telestration for local robotic proctoring
US20100166323A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical. Inc. Robust sparse image matching for robotic surgery
US8830224B2 (en) 2008-12-31 2014-09-09 Intuitive Surgical Operations, Inc. Efficient 3-D telestration for local robotic proctoring
US8184880B2 (en) 2008-12-31 2012-05-22 Intuitive Surgical Operations, Inc. Robust sparse image matching for robotic surgery
US8639000B2 (en) 2008-12-31 2014-01-28 Intuitive Surgical Operations, Inc. Robust sparse image matching for robotic surgery
US9402690B2 (en) 2008-12-31 2016-08-02 Intuitive Surgical Operations, Inc. Efficient 3-D telestration for local and remote robotic proctoring
US20100285438A1 (en) * 2009-03-12 2010-11-11 Thenkurussi Kesavadas Method And System For Minimally-Invasive Surgery Training
WO2010105237A3 (en) * 2009-03-12 2011-01-13 Health Research Inc. Method and system for minimally-invasive surgery training
US10984567B2 (en) 2009-03-31 2021-04-20 Intuitive Surgical Operations, Inc. Rendering tool information as graphic overlays on displayed images of tools
US11941734B2 (en) 2009-03-31 2024-03-26 Intuitive Surgical Operations, Inc. Rendering tool information as graphic overlays on displayed images of tools
US10282881B2 (en) 2009-03-31 2019-05-07 Intuitive Surgical Operations, Inc. Rendering tool information as graphic overlays on displayed images of tools
EP3842190A1 (en) * 2009-03-31 2021-06-30 Intuitive Surgical Operations, Inc. Synthetic representation of a surgical robot
US9155592B2 (en) * 2009-06-16 2015-10-13 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US9492240B2 (en) 2009-06-16 2016-11-15 Intuitive Surgical Operations, Inc. Virtual measurement tool for minimally invasive surgery
US20100318099A1 (en) * 2009-06-16 2010-12-16 Intuitive Surgical, Inc. Virtual measurement tool for minimally invasive surgery
US11596490B2 (en) 2009-08-15 2023-03-07 Intuitive Surgical Operations, Inc. Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose
US20110040305A1 (en) * 2009-08-15 2011-02-17 Intuitive Surgical, Inc. Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide
US8903546B2 (en) 2009-08-15 2014-12-02 Intuitive Surgical Operations, Inc. Smooth control of an articulated instrument across areas with different work space conditions
US9084623B2 (en) * 2009-08-15 2015-07-21 Intuitive Surgical Operations, Inc. Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide
US10772689B2 (en) 2009-08-15 2020-09-15 Intuitive Surgical Operations, Inc. Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide
US9492927B2 (en) 2009-08-15 2016-11-15 Intuitive Surgical Operations, Inc. Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose
US9956044B2 (en) 2009-08-15 2018-05-01 Intuitive Surgical Operations, Inc. Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide
US10271915B2 (en) 2009-08-15 2019-04-30 Intuitive Surgical Operations, Inc. Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose
US10959798B2 (en) 2009-08-15 2021-03-30 Intuitive Surgical Operations, Inc. Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose
US9814392B2 (en) 2009-10-30 2017-11-14 The Johns Hopkins University Visual tracking and annotaton of clinically important anatomical landmarks for surgical interventions
US20110202068A1 (en) * 2010-02-12 2011-08-18 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
US9622826B2 (en) 2010-02-12 2017-04-18 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
US8918211B2 (en) 2010-02-12 2014-12-23 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
US10828774B2 (en) 2010-02-12 2020-11-10 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
US10537994B2 (en) 2010-02-12 2020-01-21 Intuitive Surgical Operations, Inc. Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument
WO2011104135A1 (en) * 2010-02-25 2011-09-01 Siemens Aktiengesellschaft Method for displaying an area to be examined and/or treated
US8675939B2 (en) 2010-07-13 2014-03-18 Stryker Leibinger Gmbh & Co. Kg Registration of anatomical data sets
US9572548B2 (en) 2010-07-13 2017-02-21 Stryker European Holdings I, Llc Registration of anatomical data sets
KR101975808B1 (en) * 2010-11-04 2019-08-28 더 존스 홉킨스 유니버시티 System and method for the evaluation of or improvement of minimally invasive surgery skills
WO2012060901A1 (en) * 2010-11-04 2012-05-10 The Johns Hopkins University System and method for the evaluation of or improvement of minimally invasive surgery skills
KR20150004726A (en) * 2010-11-04 2015-01-13 더 존스 홉킨스 유니버시티 System and method for the evaluation of or improvement of minimally invasive surgery skills
US9486189B2 (en) 2010-12-02 2016-11-08 Hitachi Aloka Medical, Ltd. Assembly for use with surgery system
US9259204B2 (en) 2011-05-30 2016-02-16 General Electric Company Ultrasound diagnostic apparatus and method of displaying medical image thereof
US9161772B2 (en) 2011-08-04 2015-10-20 Olympus Corporation Surgical instrument and medical manipulator
US9477301B2 (en) 2011-08-04 2016-10-25 Olympus Corporation Operation support device and assembly method thereof
US9519341B2 (en) 2011-08-04 2016-12-13 Olympus Corporation Medical manipulator and surgical support apparatus
US9524022B2 (en) 2011-08-04 2016-12-20 Olympus Corporation Medical equipment
US9423869B2 (en) 2011-08-04 2016-08-23 Olympus Corporation Operation support device
US9244523B2 (en) 2011-08-04 2016-01-26 Olympus Corporation Manipulator system
US9851782B2 (en) 2011-08-04 2017-12-26 Olympus Corporation Operation support device and attachment and detachment method thereof
US9568992B2 (en) 2011-08-04 2017-02-14 Olympus Corporation Medical manipulator
US9218053B2 (en) 2011-08-04 2015-12-22 Olympus Corporation Surgical assistant system
US9632577B2 (en) 2011-08-04 2017-04-25 Olympus Corporation Operation support device and control method thereof
US9632573B2 (en) 2011-08-04 2017-04-25 Olympus Corporation Medical manipulator and method of controlling the same
US9671860B2 (en) 2011-08-04 2017-06-06 Olympus Corporation Manipulation input device and manipulator system having the same
US9244524B2 (en) 2011-08-04 2016-01-26 Olympus Corporation Surgical instrument and control method thereof
US9277968B2 (en) 2011-12-09 2016-03-08 Samsung Electronics Co., Ltd. Medical robot system and method for controlling the same
US9956042B2 (en) 2012-01-13 2018-05-01 Vanderbilt University Systems and methods for robot-assisted transurethral exploration and intervention
US10836045B2 (en) 2012-02-15 2020-11-17 Intuitive Surgical Operations, Inc. User selection of robotic system operating modes using mode distinguishing operator actions
US9586323B2 (en) 2012-02-15 2017-03-07 Intuitive Surgical Operations, Inc. User selection of robotic system operating modes using mode distinguishing operator actions
US10532467B2 (en) 2012-02-15 2020-01-14 Intuitive Surgical Operations, Inc. User selection of robotic system operating modes using mode distinguishing operator actions
US10272270B2 (en) 2012-04-12 2019-04-30 Koninklijke Philips N.V. Coordinate transformation of graphical objects registered to a magnetic resonance image
US9687303B2 (en) 2012-04-20 2017-06-27 Vanderbilt University Dexterous wrists for surgical intervention
US20140330432A1 (en) * 2012-04-20 2014-11-06 Vanderbilt University Systems and methods for safe compliant insertion and hybrid force/motion telemanipulation of continuum robots
US10500002B2 (en) 2012-04-20 2019-12-10 Vanderbilt University Dexterous wrists
US9549720B2 (en) 2012-04-20 2017-01-24 Vanderbilt University Robotic device for establishing access channel
US9539726B2 (en) * 2012-04-20 2017-01-10 Vanderbilt University Systems and methods for safe compliant insertion and hybrid force/motion telemanipulation of continuum robots
US10300599B2 (en) * 2012-04-20 2019-05-28 Vanderbilt University Systems and methods for safe compliant insertion and hybrid force/motion telemanipulation of continuum robots
US20130314418A1 (en) * 2012-05-24 2013-11-28 Siemens Medical Solutions Usa, Inc. System for Erasing Medical Image Features
US10682191B2 (en) 2012-06-01 2020-06-16 Intuitive Surgical Operations, Inc. Systems and methods for commanded reconfiguration of a surgical manipulator using the null-space
US11278182B2 (en) * 2012-06-28 2022-03-22 Koninklijke Philips N.V. Enhanced visualization of blood vessels using a robotically steered endoscope
US11013480B2 (en) 2012-06-28 2021-05-25 Koninklijke Philips N.V. C-arm trajectory planning for optimal image acquisition in endoscopic surgery
CN103054612A (en) * 2012-12-10 2013-04-24 苏州佳世达电通有限公司 Ultrasonic probe mouse and ultrasonoscope
US10507066B2 (en) 2013-02-15 2019-12-17 Intuitive Surgical Operations, Inc. Providing information of tools by filtering image areas adjacent to or on displayed images of the tools
US11389255B2 (en) 2013-02-15 2022-07-19 Intuitive Surgical Operations, Inc. Providing information of tools by filtering image areas adjacent to or on displayed images of the tools
US11806102B2 (en) 2013-02-15 2023-11-07 Intuitive Surgical Operations, Inc. Providing information of tools by filtering image areas adjacent to or on displayed images of the tools
US9713460B2 (en) 2013-05-02 2017-07-25 Samsung Medison Co., Ltd. Ultrasound system and method for providing change information of target object
US20160235285A1 (en) * 2013-10-30 2016-08-18 Olympus Corporation Endoscope apparatus
US10085630B2 (en) * 2013-10-30 2018-10-02 Olympus Corporation Endoscope apparatus
US10057590B2 (en) * 2014-01-13 2018-08-21 Mediatek Inc. Method and apparatus using software engine and hardware engine collaborated with each other to achieve hybrid video encoding
US11317979B2 (en) * 2014-03-17 2022-05-03 Intuitive Surgical Operations, Inc. Systems and methods for offscreen indication of instruments in a teleoperational medical system
US11903665B2 (en) 2014-03-17 2024-02-20 Intuitive Surgical Operations, Inc. Systems and methods for offscreen indication of instruments in a teleoperational medical system
US10390728B2 (en) * 2014-03-31 2019-08-27 Canon Medical Systems Corporation Medical image diagnosis apparatus
US10571671B2 (en) * 2014-03-31 2020-02-25 Sony Corporation Surgical control device, control method, and imaging control system
US20150313445A1 (en) * 2014-05-01 2015-11-05 Endochoice, Inc. System and Method of Scanning a Body Cavity Using a Multiple Viewing Elements Endoscope
US10420575B2 (en) 2014-07-25 2019-09-24 Olympus Corporation Treatment tool and treatment tool system
WO2016049294A1 (en) * 2014-09-25 2016-03-31 The Johns Hopkins University Surgical system user interface using cooperatively-controlled robot
US9815206B2 (en) 2014-09-25 2017-11-14 The Johns Hopkins University Surgical system user interface using cooperatively-controlled robot
US11723734B2 (en) 2014-11-13 2023-08-15 Intuitive Surgical Operations, Inc. User-interface control using master controller
US20210030491A1 (en) * 2014-11-13 2021-02-04 Intuitive Surgical Operations, Inc. Interaction between user-interface and master controller
US11478207B2 (en) * 2016-08-18 2022-10-25 Stryker European Operations Holdings Llc Method for visualizing a bone
US10827998B2 (en) * 2016-08-18 2020-11-10 Stryker European Holdings I, Llc Method for visualizing a bone
US20190167221A1 (en) * 2016-08-18 2019-06-06 Stryker European Holdings I, Llc Method For Visualizing A Bone
US11793394B2 (en) 2016-12-02 2023-10-24 Vanderbilt University Steerable endoscope with continuum manipulator
US10839956B2 (en) * 2017-03-03 2020-11-17 University of Maryland Medical Center Universal device and method to integrate diagnostic testing into treatment in real-time
US20180254099A1 (en) * 2017-03-03 2018-09-06 University of Maryland Medical Center Universal device and method to integrate diagnostic testing into treatment in real-time
US20220001092A1 (en) * 2017-08-21 2022-01-06 RELIGN Corporation Arthroscopic devices and methods
US11897129B2 (en) 2017-09-13 2024-02-13 Vanderbilt University Continuum robots with multi-scale motion through equilibrium modulation
US10967504B2 (en) 2017-09-13 2021-04-06 Vanderbilt University Continuum robots with multi-scale motion through equilibrium modulation
US20190110855A1 (en) * 2017-10-17 2019-04-18 Verily Life Sciences Llc Display of preoperative and intraoperative images
US10835344B2 (en) * 2017-10-17 2020-11-17 Verily Life Sciences Llc Display of preoperative and intraoperative images
US20200073526A1 (en) * 2018-08-28 2020-03-05 Johnson Controls Technology Company Energy management system with draggable and non-draggable building component user interface elements
US11903661B2 (en) 2018-09-17 2024-02-20 Auris Health, Inc. Systems and methods for concomitant medical procedures
US11197728B2 (en) 2018-09-17 2021-12-14 Auris Health, Inc. Systems and methods for concomitant medical procedures
US11903650B2 (en) 2019-09-11 2024-02-20 Ardeshir Rastinehad Method for providing clinical support for surgical guidance during robotic surgery
US11485012B2 (en) * 2019-12-12 2022-11-01 Seiko Epson Corporation Control method and robot system
USD981425S1 (en) * 2020-09-30 2023-03-21 Karl Storz Se & Co. Kg Display screen with graphical user interface
USD1022197S1 (en) 2020-11-19 2024-04-09 Auris Health, Inc. Endoscope
US11819302B2 (en) 2021-03-31 2023-11-21 Moon Surgical Sas Co-manipulation surgical system having user guided stage control
US11844583B2 (en) 2021-03-31 2023-12-19 Moon Surgical Sas Co-manipulation surgical system having an instrument centering mode for automatic scope movements
US11504197B1 (en) 2021-03-31 2022-11-22 Moon Surgical Sas Co-manipulation surgical system having multiple operational modes for use with surgical instruments for performing laparoscopic surgery
US11622826B2 (en) 2021-03-31 2023-04-11 Moon Surgical Sas Co-manipulation surgical system for use with surgical instruments for performing laparoscopic surgery while compensating for external forces
US11832909B2 (en) 2021-03-31 2023-12-05 Moon Surgical Sas Co-manipulation surgical system having actuatable setup joints
US11786323B2 (en) 2021-03-31 2023-10-17 Moon Surgical Sas Self-calibrating co-manipulation surgical system for use with surgical instrument for performing laparoscopic surgery
US11812938B2 (en) 2021-03-31 2023-11-14 Moon Surgical Sas Co-manipulation surgical system having a coupling mechanism removeably attachable to surgical instruments
US11737840B2 (en) 2021-03-31 2023-08-29 Moon Surgical Sas Co-manipulation surgical system having a robot arm removeably attachable to surgical instruments for performing laparoscopic surgery
US11839442B1 (en) 2023-01-09 2023-12-12 Moon Surgical Sas Co-manipulation surgical system for use with surgical instruments for performing laparoscopic surgery while estimating hold force
US11832910B1 (en) 2023-01-09 2023-12-05 Moon Surgical Sas Co-manipulation surgical system having adaptive gravity compensation

Also Published As

Publication number Publication date
JP5467615B2 (en) 2014-04-09
WO2007047782A2 (en) 2007-04-26
CN101291635B (en) 2013-03-27
CN103251455A (en) 2013-08-21
KR20080068640A (en) 2008-07-23
JP2009512514A (en) 2009-03-26
EP3155998B1 (en) 2021-03-31
JP2012061336A (en) 2012-03-29
CN101291635A (en) 2008-10-22
WO2007047782A3 (en) 2007-09-13
KR101320379B1 (en) 2013-10-22
EP3524202A1 (en) 2019-08-14
EP1937176B1 (en) 2019-04-17
JP5322648B2 (en) 2013-10-23
US20160235496A1 (en) 2016-08-18
EP3162318A2 (en) 2017-05-03
JP2012055752A (en) 2012-03-22
EP3162318A3 (en) 2017-08-09
CN103251455B (en) 2016-04-27
CN103142309B (en) 2015-06-17
US20190388169A1 (en) 2019-12-26
JP5276706B2 (en) 2013-08-28
EP1937176A2 (en) 2008-07-02
US11197731B2 (en) 2021-12-14
EP3155998A1 (en) 2017-04-19
JP5639223B2 (en) 2014-12-10
US20220071721A1 (en) 2022-03-10
EP3162318B1 (en) 2019-10-16
CN103142309A (en) 2013-06-12
JP2013150873A (en) 2013-08-08

Similar Documents

Publication Publication Date Title
US11197731B2 (en) Auxiliary image display and manipulation on a computer display in a medical robotic system
JP6138227B2 (en) Laparoscopic ultrasonic robotic surgical system
JP2009512514A5 (en)
US20210212773A1 (en) System and method for hybrid control using eye tracking

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTUITIVE SURGICAL, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HOFFMAN, BRIAN DAVID;KUMAR, RAJESH;LARKIN, DAVID Q.;AND OTHERS;REEL/FRAME:018715/0463;SIGNING DATES FROM 20061103 TO 20061207

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: INTUITIVE SURGICAL OPERATIONS, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTUITIVE SURGICAL, INC.;REEL/FRAME:042260/0780

Effective date: 20100110

AS Assignment

Owner name: INTUITIVE SURGICAL OPERATIONS, INC., CALIFORNIA

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNOR EXECUTION DATE PREVIOUSLY RECORDED AT REEL: 042260 FRAME: 0780. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:INTUITIVE SURGICAL, INC.;REEL/FRAME:043096/0051

Effective date: 20100219