Búsqueda Imágenes Maps Play YouTube Noticias Gmail Drive Más »
Iniciar sesión
Usuarios de lectores de pantalla: deben hacer clic en este enlace para utilizar el modo de accesibilidad. Este modo tiene las mismas funciones esenciales pero funciona mejor con el lector.

Patentes

  1. Búsqueda avanzada de patentes
Número de publicaciónUS7395249 B2
Tipo de publicaciónConcesión
Número de solicitudUS 10/942,374
Fecha de publicación1 Jul 2008
Fecha de presentación16 Sep 2004
Fecha de prioridad22 Sep 1994
TarifaPagadas
También publicado comoCA2200716A1, CA2200716C, US6463361, US6965812, US20020183894, US20050033580, WO1996009587A1
Número de publicación10942374, 942374, US 7395249 B2, US 7395249B2, US-B2-7395249, US7395249 B2, US7395249B2
InventoresYulun Wang, Darrin Uecker
Cesionario originalIntuitive Surgical, Inc.
Exportar citaBiBTeX, EndNote, RefMan
Enlaces externos: USPTO, Cesión de USPTO, Espacenet
Speech interface for an automated endoscope system
US 7395249 B2
Resumen
A robotic system which controls the movement of a surgical instrument in response to voice commands from the user. The robotic system has a computer controlled arm that holds the surgical instrument. The user provides voice commands to the computer through a microphone. The computer contains a phrase recognizer that matches the user' speech with words stored in the computer. Matched words are then processed to determine whether the user has spoken a robot command. If the user has spoken a recognized robot command the computer will move the robotic arm in accordance with the command.
Imágenes(4)
Previous page
Next page
Reclamaciones(26)
1. A voice recognition system for use with a surgical instrument, the system comprising:
a processor coupled to a microphone and to a memory, the processor having a first state and a second state;
wherein the microphone receives a plurality of spoken surgical instructions, each of the spoken surgical instructions including a spoken qualifier and a spoken command, the plurality of spoken surgical instructions including a first surgical instruction having a first spoken qualifier and a first spoken command;
wherein the memory stores a first plurality of allowable commands associated with the second state of the processor; and
wherein in the first state, the processor is configured to:
receive the first surgical instruction and determine if the first spoken qualifier included in the first surgical instruction satisfies a first condition;
if the first spoken qualifier satisfies the first condition, then advance to the second state; and
if the first spoken qualifier does not satisfy the first condition, then remain in the first state; and
wherein in the second state, the processor is configured to:
determine whether the first spoken command included in the first surgical instruction is among the first plurality of allowable commands associated with the second state; and
if the first spoken command is among the first plurality of allowable commands, then transmit a first command signal to the surgical instrument in response to the first spoken command being among the first plurality of allowable commands.
2. The voice recognition system of claim 1, wherein the surgical instrument includes an endoscope configured to be coupled to a monitor, the processor configured to be coupled to the endoscope so as to alter an image from the endoscope that is shown on the monitor.
3. The voice recognition system of claim 2, the surgical instrument including a robotic arm supporting the endoscope, a distal end of the endoscope including a tip defining a viewing coordinate frame, wherein the processor is configured to calculate transformations between the viewing coordinate frame and a coordinate frame of the robotic arm, and the first command signal includes motor signals derived from the transformations so that the tip moves in an internal surgical site to effect an instructed change in the image shown on the monitor.
4. The voice recognition system of claim 1, wherein:
the processor is configured to be coupled to the surgical instrument.
5. The voice recognition system of claim 1, wherein the plurality of spoken surgical instructions includes a second instruction having a second spoken qualifier and a second spoken command, the memory operable to store a second plurality of allowable commands, the processor operable to change to a third state in response to the second spoken qualifier satisfying a second condition, the processor in the third state configured to determine whether the second spoken command is among the second plurality of allowable commands associated with the third state, the processor operable to generate a second command signal in response to the second spoken command being among the second plurality of allowable commands.
6. The voice recognition system of claim 5, wherein the processor has a fourth state, the processor operable to change to the fourth state in response to the second spoken command, the processor in the fourth state operable to accept a third plurality of allowable commands stored in the memory and associated with the fourth state.
7. The voice recognition system of claim 1, further comprising a speaker coupled to the processor for generating audible messages to a surgeon regarding operation of the system.
8. The voice recognition system of claim 7, wherein the audible messages include audible feedback indicating successful receipt of each spoken surgical instruction.
9. The voice recognition system of claim 8, wherein the audible messages include synthesized voice messages.
10. The voice recognition system of claim 1, wherein the microphone accepts a spoken stop command, and the processor is configured to transmit a stop command signal to the surgical instrument in response to the spoken stop command without an associated spoken qualifier.
11. The voice recognition system of claim 10, wherein the stop command signal from the processor is configured to inhibit potential injury to the patient that might otherwise be inflicted by the surgical instrument.
12. The voice recognition system of claim 1, wherein:
determining whether the first spoken command is among the first plurality of allowable commands includes comparing the first spoken command to at least one of the allowable commands in the first plurality of allowable commands.
13. A method comprising:
receiving a spoken surgical instruction, the spoken surgical instruction comprising a verbal qualifier and a verbal control command, wherein the verbal qualifier precedes the verbal control command;
determining whether the verbal qualifier matches an expected qualifier associated with a medical device;
if the verbal qualifier matches the expected qualifier, then determining whether the verbal control command is among one or more predefined commands from a library of multiple predefined commands; and
providing an output command signal that corresponds to the verbal control command to the medical device only if the verbal qualifier matches the expected qualifier and the verbal control command is among the one or more predefined commands.
14. The method of claim 13, further comprising:
providing audio or visual feedback after receiving the verbal control command.
15. The method of claim 13, wherein:
the expected qualifier includes a name of the medical device.
16. The method of claim 13, wherein:
the medical device comprises a robotic arm.
17. The method of claim 13, wherein:
determining whether the verbal qualifier matches the expected qualifier includes comparing the verbal qualifier to the expected qualifier; and
determining whether the verbal control command is among the one or more predefined commands includes comparing the verbal control command to at least one of the one or more predefined commands.
18. A voice recognition system for use with a surgical instrument, the system comprising:
a microphone for inputting a plurality of spoken surgical instructions, each of the spoken surgical instructions including a spoken qualifier and a spoken command, the plurality of spoken surgical instructions including a first instruction comprising a first spoken qualifier and a first spoken command, wherein the first spoken command comprises a first portion and a second portion;
a memory for storing a first plurality of allowable commands and a second plurality of allowable compounds; and
a processor coupled to the microphone and the memory, the processor having a first state, a second state, and a third state, wherein the first plurality of allowable commands are associated with the second state, wherein the second plurality of allowable commands are associated with the third state, and wherein the processor is configured to:
in the first state, in response to the first spoken qualifier matching an expected qualifier, advance to the second state;
in the second state, determine if the first portion of the first spoken command is among the first plurality of allowable commands associated with the second state and to advance to the third state in response to the first portion being among the first plurality of allowable commands;
in the third state, determine if the second portion of the first spoken command is among the second plurality of allowable commands associated with the third state; and
in response to the second portion of the first spoken command being among the second plurality of allowable commands provide a first command signal to the surgical instrument corresponding to the first spoken command.
19. A method for controlling a surgical instrument, the method comprising:
in a first state, receiving a plurality of spoken surgical instructions, each of the spoken surgical instructions including a spoken qualifier and a spoken command, the plurality of spoken surgical instructions including a first surgical instruction having a first spoken qualifier and a first spoken command;
determining if the first spoken qualifier included in the first surgical instruction satisfies a first condition;
if the first spoken qualifier satisfies the first condition, then advancing to a second state;
in the second state, determining whether the first spoken command included in the first surgical instruction is among a plurality of allowable commands associated with the second state; and
if the first spoken command is among the first plurality of allowable commands, then transmitting a first command signal to the surgical instrument in response to the first spoken command being among the plurality of allowable commands.
20. The voice recognition system of claim 1, wherein the first plurality of allowable commands comprises a save command.
21. The voice recognition system of claim 1, wherein the first plurality of allowable commands comprises a return command.
22. The voice recognition system of claim 1, wherein the first plurality of allowable commands comprises a track instrument command.
23. The voice recognition system of claim 1, wherein the first plurality of allowable commands comprises a track head command.
24. A method comprising:
receiving a first spoken surgical instruction, the first spoken surgical instruction comprising a verbal qualifier and a first verbal control command, the verbal qualifier preceding the first verbal control command;
determining whether the verbal qualifier matches an expected qualifier;
if the verbal qualifier matches the expected qualifier, then determining whether the first verbal control command is among one or more predefined commands from a first library of multiple predefined commands;
providing a first output command signal that corresponds to the first verbal control command to a medical device only if the verbal qualifier matches the expected qualifier and the first verbal control command is among the predefined commands in the first library;
receiving a second spoken surgical instruction, the second spoken surgical instruction comprising the verbal qualifier and a second verbal control command, the verbal qualifier preceding the second verbal control command;
determining whether the verbal qualifier matches the expected qualifier;
if the verbal qualifier matches the expected qualifier, then determining whether the second verbal control command is among one or more predefined commands from a second library of multiple predefined commands; and
providing a second output command signal that corresponds to the second verbal control command to the medical device only if the verbal qualifier matches the expected qualifier and the second verbal control command is among the predefined commands in the second library.
25. A method comprising:
receiving a spoken surgical instruction, the spoken surgical instruction comprising a verbal qualifier and a verbal control command comprising a first part and a second part, the verbal qualifier preceding the verbal control command;
determining whether the verbal qualifier matches an expected qualifier;
if the verbal qualifier matches the expected qualifier, then determining whether the first part of the verbal control command is among one or more predefined commands from a first library of multiple predefined commands;
if the first part of the verbal control command is among the predefined commands in the first library, then determining whether the second part of the verbal control command is among one or more predefined commands in a second library of multiple predefined commands; and
providing an output command signal to a medical device, wherein the output command signal corresponds to the first and second parts of the verbal control command only if: the verbal qualifier matches the expected qualifier, the first part of the verbal control command is among the predefined commands in the first library, and the second part of the verbal control command is among the predefined commands in the second library.
26. The voice recognition system of claim 1, wherein:
the plurality of spoken surgical instructions includes a second instruction including the first spoken qualifier and a second spoken command including a first part and a second part;
the memory is further operable to store a second plurality of allowable commands associated with a third state of the processor;
the processor is further operable in the second state to:
determine whether the first part of the second spoken command is among the first plurality of allowable commands associated with the second state;
if the first part is among the first plurality of allowable commands, then advance to the third state and determine whether the second part of the second spoken command is among the second plurality of allowable commands associated with the third state; and
if the second part is among the second plurality of allowable commands, then transmit a second command signal to the surgical instrument in response to the second spoken command.
Descripción
CROSS-REFERENCE TO RELATED APPLICATION

This is a continuation of U.S. patent application Ser. No. 10/095,488 filed Mar. 11, 2002, the full disclosure of which is incorporated herein by reference, which is a continuation U.S. patent application Ser. No. 08/310,665 filed on Sep. 22 1994.

BRIEF SUMMARY OF THE INVENTION

The present invention is a robotic system which controls the movement of a surgical instrument in response to voice commands from the user. A surgical instrument is a tool or device used during a surgery or operation. Examples of surgical instruments include forceps, laparoscopes, endoscopes, and medical telescopes. The robotic system has a computer controlled arm that holds the surgical instrument. The user provides voice commands to the computer through a microphone. The computer contains a phrase recognizer that matches the user's speech with words stored in the computer. Matched words are then processed to determine whether the user has spoken a robot command. If the user has spoken a recognized robot command the computer will move the robotic arm in accordance with the command.

BRIEF DESCRIPTION OF THE DRAWINGS

The objects and advantages of the present invention will become more readily apparent to those ordinarily skilled in the art after reviewing the following detailed description and accompanying drawings, wherein:

FIG. 1 is a perspective view of a robotic endoscope system of the present invention;

FIG. 2 is a schematic of an endoscope within two separate coordinate systems;

FIG. 3 is a top view of a foot pedal;

FIG. 4 is a schematic of a computer system;

FIG. 5 is a schematic of a grammar process;

FIG. 6 is a schematic of a robotic arm.

DETAILED DESCRIPTION OF THE INVENTION

Referring to the drawings more particularly by reference numbers, FIG. 1 shows a robotic system 10 of the present invention. The system 10 is typically used in a sterile operating room where a surgeon performs a surgical procedure on a patient. The patient is placed on a operating table 12. Attached to the table 12 is a robotic arm assembly 14 which can move a surgical instrument 16 relative to the table 12 and the patient. The surgical instrument 16 is typically an endoscope which is inserted into the abdomen of the patient 12. The endoscope 16 enters the patient through a cannula, wherein the scope 16 rotate about a cannula pivot point. The endoscope is typically connected to a monitor 18 which allows the surgeon to view the organs, etc. of the patient. Although an endoscope is described and shown, it is to be understood that the present invention can be used with other surgical instruments.

The robotic arm assembly 14 controlled by a computer 20. In the preferred embodiment, the robotic arm assembly 16 includes a linear actuator 24 fixed to the table 14. The linear actuator 24 is connected to a linkage arm assembly 26 and adapted to move the linkage assembly 26 along the z axis of a first coordinate system. The first coordinate system also has an x axis and a y axis.

The linkage arm assembly 26 includes a first linkage arm 28 attached to a first rotary actuator 30 and an end effector 32. The first rotary actuator 30 is adapted to rotate the first linkage arm 28 and end effector 32 in a plane perpendicular to the z axis (x-y plane) The first rotary actuator 30 is connected to a second rotary actuator 34 by a second linkage arm 36. The second actuator 34 is adapted to rotate the first actuator 30 in the x-y plane. The second rotary actuator 34 is connected to the output shaft of the linear actuator 24. The actuators 24, 30 and 34 rotate in response to output signals provided by the computer 20. As shown in FIG. 2, the junction of the endoscope 16 and the end effector 32 define a second coordinate-system which has an x′ axis, a y′ axis and a z′ axis. The junction of the end effector 32 and endoscope 18 also define the origin of a third coordinate system which has a x′ axis, a p axis and a z″ axis. The z″ axis parallel with the longitudinal axis of the endoscope 16.

The arm assembly may have a pair of passive joints that allow the end effector to be rotated in the direction indicated by the arrows. The actuators 24, 30 and 34, and joints of the arm may each have position sensors (not shown) that are connected to the computer 20. The sensors provide positional feedback signals of each corresponding arm component.

The system has a microphone 40 that is connected to the computer 20. The system may also have a speaker 42 that is connected to the computer 20. The microphone 40 and speaker 42 may be mounted to a headset 44 that is worn by the user. Placing the microphone 40 in close proximity to the user reduces the amount of background noise provided to the computer and decreases the probability of an inadvertent input command.

As shown in FIG. 3, the system may also have a foot pedal 50. The foot pedal 22 has a housing 56 that supports a pair of outer first foot switches 58 and a second foot switch 60. One outer foot switch 58 has a first pressure transducer 62 and the other switch has a second pressure transducer 64. The second foot switch 60 has third 66, fourth 68, fifth 70 and sixth 72 pressure transducers. The transducers are each connected to a corresponding operational amplifier that provides a voltage input to the computer 20. The pressure transducers 62-72 are preferably constructed so that the resistance of each transducer decreases as the surgeon increases the pressure on the foot switches. Such a transducer is sold by Interlink Electronics. The decreasing transducer resistance increases the input voltage provided to the computer 20 from the operational amplifier. Each transducer corresponds to a predetermined direction within the image displayed by the monitor. In the preferred embodiment, the first pressure transducer 62 corresponds to moving the endoscope toward the image viewed by the surgeon. The second transducer 64 moves the scope away from the image. The third 66 and fourth 68 transducers move the image “up” and “down”, respectively, and the fifth 70 and sixth 72 transducers move the image “left” and “right”, respectively. The pedal may have a button 73 that enables the foot pedal 50 and disable the voice command feature, or vice versa.

FIG. 4 shows a schematic of the computer 20. The computer 20 has a multiplexer 74 which is connected to the pressure transducers of the foot pedal 50 and the position sensors of the arm. The multiplexer 74 is connected to a single analog to digital (A/D) converter 76. The computer 20 also has a processor 78 and memory 80.

The processor 78 is connected to an address decoder 82 and separate digital to analog (D/A) converters 84. Each D/A converter is connected to an actuator 24, 30 and 34. The D/A converters 84 provide analog output signals to the actuators in response to output signals received from the processor 78. The analog output signals have a sufficient voltage level to energize the electric motors and move the robotic arm assembly. The decoder 82 correlates the addresses provided by the processor with a corresponding D/A converter, so that the correct motor(s) is driven. The address decoder 82 also provides an address for the input data from the A/D converter 76 so that the data is associated with the correct input channel.

The computer 20 has a phrase recognizer 86 connected to the microphone 40 and the processor 78. The phrase recognizer 86 digitizes voice commands provided by the user through the microphone 40. The voice commands are then processed to convert the spoken words into electronic form. The electronic words are typically generated by matching the user's speech with words stored within the computer 20. In the preferred embodiment, the recognizer 86 is an electronic board with accompanying software that is marketed by SCOTT INSTRUMENTS of Denton, Tex. under the trademark “Coretechs Technology”.

The electronic words are provided to the processor 78. The processor 78 compares a word, or a combination of words to predefined robot commands that are stored within a library in the memory 80 of the computer 20. If a word, or combination of words match a word or combination of words in the library, the processor 78 provides output commands to the D/A converter 84 to move the robotic arm in accordance with the command.

FIG. 5 shows exemplary words and combinations of words that provide robot commands. A grammar process is performed to determine whether the voice commands satisfy certain conditions. The process contains a number of states advanced by the satisfaction of a condition. If the voice command provided by the user satisfies a first condition, then the process proceeds to the first state. If a condition of a next state is satisfied then the process proceeds to the next corresponding state, and so forth and so on. For example, to prevent a robot command from being inadvertently spoken, it is desirable to predicate all voice commands with a qualifier. For example, the qualifier may be a name given to the robot such as “AESOP”. Therefore when the user provides a voice command, the process initially determines whether the spoken word is AESOP. If the spoken word is not AESOP then the process ends. The term “stop” may be an exception to this rule, wherein the computer will stop arm movement when the user provides a simple “stop” voice command.

If the spoken word is AESOP the process continues to state 1. The process next determines whether the user has spoken a word that satisfies a condition to advance to states 2-6. These words include “move”, “step”, “save”, “return”, “speed”, “track instrument” and “track head”. The track instrument command is for a system which has the ability to move an endoscope to automatically track the movement of a second instrument that is inserted into the patient. The track head command may enable the system so that the endoscope movement tracks the user's eyes. For example, if the user looks to the right of the image displayed by the monitor, the robot will move the endoscope to move the image in a rightward direction. The move and step commands induce movement of the scope in a desired direction. The save command saves the position of the endoscope within the memory of the computer. The return command will return the scope to a saved position.

From states 2-6 the process will determine whether the user has spoken words that meet the next condition and so forth and so on. When a certain number of conditions have been met, the processor 78 will provide an output command to the D/A converter 84 in accordance with the voice commands. For example, if the user says “AESOP move left”, the processor 78 will provide output commands to move the endoscope 12, so that the image displayed by the monitor moves in a leftward direction. The microphone 40 phrase recognizer 86 and grammar process essentially provide the same input function as the foot pedal 50, multiplexer 74 and A/D converter 76.

The processor 78 can also provide the user with feedback regarding the recognized command through the speaker 42 or the monitor 18. For example, when the user states “AESOP move right”, after processing the speech, the processor 78 can provide an audio message through the speaker 42, or a visual message on the monitor 18, “AESOP move right”. Additionally, the processor 78 can provide messages regarding system errors, or the present state of the system such as “speed is set for slow”.

Referring to FIG. 6, the processor 78 typically computes the movement of the robotic arm assembly 16 in accordance with the following equations.

a 3 = π - cos - 1 ( x 2 + y 2 - L 1 2 + L 2 2 - 2 L 1 L 2 ) Δ = cos - 1 ( x 2 + y 2 + L 1 2 - L 2 2 2 · L 1 x 2 + y 2 ) a 0 = tan - 1 2 ( y x ) a 2 = a0 + / - Δ 1 )

where;

a2=angle between the second linkage arm 36 and the x axis.

a3=angle between the first linkage arm 28 and the longitudinal axis of the second linkage arm 36.

L1=length of the second linkage arm.

L2=length of the first linkage arm.

x=x coordinate of the end effector in the first coordinate system.

y=y coordinate of the end effector in the first coordinate system.

To move the end effector to a new location of the x-y plane the processor 78 computes the change in angles a2 and a3 and then provides output signals to move the actuators accordingly. The original angular position of the end effector is provided to the processor 78 by the position sensors. The processor moves the linkage arms an angle that corresponds to the difference between the new location and the original location of the end effector. A differential angle Δa2 corresponds to the amount of angular displacement provided by the second actuator 34, a differential angle Δa3 corresponds to the amount of angular displacement provided by the first actuator 30.

To improve the effectiveness of the system 10, the system is constructed so that the desired movement of the surgical instrument correlates to a direction relative to the image displayed by the monitor. Thus when the surgeon commands the scope to move up, the scope always appears to move in the up direction. To accomplish this result, the processor 78 converts the desired movement of the end of the endoscope in the third coordinate system to coordinates in the second coordinate system, and then converts the coordinates of the second coordinate system into the coordinates of the first coordinate system.

Referring to FIG. 2, the desired movement of the endoscope is converted from the third coordinate system to the second coordinate system by using the following transformation matrix:

( Δ x Δ y Δ z ) = ( cos ( a 6 ) 0 - sin ( a 6 ) - sin ( a 5 ) sin ( a 6 ) cos ( a 5 ) - sin ( a 5 ) cos ( a 6 ) cos ( a 5 ) sin ( a 6 ) sin ( a 5 ) cos ( a 5 ) cos ( a 6 ) ) ( Δ x Δ y Δ z ) 2 )

where;

Δx″=the desired incremental movement of the scope along the x″ axis of the third coordinate system.

Δy″=the desired incremental movement of the scope along the y″ axis of the third coordinate system.

Δz″=the desired incremental movement of the scope along the z″ axis of the third coordinate system.

a5=the angle between the z′ axis and the scope in the y-z′ plane.

a6=the angle between the z′ axis and the scope in the x′-z′ plane.

Δx″=the computed incremental movement of the scope along the x′ axis of the second coordinate system.

Δy″=the computed incremental movement of the scope along the y′ axis of the second coordinate system.

Δz″=the computed incremental movement of the scope along the z′ axis of the second coordinate system.

The angles a5 and a6 are provided by position sensors located on the end effector 32. The angles a5 and a6 are shown in FIG. 2.

The desired movement of the endoscope is converted from the second coordinate system to the first coordinate system by using the following transformation matrix:

( Δ x Δ y Δ z ) = ( cos ( π ) - sn ( π ) 0 sin ( π ) cos ( π ) 0 0 0 1 ) ( Δ x Δ y Δ z ) 3 )

where;

Δx′=the computed incremental movement of the scope along the x′ axis of the second coordinate system.

Δy′=the computed incremental movement of the scope along the y′ axis of the second coordinate system.

Δz′=the computed incremental movement of the scope along the z′ axis of the second coordinate system.

π=is the angle between the first linkage arm and the x axis of the first coordinate system.

Δx=the computed incremental movement of the scope along the x axis of the first coordinate system.

Δy=the computed incremental movement of the scope along the y axis of the first coordinate system.

Δz=the computed incremental movement of the scope along the z axis of the first coordinate system.

The incremental movements Δx and Δy are inserted into the algorithms described above for computing the angular movements (Δa2 and Δa3) of the robotic arm assembly to determine the amount of rotation that is to be provided by each electric motor. The value Δz is used to determine the amount of linear movement provided by the linear actuator 24.

The surgical instrument is typically coupled to a camera and a viewing screen so that any spinning of the instrument about its own longitudinal axis will result in a corresponding rotation of the image on the viewing screen. Rotation of the instrument and viewing image may disorient the viewer. It is therefore desirable to maintain the orientation of the viewing image. In the preferred embodiment, the end effector has a worm gear (not shown) which rotates the surgical instrument about the longitudinal axis of the instrument. To insure proper orientation of the endoscope 16, the worm gear rotates the instrument 16 about its longitudinal axis an amount Δθ6 to insure that the y″ axis is oriented in the most vertical direction within the fixed coordinate system. Δθ6 is computed from the following cross-products.

Δθ6 =zi″ (yo″.yi″)

where;

Δθ6 =the angle that the instrument is to be rotated about the z″ axis.

yo″=is the vector orientation of the y″ axis when the 20 instrument is in the first position.

yiΔ=is the vector orientation of the y″ axis when the instrument is in the second position.

zi″=is the vector orientation of the z″ axis when the instrument is in the second position.

The vectors of the yi″ and zi″ axis are computed with the following algorithms.

[ zi ] = ( cos a 6 0 - sin a 6 - sin a 5 sin a 6 cos a 5 - sin a 5 cos a 6 cos a 5 sin a 6 sin a 5 cos a 5 cos a 6 ) ( 0 0 1 )
xi″=z×zi″
yi″=zi″×xi″

where;

a5=is the angle between the instrument and the z axis in the y-z plane.

a6=is the angle between the instrument and the z axis in the x-z plane.

z=is the unit vector of the z axis in the first coordinate system.

The angles a5 and a6 are provided by position sensors. The vector yo″ is computed using the angles a5 and a6 of the instrument in the original or first position. For the computation of yi″ the angles a5 and a6 of the second position are used in the transformation matrix. After each arm movement yo″ is set to yi″ and a new yi″ vector and corresponding Δθ6 angle are computed and used to re-orient the endoscope. Using the above described algorithms, the worm gear continuously rotates the instrument about its longitudinal axis to insure that the pivotal movement of the endoscope does not cause a corresponding rotation of the viewing image.

The system may have a memory feature to store desired instrument positions within the patient. The memory feature may be enabled either by voice commands or through a button on an input device such as the foot pedal. When a save command is spoken, the coordinates of the end effector in the first coordinate system are saved in a dedicated address(es) of the computer memory. When a return command is spoken, the processor retrieves the data stored in memory and moves the end effector to the coordinates of the effector when the save command was enabled.

The memory feature allows the operator to store the coordinates of the end effector in a first position, move the end effector to a second position and then return to the first position with a simple command. By way of example, the surgeon may take a wide eye view of the patient from a predetermined location and store the coordinates of that location in memory. Subsequently, the surgeon may manipulate the endoscope to enter cavities, etc. which provide a more narrow view. The surgeon can rapidly move back to the wide eye view by merely stating “AESOP return to one”.

In operation, the user provides spoken words to the microphone. The phrase recognizer 86 matches the user's speech with stored words and provides matched electronic words to the processor 78. The processor performs a grammar process to determine whether the spoken words are robot commands. If the words are commands, the computer energizes the actuators and moves the endoscope, accordingly. The system also allows the user to control the movement of the endoscope with a foot pedal if voice commands are not desired.

While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that this invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art:

Citas de patentes
Patente citada Fecha de presentación Fecha de publicación Solicitante Título
US9778258 Ene 19106 Dic 1910George N MurphySurgical instrument.
US317154918 Jul 19622 Mar 1965Molins Machine Co LtdMechanical handling apparatus
US328099128 Abr 196425 Oct 1966Programmed & Remote Syst CorpPosition control manipulator
US3300053 *8 Abr 196424 Ene 1967Melville F PetersFluid separating device
US40580012 Ago 197615 Nov 1977G. D. Searle & Co.Ultrasound imaging system with improved scan conversion
US412888030 Jun 19765 Dic 1978Cray Research, Inc.Computer vector register processing
US4158750 *26 May 197719 Jun 1979Nippon Electric Co., Ltd.Speech recognition system with delayed output
US42079592 Jun 197817 Jun 1980New York UniversityWheelchair mounted control apparatus
US42164626 Mar 19785 Ago 1980General Electric CompanyPatient monitoring and data processing system
US422199718 Dic 19789 Sep 1980Western Electric Company, IncorporatedArticulated robot arm and method of moving same
US4340800 *18 Dic 198020 Jul 1982Matsushita Electric Industrial Co., Ltd.Heating apparatus having voice command control operative in a conversational processing manner
US4348553 *2 Jul 19807 Sep 1982International Business Machines CorporationParallel pattern verifier with dynamic time warping
US43679985 Sep 198011 Ene 1983United Kingdom Atomic Energy AuthorityManipulators
US440185213 Ene 198230 Ago 1983Nissan Motor Company, LimitedVoice response control system
US4454586 *19 Nov 198112 Jun 1984At&T Bell LaboratoriesMethod and apparatus for generating speech pattern templates
US44569615 Mar 198226 Jun 1984Texas Instruments IncorporatedApparatus for teaching and transforming noncoincident coordinate systems
US446030212 May 198117 Jul 1984Commissariat A L'energie AtomiqueHandling equipment comprising a telescopic supporting assembly carrying a motorized orientation support for at least one articulated slave arm
US4472617 *18 Jun 198218 Sep 1984Matsushita Electric Industrial Co., Ltd.Heating apparatus with voice actuated door opening mechanism
US447417423 Feb 19832 Oct 1984American Hospital Supply CorporationTo cut tissue
US4482032 *25 Abr 198313 Nov 1984Westinghouse Electric Corp.Elevator emergency control system
US44911353 Nov 19821 Ene 1985Klein Harvey ASurgical needle holder
US450385416 Jun 198312 Mar 1985Jako Geza JLaser surgery
US45179634 Ene 198321 May 1985Harold UngerImage-erecting barrel rotator for articulated optical arm
US45238848 Oct 198118 Jun 1985Commissariat A L'energie AtomiqueRemote manipulation assembly
US458639829 Sep 19836 May 1986Hamilton IndustriesFoot control assembly for power-operated tables and the like
US46040163 Ago 19835 Ago 1986Joyce Stephen AMulti-dimensional force-torque hand controller having force feedback
US4605080 *25 Mar 198512 Ago 1986Lemelson Jerome HWeighing apparatus
US461663714 Sep 198414 Oct 1986Precision Surgical Instruments, Inc.Shoulder traction apparatus
US462400218 Jul 198518 Nov 1986Vysoka Skola Chemicko-TechnologickaCircuit arrangement for decreasing the corrosion of the electrodes in a furnace for the electric melting of vitreous material
US4624008 *9 Mar 198318 Nov 1986International Telephone And Telegraph CorporationApparatus for automatic speech recognition
US462401128 Ene 198318 Nov 1986Tokyo Shibaura Denki Kabushiki KaishaSpeech recognition system
US46333891 Feb 198330 Dic 1986Hitachi, Ltd.Vector processor system comprised of plural vector processors
US4633499 *8 Oct 198230 Dic 1986Sharp Kabushiki KaishaSpeech recognition system
US463529217 Dic 19846 Ene 1987Matsushita Electric Industrial Co., Ltd.Image processor
US4641292 *21 Oct 19853 Feb 1987George TunnellVoice controlled welding system
US46552571 Nov 19857 Abr 1987Kabushiki Kaisha Machida SeisakushoGuide tube assembly for industrial endoscope
US46729637 Jun 198516 Jun 1987Israel BarkenApparatus and method for computer controlled laser surgery
US467624331 Oct 198430 Jun 1987Aldebaran Xiii Consulting CompanyAutomated anterior capsulectomy instrument
US4717364 *4 Sep 19845 Ene 1988Tomy Kogyo Inc.Voice controlled toy
US4725956 *15 Oct 198516 Feb 1988Lockheed CorporationVoice command air vehicle control system
US472897430 May 19861 Mar 1988Yaskawa Electric Manufacturing Co., Ltd.Apparatus for supporting an imaging device
US475013610 Ene 19867 Jun 1988American Telephone And Telegraph, At&T Information Systems Inc.Communication system having automatic circuit board initialization capability
US4757541 *1 Dic 198612 Jul 1988Research Triangle InstituteAudio visual speech recognition
US47624551 Jun 19879 Ago 1988Remote Technology CorporationRemote manipulator
US4776016 *21 Nov 19854 Oct 1988Position Orientation Systems, Inc.Voice control system
US47919347 Ago 198620 Dic 1988Picker International, Inc.Computer tomography assisted stereotactic surgery system and method
US47919402 Feb 198720 Dic 1988Florida Probe CorporationElectronic periodontal probe with a constant force applier
US479491217 Ago 19873 Ene 1989Welch Allyn, Inc.Borescope or endoscope with fluid dynamic muscle
US4797924 *25 Oct 198510 Ene 1989Nartron CorporationVehicle voice recognition method and apparatus
US4799171 *21 Nov 198317 Ene 1989Kenner Parker Toys Inc.Talk back doll
US4805219 *3 Abr 198714 Feb 1989Dragon Systems, Inc.Method for speech recognition
US48072736 Oct 198721 Feb 1989Joerg HaendleVoice controlled x-ray diagnostics installation
US481500616 Sep 198721 Mar 1989Asea AktiebolagMethod and device for calibrating a sensor on an industrial robot
US48154501 Feb 198828 Mar 1989Patel Jayendra IEndoscope having variable flexibility
US481705021 Nov 198628 Mar 1989Kabushiki Kaisha ToshibaDatabase system
US483773426 Feb 19876 Jun 1989Hitachi, Ltd.Method and apparatus for master-slave manipulation supplemented by automatic control based on level of operator skill
US485208322 Jun 198725 Jul 1989Texas Instruments IncorporatedDigital crossbar switch
US485387420 Nov 19871 Ago 1989Hitachi, Ltd.Master-slave manipulators with scaling
US485430112 Nov 19878 Ago 1989Olympus Optical Co., Ltd.Endoscope apparatus having a chair with a switch
US48602156 Abr 198722 Ago 1989California Institute Of TechnologyMethod and apparatus for adaptive force and position control of manipulators
US486313326 May 19875 Sep 1989Leonard MedicalArm device for adjustable positioning of a medical instrument or the like
US488340024 Ago 198828 Nov 1989Martin Marietta Energy Systems, Inc.Dual arm master controller for a bilateral servo-manipulator
US489825330 May 19896 Feb 1990Sartorius GmbhElectronic balance for dosing
US4903304 *20 Oct 198820 Feb 1990Siemens AktiengesellschaftMethod and apparatus for the recognition of individually spoken words
US493049428 Dic 19885 Jun 1990Olympus Optical Co., Ltd.Apparatus for bending an insertion section of an endoscope using a shape memory alloy
US494547931 Jul 198531 Jul 1990Unisys CorporationTightly coupled scientific processing system
US494971717 Mar 198821 Ago 1990Shaw Edward LSurgical instrument with suture cutter
US49549522 Oct 19894 Sep 1990Trw Inc.Robotic arm systems
US496541727 Mar 198923 Oct 1990Massie Philip EFoot-operated control
US496970911 Oct 198913 Nov 1990Sumitomo Electric Industries, Ltd.Mechanism for bending elongated body
US49698907 Jul 198813 Nov 1990Nippon Zeon Co., Ltd.Catheter
US49799331 Mar 198925 Dic 1990Kraft, Inc.Reclosable bag
US497994926 Abr 198825 Dic 1990The Board Of Regents Of The University Of WashingtonRobot-aided system for surgery
US498062610 Ago 198925 Dic 1990The United States Of America As Represented By The Administrator Of The National Aeronautics And Space AdministrationMethod and apparatus for positioning a robotic end effector
US498925315 Abr 198829 Ene 1991The Montefiore Hospital Association Of Western PennsylvaniaVoice activated microscope
US49969751 Jun 19905 Mar 1991Kabushiki Kaisha ToshibaElectronic endoscope apparatus capable of warning lifetime of electronic scope
US501996829 Mar 198828 May 1991Yulan WangThree-dimensional vector processor
US502000115 Sep 198928 May 1991Toyoda Koki Kabushiki KaishaRobot controller
US50657419 Ene 199119 Nov 1991Olympus Optical Co. Ltd.Extracoporeal ultrasonic lithotripter with a variable focus
US507814023 Sep 19867 Ene 1992Kwoh Yik SImaging device - aided robotic stereotaxis system
US508640111 May 19904 Feb 1992International Business Machines CorporationImage-directed robotic system for precise robotic surgery including redundant consistency checking
US509165627 Oct 198925 Feb 1992Storz Instrument CompanyFootswitch assembly with electrically engaged detents
US509782919 Mar 199024 Mar 1992Tony QuisenberryTemperature controlled cooling system
US509783913 Feb 199024 Mar 1992Allen George SApparatus for imaging the anatomy
US50984266 Feb 198924 Mar 1992Phoenix Laser Systems, Inc.Method and apparatus for precision laser surgery
US510536716 Oct 198914 Abr 1992Hitachi, Ltd.Master slave manipulator system
US510949929 Ago 198828 Abr 1992Hitachi, Ltd.Vector multiprocessor system which individually indicates the data element stored in common vector register
US512309517 Ene 198916 Jun 1992Ergo Computing, Inc.Integrated scalar and vector processors with vector addressing by the scalar processor
US513110521 Nov 199021 Jul 1992Diasonics, Inc.Patient support table
US514293029 Mar 19911 Sep 1992Allen George SMechanical arm manipulable by a person
US514522731 Dic 19908 Sep 1992The United States Of America As Represented By The Administrator Of The National Aeronautics And Space AdministrationElectromagnetic attachment mechanism
US51665136 May 199124 Nov 1992Coherent, Inc.Dual actuation photoelectric foot switch
US51756948 Feb 199029 Dic 1992The United States Of America As Represented By The Secretary Of The NavyCentroid target tracking system utilizing parallel processing of digital data patterns
US518264117 Jun 199126 Ene 1993The United States Of America As Represented By The Administrator Of The National Aeronautics And Space AdministrationComposite video and graphics display for camera viewing systems in robotics and teleoperation
US51846015 Ago 19919 Feb 1993Putman John MFreestanding surgical instrument support apparatus
US518757421 Ago 199116 Feb 1993Kanda Tsushin Kogyo Co., Ltd.Method for automatically adjusting field of view of television monitor system and apparatus for carrying out the same
US519668821 Feb 197823 Mar 1993Telefunken Systemtechnik GmbhApparatus for recognizing and following a target
US520132518 Sep 199113 Abr 1993Andronic Devices Ltd.Advanced surgical retractor
US52017435 May 199213 Abr 1993Habley Medical Technology Corp.Axially extendable endoscopic surgical instrument
US521700318 Mar 19918 Jun 1993Wilk Peter JAutomated surgical system and apparatus
US5218969 *17 Dic 199015 Jun 1993Blood Line Technology, Inc.Intelligent stethoscope
US522128315 May 199222 Jun 1993General Electric CompanyApparatus and method for stereotactic surgery
US522842930 Nov 199220 Jul 1993Tadashi HatanoPosition measuring device for endoscope
US523002330 Ene 199120 Jul 1993Nec CorporationMethod and system for controlling an external machine by a voice command
US523062310 Dic 199127 Jul 1993Radionics, Inc.Operating pointer with interactive computergraphics
US523643224 Ago 199217 Ago 1993Board Of Regents Of The University Of WashingtonRobot-aided system for surgery
US524912127 Oct 198928 Sep 1993American Cyanamid CompanyRemote control console for surgical control system
US525112731 Jul 19905 Oct 1993Faro Medical Technologies Inc.Computer-aided surgery apparatus
US52579994 Jun 19922 Nov 1993Slanetz Jr Charles ASelf-oriented laparoscopic needle holder for curved needles
US527138423 Ene 199221 Dic 1993Mcewen James APowered surgical retractor
US527486218 May 19924 Ene 1994Palmer Jr John MPatient turning device and method for lateral traveling transfer system
US527930927 Jul 199218 Ene 1994International Business Machines CorporationSignaling device and method for monitoring positions in a surgical operation
US528280621 Ago 19921 Feb 1994Habley Medical Technology CorporationEndoscopic surgical instrument having a removable, rotatable, end effector assembly
US52892735 Nov 199222 Feb 1994Semborg-Recrob, Corp.Animated character system with real-time control
US528936523 Dic 199122 Feb 1994Donnelly CorporationModular network control system
US529928818 Sep 199129 Mar 1994International Business Machines CorporationImage-directed robotic system for precise robotic surgery including redundant consistency checking
US53009262 May 19915 Abr 1994Siemens AktiengesellschaftMedical apparatus, having a single actuating device
US530314830 Oct 199012 Abr 1994Picker International, Inc.Voice actuated volume image controller and display controller
US530388222 Feb 199319 Abr 1994The United States Of America As Represented By The Secretary Of The NavyCorner vortex suppressor
US5410638 *3 May 199325 Abr 1995Northwestern UniversitySystem for positioning a medical instrument within a biotic structure using a micromanipulator
US5417210 *27 May 199223 May 1995International Business Machines CorporationSystem and method for augmentation of endoscopic surgery
US5622730 *20 Oct 199522 Abr 1997Toshiba Kikai Kabushiki KaishaHeat-displacing T-die
US5695500 *6 Abr 19949 Dic 1997International Business Machines CorporationSystem for manipulating movement of a surgical instrument with computer controlled brake
US5707942 *15 Jul 199613 Ene 1998Tonen CorporationOf a lubricating base oil, an amine salt of molybdic acid and a molybdenum dithiocarbamate or a molybdenum dithiophosphate
US5758021 *10 Sep 199226 May 1998Alcatel N.V.Speech recognition combining dynamic programming and neural network techniques
US5950629 *28 Abr 199414 Sep 1999International Business Machines CorporationSystem for assisting a surgeon during surgery
US5976156 *2 Nov 19932 Nov 1999International Business Machines CorporationStereotaxic apparatus and method for moving an end effector
US5995930 *19 Nov 199630 Nov 1999U.S. Philips CorporationMethod and apparatus for recognizing spoken words in a speech signal by organizing the vocabulary in the form of a tree
US6463361 *22 Sep 19948 Oct 2002Computer Motion, Inc.Speech interface for an automated endoscopic system
US6850817 *29 Jun 20001 Feb 2005Sri InternationalSurgical system
WO1995001757A1 *5 Jul 199419 Ene 1995Cornelius BorstRobotic system for close inspection and remote treatment of moving parts
Otras citas
Referencia
1Abstract of a presentation "3-D Vision Technology Applied to Advanced Minimally Invasive Surgery Systems" given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992 (1 page total).
2Abstract of a presentation "A Pneumatic Controlled Sewing Device for Endoscopic Application the MIS Sewing Instrument MSI" given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992 (1 page total).
3Abstract of a presentation "Concept and Experimental Application of a Surgical Robot System and Steerable MIS Instrument SMI" given at the 3<SUP>rd </SUP>World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-29, 1992 (1 page total).
4Abstract of a presentation given at the 3rd World Congress of Endoscopic Surgery in Bordeaux (Jun. 18-20, 1992), entitled "Session 15/2" (1 page total).
5Abstract of a presentation given at the 3rd World Congress of Endoscopic Surgery in Bordeaux (Jun. 18-20, 1992), entitled "Session 15/4" (1 page total).
6Abstract of a presentation given at the 3rd World Congress of Endoscopic Surgery in Bordeaux (Jun. 18-20, 1992), entitled "Session 15/5" (1 page total).
7Abstract of a presentation given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992, entitled "Session 15/1" (1 page total).
8 *Advantage of computer aided teleoperation (CAT) in microsurgery Guerrouad, A.; Vidal, P.; Advanced Robotics, 1991. 'Robots in Unstructured Environments', 91 ICAR., Fifth International Conference on Jun. 19-22, 1991 pp. 910-914 vol. 1 Digital Object Identifier 10.1109/ICAR.1991.240557.
9Alexander, "A Survey Study of Teleoperators, Robotics, and Remote Systems Technology", Remotely Manned Systems-Exploration and Operation in Space, California Institute of Technology 1973.
10Alexander, "Impacts of Telemation on Modern Society", On the Theory and Practice of Robots and Manipulators vol. II, 1974.
11 *Automatic analysis of weariness during a micromanipulation task by SMOSGuerrouad, A.; Jolly, D.; Engineering in Medicine and Biology Society, 1989. Images of the Twenty-First Century. Proceedings of the Annual International Conference of the IEEE Engineering in Nov. 9-12, 1989 pp. 906-907 vol. 3 Digital Object Identifier 10.1109/IEMBS.1989.96.
12Bejczy, "Controlling Remote Manipulators through Kinesthetic Coupling," Computers in Mechanical Engineering 1983, pp. 48-60.
13Besant et al., Abstract of a presentation "Camera Control for Laparoscopic Surgery by Speech-Recognizing Robot: Constant Attention and Better Use of Personnel," given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992 (1 page total).
14Charles et al., "Design of a Surgeon-Machine Interface for Teleoperated Microsurgery," IEEE 1989 (3 pages total).
15Colgate, "Power and Impedance Scaling in Bilateral Manipulation," IEEE, 1991, pp. 2292-2297.
16Corcoran, "Robots for the Operating Room," The New York Times, Sunday Jul. 19, 1992, Section 3, p. 9, col. 1 (2 pages total).
17Das et al., "Kinematic Control and Visual Display of Redundant Teleoperators," IEEE 1989 pp. 1072-1077.
18Dolan et al., "A Robot in an Operating Room: A Bull in a China Shop," IEEE, 1987, pp. 1096-1097.
19Fu et al., "Robotics: Control, Sensing, Vision and Intelligence", Table of Contents, McGraw-Hill Book Company, 1987.
20Gayed et al., "An Advanced Control Micromanipulator for Surgical Applications," Systems Science vol. 13, 1987, pp. 23-24.
21Green et al., Abstract of a presentation "Telepresence: Advanced Teleoperator Technology for Minimally Invasive Surgery," given at "Medicine meets virtual reality" symposium in San Diego, Jun. 4-7, 1992 (20 pages total).
22Green et al., Abstract of a presentation "Telepresence: Advanced Teleoperator Technology for Minimally Invasive Surgery," given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992 (2 pages total).
23Green, Statutory Declaration of Dr. Philip S. Green, presenter of the video entitled "Telepresence Surgery-The Future of Minimally Invasive Medicine" (32 page total).
24Guerrouad et al., "S.M.O.S.: Stereotaxical Microtelemanipulator for Ocular Surgery," IEEE, 1989, pp. 879-880.
25Guerrouad, "Voice Control in the Surgery Room," IEEE Engineering in Medicine & Biology Society 11th Annual International Conference 1989 (2 pages total).
26Inoue et al., "Six-axis Bilateral Control of an Articulated Slave Manipulator Using a Cartesian Master Manipulator," Advanced Robotics, 4, No. 2, 1990, pp. 139-150.
27Kazerooni, "Human/Robot Interaction via the Transfer of Power and Information Signals-Part I: Dynamics and Control Analysis," IEEE, 1989, pp. 1632-1640.
28Kazerooni, "Human/Robot Interaction via the Transfer of Power and Information Signals-Part II: An Experimental Analysis," IEEE, 1989, pp. 1641-1647.
29Krishnan et al., Abstract of a presentation "Design Considerations of a New Generation Endoscope Using Robotics and Computer Vision Technology," given at the 3rd World Congress of Endoscopic Surgery in Bordeaux, Jun. 18-20, 1992 (1 page total).
30Lavallee, "A New System for Computer Assisted Neurosurgery," IEEE, 1989, vol. 11, pp. 926-927.
31Mair, Industrial Robotics, Prentice Hall, 1988, pp. 41-43, 49-50, 54, 203-209.
32Majima et al., "On a Micro-Manipulator for Medical Application-Stability Consideration of its Bilateral Controller," Mechatronics, 1991, pp. 293-309.
33Nasa, "Anthropomorphic Remote Manipulator", NASA Tech Briefs, 1991 (1 page total).
34Preising et al., "A Literature Review: Robots in Medicine," IEEE, Jun. 1991, pp. 13-22 & 71.
35Rasor et al., "Endocorporeal Surgery Using Remote Manipulators", Remotely Manned Systems-Exploration and Operation in Space, California Institute of Technology 1973.
36Sabatini et al., "Force Feedback-Based Telemicromanipulation for Robot Surgery on Soft Tissues," IEEE, 1989, pp. 890-891.
37 *SMOS: stereotaxical microtelemanipulator for ocular surgery Guerrouad, A.; Vidal, P.; Engineering in Medicine and Biology Society, 1989. Images of the Twenty-First Century. Proceedings of the Annual International Conference of the IEEE Engineering in Nov. 9-12, 1989 pp. 879-880 vol. 3 Digital Object Identifier 10.1109/IEMBS.1989.96028.
38Stryker Endoscopy, "Sidne", Operating and Maintenance Manual, 33 pages total.
39Taubes, "Surgery in Cyberspace," Discover Magazine, Dec. 1994, pp. 85-92.
40Taylor et al., "Taming the Bull: Safety in a Precise Surgical Robot," IEEE, 1991, pp. 865-871.
41Tejima, "A New Microsurgical Robot System for Corneal Transplantation," Precision Machinery, 1988 vol. 2, pp. 1-9.
42Tendick et al., "Analysis of the Surgeon's Grasp for Telerobotic Surgical Manipulation," IEEE, 1989, pp. 914-915.
43Thring, "Robots and Telechirs: Manipulator with Memory: Remote Manipulators: Machine Limbs for the Handicapped," Wiley & Sons, 1983 (26 pages total).
44Transcript of a video presented by SRI at the 3rd World Congress of Endoscopic Surgery in Bordeaux on Jun. 18-20, 1992, in Washington on Apr. 9, 1992, and in San Diego, CA on Jun. 4-7, 1992 entitled "Telepresence Surgery-The Future of Minimally Invasive Medicine" (3 pages total).
45Trevelyan et al., "Motion Control for a Sheep Shearing Robot," Proceedings of the 1st International Symposium on Robotics Research, MIT, Cambridge, Massachusetts, USA, 1983, pp. 175.
46Vertut, Jean and Coeffet, Philippe Coiffet; "Robot Technology; vol. 3A Teleoperation and Robotics Evolution and Development"; 1986; Prentice-Hall, Inc; Englewood Cliffs, N.J.
47Vibet, "Properties of Master-Slave Robots," Motor-con, 1987, pp. 309-314.
48 *Voice control in the surgery room Guerrouad, A.; Engineering in Medicine and Biology Society, 1989. Images of the Twenty-First Century. Proceedings of the Annual International Conference of the IEEE Engineering in Nov. 9-12, 1989 pp. 904-905 vol. 3 Digital Object Identifier 10.1109/IEMBS.1989.96040.
49Wilson et al., "Filmless PACS in a multiple facility environment," Proceedings of the Spie, Spie, Bellingham, VA, US vol. 2711, pp. 500-509 (XP002082137).
50Wolf et al., "Student Reference Manual for Electronic Instrumentation Laboratories," Prentice Hall, New Jersey 1990, pp. 498 and 499.
Citada por
Patente citante Fecha de presentación Fecha de publicación Solicitante Título
US7848498 *8 Dic 20057 Dic 2010Siemens AktiengesellschaftOperating method for a support facility for a medical engineering system and objects corresponding herewith
US8002694 *25 Ene 200823 Ago 2011Hitachi, Ltd.Master-slave manipulator system
US8543240 *21 Sep 201024 Sep 2013Intuitive Surgical Operations, Inc.Master finger tracking device and method of use in a minimally invasive surgical system
US8725525 *13 Abr 200513 May 2014Olympus CorporationEndoscope system
US883178215 Jul 20139 Sep 2014Intuitive Surgical Operations, Inc.Patient-side surgeon interface for a teleoperated surgical instrument
US20050228294 *13 Abr 200513 Oct 2005Olympus CorporationEndoscope system
US20110118753 *21 Sep 201019 May 2011Brandon ItkowitzMaster finger tracking device and method of use in a minimally invasive surgical system
Clasificaciones
Clasificación de EE.UU.706/14, 704/200, 704/E15.045
Clasificación internacionalG06F19/00, G06F15/00, G10L11/00, A61B17/00, A61B19/00, G10L15/26, G06F15/18
Clasificación cooperativaA61B19/22, A61B19/50, G10L15/265, G06F19/3406, A61B2017/00203
Clasificación europeaG06F19/34A, A61B19/22, G10L15/26A
Eventos legales
FechaCódigoEventoDescripción
19 Dic 2011FPAYFee payment
Year of fee payment: 4
18 Nov 2004ASAssignment
Owner name: INTUITIVE SURGICAL, INC., DELAWARE
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:COMPUTER MOTION, INC.;REEL/FRAME:015384/0491
Effective date: 20041115