WO2013176760A1 - Graphical user interfaces including touchpad driving interfaces for telemedicine devices - Google Patents

Graphical user interfaces including touchpad driving interfaces for telemedicine devices Download PDF

Info

Publication number
WO2013176760A1
WO2013176760A1 PCT/US2013/031743 US2013031743W WO2013176760A1 WO 2013176760 A1 WO2013176760 A1 WO 2013176760A1 US 2013031743 W US2013031743 W US 2013031743W WO 2013176760 A1 WO2013176760 A1 WO 2013176760A1
Authority
WO
WIPO (PCT)
Prior art keywords
storage medium
readable storage
transitory computer
video feed
live video
Prior art date
Application number
PCT/US2013/031743
Other languages
French (fr)
Inventor
Charles S. Jordan
Andy YOUNG
Mei Sheng NG
Yair LURIE
Fuji Lai
Timothy C. Wright
John Cody Herzog
Blair Whitney
Bill RIZZI
James Ballantyne
Yulun Wang
Cheuk Wah WONG
Justin Kearns
Orjeta Taka
Ramchandra KARANDIKAR
Original Assignee
Intouch Technologies, Inc.
Irobot Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intouch Technologies, Inc., Irobot Corporation filed Critical Intouch Technologies, Inc.
Priority to EP13793865.0A priority Critical patent/EP2852881A4/en
Publication of WO2013176760A1 publication Critical patent/WO2013176760A1/en
Priority to US14/550,750 priority patent/US9361021B2/en
Priority to US15/154,518 priority patent/US10061896B2/en
Priority to US16/045,608 priority patent/US10658083B2/en
Priority to US15/931,451 priority patent/US10892052B2/en
Priority to US17/146,306 priority patent/US11515049B2/en
Priority to US17/992,074 priority patent/US11756694B2/en
Priority to US18/229,570 priority patent/US20230377761A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0088Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • B25J9/1676Avoiding collision or forbidden zones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/22Social work
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S901/00Robots
    • Y10S901/01Mobile robot
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S901/00Robots
    • Y10S901/46Sensing device
    • Y10S901/47Optical
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S901/00Robots
    • Y10S901/49Protective device

Definitions

  • This disclosure relates to interactive and display interfaces for
  • this disclosure provides various graphical user interfaces and interactive interfaces for remote presence devices.
  • FIG. 1 illustrates an embodiment of a home page of a portable electronic device (PED), including an application providing a remote presence interface (RPI) for interacting with a telepresence device.
  • PED portable electronic device
  • RPI remote presence interface
  • FIG. 2A illustrates an embodiment of an initial launch page for an RPI or other application associated with a telepresence device.
  • FIG. 2B illustrates an embodiment of an initial launch page for an RPI or other application associated with a telepresence device intended to induce a user to orient a PED in a portrait orientation.
  • FIG. 2C illustrates a front-facing camera located near the top of a PED in a portrait orientation capturing an image of a healthcare practitioner.
  • FIGS. 3A and 3B illustrate embodiments of a login page for the RPI.
  • FIG. 4 illustrates an embodiment of an endpoint list of various
  • FIG. 5 illustrates a connection wizard configured to facilitate the
  • connection of a user to a telepresence network
  • FIG. 6 illustrates an exemplary embodiment of a module within the RPI configured to provide visual and interactive control of a telepresence device.
  • FIG. 7 illustrates an embodiment of a map and navigation module of the RPI.
  • FIG 8 illustrates an example of a notification that may be displayed to a user via the RPI.
  • FIG. 9 illustrates an embodiment of a media manager module of the RPI configured to allow a user to capture and manage audiovisual media from the telepresence device.
  • FIG. 10 illustrates an embodiment of menu of the RPI allowing a user to modify various settings on the local PED and/or the remote telepresence device.
  • FIG. 1 1 illustrates an embodiment of the RPI in full-screen video mode.
  • FIG. 12 illustrates an embodiment of the RPI in full-screen data mode.
  • FIG. 13 illustrates an embodiment of the RPI including a visual
  • FIGS. 14A-C illustrate exemplary embodiments of a telepresence device.
  • FIG. 15 illustrates an RPI configured to provide patient visualization, remote settings control, navigation control, and access to patient data integration.
  • FIG. 16 illustrates a navigation module within the RPI configured to allow a user to navigate a telepresence device using one or more navigational techniques.
  • FIG. 17 illustrates the RPI displaying a video of a patient, associated telemetry data, and associated records.
  • FIGS. 18A and 18B illustrate embodiments of what may be displayed during a consult on a telepresence device and via an RPI on a PED, respectively.
  • FIG. 19 illustrates an embodiment of a telepresence device in a
  • FIG. 20 illustrates various embodiments of avatars and/or personalities that may be assigned or used by a medical practitioner and/or telepresence device.
  • FIG. 21 illustrates an RPI configured to utilize various pointer-based navigational modes for navigating a telepresence device.
  • FIG. 22 illustrates an RPI configured to utilize destination-based navigational modes for navigating a telepresence device.
  • FIG. 23 illustrates a selectable destination patient list that can be used within an RPI to navigate a telepresence device.
  • FIG. 24 illustrates an RPI configured to utilize map-based navigational modes for navigating a telepresence device.
  • FIG. 25 illustrates a full-screen view of a hallway from a telepresence device as visualized on a PED via an RPI.
  • FIG. 26 illustrates a full-screen view of various doorways in a healthcare facility, as visualized on a PED via an RPI.
  • FIG. 27A illustrates an intended or directed navigation path of a
  • telepresence device as visualized on a PED via an RPI.
  • FIG. 27B illustrates a mouse-based driving interface for navigating a telepresence device via an RPI.
  • FIG. 27C illustrates the mouse-based driving interface with long drive vector drawn on the video feed to indicate a direction and a relatively fast velocity.
  • FIG. 27D illustrates the mouse-based driving interface with a short drive vector drawn on the video feed to indicate a direction and a relatively slow velocity.
  • FIG. 27E illustrates the mouse-based driving interface with a vector for rounding a corner within the RPI.
  • FIG. 27F illustrates the mouse-based driving interface with a vector drawn to cause the telepresence device to spin in place.
  • FIG. 27G illustrates the mouse-based driving interface with a vector drawn towards an object.
  • FIG. 27H illustrates the mouse-based driving interface used to reverse the telepresence device with a camera of the telepresence device oriented in reverse and slightly downward.
  • FIG. 28 illustrates a bedside view of a patient bed and an associated patient monitor, as visualized on a PED via an RPI.
  • FIG. 29 illustrates a click-to-zoom feature of an RPI that can be used when medical practitioners and/or other users visualize a patient on a PED via an RPI.
  • FIG. 30 illustrates a StrokeRESPOND application for accessing a telepresence device that may be a separate application or integrated within an RPI.
  • FIG. 31 illustrates a transparent user image overlaid on an image generated by a telepresence device.
  • FIG. 32 illustrates a toolbar for managing modules and control operations available via an RPI, while simultaneously displaying a video feed from a
  • FIG. 33 illustrates a toolbar associated with a media management module of an RPI.
  • FIG. 34 illustrates a toolbar separating an endpoint list and a patient list in an RPI, allowing for quick user selection of various possible permutations.
  • FIG. 35 illustrates a view of a touch pad control pane for navigating a telepresence device while displaying a video feed from a telepresence device in an upper window.
  • FIG. 36 illustrates an avatar display of telepresence devices in a lower window and a video feed from a telepresence device in an upper window.
  • FIG. 37 illustrates a visualization of a telepresence device overlaid on a video feed from a telepresence device that may be useful for navigation of the telepresence device.
  • FIG. 38 illustrates a toolbar of an RPI associated with a settings manager for managing settings on a local PED and/or a telepresence device.
  • FIG. 39 illustrates a navigational mode including a landing strip
  • navigational panel allowing a user to specify a direction of a telepresence device.
  • FIG. 40 illustrates a full-screen video feed from a telepresence device, including an overlaid toolbar.
  • FIG. 41 illustrates a joystick-style control on a touch interface of a PED for controlling a telepresence device via an RPI.
  • FIG. 42 illustrates a dual joystick-style control on a touch interface of a
  • PED for controlling a telepresence device via an RPI.
  • FIG. 43 illustrates a state diagram for an RPI for use on a PED.
  • FIG. 44 illustrates a full-screen video feed from a telepresence device, including an overlaid toolbar and joystick control.
  • FIG. 45 illustrates an exemplary toolbar of icons that may be overlaid within an RPI.
  • FIG. 46 illustrates an overlaid instructional panel associated with driving a telepresence device via an RPI on a PED.
  • FIG. 47 illustrates a multi-participant telepresence session conducted via an RPI on a PED.
  • FIG. 48 illustrates a window accessible via an RPI on a PED providing access to a care team of a particular patient.
  • FIG. 49 illustrates an exemplary overlay help screen accessible within the RPI on a PED to provide instructions regarding available functions on any given screen.
  • Healthcare facilities may include telemedicine technologies, such as telepresence devices in a telepresence network, that allow remote healthcare practitioners to provide services to patients and/or other healthcare practitioners in remote locations.
  • a remote healthcare practitioner may be a neurologist practicing in a relatively large hospital who may, via a telepresence device, provide services and consultations to patients and/or other medical professionals in a relatively small hospital that may otherwise not have a neurologist on staff.
  • a telepresence device such as an autonomous or semi-autonomous robot, may communicate with an interfacing device via a communications network.
  • a portable electronic device PED
  • RPI remote presence interface
  • an RPI application may be executed on a device configured as a stand-alone PED configured to solely run the RPI application.
  • the RPI application may be executed by any of a wide variety of PEDs configured as multi-purpose PEDs, such as the Apple iPad®.
  • PEDs configured as multi-purpose PEDs, such as the Apple iPad®.
  • a user may launch an RPI application and login using login
  • the login process may utilize any of a wide variety of encryption algorithms and data protection systems. In various embodiments, the login process may meet or exceed standards specified for Health Insurance Portability and
  • HIPAA Health Accountability Act
  • a user may select one or more endpoint telepresence devices via the RPI.
  • the RPI may display a video feed from the telepresence device on the PED.
  • the RPI may include any number of navigational panels, setting controls, telemetry data displays, map views, and/or patient information displays.
  • the RPI may allow a user to manually, semi- autononnously, or autonomously control the movement of the telepresence device.
  • the RPI may allow a user to specify movement (i.e., a location within a healthcare facility or a physical movement, such as a head turn, of the telepresence device) using a destination selection panel, an arrow, a physical or virtual joystick, a touch pad, click-to-destination, vector based driving, mouse-based vector driving, and/or other navigational control.
  • a user may provide a navigation input by selecting a location within a live video feed ⁇ e.g., via a click or a touch).
  • the navigation input may be used to transmit navigation instructions to a remote telepresence device.
  • a click drive mode the selection of a location within the live video feed may be used to navigate the telepresence device to the selected location.
  • the selection of a location on a floor or hallway may result in navigation instructions that cause the telepresence robot to autonomously or semi-autonomously navigate to the selected location.
  • an operator may input a navigation input in the form a navigation vector provided ⁇ e.g., drawn, traced, mapped) with respect to the live video feed.
  • the navigation vector may include a length on the display and/or other input device ⁇ e.g., touchpad) and an angle with respect to plane of the display and/or other input device.
  • a plane of the display and/or other input device may be described with a Cartesian coordinate system as having an x-axis and a y-axis.
  • the navigation vector may be provided with respect to the display plane and described as having an x-axis component (horizontal direction) and a y-axis component (vertical direction).
  • a navigation input provided as a navigation vector may be decomposed into a horizontal (x-axis) component and a vertical (y-axis) component.
  • the length and sign (i.e., positive or negative value) of the horizontal component of the navigation vector may be used to determine a magnitude and direction of a rotational velocity and/or an angular displacement of a telepresence device.
  • the length of the vertical component of the navigation vector may be used to determine the magnitude of a forward velocity and/or a forward displacement of a telepresence device.
  • the length of the horizontal component may be used to determine the magnitude of the rotational velocity and/or angular displacement using a scaling function.
  • the scaling function may be constant, linear, and/or non-linear. Thus, using a non-linear scaling function, a first horizontal component twice as long as a second horizontal component may not result in a first rotational velocity double that of the second rotational velocity.
  • the length of the vertical component may be used to determine the magnitude of the forward velocity and/or forward displacement using a scaling function.
  • the scaling function may be constant, linear, and/or non-linear.
  • the scaling function used to translate the horizontal component may be different than the scaling function used to translate the vertical component.
  • the selection of a location within the live video feed may be used to generate a navigation vector, where the length of the vector corresponds to the velocity at which the telepresence device should navigate and/or the distance the telepresence device should navigate, and the angle of the vector corresponds to the direction the telepresence device should navigate.
  • a navigation input may be in the form of a vector drawn as either a final endpoint selection or as a vector including a beginning point and an end point. The end point of the vector may represent the location to which the telepresence device should navigate (i.e., a desired navigation point).
  • Navigation instructions may be generated based on the current location of the telepresence device and the endpoint of the vector within the live video feed.
  • the vector's image pixel endpoint may be mapped via a lookup table to one or more translate and rotate commands to cause the telepresence device to navigate to the selected location.
  • the length of the vector may correspond to the desired distance to move the
  • telepresence device a desired acceleration of the telepresence device, and/or a desired velocity of the telepresence device.
  • a navigation input may be received that directs a telepresence device to navigate forward (with respect to the live video feed) 3 meters and to the right 2 meters.
  • Any of a wide variety of navigation instructions may be possible to correctly navigate the telepresence device.
  • the navigation instructions may direct the telepresence device to navigate approximately 3.6 meters at a 34 degree angle relative to the video feed.
  • the navigation instructions may direct the telepresence device to navigate 3 meters forward and then 2 meters to the right.
  • the navigation input may be translated into navigation instructions as a plurality of drive forward commands coupled with rotate commands.
  • the navigation instructions are derived from a navigation input indicating a desired direction and/or location and the current location of the telepresence device.
  • the live video feed may be delayed slightly due to network and/or processing limitations. For example, a live video feed may be delayed by a few tenths of a second or even by a few seconds. This video latency may result in inaccurate navigation instructions. Accordingly, the navigation input may be adjusted based on the known video latency and the velocity and direction of the robot.
  • the telepresence device will have already traveled 1 of the desired 3 meters when the selection is made. Accordingly, the selection of a location 3 meters ahead and 2 meters to the right, may be translated or mapped to navigation instructions that cause the telepresence device to travel 2 meters forward and 2 meters to the right (or 2.8 meters at a 45 degree angle) to compensate the movement of the telepresence device.
  • the navigation instructions may be adjusted based on the latency of the video feed.
  • a navigation input in the form of a vector may be translated into navigation instructions through the use of a lookup table.
  • the lookup table may include values that compensate for latency. That is, the navigation instructions returned by the lookup table may be based on the navigation input and the current or recent average video latency.
  • the RPI may provide various notifications associated with the network connection, the PED, a patient, a healthcare facility, a healthcare practitioner, a telepresence device, and/or the like.
  • the RPI may include a media management module configured to allow a user to record and/or store audio and/or visual data for subsequent use.
  • a settings panel may allow settings on the PED and/or the telepresence device to be adjusted. In some views, multiple windows may provide quick access to various panels of information.
  • one or more panels associated with a video feed, patient data, calendars, date, time, telemetry data, PED data, telepresence device data, healthcare facility information, healthcare practitioner information, menu tabs, settings controls, and/or other features may be displayed simultaneously and/or individually in a full-screen mode.
  • the RPI may utilize a camera of the PED to capture an image of the user of the PED and project the image on a screen on the telepresence device.
  • the image on the screen of the telepresence device may be modified and/or enhanced.
  • an avatar representing the user of the PED is displayed on the PED.
  • the RPI may encourage or induce a user to utilize the PED with a front-facing camera at or above eye-level.
  • the RPI may encourage or induce a user to utilize the PED with a front-facing camera approximately centered on a user's face. For instance, on an Apple iPad®, the RPI may encourage a user to utilize the iPad® in a portrait mode for many tasks in order to maintain a more natural perspective of the user for projection via the screen on the telepresence device.
  • the RPI may facilitate conference sessions with more than two people interacting via a combination of PEDs and/or telepresence devices.
  • multiple healthcare practitioners may participate in a remote consultation.
  • each healthcare practitioner may utilize an RPI on a PED to access a telepresence device at a bedside of a patient.
  • the remote presence system via the network, servers, RPIs, and/or telepresence devices, may facilitate the multi-user experience.
  • the RPI may incorporate sub-applications and/or provide access to related applications, such as a StrokeRESPOND application configured to provide one or more functions and/or workflow processes described in U.S. Patent
  • the display of a PED may be utilized by the RPI to display any combination of video feed panels, informational panels, data panels, setting control panels, navigation panels, and/or panels providing access to any of various functions made accessible via the RPI.
  • the RPI may be configured to maintain a "stateful" connection at the application layer, such that a session and/or variables may be continued and/or maintained in the event that the connection is lost or dropped.
  • the RPI application may attempt to re-establish a disconnected session using saved or logged variables in the event a connection is lost or dropped.
  • the PED and/or RPI application may have settings that enable a user to maximize frame rate or image quality at the expense of the battery life of the device, or vice versa.
  • the RPI may include an information bar ("status bar”) that displays various status information related to the PED, including battery life, wireless connection strength, wireless connection name or SSID, or the current time.
  • the RPI may include one or more toolbars.
  • a toolbar may be disposed along the top edge of the upper pane of the RPI.
  • the toolbar may be manually hidden by touching or selecting an icon or a specified area of the display, or the toolbar may auto-hide after a period of time. Once hidden, the user may un-hide the toolbar by touching, selecting, or otherwise moving an input device on or around an icon or specified area of the display.
  • the RPI may include a "Picture-in-Picture" region or window that displays local video or image data currently being captured by the camera of the PED.
  • the local video feed may be captured from a camera either incorporated within or otherwise in communication with the PED.
  • the user may resize the local video window, reposition it within the display, and/or remove it.
  • the local video window may be displayed in a lower pane of the GUI, while the remote video is displayed in the upper pane, or vice versa.
  • the RPI may include an in-video interface that enables the user to control the endpoint device by interacting with the live video feed. For example, when the user touches, taps, clicks, or otherwise selects a point in the live video feed, the endpoint may change a mechanical pan or tilt, or digital pan or tilt, or both, such that the point selected by the user is centered in the live video feed.
  • the user may also adjust an optical or digital zoom, or both, of the live video feed. For example, a user may adjust an optical and/or digital zoom by pinching together or spreading apart two or more fingers on the surface of a touch sensitive display of the PED.
  • the user may control all or some of a mechanical or digital pan, tilt, or zoom of the live video feed by pressing and dragging a finger on the surface of the display to specify a diagonal or other dimension of a zoom region.
  • the endpoint may mechanically or digitally pan, tilt, or zoom to match a dimension of the zoom region to a dimension of the live video feed, and/or to match a center of the zoom region to a center of the live video feed.
  • a user may zoom out to a default zoom (which may be fully zoomed out) by performing a double-tap in the live video feed, or by double- clicking or right-clicking a mouse cursor within the live video feed.
  • the user may direct movement of a telemedicine device (or other telepresence device) to a particular location within the live video feed by performing a "touch-hold-tap," where the user touches a location on the screen, holds his or her finger on that location for a brief interval until a cursor appears on the screen under the user's finger, positions the cursor (which now follows the user's finger) at a point in the remote video window representing the desired destination, and subsequently taps his or her finger once again to confirm the location as the desired destination.
  • the telepresence device may then proceed to a position in the live video feed corresponding to the location selected by the user.
  • Additional functionalities available via the RPI through a touch screen interface may include a two-finger swipe to display video from one or more auxiliary video sources, such as a video camera, still camera, endoscope, ultrasound device, radiological imaging device, magnetic resonance imaging device, or other medical imaging device.
  • auxiliary video sources such as a video camera, still camera, endoscope, ultrasound device, radiological imaging device, magnetic resonance imaging device, or other medical imaging device.
  • a toolbar may be shrunken and/or expanded (vertically or horizontally, or both), or hidden and unhidden, by touching or tapping an icon disposed at the top of the screen.
  • an icon may have the appearance of an arrow or triangle that points "up” when the toolbar is fully expanded or unhidden, and may point “down” when the toolbar is shrunken or hidden.
  • the icon itself may shrink or expand with the toolbar.
  • the toolbar may be unhidden by the user "pulling down” or making a downward swipe from the top (or other edge) of the screen.
  • the toolbar may again be hidden by "pushing up” or making a swipe toward the top (or other edge) of the screen.
  • Additional functions or applications may be accessed from the toolbar.
  • Each function or application may have a distinctive icon in the toolbar indicative of the underlying functionality. If there are more functions/icons available than can be displayed on the toolbar, the toolbar may be configured to scroll icons onto and off of the toolbar with a swipe of the user's finger in the toolbar. The icons may scroll continuously in a carousel fashion, or the scrolling may be disabled in a particular direction when there are no further icons to be displayed, thus informing the user of this fact. Alternatively, the toolbar may not scroll available icons, but instead show only a specified set of icons. In this case, additional functions or applications would be exposed to the user via additional menu levels, windows, or pop-overs, accessible via one or more icons contained in the toolbar or elsewhere in the RPI.
  • the toolbar may provide access to various functions of the RPI such as activating or deactivating a headset or handset (which may be coupled to the telepresence device, in either a wired or wireless fashion) when additional privacy is desired in communicating with the remote user (i.e., the user at the control station/PED); activating or deactivating a stethoscope for monitoring a patient;
  • destinations e.g., people's names associated with a device or location
  • the user may initiate a vector drive mode for controlling a telepresence device.
  • a mouse click, touch-drag motion, or other action may be used to "draw" or otherwise input a vector for controlling the motion of a telepresence device.
  • a user may touch two fingers in the lower pane and drag them a desired distance and direction on the screen to send a drive command with a respective velocity and heading to the mobile telepresence device.
  • the interface may further include a mechanism for deactivating an obstacle detection/obstacle avoidance (ODOA) system, such that the user may operate in a "nudge mode" and continue to slowly move the telepresence device despite the telepresence being in contact with or close proximity to another object.
  • This mechanism may automatically time-out such that the ODOA system is reengaged after a period of inactivity.
  • ODOA obstacle detection/obstacle avoidance
  • Placing the telepresence device in nudge mode may cause a camera of the telepresence device to be automatically positioned to look down around the body of the telepresence device and either in the direction of drive command issued by the user or in the direction of an object closest to or in contact with the telepresence device (so that the user may see what obstacle he or she is commanding the telepresence device to nudge).
  • the ODOA system may be entirely deactivated. In one embodiment, the ODOA system may be deactivated so long as the telepresence device is driven or moved below a specified velocity.
  • the RPI may be configured for deployment on any of a wide variety of PEDs, including the Apple iPad®, iPod®, and iPhone®.
  • the RPI may be configured for deployment on any of a wide variety of PEDs, such as mobile phones, computers, laptops, tablets, and/or any other mobile or stationary computing device.
  • the RPI may be presented via a plug-in or in-browser application within a standard web browser.
  • the RPI may allow for varying feature sets depending on the type of PED utilized. For example, on a larger tablet-sized PED, the RPI may include any combination of the numerous features described herein. While on a smaller PED, such as an Apple iPhone®, the available features may be limited to suit a particular context or use-case.
  • the RPI may allow a user, such as nurse or hospital staffer, to control the movement of a telepresence device without establishing a telepresence audio/video session.
  • a remote family member of a patient may conduct a two-way voice and video telepresence with his or her iPhone, but may not have permission to drive or otherwise control the movement of the telepresence device. Any number of features or combination of features may be included and/or excluded for a particular PED.
  • a nurse may utilize an RPI on a PED with limited functionality, such as an Apple iPhone®, to request a cardiac consult for a patient.
  • a telepresence system in contact with the RPI submits the request to a telepresence device.
  • the telepresence device may begin navigating to the patient while simultaneously initiating a connection with a cardiac doctor.
  • the telepresence device may call an appropriate cardiac doctor (e.g., the cardiologist on call, the nearest cardiologist, the patient's specific cardiologist, etc.) on one or more PEDs belonging to the doctor.
  • the nurse may be allowed to control the telepresence device and/or participate in an audio-only telepresence session, but may be provided a limited feature set. Accordingly, the nurse may be able to communicate with the doctor as the telepresence device navigates to the patient.
  • a nurse or a patient may request a doctor (or a specific type of doctor) via an RPI or via a display interface directly on a
  • the RPI, the telepresence device, and/or a corresponding telepresence system may intelligently call a specific doctor as described herein.
  • the RPI, the telepresence device, and/or a corresponding telepresence system may call a plurality of doctors.
  • the first doctor to attend may be connected via an RPI to a telepresence device to service the request of the nurse or patient.
  • the call routing and telepresence session may be managed in a network cloud, a telepresence network, and/or via some other suitable computing device.
  • references throughout this specification to "one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment.
  • the appearances of the phrases “in one embodiment” and “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment.
  • an “embodiment” may be a system, an article of manufacture (such as a computer-readable storage medium), a method, and/or a product of a process.
  • phrases “connected to” and “in communication with” refer to any form of interaction between two or more entities, including mechanical, electrical, magnetic, and electromagnetic interaction. Two components may be connected to each other even though they are not in direct contact with each other and even though there may be intermediary devices between the two components.
  • Types of telepresence devices include, but are not limited to, remote telepresence devices, mobile telepresence units, and/or control stations.
  • a remote telepresence device may include a telepresence robot configured to move within a medical facility and provide a means for a remote practitioner to perform remote consultations.
  • telepresence devices may comprise any of a wide variety of endpoint devices, such as those described in U.S. Patent Application No. 13/360,579 filed on January 27, 2012, titled “INTERFACING WITH A MOBILE TELEPRESENCE ROBOT," which application is hereby incorporated by reference in its entirety.
  • Telepresence devices may also comprise any of the endpoint devices described in U.S. Patent Application No. 13/360,590 filed on January 27, 2012, titled "INTERFACING WITH A MOBILE TELEPRESENCE
  • a "portable electronic device” may include any of a wide variety of electronic devices. Specifically contemplated and illustrated are tablet-style electronic devices, including, but not limited to, electronic readers, tablet computers, tablet PCs, cellular phones, interactive displays, video displays, touch screens, touch computers, and the like. Examples of PEDs include the Apple iPad®, iPod®, and iPhone®, the Hewlett Packard Slate®, the Blackberry Playbook®, the Acer lconia Tab®, the Samsung Galaxy®, the LG Optimus G-Slate®, the Motorola Xoom, ® the HP touchpad Topaz®, the Dell Streak®, and the like.
  • a tablet-style touch-screen PED is used as an exemplary PED; however, any of a wide variety of PEDs and/or other electronic devices may be used instead.
  • PEDs any of a wide variety of PEDs and/or other electronic devices may be used instead.
  • tablet computing devices cellular phones, computers, laptops, etc.
  • operations and functions performed on or by a PED may also be performed on a stationary portable electronic device, such as a desktop computer or server.
  • Embodiments may include various features, which may be embodied in machine-executable instructions executed by a general-purpose or special-purpose computer (or other electronic device). Alternatively, the features may be performed by hardware components that include specific logic for performing the steps or by a combination of hardware, software, and/or firmware. Accordingly, the various components, modules, systems, and/or features described herein may be embodied as modules within a system. Such a system may be implemented in software, firmware, hardware, and/or physical infrastructure.
  • Embodiments may also be provided as a computer program product including a non-transitory machine-readable medium having stored thereon instructions that may be used to program or be executed on a computer (or other electronic device, such as a PED) to perform processes described herein.
  • the machine-readable medium may include, but is not limited to, hard drives, floppy diskettes, optical disks, CD-ROMs, DVD-ROMs, ROMs, RAMs, EPROMs,
  • EEPROMs electrically erasable programmable read-only memory
  • magnetic or optical cards magnetic or optical cards
  • solid-state memory devices or other types of media/machine-readable media suitable for storing electronic instructions.
  • FIG. 1 illustrates an embodiment 100 of a home page of a portable electronic device (PED) 105.
  • the PED 105 may include one or more physical buttons, such as button 120, and a display screen.
  • the display screen may be a touch-screen configured to allow a user to provide input via the touch screen.
  • the PED 105 may be configured to display various icons 1 15 to launch corresponding applications.
  • a remote presence interface (RPI) icon 1 10 may be used to launch an RPI application for interfacing with a telepresence device.
  • An RPI according to any of the various embodiments described herein may alternatively be utilized on any of a wide variety of computing platforms and using any of a wide variety of programming tools and languages.
  • FIG. 2A illustrates an embodiment 200 of an initial launch page 210 for an RPI or other application associated with a telepresence device.
  • a PED 205 may display the initial launch page 210 until the RPI fully loads.
  • Alternative logos, graphics, informational displays and/or other content may be displayed.
  • a general greeting or specific greeting may be displayed on/during launch.
  • an icon or object may indicate the progress or timeframe until loading is complete.
  • a cancel button or settings configuration may be available within the initial launch page 210.
  • FIG. 2B illustrates embodiment 250 of an alternative initial launch page 270 including an image of a telepresence device 275.
  • the launch page 270 may be oriented such that it induces or encourages a user to align the PED 255 such that camera 280 is oriented in about the middle of a user's face and at or above an eye level of a user's face.
  • the image of the telepresence device 275 in the portrait orientation may encourage the user to rotate the PED 255 to the portrait orientation with the camera 280 in the top middle.
  • An information bar 260 may provide information such as battery power, time, and/or connection strength.
  • the initial launch page 270 may be unique to each PED on which it is utilized. For example, the RPI may detect that a PED with a camera positioned for a landscape orientation is being used and reorient the displayed launch page 270 accordingly.
  • FIG. 2C illustrates a front-facing camera 280 of the PED 200.
  • the camera 280 is illustrated as located on the top of the PED 200 for capturing an capturing an image of a healthcare practitioner just above eye level in a portrait orientation.
  • This location of the camera 280 relative to the face of a healthcare practitioner 290 may facilitate the capture of an aesthetically pleasing image of the healthcare practitioner 290 for display on a screen of a telepresence device.
  • a patient or other person viewing the screen on the telepresence device may view a natural image of the healthcare practitioner 290 using the RPI on the PED 200, rather than an un- aesthetically pleasing perspective of the healthcare practitioner 290.
  • a side-captured or bottom -captured image of a healthcare practitioner 290 may not be aesthetically pleasing.
  • the RPI may automatically adjust the field of view (FOV) of the camera 280.
  • the healthcare practitioner 290 or other user, may manually adjust the field of view of the camera 280.
  • any number of compositional and exposure variables of the camera may be automatically or manually adjusted/adjustable.
  • the RPI or the healthcare practitioner 290 may automatically or manually select which camera to use in the event a PED has multiple cameras. A user may, in some embodiments, select which camera is used during a session.
  • FIGS. 3A and 3B illustrate embodiments of a login page 325 on a PED 305 for accessing a telepresence device or telepresence network via an RPI, respectively.
  • Any of a wide variety of login credentials 310 and 315 and/or security technologies may be employed.
  • a username, handle, and/or password may be provided on a different settings page.
  • the username, handle, and/or password may be maintained in a configuration file.
  • the login page may include a button 330 or link that opens a settings or configuration page. Accordingly, a "remember me" 320 option may be provided to a user.
  • FIG. 4 illustrates an embodiment 400 of an endpoint list 410 generated by the RPI running on the PED 405.
  • the endpoint list 410 may include various telepresence devices 420 and their respective connectivity states 415.
  • a user may indicate (via a touch, a click, using a stylus, and/or by speaking) to which of the available endpoints he or she would like to connect.
  • an ADT Applications, Discharges, and Transfers
  • patients may also be listed. Selecting a particular patient may initiate a connection to an endpoint device
  • telemedicine device that is assigned to, associated with, and/or in proximity to the selected patient.
  • a telemedicine device in proximity to the patient could be a telemedicine device nearest a location of the selected patient, the same bed, room, or floor number.
  • the RPI may automatically determine an optimal endpoint device to dispatch to the patient.
  • Factors involved in determining the optimal endpoint device could be distance or estimated travel time to the selected patient or remaining battery life of respective telepresence devices. Additionally, the RPI may dispatch the endpoint device that can reach the selected patient while minimizing travel through areas with poor wireless connectivity, confined spaces/areas, high-traffic areas, sensitive and/or secure areas, dangerous areas, and/or otherwise undesirable zones.
  • the endpoint list may also include doctors, nurses, staff, or any other persons that may currently (or for a scheduled period of time) be associated with a particular location and/or patient to which the endpoint can navigate.
  • the endpoint list may be searchable and/or filterable.
  • the endpoint list may be implemented with a text box, in a window, or in separate tabs. As the user enters alphanumeric characters into the text box, the list may be instantaneously filtered to exclude endpoints whose names do not match the character string currently contained in the text box.
  • filtering parameters may be specified, such as endpoint type, manufacturer, status, facility, building, floor, room, customer, and/or any other grouping.
  • Logical, arbitrary, or otherwise customized groupings of endpoints may be created by a user or administrator, and these groupings may additionally be used to filter or otherwise search the list of endpoints.
  • each endpoint in the list may have an associated status indicator, which informs the user whether a device is ready, busy, offline, in a private session, and/or in a multi-presence session (which the user may join to receive session audio, video, images, or potentially control the some or all functions of the endpoint).
  • FIG. 5 illustrates an embodiment 500 of a PED 505 displaying a
  • connection wizard page as a user is connected to a telepresence network and/or telepresence device.
  • the connection wizard may facilitate the connection of a user to a telepresence network, a specific telepresence device, and/or other operators of RPIs.
  • the displayed connection wizard page may be purely a placeholder image displayed while a connection is established. In other words
  • connection wizard may enable or allow a user to select various connection parameters and/or settings.
  • FIG. 6 illustrates an exemplary embodiment 600 of a module within the RPI configured to provide visual and interactive control of a telepresence device.
  • the RPI may be divided into an upper panel 610 and a lower panel 620.
  • the upper panel 610 may include a video window 615 showing the user of the PED 605 what the camera 680 is capturing.
  • the upper panel 610 may display a video feed originating from a camera on the remote telepresence device.
  • the upper panel 610 may further include various informational items and/or control panels, described in greater detail in conjunction with subsequent figures.
  • the lower panel 620 may be used to display additional information and/or to receive inputs from a user of the PED.
  • FIG. 7 illustrates an embodiment 700 of an RPI on a PED 705 including a map 720 and a navigation module 710.
  • the map 720 may display a plan view map of a healthcare facility.
  • An image captured by the camera 780 may be displayed in a window 715.
  • a list of destinations 730 may be displayed in an upper panel.
  • a user may select a destination by selecting a room within the healthcare facility, or by selecting a patient within a patient tab.
  • the selection of a room may be considered a navigation input.
  • the selection may then be translated into one or more navigation instructions to guide the telepresence robot, autonomously or semi-autonomously to the selected location.
  • the RPI may communicate the desired destination to the telepresence device via navigation instructions.
  • the navigation instructions may allow for the telepresence device to be manually navigated, semi-autonomously navigated, or autonomously navigated to the desired destination.
  • Systems and methods for manual, semi-autonomous, and autonomous navigation are described in U.S. Patent Application No. 13/360,579, previously incorporated by reference.
  • FIG. 8 illustrates an embodiment 800 of a notification 820 generated or received by the RPI.
  • the notification 820 may be displayed on a PED 805 in a lower panel or an upper panel.
  • the notification 820 may include the remaining battery life of the telepresence robot, the robot number, the location of the robot, and/or the local time.
  • a window 815 may show the image currently being captured by a camera.
  • An upper panel 810 may be maximized, or shown in a full-screen mode, by selecting a full-screen icon 830.
  • the notifications 820 may be displayed as overlays when the upper panel 810 is maximized or is in a full-screen mode.
  • the RPI may be designed to provide a user with notifications regarding various states of a device (e.g., the PED 805 or the telepresence device), a connection, or a session. For instance, the RPI may display indications of whether video or audio recording is active and/or enabled or if a laser pointer or remote pointer is active, a battery level or remaining time available based on the current battery level, related to network performance, and/or a wireless signal strength.
  • a device e.g., the PED 805 or the telepresence device
  • the RPI may display indications of whether video or audio recording is active and/or enabled or if a laser pointer or remote pointer is active, a battery level or remaining time available based on the current battery level, related to network performance, and/or a wireless signal strength.
  • the RPI may provide a message to the user (who may be moving around) if the PED loses (or detects a decrease in strength of) a communication signal (e.g., a WiFi or 4G signal) as the result of the user's
  • the message provided to the user may distinguish whether the loss, outage, or drop in strength has occurred locally, remotely, or elsewhere in the link.
  • the RPI may be configured to provide notifications or alerts using various display modifications, annotations, or overlays.
  • the screen may pulsate with a partially translucent overlay to indicate that a battery of the telepresence device is dying and/or that connectivity is below a predetermined level.
  • the screen may pulsate with a red glow, possible with a vignette such that it is increasingly translucent toward the center of the display, to indicate that a battery is dying.
  • a notification may include one or more pop-up dialogue boxes to bring critical information to the attention of the user of the RPI.
  • the RPI may request that a user of the RPI acknowledge the current limitation in order to continue the telepresence session.
  • the RPI may request that a user acknowledge anomalous conditions associated with the RPI, the patient, and/or the telepresence device.
  • the RPI may request that a user acknowledge that they are aware of or have noticed patient conditions considered outside of predetermined ranges.
  • the telepresence device and/or the RPI may recognize that a patient's heart rate is below a threshold level and request that a remote user of the RPI acknowledge awareness of the condition.
  • FIG. 9 illustrates an embodiment 900 of a media manager module 920 of the RPI configured to allow a user to capture and manage audio and/or visual media from the telepresence device.
  • the upper panel 910 of the PED 905 may include a live video feed from the telepresence device.
  • the live video feed in the upper panel 910 may be expanded to full-screen via the full-screen icon 930.
  • the lower panel may include a media manager module 920 adapted to capture still images, audio, video, and/or a combination thereof from the telepresence device.
  • the media manager module 920 may allow for editing, playback, trimming, storing, emailing, and/or other functions.
  • the RPI may display the native video controls of the PED overlaid above the live video feed, along a toolbar across the upper panel 910 or a lower panel, or elsewhere on the screen.
  • a library of network-accessible or locally stored video clips or movies may be made accessible in the lower panel. The user may navigate the video clip library by swiping left or right, which may cause successive clips to be visually cycled from completely obscured to partially obscured to fully visible (and selectable/playable), to partially obscured again, and so on.
  • the user may drag and drop the video from the media manager module to the upper panel 910 to cause the video to be displayed on a screen on the telepresence device.
  • the RPI may enable playback to be controlled from the PED and/or the telepresence device. Once a video is selected, the RPI may enable the user to navigate or skip through an index of key frames contained in the video. The RPI may additionally enable the user to delete a video using a flicking gesture. For security and privacy, videos may be stored in an encrypted format. In another example, when the user taps the camera icon in a toolbar, the interface may display a cover-flow image library in the lower pane. Images may be searched, browsed, displayed on the PED, or displayed at the remote endpoint in the same manner as described above with respect to video files in the media manager.
  • FIG. 10 illustrates an embodiment 1000 of menu 1040 of the RPI allowing a user to modify various settings on the local PED 1005 and/or a remote
  • a window 1010 may show the user what is currently being captured by the camera 1080.
  • the upper panel 1015 may display a live video feed generated by the telepresence device.
  • the menu 1040 in the lower panel 1030 may contain settings for local brightness control 1045, remote brightness control 1065, microphone levels 1035, speaker levels 1055, and/or any of a wide variety of other features.
  • a lower bar 1020 may also include one or more management tools. As illustrated, a picture-in-picture selection may allow a user to selectively disable the picture-in-picture window 1010. An auto-brightness feature 1045 may be enabled via a checkbox.
  • the control settings may further include a microphone level meter 1050 to indicate a volume or sound pressure level relative to a maximum specified input or clipping level. Additionally, the settings control may include a microphone gain slider 1035 to allow adjustment of a microphone gain to a desired level.
  • a speaker level meter 1060 may graphically illustrate the current volume or sound pressure level relative to a maximum specified output or slipping level.
  • the remote controls may include a picture-in-picture checkbox to enable the user to toggle a picture-in-picture display of the remote video view at the remote endpoint.
  • FIG. 1 1 illustrates an embodiment 1 100 of the RPI in full-screen video mode.
  • a window 1 1 15 may show a picture-in-picture of the image captured by camera 1 180.
  • the upper panel 1 1 10 may display a full-screen live video feed captured by a telepresence device.
  • FIG. 12 illustrates an embodiment 1200 of the RPI in full-screen data mode.
  • the image captured by camera 1280 may not be displayed in a picture-in-picture window to make more room for data entry and/or visualization.
  • telepresence device that is normally the image captured by the camera 1280 may display a no signal or screensaver image, or a previous image of the healthcare practitioner may be "frozen" on the screen.
  • the PED 1205 may allow data to be entered via the touch screen using keyboard 1230.
  • the RPI may display a single data panel 1215 in a fullscreen mode by removing the upper panel typically used to display the live video feed from the telepresence device.
  • Various tabs 1210 may toggle the screen between various data input and/or visualization modes.
  • the remote video view may continue to be displayed in a small window or panel.
  • FIG. 13 illustrates an embodiment 1300 of the RPI including a visual representation of patient telemetry data 1350 displayed in a lower panel 1320.
  • the PED 1305 may again include a camera 1380 and a window 1310 may be displayed by the RPI to show a user what is currently being captured by the camera 1380.
  • full-screen icons 1335 may be available in the RPI to transition either of the lower panel 1320 or the upper panel 1315 to a full-screen mode.
  • a software button 1330 may allow the telemetry data to be toggled, changed, scrolled, and/or otherwise manipulated.
  • a patient icon 1325 may allow a user to select a
  • the telepresence device and/or telemetry data associated with a different patient may be displayed as numerical values and/or in graphical form 1340.
  • FIGS. 14A-C illustrate exemplary embodiments of a telepresence device.
  • a telepresence device may comprise a base 1410 capable of being manually driven, semi-autonomously driven, and/or
  • the telepresence device may also include a handle 1435.
  • a head portion 1440 of the telepresence device may include one or more cameras 1460, speakers, and/or microphones.
  • the head portion 1440 may include one or more two- or three-dimensional depth sensors, such as LADARs, LIDARs, or structured light projectors/scanners. Multiple cameras 1460 may be useful to render 3D images, and multiple microphones and/or speakers may be useful for rendering and/or generating directional sound.
  • the head portion may also include a display 1450 configured to display, for example, an image captured using an RPI on a PED.
  • the display 1450 and/or the interface 1430 may comprise a touch screen or other interface to receive inputs.
  • the display 1450 and/or the interface 1430 may provide a list of destinations, healthcare practitioners, and/or patients to which, as described herein, the telepresence device can be sent or connected.
  • the display 1450 and/or the interface 1430 may also enable a person to stop the telepresence device when it is autonomously navigating and, likewise, enable the telepresence device to resume autonomous navigation to its destination.
  • the display 1450 and/or the interface 1430 may additionally have a button or menu option that instructs the telepresence device to autonomously navigate to an out of the way location (e.g., a wall, corner, etc.), a dock, a storage location, and/or a charging station.
  • the display 1450 and/or the interface 1430 may include buttons or menu options for various settings or to page or notify support personnel of a problem with or question regarding the operation of the telepresence device.
  • FIG. 15 illustrates a panel 1530 including a plan map view, a video feed window, and various settings panels.
  • the panel 1530 may be displayed in various forms and configurations via an RPI on a PED, such as the PEDs 1510, 1520, and 1540.
  • a PED such as the PEDs 1510, 1520, and 1540.
  • Each of the PEDs 1510, 1520, and 1540 illustrate examples of panel arrangements generated by the RPI to maximize the useable screen space for various tasks performed using the RPI.
  • a larger display may accommodate more panels or larger panels.
  • devices with multiple displays may accommodate one or more panels on each display.
  • a desktop version of the RPI may utilize multiple monitors connected to the desktop to display one or more panels.
  • FIG. 16 illustrates an embodiment 1600 of an RPI on a PED 1605.
  • the RPI may utilize the camera 1680 to transmit an image of the user to a telepresence device.
  • the image captured by the camera 1680 may be displayed in a picture-in- picture window 1610.
  • the upper panel 1615 of the RPI may display a live video feed received from the telepresence device.
  • the lower panel 1620 may display a plan map view.
  • the plan map view may show a current location of the telepresence device within a healthcare facility (or other locale). The user may manually navigate the telepresence device using the live video feed and the plan view map.
  • the user may select a location within the plan view map and the telepresence device may autonomously or semi-autonomously navigate to the selected location.
  • the input intended to direct the telepresence device to a new location may be considered a navigation input.
  • the selection of a room, patient, healthcare practitioner, location within a video feed, and/or other selection or input intended to cause the telepresence robot to navigate may be considered a navigation input.
  • the navigation input may then be translated into one or more navigation instructions to guide the telepresence robot, autonomously or
  • FIG. 17 illustrates an embodiment 1700 of an RPI on a PED 1710 displaying multiple panels.
  • a radiography panel 1720 may display images associated with a patient displayed in a live video feed 1750.
  • Telemetry data 1730, lab results 1740, patient data 1760, and physician notes 1770 may be displayed in various other panels on the PED 1750 via the RPI.
  • each of the participating users may be displayed in a panel 1790.
  • each of the panels 1720, 1730, 1740, 1750, 1760, 1770, and 1790 may be moved, enlarged, merged with another panel, removed, and/or captured (recorded), intelligently based on decisions made by the RPI, based on usage history, based on relevancy, and/or based on user selection.
  • a camera 1780 may be selectively enabled or disabled by the user.
  • the RPI may enable complete integration of patient data monitoring with the remote telepresence session, thereby adding a dimension of data-driven functionality uniquely valuable in telepresence applications.
  • the user may select an icon from a toolbar or other panel to activate a patient bedside data monitoring app, such as those offered by any of a variety of real-time patient data monitoring application providers. Upon selecting the appropriate icon, a patient data monitoring window may appear in the RPI. The user may expand this pane to a full-screen view, reposition the pane, and/or resize the pane as described above.
  • the RPI may show any number of real-time or archived patient biometrics or waveforms, such as temperature, heart rate, pulse, blood pressure, oxygen saturation, etc.
  • the user may pause and resume realtime, time-delayed, or archived patient data.
  • the user may move back and forth through time-based patient data using dragging or swiping gestures, or the user may zoom or scale the waveform or metric along an amplitude axis and/or time axis.
  • the application may further allow the user to set markers along a waveform to measure variations in amplitude or time associated with various features of the patient data, such as peaks, valleys, maxima or minima (global or local), global averages, running averages, threshold crossings, or the like.
  • the data may be collected from bedside monitors or other monitoring devices in real-time and archived for a period of time (or indefinitely) in a server or database.
  • the monitoring app may be a separate application and/or integrated within the RPI.
  • the monitoring app may retrieve the relevant data and provide it to the RPI through an application programming interface (API) and/or the RPI may
  • API application programming interface
  • the data may also be collected by the telepresence device by, for example, directing a camera of the telepresence device to the display of a monitoring device, and either recording video of the monitor display or performing image analysis on the video image to extract the patient data.
  • telepresence device may annotate the data and store the annotations with the data, either locally or in a remote server, for later retrieval.
  • the monitoring app may enable alarms, alerts, notifications, or other actions or scripted activities set to take place in response to certain events in the data.
  • the interface may integrate available ADT with patient bedside or biometric data. For example, if a patient's vitals or other biometrics trigger an alert or alarm condition, the telepresence device may be configured to autonomously navigate to the bed or room number of that patient, and send a notification or invitation to a doctor, caregiver, or specialist to begin a telepresence session with the patient.
  • the bedside or biometric data for a patient associated with the selected location, destination, or patient may be automatically retrieved and used to populate a "dashboard" of patient data that the healthcare practitioner can then review, annotate, or otherwise interact with as discussed above and depicted in FIG. 17.
  • an autonomous mobile telepresence device may be used to conduct patient rounds in a healthcare facility. As the telepresence device moves from one location to the next, the location of the telepresence device may be used to retrieve the name and/or other data of a patient(s) associated with that location. For example, the telepresence device may retrieve patient biometrics, bedside data, electronic medical records, and/or other patient data to populate a patient dashboard on a display of the PED. In one embodiment, this information may be retrieved from a health level 7 (HL7) compliant server associated with the facility, healthcare practitioner, and/or patient.
  • HL7 health level 7
  • an autonomous mobile telepresence device may be scripted or scheduled to make scheduled stops at various beds, rooms, locations, or patients.
  • the RPI may retrieve the names or contact info of people (such as doctors, nurses, students, family members, etc.) associated with a scheduled or upcoming stop at a particular patient or location, and send a notification via SMS, email, etc., to the associated people inviting them to join the telepresence session by receiving audio and/or video from the session on a PED via the RPI.
  • the telepresence device may send invitations, notifications, and/or reminders to join the session a predetermined amount of time prior to the time the session is scheduled to begin. Repeated or reminder
  • notifications may be sent to each party at regular or decreasing intervals to remind them of an upcoming session.
  • the notifications may contain a hyperlink to follow to join the session, a link to the RPI, an app notification or badge for display on the PED, or the address or phone number of another device address to connect to.
  • the notification may further include a username, password, pin and/or other credential(s) that the invitees may provide to join the session.
  • the length of the session may be at least partially based on the number of users connected and/or their priority levels.
  • FIGS. 18A and 18B illustrate an embodiment 1800 of what may be displayed during a consult on a telepresence device 1850 and via an RPI on a PED 1805, respectively.
  • the telepresence device 1850 may include audio and/or visual equipment 1880 to capture images and/or audio for display on the PED 1805.
  • PED 1805 may include a camera to capture an image 1815 for display on a screen 1857 of a head portion 1855 of the telepresence device 1850.
  • a lower portion of the telepresence device may include adjustment knobs for the microphone volume 1861 and/or the speaker volume 1862.
  • a screen 1870 may provide additional information about the user of the PED 1805.
  • FIG. 19 illustrates an embodiment of a telepresence device 1900 including audio and/or visual instruments 1980, controls 1961 and 1962, and displays 1950 and 1970.
  • the upper display 1950 may preserve the screen by displaying information about a company, the healthcare facility, medical facts, and/or other information associated with healthcare in general.
  • a lower display 1970 may also enter an independent screen saver mode, and/or allow for user inputs associated with the telepresence device and/or the information displayed via the upper display 1950.
  • FIG. 20 illustrates various embodiments of avatars 2091 , 2092, 2093, and 2094 and/or personalities that may be assigned or used by a healthcare practitioner and/or telepresence device.
  • the image 2090 displayed on the display 2050 may normally be associated with the image captured by a camera of a PED via the RPI, any of a wide variety of characters, avatars, cartoons, licensed caricatures, and/or other images may be used in place of an image received from the camera on the PED.
  • the avatars 2091 , 2092, 2093, and 2094 may be particularly useful to give human-like traits to the telepresence device.
  • the telepresence device may, as previously described, include controls 2061 and 2062 for audio and a touch sensitive screen 2070.
  • FIG. 21 illustrates an RPI that can be displayed on a PED including various informational, control, video, and settings panels 2120, 2130, 2140, 2150, 2160, 2170, 2180, and 2190.
  • a live video feed may be displayed in a panel 21 10.
  • Remote brightness, zoom, and volume controls may be accessible via a panel 2190.
  • Media management controls may be accessible via a panel 2180.
  • Advanced controls with various tabs 2160 may be available via a panel 2170.
  • a current image captured by a camera on the PED may be displayed in a panel 2150.
  • Local zoom, brightness, volume, and microphone controls may be accessible via a panel 2140.
  • Additional controls and tabs of control icons may be accessible via a panel 2130.
  • a user of the RPI may select a navigation mode via a selection panel 2120.
  • Inputs for selecting the navigation mode and/or inputs to direct the actual navigation may be performed using any of a wide variety of inputs, including, but not limited to, a voice input, a keyboard, a mouse, a touch input, a stylus, and/or via another peripheral input device.
  • a user may provide a navigation input in one or more manners. The navigation input may then be translated (processed, lookup table, etc) to generate and transmit navigation instructions to the telepresence robot to guide the
  • telepresence robot autonomously or semi-autonomously, to a desired location.
  • FIG. 22 illustrates an embodiment 2200 of an RPI configured to utilize destination-based navigational modes for navigating a telepresence device.
  • a centralized panel 2210 may display a live video feed 2220.
  • Various remote settings controls may be available in a panel 2290.
  • Another panel 2240 may provide similar audiovisual setting controls for a PED.
  • a window 2250 may display the image currently captured by a camera on the PED.
  • Advanced controls may be available in a panel 2270, and multiple tabs 2260 may provide for additional informational and/or control panels.
  • the illustrated embodiment 2220 includes a destination navigation panel 2230 configured to allow a user to select from a list of destinations.
  • the RPI may facilitate communication with the telepresence device to direct the telepresence device to the selected destination.
  • the telepresence device may be manually driven to the selected destination or may autonomously (or semi-autonomously) navigate to the selected destination.
  • the destinations may include various locations within a healthcare facility, specific patient names, and/or the names of various healthcare practitioners.
  • the locations of healthcare practitioners may be determined using cameras within the healthcare facility, global positioning systems, local positioning systems, radio frequency identification (RFID) tags, and/or the like.
  • RFID radio frequency identification
  • FIG. 23 illustrates an embodiment 2300 of an RPI including a panel 2320 displaying a live feed from a telepresence device.
  • a lower panel 2310 may display a list of patients. A user may select a patient on the patient list to direct the
  • the telepresence device to navigate to the selected patient.
  • the telepresence device may be manually driven, autonomously navigate, or semi-autonomously navigate to the selected destination.
  • FIG. 24 illustrates an embodiment 2400 of an RPI including panel 2410 of a plan view map allowing for map-based navigational modes for navigating a telepresence device.
  • the map-based navigational mode may be part of an advanced control panel including multiple tabs.
  • the size of the panel 2410 may be adapted to suit a particular need and may be user-selectable.
  • Media controls 2440 may allow a user to capture audio and/or visual information from the live video feed in panel 2420.
  • a panel 2430 may display the current image captured by a camera of a PED running the RPI. As illustrated the panel 2430 displaying the current image captured by the camera of the PED may include integrated settings controls.
  • Various tabs 2450 may provide access to additional features, options, help, and/or navigation within the RPI.
  • FIG. 25 illustrates an embodiment 2500 of a full-screen view of a hallway from a telepresence device as visualized on a PED via an RPI.
  • the panel 2510 may display a live video feed from a telepresence device.
  • Icons 2520 and 2522 may be overlaid on the video feed and may provide access to various controls or settings.
  • a destination panel 2530 may also be overlaid on the video feed and allow a user to select a destination.
  • the RPI may communicate a selected destination to the telepresence device, which may autonomously or semi- autonomously navigate to the selected destination.
  • FIG. 26 illustrates an embodiment 2600 of a full-screen view of a live video feed from a telepresence device as may be displayed via an RPI.
  • Various icons 2620 and 2622 may provide access to various features, settings, options, and/or other aspects of the RPI.
  • the icons 2620 and 2622 may be overlaid on the live video feed or a separate panel may be displayed containing icons 2620 and 2622.
  • FIG. 27A illustrates another embodiment 2710 of a live video feed from a telepresence device, as may be displayed via an RPI.
  • An arrow 2730 may be overlaid on the video feed as a navigation input to indicate an intended or desired navigation path of a telepresence device.
  • the arrow may be drawn by the RPI after receiving the live video feed or by the telepresence device prior to sending the live video feed.
  • the user may draw the arrow 2730 to indicate a desired direction of travel.
  • the arrow 2730 may then be translated by the RPI or by the telepresence device into directional commands to navigate the telepresence device.
  • various icons 2720 may provide access to various features, settings, options, and/or other aspects of the RPI.
  • the icons 2720 may be overlaid on the live video feed or a separate panel may be displayed containing icons 2720.
  • the arrow 2730 may be a vector quantity describing both the direction and the velocity (current, planned, or directed) of the telepresence
  • the arrow 2730 may be used to generate navigation instructions that are transmitted to a remote telepresence device.
  • the navigation input may be in the form of a vector (arrow 2730) drawn as either a final endpoint selection or as a vector including a beginning point and an end point.
  • the end point of the vector may represent the location to which the
  • navigation instructions may be generated based on the current location of the telepresence device and the endpoint of the vector within the live video feed.
  • the vector's image pixel endpoint may be mapped via a lookup table to one or more translate and rotate commands to cause the telepresence device to navigate to the selected location.
  • the length of the vector may correspond to the desired distance to move the telepresence device, a desired acceleration of the telepresence device, and/or a desired velocity of the telepresence device.
  • a navigation input may direct a telepresence device to navigate forward (with respect to the live video feed) 3 meters and to the right 2 meters. Any of a wide variety of navigation instructions may be possible to correctly navigate the telepresence device. For instance, the navigation instructions may direct the telepresence device to navigate approximately 3.6 meters at a 34 degree angle relative to the video feed. Alternatively, the navigation instructions may direct the telepresence device to navigate 3 meters forward and then 2 meters to the right. As will be appreciated, any number of navigations routes and corresponding navigation instructions may be possible.
  • the navigation input may be translated into navigation instructions as a plurality of drive forward commands coupled with rotate commands.
  • the live video feed may be delayed slightly due to network and/or processing limitations. This video latency may result in inaccurate navigation instructions. Accordingly, the navigation input may be adjusted based on the known video latency and the current or past velocity and direction of the robot. For example, the selection of a location within the video feed may be translated into navigation instructions that compensate for the latency of the video feed.
  • an end point (endpoint pixel) of a vector drawn on the video feed maps to a position 1 meter forward and 0.5 meters to the right relative to the current location of the telepresence device, and the telepresence device has moved 0.2 meters forward since the video frame on which the vector was drawn was captured, or since the associated command was issued
  • a lookup table entry for 0.8 meters forward and 0.5 meters to the right may be used to determine and/or adjust the navigation instructions.
  • Video latency calculation and command compensation may be performed at the telepresence device, at a remote interface device, and/or at any networked computing device.
  • the navigation instructions may be generated to compensate for the video latency.
  • the telepresence device and/or other computing device may adjust the navigation instructions to compensate for the video latency.
  • FIGS. 27B-27H illustrate various aspects of a mouse-based driving interface 271 1 of an RPI for manually or semi-autonomously controlling a
  • the mouse-based driving interface 271 1 for navigating a telepresence device via an RPI may be presented within a live video feed 2712 from the telepresence device.
  • a four way controller 2750 may be overlaid on the live video feed. According to various embodiments, the arrows of the four way controller 2750 may be selected by a finger or stylus on a touch screen device or via a pointer 2751 on a touch screen device or as controlled by a
  • peripheral device e.g., a keyboard or mouse.
  • a user may operator a mouse to control the pointer 2751 and click on one of the arrows on the four way controller 2750.
  • the live video feed 2712 and the four way controller 2750 may be displayed in a full-screen or expanded mode.
  • a remote controls panel 2713 may allow a user to control various settings of the remote telepresence device, such as the audio and video settings.
  • a media control panel 2714 may allow a user to open, play, record, and/or otherwise manipulate the current live video (or audio) feed 2712, network- accessible audiovisual material, remotely (at the telepresence device) stored audiovisual material, and/or locally stored audiovisual material.
  • An advanced controls panel 2715 may allow a user (or select users based on account privileges) to modify various aspects of the connection or session settings, access external or peripheral applications (e.g., StrokeRESPOND), and/or control other aspects of the remote telepresence session.
  • a local controls panel 2716 may allow a user to control various settings of the RPI, the PED, and/or other local peripheral devices, such as the audio and video settings associated with the telepresence session.
  • a battery life indicator 2717 may provide information regarding the current battery life and/or expected battery life based on current or past consumption rates.
  • Various additional icons and tabs 2718 and 2719 may provide additional controls and/or options within the RPI and/or associated with the remote telepresence device.
  • FIG. 27C illustrates the mouse-based driving interface 271 1 with long drive vector 2755 drawn on the video feed to indicate a direction and a relatively fast velocity.
  • the direction of the vector may be associated with an intended, directed (as directed by the user of the RPI), and/or current direction of travel.
  • a user may click the uppermost arrow of the four way controller 2750 and then drag a vector to describe a desired direction of travel.
  • the length (magnitude) of the vector may correspond to the velocity and/or acceleration of the telepresence device.
  • a grid display 2760 may illustrated the current direction of travel and/or the velocity (magnitude) of the telepresence device on a grid.
  • the grid may correspond and/or display travel angles (e.g., in degrees, radians, with respect to north, etc.) and/or a velocity as a numeric or descriptive value.
  • the length of the dark line on the grid display 2760 may be aligned with numbers between 1 and 10 or descriptive terms like "slow,” “medium,” and “fast.”
  • FIG. 27D illustrates the mouse-based driving interface 271 1 with a short drive vector 2756 drawn on the video feed to indicate a direction and a relatively slow velocity.
  • a corresponding display of a short line may be displayed in the grid display 2760.
  • a navigation input in the form of a vector provided in a mouse-based vector input mode may be translated into navigation instructions through the use of a lookup table.
  • the lookup table may include values that compensate for latency. That is, the navigation instructions returned by the lookup table may be based on the navigation input and the current or recent average video latency, as described herein.
  • FIG. 27E illustrates the mouse-based driving interface 271 1 with a vector 2757 for rounding a corner 2772 within the RPI.
  • the RPI and/or telepresence device may detect objects such as the walls to determine passable hallways and corridors.
  • the RPI and/or telepresence device may utilize maps to determine that a vector 2757 is intended to direct a telepresence device to turn down a different hallway or round a corner.
  • the vector 2757 in FIG. 27E may be intended to round corner 2772.
  • the user may draw a rounded or curved vector 2757.
  • the RPI and/or telepresence device may interpret a straight vector 2757 as intended to direct the telepresence device down the hallway.
  • FIG. 27F illustrates the mouse-based driving interface 271 1 with a vector
  • a horizontal vector 2758 (relative to the live video feed 2712) may be used to direct the telepresence device to spin in place either clockwise or
  • FIG. 27G illustrates the mouse-based driving interface 271 1 with a vector
  • the mouse-based driving interface may prevent the telepresence device from colliding with objects (such as walls, people, and other objects).
  • objects such as walls, people, and other objects.
  • an obstacle detection/obstacle avoidance (ODOA) system may prevent such collisions.
  • a user may deactivate the ODOA system, such that the user may operate in a "nudge mode" and continue to slowly move the telepresence device despite the telepresence being in contact with or close proximity to another object. This mechanism may automatically time-out such that the ODOA system is re-engaged after a period of inactivity.
  • Placing the telepresence device in nudge mode may cause a camera of the telepresence device to be automatically positioned to look down around the body of the telepresence device and either in the direction of drive command issued by the user or in the direction of an object closest to or in contact with the telepresence device (so that the user may see what obstacle he or she is commanding the telepresence device to nudge).
  • the ODOA system may be entirely deactivated. In one embodiment, the ODOA system may be deactivated so long as the telepresence device is driven or moved below a specified velocity.
  • FIG. 27H illustrates the mouse-based driving interface 271 1 used to reverse the telepresence device 2785 with a camera of the telepresence device (or other associated camera) oriented in reverse and slightly downward toward the floor 2790.
  • the telepresence device 2785 may be reversed.
  • the reverse mode may allow for vector controls (direction and velocity) or may
  • a rear camera may be displayed on the live video feed 2712 to show the direction of travel and help prevent the telepresence device 2785 from colliding with objects.
  • a head portion of the telepresence device may be rotated to the direction of travel (reverse) and/or inclined slightly downward to facilitate the reverse navigation.
  • the controls and vectors may be "drawn" as overlays on the live video feed (e.g., via a click and hold of a mouse button or a tap and hold via a stylus or finger).
  • the live video feed e.g., via a click and hold of a mouse button or a tap and hold via a stylus or finger.
  • a panel or peripheral device may be used to "draw" the vectors to control the velocity and/or direction of travel.
  • the velocity selected by the length of the vector may be overridden based on other factors (such as obstacle detection, congestion, human presence, etc.).
  • a head portion and/or the camera of the telepresence device may be re-centered via an icon within the mouse-based driving interface 271 1 or via a keyboard (or other peripheral device) selection.
  • the selection of any of the arrows within the four way controller 2750 may orient or re-orient the head portion of the telepresence device.
  • FIG. 28 illustrates an embodiment 2810 of full-screen live video feed from a telepresence device including a bedside view of a patient bed 2820 and an associated patient monitor 2830.
  • the live video feed may be able to visualize a patient in a patient bed 2820 as well as a patient monitor.
  • the RPI and/or the telepresence device may detect a refresh rate of the patient monitor 2830 and dynamically match the refresh rate to reduce scrolling bars and/or flickering.
  • the live video feed of the patient monitor 2830 may be digitally enhanced to reduce the flickering and/or scrolling bars.
  • FIG. 29 illustrates an embodiment 2910 of an RPI including a click-to- zoom feature 2925.
  • the click-to-zoom feature 2925 may be used to zoom in on a specific area of a live video feed, a captured video feed, or a still image.
  • the box defining the click-to-zoom feature 2925 may be drawn as a box, or by dragging one or more corners.
  • the RPI may receive the click-to- zoom box using any of a wide variety of input devices, such as via a touch, a stylus, or a mouse pointer.
  • FIG. 29 also illustrates a multi-party session. In the illustrated
  • multiple users are shown in a session guests panel 2930.
  • the session guests panel 2930 may include images captured by PEDs associated with each of the guests. Accordingly, multiple users may participate simultaneously.
  • one or more of the users may have limited access and/or control of the telepresence device.
  • control and options icons 2940 and 2942 may be overlaid and provide access to various features of the RPI.
  • FIG. 30 illustrates an embodiment 3000 of a PED running an RPI configured to display a live video feed in an upper panel 3015 along with a current image window 3010.
  • the RPI may incorporate sub-applications and/or provide access to related applications, such as a StrokeRESPOND application (in a lower panel 3020) configured to provide one or more functions and/or workflow processes associated with strokes.
  • the StrokeRESPOND application may display various patient names 3030, which may be filterable, at 3025, and allow a user to select them, at 3035, in order to see more details.
  • the RPI may allow for any number of sub-routines, sub-applications, and/or other applications to run within the RPI or be launched from the RPI.
  • the RPI may provide access to a dictionary, a medical text, an internet search engine, and/or any other external or integrated application.
  • the user may switch to viewing the application in full-screen mode by grabbing the upper part of the application window and dragging upward toward the upper panel 3010.
  • the user may return to the two-pane view by dragging downward from the top of the upper panel 3010.
  • the full-screen application view may include a picture-in-picture of the live video feed from the telepresence device so that the user may continue to monitor the remote environment. Some applications (and/or the RPI) may continue to run in the background, even when not displayed.
  • FIG. 31 illustrates an embodiment 3100 of an RPI on a PED 3105, in which an image 3125 captured by a camera of the PED 3105 is displayed as a transparent image overlaid on a live video feed 31 10 originating from a telepresence device.
  • the transparent image 3125 may alternatively be displayed in a lower panel 3120.
  • FIG. 32 illustrates an embodiment 3200 of an RPI on a PED 3205, including a toolbar 3225 in a lower panel 3220.
  • the toolbar may provide quick access to any of a wide variety of settings and or features of the RPI.
  • a user may select an icon using any of a wide variety of methods depending on the PED. For instance, a user may touch an icon to select it.
  • Settings and/or features of the RPI may be accessed simultaneously while a live video feed is shown in the upper panel 3210.
  • a media management toolbar 3230 (selectively enabled) may allow for the video feed in upper panel 3210 to be recorded, at 3240.
  • a notification 3235 may alert a user of the PED 3205 that the battery on the telepresence device is nearly depleted.
  • a window 3215 may display the image currently being captured by a camera on the PED 3205 or managing modules and control operations available via an RPI, while simultaneously displaying a video feed from a telepresence device.
  • the toolbar 3225 may provide access to a handset, a stethoscope, a camera, a video, a live cursor, a laser pointer, microphone settings, a map, navigational options, a disconnect button, and/or other features, options or settings.
  • the toolbar 3225 may provide access to various other functions or applications, such as StrokeRESPOND, SureNotes, a media manager, patient data, lab results, image data, and/or team communication.
  • FIG. 33 illustrates an embodiment 3300 of an RPI on a PED 3305 that includes a media management toolbar 3325 associated with a media management window 3330 in a lower panel 3320 of the PED 3305.
  • an upper panel 3315 may include a live video feed of a patient.
  • the user of the RPI may access stored videos, images, audio, and/or telemetry data associated with the patient via the media management toolbar 3325.
  • FIG. 34 illustrates an embodiment 3400 of an RPI on a PED 3405 including a video window 3410 displaying a list of telepresence devices to which the user has access, a work space window 3430 displaying a list of patients, and a toolbar 3415 as a tool belt dividing the display.
  • the selection of a telepresence device via video window 3410 will display a live video feed from the selected telepresence device and initiate a communication session with the telepresence device to allow the user of the RPI on PED 3405 to control the telepresence device and/or join in a multi-user experience with the telepresence device.
  • the selection of a patient via work space window 3430 may automatically select an associated telepresence device based on availability, proximity, and/or other preferences. Alternatively, the user of the RPI on the PED 3405 may additionally select a telepresence device. The selection of a patient via work space window 3430 may also direct a telepresence robot to navigate to the location of the patient.
  • FIG. 35 illustrates an embodiment 3500 of an RPI on a PED 3505 including a touch pad control pane 3540 for navigating a telepresence device while displaying a video feed from a telepresence device in an upper panel 3510.
  • a one finger tap in the upper panel 3510 may be used to control the direction in which a head portion of a telepresence device is oriented.
  • a single finger press and drag may draw a click-to-zoom box.
  • Other touch controls, such as pinch to zoom, mouse-based driving, and/or swiping to move the telepresence device may also be available in upper panel 3510.
  • a four way controller may be overlaid within the live video feed in the upper panel 3510.
  • the touch pad control pane 3540 may incorporate various touch controls. For example, a user may swipe left to gain access to local controls/settings of the PED 3505, swipe right to access
  • telepresence device controls/settings from the telepresence device, swipe down to access a toolbar, and use multiple fingers to drive the telepresence device, such as by defining a vector with a magnitude and direction.
  • FIG. 36 illustrates an embodiment 3600 of an RPI on a PED 3605, including an avatar display of telepresence devices in a lower panel 3620 and a video feed from a telepresence device in an upper panel 3610.
  • tapping a camera icon may capture an image. The image may then appear within a media manager in the lower panel 3620. Drag and drop availability may be available in order for the user to customize the telepresence device, such as by adding avatars or special effects to the video feed and/or images.
  • FIG. 37 illustrates an embodiment 3700 of an RPI on a PED 3705 including a visualization of a telepresence device overlaid on a video feed from a telepresence device in an upper panel 3710.
  • the user may place the chess-piece telepresence devices within the video feed in the upper panel 3710 in order to control movement of the actual telepresence device.
  • a lower panel 3730 may allow a user to provide other navigational inputs as well.
  • a lower toolbar 3720 may provide access to various settings and/or function to the RPI.
  • FIG. 38 illustrates another embodiment 3800 of an RPI on a PED 3805 including a live video feed in an upper panel 3815, a toolbar belt including various media management icons 3817, and settings manager toolbars 3850 and 3860 for the PED 3805 and a telepresence device, respectively.
  • FIG. 39 illustrates an embodiment 3900 of an RPI on a PED 3905 that includes a landing strip navigational panel 3920.
  • a user may provide an intended navigation path that can be displayed on the upper panel 3910 and/or the lower panel 3920.
  • the landing strip navigational panel 3920 may be used to display an intended navigational path, a directed navigational path (i.e. a path provided by a user to direct the telepresence device), or a current navigation path on the upper panel 3910 as a vector and/or within the lower panel 3920 as a vector and/or as an avatar or rendered image of the telepresence device.
  • FIG. 40 illustrates an embodiment of an RPI on a PED 4000 including a landscape orientation of a full-screen video feed 4030 from a telepresence device.
  • a toolbar may be overlaid and/or included in a separate lower panel 4020.
  • FIG. 41 illustrates an embodiment 4100 of an RPI on a PED 4105 including a joystick-style control 4125 on a touch interface of a lower panel 4120 of the PED 4105.
  • the lower panel 4120 may also include information icons relating to battery level 4122 and/or network quality 4123.
  • a toolbar 4140 may provide access to various settings, features, and/or controls.
  • a full-screen icon 4130 may allow the live video feed in the upper panel 41 10 to be maximized to a full-screen mode.
  • a window 41 15 may be overlaid in the upper panel 41 10 to show the current image captured by a camera 4180 of the PED 4105.
  • FIG. 42 illustrates an embodiment 4200 of an RPI on a PED 4205 including dual joystick-style controls 4225 and 4226 on a touch interface of a lower panel 4220 of the PED 4205.
  • the lower panel 4220 may also include information icons relating to battery level 4222 and/or network quality 4223.
  • a toolbar 4240 may provide access to various settings, features, and/or controls.
  • a full-screen icon 4230 may allow the live video feed in upper panel 4210 to be maximized to a full-screen mode.
  • a window 4215 may be overlaid in the upper panel 4210 to show the current image captured by a camera 4280 of the PED 4205.
  • the left joystick 4226 may be configured to control movement of the base or body of the telepresence device.
  • the right joystick 4225 may be configured to control movement of a head or upper portion of the telepresence device.
  • the portion of the telepresence device controlled by each joystick 4225 and 4226 may be user-selectable and/or reversed.
  • FIG. 43 illustrates a state diagram 4300 for an RPI for use on a PED.
  • a login page 4310 may be displayed.
  • a session notify page 4330 may be displayed indicating the network and communication status.
  • a successful login may result in an in- session page being displayed 4332 with a live video feed.
  • a dock button may be selected, causing the telepresence device to display an in-transit page 4325 while it navigates to a docking station.
  • a stop button may cause the page to pause 4327.
  • a navigation button may display a navigation page 4320.
  • the navigation page 4320 may allow a user to select between various navigation modes, such as a map button to result in a map page 4322, or a location button to display a list of locations 4324.
  • the in-transit page 4325 may be displayed as the telepresence device navigates to the selected location.
  • a settings page 4340 may be displayed allowing a user to select from any number of settings.
  • a WiFi selection may result in a WiFi page 4342 being displayed.
  • a robot selection may result in a robot page 4344 being displayed.
  • the state diagram 4300 illustrated in FIG. 43 is a simplified state diagram and intentionally omits numerous possible states and connections between states for clarity.
  • Each and every panel, icon, setting, application, option, tab, selection, input, and the like described herein may be represented as a separate state, entry action, transition condition, transition, exit action, and/or other component of a complex state diagram.
  • each of the various aspects, functions, operations, control panels, icons, objects, buttons, display panels, display windows, etc., described herein may be described and/or
  • FIG. 44 illustrates an embodiment of an RPI on a PED 4400 including a full-screen video feed 4410 from a telepresence device.
  • a toolbar 4450 and a navigational joystick 4430 are overlaid on the full-screen video feed 4410.
  • a user may touch, click, or otherwise manipulate the joystick in order to navigate the telepresence device.
  • a dual joystick overlay may be employed to provide
  • clicking a location within the live video feed 4410 may control the head movement of the telepresence device while the joystick 4430 is intended to control the body movement of the telepresence device.
  • a mouse-based driving interface illustrated in FIGS. 27B-27H may be overlaid on the full-screen video feed 4410 instead of or in addition to the joystick 4430.
  • FIG. 45 illustrates an exemplary toolbar 4500 of icons that may be overlaid within a page of an RPI and/or inserted as a separate panel within a page of an RPI.
  • the icons within toolbar 4500 may provide access to charts, vitals, telemetry data, images, lab results, a home page of the RPI, documents, notes, associated healthcare practitioners, navigation options and features, multimedia control panels, and the like.
  • Each page with the RPI may include context-based toolbars and/or general toolbars.
  • the toolbar 4500 may be positioned at a bottom, top, edge, and/or other location with respect to the RPI or a pane or panel within the RPI.
  • the toolbar 4500 may be configured to vanish and selectively reappear, such as, for example, upon a mouseover, swipe, or other action.
  • FIG. 46 illustrates an embodiment 4600 of an RPI on a PED 4605 including an overlaid instructional panel 4630 describing how a user may manually drive a telepresence device.
  • an upper panel 4610 may include a live video feed from the telepresence device.
  • a toolbar 4650 may provide access to various settings, controls, and/or functions.
  • the toolbar 4650 may provide access to a headset, stethoscope, camera, video, navigation, local camera, point, laser, mute, settings for local and remote devices, and/or a
  • the instructional panel 4630 may provide instructions for finger swipes to move the telepresence device forward and reverse 4631 , slide the telepresence device side to side 4632, rotate the telepresence device to the right 4633, and rotate the telepresence device to the left 4634. Separate control of a head portion of the telepresence device may be available using multiple fingers, using a toggle button (software or hardware), or by tapping within the live video feed 4610.
  • a lower toolbar 4640 may provide access to various other functions, features, and/or settings of the RPI, such as those described herein and especially in conjunction with FIG. 45.
  • the RPI may include instructional or demonstrational videos for any of the various embodiments or functions described herein.
  • FIG. 47 illustrates an embodiment 4700 of an RPI on a PED 4705 during a multi-participant telepresence session.
  • a live video feed of a patient may be displayed in an upper panel 4710 of the PED 4705.
  • a toolbar 4750 (and/or 4740) may provide access to various related functions, settings, and/or controls.
  • a lower panel 4730 may include video feeds 4731 , 4732, and 4733 from each of three participants in the multi-participant telepresence session.
  • Each of the three participants may be using an RPI on a PED and see a similar image of the video feed of the patient in the upper panel 4710. Any number of participants may participate.
  • each participant may be able to control the telepresence device.
  • only one, or only a select few, of the participants may have control of the telepresence device.
  • FIG. 48 illustrates a window or panel 4800 accessible via an RPI on a PED providing access to a care team of a particular patient.
  • the care team panel 4800 may be presented in a full-screen mode and/or as a panel within a display of multiple panels, such as illustrated in FIG. 17.
  • the care team panel 4800 may identify the relevant patient by name 4805 and/or an identification number 4807.
  • the care team panel 4800 may include a column of healthcare practitioners 4810 associated with the patient 4805.
  • a column of healthcare practitioner data 4820 may describe each of the healthcare practitioners 4810 associated with the patient 4805. Whether each healthcare practitioner 4810 is on duty or off duty may be displayed in a third column 4830.
  • a user may also select icons via the RPI to consult 4840, text ⁇ e.g., an SMS or email) 4850, and/or call 4860 the associated healthcare practitioner 4810.
  • the RPI interface may utilize inputs provided via panel 4800 to perform one or more functions via the telepresence device and/or a related telepresence system.
  • FIG. 49 illustrates an exemplary overlay help screen 4900 accessible within the RPI on a PED to provide instructions regarding available functions on any given screen.
  • a user may have selected a "help" icon or be in a training mode in order to be presented with instruction on how to use a particular interface or toolbar within a screen of the RPI.
  • help icon or be in a training mode in order to be presented with instruction on how to use a particular interface or toolbar within a screen of the RPI.
  • a window 4950 may display what is currently being captured by a camera of the PED and displayed remotely on a display interface of a telepresence device.
  • a full-screen image of a live video feed 4910 from a telepresence device may be displayed.
  • a toolbar 4920 may be displayed on a top edge (or other location) of the full-screen display of the live video feed 4910. The toolbar may be pulled down from a hidden state and/or vanish and reappear, as described herein.
  • a help screen overlay may provide instructions and/or guidance with how-to toolbars and/or other interface options. As illustrated in FIG. 49, the overlay may comprise text descriptions of the toolbar icons connected by lines to each icon. In some embodiments, instructions and/or guidance for camera controls and/or driving controls for the telepresence device may be illustrated as an overlay as well. For example, instructions on how to use the joysticks 4225 and 4226 in FIG. 42 may be provided in a help screen overlay. Similarly, the gestures 4631 , 4632, 4633, and 4634 in FIG. 46 may be provided in an instructional or help screen overlay. In some embodiments, the instructional overlay may be in the form of moving or video instructions overlaid on an existing display. For example, the overlay may
  • an RPI may be configured with all or some of the features and embodiments described herein.
  • an RPI may include any number of the features and embodiments described herein as selectively displayed and/or selectively functional options.
  • An explicit enumeration of all possible permutations of the various embodiments is not included herein; however, it will be apparent to one of skill in the art that any of the variously described
  • embodiments may be selectively utilized, if not at the same time, in a single RPI.

Abstract

The present disclosure describes various aspects of remote presence interfaces (RPIs) for use on portable electronic devices (PEDs) to interface with remote telepresence devices. An RPI may allow a user to interact with a telepresence device, view a live video feed, provide navigational instructions, and/or otherwise interact with the telepresence device. The RPI may allow a user to manually, semi-autonomously, or autonomously control the movement of the telepresence device. One or more panels associated with a video feed, patient data, calendars, date, time, telemetry data, PED data, telepresence device data, healthcare facility information, healthcare practitioner information, menu tabs, settings controls, and/or other features may be utilized via the RPI.

Description

GRAPHICAL USER INTERFACES INCLUDING TOUCHPAD DRIVING
INTERFACES FOR TELEMEDICINE DEVICES
RELATED APPLICATIONS
[0001] This U.S. Patent Application claims priority under 35 U.S.C. §1 19(e) to: U.S. Provisional Application No. 61/650,205 filed May 22, 2012, titled "Remote Presence Interface and Patient Data Integration;" U.S. Provisional Application No. 61/674,794 filed July 23, 2012, titled "Graphical User Interfaces Including Touchpad Driving Interfaces for Telemedicine Devices;" U.S. Provisional Application No.
61/674,796 filed July 23, 2012, titled "Clinical Workflows Utilizing Autonomous and Semi-Autonomous Telemedicine Devices;" U.S. Provisional Application No.
61/674,782 filed July 23, 2012, titled "Behavioral Rules For a Telemedicine Robot To Comply With Social Protocols;" U.S. Provisional Application No. 61/766,623 filed February 19, 2013, titled "Graphical User Interfaces Including Touchpad Driving Interfaces for Telemedicine Devices;" which applications are all hereby incorporated by reference in their entireties.
TECHNICAL FIELD
[0002] This disclosure relates to interactive and display interfaces for
telepresence devices in healthcare networks. More specifically, this disclosure provides various graphical user interfaces and interactive interfaces for remote presence devices.
BRIEF DESCRIPTION OF THE DRAWINGS
[0003] Non-limiting and non-exhaustive embodiments of the disclosure are described herein, including various embodiments of the disclosure illustrated in the figures listed below.
[0004] FIG. 1 illustrates an embodiment of a home page of a portable electronic device (PED), including an application providing a remote presence interface (RPI) for interacting with a telepresence device.
[0005] FIG. 2A illustrates an embodiment of an initial launch page for an RPI or other application associated with a telepresence device. [0006] FIG. 2B illustrates an embodiment of an initial launch page for an RPI or other application associated with a telepresence device intended to induce a user to orient a PED in a portrait orientation.
[0007] FIG. 2C illustrates a front-facing camera located near the top of a PED in a portrait orientation capturing an image of a healthcare practitioner.
[0008] FIGS. 3A and 3B illustrate embodiments of a login page for the RPI.
[0009] FIG. 4 illustrates an embodiment of an endpoint list of various
telepresence devices and their connectivity state.
[0010] FIG. 5 illustrates a connection wizard configured to facilitate the
connection of a user to a telepresence network.
[0011] FIG. 6 illustrates an exemplary embodiment of a module within the RPI configured to provide visual and interactive control of a telepresence device.
[0012] FIG. 7 illustrates an embodiment of a map and navigation module of the RPI.
[0013] FIG 8 illustrates an example of a notification that may be displayed to a user via the RPI.
[0014] FIG. 9 illustrates an embodiment of a media manager module of the RPI configured to allow a user to capture and manage audiovisual media from the telepresence device.
[0015] FIG. 10 illustrates an embodiment of menu of the RPI allowing a user to modify various settings on the local PED and/or the remote telepresence device.
[0016] FIG. 1 1 illustrates an embodiment of the RPI in full-screen video mode.
[0017] FIG. 12 illustrates an embodiment of the RPI in full-screen data mode.
[0018] FIG. 13 illustrates an embodiment of the RPI including a visual
representation of patient telemetry data.
[0019] FIGS. 14A-C illustrate exemplary embodiments of a telepresence device.
[0020] FIG. 15 illustrates an RPI configured to provide patient visualization, remote settings control, navigation control, and access to patient data integration.
[0021] FIG. 16 illustrates a navigation module within the RPI configured to allow a user to navigate a telepresence device using one or more navigational techniques.
[0022] FIG. 17 illustrates the RPI displaying a video of a patient, associated telemetry data, and associated records.
[0023] FIGS. 18A and 18B illustrate embodiments of what may be displayed during a consult on a telepresence device and via an RPI on a PED, respectively. [0024] FIG. 19 illustrates an embodiment of a telepresence device in a
screensaver mode.
[0025] FIG. 20 illustrates various embodiments of avatars and/or personalities that may be assigned or used by a medical practitioner and/or telepresence device.
[0026] FIG. 21 illustrates an RPI configured to utilize various pointer-based navigational modes for navigating a telepresence device.
[0027] FIG. 22 illustrates an RPI configured to utilize destination-based navigational modes for navigating a telepresence device.
[0028] FIG. 23 illustrates a selectable destination patient list that can be used within an RPI to navigate a telepresence device.
[0029] FIG. 24 illustrates an RPI configured to utilize map-based navigational modes for navigating a telepresence device.
[0030] FIG. 25 illustrates a full-screen view of a hallway from a telepresence device as visualized on a PED via an RPI.
[0031] FIG. 26 illustrates a full-screen view of various doorways in a healthcare facility, as visualized on a PED via an RPI.
[0032] FIG. 27A illustrates an intended or directed navigation path of a
telepresence device as visualized on a PED via an RPI.
[0033] FIG. 27B illustrates a mouse-based driving interface for navigating a telepresence device via an RPI.
[0034] FIG. 27C illustrates the mouse-based driving interface with long drive vector drawn on the video feed to indicate a direction and a relatively fast velocity.
[0035] FIG. 27D illustrates the mouse-based driving interface with a short drive vector drawn on the video feed to indicate a direction and a relatively slow velocity.
[0036] FIG. 27E illustrates the mouse-based driving interface with a vector for rounding a corner within the RPI.
[0037] FIG. 27F illustrates the mouse-based driving interface with a vector drawn to cause the telepresence device to spin in place.
[0038] FIG. 27G illustrates the mouse-based driving interface with a vector drawn towards an object.
[0039] FIG. 27H illustrates the mouse-based driving interface used to reverse the telepresence device with a camera of the telepresence device oriented in reverse and slightly downward. [0040] FIG. 28 illustrates a bedside view of a patient bed and an associated patient monitor, as visualized on a PED via an RPI.
[0041] FIG. 29 illustrates a click-to-zoom feature of an RPI that can be used when medical practitioners and/or other users visualize a patient on a PED via an RPI.
[0042] FIG. 30 illustrates a StrokeRESPOND application for accessing a telepresence device that may be a separate application or integrated within an RPI.
[0043] FIG. 31 illustrates a transparent user image overlaid on an image generated by a telepresence device.
[0044] FIG. 32 illustrates a toolbar for managing modules and control operations available via an RPI, while simultaneously displaying a video feed from a
telepresence device.
[0045] FIG. 33 illustrates a toolbar associated with a media management module of an RPI.
[0046] FIG. 34 illustrates a toolbar separating an endpoint list and a patient list in an RPI, allowing for quick user selection of various possible permutations.
[0047] FIG. 35 illustrates a view of a touch pad control pane for navigating a telepresence device while displaying a video feed from a telepresence device in an upper window.
[0048] FIG. 36 illustrates an avatar display of telepresence devices in a lower window and a video feed from a telepresence device in an upper window.
[0049] FIG. 37 illustrates a visualization of a telepresence device overlaid on a video feed from a telepresence device that may be useful for navigation of the telepresence device.
[0050] FIG. 38 illustrates a toolbar of an RPI associated with a settings manager for managing settings on a local PED and/or a telepresence device.
[0051] FIG. 39 illustrates a navigational mode including a landing strip
navigational panel allowing a user to specify a direction of a telepresence device.
[0052] FIG. 40 illustrates a full-screen video feed from a telepresence device, including an overlaid toolbar.
[0053] FIG. 41 illustrates a joystick-style control on a touch interface of a PED for controlling a telepresence device via an RPI.
[0054] FIG. 42 illustrates a dual joystick-style control on a touch interface of a
PED for controlling a telepresence device via an RPI.
[0055] FIG. 43 illustrates a state diagram for an RPI for use on a PED. [0056] FIG. 44 illustrates a full-screen video feed from a telepresence device, including an overlaid toolbar and joystick control.
[0057] FIG. 45 illustrates an exemplary toolbar of icons that may be overlaid within an RPI.
[0058] FIG. 46 illustrates an overlaid instructional panel associated with driving a telepresence device via an RPI on a PED.
[0059] FIG. 47 illustrates a multi-participant telepresence session conducted via an RPI on a PED.
[0060] FIG. 48 illustrates a window accessible via an RPI on a PED providing access to a care team of a particular patient.
[0061] FIG. 49 illustrates an exemplary overlay help screen accessible within the RPI on a PED to provide instructions regarding available functions on any given screen.
[0062] The described features, structures, and/or characteristics of the systems and methods described herein may be combined in any suitable manner in one or more alternative embodiments, and may differ from the illustrated embodiments.
DETAILED DESCRIPTION
[0063] Healthcare facilities may include telemedicine technologies, such as telepresence devices in a telepresence network, that allow remote healthcare practitioners to provide services to patients and/or other healthcare practitioners in remote locations. For example, a remote healthcare practitioner may be a neurologist practicing in a relatively large hospital who may, via a telepresence device, provide services and consultations to patients and/or other medical professionals in a relatively small hospital that may otherwise not have a neurologist on staff.
[0064] The present disclosure provides various interfaces for visualizing, controlling, and driving telepresence devices. In a remote presence (RP) system, a telepresence device, such as an autonomous or semi-autonomous robot, may communicate with an interfacing device via a communications network. In various embodiments, a portable electronic device (PED) may be used to run a remote presence interface (RPI) adapted to provide various telepresence functions. [0065] In various embodiments, an RPI application may be executed on a device configured as a stand-alone PED configured to solely run the RPI application.
Alternatively, the RPI application may be executed by any of a wide variety of PEDs configured as multi-purpose PEDs, such as the Apple iPad®. In various
embodiments, a user may launch an RPI application and login using login
credentials. The login process may utilize any of a wide variety of encryption algorithms and data protection systems. In various embodiments, the login process may meet or exceed standards specified for Health Insurance Portability and
Accountability Act (HIPAA) compliance.
[0066] A user may select one or more endpoint telepresence devices via the RPI. Once a secure connection is established, the RPI may display a video feed from the telepresence device on the PED. In addition, the RPI may include any number of navigational panels, setting controls, telemetry data displays, map views, and/or patient information displays.
[0067] In some embodiments, the RPI may allow a user to manually, semi- autononnously, or autonomously control the movement of the telepresence device. The RPI may allow a user to specify movement (i.e., a location within a healthcare facility or a physical movement, such as a head turn, of the telepresence device) using a destination selection panel, an arrow, a physical or virtual joystick, a touch pad, click-to-destination, vector based driving, mouse-based vector driving, and/or other navigational control.
[0068] For example, a user may provide a navigation input by selecting a location within a live video feed {e.g., via a click or a touch). The navigation input may be used to transmit navigation instructions to a remote telepresence device. For instance, in a click drive mode, the selection of a location within the live video feed may be used to navigate the telepresence device to the selected location. The selection of a location on a floor or hallway may result in navigation instructions that cause the telepresence robot to autonomously or semi-autonomously navigate to the selected location.
[0069] In some embodiments, an operator may input a navigation input in the form a navigation vector provided {e.g., drawn, traced, mapped) with respect to the live video feed. The navigation vector may include a length on the display and/or other input device {e.g., touchpad) and an angle with respect to plane of the display and/or other input device. As an example, a plane of the display and/or other input device may be described with a Cartesian coordinate system as having an x-axis and a y-axis. The navigation vector may be provided with respect to the display plane and described as having an x-axis component (horizontal direction) and a y-axis component (vertical direction).
[0070] According to various embodiments, a navigation input provided as a navigation vector may be decomposed into a horizontal (x-axis) component and a vertical (y-axis) component. The length and sign (i.e., positive or negative value) of the horizontal component of the navigation vector may be used to determine a magnitude and direction of a rotational velocity and/or an angular displacement of a telepresence device. The length of the vertical component of the navigation vector may be used to determine the magnitude of a forward velocity and/or a forward displacement of a telepresence device.
[0071] The length of the horizontal component may be used to determine the magnitude of the rotational velocity and/or angular displacement using a scaling function. The scaling function may be constant, linear, and/or non-linear. Thus, using a non-linear scaling function, a first horizontal component twice as long as a second horizontal component may not result in a first rotational velocity double that of the second rotational velocity. Similarly, the length of the vertical component may be used to determine the magnitude of the forward velocity and/or forward displacement using a scaling function. Again, the scaling function may be constant, linear, and/or non-linear. The scaling function used to translate the horizontal component may be different than the scaling function used to translate the vertical component.
[0072] Alternatively, selectively, and/or in a different navigation mode, the selection of a location within the live video feed may be used to generate a navigation vector, where the length of the vector corresponds to the velocity at which the telepresence device should navigate and/or the distance the telepresence device should navigate, and the angle of the vector corresponds to the direction the telepresence device should navigate. For instance, a navigation input may be in the form of a vector drawn as either a final endpoint selection or as a vector including a beginning point and an end point. The end point of the vector may represent the location to which the telepresence device should navigate (i.e., a desired navigation point). Navigation instructions may be generated based on the current location of the telepresence device and the endpoint of the vector within the live video feed. According to some embodiments, the vector's image pixel endpoint may be mapped via a lookup table to one or more translate and rotate commands to cause the telepresence device to navigate to the selected location. In some embodiments, the length of the vector may correspond to the desired distance to move the
telepresence device, a desired acceleration of the telepresence device, and/or a desired velocity of the telepresence device.
[0073] As an example, a navigation input may be received that directs a telepresence device to navigate forward (with respect to the live video feed) 3 meters and to the right 2 meters. Any of a wide variety of navigation instructions may be possible to correctly navigate the telepresence device. For instance, the navigation instructions may direct the telepresence device to navigate approximately 3.6 meters at a 34 degree angle relative to the video feed. Alternatively, the navigation instructions may direct the telepresence device to navigate 3 meters forward and then 2 meters to the right. As will be appreciated, any number of navigations routes and corresponding navigation instructions may be possible. The navigation input may be translated into navigation instructions as a plurality of drive forward commands coupled with rotate commands.
[0074] In various embodiments, the navigation instructions are derived from a navigation input indicating a desired direction and/or location and the current location of the telepresence device. In some embodiments, the live video feed may be delayed slightly due to network and/or processing limitations. For example, a live video feed may be delayed by a few tenths of a second or even by a few seconds. This video latency may result in inaccurate navigation instructions. Accordingly, the navigation input may be adjusted based on the known video latency and the velocity and direction of the robot.
[0075] Returning to the example above, if the telepresence device were traveling at 1 .25 meters per second in a forward direction relative to the live video feed and the video latency was .8 seconds, then the telepresence device will have already traveled 1 of the desired 3 meters when the selection is made. Accordingly, the selection of a location 3 meters ahead and 2 meters to the right, may be translated or mapped to navigation instructions that cause the telepresence device to travel 2 meters forward and 2 meters to the right (or 2.8 meters at a 45 degree angle) to compensate the movement of the telepresence device. Thus, the navigation instructions may be adjusted based on the latency of the video feed. [0076] In various embodiments, a navigation input in the form of a vector (such as a vector provided in a mouse-based vector input mode) may be translated into navigation instructions through the use of a lookup table. In some embodiments, the lookup table may include values that compensate for latency. That is, the navigation instructions returned by the lookup table may be based on the navigation input and the current or recent average video latency.
[0077] Various alternative navigation systems, methods, and processing steps may be used in conjunction with the presently described remote presence interface, including those described in United States Patent No. 6,845,297, titled "Method and System for Remote Control of Mobile Robot," filed on January 9, 2003, and
European Patent No. 1279081 , titled "Method and System for Remote Control of Mobile Robot," filed on May 1 , 2001 , which applications are hereby incorporated by reference in their entireties.
[0078] The RPI may provide various notifications associated with the network connection, the PED, a patient, a healthcare facility, a healthcare practitioner, a telepresence device, and/or the like. The RPI may include a media management module configured to allow a user to record and/or store audio and/or visual data for subsequent use. A settings panel may allow settings on the PED and/or the telepresence device to be adjusted. In some views, multiple windows may provide quick access to various panels of information. For example, one or more panels associated with a video feed, patient data, calendars, date, time, telemetry data, PED data, telepresence device data, healthcare facility information, healthcare practitioner information, menu tabs, settings controls, and/or other features may be displayed simultaneously and/or individually in a full-screen mode.
[0079] The RPI may utilize a camera of the PED to capture an image of the user of the PED and project the image on a screen on the telepresence device. In some embodiments, the image on the screen of the telepresence device may be modified and/or enhanced. In one embodiment, an avatar representing the user of the PED is displayed on the PED. In some embodiments, the RPI may encourage or induce a user to utilize the PED with a front-facing camera at or above eye-level. Similarly, the RPI may encourage or induce a user to utilize the PED with a front-facing camera approximately centered on a user's face. For instance, on an Apple iPad®, the RPI may encourage a user to utilize the iPad® in a portrait mode for many tasks in order to maintain a more natural perspective of the user for projection via the screen on the telepresence device.
[0080] In various embodiments, the RPI may facilitate conference sessions with more than two people interacting via a combination of PEDs and/or telepresence devices. For example, multiple healthcare practitioners may participate in a remote consultation. In such an example, each healthcare practitioner may utilize an RPI on a PED to access a telepresence device at a bedside of a patient. The remote presence system, via the network, servers, RPIs, and/or telepresence devices, may facilitate the multi-user experience.
[0081] The RPI may incorporate sub-applications and/or provide access to related applications, such as a StrokeRESPOND application configured to provide one or more functions and/or workflow processes described in U.S. Patent
Application No. 12/362,454, titled "DOCUMENTATION THROUGH A REMOTE PRESENCE ROBOT," filed on January 29, 2009, which application is hereby incorporated by reference in its entirety.
[0082] As described herein and illustrated in various embodiments, the display of a PED may be utilized by the RPI to display any combination of video feed panels, informational panels, data panels, setting control panels, navigation panels, and/or panels providing access to any of various functions made accessible via the RPI. The RPI may be configured to maintain a "stateful" connection at the application layer, such that a session and/or variables may be continued and/or maintained in the event that the connection is lost or dropped. The RPI application may attempt to re-establish a disconnected session using saved or logged variables in the event a connection is lost or dropped. The PED and/or RPI application may have settings that enable a user to maximize frame rate or image quality at the expense of the battery life of the device, or vice versa.
[0083] The RPI may include an information bar ("status bar") that displays various status information related to the PED, including battery life, wireless connection strength, wireless connection name or SSID, or the current time. The RPI may include one or more toolbars. A toolbar may be disposed along the top edge of the upper pane of the RPI. The toolbar may be manually hidden by touching or selecting an icon or a specified area of the display, or the toolbar may auto-hide after a period of time. Once hidden, the user may un-hide the toolbar by touching, selecting, or otherwise moving an input device on or around an icon or specified area of the display.
[0084] The RPI may include a "Picture-in-Picture" region or window that displays local video or image data currently being captured by the camera of the PED. The local video feed may be captured from a camera either incorporated within or otherwise in communication with the PED. The user may resize the local video window, reposition it within the display, and/or remove it. The local video window may be displayed in a lower pane of the GUI, while the remote video is displayed in the upper pane, or vice versa.
[0085] The RPI may include an in-video interface that enables the user to control the endpoint device by interacting with the live video feed. For example, when the user touches, taps, clicks, or otherwise selects a point in the live video feed, the endpoint may change a mechanical pan or tilt, or digital pan or tilt, or both, such that the point selected by the user is centered in the live video feed. The user may also adjust an optical or digital zoom, or both, of the live video feed. For example, a user may adjust an optical and/or digital zoom by pinching together or spreading apart two or more fingers on the surface of a touch sensitive display of the PED. As another example, the user may control all or some of a mechanical or digital pan, tilt, or zoom of the live video feed by pressing and dragging a finger on the surface of the display to specify a diagonal or other dimension of a zoom region. After a period of time or upon releasing his or her finger, the endpoint may mechanically or digitally pan, tilt, or zoom to match a dimension of the zoom region to a dimension of the live video feed, and/or to match a center of the zoom region to a center of the live video feed. In one embodiment, a user may zoom out to a default zoom (which may be fully zoomed out) by performing a double-tap in the live video feed, or by double- clicking or right-clicking a mouse cursor within the live video feed.
[0086] In one embodiment, the user may direct movement of a telemedicine device (or other telepresence device) to a particular location within the live video feed by performing a "touch-hold-tap," where the user touches a location on the screen, holds his or her finger on that location for a brief interval until a cursor appears on the screen under the user's finger, positions the cursor (which now follows the user's finger) at a point in the remote video window representing the desired destination, and subsequently taps his or her finger once again to confirm the location as the desired destination. The telepresence device may then proceed to a position in the live video feed corresponding to the location selected by the user.
[0087] Additional functionalities available via the RPI through a touch screen interface may include a two-finger swipe to display video from one or more auxiliary video sources, such as a video camera, still camera, endoscope, ultrasound device, radiological imaging device, magnetic resonance imaging device, or other medical imaging device.
[0088] A toolbar may be shrunken and/or expanded (vertically or horizontally, or both), or hidden and unhidden, by touching or tapping an icon disposed at the top of the screen. For example, an icon may have the appearance of an arrow or triangle that points "up" when the toolbar is fully expanded or unhidden, and may point "down" when the toolbar is shrunken or hidden. The icon itself may shrink or expand with the toolbar. Alternatively, the toolbar may be unhidden by the user "pulling down" or making a downward swipe from the top (or other edge) of the screen. The toolbar may again be hidden by "pushing up" or making a swipe toward the top (or other edge) of the screen.
[0089] Additional functions or applications may be accessed from the toolbar. Each function or application may have a distinctive icon in the toolbar indicative of the underlying functionality. If there are more functions/icons available than can be displayed on the toolbar, the toolbar may be configured to scroll icons onto and off of the toolbar with a swipe of the user's finger in the toolbar. The icons may scroll continuously in a carousel fashion, or the scrolling may be disabled in a particular direction when there are no further icons to be displayed, thus informing the user of this fact. Alternatively, the toolbar may not scroll available icons, but instead show only a specified set of icons. In this case, additional functions or applications would be exposed to the user via additional menu levels, windows, or pop-overs, accessible via one or more icons contained in the toolbar or elsewhere in the RPI.
[0090] The toolbar may provide access to various functions of the RPI such as activating or deactivating a headset or handset (which may be coupled to the telepresence device, in either a wired or wireless fashion) when additional privacy is desired in communicating with the remote user (i.e., the user at the control station/PED); activating or deactivating a stethoscope for monitoring a patient;
creating a still image or snapshot from a camera (still or video) on or otherwise coupled to the telepresence device or any auxiliary input; starting or stopping recording video from a camera on or coupled to the telepresence device or any auxiliary input; navigation features; opening or bringing up a map or list of destinations (e.g., people's names associated with a device or location); reversing local camera view (toggle local video to come from front of the PED or back of the PED); activating or deactivating remote pointer (cursor on remote display follows or mirrors cursor position on local display, which is positioned by the user tapping or touching the desired location on the display); activating or deactivating a laser pointer on the remote device (a laser pointer may be positioned on the telepresence device so as to correspond to a position of tap or touch within the live video feed); muting or un-muting a microphone of the PED; opening the settings panel;
disconnecting; and/or other features, options, and/or settings.
[0091] From the lower pane of the RPI, the user may initiate a vector drive mode for controlling a telepresence device. For example, a mouse click, touch-drag motion, or other action may be used to "draw" or otherwise input a vector for controlling the motion of a telepresence device. For example, a user may touch two fingers in the lower pane and drag them a desired distance and direction on the screen to send a drive command with a respective velocity and heading to the mobile telepresence device.
[0092] The interface may further include a mechanism for deactivating an obstacle detection/obstacle avoidance (ODOA) system, such that the user may operate in a "nudge mode" and continue to slowly move the telepresence device despite the telepresence being in contact with or close proximity to another object. This mechanism may automatically time-out such that the ODOA system is reengaged after a period of inactivity. Placing the telepresence device in nudge mode may cause a camera of the telepresence device to be automatically positioned to look down around the body of the telepresence device and either in the direction of drive command issued by the user or in the direction of an object closest to or in contact with the telepresence device (so that the user may see what obstacle he or she is commanding the telepresence device to nudge). Alternatively, the ODOA system may be entirely deactivated. In one embodiment, the ODOA system may be deactivated so long as the telepresence device is driven or moved below a specified velocity.
[0093] The RPI may be configured for deployment on any of a wide variety of PEDs, including the Apple iPad®, iPod®, and iPhone®. The RPI may be configured for deployment on any of a wide variety of PEDs, such as mobile phones, computers, laptops, tablets, and/or any other mobile or stationary computing device. In some embodiments, the RPI may be presented via a plug-in or in-browser application within a standard web browser.
[0094] In some embodiments, the RPI may allow for varying feature sets depending on the type of PED utilized. For example, on a larger tablet-sized PED, the RPI may include any combination of the numerous features described herein. While on a smaller PED, such as an Apple iPhone®, the available features may be limited to suit a particular context or use-case. In one embodiment, the RPI may allow a user, such as nurse or hospital staffer, to control the movement of a telepresence device without establishing a telepresence audio/video session. In other embodiments, for example, a remote family member of a patient may conduct a two-way voice and video telepresence with his or her iPhone, but may not have permission to drive or otherwise control the movement of the telepresence device. Any number of features or combination of features may be included and/or excluded for a particular PED.
[0095] As an example scenario, a nurse may utilize an RPI on a PED with limited functionality, such as an Apple iPhone®, to request a cardiac consult for a patient. A telepresence system in contact with the RPI submits the request to a telepresence device. The telepresence device may begin navigating to the patient while simultaneously initiating a connection with a cardiac doctor. For example, the telepresence device may call an appropriate cardiac doctor (e.g., the cardiologist on call, the nearest cardiologist, the patient's specific cardiologist, etc.) on one or more PEDs belonging to the doctor. Given that the nurse is using a PED with limited functionality, the nurse may be allowed to control the telepresence device and/or participate in an audio-only telepresence session, but may be provided a limited feature set. Accordingly, the nurse may be able to communicate with the doctor as the telepresence device navigates to the patient.
[0096] In some embodiments, a nurse or a patient may request a doctor (or a specific type of doctor) via an RPI or via a display interface directly on a
telepresence device. The RPI, the telepresence device, and/or a corresponding telepresence system may intelligently call a specific doctor as described herein. Alternatively, the RPI, the telepresence device, and/or a corresponding telepresence system may call a plurality of doctors. In such a scenario, the first doctor to attend may be connected via an RPI to a telepresence device to service the request of the nurse or patient. The call routing and telepresence session may be managed in a network cloud, a telepresence network, and/or via some other suitable computing device.
[0097] Throughout the specification, various functions, features, processing, and/or other computing actions are described as being performed by at least one of an RPI, a PED, a telepresence system, and/or other computing device. It will be appreciated that in many instances, the actual processing, calling, function implementation, recording, and/or other computer-performed actions may be executed on a local device, a remote device, and/or via networked or cloud device.
[0098] Reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of the phrases "in one embodiment" and "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. In particular, an "embodiment" may be a system, an article of manufacture (such as a computer-readable storage medium), a method, and/or a product of a process.
[0099] The phrases "connected to" and "in communication with" refer to any form of interaction between two or more entities, including mechanical, electrical, magnetic, and electromagnetic interaction. Two components may be connected to each other even though they are not in direct contact with each other and even though there may be intermediary devices between the two components.
[00100] Types of telepresence devices include, but are not limited to, remote telepresence devices, mobile telepresence units, and/or control stations. For example, a remote telepresence device may include a telepresence robot configured to move within a medical facility and provide a means for a remote practitioner to perform remote consultations. Additionally, telepresence devices may comprise any of a wide variety of endpoint devices, such as those described in U.S. Patent Application No. 13/360,579 filed on January 27, 2012, titled "INTERFACING WITH A MOBILE TELEPRESENCE ROBOT," which application is hereby incorporated by reference in its entirety. Telepresence devices may also comprise any of the endpoint devices described in U.S. Patent Application No. 13/360,590 filed on January 27, 2012, titled "INTERFACING WITH A MOBILE TELEPRESENCE
ROBOT," which application is hereby incorporated by reference in its entirety.
[00101] A "portable electronic device" (PED) as used throughout the specification may include any of a wide variety of electronic devices. Specifically contemplated and illustrated are tablet-style electronic devices, including, but not limited to, electronic readers, tablet computers, tablet PCs, cellular phones, interactive displays, video displays, touch screens, touch computers, and the like. Examples of PEDs include the Apple iPad®, iPod®, and iPhone®, the Hewlett Packard Slate®, the Blackberry Playbook®, the Acer lconia Tab®, the Samsung Galaxy®, the LG Optimus G-Slate®, the Motorola Xoom, ® the HP touchpad Topaz®, the Dell Streak®, and the like.
[00102] Throughout this description, a tablet-style touch-screen PED is used as an exemplary PED; however, any of a wide variety of PEDs and/or other electronic devices may be used instead. For instance, tablet computing devices, cellular phones, computers, laptops, etc., could be used in place of the illustrated and described touch-screen tablet devices. It will be appreciated by one of skill in the art that operations and functions performed on or by a PED may also be performed on a stationary portable electronic device, such as a desktop computer or server.
[00103] The embodiments of the disclosure may be understood by reference to the drawings, wherein like elements are designated by like numerals throughout. In the following description, numerous specific details are provided for a thorough understanding of the embodiments described herein. However, those of skill in the art will recognize that one or more of the specific details may be omitted, or other methods, components, or materials may be used. In some cases, operations and/or components are not shown or described in detail.
[00104] Furthermore, the described features, operations, or characteristics may be combined in any suitable manner in one or more embodiments. The order of the steps or actions of the methods described in connection with the embodiments disclosed may be varied. Thus, any order in the drawings or Detailed Description is for illustrative purposes only and is not meant to imply a required order, unless otherwise specified.
[00105] Embodiments may include various features, which may be embodied in machine-executable instructions executed by a general-purpose or special-purpose computer (or other electronic device). Alternatively, the features may be performed by hardware components that include specific logic for performing the steps or by a combination of hardware, software, and/or firmware. Accordingly, the various components, modules, systems, and/or features described herein may be embodied as modules within a system. Such a system may be implemented in software, firmware, hardware, and/or physical infrastructure.
[00106] Embodiments may also be provided as a computer program product including a non-transitory machine-readable medium having stored thereon instructions that may be used to program or be executed on a computer (or other electronic device, such as a PED) to perform processes described herein. The machine-readable medium may include, but is not limited to, hard drives, floppy diskettes, optical disks, CD-ROMs, DVD-ROMs, ROMs, RAMs, EPROMs,
EEPROMs, magnetic or optical cards, solid-state memory devices, or other types of media/machine-readable media suitable for storing electronic instructions.
[00107] FIG. 1 illustrates an embodiment 100 of a home page of a portable electronic device (PED) 105. The PED 105 may include one or more physical buttons, such as button 120, and a display screen. The display screen may be a touch-screen configured to allow a user to provide input via the touch screen. The PED 105 may be configured to display various icons 1 15 to launch corresponding applications. In various embodiments, a remote presence interface (RPI) icon 1 10 may be used to launch an RPI application for interfacing with a telepresence device. An RPI according to any of the various embodiments described herein may alternatively be utilized on any of a wide variety of computing platforms and using any of a wide variety of programming tools and languages.
[00108] FIG. 2A illustrates an embodiment 200 of an initial launch page 210 for an RPI or other application associated with a telepresence device. As illustrated, a PED 205 may display the initial launch page 210 until the RPI fully loads. Alternative logos, graphics, informational displays and/or other content may be displayed. For example, a general greeting or specific greeting may be displayed on/during launch. In some embodiments an icon or object may indicate the progress or timeframe until loading is complete. In some embodiments, a cancel button or settings configuration may be available within the initial launch page 210.
[00109] FIG. 2B illustrates embodiment 250 of an alternative initial launch page 270 including an image of a telepresence device 275. The launch page 270 may be oriented such that it induces or encourages a user to align the PED 255 such that camera 280 is oriented in about the middle of a user's face and at or above an eye level of a user's face. In the illustrated embodiment, if a user had been holding the PED 255 in a landscape orientation, the image of the telepresence device 275 in the portrait orientation may encourage the user to rotate the PED 255 to the portrait orientation with the camera 280 in the top middle. An information bar 260 may provide information such as battery power, time, and/or connection strength. The initial launch page 270 may be unique to each PED on which it is utilized. For example, the RPI may detect that a PED with a camera positioned for a landscape orientation is being used and reorient the displayed launch page 270 accordingly.
[00110] FIG. 2C illustrates a front-facing camera 280 of the PED 200. The camera 280 is illustrated as located on the top of the PED 200 for capturing an capturing an image of a healthcare practitioner just above eye level in a portrait orientation. This location of the camera 280 relative to the face of a healthcare practitioner 290 may facilitate the capture of an aesthetically pleasing image of the healthcare practitioner 290 for display on a screen of a telepresence device. Accordingly, a patient or other person viewing the screen on the telepresence device may view a natural image of the healthcare practitioner 290 using the RPI on the PED 200, rather than an un- aesthetically pleasing perspective of the healthcare practitioner 290. For example, a side-captured or bottom -captured image of a healthcare practitioner 290 may not be aesthetically pleasing.
[00111] According to various embodiments, the RPI may automatically adjust the field of view (FOV) of the camera 280. In some embodiments, the healthcare practitioner 290, or other user, may manually adjust the field of view of the camera 280. Additionally, any number of compositional and exposure variables of the camera may be automatically or manually adjusted/adjustable. The RPI or the healthcare practitioner 290 may automatically or manually select which camera to use in the event a PED has multiple cameras. A user may, in some embodiments, select which camera is used during a session.
[00112] FIGS. 3A and 3B illustrate embodiments of a login page 325 on a PED 305 for accessing a telepresence device or telepresence network via an RPI, respectively. Any of a wide variety of login credentials 310 and 315 and/or security technologies may be employed. In some embodiments, a username, handle, and/or password may be provided on a different settings page. In some embodiments, the username, handle, and/or password may be maintained in a configuration file. The login page may include a button 330 or link that opens a settings or configuration page. Accordingly, a "remember me" 320 option may be provided to a user.
[00113] FIG. 4 illustrates an embodiment 400 of an endpoint list 410 generated by the RPI running on the PED 405. The endpoint list 410 may include various telepresence devices 420 and their respective connectivity states 415. A user may indicate (via a touch, a click, using a stylus, and/or by speaking) to which of the available endpoints he or she would like to connect. Where an ADT (Admissions, Discharges, and Transfers) data source is available, patients may also be listed. Selecting a particular patient may initiate a connection to an endpoint device
(telemedicine device) that is assigned to, associated with, and/or in proximity to the selected patient. A telemedicine device in proximity to the patient could be a telemedicine device nearest a location of the selected patient, the same bed, room, or floor number. Where multiple autonomous or semi-autonomous endpoint devices (telemedicine devices) are present and capable of navigating to the selected patient, the RPI may automatically determine an optimal endpoint device to dispatch to the patient.
[00114] Factors involved in determining the optimal endpoint device could be distance or estimated travel time to the selected patient or remaining battery life of respective telepresence devices. Additionally, the RPI may dispatch the endpoint device that can reach the selected patient while minimizing travel through areas with poor wireless connectivity, confined spaces/areas, high-traffic areas, sensitive and/or secure areas, dangerous areas, and/or otherwise undesirable zones.
[00115] The endpoint list may also include doctors, nurses, staff, or any other persons that may currently (or for a scheduled period of time) be associated with a particular location and/or patient to which the endpoint can navigate. The endpoint list may be searchable and/or filterable. In some embodiments, the endpoint list may be implemented with a text box, in a window, or in separate tabs. As the user enters alphanumeric characters into the text box, the list may be instantaneously filtered to exclude endpoints whose names do not match the character string currently contained in the text box.
[00116] Other filtering parameters may be specified, such as endpoint type, manufacturer, status, facility, building, floor, room, customer, and/or any other grouping. Logical, arbitrary, or otherwise customized groupings of endpoints may be created by a user or administrator, and these groupings may additionally be used to filter or otherwise search the list of endpoints. In some embodiments, each endpoint in the list may have an associated status indicator, which informs the user whether a device is ready, busy, offline, in a private session, and/or in a multi-presence session (which the user may join to receive session audio, video, images, or potentially control the some or all functions of the endpoint).
[00117] FIG. 5 illustrates an embodiment 500 of a PED 505 displaying a
connection wizard page as a user is connected to a telepresence network and/or telepresence device. The connection wizard may facilitate the connection of a user to a telepresence network, a specific telepresence device, and/or other operators of RPIs. In various embodiments, the displayed connection wizard page may be purely a placeholder image displayed while a connection is established. In other
embodiments, the connection wizard may enable or allow a user to select various connection parameters and/or settings.
[00118] FIG. 6 illustrates an exemplary embodiment 600 of a module within the RPI configured to provide visual and interactive control of a telepresence device. As illustrated, the RPI may be divided into an upper panel 610 and a lower panel 620. The upper panel 610 may include a video window 615 showing the user of the PED 605 what the camera 680 is capturing. The upper panel 610 may display a video feed originating from a camera on the remote telepresence device. The upper panel 610 may further include various informational items and/or control panels, described in greater detail in conjunction with subsequent figures. The lower panel 620 may be used to display additional information and/or to receive inputs from a user of the PED.
[00119] FIG. 7 illustrates an embodiment 700 of an RPI on a PED 705 including a map 720 and a navigation module 710. The map 720 may display a plan view map of a healthcare facility. An image captured by the camera 780 may be displayed in a window 715. A list of destinations 730 may be displayed in an upper panel.
According to various embodiments, a user may select a destination by selecting a room within the healthcare facility, or by selecting a patient within a patient tab. The selection of a room may be considered a navigation input. The selection may then be translated into one or more navigation instructions to guide the telepresence robot, autonomously or semi-autonomously to the selected location.
[00120] Once a destination, patient, or other person is selected, the RPI may communicate the desired destination to the telepresence device via navigation instructions. The navigation instructions may allow for the telepresence device to be manually navigated, semi-autonomously navigated, or autonomously navigated to the desired destination. Systems and methods for manual, semi-autonomous, and autonomous navigation are described in U.S. Patent Application No. 13/360,579, previously incorporated by reference.
[00121] FIG. 8 illustrates an embodiment 800 of a notification 820 generated or received by the RPI. The notification 820 may be displayed on a PED 805 in a lower panel or an upper panel. As illustrated, the notification 820 may include the remaining battery life of the telepresence robot, the robot number, the location of the robot, and/or the local time. As in previous embodiments, a window 815 may show the image currently being captured by a camera. An upper panel 810 may be maximized, or shown in a full-screen mode, by selecting a full-screen icon 830. In some embodiments, the notifications 820 may be displayed as overlays when the upper panel 810 is maximized or is in a full-screen mode.
[00122] The RPI may be designed to provide a user with notifications regarding various states of a device (e.g., the PED 805 or the telepresence device), a connection, or a session. For instance, the RPI may display indications of whether video or audio recording is active and/or enabled or if a laser pointer or remote pointer is active, a battery level or remaining time available based on the current battery level, related to network performance, and/or a wireless signal strength.
[00123] In one embodiment, the RPI may provide a message to the user (who may be moving around) if the PED loses (or detects a decrease in strength of) a communication signal (e.g., a WiFi or 4G signal) as the result of the user's
movement and/or as a result of the telepresence device leaving a good WiFi zone, or some loss, outage, or failure elsewhere in the link. The message provided to the user may distinguish whether the loss, outage, or drop in strength has occurred locally, remotely, or elsewhere in the link.
[00124] In some embodiments, the RPI may be configured to provide notifications or alerts using various display modifications, annotations, or overlays. For example, the screen may pulsate with a partially translucent overlay to indicate that a battery of the telepresence device is dying and/or that connectivity is below a predetermined level. For example, the screen may pulsate with a red glow, possible with a vignette such that it is increasingly translucent toward the center of the display, to indicate that a battery is dying. [00125] In one embodiment, a notification may include one or more pop-up dialogue boxes to bring critical information to the attention of the user of the RPI. For example, if the connectivity falls below a level considered clinically viable, the RPI may request that a user of the RPI acknowledge the current limitation in order to continue the telepresence session. As another example, the RPI may request that a user acknowledge anomalous conditions associated with the RPI, the patient, and/or the telepresence device. The RPI may request that a user acknowledge that they are aware of or have noticed patient conditions considered outside of predetermined ranges. For example, the telepresence device and/or the RPI may recognize that a patient's heart rate is below a threshold level and request that a remote user of the RPI acknowledge awareness of the condition.
[00126] FIG. 9 illustrates an embodiment 900 of a media manager module 920 of the RPI configured to allow a user to capture and manage audio and/or visual media from the telepresence device. As illustrated, the upper panel 910 of the PED 905 may include a live video feed from the telepresence device. The live video feed in the upper panel 910 may be expanded to full-screen via the full-screen icon 930. The lower panel may include a media manager module 920 adapted to capture still images, audio, video, and/or a combination thereof from the telepresence device. Additionally, the media manager module 920 may allow for editing, playback, trimming, storing, emailing, and/or other functions.
[00127] For example, when a user taps the video or movie projector icon in a toolbar, the RPI may display the native video controls of the PED overlaid above the live video feed, along a toolbar across the upper panel 910 or a lower panel, or elsewhere on the screen. Additionally, a library of network-accessible or locally stored video clips or movies may be made accessible in the lower panel. The user may navigate the video clip library by swiping left or right, which may cause successive clips to be visually cycled from completely obscured to partially obscured to fully visible (and selectable/playable), to partially obscured again, and so on.
Once the user locates a desired video, the user may drag and drop the video from the media manager module to the upper panel 910 to cause the video to be displayed on a screen on the telepresence device.
[00128] The RPI may enable playback to be controlled from the PED and/or the telepresence device. Once a video is selected, the RPI may enable the user to navigate or skip through an index of key frames contained in the video. The RPI may additionally enable the user to delete a video using a flicking gesture. For security and privacy, videos may be stored in an encrypted format. In another example, when the user taps the camera icon in a toolbar, the interface may display a cover-flow image library in the lower pane. Images may be searched, browsed, displayed on the PED, or displayed at the remote endpoint in the same manner as described above with respect to video files in the media manager.
[00129] FIG. 10 illustrates an embodiment 1000 of menu 1040 of the RPI allowing a user to modify various settings on the local PED 1005 and/or a remote
telepresence device. As illustrated, a window 1010 may show the user what is currently being captured by the camera 1080. The upper panel 1015 may display a live video feed generated by the telepresence device. The menu 1040 in the lower panel 1030 may contain settings for local brightness control 1045, remote brightness control 1065, microphone levels 1035, speaker levels 1055, and/or any of a wide variety of other features.
[00130] A lower bar 1020 may also include one or more management tools. As illustrated, a picture-in-picture selection may allow a user to selectively disable the picture-in-picture window 1010. An auto-brightness feature 1045 may be enabled via a checkbox. The control settings may further include a microphone level meter 1050 to indicate a volume or sound pressure level relative to a maximum specified input or clipping level. Additionally, the settings control may include a microphone gain slider 1035 to allow adjustment of a microphone gain to a desired level.
Similarly, a speaker level meter 1060 may graphically illustrate the current volume or sound pressure level relative to a maximum specified output or slipping level. The remote controls may include a picture-in-picture checkbox to enable the user to toggle a picture-in-picture display of the remote video view at the remote endpoint.
[00131] FIG. 1 1 illustrates an embodiment 1 100 of the RPI in full-screen video mode. As illustrated, a window 1 1 15 may show a picture-in-picture of the image captured by camera 1 180. The upper panel 1 1 10 may display a full-screen live video feed captured by a telepresence device.
[00132] FIG. 12 illustrates an embodiment 1200 of the RPI in full-screen data mode. In the illustrated embodiment, the image captured by camera 1280 may not be displayed in a picture-in-picture window to make more room for data entry and/or visualization. In some embodiments, the image captured displayed by the
telepresence device that is normally the image captured by the camera 1280 may display a no signal or screensaver image, or a previous image of the healthcare practitioner may be "frozen" on the screen.
[00133] The PED 1205 may allow data to be entered via the touch screen using keyboard 1230. As illustrated, the RPI may display a single data panel 1215 in a fullscreen mode by removing the upper panel typically used to display the live video feed from the telepresence device. Various tabs 1210 may toggle the screen between various data input and/or visualization modes. In some embodiments, the remote video view may continue to be displayed in a small window or panel.
[00134] FIG. 13 illustrates an embodiment 1300 of the RPI including a visual representation of patient telemetry data 1350 displayed in a lower panel 1320. The PED 1305 may again include a camera 1380 and a window 1310 may be displayed by the RPI to show a user what is currently being captured by the camera 1380. As illustrated, full-screen icons 1335 may be available in the RPI to transition either of the lower panel 1320 or the upper panel 1315 to a full-screen mode. A software button 1330 may allow the telemetry data to be toggled, changed, scrolled, and/or otherwise manipulated. A patient icon 1325 may allow a user to select a
telepresence device and/or telemetry data associated with a different patient. The telemetry data 1350 may be displayed as numerical values and/or in graphical form 1340.
[00135] FIGS. 14A-C illustrate exemplary embodiments of a telepresence device. As illustrated in each of FIGS. 14A-C, a telepresence device may comprise a base 1410 capable of being manually driven, semi-autonomously driven, and/or
autonomously driven. Various display features, connection features, and/or data ports 1420, 1430, and 1465 may be available on a mid-section of the telepresence device. The telepresence device may also include a handle 1435. A head portion 1440 of the telepresence device may include one or more cameras 1460, speakers, and/or microphones. In addition, the head portion 1440 may include one or more two- or three-dimensional depth sensors, such as LADARs, LIDARs, or structured light projectors/scanners. Multiple cameras 1460 may be useful to render 3D images, and multiple microphones and/or speakers may be useful for rendering and/or generating directional sound. The head portion may also include a display 1450 configured to display, for example, an image captured using an RPI on a PED.
[00136] The display 1450 and/or the interface 1430 may comprise a touch screen or other interface to receive inputs. In some embodiments, if a telepresence device is an autonomous mobile telepresence device, the display 1450 and/or the interface 1430 may provide a list of destinations, healthcare practitioners, and/or patients to which, as described herein, the telepresence device can be sent or connected. The display 1450 and/or the interface 1430 may also enable a person to stop the telepresence device when it is autonomously navigating and, likewise, enable the telepresence device to resume autonomous navigation to its destination. The display 1450 and/or the interface 1430 may additionally have a button or menu option that instructs the telepresence device to autonomously navigate to an out of the way location (e.g., a wall, corner, etc.), a dock, a storage location, and/or a charging station. The display 1450 and/or the interface 1430 may include buttons or menu options for various settings or to page or notify support personnel of a problem with or question regarding the operation of the telepresence device.
[00137] FIG. 15 illustrates a panel 1530 including a plan map view, a video feed window, and various settings panels. The panel 1530 may be displayed in various forms and configurations via an RPI on a PED, such as the PEDs 1510, 1520, and 1540. Each of the PEDs 1510, 1520, and 1540 illustrate examples of panel arrangements generated by the RPI to maximize the useable screen space for various tasks performed using the RPI. It will be appreciated by one of skill in the art that a larger display may accommodate more panels or larger panels. Similarly, devices with multiple displays may accommodate one or more panels on each display. For example, a desktop version of the RPI may utilize multiple monitors connected to the desktop to display one or more panels.
[00138] FIG. 16 illustrates an embodiment 1600 of an RPI on a PED 1605. The RPI may utilize the camera 1680 to transmit an image of the user to a telepresence device. The image captured by the camera 1680 may be displayed in a picture-in- picture window 1610. The upper panel 1615 of the RPI may display a live video feed received from the telepresence device. The lower panel 1620 may display a plan map view. In some embodiments, the plan map view may show a current location of the telepresence device within a healthcare facility (or other locale). The user may manually navigate the telepresence device using the live video feed and the plan view map. Alternatively or additionally, the user may select a location within the plan view map and the telepresence device may autonomously or semi-autonomously navigate to the selected location. [00139] The input intended to direct the telepresence device to a new location may be considered a navigation input. For example, the selection of a room, patient, healthcare practitioner, location within a video feed, and/or other selection or input intended to cause the telepresence robot to navigate may be considered a navigation input. The navigation input may then be translated into one or more navigation instructions to guide the telepresence robot, autonomously or
semi-autonomously, to the selected location.
[00140] FIG. 17 illustrates an embodiment 1700 of an RPI on a PED 1710 displaying multiple panels. As illustrated, a radiography panel 1720 may display images associated with a patient displayed in a live video feed 1750. Telemetry data 1730, lab results 1740, patient data 1760, and physician notes 1770 may be displayed in various other panels on the PED 1750 via the RPI. In a multi-user telepresence conference each of the participating users may be displayed in a panel 1790. According to various embodiments, each of the panels 1720, 1730, 1740, 1750, 1760, 1770, and 1790 may be moved, enlarged, merged with another panel, removed, and/or captured (recorded), intelligently based on decisions made by the RPI, based on usage history, based on relevancy, and/or based on user selection. A camera 1780 may be selectively enabled or disabled by the user.
[00141] The RPI may enable complete integration of patient data monitoring with the remote telepresence session, thereby adding a dimension of data-driven functionality uniquely valuable in telepresence applications. The user may select an icon from a toolbar or other panel to activate a patient bedside data monitoring app, such as those offered by any of a variety of real-time patient data monitoring application providers. Upon selecting the appropriate icon, a patient data monitoring window may appear in the RPI. The user may expand this pane to a full-screen view, reposition the pane, and/or resize the pane as described above. The RPI may show any number of real-time or archived patient biometrics or waveforms, such as temperature, heart rate, pulse, blood pressure, oxygen saturation, etc.
[00142] Using the touch-screen interface, the user may pause and resume realtime, time-delayed, or archived patient data. The user may move back and forth through time-based patient data using dragging or swiping gestures, or the user may zoom or scale the waveform or metric along an amplitude axis and/or time axis. The application may further allow the user to set markers along a waveform to measure variations in amplitude or time associated with various features of the patient data, such as peaks, valleys, maxima or minima (global or local), global averages, running averages, threshold crossings, or the like.
[00143] The data may be collected from bedside monitors or other monitoring devices in real-time and archived for a period of time (or indefinitely) in a server or database. The monitoring app may be a separate application and/or integrated within the RPI. The monitoring app may retrieve the relevant data and provide it to the RPI through an application programming interface (API) and/or the RPI may
independently retrieve the data from a database.
[00144] The data may also be collected by the telepresence device by, for example, directing a camera of the telepresence device to the display of a monitoring device, and either recording video of the monitor display or performing image analysis on the video image to extract the patient data. The user and/or
telepresence device may annotate the data and store the annotations with the data, either locally or in a remote server, for later retrieval. The monitoring app may enable alarms, alerts, notifications, or other actions or scripted activities set to take place in response to certain events in the data.
[00145] Further, the interface may integrate available ADT with patient bedside or biometric data. For example, if a patient's vitals or other biometrics trigger an alert or alarm condition, the telepresence device may be configured to autonomously navigate to the bed or room number of that patient, and send a notification or invitation to a doctor, caregiver, or specialist to begin a telepresence session with the patient. Additionally, when a healthcare practitioner initiates a session with a telepresence device and selects either a location, destination, or patient to visit with the autonomous telepresence device, the bedside or biometric data for a patient associated with the selected location, destination, or patient may be automatically retrieved and used to populate a "dashboard" of patient data that the healthcare practitioner can then review, annotate, or otherwise interact with as discussed above and depicted in FIG. 17.
[00146] Moreover, an autonomous mobile telepresence device may be used to conduct patient rounds in a healthcare facility. As the telepresence device moves from one location to the next, the location of the telepresence device may be used to retrieve the name and/or other data of a patient(s) associated with that location. For example, the telepresence device may retrieve patient biometrics, bedside data, electronic medical records, and/or other patient data to populate a patient dashboard on a display of the PED. In one embodiment, this information may be retrieved from a health level 7 (HL7) compliant server associated with the facility, healthcare practitioner, and/or patient.
[00147] In addition, an autonomous mobile telepresence device may be scripted or scheduled to make scheduled stops at various beds, rooms, locations, or patients. The RPI may retrieve the names or contact info of people (such as doctors, nurses, students, family members, etc.) associated with a scheduled or upcoming stop at a particular patient or location, and send a notification via SMS, email, etc., to the associated people inviting them to join the telepresence session by receiving audio and/or video from the session on a PED via the RPI.
[00148] To accommodate a time interval that may be necessary or convenient to allow others to join the session, the telepresence device may send invitations, notifications, and/or reminders to join the session a predetermined amount of time prior to the time the session is scheduled to begin. Repeated or reminder
notifications may be sent to each party at regular or decreasing intervals to remind them of an upcoming session. The notifications may contain a hyperlink to follow to join the session, a link to the RPI, an app notification or badge for display on the PED, or the address or phone number of another device address to connect to. The notification may further include a username, password, pin and/or other credential(s) that the invitees may provide to join the session. The length of the session may be at least partially based on the number of users connected and/or their priority levels.
[00149] FIGS. 18A and 18B illustrate an embodiment 1800 of what may be displayed during a consult on a telepresence device 1850 and via an RPI on a PED 1805, respectively. As illustrated, the telepresence device 1850 may include audio and/or visual equipment 1880 to capture images and/or audio for display on the PED 1805. PED 1805 may include a camera to capture an image 1815 for display on a screen 1857 of a head portion 1855 of the telepresence device 1850. A lower portion of the telepresence device may include adjustment knobs for the microphone volume 1861 and/or the speaker volume 1862. Additionally, a screen 1870 may provide additional information about the user of the PED 1805. Accordingly, a patient being cared for via the telepresence device 1850 may see a healthcare practitioner using the RPI on the PED 1805. Similarly, the healthcare provider may see and interact with the patient via the telepresence device using the RPI on the PED 1805. [00150] FIG. 19 illustrates an embodiment of a telepresence device 1900 including audio and/or visual instruments 1980, controls 1961 and 1962, and displays 1950 and 1970. In a Screensaver mode, the upper display 1950 may preserve the screen by displaying information about a company, the healthcare facility, medical facts, and/or other information associated with healthcare in general. A lower display 1970 may also enter an independent screen saver mode, and/or allow for user inputs associated with the telepresence device and/or the information displayed via the upper display 1950.
[00151] FIG. 20 illustrates various embodiments of avatars 2091 , 2092, 2093, and 2094 and/or personalities that may be assigned or used by a healthcare practitioner and/or telepresence device. For instance, while the image 2090 displayed on the display 2050 may normally be associated with the image captured by a camera of a PED via the RPI, any of a wide variety of characters, avatars, cartoons, licensed caricatures, and/or other images may be used in place of an image received from the camera on the PED. The avatars 2091 , 2092, 2093, and 2094 may be particularly useful to give human-like traits to the telepresence device. The telepresence device may, as previously described, include controls 2061 and 2062 for audio and a touch sensitive screen 2070.
[00152] FIG. 21 illustrates an RPI that can be displayed on a PED including various informational, control, video, and settings panels 2120, 2130, 2140, 2150, 2160, 2170, 2180, and 2190. As illustrated, a live video feed may be displayed in a panel 21 10. Remote brightness, zoom, and volume controls may be accessible via a panel 2190. Media management controls may be accessible via a panel 2180.
Advanced controls with various tabs 2160 may be available via a panel 2170. A current image captured by a camera on the PED may be displayed in a panel 2150. Local zoom, brightness, volume, and microphone controls may be accessible via a panel 2140. Additional controls and tabs of control icons may be accessible via a panel 2130.
[00153] In the illustrated embodiment, a user of the RPI may select a navigation mode via a selection panel 2120. Inputs for selecting the navigation mode and/or inputs to direct the actual navigation may be performed using any of a wide variety of inputs, including, but not limited to, a voice input, a keyboard, a mouse, a touch input, a stylus, and/or via another peripheral input device. In each of the navigation modes 2120, a user may provide a navigation input in one or more manners. The navigation input may then be translated (processed, lookup table, etc) to generate and transmit navigation instructions to the telepresence robot to guide the
telepresence robot, autonomously or semi-autonomously, to a desired location.
[00154] FIG. 22 illustrates an embodiment 2200 of an RPI configured to utilize destination-based navigational modes for navigating a telepresence device. As illustrated, a centralized panel 2210 may display a live video feed 2220. Various remote settings controls may be available in a panel 2290. Another panel 2240 may provide similar audiovisual setting controls for a PED. A window 2250 may display the image currently captured by a camera on the PED. Advanced controls may be available in a panel 2270, and multiple tabs 2260 may provide for additional informational and/or control panels. The illustrated embodiment 2220 includes a destination navigation panel 2230 configured to allow a user to select from a list of destinations. Once a destination is selected, the RPI may facilitate communication with the telepresence device to direct the telepresence device to the selected destination. The telepresence device may be manually driven to the selected destination or may autonomously (or semi-autonomously) navigate to the selected destination.
[00155] According to various embodiments, the destinations may include various locations within a healthcare facility, specific patient names, and/or the names of various healthcare practitioners. In some embodiments, the locations of healthcare practitioners may be determined using cameras within the healthcare facility, global positioning systems, local positioning systems, radio frequency identification (RFID) tags, and/or the like.
[00156] FIG. 23 illustrates an embodiment 2300 of an RPI including a panel 2320 displaying a live feed from a telepresence device. A lower panel 2310 may display a list of patients. A user may select a patient on the patient list to direct the
telepresence device to navigate to the selected patient. Again, the telepresence device may be manually driven, autonomously navigate, or semi-autonomously navigate to the selected destination.
[00157] FIG. 24 illustrates an embodiment 2400 of an RPI including panel 2410 of a plan view map allowing for map-based navigational modes for navigating a telepresence device. As illustrated, the map-based navigational mode may be part of an advanced control panel including multiple tabs. The size of the panel 2410 may be adapted to suit a particular need and may be user-selectable. Media controls 2440 may allow a user to capture audio and/or visual information from the live video feed in panel 2420. A panel 2430 may display the current image captured by a camera of a PED running the RPI. As illustrated the panel 2430 displaying the current image captured by the camera of the PED may include integrated settings controls. Various tabs 2450 may provide access to additional features, options, help, and/or navigation within the RPI.
[00158] FIG. 25 illustrates an embodiment 2500 of a full-screen view of a hallway from a telepresence device as visualized on a PED via an RPI. The panel 2510 may display a live video feed from a telepresence device. Icons 2520 and 2522 may be overlaid on the video feed and may provide access to various controls or settings. A destination panel 2530 may also be overlaid on the video feed and allow a user to select a destination. As in previous embodiments, the RPI may communicate a selected destination to the telepresence device, which may autonomously or semi- autonomously navigate to the selected destination.
[00159] FIG. 26 illustrates an embodiment 2600 of a full-screen view of a live video feed from a telepresence device as may be displayed via an RPI. Various icons 2620 and 2622 may provide access to various features, settings, options, and/or other aspects of the RPI. The icons 2620 and 2622 may be overlaid on the live video feed or a separate panel may be displayed containing icons 2620 and 2622.
[00160] FIG. 27A illustrates another embodiment 2710 of a live video feed from a telepresence device, as may be displayed via an RPI. An arrow 2730 may be overlaid on the video feed as a navigation input to indicate an intended or desired navigation path of a telepresence device. The arrow may be drawn by the RPI after receiving the live video feed or by the telepresence device prior to sending the live video feed. In some embodiments, the user may draw the arrow 2730 to indicate a desired direction of travel. The arrow 2730 may then be translated by the RPI or by the telepresence device into directional commands to navigate the telepresence device. Again, various icons 2720 may provide access to various features, settings, options, and/or other aspects of the RPI. The icons 2720 may be overlaid on the live video feed or a separate panel may be displayed containing icons 2720. The arrow 2730 may be a vector quantity describing both the direction and the velocity (current, planned, or directed) of the telepresence device.
[00161] The arrow 2730, provided as a navigation input, may be used to generate navigation instructions that are transmitted to a remote telepresence device. As illustrated, the navigation input may be in the form of a vector (arrow 2730) drawn as either a final endpoint selection or as a vector including a beginning point and an end point. The end point of the vector may represent the location to which the
telepresence device should navigate. Accordingly, navigation instructions may be generated based on the current location of the telepresence device and the endpoint of the vector within the live video feed. In some embodiments, the vector's image pixel endpoint may be mapped via a lookup table to one or more translate and rotate commands to cause the telepresence device to navigate to the selected location. In some embodiments, the length of the vector may correspond to the desired distance to move the telepresence device, a desired acceleration of the telepresence device, and/or a desired velocity of the telepresence device.
[00162] As previously described, a navigation input may direct a telepresence device to navigate forward (with respect to the live video feed) 3 meters and to the right 2 meters. Any of a wide variety of navigation instructions may be possible to correctly navigate the telepresence device. For instance, the navigation instructions may direct the telepresence device to navigate approximately 3.6 meters at a 34 degree angle relative to the video feed. Alternatively, the navigation instructions may direct the telepresence device to navigate 3 meters forward and then 2 meters to the right. As will be appreciated, any number of navigations routes and corresponding navigation instructions may be possible. The navigation input may be translated into navigation instructions as a plurality of drive forward commands coupled with rotate commands.
[00163] In some embodiments, the live video feed may be delayed slightly due to network and/or processing limitations. This video latency may result in inaccurate navigation instructions. Accordingly, the navigation input may be adjusted based on the known video latency and the current or past velocity and direction of the robot. For example, the selection of a location within the video feed may be translated into navigation instructions that compensate for the latency of the video feed. For instance, if an end point (endpoint pixel) of a vector drawn on the video feed maps to a position 1 meter forward and 0.5 meters to the right relative to the current location of the telepresence device, and the telepresence device has moved 0.2 meters forward since the video frame on which the vector was drawn was captured, or since the associated command was issued, a lookup table entry for 0.8 meters forward and 0.5 meters to the right may be used to determine and/or adjust the navigation instructions. Video latency calculation and command compensation may be performed at the telepresence device, at a remote interface device, and/or at any networked computing device. In some embodiments, the navigation instructions may be generated to compensate for the video latency. In other embodiments, the telepresence device and/or other computing device may adjust the navigation instructions to compensate for the video latency.
[00164] FIGS. 27B-27H illustrate various aspects of a mouse-based driving interface 271 1 of an RPI for manually or semi-autonomously controlling a
telepresence device. As illustrated in FIG. 27B, the mouse-based driving interface 271 1 for navigating a telepresence device via an RPI may be presented within a live video feed 2712 from the telepresence device. A four way controller 2750 may be overlaid on the live video feed. According to various embodiments, the arrows of the four way controller 2750 may be selected by a finger or stylus on a touch screen device or via a pointer 2751 on a touch screen device or as controlled by a
peripheral device (e.g., a keyboard or mouse). For instance, a user may operator a mouse to control the pointer 2751 and click on one of the arrows on the four way controller 2750.
[00165] In some embodiments, the live video feed 2712 and the four way controller 2750 may be displayed in a full-screen or expanded mode. In other embodiments and/or modes, various additional panels, icons, tabs, and/or other objects 2713, 2714, 2715, 2716, 2717, 2718, and 2719 simultaneously with the mouse-based driving interface. For example, a remote controls panel 2713 may allow a user to control various settings of the remote telepresence device, such as the audio and video settings. A media control panel 2714 may allow a user to open, play, record, and/or otherwise manipulate the current live video (or audio) feed 2712, network- accessible audiovisual material, remotely (at the telepresence device) stored audiovisual material, and/or locally stored audiovisual material.
[00166] An advanced controls panel 2715 may allow a user (or select users based on account privileges) to modify various aspects of the connection or session settings, access external or peripheral applications (e.g., StrokeRESPOND), and/or control other aspects of the remote telepresence session. A local controls panel 2716 may allow a user to control various settings of the RPI, the PED, and/or other local peripheral devices, such as the audio and video settings associated with the telepresence session. A battery life indicator 2717 may provide information regarding the current battery life and/or expected battery life based on current or past consumption rates. Various additional icons and tabs 2718 and 2719 may provide additional controls and/or options within the RPI and/or associated with the remote telepresence device.
[00167] FIG. 27C illustrates the mouse-based driving interface 271 1 with long drive vector 2755 drawn on the video feed to indicate a direction and a relatively fast velocity. According to various embodiments, the direction of the vector may be associated with an intended, directed (as directed by the user of the RPI), and/or current direction of travel. In various embodiments, a user may click the uppermost arrow of the four way controller 2750 and then drag a vector to describe a desired direction of travel. The length (magnitude) of the vector may correspond to the velocity and/or acceleration of the telepresence device. Additionally, a grid display 2760 may illustrated the current direction of travel and/or the velocity (magnitude) of the telepresence device on a grid. The grid may correspond and/or display travel angles (e.g., in degrees, radians, with respect to north, etc.) and/or a velocity as a numeric or descriptive value. For example, the length of the dark line on the grid display 2760 may be aligned with numbers between 1 and 10 or descriptive terms like "slow," "medium," and "fast."
[00168] FIG. 27D illustrates the mouse-based driving interface 271 1 with a short drive vector 2756 drawn on the video feed to indicate a direction and a relatively slow velocity. A corresponding display of a short line may be displayed in the grid display 2760.
[00169] In various embodiments, a navigation input in the form of a vector provided in a mouse-based vector input mode may be translated into navigation instructions through the use of a lookup table. In some embodiments, the lookup table may include values that compensate for latency. That is, the navigation instructions returned by the lookup table may be based on the navigation input and the current or recent average video latency, as described herein.
[00170] FIG. 27E illustrates the mouse-based driving interface 271 1 with a vector 2757 for rounding a corner 2772 within the RPI. According to various embodiments, the RPI and/or telepresence device may detect objects such as the walls to determine passable hallways and corridors. Alternatively or additionally, the RPI and/or telepresence device may utilize maps to determine that a vector 2757 is intended to direct a telepresence device to turn down a different hallway or round a corner. As illustrated, the vector 2757 in FIG. 27E may be intended to round corner 2772. In some embodiments, the user may draw a rounded or curved vector 2757. In other embodiments, the RPI and/or telepresence device may interpret a straight vector 2757 as intended to direct the telepresence device down the hallway.
[00171] FIG. 27F illustrates the mouse-based driving interface 271 1 with a vector
2758 drawn to cause the telepresence device to spin in place. According to various embodiments, a horizontal vector 2758 (relative to the live video feed 2712) may be used to direct the telepresence device to spin in place either clockwise or
counterclockwise depending on the direction of the horizontal vector 2758.
[00172] FIG. 27G illustrates the mouse-based driving interface 271 1 with a vector
2759 drawn towards an object (a wall). According to various embodiments, the mouse-based driving interface may prevent the telepresence device from colliding with objects (such as walls, people, and other objects). As previously described, an obstacle detection/obstacle avoidance (ODOA) system may prevent such collisions. In some embodiments, a user may deactivate the ODOA system, such that the user may operate in a "nudge mode" and continue to slowly move the telepresence device despite the telepresence being in contact with or close proximity to another object. This mechanism may automatically time-out such that the ODOA system is re-engaged after a period of inactivity.
[00173] Placing the telepresence device in nudge mode may cause a camera of the telepresence device to be automatically positioned to look down around the body of the telepresence device and either in the direction of drive command issued by the user or in the direction of an object closest to or in contact with the telepresence device (so that the user may see what obstacle he or she is commanding the telepresence device to nudge). Alternatively, the ODOA system may be entirely deactivated. In one embodiment, the ODOA system may be deactivated so long as the telepresence device is driven or moved below a specified velocity.
[00174] FIG. 27H illustrates the mouse-based driving interface 271 1 used to reverse the telepresence device 2785 with a camera of the telepresence device (or other associated camera) oriented in reverse and slightly downward toward the floor 2790. In some embodiments, by selecting (touch or mouse click) the bottom arrow of the four way controller 2750, the telepresence device 2785 may be reversed. The reverse mode may allow for vector controls (direction and velocity) or may
alternatively reverse straight back and at a constant velocity. In some embodiments, a rear camera may be displayed on the live video feed 2712 to show the direction of travel and help prevent the telepresence device 2785 from colliding with objects. In other embodiments, a head portion of the telepresence device may be rotated to the direction of travel (reverse) and/or inclined slightly downward to facilitate the reverse navigation.
[00175] In various embodiments of the mouse-based driving interface, the controls and vectors may be "drawn" as overlays on the live video feed (e.g., via a click and hold of a mouse button or a tap and hold via a stylus or finger). In other
embodiments, a panel or peripheral device may be used to "draw" the vectors to control the velocity and/or direction of travel. In some embodiments, the velocity selected by the length of the vector may be overridden based on other factors (such as obstacle detection, congestion, human presence, etc.).
[00176] According to various embodiments, a head portion and/or the camera of the telepresence device may be re-centered via an icon within the mouse-based driving interface 271 1 or via a keyboard (or other peripheral device) selection. In some embodiments, the selection of any of the arrows within the four way controller 2750 may orient or re-orient the head portion of the telepresence device.
[00177] FIG. 28 illustrates an embodiment 2810 of full-screen live video feed from a telepresence device including a bedside view of a patient bed 2820 and an associated patient monitor 2830. As illustrated, the live video feed may be able to visualize a patient in a patient bed 2820 as well as a patient monitor. In some embodiments, the RPI and/or the telepresence device may detect a refresh rate of the patient monitor 2830 and dynamically match the refresh rate to reduce scrolling bars and/or flickering. In other embodiments, the live video feed of the patient monitor 2830 may be digitally enhanced to reduce the flickering and/or scrolling bars.
[00178] FIG. 29 illustrates an embodiment 2910 of an RPI including a click-to- zoom feature 2925. The click-to-zoom feature 2925 may be used to zoom in on a specific area of a live video feed, a captured video feed, or a still image. The box defining the click-to-zoom feature 2925 may be drawn as a box, or by dragging one or more corners. Depending on the PED used, the RPI may receive the click-to- zoom box using any of a wide variety of input devices, such as via a touch, a stylus, or a mouse pointer.
[00179] FIG. 29 also illustrates a multi-party session. In the illustrated
embodiments, multiple users are shown in a session guests panel 2930. The session guests panel 2930 may include images captured by PEDs associated with each of the guests. Accordingly, multiple users may participate simultaneously. In such embodiments, one or more of the users may have limited access and/or control of the telepresence device. As in previous embodiments, control and options icons 2940 and 2942 may be overlaid and provide access to various features of the RPI.
[00180] FIG. 30 illustrates an embodiment 3000 of a PED running an RPI configured to display a live video feed in an upper panel 3015 along with a current image window 3010. As illustrated, the RPI may incorporate sub-applications and/or provide access to related applications, such as a StrokeRESPOND application (in a lower panel 3020) configured to provide one or more functions and/or workflow processes associated with strokes. The StrokeRESPOND application may display various patient names 3030, which may be filterable, at 3025, and allow a user to select them, at 3035, in order to see more details. The RPI may allow for any number of sub-routines, sub-applications, and/or other applications to run within the RPI or be launched from the RPI. As other examples, the RPI may provide access to a dictionary, a medical text, an internet search engine, and/or any other external or integrated application.
[00181] In some embodiments, while an application is open in the lower panel 3020, the user may switch to viewing the application in full-screen mode by grabbing the upper part of the application window and dragging upward toward the upper panel 3010. The user may return to the two-pane view by dragging downward from the top of the upper panel 3010. The full-screen application view may include a picture-in-picture of the live video feed from the telepresence device so that the user may continue to monitor the remote environment. Some applications (and/or the RPI) may continue to run in the background, even when not displayed.
[00182] FIG. 31 illustrates an embodiment 3100 of an RPI on a PED 3105, in which an image 3125 captured by a camera of the PED 3105 is displayed as a transparent image overlaid on a live video feed 31 10 originating from a telepresence device. The transparent image 3125 may alternatively be displayed in a lower panel 3120.
[00183] FIG. 32 illustrates an embodiment 3200 of an RPI on a PED 3205, including a toolbar 3225 in a lower panel 3220. The toolbar may provide quick access to any of a wide variety of settings and or features of the RPI. A user may select an icon using any of a wide variety of methods depending on the PED. For instance, a user may touch an icon to select it. Settings and/or features of the RPI may be accessed simultaneously while a live video feed is shown in the upper panel 3210. A media management toolbar 3230 (selectively enabled) may allow for the video feed in upper panel 3210 to be recorded, at 3240. A notification 3235 may alert a user of the PED 3205 that the battery on the telepresence device is nearly depleted. As in previous embodiments, a window 3215 may display the image currently being captured by a camera on the PED 3205 or managing modules and control operations available via an RPI, while simultaneously displaying a video feed from a telepresence device.
[00184] According to various embodiments, the toolbar 3225 may provide access to a handset, a stethoscope, a camera, a video, a live cursor, a laser pointer, microphone settings, a map, navigational options, a disconnect button, and/or other features, options or settings. The toolbar 3225 may provide access to various other functions or applications, such as StrokeRESPOND, SureNotes, a media manager, patient data, lab results, image data, and/or team communication.
[00185] FIG. 33 illustrates an embodiment 3300 of an RPI on a PED 3305 that includes a media management toolbar 3325 associated with a media management window 3330 in a lower panel 3320 of the PED 3305. As illustrated, an upper panel 3315 may include a live video feed of a patient. The user of the RPI may access stored videos, images, audio, and/or telemetry data associated with the patient via the media management toolbar 3325.
[00186] FIG. 34 illustrates an embodiment 3400 of an RPI on a PED 3405 including a video window 3410 displaying a list of telepresence devices to which the user has access, a work space window 3430 displaying a list of patients, and a toolbar 3415 as a tool belt dividing the display. In various embodiments, the selection of a telepresence device via video window 3410 will display a live video feed from the selected telepresence device and initiate a communication session with the telepresence device to allow the user of the RPI on PED 3405 to control the telepresence device and/or join in a multi-user experience with the telepresence device. The selection of a patient via work space window 3430 may automatically select an associated telepresence device based on availability, proximity, and/or other preferences. Alternatively, the user of the RPI on the PED 3405 may additionally select a telepresence device. The selection of a patient via work space window 3430 may also direct a telepresence robot to navigate to the location of the patient.
[00187] FIG. 35 illustrates an embodiment 3500 of an RPI on a PED 3505 including a touch pad control pane 3540 for navigating a telepresence device while displaying a video feed from a telepresence device in an upper panel 3510.
According to various embodiments, a one finger tap in the upper panel 3510 may be used to control the direction in which a head portion of a telepresence device is oriented. A single finger press and drag may draw a click-to-zoom box. Other touch controls, such as pinch to zoom, mouse-based driving, and/or swiping to move the telepresence device may also be available in upper panel 3510. In some
embodiments, a four way controller (illustrated in FIGS. 27B-27H) may be overlaid within the live video feed in the upper panel 3510. The touch pad control pane 3540 may incorporate various touch controls. For example, a user may swipe left to gain access to local controls/settings of the PED 3505, swipe right to access
controls/settings from the telepresence device, swipe down to access a toolbar, and use multiple fingers to drive the telepresence device, such as by defining a vector with a magnitude and direction.
[00188] FIG. 36 illustrates an embodiment 3600 of an RPI on a PED 3605, including an avatar display of telepresence devices in a lower panel 3620 and a video feed from a telepresence device in an upper panel 3610. In various embodiments, tapping a camera icon may capture an image. The image may then appear within a media manager in the lower panel 3620. Drag and drop availability may be available in order for the user to customize the telepresence device, such as by adding avatars or special effects to the video feed and/or images.
[00189] FIG. 37 illustrates an embodiment 3700 of an RPI on a PED 3705 including a visualization of a telepresence device overlaid on a video feed from a telepresence device in an upper panel 3710. As illustrated, the user may place the chess-piece telepresence devices within the video feed in the upper panel 3710 in order to control movement of the actual telepresence device. A lower panel 3730 may allow a user to provide other navigational inputs as well. A lower toolbar 3720 may provide access to various settings and/or function to the RPI.
[00190] FIG. 38 illustrates another embodiment 3800 of an RPI on a PED 3805 including a live video feed in an upper panel 3815, a toolbar belt including various media management icons 3817, and settings manager toolbars 3850 and 3860 for the PED 3805 and a telepresence device, respectively.
[00191] FIG. 39 illustrates an embodiment 3900 of an RPI on a PED 3905 that includes a landing strip navigational panel 3920. A user may provide an intended navigation path that can be displayed on the upper panel 3910 and/or the lower panel 3920. The landing strip navigational panel 3920 may be used to display an intended navigational path, a directed navigational path (i.e. a path provided by a user to direct the telepresence device), or a current navigation path on the upper panel 3910 as a vector and/or within the lower panel 3920 as a vector and/or as an avatar or rendered image of the telepresence device.
[00192] FIG. 40 illustrates an embodiment of an RPI on a PED 4000 including a landscape orientation of a full-screen video feed 4030 from a telepresence device. In addition, a toolbar may be overlaid and/or included in a separate lower panel 4020.
[00193] FIG. 41 illustrates an embodiment 4100 of an RPI on a PED 4105 including a joystick-style control 4125 on a touch interface of a lower panel 4120 of the PED 4105. The lower panel 4120 may also include information icons relating to battery level 4122 and/or network quality 4123. A toolbar 4140 may provide access to various settings, features, and/or controls. A full-screen icon 4130 may allow the live video feed in the upper panel 41 10 to be maximized to a full-screen mode. A window 41 15 may be overlaid in the upper panel 41 10 to show the current image captured by a camera 4180 of the PED 4105.
[00194] FIG. 42 illustrates an embodiment 4200 of an RPI on a PED 4205 including dual joystick-style controls 4225 and 4226 on a touch interface of a lower panel 4220 of the PED 4205. The lower panel 4220 may also include information icons relating to battery level 4222 and/or network quality 4223. A toolbar 4240 may provide access to various settings, features, and/or controls. A full-screen icon 4230 may allow the live video feed in upper panel 4210 to be maximized to a full-screen mode. A window 4215 may be overlaid in the upper panel 4210 to show the current image captured by a camera 4280 of the PED 4205.
[00195] In the illustrated embodiment, the left joystick 4226 may be configured to control movement of the base or body of the telepresence device. The right joystick 4225 may be configured to control movement of a head or upper portion of the telepresence device. The portion of the telepresence device controlled by each joystick 4225 and 4226 may be user-selectable and/or reversed.
[00196] FIG. 43 illustrates a state diagram 4300 for an RPI for use on a PED. As illustrated, following startup, a login page 4310 may be displayed. After a username and password are entered, a session notify page 4330 may be displayed indicating the network and communication status. A successful login may result in an in- session page being displayed 4332 with a live video feed. A dock button may be selected, causing the telepresence device to display an in-transit page 4325 while it navigates to a docking station. A stop button may cause the page to pause 4327.
[00197] From the login page, potentially after successfully logging in, a navigation button may display a navigation page 4320. The navigation page 4320 may allow a user to select between various navigation modes, such as a map button to result in a map page 4322, or a location button to display a list of locations 4324. Again, once a destination is selected, the in-transit page 4325 may be displayed as the telepresence device navigates to the selected location.
[00198] A settings page 4340 may be displayed allowing a user to select from any number of settings. For example, a WiFi selection may result in a WiFi page 4342 being displayed. A robot selection may result in a robot page 4344 being displayed. The state diagram 4300 illustrated in FIG. 43 is a simplified state diagram and intentionally omits numerous possible states and connections between states for clarity. Each and every panel, icon, setting, application, option, tab, selection, input, and the like described herein may be represented as a separate state, entry action, transition condition, transition, exit action, and/or other component of a complex state diagram. As will be appreciated by one of skill in the art, each of the various aspects, functions, operations, control panels, icons, objects, buttons, display panels, display windows, etc., described herein may be described and/or
implemented in terms of software, hardware, and/or firmware and could potentially be described as a complex state machine.
[00199] FIG. 44 illustrates an embodiment of an RPI on a PED 4400 including a full-screen video feed 4410 from a telepresence device. A toolbar 4450 and a navigational joystick 4430 are overlaid on the full-screen video feed 4410. A user may touch, click, or otherwise manipulate the joystick in order to navigate the telepresence device. A dual joystick overlay may be employed to provide
navigational control over a head portion of a telepresence device. In some embodiments, clicking a location within the live video feed 4410 may control the head movement of the telepresence device while the joystick 4430 is intended to control the body movement of the telepresence device. Similarly, a mouse-based driving interface (illustrated in FIGS. 27B-27H) may be overlaid on the full-screen video feed 4410 instead of or in addition to the joystick 4430.
[00200] FIG. 45 illustrates an exemplary toolbar 4500 of icons that may be overlaid within a page of an RPI and/or inserted as a separate panel within a page of an RPI. The icons within toolbar 4500 may provide access to charts, vitals, telemetry data, images, lab results, a home page of the RPI, documents, notes, associated healthcare practitioners, navigation options and features, multimedia control panels, and the like. Each page with the RPI may include context-based toolbars and/or general toolbars. The toolbar 4500 may be positioned at a bottom, top, edge, and/or other location with respect to the RPI or a pane or panel within the RPI. In some embodiments, the toolbar 4500 may be configured to vanish and selectively reappear, such as, for example, upon a mouseover, swipe, or other action.
[00201] FIG. 46 illustrates an embodiment 4600 of an RPI on a PED 4605 including an overlaid instructional panel 4630 describing how a user may manually drive a telepresence device. As in previous embodiments, an upper panel 4610 may include a live video feed from the telepresence device. A toolbar 4650 may provide access to various settings, controls, and/or functions. For example, the toolbar 4650 may provide access to a headset, stethoscope, camera, video, navigation, local camera, point, laser, mute, settings for local and remote devices, and/or a
disconnect (end) button.
[00202] The instructional panel 4630 may provide instructions for finger swipes to move the telepresence device forward and reverse 4631 , slide the telepresence device side to side 4632, rotate the telepresence device to the right 4633, and rotate the telepresence device to the left 4634. Separate control of a head portion of the telepresence device may be available using multiple fingers, using a toggle button (software or hardware), or by tapping within the live video feed 4610. A lower toolbar 4640 may provide access to various other functions, features, and/or settings of the RPI, such as those described herein and especially in conjunction with FIG. 45. In some embodiments, the RPI may include instructional or demonstrational videos for any of the various embodiments or functions described herein. [00203] FIG. 47 illustrates an embodiment 4700 of an RPI on a PED 4705 during a multi-participant telepresence session. As illustrated, a live video feed of a patient may be displayed in an upper panel 4710 of the PED 4705. A toolbar 4750 (and/or 4740) may provide access to various related functions, settings, and/or controls. A lower panel 4730 may include video feeds 4731 , 4732, and 4733 from each of three participants in the multi-participant telepresence session. Each of the three participants may be using an RPI on a PED and see a similar image of the video feed of the patient in the upper panel 4710. Any number of participants may participate. In some embodiments, each participant may be able to control the telepresence device. In other embodiments, only one, or only a select few, of the participants may have control of the telepresence device.
[00204] FIG. 48 illustrates a window or panel 4800 accessible via an RPI on a PED providing access to a care team of a particular patient. The care team panel 4800 may be presented in a full-screen mode and/or as a panel within a display of multiple panels, such as illustrated in FIG. 17. The care team panel 4800 may identify the relevant patient by name 4805 and/or an identification number 4807. The care team panel 4800 may include a column of healthcare practitioners 4810 associated with the patient 4805. A column of healthcare practitioner data 4820 may describe each of the healthcare practitioners 4810 associated with the patient 4805. Whether each healthcare practitioner 4810 is on duty or off duty may be displayed in a third column 4830. A user may also select icons via the RPI to consult 4840, text {e.g., an SMS or email) 4850, and/or call 4860 the associated healthcare practitioner 4810. In various embodiments, the RPI interface may utilize inputs provided via panel 4800 to perform one or more functions via the telepresence device and/or a related telepresence system.
[00205] FIG. 49 illustrates an exemplary overlay help screen 4900 accessible within the RPI on a PED to provide instructions regarding available functions on any given screen. In the illustrated example, a user may have selected a "help" icon or be in a training mode in order to be presented with instruction on how to use a particular interface or toolbar within a screen of the RPI. In the illustrated
embodiment, a window 4950 may display what is currently being captured by a camera of the PED and displayed remotely on a display interface of a telepresence device. A full-screen image of a live video feed 4910 from a telepresence device may be displayed. In addition, a toolbar 4920 may be displayed on a top edge (or other location) of the full-screen display of the live video feed 4910. The toolbar may be pulled down from a hidden state and/or vanish and reappear, as described herein.
[00206] During training, initial use, or after indicating help is needed, the live video feed 4910 and/or the window 4950 may be dimmed and/or otherwise less obtrusive. A help screen overlay may provide instructions and/or guidance with how-to toolbars and/or other interface options. As illustrated in FIG. 49, the overlay may comprise text descriptions of the toolbar icons connected by lines to each icon. In some embodiments, instructions and/or guidance for camera controls and/or driving controls for the telepresence device may be illustrated as an overlay as well. For example, instructions on how to use the joysticks 4225 and 4226 in FIG. 42 may be provided in a help screen overlay. Similarly, the gestures 4631 , 4632, 4633, and 4634 in FIG. 46 may be provided in an instructional or help screen overlay. In some embodiments, the instructional overlay may be in the form of moving or video instructions overlaid on an existing display. For example, the overlay may
demonstrate the gestures 4631 , 4632, 4633, and 4634 in FIG. 46.
[00207] According to various embodiments, an RPI may be configured with all or some of the features and embodiments described herein. For example, an RPI may include any number of the features and embodiments described herein as selectively displayed and/or selectively functional options. An explicit enumeration of all possible permutations of the various embodiments is not included herein; however, it will be apparent to one of skill in the art that any of the variously described
embodiments may be selectively utilized, if not at the same time, in a single RPI.
[00208] This disclosure has been made with reference to various exemplary embodiments, including the best mode. However, those skilled in the art will recognize that changes and modifications may be made to the exemplary
embodiments without departing from the scope of the present disclosure. While the principles of this disclosure have been shown in various embodiments, many modifications may be made to adapt the RPI for a specific environment and/or to satisfy particular operating requirements without departing from the principles and scope of this disclosure. These and other changes or modifications are intended to be included within the scope of the present disclosure. This disclosure includes all possible permutations of the independent claims and their dependent claims.

Claims

What is claimed:
1 . A non-transitory computer-readable storage medium storing
instructions that, when executed by a processor, are configured to cause the processor to perform operations comprising:
communicatively connecting an electronic device to a remote presence device;
selectively displaying a video feed from the remote telepresence device in a video panel on an electronic display of the electronic device;
receiving a navigation input; and
transmitting navigation instructions associated with the navigation input to the remote telepresence device.
2. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
selectively displaying the video panel in a full-screen mode.
3. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
selectively displaying a toolbar comprising at least one selectable setting function associated with at least one of the electronic device and the remote presence device.
4. The non-transitory computer-readable storage medium of claim 3, wherein the toolbar, when selectively displayed, is overlaid on the live video feed.
5. The non-transitory computer-readable storage medium of claim 3, wherein the toolbar comprises a row of selectable icons,
wherein the row of selectable icons is one of horizontally aligned and vertically aligned with respect to the live video feed, and
wherein the row of selectable icons is selectively positionable on the electronic display.
6. The non-transitory computer-readable storage medium of claim 3, wherein the toolbar comprises a media manager icon for selecting a media manger, and
wherein the media manager is configured to allow for selectively recording audiovisual data received from the remote telepresence device and selectively controlling playback of recorded audiovisual material.
7. The non-transitory computer-readable storage medium of claim 1 , wherein the received navigation input is provided with respect to the live video feed.
8. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
displaying a login page for authenticating an operator of the electronic device.
9. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
displaying a request for a selection of at least one of a plurality of remote presence devices;
receiving a selection of one of the plurality of remote presence devices; and communicatively connecting the electronic device to the at least one selected remote presence device.
10. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
receiving a live video feed of an operator of the electronic device from a camera associated with the electronic device; and
transmitting the live video feed of the operator to the remote presence device.
1 1 . The non-transitory computer-readable storage medium of claim 10, wherein the operations further comprise:
displaying the live video feed of the operator in a picture-in-picture video panel on the electronic display of the electronic device.
12. The non-transitory computer-readable storage medium of claim 10, wherein the operations further comprise:
selectively displaying the live video feed of the operator as a partially transparent image overlaid on at least a portion of the live video feed in the video panel.
13. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
selectively displaying a plurality of destination locations within a healthcare facility;
receiving a selection of one of the plurality of destination locations within the healthcare facility; and
transmitting navigation instructions associated with the selected destination location to the telepresence device.
14. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
displaying a notification associated with at least one of the electronic device, the remote presence device, and a communication session between the electronic device and the remote telepresence device.
15. The non-transitory computer-readable storage medium of claim 14, wherein the notification is associated with at least one of battery life, connection quality, a video recording status, an audio recording status, and a status of a peripheral device associated with the remote presence device.
16. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
displaying a media manager for selectively recording audiovisual data received from the remote telepresence device and selectively controlling playback of recorded audiovisual material.
17. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise: selectively displaying patient telemetry data simultaneously with the live video feed.
18. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
selectively displaying a plan view map of a healthcare facility associated with remote presence device, and
wherein the navigation input is provided with respect to the plan view map.
19. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
selectively displaying a plan view map of a healthcare facility associated with remote presence device, and
wherein the navigation input is provided with respect to either the plan view plan view map or the live video feed.
20. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
receiving a selection of an avatar to visually represent the operator of the electronic device; and
transmitting the selected avatar to the remote presence device.
21 . The non-transitory computer-readable storage medium of claim 1 , wherein receiving a navigation input comprises:
receiving a vector input comprising a length and a direction provided relative to the live video feed.
22. The non-transitory computer-readable storage medium of claim 21 , wherein the operations further comprise:
selectively overlaying a plurality of directional icons on the live video feed within the video panel, and
wherein the vector input is provided with respect to at least one of the plurality of overlaid directional icons on the live video feed.
23. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
selectively overlaying a plurality of directional icons on the live video feed within the video panel, and
wherein receiving a navigation input comprises at least one of:
receiving a selection of a left directional icon to strafe the remote presence device to the left relative to the live video feed; and
receiving a selection of a right directional icon to strafe the remote presence device to the right relative to the live video feed;
receiving a selection of a reverse directional icon to move the remote presence device in reverse relative to the live video feed; and
receiving a selection of a forward directional icon to move the remote presence device forward relative to the live video feed.
24. The non-transitory computer-readable storage medium of claim 23, wherein the navigation instructions associated with a selection of the reverse directional icon are configured to orient the live video feed from the remote telepresence device in the direction of movement.
25. The non-transitory computer-readable storage medium of claim 24, wherein the navigation instructions associated with a selection of the reverse directional icon are configured to cause the remote presence device to orient a head portion of the in the direction of movement.
26. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
selectively displaying a selection of navigational modes, including at least one of:
a cursor pointer mode configured to allow an operator to provide a navigation input relative to the live video feed via an on-screen cursor;
a head mode configured to allow an operator to provide a navigation input for controlling only the movement of a head portion of the remote presence device; a laser pointer mode configured to allow an operator to provide a navigation input relative to the live video feed by controlling the location of a simulated on-screen illumination point;
a map drive mode configured to allow an operator to provide a navigation input relative to a displayed map;
a click drive mode configured to allow an operator to provide a navigation input relative to the live video feed by selecting a location within the live video feed; and
a drag-drive mode configured to allow an operator to provide a navigation input relative to the live video feed by defining a navigational path within the live video feed; and
receiving a selection of a navigational mode.
27. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
selectively displaying a plurality of patient identifiers, wherein each patient identifier is associated with a location within a healthcare facility;
receiving a selection of one of the plurality of patient identifiers; and
transmitting navigation instructions based on the location within the healthcare facility associated with the selected patient identifier.
28. The non-transitory computer-readable storage medium of claim 27, wherein the plurality of patient identifiers comprises a list of patient names.
29. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
receiving a selection of a click-zoom mode;
receiving a selection of a first point within the live video feed;
receiving a selection of a second point within the live video feed; and reframing the live video feed such that the perimeter of live video feed displayed within the video panel defined by the selected first and second points.
30. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise: displaying a request for a selection of at least one of a plurality of remote presence devices;
receiving a selection of one of the plurality of remote presence devices;
displaying a request for a selection of at least one of plurality of patients, each of the plurality of patients associated with a location within a healthcare facility;
receiving a selection of one of the plurality of patients; and
communicatively connecting the electronic device to the selected remote presence device, and
wherein receiving the navigation input comprises receiving the location within the healthcare facility associated with the selected patient.
31 . The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
selectively displaying a touchpad panel simultaneously with the live video feed within the video panel on the electronic display, and
wherein the navigation input comprises a touch input provided relative to the live video feed via the touchpad panel.
32. The non-transitory computer-readable storage medium of claim 31 , wherein the touch input comprises touch swipes of one or more fingers within the touchpad panel to control the movement of at least one portion of the remote telepresence device.
33. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
selectively displaying an avatar representation of the remote telepresence device in an avatar panel on the electronic display of the electronic device.
34. The non-transitory computer-readable storage medium of claim 33, wherein the operations further comprise:
displaying the avatar representation of the remote telepresence device so as to convey an intended navigation path with respect to a displayed landing strip.
35. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
selectively displaying an avatar representation of the remote telepresence device overlaid in the live video feed within the video panel.
36. The non-transitory computer-readable storage medium of claim 35, wherein the operations further comprise:
displaying the avatar representation of the remote telepresence device so as to convey an intended navigation path with respect to the live video feed.
37. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise:
selectively displaying a first virtual joystick on the electronic display of the electronic device, and
wherein the navigation input is provided via the first virtual joystick with respect to the live video feed.
38. The non-transitory computer-readable storage medium of claim 37, wherein the operations further comprise:
displaying a second virtual joystick on the electronic display of the electronic device;
receiving a head movement input via the second virtual joystick with respect to the live video feed; and
transmitting head movement input instructions associated with the head movement input to the remote telepresence device.
39. The non-transitory computer-readable storage medium of claim 37, wherein the first virtual joystick is overlaid on the live video feed within the video panel.
40. The non-transitory computer-readable storage medium of claim 1 , wherein the operations further comprise: selectively displaying a help screen overlay, wherein the help screen overlay provides information regarding at least one available function currently displayed on the electronic display.
41 . The non-transitory computer-readable storage medium of claim 1 , wherein the navigation instructions are adjusted based on a latency of the video feed and movement of the remote telepresence device during the latency period.
42. A non-transitory computer-readable storage medium storing
instructions that, when executed by a processor, are configured to cause the processor to perform operations comprising:
communicatively connecting an electronic device to a remote presence device;
selectively displaying a video feed from the remote telepresence device in a video panel on an electronic display of the electronic device;
selectively overlaying a plurality of directional icons on the live video feed; receiving a navigation input in the form of a vector comprising a length and a direction with respect to the live video feed;
transmitting navigation instructions associated with the navigation input to the remote telepresence device, wherein the navigation instructions include a velocity corresponding to the length of the vector.
43. The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
receiving a second navigation input comprising at least one of:
receiving a selection of a left directional icon to strafe the remote presence device to the left relative to the live video feed; and
receiving a selection of a right directional icon to strafe the remote presence device to the right relative to the live video feed;
receiving a selection of a reverse directional icon to move the remote presence device in reverse relative to the live video feed; and
receiving a selection of a forward directional icon to move the remote presence device forward relative to the live video feed; and transmitting second navigation instructions associated with the second navigation input to the remote telepresence device.
44. The non-transitory computer-readable storage medium of claim 43, wherein the navigation instructions associated with a selection of the reverse directional icon are configured to orient the live video feed from the remote telepresence device in the direction of movement.
45. The non-transitory computer-readable storage medium of claim 44, wherein the navigation instructions associated with a selection of the reverse directional icon are configured to cause the remote presence device to orient a head portion of the in the direction of movement.
46. The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
selectively displaying the video panel in a full-screen mode.
47. The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
selectively displaying a toolbar comprising at least one selectable setting function associated with at least one of the electronic device and the remote presence device.
48. The non-transitory computer-readable storage medium of claim 47, wherein the toolbar, when selectively displayed, is overlaid on the live video feed.
49. The non-transitory computer-readable storage medium of claim 47, wherein the toolbar comprises a row of selectable icons,
wherein the row of selectable icons is one of horizontally aligned and vertically aligned with respect to the live video feed, and
wherein the row of selectable icons is selectively positionable on the electronic display.
50. The non-transitory computer-readable storage medium of claim 47, wherein the toolbar comprises a media manager icon for selecting a media manger, and wherein the media manager is configured to allow for selectively recording audiovisual data received from the remote telepresence device and selectively controlling playback of recorded audiovisual material.
51 . The non-transitory computer-readable storage medium of claim 42, wherein the received navigation input is provided with respect to the live video feed.
52. The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
displaying a login page for authenticating an operator of the electronic device.
53. The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
displaying a request for a selection of at least one of a plurality of remote presence devices;
receiving a selection of one of the plurality of remote presence devices; and communicatively connecting the electronic device to the at least one selected remote presence device.
54. The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
receiving a live video feed of an operator of the electronic device from a camera associated with the electronic device; and
transmitting the live video feed of the operator to the remote presence device.
55. The non-transitory computer-readable storage medium of claim 54, wherein the operations further comprise:
displaying the live video feed of the operator in a picture-in-picture video panel on the electronic display of the electronic device.
56. The non-transitory computer-readable storage medium of claim 54, wherein the operations further comprise:
displaying the live video feed of the operator as a partially transparent image overlaid on at least a portion of the live video feed in the video panel.
57. The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
displaying a plurality of destination locations within a healthcare facility;
receiving a selection of one of the plurality of destination locations within the healthcare facility; and
transmitting navigation instructions associated with the selected destination location to the telepresence device.
58. The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
displaying a notification associated with at least one of the electronic device, the remote presence device, and a communication session between the electronic device and the remote telepresence device.
59. The non-transitory computer-readable storage medium of claim 58, wherein the notification is associated with at least one of battery life, connection quality, a video recording status, an audio recording status, and a status of a peripheral device associated with the remote presence device.
60. The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
displaying a media manager for selectively recording audiovisual data received from the remote telepresence device and selectively controlling playback of recorded audiovisual material.
61 . The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
selectively displaying patient telemetry data simultaneously with the live video feed.
62. The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise: selectively displaying a plan view map of a healthcare facility associated with remote presence device;
receiving a second navigation input provided with respect to the plan view map; and
transmitting second navigation instructions associated with the second navigation input to the remote telepresence device.
63. The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
receiving a selection of an avatar to visually represent the operator of the electronic device; and
transmitting the selected avatar to the remote presence device.
64. The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
selectively displaying a selection of navigational modes, including at least one of:
a cursor pointer mode configured to allow an operator to provide a navigation input relative to the live video feed via an on-screen cursor;
a head mode configured to allow an operator to provide a navigation input for controlling only the movement of a head portion of the remote presence device;
a laser pointer mode configured to allow an operator to provide a navigation input relative to the live video feed by controlling the location of a simulated on-screen illumination point;
a map drive mode configured to allow an operator to provide a navigation input relative to a displayed map;
a click drive mode configured to allow an operator to provide a navigation input relative to the live video feed by selecting a location within the live video feed; and
a drag-drive mode configured to allow an operator to provide a navigation input relative to the live video feed by defining a navigational path within the live video feed; and
receiving a selection of a navigational mode.
65. The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
displaying a plurality of patient identifiers, wherein each patient identifier is associated with a location within a healthcare facility;
receiving a selection of one of the plurality of patient identifiers; and
transmitting navigation instructions based on the location within the healthcare facility associated with the selected patient identifier.
66. The non-transitory computer-readable storage medium of claim 65, wherein the plurality of patient identifiers comprises a list of patient names.
67. The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
receiving a selection of a click-zoom mode;
receiving a selection of a first point within the live video feed;
receiving a selection of a second point within the live video feed; and reframing the live video feed such that the perimeter of live video feed displayed within the video panel defined by the selected first and second points.
68. The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
displaying a request for a selection of at least one of a plurality of remote presence devices;
receiving a selection of one of the plurality of remote presence devices;
displaying a request for a selection of at least one of plurality of patients, each of the plurality of patients associated with a location within a healthcare facility;
receiving a selection of one of the plurality of patients; and
communicatively connecting the electronic device to the selected remote presence device, and
transmitting navigation instructions associated with the location within the healthcare facility associated with the selected patient.
69. The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
displaying a touchpad panel simultaneously with the live video feed within the video panel on the electronic display;
receiving a touch navigation input provided relative to the live video feed via the touchpad panel; and
transmitting navigation instructions based on the touch navigation input.
70. The non-transitory computer-readable storage medium of claim 69, wherein the touch navigation input comprises touch swipes of one or more fingers within the touchpad panel to control the movement of at least one portion of the remote telepresence device.
71 . The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
selectively displaying an avatar representation of the remote telepresence device in an avatar panel on the electronic display of the electronic device.
72. The non-transitory computer-readable storage medium of claim 71 , wherein the operations further comprise:
displaying the avatar representation of the remote telepresence device so as to convey an intended navigation path with respect to a displayed landing strip.
73. The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
selectively displaying an avatar representation of the remote telepresence device overlaid in the live video feed within the video panel.
74. The non-transitory computer-readable storage medium of claim 73, wherein the operations further comprise:
displaying the avatar representation of the remote telepresence device so as to convey an intended navigation path with respect to the live video feed.
75. The non-transitory computer-readable storage medium of claim 42, wherein the operations further comprise:
selectively displaying a help screen overlay, wherein the help screen overlay provides information regarding at least one available function currently displayed on the electronic display.
76. The non-transitory computer-readable storage medium of claim 42, wherein the navigation instructions are adjusted based on a latency of the video feed and movement of the remote telepresence device during the latency period.
PCT/US2013/031743 2012-05-22 2013-03-14 Graphical user interfaces including touchpad driving interfaces for telemedicine devices WO2013176760A1 (en)

Priority Applications (8)

Application Number Priority Date Filing Date Title
EP13793865.0A EP2852881A4 (en) 2012-05-22 2013-03-14 Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US14/550,750 US9361021B2 (en) 2012-05-22 2014-11-21 Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US15/154,518 US10061896B2 (en) 2012-05-22 2016-05-13 Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US16/045,608 US10658083B2 (en) 2012-05-22 2018-07-25 Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US15/931,451 US10892052B2 (en) 2012-05-22 2020-05-13 Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US17/146,306 US11515049B2 (en) 2012-05-22 2021-01-11 Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US17/992,074 US11756694B2 (en) 2012-05-22 2022-11-22 Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US18/229,570 US20230377761A1 (en) 2012-05-22 2023-08-02 Graphical user interfaces including touchpad driving interfaces for telemedicine devices

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
US201261650205P 2012-05-22 2012-05-22
US61/650,205 2012-05-22
US201261674796P 2012-07-23 2012-07-23
US201261674794P 2012-07-23 2012-07-23
US201261674782P 2012-07-23 2012-07-23
US61/674,794 2012-07-23
US61/674,782 2012-07-23
US61/674,796 2012-07-23
US201361766623P 2013-02-19 2013-02-19
US61/766,623 2013-02-19

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/550,750 Continuation US9361021B2 (en) 2012-05-22 2014-11-21 Graphical user interfaces including touchpad driving interfaces for telemedicine devices

Publications (1)

Publication Number Publication Date
WO2013176760A1 true WO2013176760A1 (en) 2013-11-28

Family

ID=49624223

Family Applications (3)

Application Number Title Priority Date Filing Date
PCT/US2013/031708 WO2013176758A1 (en) 2012-05-22 2013-03-14 Clinical workflows utilizing autonomous and semi-autonomous telemedicine devices
PCT/US2013/031743 WO2013176760A1 (en) 2012-05-22 2013-03-14 Graphical user interfaces including touchpad driving interfaces for telemedicine devices
PCT/US2013/031778 WO2013176762A1 (en) 2012-05-22 2013-03-14 Social behavior rules for a medical telepresence robot

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/US2013/031708 WO2013176758A1 (en) 2012-05-22 2013-03-14 Clinical workflows utilizing autonomous and semi-autonomous telemedicine devices

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/US2013/031778 WO2013176762A1 (en) 2012-05-22 2013-03-14 Social behavior rules for a medical telepresence robot

Country Status (3)

Country Link
US (9) US10603792B2 (en)
EP (2) EP2852881A4 (en)
WO (3) WO2013176758A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022075970A1 (en) * 2020-10-05 2022-04-14 Hewlett-Packard Development Company, L.P. Transmitting biometric healthcare data

Families Citing this family (206)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140040094A (en) 2011-01-28 2014-04-02 인터치 테크놀로지스 인코퍼레이티드 Interfacing with a mobile telepresence robot
US9323250B2 (en) 2011-01-28 2016-04-26 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
US9098611B2 (en) 2012-11-26 2015-08-04 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
EP2810748A4 (en) * 2012-02-03 2016-09-07 Nec Corp Communication draw-in system, communication draw-in method, and communication draw-in program
US9361021B2 (en) 2012-05-22 2016-06-07 Irobot Corporation Graphical user interfaces including touchpad driving interfaces for telemedicine devices
EP2852881A4 (en) 2012-05-22 2016-03-23 Intouch Technologies Inc Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US9220651B2 (en) * 2012-09-28 2015-12-29 Elwha Llc Automated systems, devices, and methods for transporting and supporting patients
US9838645B2 (en) * 2013-10-31 2017-12-05 Elwha Llc Remote monitoring of telemedicine device
US9075906B2 (en) 2013-06-28 2015-07-07 Elwha Llc Medical support system including medical equipment case
EP3086708A4 (en) * 2013-12-23 2017-08-09 Justin C. Scott Remote anesthesia monitoring
WO2015123468A1 (en) 2014-02-12 2015-08-20 Mobile Heartbeat Llc System for setting and controlling functionalities of mobile devices
US10121015B2 (en) * 2014-02-21 2018-11-06 Lens Ventures, Llc Management of data privacy and security in a pervasive computing environment
US9579799B2 (en) * 2014-04-30 2017-02-28 Coleman P. Parker Robotic control system using virtual reality input
USD752668S1 (en) * 2014-09-16 2016-03-29 Gripa Holding Aps Mount system for robotic tooling
US9434069B1 (en) 2014-11-10 2016-09-06 Google Inc. Motion heat map
JP6034892B2 (en) * 2015-01-27 2016-11-30 ファナック株式会社 A robot system in which the brightness of the robot mount changes
US10019553B2 (en) * 2015-01-27 2018-07-10 Catholic Health Initiatives Systems and methods for virtually integrated care delivery
US10216196B2 (en) * 2015-02-01 2019-02-26 Prosper Technology, Llc Methods to operate autonomous vehicles to pilot vehicles in groups or convoys
US20180099846A1 (en) 2015-03-06 2018-04-12 Wal-Mart Stores, Inc. Method and apparatus for transporting a plurality of stacked motorized transport units
US9757002B2 (en) 2015-03-06 2017-09-12 Wal-Mart Stores, Inc. Shopping facility assistance systems, devices and methods that employ voice input
WO2016142794A1 (en) 2015-03-06 2016-09-15 Wal-Mart Stores, Inc Item monitoring system and method
USD761339S1 (en) * 2015-03-17 2016-07-12 Rethink Robotics, Inc. Compliant manufacturing robot
US9588519B2 (en) * 2015-03-17 2017-03-07 Amazon Technologies, Inc. Systems and methods to facilitate human/robot interaction
US9649766B2 (en) 2015-03-17 2017-05-16 Amazon Technologies, Inc. Systems and methods to facilitate human/robot interaction
PL241337B1 (en) * 2015-04-14 2022-09-12 Medinice Spolka Akcyjna Method for monitoring and controlling of a patient's parameters and for transmitting medical information and the system for the execution of this method
USD802040S1 (en) * 2015-04-28 2017-11-07 Savioke, Inc. Robot
TWD171958S (en) * 2015-05-14 2015-11-21 鴻海精密工業股份有限公司 Emotion interactive robot
US10114379B2 (en) * 2015-06-01 2018-10-30 Dpix, Llc Point to point material transport vehicle improvements for glass substrate
CN106293042B (en) * 2015-06-26 2020-06-23 联想(北京)有限公司 Information processing method and electronic equipment
US9796091B1 (en) 2015-08-17 2017-10-24 X Development Llc Selective robot deployment
US9895809B1 (en) 2015-08-20 2018-02-20 X Development Llc Visual annotations in robot control interfaces
US10880470B2 (en) * 2015-08-27 2020-12-29 Accel Robotics Corporation Robotic camera system
US10016897B2 (en) * 2015-09-14 2018-07-10 OneMarket Network LLC Robotic systems and methods in prediction and presentation of resource availability
JP1560758S (en) * 2015-10-02 2016-10-17
CN106570443A (en) * 2015-10-09 2017-04-19 芋头科技(杭州)有限公司 Rapid identification method and household intelligent robot
US10120057B1 (en) 2015-10-13 2018-11-06 Google Llc System and method for determining the direction of an actor
US9691153B1 (en) 2015-10-21 2017-06-27 Google Inc. System and method for using image data to determine a direction of an actor
US10081106B2 (en) 2015-11-24 2018-09-25 X Development Llc Safety system for integrated human/robotic environments
FR3044193A1 (en) * 2015-11-25 2017-05-26 Orange METHOD AND DEVICE FOR MANAGING COMMUNICATION
US10409292B2 (en) * 2015-12-10 2019-09-10 Panasonic Intellectual Property Corporation Of America Movement control method, autonomous mobile robot, and recording medium storing program
US9740207B2 (en) 2015-12-23 2017-08-22 Intel Corporation Navigating semi-autonomous mobile robots
CN105700681B (en) * 2015-12-31 2018-10-12 联想(北京)有限公司 A kind of control method, electronic equipment
WO2017118001A1 (en) * 2016-01-04 2017-07-13 杭州亚美利嘉科技有限公司 Method and device for returning robots from site
US9776323B2 (en) * 2016-01-06 2017-10-03 Disney Enterprises, Inc. Trained human-intention classifier for safe and efficient robot navigation
US10782686B2 (en) 2016-01-28 2020-09-22 Savioke, Inc. Systems and methods for operating robots including the handling of delivery operations that cannot be completed
US10025308B1 (en) 2016-02-19 2018-07-17 Google Llc System and method to obtain and use attribute data
JP6726388B2 (en) * 2016-03-16 2020-07-22 富士ゼロックス株式会社 Robot control system
CN109153127B (en) 2016-03-28 2022-05-31 Groove X 株式会社 Behavior autonomous robot for executing head-on behavior
CA2961938A1 (en) 2016-04-01 2017-10-01 Wal-Mart Stores, Inc. Systems and methods for moving pallets via unmanned motorized unit-guided forklifts
US10074226B2 (en) * 2016-04-05 2018-09-11 Honeywell International Inc. Systems and methods for providing UAV-based digital escort drones in visitor management and integrated access control systems
GB2549264B (en) 2016-04-06 2020-09-23 Rolls Royce Power Eng Plc Apparatus, methods, computer programs, and non-transitory computer readable storage mediums for enabling remote control of one or more devices
US11212437B2 (en) * 2016-06-06 2021-12-28 Bryan COLIN Immersive capture and review
US10124170B2 (en) 2016-06-13 2018-11-13 Cyberonics, Inc. Neurostimulator with titration assist
US9868214B2 (en) * 2016-06-20 2018-01-16 X Development Llc Localization of a mobile system
USD849813S1 (en) * 2016-07-13 2019-05-28 Crosswing Inc. Mobile robot
DE112017003651T5 (en) 2016-07-20 2019-04-04 Groove X, Inc. Autonomous robot that understands body contact
KR102577571B1 (en) 2016-08-03 2023-09-14 삼성전자주식회사 Robot apparatus amd method of corntrolling emotion expression funtion of the same
USD836690S1 (en) * 2016-08-05 2018-12-25 Samsung Electronics Co., Ltd. Robot
TWD184746S (en) * 2016-08-12 2017-08-01 北京欣奕華科技有限公司 Service robot
EP3287861A1 (en) * 2016-08-24 2018-02-28 Siemens Aktiengesellschaft Method for testing an autonomous system
JP1578668S (en) * 2016-08-29 2017-06-12
US9829333B1 (en) * 2016-09-13 2017-11-28 Amazon Technologies, Inc. Robotic traffic density based guidance
US10095231B2 (en) * 2016-09-14 2018-10-09 International Business Machines Corporation Drone and drone-based system for collecting and managing waste for improved sanitation
TWD186020S (en) * 2016-09-19 2017-10-11 北京欣奕華科技有限公司 Service robot
US11181908B2 (en) 2016-09-20 2021-11-23 Hewlett-Packard Development Company, L.P. Access rights of telepresence robots
EP3298874B1 (en) * 2016-09-22 2020-07-01 Honda Research Institute Europe GmbH Robotic gardening device and method for controlling the same
ES2846098T3 (en) * 2016-09-27 2021-07-28 Sz Dji Technology Co Ltd System and method to control an unmanned vehicle in the presence of a live object
KR20180039821A (en) * 2016-10-11 2018-04-19 삼성전자주식회사 Method for monitoring system control and electronic device supporting the same
CA3040928A1 (en) 2016-10-18 2018-04-26 Piaggio Fast Forward, Inc. Vehicle having non-axial drive and stabilization system
US10987804B2 (en) * 2016-10-19 2021-04-27 Fuji Xerox Co., Ltd. Robot device and non-transitory computer readable medium
WO2018089700A1 (en) * 2016-11-10 2018-05-17 Warner Bros. Entertainment Inc. Social robot with environmental control feature
US10905520B2 (en) * 2016-11-11 2021-02-02 Stryker Corporation Autonomous accessory support for transporting a medical accessory
CA171589S (en) * 2016-11-15 2017-08-04 Crosswing Inc Security robot
USD862551S1 (en) * 2016-11-21 2019-10-08 Ninebot (Beijing) Tech. Co., Ltd Head for mobile service robot
AU2017363489B2 (en) * 2016-11-22 2023-09-14 The Toro Company Autonomous path treatment systems and methods
KR20180080498A (en) * 2017-01-04 2018-07-12 엘지전자 주식회사 Robot for airport and method thereof
USD810167S1 (en) * 2017-01-05 2018-02-13 Kinpo Electronics, Inc. Service robot
USD833498S1 (en) * 2017-01-10 2018-11-13 Intuition Robotics Ltd. Social robot
JP1584573S (en) * 2017-01-12 2018-08-20
JP1584574S (en) * 2017-01-12 2018-08-20
CN107329680A (en) * 2017-01-24 2017-11-07 问众智能信息科技(北京)有限公司 The resident method and apparatus of information based on smart machine
USD817375S1 (en) * 2017-02-06 2018-05-08 Cobalt Robotics Inc. Mobile robot
USD852857S1 (en) * 2017-02-21 2019-07-02 Beijing Ling Technology Co., Ltd. Robot
CN110366758A (en) * 2017-03-07 2019-10-22 索尼奥林巴斯医疗解决方案公司 Managing medical information equipment, method for managing medical information and medical information management system
JP1599484S (en) * 2017-03-23 2018-03-12
CN107030691B (en) * 2017-03-24 2020-04-14 华为技术有限公司 Data processing method and device for nursing robot
US11024289B2 (en) 2017-03-29 2021-06-01 International Business Machines Corporation Cognitive recommendation engine facilitating interaction between entities
WO2018191818A1 (en) * 2017-04-18 2018-10-25 Clearpath Robotics Inc. Stand-alone self-driving material-transport vehicle
USD843428S1 (en) * 2017-04-19 2019-03-19 Simbe Robotics, Inc. Inventory-tracking robotic system
USD819712S1 (en) * 2017-04-19 2018-06-05 Simbe Robotics, Inc. Inventory-tracking robotic system
USD881961S1 (en) * 2017-05-19 2020-04-21 Sita Information Networking Computing Usa, Inc. Robot
TWD188692S (en) * 2017-05-19 2018-02-21 隆宸星股份有限公司 Robot
TWD188693S (en) * 2017-05-19 2018-02-21 隆宸星股份有限公司 Robot
JP6761990B2 (en) * 2017-05-22 2020-09-30 パナソニックIpマネジメント株式会社 Communication control method, communication control device, telepresence robot, and communication control program
WO2018213931A1 (en) 2017-05-25 2018-11-29 Clearpath Robotics Inc. Systems and methods for process tending with a robot arm
USD818020S1 (en) * 2017-06-06 2018-05-15 The Provost, Fellows, Foundation Scholars and the other members of Board, of the College of the Holy and Undivided Trinity of Queen Elizabeth near Dublin Robot head
USD839940S1 (en) * 2017-06-06 2019-02-05 College of the Holy and Undivided Trinity Robot neck
USD835693S1 (en) * 2017-06-23 2018-12-11 Samsung Electronics Co., Ltd. Robotic device
USD859485S1 (en) * 2017-06-23 2019-09-10 Crosswing Inc. Retail and security robot
USD829249S1 (en) * 2017-07-11 2018-09-25 Intel Corporation Robotic finger
US11837341B1 (en) * 2017-07-17 2023-12-05 Cerner Innovation, Inc. Secured messaging service with customized near real-time data integration
USD829250S1 (en) * 2017-07-17 2018-09-25 Robo-Team Home Ltd Robot
US10483007B2 (en) * 2017-07-25 2019-11-19 Intouch Technologies, Inc. Modular telehealth cart with thermal imaging and touch screen user interface
USD829793S1 (en) * 2017-07-28 2018-10-02 Engineering Services Inc. Robot
USD829794S1 (en) * 2017-07-28 2018-10-02 Engineering Services Inc. Docking station for robot
CA176118S (en) * 2017-07-28 2018-09-20 Genesis Robotics Llp Robotic arm
USD829252S1 (en) * 2017-07-28 2018-09-25 Engineering Services Inc. Robot
USD829251S1 (en) * 2017-07-28 2018-09-25 Engineering Services Inc. Robot head
USD839332S1 (en) * 2017-08-23 2019-01-29 Brent Andrew BAILEY End effector
US11568265B2 (en) * 2017-08-23 2023-01-31 Sony Interactive Entertainment Inc. Continual selection of scenarios based on identified tags describing contextual environment of a user for execution by an artificial intelligence model of the user by an autonomous personal companion
USD825632S1 (en) * 2017-08-28 2018-08-14 MerchSource, LLC Robotic arm
USD841711S1 (en) * 2017-08-29 2019-02-26 Shenzhen GLI Technology Limited Household robot
WO2019041043A1 (en) 2017-08-31 2019-03-07 Clearpath Robotics Inc. Systems and methods for generating a mission for a self-driving material-transport vehicle
USD827006S1 (en) * 2017-09-06 2018-08-28 Hiwin Technologies Corp. Robotic arm
USD827005S1 (en) * 2017-09-06 2018-08-28 Hwin Technologies Corp. Robotic arm
TWD189310S (en) * 2017-09-08 2018-03-21 趙嘉浩 Robot body structure
TWD189311S (en) * 2017-09-08 2018-03-21 趙嘉浩 Robot's eyelid structure
JP1612101S (en) * 2017-10-03 2020-08-17
IT201700114497A1 (en) 2017-10-11 2019-04-11 Piaggio Fast Forward Inc TWO-WHEEL VEHICLE WITH LINEAR STABILIZATION SYSTEM
CN107844117B (en) * 2017-10-23 2020-10-30 上海木木聚枞机器人科技有限公司 Road locking system and method based on cloud
USD852860S1 (en) * 2017-10-27 2019-07-02 Samsung Electronics Co., Ltd. Robotic device
USD852861S1 (en) * 2017-10-27 2019-07-02 Samsung Electronics Co., Ltd. Robotic device
WO2019084686A1 (en) 2017-10-31 2019-05-09 Clearpath Robotics Inc. Systems and methods for operating robotic equipment in controlled zones
US20210137438A1 (en) * 2017-10-31 2021-05-13 Hewlett-Packard Development Company, L.P. Control system for mobile robots
US10533858B2 (en) * 2017-11-06 2020-01-14 International Business Machines Corporation Automated emergency response
JP7036399B2 (en) * 2017-11-08 2022-03-15 学校法人早稲田大学 Autonomous mobile robots, their control devices and motion control programs
USD852858S1 (en) * 2017-11-08 2019-07-02 Guangdong Kang Yun Technologies Limited Autonomous indoor scanning robot
USD882655S1 (en) * 2017-11-09 2020-04-28 Teenage Engineering Ab Robot
WO2019090417A1 (en) 2017-11-10 2019-05-16 Clearpath Robotics Inc. Systems and methods for updating an electronic map
USD855674S1 (en) * 2017-11-14 2019-08-06 Baidu Online Network Technology (Beijing) Co., Ltd Smart device
US10901430B2 (en) * 2017-11-30 2021-01-26 International Business Machines Corporation Autonomous robotic avatars
US20190172284A1 (en) * 2017-12-05 2019-06-06 Savioke Inc. Apparatus, System, and Method for Secure Robot Access
USD868864S1 (en) * 2017-12-12 2019-12-03 EasyRobotics ApS Robotic workstation
US10698413B2 (en) * 2017-12-28 2020-06-30 Savioke Inc. Apparatus, system, and method for mobile robot relocalization
TWD192704S (en) * 2018-01-08 2018-09-01 廣達電腦股份有限公司 Interactive robot
USD841067S1 (en) * 2018-01-19 2019-02-19 Myav S.R.L. Industrial robot
US10424412B2 (en) * 2018-01-31 2019-09-24 Jeffrey Huang Privacy-controlled care requester communication system with on-demand caregiver conferencing and real-time vital statistics alert
USD838759S1 (en) * 2018-02-07 2019-01-22 Mainspring Home Decor, Llc Combination robot clock and device holder
USD868129S1 (en) * 2018-03-21 2019-11-26 Productive Robotics, Inc. Robot stand with work table
USD868865S1 (en) * 2018-03-21 2019-12-03 Productive Robotics, Inc. Robot stand
JP1612145S (en) * 2018-03-29 2019-08-19
USD856389S1 (en) * 2018-03-30 2019-08-13 Jabil Inc. Base of an autonomous mobile robot
USD857073S1 (en) * 2018-03-30 2019-08-20 Jabil Inc. Autonomous mobile robot
USD877786S1 (en) * 2018-03-30 2020-03-10 Jabil Inc. Autonomous mobile robot
USD854595S1 (en) * 2018-03-30 2019-07-23 Jabil Inc. Tower of an autonomous mobile robot
CA3098000C (en) * 2018-05-01 2021-12-07 Piaggio Fast Forward, Inc. Method for determining self-driving vehicle behavior models, a self-driving vehicle, and a method of navigating a self-driving vehicle
CN110475087B (en) * 2018-05-09 2022-03-15 视联动力信息技术股份有限公司 Service processing system and method and electronic equipment
EP3569366B1 (en) * 2018-05-17 2023-06-28 Siemens Aktiengesellschaft Robot control method and apparatus
EP3570134B1 (en) * 2018-05-18 2021-06-30 Mobile Industrial Robots A/S System for evacuating one or more mobile robots
CN109003666A (en) * 2018-06-21 2018-12-14 珠海金山网络游戏科技有限公司 Long-range remote sensing based on motion capture is accompanied and attended to the methods, devices and systems of robot
US10933528B2 (en) * 2018-07-06 2021-03-02 International Business Machines Corporation Autonomous robotic monitor for alerting of hazards
JP7035886B2 (en) * 2018-07-30 2022-03-15 トヨタ自動車株式会社 Image processing device, image processing method
IT201800007934A1 (en) * 2018-08-07 2020-02-07 Universita' Degli Studi Di Siena METHOD FOR GENERATING A AWARENESS SIGNAL OF A COLLABORATIVE ROBOT AND RELATED HARDWARE SYSTEM
KR102629036B1 (en) * 2018-08-30 2024-01-25 삼성전자주식회사 Robot and the controlling method thereof
CN113016038A (en) * 2018-10-12 2021-06-22 索尼集团公司 Haptic obstruction to avoid collisions with robotic surgical equipment
KR102228866B1 (en) * 2018-10-18 2021-03-17 엘지전자 주식회사 Robot and method for controlling thereof
EP3870470A4 (en) 2018-10-22 2022-08-10 Piaggio Fast Forward, Inc. Shifting assembly and mobile carrier comprising same
JP7253900B2 (en) * 2018-11-13 2023-04-07 株式会社日立製作所 communication robot
USD897387S1 (en) * 2018-12-27 2020-09-29 Iei Integration Corp. Accompany robot
TWD200891S (en) * 2018-12-27 2019-11-11 威強電工業電腦股份有限公&#x5 Accompany robot
USD897388S1 (en) * 2018-12-27 2020-09-29 Iei Integration Corp. Accompany robot
USD897389S1 (en) * 2018-12-27 2020-09-29 Iei Integration Corp. Accompany robot
US11338438B2 (en) * 2019-01-25 2022-05-24 Bear Robotics, Inc. Method, system and non-transitory computer-readable recording medium for determining a movement path of a robot
US11078019B2 (en) 2019-01-30 2021-08-03 Locus Robotics Corp. Tote induction in warehouse order fulfillment operations
US11034027B2 (en) * 2019-02-01 2021-06-15 Locus Robotics Corp. Robot assisted personnel routing
US11724395B2 (en) 2019-02-01 2023-08-15 Locus Robotics Corp. Robot congestion management
KR20200115696A (en) 2019-03-07 2020-10-08 삼성전자주식회사 Electronic apparatus and controlling method thereof
US11347226B2 (en) * 2019-04-25 2022-05-31 Lg Electronics Inc. Method of redefining position of robot using artificial intelligence and robot of implementing thereof
US11191005B2 (en) 2019-05-29 2021-11-30 At&T Intellectual Property I, L.P. Cyber control plane for universal physical space
US20220218432A1 (en) * 2019-05-31 2022-07-14 Intuitive Surgical Operations, Inc. Systems and methods for bifurcated navigation control of a manipulator cart included within a computer-assisted medical system
USD894987S1 (en) * 2019-06-14 2020-09-01 Bear Robotics Korea, Inc. Robot
WO2021034681A1 (en) 2019-08-16 2021-02-25 Bossa Nova Robotics Ip, Inc. Systems and methods for image capture and shelf content detection
US11260536B1 (en) * 2019-08-27 2022-03-01 Amazon Technologies, Inc. Simulation of emotional state of an interactive device
DE102019124720B3 (en) * 2019-09-13 2020-12-03 Franka Emika Gmbh Online conformity analysis and conformity marking for robots
KR20210045022A (en) * 2019-10-16 2021-04-26 네이버 주식회사 Method and system for controlling robot using recognized personal space associated with human
US11504849B2 (en) * 2019-11-22 2022-11-22 Edda Technology, Inc. Deterministic robot path planning method for obstacle avoidance
USD976976S1 (en) * 2019-12-09 2023-01-31 Zebra Technologies Corporation Mobile automation apparatus
USD958860S1 (en) * 2019-12-31 2022-07-26 Nvidia Corporation Robot
USD927580S1 (en) * 2020-05-07 2021-08-10 Expper Technologies, Inc. Interactive robot
USD978210S1 (en) * 2020-05-15 2023-02-14 GoBe Robots ApS Display assembly
USD963013S1 (en) 2020-05-15 2022-09-06 GoBe Robots ApS Robot
USD968492S1 (en) * 2020-05-29 2022-11-01 Blue Ocean Robotics Aps UV-light disinfection robot
JP6867721B2 (en) * 2020-06-25 2021-05-12 株式会社高山商事 Monitoring system
USD945507S1 (en) * 2020-06-30 2022-03-08 Bossa Nova Robotics Ip, Inc. Mobile robot for object detection
KR102337531B1 (en) * 2020-07-08 2021-12-09 네이버랩스 주식회사 Method and system for specifying node for robot path plannig
US11741564B2 (en) 2020-09-11 2023-08-29 Locus Robotics Corp. Sequence adjustment for executing functions on hems in an order
JP1711631S (en) * 2020-09-27 2022-04-04 Disinfection robot
JP2022062635A (en) * 2020-10-08 2022-04-20 トヨタ自動車株式会社 Server device, system, control device, mobile device, and operating method for system
US11837363B2 (en) 2020-11-04 2023-12-05 Hill-Rom Services, Inc. Remote management of patient environment
USD936057S1 (en) 2020-11-24 2021-11-16 Vr Media Technology, Inc. Scanner
US11835949B2 (en) 2020-11-24 2023-12-05 Mobile Industrial Robots A/S Autonomous device safety system
USD964446S1 (en) * 2020-11-26 2022-09-20 Samsung Electronics Co., Ltd. Service robot
USD970572S1 (en) * 2020-12-09 2022-11-22 Samsung Electronics Co., Ltd. Service robot
EP4015155A1 (en) 2020-12-16 2022-06-22 Tata Consultancy Services Limited Mobile robotic manipulator with telepresence system
CN112998666B (en) * 2020-12-17 2023-09-05 杭州慧能科技有限公司 Intelligent medical device and intelligent medical system for remotely visiting patient
EP4269037A1 (en) 2020-12-23 2023-11-01 Panasonic Intellectual Property Management Co., Ltd. Robot control method, robot, and program
JP2022100936A (en) * 2020-12-24 2022-07-06 トヨタ自動車株式会社 Autonomous movement system, autonomous movement method and autonomous movement program
USD954120S1 (en) * 2021-03-16 2022-06-07 Ubtech Robotics Corp Ltd Robot
USD958858S1 (en) * 2021-03-29 2022-07-26 Heyun Yang Vacuum sealing machine
JP1696990S (en) * 2021-03-31 2021-10-11
USD975764S1 (en) * 2021-06-30 2023-01-17 Heyun Yang Vacuum sealing machine
US20230043172A1 (en) * 2021-08-06 2023-02-09 Zebra Technologies Corporation Adaptive Perimeter Intrusion Detection for Mobile Automation Apparatus
USD989145S1 (en) * 2021-11-05 2023-06-13 Ubtech North America Research And Development Center Corp Disinfecting robot
US11861959B2 (en) * 2022-01-06 2024-01-02 Johnson Controls Tyco IP Holdings LLP Methods and systems for integrating autonomous devices with an access control system
USD989146S1 (en) * 2022-01-11 2023-06-13 Ubtech Robotics Corp Ltd Robot
US11908577B2 (en) * 2022-07-22 2024-02-20 Health Science Partners LLC Telemedicine platform including virtual assistance

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030216834A1 (en) * 2000-05-01 2003-11-20 Allard James R. Method and system for remote control of mobile robot
US20070271122A1 (en) * 2006-05-02 2007-11-22 Siemens Medical Solutions Usa, Inc. Patient Video and Audio Monitoring System
US20090278912A1 (en) * 2008-05-11 2009-11-12 Revolutionary Concepts, Inc. Medical audio/video communications system
US20110288682A1 (en) * 2010-05-24 2011-11-24 Marco Pinter Telepresence Robot System that can be Accessed by a Cellular Phone
US20120010518A1 (en) * 2000-07-12 2012-01-12 Dimicine Research It, Llc Telemedicine system

Family Cites Families (913)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3821995A (en) 1971-10-15 1974-07-02 E Aghnides Vehicle with composite wheel
US4107689A (en) 1976-06-07 1978-08-15 Rca Corporation System for automatic vehicle location
US4213182A (en) 1978-12-06 1980-07-15 General Electric Company Programmable energy load controller system and methods
US4413693A (en) 1981-03-27 1983-11-08 Derby Sherwin L Mobile chair
US6317953B1 (en) 1981-05-11 2001-11-20 Lmi-Diffracto Vision target based assembly
US5148591A (en) 1981-05-11 1992-09-22 Sensor Adaptive Machines, Inc. Vision target based assembly
US4471354A (en) 1981-11-23 1984-09-11 Marathon Medical Equipment Corporation Apparatus and method for remotely measuring temperature
US4519466A (en) 1982-03-30 1985-05-28 Eiko Shiraishi Omnidirectional drive system
EP0108657B1 (en) 1982-09-25 1987-08-12 Fujitsu Limited A multi-articulated robot
US4625274A (en) 1983-12-05 1986-11-25 Motorola, Inc. Microprocessor reset system
US4572594A (en) 1984-02-08 1986-02-25 Schwartz C Bruce Arthroscopy support stand
US4638445A (en) 1984-06-08 1987-01-20 Mattaboni Paul J Autonomous mobile robot
US4766581A (en) 1984-08-07 1988-08-23 Justin Korn Information retrieval system and method using independent user stations
US4553309A (en) 1984-09-26 1985-11-19 General Motors Corporation Robotic assembly of vehicle headliners
JPS6180410A (en) 1984-09-28 1986-04-24 Yutaka Kanayama Drive command system of mobile robot
JPS61111863A (en) 1984-11-05 1986-05-29 Nissan Motor Co Ltd Assembling work by using robots
US4679152A (en) 1985-02-20 1987-07-07 Heath Company Navigation system and method for a mobile robot
US4697278A (en) 1985-03-01 1987-09-29 Veeder Industries Inc. Electronic hub odometer
US4652204A (en) 1985-08-02 1987-03-24 Arnett Edward M Apparatus for handling hazardous materials
US4733737A (en) 1985-08-29 1988-03-29 Reza Falamak Drivable steerable platform for industrial, domestic, entertainment and like uses
US4709265A (en) 1985-10-15 1987-11-24 Advanced Resource Development Corporation Remote control mobile surveillance system
US4751658A (en) 1986-05-16 1988-06-14 Denning Mobile Robotics, Inc. Obstacle avoidance system
US4777416A (en) 1986-05-16 1988-10-11 Denning Mobile Robotics, Inc. Recharge docking system for mobile robot
SE455539B (en) 1986-05-23 1988-07-18 Electrolux Ab ELECTROOPTIC POSITION KNOWLEDGE SYSTEM FOR A PLAN REALLY FORMULA, PREFERRED A MOBILE ROBOT
US4803625A (en) 1986-06-30 1989-02-07 Buddy Systems, Inc. Personal health monitor
US4878501A (en) 1986-09-24 1989-11-07 Shue Ming Jeng Electronic stethoscopic apparatus
JPS63289607A (en) 1987-05-21 1988-11-28 Toshiba Corp Inter-module communication control system for intelligent robot
US4847764C1 (en) 1987-05-21 2001-09-11 Meditrol Inc System for dispensing drugs in health care instituions
JPH0191834A (en) 1987-08-20 1989-04-11 Tsuruta Hiroko Abnormal data detection and information method in individual medical data central control system
JPS6479097A (en) 1987-09-21 1989-03-24 Sumitomo Electric Industries Compound semiconductor vapor growth device
US4942538A (en) 1988-01-05 1990-07-17 Spar Aerospace Limited Telerobotic tracker
US5193143A (en) 1988-01-12 1993-03-09 Honeywell Inc. Problem state monitoring
US4979949A (en) 1988-04-26 1990-12-25 The Board Of Regents Of The University Of Washington Robot-aided system for surgery
US5142484A (en) 1988-05-12 1992-08-25 Health Tech Services Corporation An interactive patient assistance device for storing and dispensing prescribed medication and physical device
US5008804A (en) 1988-06-23 1991-04-16 Total Spectrum Manufacturing Inc. Robotic television-camera dolly system
US5040116A (en) 1988-09-06 1991-08-13 Transitions Research Corporation Visual navigation and obstacle avoidance structured light system
US5157491A (en) 1988-10-17 1992-10-20 Kassatly L Samuel A Method and apparatus for video broadcasting and teleconferencing
US5155684A (en) 1988-10-25 1992-10-13 Tennant Company Guiding an unmanned vehicle by reference to overhead features
US4953159A (en) 1989-01-03 1990-08-28 American Telephone And Telegraph Company Audiographics conferencing arrangement
US5016173A (en) 1989-04-13 1991-05-14 Vanguard Imaging Ltd. Apparatus and method for monitoring visually accessible surfaces of the body
US5006988A (en) 1989-04-28 1991-04-09 University Of Michigan Obstacle-avoiding navigation system
US4977971A (en) 1989-05-17 1990-12-18 University Of Florida Hybrid robotic vehicle
US5224157A (en) 1989-05-22 1993-06-29 Minolta Camera Kabushiki Kaisha Management system for managing maintenance information of image forming apparatus
US5051906A (en) 1989-06-07 1991-09-24 Transitions Research Corporation Mobile robot navigation employing retroreflective ceiling features
JP3002206B2 (en) 1989-06-22 2000-01-24 神鋼電機株式会社 Travel control method for mobile robot
US5341854A (en) 1989-09-28 1994-08-30 Alberta Research Council Robotic drug dispensing system
US5084828A (en) 1989-09-29 1992-01-28 Healthtech Services Corp. Interactive medication delivery system
JP2964518B2 (en) 1990-01-30 1999-10-18 日本電気株式会社 Voice control method
JP2679346B2 (en) 1990-03-28 1997-11-19 神鋼電機株式会社 Charging control method for mobile robot system
US5130794A (en) 1990-03-29 1992-07-14 Ritchey Kurtis J Panoramic display system
JP2921936B2 (en) 1990-07-13 1999-07-19 株式会社東芝 Image monitoring device
US6958706B2 (en) 1990-07-27 2005-10-25 Hill-Rom Services, Inc. Patient care and communication system
JP2541353B2 (en) 1990-09-18 1996-10-09 三菱自動車工業株式会社 Active suspension system for vehicles
US5563998A (en) 1990-10-19 1996-10-08 Moore Business Forms, Inc. Forms automation system implementation
US5276445A (en) 1990-11-30 1994-01-04 Sony Corporation Polling control system for switching units in a plural stage switching matrix
US5310464A (en) 1991-01-04 1994-05-10 Redepenning Jody G Electrocrystallization of strongly adherent brushite coatings on prosthetic alloys
JPH0530502A (en) 1991-07-24 1993-02-05 Hitachi Ltd Integrated video telephone set
US5217453A (en) 1991-03-18 1993-06-08 Wilk Peter J Automated surgical system and apparatus
JPH04289379A (en) 1991-03-18 1992-10-14 Toda Constr Co Ltd Mounting method for guide rail for cutting machine
JPH06502054A (en) 1991-04-22 1994-03-03 エバンズ・アンド・サザーランド・コンピューター・コーポレーション Head-mounted projection display system using beam splitter
US5341459A (en) 1991-05-09 1994-08-23 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Generalized compliant motion primitive
US5231693A (en) 1991-05-09 1993-07-27 The United States Of America As Represented By The Administrator, National Aeronautics And Space Administration Telerobot control system
US7382399B1 (en) 1991-05-13 2008-06-03 Sony Coporation Omniview motionless camera orientation system
JP3173042B2 (en) 1991-05-21 2001-06-04 ソニー株式会社 Robot numerical controller
US5417210A (en) 1992-05-27 1995-05-23 International Business Machines Corporation System and method for augmentation of endoscopic surgery
US5182641A (en) 1991-06-17 1993-01-26 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Composite video and graphics display for camera viewing systems in robotics and teleoperation
US5366896A (en) 1991-07-30 1994-11-22 University Of Virginia Alumni Patents Foundation Robotically operated laboratory system
US5441042A (en) 1991-08-05 1995-08-15 Putman; John M. Endoscope instrument holder
IL99420A (en) 1991-09-05 2000-12-06 Elbit Systems Ltd Helmet mounted display
WO1993006690A1 (en) 1991-09-17 1993-04-01 Radamec Epo Limited Setting-up system for remotely controlled cameras
US5186270A (en) 1991-10-24 1993-02-16 Massachusetts Institute Of Technology Omnidirectional vehicle
US5419008A (en) 1991-10-24 1995-05-30 West; Mark Ball joint
JP3583777B2 (en) 1992-01-21 2004-11-04 エス・アール・アイ・インターナシヨナル Teleoperator system and telepresence method
US5631973A (en) 1994-05-05 1997-05-20 Sri International Method for telemanipulation with telepresence
EP0559348A3 (en) 1992-03-02 1993-11-03 AT&T Corp. Rate control loop processor for perceptual encoder/decoder
US5441047A (en) 1992-03-25 1995-08-15 David; Daniel Ambulatory patient health monitoring techniques utilizing interactive visual communication
US5544649A (en) 1992-03-25 1996-08-13 Cardiomedix, Inc. Ambulatory patient health monitoring techniques utilizing interactive visual communication
US5262944A (en) 1992-05-15 1993-11-16 Hewlett-Packard Company Method for use of color and selective highlighting to indicate patient critical events in a centralized patient monitoring system
US5594859A (en) 1992-06-03 1997-01-14 Digital Equipment Corporation Graphical user interface for video teleconferencing
US5375195A (en) 1992-06-29 1994-12-20 Johnston; Victor S. Method and apparatus for generating composites of human faces
US5762458A (en) 1996-02-20 1998-06-09 Computer Motion, Inc. Method and apparatus for performing minimally invasive cardiac procedures
DE4228281A1 (en) 1992-08-26 1994-03-03 Koenig & Bauer Ag Process for displaying machine malfunctions
US5374879A (en) 1992-11-04 1994-12-20 Martin Marietta Energy Systems, Inc. Omni-directional and holonomic rolling platform with decoupled rotational and translational degrees of freedom
US5600573A (en) 1992-12-09 1997-02-04 Discovery Communications, Inc. Operations center with video storage for a television program packaging and delivery system
US5315287A (en) 1993-01-13 1994-05-24 David Sol Energy monitoring system for recreational vehicles and marine vessels
DE69413585T2 (en) 1993-03-31 1999-04-29 Siemens Medical Systems Inc Apparatus and method for providing dual output signals in a telemetry transmitter
US5319611A (en) 1993-03-31 1994-06-07 National Research Council Of Canada Method of determining range data in a time-of-flight ranging system
US5350033A (en) 1993-04-26 1994-09-27 Kraft Brett W Robotic inspection vehicle
DE69317267T2 (en) 1993-05-19 1998-06-25 Alsthom Cge Alcatel Network for video on request
DE69434779T2 (en) 1993-09-20 2007-06-14 Canon K.K. video system
US5689641A (en) 1993-10-01 1997-11-18 Vicor, Inc. Multimedia collaboration system arrangement for routing compressed AV signal through a participant site without decompressing the AV signal
US6594688B2 (en) 1993-10-01 2003-07-15 Collaboration Properties, Inc. Dedicated echo canceler for a workstation
CA2174336A1 (en) 1993-10-20 1995-04-27 Leo M. Cortjens Adaptive videoconferencing system
US5876325A (en) 1993-11-02 1999-03-02 Olympus Optical Co., Ltd. Surgical manipulation system
US5623679A (en) 1993-11-19 1997-04-22 Waverley Holdings, Inc. System and method for creating and manipulating notes each containing multiple sub-notes, and linking the sub-notes to portions of data objects
US5510832A (en) 1993-12-01 1996-04-23 Medi-Vision Technologies, Inc. Synthesized stereoscopic imaging system and method
US5347306A (en) 1993-12-17 1994-09-13 Mitsubishi Electric Research Laboratories, Inc. Animated electronic meeting place
GB2284968A (en) 1993-12-18 1995-06-21 Ibm Audio conferencing system
JP3339953B2 (en) 1993-12-29 2002-10-28 オリンパス光学工業株式会社 Medical master-slave manipulator
US5511147A (en) 1994-01-12 1996-04-23 Uti Corporation Graphical interface for robot
US5436542A (en) 1994-01-28 1995-07-25 Surgix, Inc. Telescopic camera mount with remotely controlled positioning
JPH07213753A (en) 1994-02-02 1995-08-15 Hitachi Ltd Personal robot device
DE4408329C2 (en) 1994-03-11 1996-04-18 Siemens Ag Method for building up a cellular structured environment map of a self-moving mobile unit, which is oriented with the help of sensors based on wave reflection
JPH07248823A (en) 1994-03-11 1995-09-26 Hitachi Ltd Personal robot device
JPH07257422A (en) 1994-03-19 1995-10-09 Hideaki Maehara Omnidirectional drive wheel and omnidirectional traveling vehicle providing the same
US5659779A (en) 1994-04-25 1997-08-19 The United States Of America As Represented By The Secretary Of The Navy System for assigning computer resources to control multiple computer directed devices
US5784546A (en) 1994-05-12 1998-07-21 Integrated Virtual Networks Integrated virtual networks
JPH084328A (en) 1994-06-16 1996-01-09 Sekisui Chem Co Ltd Bath room unit
US5734805A (en) 1994-06-17 1998-03-31 International Business Machines Corporation Apparatus and method for controlling navigation in 3-D space
CA2148631C (en) 1994-06-20 2000-06-13 John J. Hildin Voice-following video system
JPH0811074A (en) 1994-06-29 1996-01-16 Fanuc Ltd Robot system
BE1008470A3 (en) 1994-07-04 1996-05-07 Colens Andre Device and automatic system and equipment dedusting sol y adapted.
US5462051A (en) 1994-08-31 1995-10-31 Colin Corporation Medical communication system
JP3302188B2 (en) 1994-09-13 2002-07-15 日本電信電話株式会社 Telexistence-type video phone
US5675229A (en) 1994-09-21 1997-10-07 Abb Robotics Inc. Apparatus and method for adjusting robot positioning
US6463361B1 (en) 1994-09-22 2002-10-08 Computer Motion, Inc. Speech interface for an automated endoscopic system
US5598208A (en) 1994-09-26 1997-01-28 Sony Corporation Video viewing and recording system
US5764731A (en) 1994-10-13 1998-06-09 Yablon; Jay R. Enhanced system for transferring, storing and using signaling information in a switched telephone network
US5767897A (en) 1994-10-31 1998-06-16 Picturetel Corporation Video conferencing system
JPH08139900A (en) 1994-11-14 1996-05-31 Canon Inc Image communication equipment
JP2726630B2 (en) 1994-12-07 1998-03-11 インターナショナル・ビジネス・マシーンズ・コーポレイション Gateway device and gateway method
JPH08166822A (en) * 1994-12-13 1996-06-25 Nippon Telegr & Teleph Corp <Ntt> User tracking type moving robot device and sensing method
US5486853A (en) 1994-12-13 1996-01-23 Picturetel Corporation Electrical cable interface for electronic camera
US5553609A (en) 1995-02-09 1996-09-10 Visiting Nurse Service, Inc. Intelligent remote visual monitoring system for home health care service
US5619341A (en) 1995-02-23 1997-04-08 Motorola, Inc. Method and apparatus for preventing overflow and underflow of an encoder buffer in a video compression system
US5973724A (en) 1995-02-24 1999-10-26 Apple Computer, Inc. Merging multiple teleconferences
US5854898A (en) 1995-02-24 1998-12-29 Apple Computer, Inc. System for automatically adding additional data stream to existing media connection between two end points upon exchange of notifying and confirmation messages therebetween
DE69530367T2 (en) 1995-03-06 2004-02-19 Perkin-Elmer Ltd., Beaconsfield Checking a microscope carrier
US5657246A (en) 1995-03-07 1997-08-12 Vtel Corporation Method and apparatus for a video conference user interface
JP2947113B2 (en) 1995-03-09 1999-09-13 日本電気株式会社 User interface device for image communication terminal
US5652849A (en) 1995-03-16 1997-07-29 Regents Of The University Of Michigan Apparatus and method for remote control using a visual information stream
US5673082A (en) 1995-04-10 1997-09-30 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Light-directed ranging system implementing single camera system for telerobotics applications
JP3241564B2 (en) 1995-05-10 2001-12-25 富士通株式会社 Control device and method for motion control of normal wheel type omnidirectional mobile robot
JPH08320727A (en) 1995-05-24 1996-12-03 Shinko Electric Co Ltd Moving device
US5630566A (en) 1995-05-30 1997-05-20 Case; Laura Portable ergonomic work station
JPH08335112A (en) 1995-06-08 1996-12-17 Minolta Co Ltd Mobile working robot system
US5956342A (en) 1995-07-19 1999-09-21 Fujitsu Network Communications, Inc. Priority arbitration for point-to-point and multipoint transmission
JPH0965224A (en) 1995-08-24 1997-03-07 Hitachi Ltd Television receiver
US5825982A (en) 1995-09-15 1998-10-20 Wright; James Head cursor control interface for an automated endoscope system for optimal positioning
US6710797B1 (en) 1995-09-20 2004-03-23 Videotronic Systems Adaptable teleconferencing eye contact terminal
US5961446A (en) 1995-10-06 1999-10-05 Tevital Incorporated Patient terminal for home health care system
US5797515A (en) 1995-10-18 1998-08-25 Adds, Inc. Method for controlling a drug dispensing system
EP0804028B1 (en) 1995-11-13 2008-01-23 Sony Corporation Near video on-demand system and televising method of the same
US20010034475A1 (en) 1995-11-13 2001-10-25 Flach Terry E. Wireless lan system with cellular architecture
US6219032B1 (en) 1995-12-01 2001-04-17 Immersion Corporation Method for providing force feedback to a user of an interface device based on interactions of a controlled cursor with graphical elements in a graphical user interface
US5838575A (en) 1995-12-14 1998-11-17 Rx Excell Inc. System for dispensing drugs
WO1997023094A1 (en) 1995-12-18 1997-06-26 Bell Communications Research, Inc. Head mounted displays linked to networked electronic panning cameras
US5793365A (en) 1996-01-02 1998-08-11 Sun Microsystems, Inc. System and method providing a computer user interface enabling access to distributed workgroup members
US5701904A (en) 1996-01-11 1997-12-30 Krug International Telemedicine instrumentation pack
US5624398A (en) 1996-02-08 1997-04-29 Symbiosis Corporation Endoscopic robotic surgical tools and methods
DE69727660T2 (en) 1996-03-18 2004-12-23 General Instrument Corporation DYNAMIC BANDWIDTH ASSIGNMENT FOR A COMMUNICATION NETWORK
US5682199A (en) 1996-03-28 1997-10-28 Jedmed Instrument Company Video endoscope with interchangeable endoscope heads
JP3601737B2 (en) 1996-03-30 2004-12-15 技術研究組合医療福祉機器研究所 Transfer robot system
US5801755A (en) 1996-04-09 1998-09-01 Echerer; Scott J. Interactive communciation system for medical treatment of remotely located patients
US5867653A (en) 1996-04-18 1999-02-02 International Business Machines Corporation Method and apparatus for multi-cast based video conferencing
US6135228A (en) 1996-04-25 2000-10-24 Massachusetts Institute Of Technology Human transport system with dead reckoning facilitating docking
AU2829697A (en) 1996-05-06 1997-11-26 Camelot Corporation, The Videophone system
US6189034B1 (en) 1996-05-08 2001-02-13 Apple Computer, Inc. Method and apparatus for dynamic launching of a teleconferencing application upon receipt of a call
US6006191A (en) 1996-05-13 1999-12-21 Dirienzo; Andrew L. Remote access medical image exchange system and methods of operation therefor
US6496099B2 (en) 1996-06-24 2002-12-17 Computer Motion, Inc. General purpose distributed operating room control system
US5949758A (en) 1996-06-27 1999-09-07 International Business Machines Corporation Bandwidth reservation for multiple file transfer in a high speed communication network
US5787745A (en) 1996-07-22 1998-08-04 Chang; Kun-Sheng Key-chain with annual calendar
JPH1079097A (en) 1996-09-04 1998-03-24 Toyota Motor Corp Mobile object communication method
US6195357B1 (en) 1996-09-24 2001-02-27 Intervoice Limited Partnership Interactive information transaction processing system with universal telephony gateway capabilities
US5754631A (en) 1996-09-30 1998-05-19 Intervoice Limited Partnership Voice response unit having robot conference capability on ports
US5974446A (en) 1996-10-24 1999-10-26 Academy Of Applied Science Internet based distance learning system for communicating between server and clients wherein clients communicate with each other or with teacher using different communication techniques via common user interface
US6646677B2 (en) 1996-10-25 2003-11-11 Canon Kabushiki Kaisha Image sensing control method and apparatus, image transmission control method, apparatus, and system, and storage means storing program that implements the method
US5917958A (en) 1996-10-31 1999-06-29 Sensormatic Electronics Corporation Distributed video data base with remote searching for image data features
US5867494A (en) 1996-11-18 1999-02-02 Mci Communication Corporation System, method and article of manufacture with integrated video conferencing billing in a communication system architecture
US6331181B1 (en) 1998-12-08 2001-12-18 Intuitive Surgical, Inc. Surgical robotic tools, data architecture, and use
US8182469B2 (en) 1997-11-21 2012-05-22 Intuitive Surgical Operations, Inc. Surgical accessory clamp and method
US6113343A (en) 1996-12-16 2000-09-05 Goldenberg; Andrew Explosives disposal robot
US6148100A (en) 1996-12-20 2000-11-14 Bechtel Bwxt Idaho, Llc 3-dimensional telepresence system for a robotic environment
US5886735A (en) 1997-01-14 1999-03-23 Bullister; Edward T Video telephone headset
WO1998038958A1 (en) 1997-03-05 1998-09-11 Massachusetts Institute Of Technology A reconfigurable footprint mechanism for omnidirectional vehicles
US5995884A (en) 1997-03-07 1999-11-30 Allen; Timothy P. Computer peripheral floor cleaning system and navigation method
US6501740B1 (en) 1997-03-07 2002-12-31 At&T Corp. System and method for teleconferencing on an internetwork comprising connection-oriented and connectionless networks
JP3217723B2 (en) 1997-03-13 2001-10-15 ▲すすむ▼ 舘 Telecommunications system and telecommunications method
WO1998042407A1 (en) 1997-03-27 1998-10-01 Medtronic, Inc. Concepts to implement medconnect
JPH10288689A (en) 1997-04-14 1998-10-27 Hitachi Ltd Remote monitoring system
US20040157612A1 (en) 1997-04-25 2004-08-12 Minerva Industries, Inc. Mobile communication and stethoscope system
US6914622B1 (en) 1997-05-07 2005-07-05 Telbotics Inc. Teleconferencing robot with swiveling video monitor
WO1998051078A1 (en) 1997-05-07 1998-11-12 Telbotics Inc. Teleconferencing robot with swiveling video monitor
GB2325376B (en) 1997-05-14 2001-09-19 Dsc Telecom Lp Allocation of bandwidth to calls in a wireless telecommunications system
US5857534A (en) 1997-06-05 1999-01-12 Kansas State University Research Foundation Robotic inspection apparatus and method
US5995119A (en) 1997-06-06 1999-11-30 At&T Corp. Method for generating photo-realistic animated characters
DE69805068T2 (en) 1997-07-02 2002-11-07 Borringia Ind Ag Arlesheim DRIVE WHEEL
US6330486B1 (en) 1997-07-16 2001-12-11 Silicon Graphics, Inc. Acoustic perspective in a virtual three-dimensional environment
US6445964B1 (en) 1997-08-04 2002-09-03 Harris Corporation Virtual reality simulation-based training of telekinegenesis system for training sequential kinematic behavior of automated kinematic machine
JPH11126017A (en) 1997-08-22 1999-05-11 Sony Corp Storage medium, robot, information processing device and electronic pet system
WO1999012082A1 (en) 1997-09-04 1999-03-11 Dynalog, Inc. Method for calibration of a robot inspection system
US6714839B2 (en) 1998-12-08 2004-03-30 Intuitive Surgical, Inc. Master having redundant degrees of freedom
US6400378B1 (en) 1997-09-26 2002-06-04 Sony Corporation Home movie maker
JPH11175118A (en) 1997-10-08 1999-07-02 Denso Corp Robot controller
US6597392B1 (en) 1997-10-14 2003-07-22 Healthcare Vision, Inc. Apparatus and method for computerized multi-media data organization and transmission
US7956894B2 (en) 1997-10-14 2011-06-07 William Rex Akers Apparatus and method for computerized multi-media medical and pharmaceutical data organization and transmission
US7885822B2 (en) 2001-05-09 2011-02-08 William Rex Akers System and method for electronic medical file management
US6209018B1 (en) 1997-11-13 2001-03-27 Sun Microsystems, Inc. Service framework for a distributed object network system
US6389329B1 (en) 1997-11-27 2002-05-14 Andre Colens Mobile robots and their control system
US6532404B2 (en) 1997-11-27 2003-03-11 Colens Andre Mobile robots and their control system
JP3919040B2 (en) 1997-11-30 2007-05-23 ソニー株式会社 Robot equipment
US6036812A (en) 1997-12-05 2000-03-14 Automated Prescription Systems, Inc. Pill dispensing system
US6006946A (en) 1997-12-05 1999-12-28 Automated Prescriptions System, Inc. Pill dispensing system
WO1999030876A1 (en) 1997-12-16 1999-06-24 Board Of Trustees Operating Michigan State University Spherical mobile robot
US6047259A (en) 1997-12-30 2000-04-04 Medical Management International, Inc. Interactive method and system for managing physical exams, diagnosis and treatment protocols in a health care practice
US5983263A (en) 1998-01-02 1999-11-09 Intel Corporation Method and apparatus for transmitting images during a multimedia teleconference
US6563533B1 (en) 1998-01-06 2003-05-13 Sony Corporation Ergonomically designed apparatus for selectively actuating remote robotics cameras
US6380968B1 (en) 1998-01-06 2002-04-30 Intel Corporation Method and apparatus for controlling a remote video camera in a video conferencing system
DE19803494A1 (en) 1998-01-29 1999-08-05 Berchtold Gmbh & Co Geb Procedure for manipulating an operating light
JPH11220706A (en) 1998-02-03 1999-08-10 Nikon Corp Video telephone system
JPH11249725A (en) 1998-02-26 1999-09-17 Fanuc Ltd Robot controller
US6346962B1 (en) 1998-02-27 2002-02-12 International Business Machines Corporation Control of video conferencing system with pointing device
US6373855B1 (en) 1998-03-05 2002-04-16 Intel Corporation System and method for using audio performance to control video bandwidth
US6643496B1 (en) 1998-03-31 2003-11-04 Canon Kabushiki Kaisha System, method, and apparatus for adjusting packet transmission rates based on dynamic evaluation of network characteristics
GB9807540D0 (en) 1998-04-09 1998-06-10 Orad Hi Tec Systems Ltd Tracking system for sports
US6650748B1 (en) 1998-04-13 2003-11-18 Avaya Technology Corp. Multiple call handling in a call center
US6233504B1 (en) 1998-04-16 2001-05-15 California Institute Of Technology Tool actuation and force feedback on robot-assisted microsurgery system
US6529765B1 (en) 1998-04-21 2003-03-04 Neutar L.L.C. Instrumented and actuated guidance fixture for sterotactic surgery
US20020151514A1 (en) 1998-05-11 2002-10-17 Paz Einat Genes associated with mechanical stress, expression products therefrom, and uses thereof
US6219587B1 (en) 1998-05-27 2001-04-17 Nextrx Corporation Automated pharmaceutical management and dispensing system
US6250928B1 (en) 1998-06-22 2001-06-26 Massachusetts Institute Of Technology Talking facial display method and apparatus
JP4328997B2 (en) 1998-06-23 2009-09-09 ソニー株式会社 Robot device
JP3792901B2 (en) 1998-07-08 2006-07-05 キヤノン株式会社 Camera control system and control method thereof
US6452915B1 (en) 1998-07-10 2002-09-17 Malibu Networks, Inc. IP-flow classification in a wireless point to multi-point (PTMP) transmission system
US6266577B1 (en) 1998-07-13 2001-07-24 Gte Internetworking Incorporated System for dynamically reconfigure wireless robot network
JP3487186B2 (en) 1998-07-28 2004-01-13 日本ビクター株式会社 Network remote control system
JP4100773B2 (en) 1998-09-04 2008-06-11 富士通株式会社 Robot remote control method and system
JP2000153476A (en) 1998-09-14 2000-06-06 Honda Motor Co Ltd Leg type movable robot
US6594527B2 (en) 1998-09-18 2003-07-15 Nexmed Holdings, Inc. Electrical stimulation apparatus and method
US6175779B1 (en) 1998-09-29 2001-01-16 J. Todd Barrett Computerized unit dose medication dispensing cart
US6457043B1 (en) 1998-10-23 2002-09-24 Verizon Laboratories Inc. Speaker identifier for multi-party conference
WO2000025516A1 (en) 1998-10-24 2000-05-04 Vtel Corporation Graphical menu items for a user interface menu of a video teleconferencing system
US6602469B1 (en) 1998-11-09 2003-08-05 Lifestream Technologies, Inc. Health monitoring and diagnostic device and network-based health assessment and medical records maintenance system
US6468265B1 (en) 1998-11-20 2002-10-22 Intuitive Surgical, Inc. Performing cardiac surgery without cardioplegia
US6852107B2 (en) 2002-01-16 2005-02-08 Computer Motion, Inc. Minimally invasive surgical training using robotics and tele-collaboration
US8527094B2 (en) 1998-11-20 2013-09-03 Intuitive Surgical Operations, Inc. Multi-user medical robotic system for collaboration or training in minimally invasive surgical procedures
US6951535B2 (en) 2002-01-16 2005-10-04 Intuitive Surgical, Inc. Tele-medicine system that transmits an entire state of a subsystem
US6232735B1 (en) 1998-11-24 2001-05-15 Thames Co., Ltd. Robot remote control system and robot image remote control processing system
JP2000196876A (en) 1998-12-28 2000-07-14 Canon Inc Image processing system, image forming controller, image forming device, control method for image processing system, control method for the image forming controller, and control method for the image forming device
US6170929B1 (en) 1998-12-02 2001-01-09 Ronald H. Wilson Automated medication-dispensing cart
US6535182B2 (en) 1998-12-07 2003-03-18 Koninklijke Philips Electronics N.V. Head-mounted projection display system
US6799065B1 (en) 1998-12-08 2004-09-28 Intuitive Surgical, Inc. Image shifting apparatus and method for a telerobotic system
US6522906B1 (en) 1998-12-08 2003-02-18 Intuitive Surgical, Inc. Devices and methods for presenting and regulating auxiliary information on an image display of a telesurgical system to assist an operator in performing a surgical procedure
ATE460127T1 (en) 1998-12-08 2010-03-15 Intuitive Surgical Inc TELEROBOT FOR MOVING PICTURES
JP3980205B2 (en) 1998-12-17 2007-09-26 コニカミノルタホールディングス株式会社 Work robot
US6259956B1 (en) 1999-01-14 2001-07-10 Rawl & Winstead, Inc. Method and apparatus for site management
US6463352B1 (en) 1999-01-21 2002-10-08 Amada Cutting Technologies, Inc. System for management of cutting machines
JP4366617B2 (en) 1999-01-25 2009-11-18 ソニー株式会社 Robot device
US6338013B1 (en) 1999-03-19 2002-01-08 Bryan John Ruffner Multifunctional mobile appliance
WO2000060522A2 (en) 1999-04-01 2000-10-12 Acist Medical Systems, Inc. An integrated medical information management and medical device control system and method
US7007235B1 (en) 1999-04-02 2006-02-28 Massachusetts Institute Of Technology Collaborative agent interaction control and synchronization system
US6594552B1 (en) 1999-04-07 2003-07-15 Intuitive Surgical, Inc. Grip strength with tactile feedback for robotic surgery
US6424885B1 (en) 1999-04-07 2002-07-23 Intuitive Surgical, Inc. Camera referenced control in a minimally invasive surgical apparatus
US6788651B1 (en) 1999-04-21 2004-09-07 Mindspeed Technologies, Inc. Methods and apparatus for data communications on packet networks
US6346950B1 (en) 1999-05-20 2002-02-12 Compaq Computer Corporation System and method for display images using anamorphic video
US6781606B2 (en) 1999-05-20 2004-08-24 Hewlett-Packard Development Company, L.P. System and method for displaying images using foveal video
US6292713B1 (en) 1999-05-20 2001-09-18 Compaq Computer Corporation Robotic telepresence system
US6523629B1 (en) 1999-06-07 2003-02-25 Sandia Corporation Tandem mobile robot system
US7256708B2 (en) 1999-06-23 2007-08-14 Visicu, Inc. Telecommunications network for remote patient monitoring
US6804656B1 (en) 1999-06-23 2004-10-12 Visicu, Inc. System and method for providing continuous, expert network critical care services from a remote location(s)
US6304050B1 (en) 1999-07-19 2001-10-16 Steven B. Skaar Means and method of robot control relative to an arbitrary surface using camera-space manipulation
US7606164B2 (en) 1999-12-14 2009-10-20 Texas Instruments Incorporated Process of increasing source rate on acceptable side of threshold
US6540039B1 (en) 1999-08-19 2003-04-01 Massachusetts Institute Of Technology Omnidirectional vehicle with offset wheel pairs
ATE306096T1 (en) 1999-08-31 2005-10-15 Swisscom Ag MOBILE ROBOT AND CONTROL METHOD FOR A MOBILE ROBOT
US6810411B1 (en) 1999-09-13 2004-10-26 Intel Corporation Method and system for selecting a host in a communications network
EP1090722B1 (en) 1999-09-16 2007-07-25 Fanuc Ltd Control system for synchronously cooperative operation of a plurality of robots
JP2001094989A (en) 1999-09-20 2001-04-06 Toshiba Corp Dynamic image transmitter and dynamic image communications equipment
JP2001088124A (en) 1999-09-22 2001-04-03 Japan Steel Works Ltd:The Apparatus for cooling and taking off strand
US6480762B1 (en) 1999-09-27 2002-11-12 Olympus Optical Co., Ltd. Medical apparatus supporting system
US7123292B1 (en) 1999-09-29 2006-10-17 Xerox Corporation Mosaicing images with an offset lens
US6449762B1 (en) 1999-10-07 2002-09-10 Synplicity, Inc. Maintaining correspondence between text and schematic representations of circuit elements in circuit synthesis
US6798753B1 (en) 1999-10-14 2004-09-28 International Business Machines Corporation Automatically establishing conferences from desktop applications over the Internet
US7467211B1 (en) 1999-10-18 2008-12-16 Cisco Technology Inc. Remote computer system management through an FTP internet connection
WO2001031861A1 (en) 1999-10-22 2001-05-03 Nomadix, Inc. Systems and methods for dynamic bandwidth management on a per subscriber basis in a communications network
JP4207336B2 (en) 1999-10-29 2009-01-14 ソニー株式会社 Charging system for mobile robot, method for searching for charging station, mobile robot, connector, and electrical connection structure
US6524239B1 (en) 1999-11-05 2003-02-25 Wcr Company Apparatus for non-instrusively measuring health parameters of a subject and method of use thereof
AT409238B (en) 1999-11-05 2002-06-25 Fronius Schweissmasch Prod DETERMINING AND / OR DETERMINING USER AUTHORIZATIONS BY MEANS OF A TRANSPONDER, A FINGERPRINT IDENTIFIER OR THE LIKE
JP2001134309A (en) 1999-11-09 2001-05-18 Mitsubishi Electric Corp Robot operation terminal and remote control system for robot
JP2001142512A (en) 1999-11-16 2001-05-25 Mitsubishi Electric Corp Remote operation system for robot
US6459955B1 (en) 1999-11-18 2002-10-01 The Procter & Gamble Company Home cleaning robot
JP2001147718A (en) 1999-11-19 2001-05-29 Sony Corp Information communication robot device, information communication method and information communication robot system
US6374155B1 (en) 1999-11-24 2002-04-16 Personal Robotics, Inc. Autonomous multi-platform robot system
US6443359B1 (en) 1999-12-03 2002-09-03 Diebold, Incorporated Automated transaction system and method
US7156809B2 (en) 1999-12-17 2007-01-02 Q-Tec Systems Llc Method and apparatus for health and disease management combining patient data monitoring with wireless internet connectivity
US20010051881A1 (en) 1999-12-22 2001-12-13 Aaron G. Filler System, method and article of manufacture for managing a medical services network
EP1239805B1 (en) 1999-12-23 2006-06-14 Hill-Rom Services, Inc. Surgical theater system
JP2001179663A (en) 1999-12-24 2001-07-03 Sony Corp Leg type mobile robot, its control method and charging station
JP2001188124A (en) 1999-12-27 2001-07-10 Ge Toshiba Silicones Co Ltd Saponified cellulose acetate composite polarizing base plate, its manufacturing method and liquid crystal display device
US7389252B2 (en) 2000-01-06 2008-06-17 Anne E. Robb Recursive method and system for accessing classification information
JP2001198868A (en) 2000-01-17 2001-07-24 Atr Media Integration & Communications Res Lab Robot for cyber two man comic dialogue and support device
JP3791663B2 (en) 2000-01-17 2006-06-28 富士電機ホールディングス株式会社 Omnidirectional moving vehicle and its control method
JP2001198865A (en) 2000-01-20 2001-07-24 Toshiba Corp Bipedal robot device and its operating method
JP2001222309A (en) 2000-02-10 2001-08-17 Yaskawa Electric Corp Robot controller
JP2001252884A (en) 2000-03-06 2001-09-18 Matsushita Electric Ind Co Ltd Robot, robot system, and method of controlling robot
US20010054071A1 (en) 2000-03-10 2001-12-20 Loeb Gerald E. Audio/video conference system for electronic caregiving
FR2806561B1 (en) 2000-03-16 2002-08-09 France Telecom HOME TELE ASSISTANCE SYSTEM
US6369847B1 (en) 2000-03-17 2002-04-09 Emtel, Inc. Emergency facility video-conferencing system
KR100351816B1 (en) 2000-03-24 2002-09-11 엘지전자 주식회사 Apparatus for conversing format
US20010048464A1 (en) 2000-04-07 2001-12-06 Barnett Howard S. Portable videoconferencing system
US6590604B1 (en) 2000-04-07 2003-07-08 Polycom, Inc. Personal videoconferencing system having distributed processing architecture
JP3511088B2 (en) 2000-04-10 2004-03-29 独立行政法人航空宇宙技術研究所 Pressure distribution sensor for multi-joint care robot control
JP4660879B2 (en) 2000-04-27 2011-03-30 ソニー株式会社 Information providing apparatus and method, and program
EP1279081B1 (en) 2000-05-01 2012-01-04 iRobot Corporation Method and system for remote control of mobile robot
US6292714B1 (en) 2000-05-12 2001-09-18 Fujitsu Limited Robot cooperation device, and robot cooperation program storage medium
CA2411187A1 (en) 2000-05-24 2001-11-29 Virtual Clinic, Inc. Method and apparatus for providing personalized services
US7215786B2 (en) 2000-06-09 2007-05-08 Japan Science And Technology Agency Robot acoustic device and robot acoustic system
JP2001353678A (en) 2000-06-12 2001-12-25 Sony Corp Authoring system and method and storage medium
JP3513084B2 (en) 2000-06-14 2004-03-31 株式会社東芝 Information processing system, information equipment and information processing method
JP2002000574A (en) 2000-06-22 2002-01-08 Matsushita Electric Ind Co Ltd Robot for nursing care support and nursing care support system
US7782363B2 (en) 2000-06-27 2010-08-24 Front Row Technologies, Llc Providing multiple video perspectives of activities through a data network to a remote multimedia server for selective display by remote viewing audiences
US6629028B2 (en) 2000-06-29 2003-09-30 Riken Method and system of optical guidance of mobile body
US20020086262A1 (en) 2000-07-13 2002-07-04 Rainey J. Tim Parabolic light source with incorporated photographic device
US6539284B2 (en) * 2000-07-25 2003-03-25 Axonn Robotics, Llc Socially interactive autonomous robot
US6746443B1 (en) 2000-07-27 2004-06-08 Intuitive Surgical Inc. Roll-pitch-roll surgical tool
US8751248B2 (en) 2000-07-28 2014-06-10 Visual Telecommunications Network, Inc. Method, apparatus, and medium using a master control file for computer software interoperability between disparate operating systems
US7886054B1 (en) 2000-10-11 2011-02-08 Siddhartha Nag Graphical user interface (GUI) for administering a network implementing media aggregation
CA2416253A1 (en) 2000-07-28 2002-02-07 American Calcar Inc. Technique for effective organization and communication of information
US6738076B1 (en) 2000-07-31 2004-05-18 Hewlett-Packard Development Company, L.P. Method and system for maintaining persistance of graphical markups in a collaborative graphical viewing system
JP2002046088A (en) 2000-08-03 2002-02-12 Matsushita Electric Ind Co Ltd Robot device
US20020027597A1 (en) 2000-09-05 2002-03-07 John Sachau System for mobile videoconferencing
US20070273751A1 (en) 2000-09-05 2007-11-29 Sachau John A System and methods for mobile videoconferencing
EP1189169A1 (en) 2000-09-07 2002-03-20 STMicroelectronics S.r.l. A VLSI architecture, particularly for motion estimation applications
US6523905B2 (en) 2000-09-08 2003-02-25 Hitachi Construction Machinery Co., Ltd. Crawler carrier having an engine, a hydraulic pump and a heat exchanger positioned in a lateral direction
WO2002023403A2 (en) 2000-09-11 2002-03-21 Pinotage, Llc. System and method for obtaining and utilizing maintenance information
US20020091659A1 (en) 2000-09-12 2002-07-11 Beaulieu Christopher F. Portable viewing of medical images using handheld computers
KR100373323B1 (en) 2000-09-19 2003-02-25 한국전자통신연구원 Method of multipoint video conference in video conferencing system
US6741911B2 (en) 2000-09-20 2004-05-25 John Castle Simmons Natural robot control
JP2002101333A (en) 2000-09-26 2002-04-05 Casio Comput Co Ltd Remote controller and remote control service system, and recording medium for recording program for them
AU2001296925A1 (en) 2000-09-28 2002-04-08 Vigilos, Inc. Method and process for configuring a premises for monitoring
US20030060808A1 (en) 2000-10-04 2003-03-27 Wilk Peter J. Telemedical method and system
JP2004538538A (en) 2000-10-05 2004-12-24 シーメンス コーポレイト リサーチ インコーポレイテツド Intraoperative image-guided neurosurgery and surgical devices with augmented reality visualization
US20050149364A1 (en) 2000-10-06 2005-07-07 Ombrellaro Mark P. Multifunction telemedicine software with integrated electronic medical record
US6674259B1 (en) 2000-10-06 2004-01-06 Innovation First, Inc. System and method for managing and controlling a robot competition
JP2002112970A (en) 2000-10-10 2002-04-16 Daito Seiki Co Ltd Device and method for observing surface of skin
JP2002113675A (en) 2000-10-11 2002-04-16 Sony Corp Robot control system and introducing method for robot controlling software
WO2002033641A2 (en) 2000-10-16 2002-04-25 Cardionow, Inc. Medical image capture system and method
US8348675B2 (en) 2000-10-19 2013-01-08 Life Success Academy Apparatus and method for delivery of instructional information
US6636780B1 (en) 2000-11-07 2003-10-21 Mdg Medical Inc. Medication dispensing system including medicine cabinet and tray therefor
JP4310916B2 (en) 2000-11-08 2009-08-12 コニカミノルタホールディングス株式会社 Video display device
US7219364B2 (en) 2000-11-22 2007-05-15 International Business Machines Corporation System and method for selectable semantic codec pairs for very low data-rate video transmission
US20020104094A1 (en) 2000-12-01 2002-08-01 Bruce Alexander System and method for processing video data utilizing motion detection and subdivided video fields
US6543899B2 (en) 2000-12-05 2003-04-08 Eastman Kodak Company Auto-stereoscopic viewing system using mounted projection
WO2002046901A1 (en) 2000-12-06 2002-06-13 Vigilos, Inc. System and method for implementing open-protocol remote device control
US6411209B1 (en) 2000-12-06 2002-06-25 Koninklijke Philips Electronics N.V. Method and apparatus to select the best video frame to transmit to a remote station for CCTV based residential security monitoring
US6791550B2 (en) 2000-12-12 2004-09-14 Enounce, Inc. Management of presentation time in a digital media presentation system with variable rate presentation capability
US20040260790A1 (en) 2000-12-21 2004-12-23 Ge Medical System Global Technology Company, Llc Method and apparatus for remote or collaborative control of an imaging system
US7339605B2 (en) 2004-04-16 2008-03-04 Polycom, Inc. Conference link between a speakerphone and a video conference unit
US6442451B1 (en) 2000-12-28 2002-08-27 Robotic Workspace Technologies, Inc. Versatile robot control system
US20020085030A1 (en) 2000-12-29 2002-07-04 Jamal Ghani Graphical user interface for an interactive collaboration system
KR20020061961A (en) 2001-01-19 2002-07-25 사성동 Intelligent pet robot
JP2002342759A (en) 2001-01-30 2002-11-29 Nec Corp System and method for providing information and its program
US20020106998A1 (en) 2001-02-05 2002-08-08 Presley Herbert L. Wireless rich media conferencing
JP3736358B2 (en) 2001-02-08 2006-01-18 株式会社チューオー Wall material
US20020109775A1 (en) 2001-02-09 2002-08-15 Excellon Automation Co. Back-lighted fiducial recognition system and method of use
JP4182464B2 (en) 2001-02-09 2008-11-19 富士フイルム株式会社 Video conferencing system
US7184559B2 (en) 2001-02-23 2007-02-27 Hewlett-Packard Development Company, L.P. System and method for audio telepresence
EP1386472A4 (en) 2001-02-27 2009-05-27 Anthrotronix Inc Robotic apparatus and wireless communication system
US20020128985A1 (en) 2001-03-09 2002-09-12 Brad Greenwald Vehicle value appraisal system
US20020133062A1 (en) 2001-03-15 2002-09-19 Arling Robert Stanley Embedded measurement values in medical reports
JP4739556B2 (en) 2001-03-27 2011-08-03 株式会社安川電機 Remote adjustment and abnormality judgment device for control target
US6965394B2 (en) 2001-03-30 2005-11-15 Koninklijke Philips Electronics N.V. Remote camera control device
US20020143923A1 (en) 2001-04-03 2002-10-03 Vigilos, Inc. System and method for managing a device network
JP2002305743A (en) 2001-04-04 2002-10-18 Rita Security Engineering:Kk Remote moving picture transmission system compatible with adsl
US20030199000A1 (en) 2001-08-20 2003-10-23 Valkirs Gunars E. Diagnostic markers of stroke and cerebral injury and methods of use thereof
US6920373B2 (en) 2001-04-13 2005-07-19 Board Of Trusstees Operating Michigan State University Synchronization and task control of real-time internet based super-media
AU767561B2 (en) 2001-04-18 2003-11-13 Samsung Kwangju Electronics Co., Ltd. Robot cleaner, system employing the same and method for reconnecting to external recharging device
KR100437372B1 (en) 2001-04-18 2004-06-25 삼성광주전자 주식회사 Robot cleaning System using by mobile communication network
US7111980B2 (en) 2001-04-19 2006-09-26 Honeywell International Inc. System and method using thermal image analysis and slope threshold classification for polygraph testing
JP2002321180A (en) 2001-04-24 2002-11-05 Matsushita Electric Ind Co Ltd Robot control system
AU2002305392A1 (en) 2001-05-02 2002-11-11 Bitstream, Inc. Methods, systems, and programming for producing and displaying subpixel-optimized images and digital content including such images
US7202851B2 (en) 2001-05-04 2007-04-10 Immersion Medical Inc. Haptic interface for palpation simulation
US6723086B2 (en) 2001-05-07 2004-04-20 Logiq Wireless Solutions, Inc. Remote controlled transdermal medication delivery device
US7242306B2 (en) 2001-05-08 2007-07-10 Hill-Rom Services, Inc. Article locating and tracking apparatus and method
JP2004536634A (en) 2001-05-25 2004-12-09 レゴ エー/エス Robot toy programming
JP2002354551A (en) 2001-05-25 2002-12-06 Mitsubishi Heavy Ind Ltd Robot service providing method and system thereof
JP2002352354A (en) 2001-05-30 2002-12-06 Denso Corp Remote care method
JP2002355779A (en) 2001-06-01 2002-12-10 Sharp Corp Robot type interface device and control method for the same
US6763282B2 (en) 2001-06-04 2004-07-13 Time Domain Corp. Method and system for controlling a robot
US20020186243A1 (en) 2001-06-06 2002-12-12 Robert Ellis Method and system for providing combined video and physiological data over a communication network for patient monitoring
US6507773B2 (en) 2001-06-14 2003-01-14 Sharper Image Corporation Multi-functional robot with remote and video system
US6995664B1 (en) 2001-06-20 2006-02-07 Jeffrey Darling Remote supervision system and method
US6604021B2 (en) 2001-06-21 2003-08-05 Advanced Telecommunications Research Institute International Communication robot
WO2003000015A2 (en) 2001-06-25 2003-01-03 Science Applications International Corporation Identification by analysis of physiometric variation
US7483867B2 (en) 2001-06-26 2009-01-27 Intuition Intelligence, Inc. Processing device with intuitive learning capability
GB2377117B (en) 2001-06-27 2004-08-18 Cambridge Broadband Ltd Method and apparatus for providing communications bandwidth
NO20013450L (en) 2001-07-11 2003-01-13 Simsurgery As Systems and methods for interactive training of procedures
WO2003013140A1 (en) 2001-07-25 2003-02-13 Stevenson Neil J A camera control apparatus and method
US7831575B2 (en) 2001-08-02 2010-11-09 Bridge Works, Ltd Library virtualisation module
US6580246B2 (en) 2001-08-13 2003-06-17 Steven Jacobs Robot touch shield
US6667592B2 (en) 2001-08-13 2003-12-23 Intellibot, L.L.C. Mapped robot system
JP4689107B2 (en) 2001-08-22 2011-05-25 本田技研工業株式会社 Autonomous robot
US6952470B1 (en) 2001-08-23 2005-10-04 Bellsouth Intellectual Property Corp. Apparatus and method for managing a call center
WO2003019450A2 (en) 2001-08-24 2003-03-06 March Networks Corporation Remote health-monitoring system and method
JP2003070804A (en) 2001-09-05 2003-03-11 Olympus Optical Co Ltd Remote medical support system
US6728599B2 (en) 2001-09-07 2004-04-27 Computer Motion, Inc. Modularity system for computer assisted surgery
JP4378072B2 (en) 2001-09-07 2009-12-02 キヤノン株式会社 Electronic device, imaging device, portable communication device, video display control method and program
WO2003022142A2 (en) 2001-09-13 2003-03-20 The Boeing Company Method for transmitting vital health statistics to a remote location form an aircraft
US20030053662A1 (en) 2001-09-19 2003-03-20 Koninklijke Philips Electronics N.V. Method and apparatus for digital encoding and operator identification using stored user image
US6587750B2 (en) 2001-09-25 2003-07-01 Intuitive Surgical, Inc. Removable infinite roll master grip handle and touch sensor for robotic surgery
JP2003110652A (en) 2001-10-01 2003-04-11 Matsushita Graphic Communication Systems Inc Method of reinitializing adsl modem and the adsl modem
US6840904B2 (en) 2001-10-11 2005-01-11 Jason Goldberg Medical monitoring device and system
US7058689B2 (en) 2001-10-16 2006-06-06 Sprint Communications Company L.P. Sharing of still images within a video telephony call
US7307653B2 (en) 2001-10-19 2007-12-11 Nokia Corporation Image stabilizer for a microcamera module of a handheld device, and method for stabilizing a microcamera module of a handheld device
WO2003036557A1 (en) 2001-10-22 2003-05-01 Intel Zao Method and apparatus for background segmentation based on motion localization
US20030080901A1 (en) 2001-10-25 2003-05-01 Koninklijke Philips Electronics N.V. RFID navigation system
JP2003136450A (en) 2001-10-26 2003-05-14 Communication Research Laboratory Remote control system of robot arm by providing audio information
JP2003205483A (en) 2001-11-07 2003-07-22 Sony Corp Robot system and control method for robot device
US8199188B2 (en) 2001-11-09 2012-06-12 Karl Storz Imaging, Inc. Video imaging system with a camera control unit
US20030152145A1 (en) 2001-11-15 2003-08-14 Kevin Kawakita Crash prevention recorder (CPR)/video-flight data recorder (V-FDR)/cockpit-cabin voice recorder for light aircraft with an add-on option for large commercial jets
US7317685B1 (en) 2001-11-26 2008-01-08 Polycom, Inc. System and method for dynamic bandwidth allocation for videoconferencing in lossy packet switched networks
US6785589B2 (en) 2001-11-30 2004-08-31 Mckesson Automation, Inc. Dispensing cabinet with unit dose dispensing drawer
US20050101841A9 (en) 2001-12-04 2005-05-12 Kimberly-Clark Worldwide, Inc. Healthcare networks with biosensors
US7539504B2 (en) 2001-12-05 2009-05-26 Espre Solutions, Inc. Wireless telepresence collaboration system
US6839612B2 (en) 2001-12-07 2005-01-04 Institute Surgical, Inc. Microwrist system for surgical procedures
JP3709393B2 (en) 2001-12-14 2005-10-26 富士ソフトエービーシ株式会社 Remote control system and remote control method
US7227864B2 (en) 2001-12-17 2007-06-05 Microsoft Corporation Methods and systems for establishing communications through firewalls and network address translators
US7305114B2 (en) 2001-12-26 2007-12-04 Cognex Technology And Investment Corporation Human/machine interface for a machine vision sensor and method for installing and operating the same
US7082497B2 (en) 2001-12-28 2006-07-25 Hewlett-Packard Development Company, L.P. System and method for managing a moveable media library with library partitions
US7647320B2 (en) 2002-01-18 2010-01-12 Peoplechart Corporation Patient directed system and method for managing medical information
US7167448B2 (en) 2002-02-04 2007-01-23 Sun Microsystems, Inc. Prioritization of remote services messages within a low bandwidth environment
US6693585B1 (en) 2002-02-07 2004-02-17 Aradiant Corporation Self-contained selectively activated mobile object position reporting device with reduced power consumption and minimized wireless service fees.
US6784916B2 (en) 2002-02-11 2004-08-31 Telbotics Inc. Video conferencing apparatus
AU2002335152A1 (en) 2002-02-13 2003-09-04 Toudai Tlo, Ltd. Robot-phone
JP2003241807A (en) 2002-02-19 2003-08-29 Yaskawa Electric Corp Robot control unit
JP4100934B2 (en) 2002-02-28 2008-06-11 シャープ株式会社 Composite camera system, zoom camera control method, and zoom camera control program
WO2003077101A2 (en) 2002-03-06 2003-09-18 Z-Kat, Inc. System and method for using a haptic device in combination with a computer-assisted surgery system
US7860680B2 (en) 2002-03-07 2010-12-28 Microstrain, Inc. Robotic system for powering and interrogating sensors
US6915871B2 (en) 2002-03-12 2005-07-12 Dan Gavish Method and apparatus for improving child safety and adult convenience while using a mobile ride-on toy
US6769771B2 (en) 2002-03-14 2004-08-03 Entertainment Design Workshop, Llc Method and apparatus for producing dynamic imagery in a visual medium
JP3945279B2 (en) 2002-03-15 2007-07-18 ソニー株式会社 Obstacle recognition apparatus, obstacle recognition method, obstacle recognition program, and mobile robot apparatus
EP1485008A1 (en) 2002-03-18 2004-12-15 Medic4all AG Monitoring method and monitoring system for assessing physiological parameters of a subject
US7343565B2 (en) 2002-03-20 2008-03-11 Mercurymd, Inc. Handheld device graphical user interfaces for displaying patient medical records
KR100483790B1 (en) 2002-03-22 2005-04-20 한국과학기술연구원 Multi-degree of freedom telerobotic system for micro assembly
JP4032793B2 (en) 2002-03-27 2008-01-16 ソニー株式会社 Charging system, charging control method, robot apparatus, charging control program, and recording medium
US7117067B2 (en) 2002-04-16 2006-10-03 Irobot Corporation System and methods for adaptive control of robotic devices
US20030231244A1 (en) 2002-04-22 2003-12-18 Bonilla Victor G. Method and system for manipulating a field of view of a video image from a remote vehicle
US20040172301A1 (en) 2002-04-30 2004-09-02 Mihai Dan M. Remote multi-purpose user interface for a healthcare system
US6898484B2 (en) 2002-05-01 2005-05-24 Dorothy Lemelson Robotic manufacturing and assembly with relative radio positioning using radio based location determination
CN100379391C (en) 2002-05-07 2008-04-09 国立大学法人京都大学 Medical cockpit system
US6836701B2 (en) 2002-05-10 2004-12-28 Royal Appliance Mfg. Co. Autonomous multi-platform robotic system
JP4081747B2 (en) 2002-05-17 2008-04-30 技研株式会社 Robot drive control method and apparatus
AU2003301240A1 (en) 2002-05-20 2004-05-04 William O. Kling Skin cleanser compositions and methods of use
AU2003239555A1 (en) 2002-05-20 2003-12-12 Vigilos, Inc. System and method for providing data communication in a device network
US6807461B2 (en) 2002-05-22 2004-10-19 Kuka Roboter Gmbh Coordinated robot control from multiple remote instruction sources
JP4043289B2 (en) * 2002-05-27 2008-02-06 シャープ株式会社 Search robot system
US6743721B2 (en) 2002-06-10 2004-06-01 United Microelectronics Corp. Method and system for making cobalt silicide
KR100478452B1 (en) 2002-06-12 2005-03-23 삼성전자주식회사 Localization apparatus and method for mobile robot
US20030232649A1 (en) 2002-06-18 2003-12-18 Gizis Alexander C.M. Gaming system and method
JP3910112B2 (en) 2002-06-21 2007-04-25 シャープ株式会社 Camera phone
US7181455B2 (en) 2002-06-27 2007-02-20 Sun Microsystems, Inc. Bandwidth management for remote services system
US6752539B2 (en) 2002-06-28 2004-06-22 International Buisness Machines Corporation Apparatus and system for providing optical bus interprocessor interconnection
KR100556612B1 (en) 2002-06-29 2006-03-06 삼성전자주식회사 Apparatus and method of localization using laser
DE10231388A1 (en) 2002-07-08 2004-02-05 Alfred Kärcher Gmbh & Co. Kg Tillage system
DE10231391A1 (en) 2002-07-08 2004-02-12 Alfred Kärcher Gmbh & Co. Kg Tillage system
FR2842320A1 (en) 2002-07-12 2004-01-16 Thomson Licensing Sa MULTIMEDIA DATA PROCESSING DEVICE
US7084809B2 (en) 2002-07-15 2006-08-01 Qualcomm, Incorporated Apparatus and method of position determination using shared information
JP2004042230A (en) 2002-07-15 2004-02-12 Kawasaki Heavy Ind Ltd Remote control method and remote control system of robot controller
US20120072024A1 (en) 2002-07-25 2012-03-22 Yulun Wang Telerobotic system with dual application screen presentation
US20040162637A1 (en) 2002-07-25 2004-08-19 Yulun Wang Medical tele-robotic system with a master remote station with an arbitrator
US7593030B2 (en) 2002-07-25 2009-09-22 Intouch Technologies, Inc. Tele-robotic videoconferencing in a corporate environment
US6925357B2 (en) 2002-07-25 2005-08-02 Intouch Health, Inc. Medical tele-robotic system
DE10234233A1 (en) 2002-07-27 2004-02-05 Kuka Roboter Gmbh Process for the exchange of data between controls of machines, in particular robots
US6975229B2 (en) 2002-08-09 2005-12-13 Battelle Memorial Institute K1-53 System and method for acquisition management of subject position information
EP1388813A2 (en) 2002-08-09 2004-02-11 Matsushita Electric Industrial Co., Ltd. Apparatus and method for image watermarking
US7523505B2 (en) 2002-08-16 2009-04-21 Hx Technologies, Inc. Methods and systems for managing distributed digital medical data
US20050288571A1 (en) 2002-08-20 2005-12-29 Welch Allyn, Inc. Mobile medical workstation
US6753899B2 (en) 2002-09-03 2004-06-22 Audisoft Method and apparatus for telepresence
WO2004025947A2 (en) 2002-09-13 2004-03-25 Irobot Corporation A navigational control system for a robotic device
US8182440B2 (en) 2002-09-27 2012-05-22 Baxter International Inc. Dialysis machine having combination display and handle
US20140254896A1 (en) * 2011-07-18 2014-09-11 Tiger T G Zhou Unmanned drone, robot system for delivering mail, goods, humanoid security, crisis negotiation, mobile payments, smart humanoid mailbox and wearable personal exoskeleton heavy load flying machine
JP2004124824A (en) 2002-10-02 2004-04-22 Toyota Motor Corp Secondary air supply device
US20040065073A1 (en) 2002-10-08 2004-04-08 Ingersoll-Rand Energy Systems Corporation Flexible recuperator mounting system
US7881658B2 (en) 2002-10-10 2011-02-01 Znl Enterprises, Llc Method and apparatus for entertainment and information services delivered via mobile telecommunication devices
US6804579B1 (en) 2002-10-16 2004-10-12 Abb, Inc. Robotic wash cell using recycled pure water
WO2004036371A2 (en) 2002-10-16 2004-04-29 Rocksteady Networks, Inc. System and method for dynamic bandwidth provisioning
WO2004040437A1 (en) 2002-10-28 2004-05-13 The General Hospital Corporation Tissue disorder imaging analysis
US6879879B2 (en) 2002-10-31 2005-04-12 Hewlett-Packard Development Company, L.P. Telepresence system with automatic user-surrogate height matching
US6920376B2 (en) 2002-10-31 2005-07-19 Hewlett-Packard Development Company, L.P. Mutually-immersive mobile telepresence system with user rotation and surrogate translation
US20040093409A1 (en) 2002-11-07 2004-05-13 Vigilos, Inc. System and method for external event determination utilizing an integrated information system
US8073304B2 (en) 2002-11-16 2011-12-06 Gregory Karel Rohlicek Portable recorded television viewer
KR100542340B1 (en) 2002-11-18 2006-01-11 삼성전자주식회사 home network system and method for controlling home network system
US7123974B1 (en) 2002-11-19 2006-10-17 Rockwell Software Inc. System and methodology providing audit recording and tracking in real time industrial controller environment
JP2004181229A (en) 2002-11-20 2004-07-02 Olympus Corp System and method for supporting remote operation
KR20040046071A (en) 2002-11-26 2004-06-05 삼성전자주식회사 Method for displaying antenna-ba of terminal
JP3885019B2 (en) 2002-11-29 2007-02-21 株式会社東芝 Security system and mobile robot
US20040172306A1 (en) 2002-12-02 2004-09-02 Recare, Inc. Medical data entry interface
AU2003289142A1 (en) * 2002-12-10 2004-06-30 Honda Motor Co., Ltd. Robot control device, robot control method, and robot control program
US6889120B2 (en) 2002-12-14 2005-05-03 Hewlett-Packard Development Company, L.P. Mutually-immersive mobile telepresence with gaze and eye contact preservation
US7015831B2 (en) 2002-12-17 2006-03-21 Evolution Robotics, Inc. Systems and methods for incrementally updating a pose of a mobile device calculated by visual simultaneous localization and mapping techniques
US20090030552A1 (en) 2002-12-17 2009-01-29 Japan Science And Technology Agency Robotics visual and auditory system
US6938167B2 (en) 2002-12-18 2005-08-30 America Online, Inc. Using trusted communication channel to combat user name/password theft
US7584019B2 (en) 2003-12-15 2009-09-01 Dako Denmark A/S Systems and methods for the automated pre-treatment and processing of biological samples
US20040135879A1 (en) 2003-01-03 2004-07-15 Stacy Marco A. Portable wireless indoor/outdoor camera
US6745115B1 (en) 2003-01-07 2004-06-01 Garmin Ltd. System, method and apparatus for searching geographic area using prioritized spacial order
CN101390098A (en) 2003-01-15 2009-03-18 英塔茨科技公司 5 degress of freedom mobile robot
US7158859B2 (en) 2003-01-15 2007-01-02 Intouch Technologies, Inc. 5 degrees of freedom mobile robot
ITMI20030121A1 (en) 2003-01-27 2004-07-28 Giuseppe Donato MODULAR SURVEILLANCE SYSTEM FOR MONITORING OF CRITICAL ENVIRONMENTS.
US7404140B2 (en) 2003-01-31 2008-07-22 Siemens Medical Solutions Usa, Inc. System for managing form information for use by portable devices
US20040176118A1 (en) 2003-02-18 2004-09-09 Michael Strittmatter Service attribute based filtering system and method
US7158860B2 (en) 2003-02-24 2007-01-02 Intouch Technologies, Inc. Healthcare tele-robotic system which allows parallel remote station observation
US7171286B2 (en) 2003-02-24 2007-01-30 Intouch Technologies, Inc. Healthcare tele-robotic system with a robot that also functions as a remote station
US7388981B2 (en) 2003-02-27 2008-06-17 Hewlett-Packard Development Company, L.P. Telepresence system with automatic preservation of user head size
JP2004261941A (en) 2003-03-04 2004-09-24 Sharp Corp Communication robot and communication system
US7262573B2 (en) 2003-03-06 2007-08-28 Intouch Technologies, Inc. Medical tele-robotic system with a head worn device
US7593546B2 (en) 2003-03-11 2009-09-22 Hewlett-Packard Development Company, L.P. Telepresence system with simultaneous automatic preservation of user height, perspective, and vertical gaze
US20050065813A1 (en) 2003-03-11 2005-03-24 Mishelevich David J. Online medical evaluation system
JP3879848B2 (en) * 2003-03-14 2007-02-14 松下電工株式会社 Autonomous mobile device
EP1627524A4 (en) 2003-03-20 2009-05-27 Ge Security Inc Systems and methods for multi-resolution image processing
JP4124682B2 (en) 2003-03-20 2008-07-23 日本放送協会 Camera control device
US20040205664A1 (en) 2003-03-25 2004-10-14 Prendergast Thomas V. Claim data and document processing system
JP2004298977A (en) 2003-03-28 2004-10-28 Sony Corp Action control device, action control method, action control program and mobile robot device
US6804580B1 (en) 2003-04-03 2004-10-12 Kuka Roboter Gmbh Method and control system for controlling a plurality of robots
US20040201602A1 (en) 2003-04-14 2004-10-14 Invensys Systems, Inc. Tablet computer system for industrial process design, supervisory control, and data management
CA2466371A1 (en) 2003-05-05 2004-11-05 Engineering Services Inc. Mobile robot hydrid communication link
WO2005008804A2 (en) 2003-05-08 2005-01-27 Power Estimate Company Apparatus and method for providing electrical energy generated from motion to an electrically powered device
US6856662B2 (en) 2003-05-13 2005-02-15 Framatome Anp, Inc. Remote examination of reactor nozzle J-groove welds
US20040236830A1 (en) 2003-05-15 2004-11-25 Steve Nelson Annotation management system
GB2391361B (en) 2003-05-23 2005-09-21 Bridgeworks Ltd Library element management
US20040240981A1 (en) 2003-05-29 2004-12-02 I-Scan Robotics Robot stacking system for flat glass
US6905941B2 (en) 2003-06-02 2005-06-14 International Business Machines Corporation Structure and method to fabricate ultra-thin Si channel devices
US20040252966A1 (en) 2003-06-10 2004-12-16 Holloway Marty M. Video storage and playback system and method
IL156556A (en) 2003-06-19 2010-02-17 Eran Schenker Life signs detector
US6888333B2 (en) 2003-07-02 2005-05-03 Intouch Health, Inc. Holonomic platform for a robot
US20050003330A1 (en) 2003-07-02 2005-01-06 Mehdi Asgarinejad Interactive virtual classroom
JP2005028066A (en) 2003-07-08 2005-02-03 Kikuta Sogo Kikaku:Kk Remote cleaning management system
US7154526B2 (en) 2003-07-11 2006-12-26 Fuji Xerox Co., Ltd. Telepresence system and method for video teleconferencing
US20050065435A1 (en) 2003-07-22 2005-03-24 John Rauch User interface for remote control of medical devices
US7995090B2 (en) 2003-07-28 2011-08-09 Fuji Xerox Co., Ltd. Video enabled tele-presence control host
US20050027567A1 (en) 2003-07-29 2005-02-03 Taha Amer Jamil System and method for health care data collection and management
US7395126B2 (en) 2003-07-29 2008-07-01 Far Touch, Inc. Remote control of wireless electromechanical device using a web browser
US7133062B2 (en) 2003-07-31 2006-11-07 Polycom, Inc. Graphical user interface for video feed on videoconference terminal
DE20312211U1 (en) 2003-08-07 2003-10-30 Yueh Wen Hsiang Swiveling USB plug
US7413040B2 (en) 2003-08-12 2008-08-19 White Box Robotics, Inc. Robot with removable mounting elements
JP2005059170A (en) 2003-08-18 2005-03-10 Honda Motor Co Ltd Information collecting robot
US7982763B2 (en) 2003-08-20 2011-07-19 King Simon P Portable pan-tilt camera and lighting unit for videoimaging, videoconferencing, production and recording
US7432949B2 (en) 2003-08-20 2008-10-07 Christophe Remy Mobile videoimaging, videocommunication, video production (VCVP) system
WO2005033832A2 (en) 2003-08-28 2005-04-14 University Of Maryland, Baltimore Techniques for delivering coordination data for a shared facility
US20050049898A1 (en) 2003-09-01 2005-03-03 Maiko Hirakawa Telemedicine system using the internet
US20070061041A1 (en) 2003-09-02 2007-03-15 Zweig Stephen E Mobile robot with wireless location sensing apparatus
US7174238B1 (en) 2003-09-02 2007-02-06 Stephen Eliot Zweig Mobile robotic system with web server and digital radio links
US20050065438A1 (en) 2003-09-08 2005-03-24 Miller Landon C.G. System and method of capturing and managing information during a medical diagnostic imaging procedure
JP2005103680A (en) 2003-09-29 2005-04-21 Toshiba Corp Monitoring system and monitoring robot
IL158276A (en) 2003-10-02 2010-04-29 Radvision Ltd Method for dynamically optimizing bandwidth allocation in variable bitrate (multi-rate) conferences
US7221386B2 (en) 2003-10-07 2007-05-22 Librestream Technologies Inc. Camera for communication of streaming media to a remote client
JP2005111083A (en) 2003-10-09 2005-04-28 Olympus Corp Medical integrated system
US7307651B2 (en) 2003-10-16 2007-12-11 Mark A. Chew Two-way mobile video/audio/data interactive companion (MVIC) system
KR100820743B1 (en) 2003-10-21 2008-04-10 삼성전자주식회사 Charging Apparatus For Mobile Robot
JP4325853B2 (en) 2003-10-31 2009-09-02 富士通株式会社 Communication adapter device
US7096090B1 (en) 2003-11-03 2006-08-22 Stephen Eliot Zweig Mobile robotic router with web server and digital radio links
US20050125083A1 (en) 2003-11-10 2005-06-09 Kiko Frederick J. Automation apparatus and methods
US20060010028A1 (en) 2003-11-14 2006-01-12 Herb Sorensen Video shopper tracking system and method
US7115102B2 (en) 2003-11-17 2006-10-03 Abbruscato Charles R Electronic stethoscope system
US7161322B2 (en) 2003-11-18 2007-01-09 Intouch Technologies, Inc. Robot with a manipulator arm
US7092001B2 (en) 2003-11-26 2006-08-15 Sap Aktiengesellschaft Video conferencing system with physical cues
GB2408655B (en) 2003-11-27 2007-02-28 Motorola Inc Communication system, communication units and method of ambience listening thereto
US7624166B2 (en) 2003-12-02 2009-11-24 Fuji Xerox Co., Ltd. System and methods for remote control of multiple display and devices
US7292912B2 (en) 2003-12-05 2007-11-06 Lntouch Technologies, Inc. Door knocker control system for a remote controlled teleconferencing robot
US7813836B2 (en) 2003-12-09 2010-10-12 Intouch Technologies, Inc. Protocol for a remotely controlled videoconferencing robot
EP1704710A4 (en) 2003-12-24 2007-09-19 Walker Digital Llc Method and apparatus for automatically capturing and managing images
US7613313B2 (en) 2004-01-09 2009-11-03 Hewlett-Packard Development Company, L.P. System and method for control of audio field based on position of user
US8824730B2 (en) 2004-01-09 2014-09-02 Hewlett-Packard Development Company, L.P. System and method for control of video bandwidth based on pose of a person
US20050154265A1 (en) 2004-01-12 2005-07-14 Miro Xavier A. Intelligent nurse robot
US8229186B2 (en) 2004-01-15 2012-07-24 Algotec Systems Ltd. Vessel centerline determination
WO2005069890A2 (en) 2004-01-15 2005-08-04 Mega Robot, Inc. System and method for reconfiguring an autonomous robot
EP1705087A1 (en) 2004-01-16 2006-09-27 Yoshiaki Takida Robot arm type automatic car washing device
US7332890B2 (en) 2004-01-21 2008-02-19 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
WO2005074362A2 (en) 2004-02-03 2005-08-18 F. Robotics Aquisitions Ltd. Robot docking station
US7079173B2 (en) 2004-02-04 2006-07-18 Hewlett-Packard Development Company, L.P. Displaying a wide field of view video image
US20050182322A1 (en) 2004-02-17 2005-08-18 Liebel-Flarsheim Company Injector auto purge
US20050204438A1 (en) 2004-02-26 2005-09-15 Yulun Wang Graphical interface for a remote presence system
US7756614B2 (en) 2004-02-27 2010-07-13 Hewlett-Packard Development Company, L.P. Mobile device control system
CN1259891C (en) 2004-03-17 2006-06-21 哈尔滨工业大学 Robot assisted bone setting operation medical system with lock marrow internal nail
JP2005270430A (en) 2004-03-25 2005-10-06 Funai Electric Co Ltd Station for mobile robot
DE602005013938D1 (en) 2004-03-29 2009-05-28 Philips Intellectual Property METHOD FOR CONTROLLING MULTIPLE APPLICATIONS AND DIALOG MANAGEMENT SYSTEM
US20050264649A1 (en) 2004-03-29 2005-12-01 Calvin Chang Mobile video-interpreting mounting system
US20050225634A1 (en) 2004-04-05 2005-10-13 Sam Brunetti Closed circuit TV security system
JP2005312096A (en) 2004-04-16 2005-11-04 Funai Electric Co Ltd Electric apparatus
TWI258259B (en) 2004-04-20 2006-07-11 Jason Yan Automatic charging system of mobile robotic electronic device
KR20070011495A (en) 2004-04-22 2007-01-24 프론트라인 로보틱스 Open control system architecture for mobile autonomous systems
US7769705B1 (en) 2004-05-21 2010-08-03 Ray Anthony Luechtefeld Method, artificially intelligent system and networked complex for facilitating group interactions
US20050278446A1 (en) 2004-05-27 2005-12-15 Jeffery Bryant Home improvement telepresence system and method
US7949616B2 (en) 2004-06-01 2011-05-24 George Samuel Levy Telepresence by human-assisted remote controlled devices and robots
US7011538B2 (en) 2004-06-02 2006-03-14 Elementech International Co., Ltd. Dual input charger with cable storing mechanism
CN100461212C (en) 2004-06-04 2009-02-11 松下电器产业株式会社 Display control device, display control method, program, and portable apparatus
US20050283414A1 (en) 2004-06-17 2005-12-22 Fernandes Curtis T Remote system management
JP4479372B2 (en) 2004-06-25 2010-06-09 ソニー株式会社 Environmental map creation method, environmental map creation device, and mobile robot device
US7292257B2 (en) 2004-06-28 2007-11-06 Microsoft Corporation Interactive viewpoint video system and process
US8244542B2 (en) 2004-07-01 2012-08-14 Emc Corporation Video surveillance
US7539187B2 (en) 2004-07-07 2009-05-26 Qvidium Technologies, Inc. System and method for low-latency content-sensitive forward error correction
US20060007943A1 (en) 2004-07-07 2006-01-12 Fellman Ronald D Method and system for providing site independent real-time multimedia transport over packet-switched networks
US8503340B1 (en) 2004-07-11 2013-08-06 Yongyong Xu WiFi phone system
US8077963B2 (en) 2004-07-13 2011-12-13 Yulun Wang Mobile robot with a head-based movement mapping scheme
US7551647B2 (en) 2004-07-19 2009-06-23 Qvidium Technologies, Inc. System and method for clock synchronization over packet-switched networks
US7979157B2 (en) 2004-07-23 2011-07-12 Mcmaster University Multi-purpose robotic operating system and method
US7319469B2 (en) 2004-07-26 2008-01-15 Sony Corporation Copy protection arrangement
JP4315872B2 (en) * 2004-07-28 2009-08-19 本田技研工業株式会社 Mobile robot controller
CN100394897C (en) 2004-08-03 2008-06-18 张毓笠 Compound vibrated ultrasonic bone surgery apparatus
JP4912577B2 (en) 2004-09-01 2012-04-11 本田技研工業株式会社 Biped walking robot charging system
US20060052676A1 (en) 2004-09-07 2006-03-09 Yulun Wang Tele-presence system that allows for remote monitoring/observation and review of a patient and their medical records
US9820658B2 (en) 2006-06-30 2017-11-21 Bao Q. Tran Systems and methods for providing interoperability among healthcare devices
US7502498B2 (en) 2004-09-10 2009-03-10 Available For Licensing Patient monitoring apparatus
FI116749B (en) 2004-09-14 2006-02-15 Nokia Corp A device comprising camera elements
US20060064212A1 (en) 2004-09-22 2006-03-23 Cycle Time Corporation Reactive automated guided vehicle vision guidance system
US20060066609A1 (en) 2004-09-28 2006-03-30 Iodice Arthur P Methods and systems for viewing geometry of an object model generated by a CAD tool
US8060376B2 (en) 2004-10-01 2011-11-15 Nomoreclipboard, Llc System and method for collection of community health and administrative data
US7720570B2 (en) 2004-10-01 2010-05-18 Redzone Robotics, Inc. Network architecture for remote robot with interchangeable tools
JP2006109094A (en) 2004-10-05 2006-04-20 Nec Software Kyushu Ltd Remote controller, remote control system, and remote control method
US7441953B2 (en) 2004-10-07 2008-10-28 University Of Florida Research Foundation, Inc. Radiographic medical imaging system using robot mounted source and sensor for dynamic image capture and tomography
US20060087746A1 (en) 2004-10-22 2006-04-27 Kenneth Lipow Remote augmented motor-sensory interface for surgery
KR100645379B1 (en) 2004-10-29 2006-11-15 삼성광주전자 주식회사 A robot controlling system and a robot control method
KR100703692B1 (en) 2004-11-03 2007-04-05 삼성전자주식회사 System, apparatus and method for improving readability of a map representing objects in space
US20060098573A1 (en) 2004-11-08 2006-05-11 Beer John C System and method for the virtual aggregation of network links
US20060173712A1 (en) 2004-11-12 2006-08-03 Dirk Joubert Portable medical information system
US8738891B1 (en) 2004-11-15 2014-05-27 Nvidia Corporation Methods and systems for command acceleration in a video processor via translation of scalar instructions into vector instructions
US7522528B2 (en) 2004-11-18 2009-04-21 Qvidium Technologies, Inc. Low-latency automatic repeat request packet recovery mechanism for media streams
US20060122482A1 (en) 2004-11-22 2006-06-08 Foresight Imaging Inc. Medical image acquisition system for receiving and transmitting medical images instantaneously and method of using the same
WO2007035185A2 (en) 2004-12-03 2007-03-29 Mckesson Automation Inc. Mobile point of care system and associated method and computer program product
US7400578B2 (en) 2004-12-16 2008-07-15 International Business Machines Corporation Method and system for throttling network transmissions using per-receiver bandwidth control at the application layer of the transmitting server
KR100499770B1 (en) 2004-12-30 2005-07-07 주식회사 아이오. 테크 Network based robot control system
KR100497310B1 (en) 2005-01-10 2005-06-23 주식회사 아이오. 테크 Selection and playback method of multimedia content having motion information in network based robot system
US7395508B2 (en) 2005-01-14 2008-07-01 International Business Machines Corporation Method and apparatus for providing an interactive presentation environment
US7222000B2 (en) 2005-01-18 2007-05-22 Intouch Technologies, Inc. Mobile videoconferencing platform with automatic shut-off features
JP2006203821A (en) 2005-01-24 2006-08-03 Sony Corp Automatic transmission system
US20060173708A1 (en) 2005-01-28 2006-08-03 Circle Of Care, Inc. System and method for providing health care
KR100636270B1 (en) 2005-02-04 2006-10-19 삼성전자주식회사 Home network system and control method thereof
US20060176832A1 (en) 2005-02-04 2006-08-10 Sean Miceli Adaptive bit-rate adjustment of multimedia communications channels using transport control protocol
US7944469B2 (en) 2005-02-14 2011-05-17 Vigilos, Llc System and method for using self-learning rules to enable adaptive security monitoring
US8670866B2 (en) * 2005-02-18 2014-03-11 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US20060189393A1 (en) 2005-02-22 2006-08-24 Albert Edery Real action network gaming system
US7475112B2 (en) 2005-03-04 2009-01-06 Microsoft Corporation Method and system for presenting a video conference using a three-dimensional object
US20060224781A1 (en) 2005-03-10 2006-10-05 Jen-Ming Tsao Method and apparatus for controlling a user interface of a consumer electronic device
US7644898B2 (en) 2005-03-28 2010-01-12 Compview Medical, Llc Medical boom with articulated arms and a base with preconfigured removable modular racks used for storing electronic and utility equipment
US20080285886A1 (en) 2005-03-29 2008-11-20 Matthew Emmerson Allen System For Displaying Images
WO2006113553A2 (en) 2005-04-15 2006-10-26 New Jersey Institute Of Technology Dynamic bandwidth allocation and service differentiation for broadband passive optical networks
US7680038B1 (en) 2005-04-25 2010-03-16 Electronic Arts, Inc. Dynamic bandwidth detection and response for online games
US7436143B2 (en) 2005-04-25 2008-10-14 M-Bots, Inc. Miniature surveillance robot
US7864209B2 (en) 2005-04-28 2011-01-04 Apple Inc. Audio processing in a multi-participant conference
US20060248210A1 (en) 2005-05-02 2006-11-02 Lifesize Communications, Inc. Controlling video display mode in a video conferencing system
US20070165106A1 (en) 2005-05-02 2007-07-19 Groves Randall D Distributed Videoconferencing Processing
EP1877981A4 (en) 2005-05-02 2009-12-16 Univ Virginia Systems, devices, and methods for interpreting movement
WO2006119396A2 (en) 2005-05-04 2006-11-09 Board Of Regents, The University Of Texas System System, method and program product for delivering medical services from a remote location
US7240879B1 (en) 2005-05-06 2007-07-10 United States of America as represented by the Administration of the National Aeronautics and Space Administration Method and associated apparatus for capturing, servicing and de-orbiting earth satellites using robotics
US20060259193A1 (en) 2005-05-12 2006-11-16 Yulun Wang Telerobotic system with a dual application screen presentation
KR100594165B1 (en) 2005-05-24 2006-06-28 삼성전자주식회사 Robot controlling system based on network and method for controlling velocity of robot in the robot controlling system
US20060293788A1 (en) 2005-06-26 2006-12-28 Pavel Pogodin Robotic floor care appliance with improved remote management
JP2007007040A (en) 2005-06-29 2007-01-18 Hitachi Medical Corp Surgery support system
GB2428110A (en) 2005-07-06 2007-01-17 Armstrong Healthcare Ltd A robot and method of registering a robot.
US20070008321A1 (en) 2005-07-11 2007-01-11 Eastman Kodak Company Identifying collection images with special events
US7379664B2 (en) 2005-07-26 2008-05-27 Tinkers & Chance Remote view and controller for a camera
GB2453902B (en) 2005-08-11 2012-03-14 Beon Light Corp Pty Ltd Portable lamp
KR100749579B1 (en) 2005-09-05 2007-08-16 삼성광주전자 주식회사 Moving Robot having a plurality of changeable work module and Control Method for the same
US7643051B2 (en) 2005-09-09 2010-01-05 Roy Benjamin Sandberg Mobile video teleconferencing system and control method
EP1763243A3 (en) 2005-09-09 2008-03-26 LG Electronics Inc. Image capturing and displaying method and system
JP2007081646A (en) 2005-09-13 2007-03-29 Toshiba Corp Transmitting/receiving device
US20070070069A1 (en) 2005-09-26 2007-03-29 Supun Samarasekera System and method for enhanced situation awareness and visualization of environments
CN1743144A (en) 2005-09-29 2006-03-08 天津理工大学 Internet-based robot long-distance control method
US8098603B2 (en) 2005-09-30 2012-01-17 Intel Corporation Bandwidth adaptation in a wireless network
US9198728B2 (en) 2005-09-30 2015-12-01 Intouch Technologies, Inc. Multi-camera mobile teleconferencing platform
US8935006B2 (en) 2005-09-30 2015-01-13 Irobot Corporation Companion robot for personal interaction
JP2009516237A (en) 2005-10-07 2009-04-16 インテンシーブ ケア オン−ライン Online medical treatment service system and method of using the same
GB0520576D0 (en) 2005-10-10 2005-11-16 Applied Generics Ltd Using traffic monitoring information to provide better driver route planning
US20070093279A1 (en) 2005-10-12 2007-04-26 Craig Janik Wireless headset system for the automobile
CA2864027C (en) * 2005-10-14 2017-05-02 Aldo Zini Robotic ordering and delivery apparatuses, systems and methods
US7733224B2 (en) 2006-06-30 2010-06-08 Bao Tran Mesh network personal emergency response appliance
ES2726017T3 (en) 2005-10-28 2019-10-01 Viasat Inc Adaptive coding and modulation for broadband data transmission
US20070109324A1 (en) 2005-11-16 2007-05-17 Qian Lin Interactive viewing of video
US7751780B2 (en) 2005-11-23 2010-07-06 Qualcomm Incorporated Method and apparatus for collecting information from a wireless device
US20070120965A1 (en) 2005-11-25 2007-05-31 Sandberg Roy B Mobile video teleconferencing authentication and management system and method
CN101123911B (en) 2005-11-25 2012-02-15 株式会社东芝 Medical image diagnostic device
KR101300492B1 (en) * 2005-12-02 2013-09-02 아이로보트 코퍼레이션 Coverage robot mobility
EP2466411B1 (en) 2005-12-02 2018-10-17 iRobot Corporation Robot system
EP2816434A3 (en) * 2005-12-02 2015-01-28 iRobot Corporation Autonomous coverage robot
US20070135967A1 (en) 2005-12-08 2007-06-14 Jung Seung W Apparatus and method of controlling network-based robot
EP1796332B1 (en) 2005-12-08 2012-11-14 Electronics and Telecommunications Research Institute Token bucket dynamic bandwidth allocation
US8190238B2 (en) 2005-12-09 2012-05-29 Hansen Medical, Inc. Robotic catheter system and methods
DE102005058867B4 (en) * 2005-12-09 2018-09-27 Cine-Tv Broadcast Systems Gmbh Method and device for moving a camera arranged on a pan and tilt head along a predetermined path of movement
US7480870B2 (en) 2005-12-23 2009-01-20 Apple Inc. Indication of progress towards satisfaction of a user input condition
WO2008013568A2 (en) * 2005-12-30 2008-01-31 Irobot Corporation Autonomous mobile robot
US8577538B2 (en) 2006-07-14 2013-11-05 Irobot Corporation Method and system for controlling a remote vehicle
US20070171275A1 (en) 2006-01-24 2007-07-26 Kenoyer Michael L Three Dimensional Videoconferencing
JP2007232208A (en) 2006-01-31 2007-09-13 Mitsuboshi Belting Ltd Toothed belt and tooth cloth used therefor
US7979059B2 (en) 2006-02-06 2011-07-12 Rockefeller Alfred G Exchange of voice and video between two cellular or wireless telephones
FI20060131A0 (en) 2006-02-13 2006-02-13 Kone Corp connection system
US20090201372A1 (en) 2006-02-13 2009-08-13 Fraudhalt, Ltd. Method and apparatus for integrated atm surveillance
JP2007245332A (en) 2006-02-14 2007-09-27 Honda Motor Co Ltd Charging system of legged mobile robot
US7769492B2 (en) 2006-02-22 2010-08-03 Intouch Technologies, Inc. Graphical interface for a remote presence system
JP4728860B2 (en) 2006-03-29 2011-07-20 株式会社東芝 Information retrieval device
EP1842474A3 (en) 2006-04-04 2007-11-28 Samsung Electronics Co., Ltd. Robot cleaner system having robot cleaner and docking station
US20100171826A1 (en) 2006-04-12 2010-07-08 Store Eyes, Inc. Method for measuring retail display and compliance
US7610083B2 (en) 2006-04-27 2009-10-27 Medtronic, Inc. Method and system for loop recording with overlapping events
US20070255115A1 (en) 2006-04-27 2007-11-01 Anglin Richard L Jr Remote diagnostic & treatment system
US7539533B2 (en) 2006-05-16 2009-05-26 Bao Tran Mesh network monitoring appliance
WO2007136769A2 (en) 2006-05-19 2007-11-29 Mako Surgical Corp. Method and apparatus for controlling a haptic device
JP2007316966A (en) 2006-05-26 2007-12-06 Fujitsu Ltd Mobile robot, control method thereof and program
US20070279483A1 (en) 2006-05-31 2007-12-06 Beers Ted W Blended Space For Aligning Video Streams
US8849679B2 (en) 2006-06-15 2014-09-30 Intouch Technologies, Inc. Remote controlled robot system that provides medical images
US20070291128A1 (en) 2006-06-15 2007-12-20 Yulun Wang Mobile teleconferencing system that projects an image provided by a mobile robot
US7920962B2 (en) 2006-06-19 2011-04-05 Kiva Systems, Inc. System and method for coordinating movement of mobile drive units
US8649899B2 (en) 2006-06-19 2014-02-11 Amazon Technologies, Inc. System and method for maneuvering a mobile drive unit
US20070299316A1 (en) 2006-06-21 2007-12-27 Patrick Haslehurst System and method for remote medical device operation
WO2008097252A2 (en) * 2006-06-22 2008-08-14 Roy Sandberg Method and apparatus for robotic path planning, selection, and visualization
US7801644B2 (en) * 2006-07-05 2010-09-21 Battelle Energy Alliance, Llc Generic robot architecture
US7620477B2 (en) * 2006-07-05 2009-11-17 Battelle Energy Alliance, Llc Robotic intelligence kernel
US8355818B2 (en) * 2009-09-03 2013-01-15 Battelle Energy Alliance, Llc Robots, systems, and methods for hazard evaluation and visualization
US8271132B2 (en) * 2008-03-13 2012-09-18 Battelle Energy Alliance, Llc System and method for seamless task-directed autonomy for robots
US7668621B2 (en) * 2006-07-05 2010-02-23 The United States Of America As Represented By The United States Department Of Energy Robotic guarded motion system and method
US8965578B2 (en) * 2006-07-05 2015-02-24 Battelle Energy Alliance, Llc Real time explosive hazard information sensing, processing, and communication for autonomous operation
US8073564B2 (en) 2006-07-05 2011-12-06 Battelle Energy Alliance, Llc Multi-robot control interface
US7587260B2 (en) 2006-07-05 2009-09-08 Battelle Energy Alliance, Llc Autonomous navigation system and method
US7584020B2 (en) * 2006-07-05 2009-09-01 Battelle Energy Alliance, Llc Occupancy change detection system and method
US7974738B2 (en) * 2006-07-05 2011-07-05 Battelle Energy Alliance, Llc Robotics virtual rail system and method
US8843244B2 (en) 2006-10-06 2014-09-23 Irobot Corporation Autonomous behaviors for a remove vehicle
US7974924B2 (en) 2006-07-19 2011-07-05 Mvisum, Inc. Medical data encryption for communication over a vulnerable system
US20080033641A1 (en) 2006-07-25 2008-02-07 Medalia Michael J Method of generating a three-dimensional interactive tour of a geographic location
US7599290B2 (en) 2006-08-11 2009-10-06 Latitude Broadband, Inc. Methods and systems for providing quality of service in packet-based core transport networks
US20080056933A1 (en) * 2006-08-29 2008-03-06 Moore Barrett H Self-Propelled Sterilization Robot and Method
US20090192821A9 (en) 2006-09-01 2009-07-30 Athenahealth, Inc. Medical image annotation
US8564544B2 (en) 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US7693757B2 (en) 2006-09-21 2010-04-06 International Business Machines Corporation System and method for performing inventory using a mobile inventory robot
US20100243344A1 (en) * 2006-09-25 2010-09-30 Board Of Trustees Of Leland Stanford Junior University Electromechanically counterbalanced humanoid robotic system
KR100812724B1 (en) * 2006-09-29 2008-03-12 삼성중공업 주식회사 Multi function robot for moving on wall using indoor global positioning system
US8180486B2 (en) 2006-10-02 2012-05-15 Honda Motor Co., Ltd. Mobile robot and controller for same
US20070170886A1 (en) 2006-10-03 2007-07-26 Plishner Paul J Vehicle equipped for providing solar electric power for off-vehicle use and systems in support thereof
US7761185B2 (en) 2006-10-03 2010-07-20 Intouch Technologies, Inc. Remote presence display through remotely controlled robot
US7654348B2 (en) 2006-10-06 2010-02-02 Irobot Corporation Maneuvering robotic vehicles having a positionable sensor head
US20080126132A1 (en) 2006-11-28 2008-05-29 General Electric Company Smart bed system
US8095238B2 (en) 2006-11-29 2012-01-10 Irobot Corporation Robot development platform
US7630314B2 (en) 2006-12-05 2009-12-08 Latitue Broadband, Inc. Methods and systems for dynamic bandwidth management for quality of service in IP Core and access networks
US20100054566A1 (en) 2006-12-19 2010-03-04 Konica Minolta Medical & Graphic, Inc. Medical image management system
TWI330305B (en) 2006-12-28 2010-09-11 Ind Tech Res Inst Method for routing a robotic apparatus to a service station and robotic apparatus service system using thereof
US7557758B2 (en) 2007-03-26 2009-07-07 Broadcom Corporation Very high frequency dielectric substrate wave guide
KR101370478B1 (en) 2007-01-10 2014-03-06 퀄컴 인코포레이티드 Content-and link-dependent coding adaptation for multimedia telephony
WO2008101117A1 (en) 2007-02-14 2008-08-21 Teliris, Inc. Telepresence conference room layout, dynamic scenario manager, diagnostics and control system and method
US20080232763A1 (en) 2007-03-15 2008-09-25 Colin Brady System and method for adjustment of video playback resolution
US8265793B2 (en) 2007-03-20 2012-09-11 Irobot Corporation Mobile robot for telecommunication
US7747223B2 (en) 2007-03-29 2010-06-29 Research In Motion Limited Method, system and mobile device for prioritizing a discovered device list
JP5053690B2 (en) 2007-04-12 2012-10-17 株式会社東芝 Image diagnosis support system and image diagnosis support program
US8505086B2 (en) 2007-04-20 2013-08-06 Innovation First, Inc. Managing communications between robots and controllers
US8305914B2 (en) 2007-04-30 2012-11-06 Hewlett-Packard Development Company, L.P. Method for signal adjustment through latency control
US8310521B2 (en) 2007-04-30 2012-11-13 Microsoft Corp. Insertion of virtual video into live video
US8909370B2 (en) * 2007-05-08 2014-12-09 Massachusetts Institute Of Technology Interactive systems employing robotic companions
KR101345528B1 (en) 2007-05-09 2013-12-27 아이로보트 코퍼레이션 Autonomous robot
US9160783B2 (en) 2007-05-09 2015-10-13 Intouch Technologies, Inc. Robot system that operates through a network firewall
US8175677B2 (en) 2007-06-07 2012-05-08 MRI Interventions, Inc. MRI-guided medical interventional systems and methods
US20130198321A1 (en) 2012-01-31 2013-08-01 Paul W. Martin Content associated with primary content
JP2009022368A (en) 2007-07-17 2009-02-05 Toshiba Corp Medical image observation supporting system
WO2009011542A1 (en) * 2007-07-18 2009-01-22 Lg Electronics Inc. Mobile robot and controlling method thereof
US8645527B1 (en) 2007-07-25 2014-02-04 Xangati, Inc. Network monitoring using bounded memory data structures
KR20090012542A (en) 2007-07-30 2009-02-04 주식회사 마이크로로봇 System for home monitoring using robot
US8400491B1 (en) 2007-08-01 2013-03-19 Sprint Communications Company L.P. Use-based adaptive video client for a bandwidth-constrained network
US7631833B1 (en) 2007-08-03 2009-12-15 The United States Of America As Represented By The Secretary Of The Navy Smart counter asymmetric threat micromunition with autonomous target selection and homing
US8639797B1 (en) 2007-08-03 2014-01-28 Xangati, Inc. Network monitoring of behavior probability density
US20090044334A1 (en) 2007-08-13 2009-02-19 Valence Broadband, Inc. Automatically adjusting patient platform support height in response to patient related events
US8116910B2 (en) 2007-08-23 2012-02-14 Intouch Technologies, Inc. Telepresence robot with a printer
KR101330734B1 (en) 2007-08-24 2013-11-20 삼성전자주식회사 Robot cleaner system having robot cleaner and docking station
WO2009032922A1 (en) 2007-09-04 2009-03-12 Objectvideo, Inc. Stationary target detection by exploiting changes in background model
US20090070135A1 (en) 2007-09-10 2009-03-12 General Electric Company System and method for improving claims processing in the healthcare industry
US8632376B2 (en) * 2007-09-20 2014-01-21 Irobot Corporation Robotic game systems and methods
US8237769B2 (en) 2007-09-21 2012-08-07 Motorola Mobility Llc System and method of videotelephony with detection of a visual token in the videotelephony image for electronic control of the field of view
US9060094B2 (en) 2007-09-30 2015-06-16 Optical Fusion, Inc. Individual adjustment of audio and video properties in network conferencing
US7890351B2 (en) * 2007-10-02 2011-02-15 American Well Corporation Managing utilization
US20090248200A1 (en) 2007-10-22 2009-10-01 North End Technologies Method & apparatus for remotely operating a robotic device linked to a communications network
US8045458B2 (en) 2007-11-08 2011-10-25 Mcafee, Inc. Prioritizing network traffic
US8219670B2 (en) 2007-11-08 2012-07-10 University Of Maryland System and method for adaptive context aware interaction of user with entity of interest
US7987069B2 (en) 2007-11-12 2011-07-26 Bee Cave, Llc Monitoring patient support exiting and initiating response
US20090146822A1 (en) * 2007-11-13 2009-06-11 Elevate Technologies Pty Ltd. Telemedicine Application for Remote Monitoring, Viewing and Updating of Patient Records
JP2009125133A (en) 2007-11-20 2009-06-11 Asano Dental Inc Dental treatment support system and x-ray sensor for the same
US7969469B2 (en) 2007-11-30 2011-06-28 Omnivision Technologies, Inc. Multiple image sensor system with shared processing
JP4839487B2 (en) 2007-12-04 2011-12-21 本田技研工業株式会社 Robot and task execution system
US7908393B2 (en) 2007-12-04 2011-03-15 Sony Computer Entertainment Inc. Network bandwidth detection, distribution and traffic prioritization
WO2009072090A1 (en) 2007-12-07 2009-06-11 Nokia Corporation Introducing location support into a user plane of a communications network
US20090164657A1 (en) 2007-12-20 2009-06-25 Microsoft Corporation Application aware rate control
US20090171170A1 (en) 2007-12-28 2009-07-02 Nellcor Puritan Bennett Llc Medical Monitoring With Portable Electronic Device System And Method
US20090102919A1 (en) 2007-12-31 2009-04-23 Zamierowski David S Audio-video system and method for telecommunications
US20090177641A1 (en) 2008-01-03 2009-07-09 General Electric Company Patient monitoring network and method of using the patient monitoring network
WO2009095967A1 (en) 2008-01-31 2009-08-06 Mitsubishi Electric Corporation Navigation device
JP4680273B2 (en) 2008-02-05 2011-05-11 京セラ株式会社 Terminal with display function
US8140188B2 (en) * 2008-02-18 2012-03-20 Toyota Motor Engineering & Manufacturing North America, Inc. Robotic system and method for observing, learning, and supporting human activities
KR100971609B1 (en) 2008-03-05 2010-07-20 주식회사 팬택 Method and system for improving performance of connection to receiver
US8374171B2 (en) 2008-03-06 2013-02-12 Pantech Co., Ltd. Method for reducing the risk of call connection failure and system to perform the method
US8244469B2 (en) 2008-03-16 2012-08-14 Irobot Corporation Collaborative engagement for target identification and tracking
US10875182B2 (en) 2008-03-20 2020-12-29 Teladoc Health, Inc. Remote presence system mounted to operating room hardware
US20100088232A1 (en) 2008-03-21 2010-04-08 Brian Gale Verification monitor for critical test result delivery systems
GB2458388A (en) 2008-03-21 2009-09-23 Dressbot Inc A collaborative online shopping environment, virtual mall, store, etc. in which payments may be shared, products recommended and users modelled.
US8179418B2 (en) * 2008-04-14 2012-05-15 Intouch Technologies, Inc. Robotic based health care system
US8170241B2 (en) 2008-04-17 2012-05-01 Intouch Technologies, Inc. Mobile tele-presence system with a microphone system
US8521893B2 (en) 2008-06-27 2013-08-27 Qualcomm Incorporated Multi-rate proximity based peer discovery methods and apparatus
US9193065B2 (en) * 2008-07-10 2015-11-24 Intouch Technologies, Inc. Docking system for a tele-presence robot
US9842192B2 (en) 2008-07-11 2017-12-12 Intouch Technologies, Inc. Tele-presence robot system with multi-cast features
GB0910117D0 (en) 2008-07-14 2009-07-29 Holicom Film Ltd Method and system for filming
US8036915B2 (en) 2008-07-31 2011-10-11 Cosortium of Rheumatology Researchers of North America, Inc. System and method for collecting and managing patient data
CN101640295A (en) 2008-07-31 2010-02-03 鸿富锦精密工业(深圳)有限公司 Charging device
US20100041998A1 (en) 2008-08-18 2010-02-18 Postel Olivier B Method for Detecting and/or Monitoring a Wound Using Infrared Thermal Imaging
US8476555B2 (en) 2008-08-29 2013-07-02 Illinois Tool Works Inc. Portable welding wire feed system and method
JP5040865B2 (en) 2008-09-08 2012-10-03 日本電気株式会社 Robot control system, remote management device, remote management method and program
JP5111445B2 (en) 2008-09-10 2013-01-09 三菱電機株式会社 Air conditioner
US8144182B2 (en) 2008-09-16 2012-03-27 Biscotti Inc. Real time video communications system
US8340819B2 (en) 2008-09-18 2012-12-25 Intouch Technologies, Inc. Mobile videoconferencing robot system with network adaptive driving
JP5410720B2 (en) 2008-09-25 2014-02-05 日立コンシューマエレクトロニクス株式会社 Digital information signal transmitting / receiving apparatus and digital information signal transmitting / receiving method
US8180712B2 (en) 2008-09-30 2012-05-15 The Nielsen Company (Us), Llc Methods and apparatus for determining whether a media presentation device is in an on state or an off state
US8000235B2 (en) 2008-10-05 2011-08-16 Contextream Ltd. Bandwidth allocation method and apparatus
US20100145479A1 (en) 2008-10-09 2010-06-10 G2 Software Systems, Inc. Wireless Portable Sensor Monitoring System
US8996165B2 (en) 2008-10-21 2015-03-31 Intouch Technologies, Inc. Telepresence robot with a camera boom
US8301193B1 (en) 2008-11-03 2012-10-30 Sprint Communications Company L.P. Differential planes for video I/O in a hearing impaired application
US20100118112A1 (en) 2008-11-13 2010-05-13 Polycom, Inc. Group table top videoconferencing device
JP5587331B2 (en) 2008-11-21 2014-09-10 ストライカー コーポレイション Wireless operating room communication system
US8305423B2 (en) * 2008-11-24 2012-11-06 Innovatec, S.L. Communication system for remote patient visits and clinical status monitoring
US8463435B2 (en) 2008-11-25 2013-06-11 Intouch Technologies, Inc. Server connectivity control for tele-presence robot
KR101553521B1 (en) * 2008-12-11 2015-09-16 삼성전자 주식회사 Intelligent robot and control method thereof
US7995493B2 (en) 2008-12-23 2011-08-09 Airvana, Corp. Estimating bandwidth in communication networks
US8462681B2 (en) 2009-01-15 2013-06-11 The Trustees Of Stevens Institute Of Technology Method and apparatus for adaptive transmission of sensor data with latency controls
US8620077B1 (en) 2009-01-26 2013-12-31 Google Inc. Spatio-temporal segmentation for video
US8849680B2 (en) 2009-01-29 2014-09-30 Intouch Technologies, Inc. Documentation through a remote presence robot
US10163175B2 (en) * 2009-02-25 2018-12-25 Humana Inc. System and method for improving healthcare through social robotics
US8418073B2 (en) 2009-03-09 2013-04-09 Intuitive Surgical Operations, Inc. User interfaces for electrosurgical tools in robotic surgical systems
US20100238323A1 (en) 2009-03-23 2010-09-23 Sony Ericsson Mobile Communications Ab Voice-controlled image editing
US8423284B2 (en) 2009-04-15 2013-04-16 Abalta Technologies, Inc. Monitoring, recording and testing of navigation systems
US8897920B2 (en) 2009-04-17 2014-11-25 Intouch Technologies, Inc. Tele-presence robot system with software modularity, projector and laser pointer
US9792660B2 (en) 2009-05-07 2017-10-17 Cerner Innovation, Inc. Clinician to device association
US8745139B2 (en) 2009-05-22 2014-06-03 Cisco Technology, Inc. Configuring channels for sharing media
US8340654B2 (en) 2009-05-26 2012-12-25 Lextech Labs Llc Apparatus and method for video display and control for portable device
JP5466435B2 (en) 2009-06-16 2014-04-09 任天堂株式会社 Information processing program and information processing apparatus
JP5430246B2 (en) 2009-06-23 2014-02-26 任天堂株式会社 GAME DEVICE AND GAME PROGRAM
US8626499B2 (en) 2009-07-21 2014-01-07 Vivu, Inc. Multimedia signal latency management by skipping
US8983772B2 (en) 2009-07-27 2015-03-17 Htc Corporation Method for displaying navigation route, navigation apparatus and recording medium
US8379130B2 (en) 2009-08-07 2013-02-19 Qualcomm Incorporated Apparatus and method of processing images based on an adjusted value of an image processing parameter
US8384755B2 (en) 2009-08-26 2013-02-26 Intouch Technologies, Inc. Portable remote presence robot
WO2011028589A2 (en) 2009-08-26 2011-03-10 Intouch Technologies, Inc. Portable telepresence apparatus
US11399153B2 (en) 2009-08-26 2022-07-26 Teladoc Health, Inc. Portable telepresence apparatus
CN102006450A (en) 2009-09-01 2011-04-06 华为终端有限公司 Method, equipment and system for displaying video conference virtual meeting place
US8473558B2 (en) 2009-09-22 2013-06-25 Thwapr, Inc. Progressive registration for mobile media sharing
US8244402B2 (en) 2009-09-22 2012-08-14 GM Global Technology Operations LLC Visual perception system and method for a humanoid robot
US9120224B2 (en) * 2009-09-22 2015-09-01 GM Global Technology Operations LLC Framework and method for controlling a robotic system using a distributed computer network
US20110077852A1 (en) 2009-09-25 2011-03-31 Mythreyi Ragavan User-defined marked locations for use in conjunction with a personal navigation device
US8712120B1 (en) 2009-09-28 2014-04-29 Dr Systems, Inc. Rules-based approach to transferring and/or viewing medical images
US9147284B2 (en) 2009-10-05 2015-09-29 Myles L. Strohl System and method for generating a computer model to display a position of a person
US9571784B2 (en) 2009-10-30 2017-02-14 Verizon Patent And Licensing Inc. Media content watch list systems and methods
US9626826B2 (en) 2010-06-10 2017-04-18 Nguyen Gaming Llc Location-based real-time casino data
US20110153198A1 (en) 2009-12-21 2011-06-23 Navisus LLC Method for the display of navigation instructions using an augmented-reality concept
US8212533B2 (en) 2009-12-23 2012-07-03 Toyota Motor Engineering & Manufacturing North America, Inc. Robot battery charging apparatuses and methods
US9586471B2 (en) * 2013-04-26 2017-03-07 Carla R. Gillett Robotic omniwheel
JP5537160B2 (en) 2010-01-05 2014-07-02 キヤノン株式会社 Event proxy notification device, control method thereof, and program thereof
US20110169832A1 (en) 2010-01-11 2011-07-14 Roy-G-Biv Corporation 3D Motion Interface Systems and Methods
US11154981B2 (en) * 2010-02-04 2021-10-26 Teladoc Health, Inc. Robot user interface for telepresence robot system
US20110187875A1 (en) 2010-02-04 2011-08-04 Intouch Technologies, Inc. Robot face used in a sterile environment
US9823342B2 (en) 2010-02-09 2017-11-21 Aeroscout, Ltd. System and method for mobile monitoring of non-associated tags
US8234010B2 (en) * 2010-02-16 2012-07-31 Deere & Company Tethered robot positioning
US20110231796A1 (en) 2010-02-16 2011-09-22 Jose Manuel Vigil Methods for navigating a touch screen device in conjunction with gestures
US8670017B2 (en) 2010-03-04 2014-03-11 Intouch Technologies, Inc. Remote presence system including a cart that supports a robot face and an overhead camera
KR101169674B1 (en) 2010-03-11 2012-08-06 한국과학기술연구원 Telepresence robot, telepresence system comprising the same and method for controlling the same
US9124804B2 (en) 2010-03-22 2015-09-01 Microsoft Technology Licensing, Llc Using accelerometer information for determining orientation of pictures and video images
US8457830B2 (en) * 2010-03-22 2013-06-04 John R. Goulding In-line legged robot vehicle and method for operating
US8539353B2 (en) 2010-03-30 2013-09-17 Cisco Technology, Inc. Tabs for managing content
US8577535B2 (en) 2010-03-31 2013-11-05 Massachusetts Institute Of Technology System and method for providing perceived first-order control of an unmanned vehicle
US8725880B2 (en) 2010-04-07 2014-05-13 Apple, Inc. Establishing online communication sessions between client computing devices
WO2011130634A1 (en) 2010-04-16 2011-10-20 University Of Pittsburgh - Of The Commonwealth System Of Higher Education Versatile and integrated system for telehealth
US8837900B2 (en) 2010-05-11 2014-09-16 Cisco Technology, Inc. Unintended video recording detection in a video recording device
US20110288417A1 (en) * 2010-05-19 2011-11-24 Intouch Technologies, Inc. Mobile videoconferencing robot system with autonomy and image analysis
US8935005B2 (en) * 2010-05-20 2015-01-13 Irobot Corporation Operating a mobile robot
US9014848B2 (en) * 2010-05-20 2015-04-21 Irobot Corporation Mobile robot system
US8918213B2 (en) * 2010-05-20 2014-12-23 Irobot Corporation Mobile human interface robot
US8918209B2 (en) * 2010-05-20 2014-12-23 Irobot Corporation Mobile human interface robot
US10808882B2 (en) 2010-05-26 2020-10-20 Intouch Technologies, Inc. Tele-robotic system with a robot face placed on a chair
US8621213B2 (en) 2010-06-08 2013-12-31 Merge Healthcare, Inc. Remote control of medical devices using instant messaging infrastructure
US8429674B2 (en) 2010-07-20 2013-04-23 Apple Inc. Maintaining data states upon forced exit
US8522167B2 (en) 2010-08-09 2013-08-27 Microsoft Corporation Relationship visualization and graphical interaction model in it client management
US8832293B2 (en) 2010-09-03 2014-09-09 Hulu, LLC Bandwidth allocation with modified seek function
US8781629B2 (en) 2010-09-22 2014-07-15 Toyota Motor Engineering & Manufacturing North America, Inc. Human-robot interface apparatuses and methods of controlling robots
WO2012050932A1 (en) 2010-09-29 2012-04-19 University Of Virginia Patent Foundation Method, system and computer program product for optimizing route planning digital maps
US9440356B2 (en) * 2012-12-21 2016-09-13 Crosswing Inc. Customizable robotic system
CA2720886A1 (en) 2010-11-12 2012-05-12 Crosswing Inc. Customizable virtual presence system
US20120143906A1 (en) 2010-12-02 2012-06-07 Twisted Castle, LLC Method of Accessing and Executing Digital Media
US9001207B1 (en) 2010-12-14 2015-04-07 Logitech Europe S.A. Apparatus and method for motion detection in video
US9007415B2 (en) 2010-12-16 2015-04-14 Mitel Networks Corporation Method and system for audio-video communications
US8984708B2 (en) 2011-01-07 2015-03-24 Irobot Corporation Evacuation station system
US20120176525A1 (en) 2011-01-12 2012-07-12 Qualcomm Incorporated Non-map-based mobile interface
GB2487375B (en) 2011-01-18 2017-09-20 Aptina Imaging Corp Interest point detection
US20120191464A1 (en) 2011-01-21 2012-07-26 Intouch Technologies, Inc. Telerobotic System with a Dual Application Screen Presentation
US9323250B2 (en) 2011-01-28 2016-04-26 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
KR20140040094A (en) 2011-01-28 2014-04-02 인터치 테크놀로지스 인코퍼레이티드 Interfacing with a mobile telepresence robot
EP2487577A3 (en) 2011-02-11 2017-10-11 BlackBerry Limited Presenting buttons for controlling an application
US8511540B2 (en) 2011-02-18 2013-08-20 Echostar Technologies L.L.C. Matrix code for use in verification of data card swap
US20120215380A1 (en) 2011-02-23 2012-08-23 Microsoft Corporation Semi-autonomous robot that supports multiple modes of navigation
US9094420B2 (en) 2011-02-24 2015-07-28 Avaya Inc. System and method for assuring quality real-time communication experience in virtual machine
US8532860B2 (en) 2011-02-25 2013-09-10 Intellibot Robotics Llc Methods and systems for automatically yielding to high-priority traffic
US8743244B2 (en) 2011-03-21 2014-06-03 HJ Laboratories, LLC Providing augmented reality based on third party information
US20120245957A1 (en) 2011-03-21 2012-09-27 healthEworks LLC Method and apparatus for providing electronic aftercare instructions
US20140139616A1 (en) 2012-01-27 2014-05-22 Intouch Technologies, Inc. Enhanced Diagnostics for a Telepresence Robot
US9098611B2 (en) 2012-11-26 2015-08-04 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US9259842B2 (en) 2011-06-10 2016-02-16 Microsoft Technology Licensing, Llc Interactive robot initialization
US9030522B2 (en) 2011-06-24 2015-05-12 At&T Intellectual Property I, Lp Apparatus and method for providing media content
US8761933B2 (en) * 2011-08-02 2014-06-24 Microsoft Corporation Finding a called party
US20130044180A1 (en) 2011-08-16 2013-02-21 Sony Corporation Stereoscopic teleconferencing techniques
US20130065604A1 (en) 2011-09-13 2013-03-14 Qualcomm Incorporated Method for seamless transition from urban outdoor environments to indoor navigation
US20130100269A1 (en) 2011-10-20 2013-04-25 Jay Shiro Tashiro System and Method for Assessing an Individual's Physical and Psychosocial Abilities
US8836751B2 (en) 2011-11-08 2014-09-16 Intouch Technologies, Inc. Tele-presence system with a user interface that displays different communication links
US9480910B2 (en) * 2011-11-09 2016-11-01 Marta Isabel santos Paiva Ferraz Conceicao Interactive embodied robot videogame through the use of sensors and physical objects
US8947495B2 (en) 2011-12-06 2015-02-03 Alcatel Lucent Telepresence apparatus for immersion of a human image in a physical environment
US20130158720A1 (en) 2011-12-15 2013-06-20 Honeywell International Inc. Hvac controller with performance log
US9219857B2 (en) 2011-12-21 2015-12-22 Nokia Technologies Oy Image capture
KR101913332B1 (en) * 2011-12-23 2018-10-31 삼성전자주식회사 Mobile apparatus and localization method of mobile apparatus
US9398262B2 (en) 2011-12-29 2016-07-19 Intel Corporation Communication using avatar
US9258459B2 (en) 2012-01-24 2016-02-09 Radical Switchcam Llc System and method for compiling and playing a multi-channel video
US8902278B2 (en) 2012-04-11 2014-12-02 Intouch Technologies, Inc. Systems and methods for visualizing and managing telepresence devices in healthcare networks
US8965623B2 (en) 2012-05-11 2015-02-24 International Business Machines Corporation Automated cleaning in a sensor network
US8908947B2 (en) 2012-05-21 2014-12-09 Terarecon, Inc. Integration of medical software and advanced image processing
EP2852881A4 (en) * 2012-05-22 2016-03-23 Intouch Technologies Inc Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US9361021B2 (en) 2012-05-22 2016-06-07 Irobot Corporation Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US20140015914A1 (en) 2012-07-12 2014-01-16 Claire Delaunay Remote robotic presence
US10152467B2 (en) 2012-08-13 2018-12-11 Google Llc Managing a sharing of media content among client computers
US20140128103A1 (en) 2012-11-02 2014-05-08 Raymond Anthony Joao Apparatus and method for providing information regarding the presence or location of members of a social network
EP2933069B1 (en) * 2014-04-17 2019-07-03 Softbank Robotics Europe Omnidirectional wheeled humanoid robot based on a linear predictive position and velocity controller
JP5931167B1 (en) * 2014-12-11 2016-06-08 ファナック株式会社 Human cooperative robot system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030216834A1 (en) * 2000-05-01 2003-11-20 Allard James R. Method and system for remote control of mobile robot
US20120010518A1 (en) * 2000-07-12 2012-01-12 Dimicine Research It, Llc Telemedicine system
US20070271122A1 (en) * 2006-05-02 2007-11-22 Siemens Medical Solutions Usa, Inc. Patient Video and Audio Monitoring System
US20090278912A1 (en) * 2008-05-11 2009-11-12 Revolutionary Concepts, Inc. Medical audio/video communications system
US20110288682A1 (en) * 2010-05-24 2011-11-24 Marco Pinter Telepresence Robot System that can be Accessed by a Cellular Phone

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2852881A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022075970A1 (en) * 2020-10-05 2022-04-14 Hewlett-Packard Development Company, L.P. Transmitting biometric healthcare data

Also Published As

Publication number Publication date
US20150081338A1 (en) 2015-03-19
US9174342B2 (en) 2015-11-03
US20230226694A1 (en) 2023-07-20
US10780582B2 (en) 2020-09-22
US9776327B2 (en) 2017-10-03
US20210008722A1 (en) 2021-01-14
US11453126B2 (en) 2022-09-27
US20230016135A1 (en) 2023-01-19
US20200009736A1 (en) 2020-01-09
WO2013176758A1 (en) 2013-11-28
US20180099412A1 (en) 2018-04-12
EP2852881A4 (en) 2016-03-23
EP2852475A4 (en) 2016-01-20
US20160229058A1 (en) 2016-08-11
US11628571B2 (en) 2023-04-18
US10328576B2 (en) 2019-06-25
US20150088310A1 (en) 2015-03-26
WO2013176762A1 (en) 2013-11-28
EP2852475A1 (en) 2015-04-01
US10603792B2 (en) 2020-03-31
EP2852881A1 (en) 2015-04-01
US20200198142A1 (en) 2020-06-25

Similar Documents

Publication Publication Date Title
US11756694B2 (en) Graphical user interfaces including touchpad driving interfaces for telemedicine devices
WO2013176760A1 (en) Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US10983659B1 (en) Emissive surfaces and workspaces method and apparatus
US20220365739A1 (en) Shared-content session user interfaces
US20220286314A1 (en) User interfaces for multi-participant live communication
US20200296147A1 (en) Systems and methods for real-time collaboration
US20220374136A1 (en) Adaptive video conference user interfaces
US11937021B2 (en) Camera and visitor user interfaces
US20100179390A1 (en) Collaborative tabletop for centralized monitoring system
US10928977B2 (en) Mobile terminal and method of controlling medical apparatus by using the mobile terminal
CN104346076B (en) Information processing equipment, information processing method and program
CN104346022A (en) Method and apparatus for message processing
JP2008118301A (en) Electronic blackboard system
US20240069711A1 (en) User interfaces for managing accessories
US20230370507A1 (en) User interfaces for managing shared-content sessions
US20160179355A1 (en) System and method for managing image scan parameters in medical imaging
US20120253851A1 (en) System And Method For Controlling Displaying Medical Record Information On A Secondary Display
KR20230084336A (en) Multi-participant live communication user interface
US11775127B1 (en) Emissive surfaces and workspaces method and apparatus
KR102233429B1 (en) Mobile device for controlling medical apparatus and method for controlling medical apparatus thereof
KR20170087438A (en) Mobile device for controlling medical apparatus and method for controlling medical apparatus thereof
KR102541365B1 (en) Multi-participant live communication user interface
CN107077276A (en) Method and apparatus for providing user interface

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13793865

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2013793865

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2013793865

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE