US6580246B2 - Robot touch shield - Google Patents

Robot touch shield Download PDF

Info

Publication number
US6580246B2
US6580246B2 US09/976,420 US97642001A US6580246B2 US 6580246 B2 US6580246 B2 US 6580246B2 US 97642001 A US97642001 A US 97642001A US 6580246 B2 US6580246 B2 US 6580246B2
Authority
US
United States
Prior art keywords
robot system
shell
base
area
force applied
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime, expires
Application number
US09/976,420
Other versions
US20030030399A1 (en
Inventor
Stephen Jacobs
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Diversey Inc
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/928,669 external-priority patent/US6667592B2/en
Application filed by Individual filed Critical Individual
Priority to US09/976,420 priority Critical patent/US6580246B2/en
Assigned to INTELLIBOT, L.L.C. reassignment INTELLIBOT, L.L.C. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JACOBS, STEPHEN
Publication of US20030030399A1 publication Critical patent/US20030030399A1/en
Application granted granted Critical
Publication of US6580246B2 publication Critical patent/US6580246B2/en
Assigned to AXXON ROBOTICS, LLC reassignment AXXON ROBOTICS, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: INTELLIBOT, LLC
Assigned to INTELLIBOT ROBOTICS LLC reassignment INTELLIBOT ROBOTICS LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AXXON ROBOTICS LLC
Assigned to DIVERSEY, INC. reassignment DIVERSEY, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: INTELLIBOT ROBOTICS, LLC
Assigned to CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH, AS COLLATERAL AGENT reassignment CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH, AS COLLATERAL AGENT SECURITY AGREEMENT Assignors: DIVERSEY, INC., THE BUTCHER COMPANY
Adjusted expiration legal-status Critical
Assigned to THE BUTCHER COMPANY, DIVERSEY, INC. reassignment THE BUTCHER COMPANY RELEASE OF SECURITY AGREEMENT REEL/FRAME 045300/0141 Assignors: CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0225Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay

Definitions

  • the system and methods of the invention relate to utilizing a robot system with a touch shield to perform a function. More specifically, the invention relates to a service robot system and a method of utilizing a service robot system to perform a service function in an area.
  • a robot system could operate for extended periods of time autonomously, without the need for extended human supervision.
  • a robot system could perform a series of tasks that free the robot system operator to perform other duties.
  • This need can, once again, be seen in the industrial cleaning industry.
  • illustrative cleaning systems will autonomously clean an area, but then require an operator to move the system to the next area that requires service. This may require transport over areas that do not require any type of cleaning or other service.
  • a method of controlling a cleaning robot system such that the system can be given multiple tasks, in many different areas, wherein the robot system could finish the tasks in each different area without a human operator being required.
  • a robot touch shield device comprising a shell supported by at least one shell support member mounted on a base member, and a sensor device for sensing an exterior force applied to the shell, the sensor device having a base sensor portion having a center and a vertical member, the base sensor portion affixed on the base member, the vertical member affixed on the shell, the vertical member positioned over the center of the base sensor portion, wherein the exterior force applied to the shell translates the shell relative to the base member, the base sensor portion senses a displacement of the vertical member relative to the center of the base sensor portion and produces an output representing at least one of a direction of the exterior force applied and the degree of the exterior force applied.
  • a robot system with a touch shield device comprising a processing portion for processing data in the robot system, a memory portion, the processor portion storing data in the memory portion and retrieving data from the memory portion, a transport portion for transporting the robot system from a first location to a second location, a body portion, the body portion containing at least one of the processor portion, the memory portion, and the transport portion, a touch shield device mounted on the body portion, the touch shield device having a shell supported by at least one shell support member mounted on a base member, and a sensor device for sensing an exterior force applied to the shell, the sensor device having a base sensor portion having a center and a vertical member, the base sensor portion affixed on the base member, the vertical member affixed on the shell, the vertical member positioned over the center of the base sensor portion, wherein the exterior force applied to the shell translates the shell relative to the base member, the base sensor portion senses a displacement of the vertical member relative to the center of the
  • a method of utilizing a robot system with a touch shield device comprising the steps of commanding the robot system to perform a function in an area, the function having at least one function task, the area having an area layout including at least one area segment; accessing by the robot system a stored map of the area layout, the stored map having at least one function task associated with the at least one area segment; localizing a first position of the robot system in the area; determining a function path by the robot system from the first position of the robot system for navigation of the area and completion of the at least one function task; repeatedly continuously localizing a current position of the robot system while navigating the robot system along the function path; repeatedly continuously monitoring by the robot system the touch shield device for obstacles in the function path, the touch shield device having a shell supported by at least one shell support member mounted on a base member, and a sensor device for sensing an exterior force applied to the shell, the sensor device having a base sensor portion having a center and a vertical member, the base sensor portion affixed on the
  • FIG. 1 is a block diagram showing a robot system in accordance with one embodiment of the system and method of the invention
  • FIG. 2 is a block diagram showing the processor portion of FIG. 1 in further detail in accordance with one embodiment of the system and method of the invention
  • FIG. 3 is a block diagram showing the device subsystem portion of FIG. 2 in further detail in accordance with one embodiment of the system and method of the invention
  • FIG. 4 is a block diagram showing the motion subsystem portion of FIG. 2 in further detail in accordance with one embodiment of the system and method of the invention
  • FIG. 5 is a block diagram showing the memory portion of FIG. 1 in further detail in accordance with one embodiment of the system and method of the invention
  • FIG. 6 is a block diagram showing the interaction portion of FIG. 1 in further detail in accordance with one embodiment of the system and method of the invention
  • FIG. 7 is a block diagram showing the cleaning portion of FIG. 1 in further detail in accordance with one embodiment of the system and method of the invention.
  • FIG. 8 is a block diagram showing the transport portion of FIG. 1 in further detail in accordance with one embodiment of the system and method of the invention.
  • FIG. 9 is a flowchart showing a method of utilizing a robot system to perform a function in an area in accordance with an embodiment of the method and system of the invention.
  • FIG. 10 is a flowchart showing the “robot system determines a function path” step of FIG. 9 in further detail in accordance with an embodiment of the method and system of the invention.
  • FIG. 11 is a flowchart showing a method of mapping an area utilizing a robot system in accordance with one embodiment of the method and system of the invention.
  • FIG. 12 is a flowchart showing a method of mapping an area utilizing a robot system in accordance with yet another embodiment of the method and system of the invention.
  • FIG. 13 is a flowchart showing a method of storing a map of an area layout in accordance with one embodiment of the system and method of the invention
  • FIG. 14 is a flowchart showing a method of associating a function task with an area segment on a map of an area layout in accordance with yet another embodiment of the method and system of the invention.
  • FIG. 15 is a diagram of an illustrative area layout in accordance with one embodiment of the method and system of the invention.
  • FIG. 16 is a flowchart showing a method of editing a map of an area layout in accordance with one embodiment of the system and method of the invention.
  • FIG. 17 is a flowchart showing a method of editing a map of an area layout in accordance with a further embodiment of the method and system of the invention.
  • FIG. 18 a is a diagram of an unedited area layout in accordance with one embodiment of the method and system of the invention.
  • FIG. 18 b is a diagram of an edited area layout in accordance with one embodiment of the method and system of the invention.
  • FIG. 19 a is a diagram of an illustrative area layout in accordance with one embodiment of the method and system of the invention.
  • FIG. 19 b is a diagram of the illustrative area layout of FIG. 19 a in further detail in accordance with one embodiment of the method and system of the invention.
  • FIG. 19 c is a diagram of the illustrative area layout of FIG. 19 a in further detail in accordance with one embodiment of the method and system of the invention.
  • FIG. 19 d is a diagram of the illustrative area layout of FIG. 19 a in further detail in accordance with one embodiment of the method and system of the invention.
  • FIG. 20 is an isometric view of an illustrative robot with touch shield in accordance with one embodiment of the method and system of the invention.
  • FIG. 21 is an isometric view of an illustrative touch shield device shell in accordance with one embodiment of the method and system of the invention.
  • FIG. 22 shows a planar view of an illustrative touch shield device shell in accordance with one embodiment of the method and system of the invention
  • FIG. 23 shows a planar view of the illustrative touch shield device shell in accordance with one embodiment of the method and system of the invention
  • FIG. 24 is an isometric view of an illustrative robot without a touch shield device shell in accordance with one embodiment of the method and system of the invention.
  • FIG. 25 is a planar view of the illustrative robot without a touch shield device shell of FIG. 24, in further detail, in accordance with one embodiment of the method and system of the invention;
  • FIG. 26 is a side sectional view of an illustrative touch shield device mounted on an illustrative base member in accordance with one embodiment of the method and system of the invention
  • FIG. 27 is a side sectional view of an illustrative touch shield device mounted on an illustrative base member in accordance with one embodiment of the method and system of the invention
  • FIG. 28 is an illustrative flowchart showing a method of utilizing a robot with a touch shield to perform a function in an area in accordance with one embodiment of the method and system of the invention
  • FIG. 29 is an isometric view of an illustrative robot without a touch shield device shell in accordance with one embodiment of the method and system of the invention.
  • FIG. 30 is a side sectional view of an illustrative touch shield device mounted on an illustrative base member in accordance with one embodiment of the method and system of the invention.
  • FIG. 31 is a side sectional view of an illustrative touch shield device mounted on an illustrative base member in accordance with one embodiment of the method and system of the invention.
  • the invention provides a method of utilizing a robot system, the method comprising the steps of commanding the robot system to perform a function in an area, the area having an area layout including at least one area segment.
  • the method further includes accessing by the robot system a stored map of the area layout, the stored map having at least one function task associated with the at least one area segment, localizing a first position of the robot system in the area, and determining a function path from the first position of the robot system for navigation of the area and completion of the at least one function task.
  • the method includes repeatedly continuously localizing a current position of the robot system while navigating the robot system along the function path, and completing the at least one function task that is associated with the current position of the robot system on the stored map of the area, for example.
  • a “robot” or “robot system” or “cleaning robot system” is a stand-alone system, for example, that is mobile, that performs both physical activities and computational activities.
  • the physical activities may be performed using a wide variety of movable parts including cleaning devices and tools, for example.
  • the computational activities may be performed utilizing a suitable processor and memory stores, i.e., a data memory storage device, for example.
  • the computational activities may include processing information input from various sensors or other inputs of the robot system to perform commanded functions; processing the input information, as well as other data in the memory stores of the robot system, to generate a variety of desired information; or outputting information that has been acquired or produced by the robot system to a desired destination, for example.
  • an area is a distinct part or section of an environment, surroundings or space, that is set aside from other parts or sections, for example.
  • An area may include, but not be limited to, a part or section of a store, factory, warehouse, shop, mall, fair, outside market, display area, hospital, law firm, accounting firm, restaurant, commercial office space, convention center, hotel, airport, arena, stadium, outdoor venue or any other space either inside a structure or outside in which boundaries may be provided for the surroundings, for example.
  • An area may describe a two dimensional plot or three dimensional space, for example. Accordingly, an area could be mapped utilizing coordinates in the X and Y axis, or using coordinates in the X, Y and Z axis.
  • area layout is an arrangement, plan, or structuring of an area, for example.
  • An area layout may define the walls of a structure, different zones within a building, or other structural features of an area, for example.
  • area segment is any portion, part or section of an area layout that can be divided or subdivided. Therefore, an area segment could include a hallway, doorway, staircase, or other section of an area layout, for example. It should be appreciated that an area segment does not have to be separated from other area segments by physical boundaries, need not be contiguous, may overlap other area segments. In general, an area segment can be defined in any suitable manner, as desired by an operator, for the performance of commanded tasks.
  • the term “function” describes any assigned duty, activity, service, assignment or role, that an operator commands a robot system to perform.
  • the robot system may be commanded to perform a cleaning function, security function, entertainment function, or other services.
  • the term “function task” describes a piece of work assigned or done as part of a robot system's function. It should be appreciated that in order to perform its function, a robot system must complete at least one function task associated with the function.
  • cleaning function describes any function associated with cleaning of an area.
  • a cleaning function may include, but not be limited to, rinsing, wringing, flushing, wiping, mopping, dust mopping, sponging, scouring, abrading, grinding, leveling, swabbing, scrubbing, scraping, stripping, sanding, brushing, washing, drying, laving, laundering, applying detergent to, applying abrasive to, clearing, disinfecting, irradiating, deodorizing, whitewashing, fumigating, applying antimicrobial agents to, sweeping, vacuuming, soaking, removing stains and soil marks from, waxing, buffing, utilizing a squeegee device on, applying cleaning solution to, dusting, bleaching, or shampooing a portion of an area.
  • an operator may develop a cleaning function, entitled “basic clean,” in which the robot system is programmed to sweep the floor of an area.
  • This “basic clean” function would include two function tasks, in which the robot system would (1) navigate the area it is sweeping and (2) perform the sweeping.
  • an operator may associate a function with an area, such that when the robot system receives the command to perform the function, the function is associated with a known area.
  • the operator may utilize a data map of the first floor of a building and the “basic clean” function.
  • the operator could create a new function, entitled “basic clean 1 st floor,” in which the robot system accesses a map of the first floor and function tasks associated with each section of the first floor.
  • the first floor area could be broken down into three sections A, B and C, in which sections A and C have tiles floors that must be swept, but B is carpeted and you must travel over section B to get from section A to C, or vice versa.
  • the operator could program “basic clean 1 st floor,” based upon the stored data map of the first floor, such that the robot would navigate and sweep section A, simply transport over section B and not perform any cleaning task, and then navigate and sweep section C.
  • a function may be very broad, and may include more than one function task.
  • functions may be associated with stored maps of area, in which the area can be broken down into smaller segments, each having different function tasks associated with each. These segments do not have to be contiguous, may overlap other area segments, and in general, can be defined in any suitable manner as desired by an operator. Therefore, a robot system may, as part of performing its commanded function, perform one task in three different area segments within an area, or six tasks within one area segment in an area, for example.
  • FIG. 1 is a block diagram showing a robot system 10 in accordance with one embodiment of the system and method of the invention.
  • the robot system 10 includes a control portion 20 .
  • the control portion 20 includes a processor portion 100 and a memory portion 600 .
  • the robot system 10 further includes an interaction portion 700 , a cleaning portion 800 , and a transport portion 900 .
  • Each of the processor portion 100 , the memory portion 600 , the interaction portion 700 , the cleaning portion 800 and the transport portion 900 are connected to and in communication with each other through a data bus 30 .
  • any suitable communication interface might be utilized to connect the operating components of the robot system 10 .
  • the components of the robot system 10 as described above perform a wide variety of operations.
  • the processor portion 100 monitors and controls the various operations of the robot system 10 as described in detail below.
  • the memory portion 600 serves as a memory store for a wide variety of data used by the processor portion 100 as well as the other components of the robot system 10 .
  • the interaction portion 700 includes a variety of operational components that are controlled by the processor portion 100 .
  • the interaction portion 700 includes components that allow navigation of the robot system 10 and interaction with operators.
  • the robot system 10 further includes a cleaning portion 800 .
  • the cleaning portion 800 also includes a variety of components which are in communication with the processor portion 100 in accordance with some embodiments of the invention.
  • the components contained in a cleaning portion 800 perform a variety of cleaning function tasks in the area the robot system is operating.
  • the robot system 10 further includes a transport portion 900 .
  • the transport portion 900 is controlled by the processor portion 100 based on data input to the processor portion 100 .
  • the transport portion 900 provides mobile capabilities to the robot system 10 .
  • the transport portion 900 may include a mechanical system of wheels or an electromechanical system, for example. Further details of the transport portion 900 are described below.
  • FIG. 1 illustrates various operating components of the robot system 10 .
  • the operating components of the robot system 10 may be encased or enclosed in a suitable body or body portion 40 , as illustrated in FIG. 1 .
  • the operating components of the robot system 10 may simply be suitably disposed on a support framework or structure.
  • FIG. 2 is a block diagram showing in further detail the processor portion 100 .
  • the processor portion 100 includes a general operating portion 200 , a device subsystem portion 300 , a motion subsystem portion 400 and a cognition subsystem portion 500 .
  • the cognition subsystem 500 might be characterized as the “brain” of the robot system 10 .
  • the components of the processor portion 100 allow the robot system 10 to interact with operators and other robot systems, navigate within an area, and perform function tasks in the area.
  • the general operating portion 200 controls general operations of the processor portion 100 not otherwise handled by the other processor portions.
  • the general operating portion 200 controls system and memory backup operations, virus protection processes and processor multitasking monitoring and control, for example.
  • the device subsystem portion 300 is responsible for controlling a variety of devices in the interaction portion 700 , as well as devices in the cleaning portion 800 .
  • such devices in the interaction portion 700 and the cleaning portion 800 may be electrical, electro-mechanical or mechanical devices.
  • these devices include, but are not limited to, sonar sensors, laser sensors, a touch shield device, shell, analog, optical or digital joystick sensor, odometry sensors, a gyroscope, a global positioning device (GPS), solution container, cleaning brush, vacuum device, squeegee device, a monitor, joy stick, magnetic strip readers, speakers, touch screens, keypads, a mouse, and motor controllers, for example.
  • further devices may be included in the cleaning portion 800 , such as a buffer device, waxing device, dryer device, mopping device, or other cleaning devices necessary to effectuate any of the above-described cleaning functions or tasks, for example.
  • the processor portion 100 also includes a motion subsystem portion 400 .
  • the motion subsystem portion 400 monitors and controls various navigational aspects of the robot system 10 .
  • the motion subsystem portion 400 determines the position of the robot system 10 in the area (i.e. localization), and controls navigation of the robot to different positions in area and along the function path. Further aspects of the motion subsystem portion 400 are described below with reference to FIG. 4 .
  • the processor portion 100 also includes a cognition subsystem portion 500 .
  • the cognition subsystem portion 500 is essentially the brain of the robot system 10 .
  • the cognition subsystem portion 500 is responsible for all cognitive tasks performed in the robot system 10 including environment interaction, logic processes, and game logic processes, for example.
  • FIG. 3 shows the device subsystem portion 300 in further detail. As described above, the device subsystem portion 300 controls a variety of devices utilized in operation of the robot system 10 .
  • the device subsystem portion 300 includes an environment interface controller 310 , a user interface controller 320 , and system support controller 340 .
  • the various controllers respectively control operational devices in the interaction portion 700 .
  • the environment interface controller 310 in general controls devices utilized to input information regarding the area, as well as to navigate within the area.
  • the user interface controller 320 in general controls a variety of devices utilized to input information from operators, and other robot systems, and output responsive information.
  • the system support controller 340 controls a variety of devices, not otherwise controlled by the environment interface controller 310 or the user interface controller 320 , that are used in operation of the robot system 10 . Further aspects of the controllers ( 310 , 320 , 340 ) will be described with reference to FIG. 6 below.
  • FIG. 4 is a block diagram showing further details of the motion subsystem portion 400 .
  • the motion subsystem 400 includes a path planner portion 410 , a path tracker portion 420 , a localizer portion 430 , and a Kalman filter portion 440 .
  • the motion subsystem portion 400 monitors and controls a variety of navigational aspects of the robot system 10 .
  • the localizer portion 430 is responsible for gathering a variety of sensor information.
  • the sensor information may include laser data, sonar data, touch shield device data, shell sensor data, position sensor data (X-Y axis coordinate data) from an analog, optical or digital joystick sensor, odometry data, gyroscope data, global position system (GPS) data, pre-stored maps data, and x-y position system, i.e., grid data, for example.
  • the localizer portion 430 accesses a stored map of the area layout in which the robot system has been commanded to perform a function. The localizer portion 430 then localizes the robot system's position in the area and associates that current position with an actual position on the stored map.
  • the path planner portion 410 may determine a function path for the robot system to complete its assigned function and tasks. Furthermore, the localizer portion 430 utilizes the devices contained in the interaction portion 700 shown in FIG. 6 . Accordingly, the localizer portion 430 determines the position and heading of the robot system 10 in the area in which the robot system 10 is operating.
  • the path planner portion 410 based on a variety of input, generates a desired path of travel for traveling from a current position of the robot system 10 to a final position in the area once the assigned function and tasks have been completed.
  • a suitable algorithm or other logic may be utilized by the path planner portion 410 to generate such a desired path of travel.
  • the path planner portion 410 may utilize stored pre-determined function paths for given areas to complete assigned function tasks upon localization in the area.
  • the localizer portion 430 may detect obstacles in the planned path of the robot system 10 while the robot system 10 is moving along the path. When an obstacle is detected, the localizer portion 430 communicates with the path tracker portion 420 to stop the robot system's 10 movement, and the path planner portion 410 to generate an amended function path, thus, avoiding the obstacle.
  • the path tracker portion 420 utilizes this information. Specifically, the path tracker portion 420 uses the path information from the path planner, as well as the position information from the localizer portion 430 , to control the robot system 10 to move along the desired function path.
  • the path tracker portion 420 may utilize a suitable logic or algorithm as is necessary or desired.
  • the path tracker portion 420 may further include an obstacle avoidance planner portion which may track the position of obstacles detected in the area and in the function path of the robot system.
  • the Kalman filter 440 may be employed by the localizer portion 430 to assist in the prediction of the robot system's position based on a current sensor position. It should be appreciated that the various software pieces illustrated in the motion subsystem 400 perform separate tasks, as described above. However, it should be appreciated that in another embodiment of the system of the invention, two or more of these respective tasks may be performed by a single processor, or alternatively, the tasks performed by a particular component may be further broken down into multiple processing components.
  • the control portion 20 in the robot system 10 includes a memory portion 600 .
  • the memory portion 600 stores a variety of data utilized in operation of the robot system 10 .
  • FIG. 5 is a block diagram showing further details of the memory portion 600 .
  • the memory portion 600 includes a general operating memory 610 , a device subsystem memory 620 , a motion subsystem memory 630 , and a cognition subsystem memory 640 .
  • the general operating memory 610 in the memory portion 600 stores a wide variety of data used by the general operating portion 200 in general operations of the robot system 10 . Furthermore, the general operating memory 610 contains specialty data stores in accordance with one embodiment of the system of the invention. Specifically, the general operating memory 610 includes a stored area memory 612 , a function path mapping memory 614 , a function task memory 616 and a function path editing memory 618 .
  • the stored area memory 612 may contain data and/or files relating to area layouts and area segments.
  • the function task memory 616 may contain data and/or files relating to functions, function commands, function tasks, and associations between this information and data on areas, area layouts and area segments.
  • the function path mapping memory 614 may store data and input information acquired by then robot system 10 and utilized to produce maps of different areas.
  • the function path editing memory 618 may store a wide variety of data related to the editing of stored maps of areas, area layouts or area segments, as is necessary or desired. While the general operating memory 610 of FIG.
  • the memory portion 600 also includes the device subsystem memory 620 .
  • the device subsystem memory 620 is the memory store utilized by the device subsystem portion 300 . Accordingly, the device subsystem memory 620 stores a variety of information related to operation of the devices in the interaction portion 700 , which are controlled by the device subsystem portion 300 .
  • the memory portion 600 also includes a motion subsystem memory 630 .
  • the motion subsystem memory 630 is the memory store utilized by the components of the motion subsystem portion 400 , i.e., the path planner portion 410 , the path tracker portion 420 , the localizer portion 430 and the Kalman filter 440 .
  • the motion subsystem memory 630 includes a path planner memory 632 and a path tracker memory 634 utilized by the path planner portion 410 and the path tracker portion 420 , respectively.
  • the memory portion 600 also includes a cognition subsystem memory 640 .
  • the cognition subsystem memory 640 serves as the memory store for the cognition subsystem portion 500 , which is shown in FIG. 2 . Accordingly, the cognition subsystem memory 640 contains a wide variety of data utilized in the cognitive operations performed by the cognition subsystem portion 500 . Illustratively, the cognition subsystem memory 640 may contain data associating particular observed inputs with desired outputs.
  • the various memory components contained in the memory portion 600 may take on a variety of architectures as is necessary or desired by the particular operating circumstances. Further, the various memory components of the memory portion 600 may exchange data or utilize other memory component data utilizing known techniques such as relational database techniques.
  • the interaction portion 700 contains various components to collect information and data from the area in which the robot system 10 is operating, as well as to output information and data.
  • the interaction portion 700 includes an environment interface portion 710 , a user interface portion 720 and a system support portion 740 .
  • the environment interface portion 710 collects various information regarding the area in which the robot system 10 is operating, as well as information regarding travel of the robot system 10 through the area.
  • the user interface portion 720 generally provides operator interaction capabilities.
  • the user interface portion 720 is controlled by the processor portion 100 , or components thereof, to interface with an operator or other robot system including inputting commands and outputting data or information relating to the areas and functions.
  • the system support portion 740 contains a variety of operational components of a more general type not contained in either the environment interface portion 710 or the user interface portion 720 .
  • the portions 710 , 720 , 740 ) will be described.
  • the environment interface portion 710 as shown in FIG. 6, contains a variety of components to input information from the area in which the robot system 10 is operating. More specifically, the environment interface portion 710 may include sonar sensors 711 , laser sensors 712 , an odometry sensor 713 , a global position (GPS) device 714 , microwave sensors 715 , Doppler radar sensors 716 , a gyroscope 717 , motion sensors 718 , and touch shield device 760 , as described hereinafter, for example.
  • GPS global position
  • x-y grid device a gaseous sensor, a heat sensor, a camera, a recording portion, air quality sensors, flame sensors, a metal detector, for example.
  • gaseous sensor a gaseous sensor
  • heat sensor a heat sensor
  • camera a recording portion
  • air quality sensors flame sensors
  • metal detector a metal detector
  • Each of the components in the environment interface portion 710 perform respective operations.
  • the various components in the environment interface portion 710 are in general controlled by the environment interface controller 310 , which includes sub-processing systems that respectively control some of the components in the environment interface portion 710 .
  • the map generator controller 311 controls and receives feedback from the sonar sensors 711 and the laser sensors 712 .
  • the gyroscope controller 317 controls the gyroscope 717 .
  • the interface device controller 318 controls any additional devices in the environment interface portion 710 that are not controlled by the map generator controller 311 or gyroscope controller 317 .
  • the environment interface controller 310 monitors and controls the various devices contained in the environment interface portion 710 .
  • the environment interface portion 710 includes sonar sensors 711 and laser sensors 712 , which are both responsible for the “localization” or determining the positioning of the robot system 10 within the are in which the robot system is operating.
  • the sonar sensors 711 and laser sensors 712 can be utilized to determine positions of the robot system 10 in the area, and during navigation of the function path and completion of tasks. Accordingly, in one embodiment of the invention, the robot system 10 utilizes the sonar sensors 711 and laser sensors 712 to continuously localize the position of the robot system 10 in the area in which the robot system 10 is operating.
  • Each of the sonar sensors 711 and the laser sensors 712 are controlled by the map generator controller 311 .
  • the sonar sensors 711 and the laser sensors 712 provide the map generator controller 311 with various spatial information regarding the surrounding area in which the robot system 10 is operating. Specifically, the sonar sensors 711 transmits and detects reflected acoustic waves to determine surrounding objects.
  • the sonar sensors 711 in accordance with the system of the invention can detect both static objects, i.e., such as a wall, or dynamic objects such as a moving person.
  • the sonar sensors 711 conveys gathered information back to the map generator controller 311 . Thereafter, the map generator controller 311 outputs the processed information to other components in the robot system 10 as is desired or necessary.
  • the laser sensors 712 gather information utilizing laser technology.
  • the laser sensors 712 may include a device that converts incident electromagnetic radiation of mixed frequencies to coherent visible radiation having one or more discreet frequencies. The reflection of this radiation off surrounding objects may be sensed by the laser sensors 712 so as to determine the surrounding area specifics.
  • the laser sensors 712 in accordance with the system of the invention can also detect both static objects, i.e., such as a wall, or dynamic objects such as a moving person.
  • the laser sensors 712 conveys gathered information back to the map generator controller 311 . Thereafter, the map generator controller 311 outputs the processed information to other components in the robot system 10 as is desired or necessary.
  • the sonar or laser sensors may be utilized in the system of the invention to produce a map of an area in which the robot system will operate.
  • the robot system may receive a command to map an area, for example.
  • the map generator controller 311 would utilize the information and data gathered by the sonar and laser sensors to create a map of the area sensed.
  • an operator could utilize a stored map of an area to develop an increasingly enhanced navigational capability in the area for later navigation by the robot system.
  • the robot system could generate a map of the area for alternate uses such as providing floor plans, emergency exit maps, for example.
  • the robot system may create updated maps of the area layout when obstacles are detected in the area layout. Once detected, the obstacles can be added to an amended map of the area layout that could be utilized from thereon. Also, if the robot system continued to operate within the area layout and later the obstacle was removed, the robot system could either create another amended map that removed the obstacle, or could return to the previous stored map. In this respect, the robot system could operate efficiently in an environment such as a grocery store where floor displays are constantly being moved and rearranged.
  • the updating of stored maps by the robot system can be effectuated in several different fashions.
  • the operator may set the updating of a stored map to occur on the detection of an obstacle for the third consecutive cleaning cycle through a given area layout. Therefore, the operator may program the robot system to create an amended map only upon three detections of an obstacle to avoid unnecessary effort.
  • the number of repeated detections of an obstacle before it is added to a map of the area layout can be determined by one of ordinary skill in the art based upon the needs of the given area layout.
  • the robot system might be manually taught regarding the surroundings. Accordingly, the robot system could receive data and input relating to obstacles in the environment as well as the location of beacons relative to those objects.
  • an operator may edit a pre-determined stored map, change the nature of mapped boundaries in the area layout, i.e., increase the size of a room, add a doorway, or place a pillar in a hallway, and the robot system would receive updated information on the area.
  • other suitable techniques may be utilized to provide improved navigational capabilities to the robot system.
  • the environment interface portion 710 also includes an odometry sensor 716 .
  • the odometry sensor 716 may monitor the distance traveled by the robot system 10 for any of a variety of purposes. For example, the distance traveled by the robot system 10 may be utilized in combination with a stored map of an area to provide an efficient function path for performing assigned tasks. Alternatively, the distance traveled might assist in estimations relating to when replenishment of the robot system 10 will be required.
  • the information gathered by the odometry sensor 716 as well as the information gathered in the other components of the environment interface portion 710 , may be stored in the device subsystem memory 620 in accordance with one embodiment of the system of the invention.
  • the environment interface portion 710 also includes a gyroscope 717 .
  • the gyroscope 717 is monitored and controlled by the gyroscope controller 317 in the environment interface controller 310 as shown in FIG. 3 .
  • the gyroscope 717 may include a known structure using orientational gyroscope technology, that comprises a spinning mass, the spin access of which is allowed to rotate between low-friction supports so as to maintain its angular orientation with respect to initial coordinates when the spinning mass is not subjected to external torque. Accordingly, the gyroscope 717 provides feedback to the gyroscope controller 317 indicative of movement of the robot system 10 .
  • other gyroscope technology may be utilized. For example, tuning fork or laser ring gyroscope technology might be utilized in conjunction with the system and method of the invention.
  • the localizer portion 430 in the motion subsystem portion 400 is responsible for gathering sensor information and determining the position or heading of the robot system 10 .
  • a purpose of the localizer portion 430 is to assist in navigation of the robot system 10 in its travels through the area in which the robot system 10 is operating, i.e. along a function path.
  • the map generator controller 311 assists the localizer portion 430 in its operations. Specifically, the map generator controller 311 forwards information it gathers from the sonar sensor 711 and/or the laser sensors 712 and forwards that information to the vocalizer portion 430 .
  • the user interface portion 720 as shown in FIG. 6, contains a variety of components utilized to interface with a customer in an area.
  • the user interface portion 720 includes a touch screen 721 , a keypad 722 , a mouse 723 , a joystick 724 , speakers 725 , a magnetic strip reader 726 , i.e., a card reader, user buttons 727 , and a monitor 728 , for example.
  • the user interface portion may include additional components including, for example, an armature, a microphone, or printer.
  • the various components in the user interface portion 720 are controlled by the user interface controller 320 , in general, or alternatively, by a sub-processing system of the user interface controller 320 .
  • the user interface controller 320 outputs data to the speakers 723 so as to provide audible messages, automated alert signals for robot operation, or simulated speech and voice generation using the speakers, for example.
  • the card reader controller 328 in the user interface controller 320 controls and inputs information from the magnetic strip reader 726 .
  • the touch screen controller 333 in the user interface controller 320 controls and inputs information from the touch screen 721 and key pad 722 .
  • the user interface portion 720 includes a variety of devices used to operate the robot system in an area to perform a function.
  • the devices in the user interface portion 720 are controlled by the user interface controller 320 or a sub-component thereof.
  • the user interface portion 720 includes a touch screen 721 and key pad 722 that are controlled by a specialized processing component in the user interface controller 320 , which is the touch screen controller 333 .
  • An operator may use the touch screen 733 to input information into the robot system 10 , i.e. commands to perform a function.
  • an operator may use the touch screen 733 to command a cleaning robot system to perform a cleaning function in a given area.
  • the touch screen 733 could be used to select a specific hallway, and the operator could specify that the floor of the hallway should be washed and waxed.
  • an operator could utilize the key pad 722 , or any combination of the touch screen and key pad to give similar commands.
  • the user interface portion 720 includes a mouse 723 , joystick 724 , user buttons 727 , and monitor 728 .
  • Each of these additional components may be utilized to input a wide variety of information into the robot system 10 .
  • each of these components could be utilized to command the robot system, including changing the function path, for example.
  • the user interface portion 720 also includes a magnetic strip reader 730 , i.e., a card reader. Alternatively, it should be appreciated that barcode or laser scanners might also be utilized.
  • the magnetic strip reader 726 is controlled by the card reader controller 328 . In one embodiment, the magnetic strip reader 726 may be utilized to identify an operator, wherein the robot system would not respond to a given command unless the command came from an identified and authorized operator, for example.
  • the system support portion 740 as shown in FIG. 6 includes a variety of components used to support operation of the robot system 10 .
  • the system support portion 740 includes a communication interface 742 , a battery 746 , an encoder device 748 , and a security portion 749 .
  • the devices in the system support portion 740 are controlled by respective processing components in the system support controller 340 . That is, the communication interface controller 342 controls operation of the communication interface 742 .
  • the battery interface controller 346 controls operation of the battery 746 .
  • the encoder interface controller 348 controls operation of the encoder device 748 .
  • the security controller 349 controls the security portion 749 .
  • the system support portion 740 includes the communication interface 742 .
  • the communication interface 742 is controlled by the communication interface controller 342 , as shown in FIG. 3 .
  • the communication interface 742 provides for transmission of data both to the robot system 10 and from the robot system 10 .
  • the communication interface 742 is a wireless device.
  • Various communications techniques may be utilized to provide the wireless transmission both to and from the robot system 10 , including radio, spread spectrum, infrared line of sight, cellular, microwave, or satellite, for example.
  • the communication interface 742 may use wire technology wherein a physical cable is running from the robot system 10 to a desired location, such as a modem which may then be connected to the Internet, for example.
  • the wire technology may be utilized where the robot system 10 is operated in a small defined area.
  • the system support portion 740 also includes a battery 746 .
  • the battery 746 is monitored and controlled by the battery interface controller 346 .
  • the battery 746 may be any suitable type including lithium polymer, nickel cadmium, nickel hydride, lead acid, lithium ion, lithium polymer, zinc air or alkaline, for example. Further, it should be appreciated that a plurality of batteries may be utilized that are the same or different types. This may be preferable in that various processing systems and operational devices utilized on the robot system 10 may optimally utilize different types of batteries for enhanced performance.
  • the battery interface controller 346 monitors the battery 746 , or alternatively batteries, for possible malfunctions and recharging requirements.
  • the battery interface controller 346 determines that a battery 746 requires recharging or replacement, the battery interface controller 346 works in conjunction with the other processing portions and devices to effect travel of the robot system 10 to a recharging station, in accordance with one embodiment of the invention.
  • the system support portion also includes a security portion 749 that is controlled by the security controller 349 in the system support controller 340 .
  • the security portion 749 working in conjunction with the security controller 349 provides various capabilities related to security of both the area in which the robot system 10 is operating, as well as the robot system 10 itself.
  • the security controller 749 may provide theft detection capabilities.
  • the security portion 749 may include a proximity sensor that interacts with a base station, an embodiment of which is described below. As a result, once the robot system 10 is a predetermined distance away from the base station various operations may be performed such as sounding an audio alarm, electronically transmitting a signal to close exit ways, or effecting certain mechanical operations such as locking wheels of the robot system 10 so as to hinder transport.
  • the processing capabilities of the security controller 349 may also utilize input devices controlled by the environment interface controller 310 and the user interface controller 320 .
  • the security controller 349 may direct that a communication or message, describing an emergency condition, be dispatched to an emergency services provider, such as the police, fire department or building manager. Accordingly, if the robot system 10 detected smoke in an operation area, the security controller 349 could alert the fire department of the emergency. This monitoring is controlled by the security controller 349 , working in conjunction with the other processing systems.
  • FIG. 1 includes a cleaning portion 800 in accordance with one embodiment of the system and method of the invention.
  • FIG. 7 is a block diagram showing the cleaning portion 800 in further detail.
  • the cleaning portion 800 includes various operational components providing additional capabilities to the robot system 10 , i.e. the ability to perform cleaning functions.
  • the components in the cleaning portion 800 provide feedback to the device subsystem portion 300 , which monitors and controls operation of the components of the customer service device portion 800 .
  • the cleaning portion 800 includes a solution container 810 , applicator nozzles 812 , a cleaning brush 820 , a vacuum device 830 , and a squeegee device 840 .
  • the solution container 810 is a physical container that is disposed on the robot system 10 in accordance with one embodiment of the invention.
  • the solution container 810 may be disposed on or within the body portion 40 so as to be accessible by an operator, i.e. for filling and re-filling cleaning solution.
  • a sensor or sensors may be disposed in the holding container 810 to monitor the quantity of solution disposed in the solution container 810 . Accordingly, once such sensors in the solution container 810 convey feedback to the device subsystem portion 300 that the quantity of solution is sufficiently diminished, then the device subsystem portion 300 may effect a desired action.
  • the device subsystem portion 300 may effect travel of the robot system 10 to a predetermined location such that the solution container 810 may be refilled with more solution.
  • the cleaning portion 800 also includes applicator nozzles 812 for use in the application of the cleaning solution in the operation of the robot system 10 .
  • the cleaning portion 800 also includes a cleaning brush 820 .
  • the brush 820 is disposed on the body portion 40 such that it may be utilized to perform a cleaning function upon command.
  • a vacuum device 830 is provided in the cleaning portion 800 , and disposed on the body portion 40 such that it may be utilized to vacuum an area upon command.
  • a squeegee device 840 is provided in the cleaning portion 800 , and disposed on the body portion 40 such that it may be utilized to perform a cleaning function upon command.
  • the robot system 10 includes a transport portion 900 .
  • the transport portion 900 is controlled by the device subsystem portion 300 .
  • the device subsystem portion 300 inputs various information from the interaction portion 700 including operator commands, for example.
  • the robot system 10 moves to a position and location where the robot system 10 can complete the function it has been commanded to perform.
  • the device subsystem portion 300 utilizes the transport portion 900 to effect this movement of the robot system 10 .
  • the transport portion 900 as shown in FIG. 8 controls various mechanical or electro mechanical components needed to effect physical movement of the robot system 10 .
  • the transport portion 900 includes a motor 910 used to drive the wheels 920 .
  • the motor 910 may be powered by the battery 746 .
  • various directional devices and sensors may be utilized as is needed or desired.
  • the robot system 10 utilizes data gathered from sonar sensors 711 and laser sensors 712 to continuously localize the position of the robot system 10 , and guide the movement of the robot system 10 along a determined function path.
  • the transport portion 900 may utilize a gyroscope 930 to monitor and control the direction of travel of the robot system 10 .
  • the robot system 10 includes a touch screen 733 , which provides a graphical user interface (GUI).
  • GUI graphical user interface
  • the graphical user interface is a device that utilizes separate passive process that performs two functions. These two functions include: (1) display of images on the touch screen as controlled by the cognition subsystem portion 500 , and (2) informing the cognition subsystem portion 500 of the location of any touches on the touch screen 733 .
  • all logic is removed from the graphical user interface, i.e. the touch screen 733 .
  • a clean division of the robot system's “brain” and the graphical user interface operation i.e., the operation of a device, is achieved.
  • the robots may be in communication utilizing any suitable wireless technology, for example, such as a gateway.
  • any suitable wireless technology for example, such as a gateway.
  • one robot could be washing a first area, while another robot is vacuuming a second area.
  • the robots might communicate with each other to determine if each robot is done with their respective tasks, thus allowing the robots to switch areas to perform the other functions, for example.
  • a second robot could be dispatched from another area to finish the cleaning solution washing.
  • one robot might communicate with multiple other robots. For example, one robot, while busy performing one function, might be commanded to perform a second function. As a result, that robot might communicate with a fleet of robots in the area to determine which robot in the fleet is available to perform the second function.
  • a gateway might be utilized to route communication between the robots. The gateway might be characterized as a traffic controller or a coordinator between the various robots.
  • a particular robot may be guided by a device, such as a mouse, in a remote location.
  • a device such as a mouse
  • a camera mounted upon the robot may input information using the camera and communicate that visual information to a distant location, where a human operator is monitoring and controlling movement of the robot.
  • the operator may control movement of the robot through a particular area and obtain visual information based on the travels of the robot.
  • the robot system 10 may communicate with the Internet, and Ethernet, other network systems, or other processing systems, utilizing wireless technology.
  • the robot system 10 may use streaming video technology.
  • the body portion 30 is an outer shell.
  • the outer shell may be formed in any of a wide variety of shapes depending on the area and functions in which the robot system is to be used.
  • the robot system may also utilize voice recognition techniques in operations of the robot system 10 .
  • the voice recognition techniques may identify a particular operator, or alternatively, accept a given command to perform a function.
  • a Kalman filter portion 440 may be utilized in the motion sub-system portion 400 .
  • an off-the-shelf Kalman filter may be utilized in accordance with one embodiment of the system and method of the invention.
  • the Kalman filter takes input and then processes that input to generate navigation information.
  • Various sensor inputs may be utilized including sonar information, odometry information, and gyroscope information.
  • the Kalman filter may be utilized to assess the accuracy of the various inputs.
  • an encoder device 748 may be utilized in the support system portion 740 as described above.
  • the encoder device may be utilized to control operation of the drive wheels, for example.
  • the processing portion or control portion of the robot system 10 may command the motors to turn the robot system 10 at a certain rate or, alternatively, to travel 4 feet.
  • the motors do not know exactly what 4 feet is and, as a result, the motors receive feedback from an encoder mechanism, i.e., a disk or optical reader, to provide feedback information.
  • a portion of the encoder mechanism is disposed on and spinning with the wheels. For example, there may be slots utilized in the encoder and the control system knows that there are, for example, 1000 slots, and 4000 slots are necessary to travel a distance of 4 feet.
  • an optical encoder may be positioned on the drive shaft of the motor, or alternatively on a wheel, keeping track of wheel rotation. It should further be appreciated that it is not required that the encoder actually be disposed on the driven wheel.
  • the encoder device could be disposed on a free rolling trailing wheel, for example. The rotation of the trailing wheel would provide the necessary feedback to a motor control mechanism to monitor the driven wheel, i.e., the travel of the robot system, as is necessary or desired.
  • the environment interface portion 710 may include a gyroscope 717 .
  • the gyroscope may be thought of a rotational compass.
  • various known techniques may be utilized in operation of the gyroscope.
  • appropriate techniques and devices may be utilized to prevent the gyroscope from drifting, and in particular, when less expensive gyroscopes are utilized.
  • a filtering process may be utilized to effectively use data output by the gyroscope. For example, if a controller portion commands the robot to go straight and the wheels are experiencing slippage, the gyroscope will accurately inform the controller of rotation of the robot system 10 . Accordingly, the gyroscope provides angular sensing and input, which is particularly useful when turning the robot.
  • the robot system 10 may utilize a docking system.
  • a “home position” is provided at which the robot docks in the “docking position.”
  • the robot system 10 When positioned in a docking position, the robot system 10 is electrically connected to a recharging source, for example. Accordingly, the robot system 10 may go out onto a floor of an area and work for a number of hours at which time the robot navigates its way back to the home position. At the home position, the robot system 10 self-docks itself so as to provide for replenishment.
  • the recharging may, for example, be performed utilizing an inductive-type pickup wherein a plate is positioned in the floor; and the robot system 10 is positioned over that plate so as to provide for charging utilizing inductive techniques.
  • inductive-type pickup wherein a plate is positioned in the floor; and the robot system 10 is positioned over that plate so as to provide for charging utilizing inductive techniques.
  • other items may be replenished on the robot system 10 including cleaning solution, wax, water, as well as other exhaustible items.
  • a touch shield or lower shield may be provided.
  • the touch shield provides feedback to the robot such that, if the robot bumps into something, or if something bumps into the robot, the robot can determine where the impact came from.
  • the impact may have come from the left, right, rear or front, for example.
  • the touch shield is a physical element that surrounds a lower portion of the robot.
  • the touch shield may be connected to the robot body using movement sensitive linkages, for example.
  • one embodiment of the robot system 10 may include a touch shield.
  • An illustrative touch shield include a shell and a joystick sensor device mounted on the body portion of the robot.
  • FIGS. 20-27 illustrate a robot with a touch shield, in accordance with one embodiment of the method and system of the invention described above.
  • FIG. 20 is an isometric view of an illustrative robot with touch shield in accordance with one embodiment of the method and system of the invention.
  • the robot system 10 includes a body portion 40 , environment interface portion 710 , as embodied by sensors 711 , and user interface portion 720 , as embodied by touch screen 721 , user buttons 727 , and key pad 722 , and transport portion 900 , as embodied by wheels 920 . It should be appreciated that while other portions of the robot system 10 may not be shown in FIG. 20, these portions and components are incorporated in the embodiment of the robot system 10 .
  • the environment interaction portion 710 of the robot system 10 is further embodied by the inclusion of a touch shield device 760 .
  • the touch shield device 760 includes a shell 770 and a joystick sensor device 780 .
  • Shell 770 is supported by at least one shell support member 44 affixed to a base member 42 .
  • the base member 42 may be the body portion 40 of the robot system 10 , or a part thereof, i.e. a robot chasse. Additionally, it should be appreciated that the base member 42 may be another physical element attached to the robot body portion 40 .
  • each shell support member 44 When assembled, each shell support member 44 must be flexible yet self-centering, such that the shell 770 , which is supported by each shell support member 44 , can translate relative to the base member 42 when an exterior force is applied to the shell 770 .
  • Shell 770 is supported by shell support members 44 , and mounted over base member 42 with a sufficient space between base member 42 and shell 770 , such that shell 770 can move in any direction in a horizontal plane parallel to the base member 42 , in response to the exterior force applied.
  • An exterior force may come from a human touching the shell, or the shell contacting an object while the robot is moving, for example.
  • a shell support member 44 may include a rubber mount or column, a spring, pneumatic cylinder, hydraulic cylinder, or air cylinders, for example. Rubber mounts or columns provide the additional benefit of being self-damping, thus allowing the rubber mount to self-center much easier than other potential support members.
  • a suitable shell support member 44 , or rubber mount must be sufficiently flexible such that when an exterior force is applied to shell 770 , the shell support member 44 bends and shell 770 moves relative to the base member 42 .
  • the shell support member 44 , or rubber mount must be sufficiently sturdy such that the shell support member 44 returns to a neutral vertical alignment after the exterior force is removed from shell 770 .
  • a shell support member 44 may be affixed to the base member 42 , and a fastener may be threaded through the top panel 772 of shell 770 and into the shell support member 44 , securing the shell 770 to the shell support member 44 . Therefore, the placement of the shell 770 on the shell support members 44 is such that the shell 770 can translate relative to the base member 42 when an exterior force is applied to the shell 770 , due to the displacement in the shell support members 44 . Further descriptions of shell 770 are shown in FIGS. 21-23.
  • touch shield device 760 includes a shell 770 and a sensor device 780 .
  • the sensor device 780 includes a base sensor portion 782 and a vertical member 784 .
  • the base sensor portion 782 which may be a joystick base plate, is affixed to the base member 42 . This may be done in any suitable manner including, for example, by screws, bolts or other fastening means.
  • the vertical member 784 which may be an armature or pin, is affixed to the shell 770 , as well in any suitable manner.
  • an adjustable centering device may also be utilized to center the vertical member 784 over the center 783 of base sensor portion 782 .
  • Such an adjustable centering device may take the form of a planar member, i.e. plexiglass disc or sheet, with the vertical member affixed to the planar member, and a plurality of fasteners, i.e. nut, bolt and washer combinations, integrally connected to the planar member and shell.
  • bolts may be threaded through a plurality of clearance holes in the top panel of the shell, and integrally threaded into and affixed to the planar member.
  • the above-described adjustable centering device allows the vertical member to be centered over the base sensor portion, and also adjust the position of the vertical member once the shell is supported on the shell support members.
  • An additional viewing port in the shell may also be provided to allow the vertical member to be centered over the base sensor portion visually.
  • movement of shell 770 in response to an exterior force applied translates vertical member 784 from over the center 783 of the base sensor portion 782 , i.e. the zero degree position, such that the base sensor portion 782 senses the angular direction and magnitude of the exterior force on the shell 770 .
  • the base sensor portion 782 senses the distance the vertical member 784 is displaced from over center 783 , i.e. the zero degree position, which allows the robot system 10 to determine the magnitude of the exterior force, as well as the angular direction the vertical member 784 is displaced from the center 783 , i.e. the zero degree position, which allows the robot system 10 to determine the direction from the exterior force was applied to shell 770 .
  • the base sensor portion 782 Upon movement of the vertical member 784 , the base sensor portion 782 produces an output to the processor portion signaling the exterior force on the shell 770 .
  • the output to the processor portion signals the direction of the exterior force applied and the degree of the exterior force applied.
  • FIG. 21 is an isometric view of an illustrative touch shield device shell in accordance with one embodiment of the method and system of the invention.
  • shell 770 includes a top panel 772 , with a front panel 776 and side panels 778 extending longitudinally downward there from.
  • the top panel 772 of shell 770 is formed in a “U” shape to allow the shell 770 to be placed over the base member 42 , and supported on shell support members 44 .
  • FIG. 21 may incorporate different designs, shapes, panels, or formations, without diverting from the scope of the invention.
  • shell 770 may utilize a touch shield device shell that covers the base member from the top, front, rear and sides.
  • a touch shield device shell that covers the base member from all directions except the side of the robot body portion 40 that is in contact with a floor surface.
  • the shell may be provided with clearance holes for the attachment of an adjustable centering device to the shell, for positioning of the vertical member in relation to the base sensor portion.
  • a viewing portal in the shell may also aid positioning of the vertical member.
  • shell 770 further includes shell mounting holes 773 , wherein the shell 770 is affixed to shell supporting members 44 .
  • shell 770 includes upward sensor ports 774 , frontal sensor ports 777 , and side sensor ports 779 , wherein sensors are placed to sense objects, or obstacles in relation to the position of the robot system.
  • Sonar sensors 711 i.e. ultrasonic transducers, are placed within the upward sensor ports 774 , frontal sensor ports 777 , and side sensor ports 779 .
  • sonar sensors 711 i.e. ultrasonic transducers
  • the processor utilizes the input from the sonar sensor to determine the position of walls in the area, location of obstacles, or mapping of an area.
  • these sensors as placed in the upward sensor ports 774 , frontal sensor ports 777 , and side sensor ports 779 , can be arranged in a variety of directions and angles on the shell 770 to gather a full spectrum of information on the location of obstacles in the area.
  • the sonar sensors 711 placed in the upward sensor ports 774 gather information that allows the processor to determine if the shell 770 has moved under an obstacle, i.e. the overhang of a desk or table.
  • Sonar sensors 711 placed in the frontal sensor ports 777 gather information that allows the processor to determine if an obstacle is in front of the robot system 10 .
  • Sonar sensors 711 placed in the side sensor ports 779 gather information that allows the processor to track walls or other obstacles on either side of the robot system 10 , and also provide improved steering of the robot system 10 , and map an area.
  • a plurality of sensors placed in side sensor ports at angles of 80°, 90° (perpendicular to the path of robot system 10 ), and 100° may provide additional steering and mapping capabilities.
  • FIG. 22 shows a planar view of an illustrative touch shield device shell in accordance with one embodiment of the method and system of the invention.
  • the embodiment shown in FIG. 22 illustrates the top panel 772 of the shell 770 .
  • shell 770 includes shell mounting holes 773 , wherein the shell 770 is affixed to shell supporting members 44 , and sensor ports, i.e. upward sensor ports 774 , for example.
  • FIG. 23 shows a planar view of the illustrative touch shield device shell in accordance with one embodiment of the method and system of the invention.
  • the embodiment shown in FIG. 23 illustrates the underside of the shell 770 and top panel 772 .
  • the shell 770 includes side panels 778 and front panel 776 .
  • shell 770 also includes shell mounting holes 773 .
  • FIG. 23 also illustrates sensors 711 placed within and filling upward sensor ports 774 , from the underside of top panel 772 .
  • Vertical member 784 is illustrated in FIG. 23 .
  • Vertical member 784 is affixed to the underside of shell 770 , such that in a non-operational condition, the placement of the shell 770 with the vertical member 784 affixed, over the base member 42 with the base sensor portion 782 affixed, positions the vertical member 784 over the center 783 of the base sensor portion 782 , in a zero degree (neutral) position.
  • touch shield device 760 includes a shell 770 and a sensor device 780 .
  • the sensor device 780 includes a base sensor portion 782 and a vertical member 784 .
  • the placement of the sensor device 780 in relation to the shell 770 , and/or base member 42 is shown in FIGS. 24-27.
  • FIG. 24 is an isometric view of an illustrative robot without a touch shield device shell in accordance with one embodiment of the method and system of the invention.
  • shell 770 has been removed to provide a better understanding of the base sensor portion 782 and shell support members 44 , as affixed to base member 42 .
  • base sensor portion 782 is affixed to base member 42 .
  • the base sensor portion 782 may be affixed in any suitable manner to the base member 42 , including, for example, with bolts 788 .
  • shell support members 44 are affixed to base member 42 , and extend vertically therefrom, for attachment to shell 770 .
  • Brush 820 one component of the cleaning portion 800 of robot system 10 , is also illustrated in FIG. 24 . In this embodiment, brush 820 is affixed to the base member 42 .
  • FIG. 25 is a planar view of the illustrative robot without a touch shield device shell of FIG. 24, in further detail, in accordance with one embodiment of the method and system of the invention.
  • FIG. 25 illustrates the robot system 10 from the planar view, along vertical plane A-A′.
  • Base member 42 extends from underneath the body portion 40 of the robot system 10 .
  • Brush 820 extends outwardly from beneath the base member 42 .
  • Base sensor portion 782 is affixed to base member 42 with bolts 788 .
  • Shell support members 44 are affixed to and extend vertically upward from base member 42 to support shell 770 .
  • Sensor 711 extends outwardly from the face of body portion 40 .
  • FIGS. 26-27 are provided. It should be appreciated that while FIGS. 26-27 do not illustrate each component or portion of robot system 10 , the embodiments of an illustrative touch shield shown therein may incorporate the descriptions and drawings of the embodiments shown and described in FIGS. 20-25.
  • FIG. 26 is a side sectional view of an illustrative touch shield device mounted on an illustrative base member in accordance with one embodiment of the method and system of the invention.
  • FIG. 26 illustrates a non-operational position.
  • a non-operational position includes any point in which the vertical member 784 of joystick sensor device 780 is positioned over the center 783 of the base sensor portion 782 . It should be appreciated that a non-operational position may include when the robot system 10 is operational, and even moving, as long as an exterior force is not being applied to the shell 770 such that the shell 770 would be translated.
  • Shell 770 with side panel 778 , are supported by shell support members 44 , which are affixed to base member 42 .
  • Brush 820 is also attached to base member 42 .
  • Base sensor portion 782 is bolted to base member 42
  • vertical member 784 is centered over base sensor portion 782 , and center 783 .
  • FIG. 27 illustrates the translation of shell 770 , vertical member 784 , and shell support members 44 when an exterior force is applied to shell 770 .
  • FIG. 27 is a side sectional view of an illustrative touch shield device mounted on an illustrative base member in accordance with one embodiment of the method and system of the invention.
  • FIG. 27 illustrates the occurrence of an exterior force F, applied to shell 770 .
  • the force F pushes on the shell 770 , forcing the shell support members 44 to flex in response to the force.
  • vertical member 784 moves away from the center 783 , i.e. the zero degree position, of the base sensor portion 782 .
  • the shell support members 44 which may be rubber mounts, support the shell 770 and translate the exterior force F into a parallelogram where the shell 770 moves in a plane parallel to the base member 42 , but in the opposite direction. Once the force F is no longer being applied to the shell 770 , the shell support members 44 return to a vertical upright position, and the vertical member 784 returns to a centered non-operational position over center 783 of the base sensor portion 782 .
  • base sensor portion 782 is the joystick base plate of a joystick with position sensors incorporated therein.
  • Vertical member 784 is an elongate rod-like element, or armature or pin, positioned such that vertical member 784 is vertically over center 783 of base sensor portion 782 .
  • the placement of the shell 770 over the base member 42 with the base sensor portion 782 affixed, places the vertical member 784 over the center 783 of the base sensor portion 782 , in a zero degree (neutral) position.
  • movement of shell 770 in response to an exterior force applied translates vertical member 784 from over the center 783 of the base sensor portion 782 , the zero degree position, such that the base sensor portion 782 senses the angular direction and magnitude of the exterior force on the shell 770 .
  • the base sensor portion 782 senses the distance the vertical member 784 is displaced from the zero degree position, which allows the robot system 10 to determine the magnitude of the exterior force, as well as the angular direction the vertical member 784 is displaced from the zero degree position over center 783 .
  • a robot with a touch shield device is commanded to perform a function within an area.
  • the robot system 10 is commanded to perform a cleaning function as described in FIGS. 9-19 and the accompanying descriptions described below.
  • the robot system is continuously localizing its position and searching for obstacles in its path.
  • the robot system 10 is provided with a plurality of sensors, sonar and laser, for example, not all obstacles can be detected before the robot comes into contact with the obstacle.
  • robot system 10 with touch shield device 760 provides an emergency stop mechanism for ceasing the movement of the robot system 10 upon contact with an exterior force, i.e. an obstacle.
  • shell 770 of touch shield device 760 When the shell 770 of touch shield device 760 has an exterior force applied to it, shell 770 translates on deformed shell support members 44 in a plane parallel to base member 42 , but in the direction of the force.
  • the shell 770 which is freely moveable in that it is being supported by the flexible support members 44 , can move in any direction depending on the angle of the exterior force applied on it.
  • the shell 770 When translated, the shell 770 causes an affixed vertical member 784 to move in relation to its neutral position over center 783 of base sensor portion 782 .
  • the angle and degree of displacement of the vertical member 784 from over center 783 allows the robot system 10 to determine the angular displacement of the vertical member 784 , and the magnitude of the exterior force.
  • the movement of vertical member 784 from over center 783 triggers an interrupt signal to the processor portion 100 , which commands the transport portion 900 to cease movement of the robot system 10 .
  • the processor portion 100 utilizing the information gathered from the base sensor portion 782 may command the transport portion 900 to then move the robot system 10 in a direction away from the exterior force, and consequently, away from the obstacle contacted.
  • the robot system 10 functions similar to having detected a normal obstacle, and determines a new function path that allows the robot system 10 to continue its commanded function, but avoid the obstacle it previously contacted.
  • FIG. 28 is an illustrative flowchart showing a method of utilizing a robot with a touch shield to perform a function in an area in accordance with one embodiment of the method and system of the invention.
  • the process begins in step S 2810 , wherein it passes to step S 2820 .
  • step S 2820 the robot system is commanded to perform a function in an area.
  • the robot system determines the area layout (in step S 2840 ) and localizes a position in the area (in step S 2860 ), the robot system determines a function path (in step S 2880 ).
  • the robot system then begins, in step S 2900 , to navigate the area and complete at least one function task associated with the robot system's localized position in the area.
  • the robot detects an obstacle in its function path by sensing an exterior force applied to the shell of the touch shield device (in step S 2920 )
  • the robot system ceases navigating the area (in step S 2940 ).
  • the robot system determines at least one of the angle and direction of the exterior force applied to the shell of the touch shield device (in step S 2960 ), and the robot system determines an exit path that moves the robot system in a direction opposite of the exterior force (in step S 2980 ).
  • the robot system determines a new function path (in step S 2990 ) and continues to navigate the area performing a commanded function (in step S 2999 ).
  • the process then ends in step S 3000 .
  • the sensor device 780 may comprise an analog joystick sensor, optical joystick sensor, digital joystick sensor, or electromechanical joystick sensor, for the base sensor portion 782 and accompanying vertical member 784 .
  • an optical, digital or mechanical joystick can be utilized interchangeably as necessary based on the skilled artisan's desired configurations.
  • a suitable optical joystick may be an eight position optical joystick, providing eight octants of sensory output information, such as the Perfect 360°TM Joystick manufactured by Happ Controls, Inc. The eight position optical joystick senses movement of the joystick handle member in eight octants, or in 45° segments of the base sensor portion.
  • the zero degree position, over center 783 provides a ninth position sensed by the base sensor portion.
  • the optical joystick would provide pulses giving a digital output by utilizing a base portion having sensors in it to make and break an optical link to provide digital light pulses which can be sensed electrically.
  • An analog joystick utilizing a plurality of capacitors and potentiometers, may provide angular direction measurements within one degree, and force magnitude values.
  • FIGS. 29-31 are provided to illustrate one embodiment of the method and system of the invention utilizing an analog joystick.
  • FIG. 29 is an isometric view of an illustrative robot without a touch shield device shell in accordance with one embodiment of the method and system of the invention.
  • shell 770 has been removed to provide a better understanding of the sensor device 780 , base sensor portion 782 , vertical member 784 , and shell support members 44 , as affixed to base member 42 .
  • base sensor portion 782 is affixed to base member 42 .
  • the vertical member 784 is integrally connected to the base sensor portion 782 in a conventional joystick manner.
  • Vertical member 784 which may be a handle, armature or other elongate element, extends vertically upward from the center 783 of the base sensor portion 782 , such that movement of the shell cause movement of the vertical member 784 . Accordingly, vertical member 784 may extend through a clearance hole 775 in shell 770 , or other suitable fixture on the shell 770 .
  • the base sensor portion 782 may be affixed in any suitable manner to the base member 42 , including, for example, with bolts 788 .
  • shell support members 44 are affixed to base member 42 , and extend vertically upward for attachment to shell 770 .
  • Brush 820 one component of the cleaning portion 800 of robot system 10 , is also illustrated in FIG. 29 . In this embodiment, brush 820 is affixed to the base member 42 .
  • FIG. 30 is a side sectional view of an illustrative touch shield device mounted on an illustrative base member in accordance with one embodiment of the method and system of the invention.
  • FIG. 30 illustrates a non-operational position.
  • the non-operational position includes any point in which the vertical member 784 of sensor device 780 is positioned in the center 783 of the base sensor portion 782 . It should be appreciated that a non-operational position may include when the robot system 10 is operational, and even moving, as long as an exterior force is not being applied to the shell 770 such that the shell 770 would be translated, and consequently move vertical member 784 extending upwardly through the shell 770 .
  • Shell 770 with side panel 778 , is supported by shell support members 44 , which are affixed to base member 42 .
  • Brush 820 is also attached to base member 42 .
  • Base sensor portion 782 is bolted to base member 42
  • vertical member 784 is centered over base sensor portion 782 , and center 783 .
  • FIG. 31 illustrates the translation of shell 770 , vertical member 784 , and shell support members 44 when an exterior force is applied to shell 770 .
  • FIG. 31 is a side sectional view of an illustrative touch shield device mounted on an illustrative base member in accordance with one embodiment of the method and system of the invention.
  • FIG. 31 illustrates the occurrence of an exterior force F, applied to shell 770 .
  • the force F pushes on the shell 770 , forcing the shell support members 44 to flex in response to the force.
  • vertical member 784 is bent from a perpendicular vertical position over the center 783 , i.e. the zero degree position, of the base sensor portion 782 .
  • the shell support members 44 which may be rubber mounts, support the shell 770 and translate the exterior force F into a parallelogram where the shell 770 moves in a plane parallel to the base member 42 , but in the opposite direction. Once the force F is no longer being applied to the shell 770 , the shell support members 44 return to a vertical upright position, and the vertical member 784 returns to a centered non-operational position over center 783 of the base sensor portion 782 .
  • a touch shield device may be constructed in any suitable manner such that a force applied to the shell causes a movement of the shell relative to a base member. This movement causes a joystick sensor device to move relative to its neutral position, i.e. the zero degree position.
  • the above described embodiments illustrate the movement of a vertical member relative to a base sensor portion mounted on a base member.
  • the base sensor portion 782 and the vertical member 784 comprise the joystick sensor device 780
  • the base sensor portion 782 may be affixed on the shell 770
  • the vertical member 784 is mounted on the base member 42
  • the base sensor portion and the vertical member may be mounted on the shell and base member in any suitable fashion, position, or placement wherein the joystick sensor device measures the movement of the shell versus the base member by change in position of the vertical member in relation to its neutral position.
  • additional joystick sensor devices may be utilized such that movement of the shell causes the shell to contact and move a vertical member of a joystick sensor device.
  • the vertical member may extend through an opening in the shell, or fit within a space integral with the shell such that contact of the shell would necessarily cause the vertical member to be moved.
  • robot touch shield device described above may be utilized in combination with additional robot systems and embodiments, such as those incorporated by reference in their entirety, U.S. patent application Nos. 09/906,216, and U.S. 09/906,159, U.S. Pat. Nos. 5,548,511 and 6,124,694.
  • a method of utilizing a robot system to perform a function in an area comprising the steps of first commanding the robot system to perform a function in an area, the area having an area layout including at least one area segment.
  • the method further includes accessing by the robot system a stored map of the area layout, the stored map having at least one function task associated with the at least one area segment, localizing a first position of the robot system in the area, and determining a function path from the first position of the robot system for navigation of the area and completion of the at least one function task.
  • the method includes repeatedly continuously localizing a current position of the robot system while navigating the robot system along the function path, and completing the at least one function task that is associated with the current position of the robot system on the stored map of the area.
  • An illustrative method of utilizing a robot system to perform a function in an area is shown in FIGS. 9-10, and described below.
  • FIG. 9 is a flowchart showing a method of utilizing a robot system to perform a function in an area in accordance with an embodiment of the method and system of the invention.
  • the process begins in step S 10 , and then passes to step S 20 , wherein the robot system is commanded to perform a function in an area.
  • the function has at least one function task, while the area has an area layout which includes at least one area segment.
  • the robot system accesses a stored map of the area layout in step S 40 .
  • the area layout has at least one function task associated with its at least one area segment.
  • the robot system localizes a first position in the area.
  • step S 80 the robot system determines a function path, from the first position, for navigation of the area and completion of the at least one function task. Then, in step S 100 , the robot system navigates the area and completes the at least one function task associated with the position of the robot system in the area, while continuously localizing the robot system position in the area. In addition, while the robot system is navigating the area and completing the at least one function task, the robot system is continuously monitoring for obstacles, and determining if an obstacle is in the function path in step S 120 .
  • step S 60 the process returns to step S 60 , where the robot system will once again localize a position (in step S 60 ) and recalculate a new function path that avoids the obstacle in the current function path (in step S 80 ).
  • step S 120 the process passes to step S 140 , wherein the robot system determines if it has completed the at least one function task in the area. If yes, the process passes to step S 160 wherein the robot system returns to a non-operating position. Then, the process ends in step S 180 . However, if the robot system has not completed its at least one function task in step S 160 , the process returns to step S 100 and the robot system continues to navigate the area and complete its at least one function task.
  • FIG. 10 shows the method of FIG. 9 in further detail.
  • the robot system moves to the area.
  • the robot system may take appropriate measures to guide itself to the area, or the robot system may be directed to the area by an operator.
  • FIG. 10 is a flowchart showing the “robot system determines a function path” step of FIG. 9 in further detail in accordance with an embodiment of the method and system of the invention.
  • the process begins in step S 80 , and then passes to step S 82 .
  • step S 82 the robot system determines whether it has received a new command to perform a function. If the robot system has received a new command, the process passes to step S 84 , where the robot system determines if there is a stored function path associated with the command received. This may include a previously determined function path for a given area based on layout of the area and associated tasks. If a stored function path exists that is associated with the new command received, the process passes to step S 86 . In step S 86 , the stored function path is identified as the function path for the continuing process. The, the process passes to step S 99 , wherein the process returns to step S 100 .
  • step S 82 determines that it has not received a new command
  • the process passes to step S 88 .
  • step S 88 the robot system determines that an obstacle has been detected in the current function path.
  • step S 90 the robot system develops a new function path and identifies the new function path as the function path for the continuing process.
  • step S 99 wherein the process returns to step S 100 .
  • step S 84 if the robot system determines that there is no stored function path associated with the command received, the process passes to step S 90 . Then, in step S 90 , the robot system develops a new function path and identifies the new function path as the function path for the continuing process. Thereafter, the process passes to step S 99 , wherein the process returns to step S 100 .
  • a stored function path for a given area segment in an area layout may be more efficient than a function path determined by a robot system operating of information gathered from sensors. This may occur because pre-programmed function paths may allow an operator to direct the robot system very close to obstacles, such as walls, and guide the robot system into tight spaces that the robot system's obstacle avoidance systems would otherwise not allow the robot system to navigate within.
  • the robot system may create updated maps of the area layout when obstacles are detected in the area layout. Once detected, the obstacles can be added to an amended map of the area layout that can be utilized. Also, if the obstacle is removed, the robot system can either create another amended map that removes the obstacle, or could return to the previous stored map. It should also be appreciated that the operator may set the updating of a stored map to occur on the detection of an obstacle for a given number of consecutive cleaning cycles through a given area layout. The number of repeated detections of an obstacle before it is added to a map of the area layout can be determined by one of ordinary skill in the art based upon the needs of the given area layout.
  • FIGS. 11-13 are illustrative flowcharts showing a method of mapping an area and assigning function tasks to an area segment in accordance with one embodiment of the method and system of the invention.
  • FIG. 11 is a flowchart showing a method of mapping an area utilizing a robot system in accordance with one embodiment of the method and system of the invention.
  • the process begins in step S 300 , and then passes to step S 310 .
  • step S 310 the robot system is commanded to map an area.
  • the area has an area layout which includes at least one area segment.
  • the robot system determines the area layout utilizing a plurality of sensors. Accordingly, the robot system produces a map of the area layout in step S 330 , and stores the map of the area layout in a memory device in step S 340 .
  • the process then passes to step S 350 .
  • FIG. 12 is a flowchart showing a method of mapping an area utilizing a robot system in accordance with yet another embodiment of the method and system of the invention.
  • the process begins in step S 400 , and then passes to step S 410 .
  • the robot system is commanded to map an area.
  • the area has an area layout which includes at least one area segment.
  • an operator directs the robot system on a mapping path that transverses the area being mapped.
  • step S 430 while moving along the path transversing the area being mapped, the robot system determines the area layout utilizing a plurality of sensors.
  • the robot system produces a map of the area layout in step S 440 .
  • step S 450 the robot system determines whether or not the entire area has been mapped. If not, the process returns to step S 420 to continue mapping. However, if the entire area has been mapped, the robot system stores the map of the area layout in a memory device in step S 460 . The process then passes to step S 470 , where it ends.
  • FIG. 13 is a flowchart showing a method of storing a map of an area layout in accordance with one embodiment of the system and method of the invention.
  • the process shown in FIG. 13 may represent the steps of S 340 and S 460 , of FIGS. 11 and 12 respectively, in further detail.
  • step S 510 the robot system determines whether to store the map of the area layout in an internal memory device. If yes, the robot system stores the map of the area layout in an internal memory device in step S 520 . Alternatively, if the robot system determines not to store the map of the are layout internally, the robot system stores the map in an external memory device in step S 530 . Once the map of the area layout has been stored in steps S 520 and S 530 , the processes pass to step S 540 , wherein the processes end. It should be appreciated that these illustrative storing processes may be utilized in conjunction with other processes in which other steps may be added or deleted.
  • the operator may direct the robot system along a mapping path that transverses the area being mapped in any suitable manner in which the robot system receives a command from an operator to move from one point to another.
  • an operator may physically guide the robot system from a first point in an area segment to a finishing point of the mapping.
  • the operator may control the movement of the robot system through the use of a wireless keyboard or joystick.
  • any suitable mapping path that allows the robot system to produce a complete map of an area segment may be utilized. Therefore, as stated above, the robot system need only transverse the area being mapped to the extent necessary for mapping the area. It should be appreciated that the robot system may not need to physically move in any respect to produce a map of an area.
  • the robot system performing the mapping of an area may utilize a plurality of different sensors in producing the map.
  • the map may be stored in any suitable memory device. This may include an internal memory store within the robot system, or any external memory device in which the robot system is in communication, or both, for example. Accordingly, the robot system may store the completed map in a central memory device, wherein the map is accessible by one or more alternate robot systems. The alternate robot systems may employ their access to the stored map upon receiving a command to perform a function in an area associated with the stored map.
  • an operator may assign, program, or associate function tasks for a given area segment. Accordingly, an operator could map an area segment and assign one function task to be completed in that area segment once the robot system receives a command to perform a function in the area, for example.
  • FIG. 14 provides a better understanding of how function tasks can be associated with maps of area layouts.
  • FIG. 14 is a flowchart showing a method of associating a function task with an area segment on a map of an area layout in accordance with yet another embodiment of the method and system of the invention.
  • the process begins in step S 600 , and then passes to step S 610 .
  • step S 610 the robot system is commanded to map an area.
  • the area has an area layout which includes at least one area segment.
  • an operator directs the robot system on a mapping path that transverses the area being mapped.
  • step S 630 while moving along the path transversing the area being mapped, the robot system determines the area layout utilizing a plurality of sensors.
  • the robot system produces a map of the area layout in step S 640 .
  • the process then passes to step S 650 , wherein an operator associates at least one function task to be completed in the area segment with the map of the area layout, which includes the area segment.
  • step S 660 the robot system determines whether or not the entire area has been mapped. If not, the process returns to step S 620 to continue mapping. However, if the entire area has been mapped, the robot system stores the map of the area layout and the at least one function task associated with the at least one area segment in a memory device in step S 670 . The process then passes to step S 680 , where it ends.
  • an operator may associate a certain function task to an area segment before the robot system begins to map the area segment.
  • an operator could command the robot to map the area, the robot begins sensing the area layout and the operator programs a function task to go along with the area segment before the robot system begins to transverse the area.
  • the operator could press a record button, which tells the robot system to map the area, then push a function task button, which tells the robot to associate the function task with the area segment.
  • the operator could then direct the robot system over the area segment and map and assign tasks simultaneously.
  • an operator may choose to associate several function tasks with one area segment and only one for another.
  • each area segment should have at least one function task.
  • This one function task may be as simple as moving through the area segment and not performing any other function task.
  • one of ordinary skill in the art could prepare several different stored functions within the robot system, which include several commands, functions, area layouts further including numerous area segments, for example. This embodiment of the invention may be further understood with reference to FIG. 15 .
  • FIG. 15 is a diagram of an illustrative area layout in accordance with one embodiment of the method and system of the invention.
  • the robot system is first placed in an unmapped area 1000 at point 1001 .
  • An operator then commands the robot system to begin mapping the area layout in which the robot system is placed.
  • the robot system begins sensing the boundaries, i.e. walls, of the area layout and then moves to point 1002 , while continuously sensing.
  • the robot system produces a map of the first area segment 1051 , which is defined by the walls where the robot system first began sensing at point 1001 , and artificial boundary 1021 , which is the programmed boundary between area segments 1051 and 1052 .
  • the operator or programmer can assign any number of function tasks for completion in this area segment 1051 , the least of which is to travel through it.
  • the operator then directs the robot system from point 1002 to point 1003 .
  • the robot system continues sensing and updating a map of the area layout based on the movement, while recognizing the established boundary 1021 between area segments 1051 and 1052 .
  • area segment 1052 is defined by the walls of the area and space between boundary 1021 and boundary 1022 , the boundary between area segments 1052 and 1053 .
  • the mapping process may continue accordingly through mapping passes from points 1003 - 1014 .
  • the robot system will move from point 1003 to 1004 to 1005 to 1006 to 1007 to 1008 , back to 1009 to 1010 to 1011 to 1012 back to 1009 to 1013 and finishing with point 1014 .
  • the mapping process can map the entire area layout 1000 , while dividing the area layout 1000 into area segments 1051 , 1052 , 1053 , 1054 , 1055 , 1056 , 1057 , and 1058 , separated by artificial boundaries 1021 , 1022 , 1023 , 1024 , 1025 , 1026 and 1027 , as shown in FIG. 15 .
  • the operator possesses a wide range of latitude in determining which function tasks may be assigned to each area segment.
  • the operator may program than area segments 1051 and 1052 should be mopped and dried, area segment 1053 should be passed over because it is carpeted (unless vacuuming desired), area segments 1054 and 1055 should be scrubbed, mopped, dried and waxed, area segment 1056 should be passed over because it is carpeted, area segment 1057 vacuumed, and area segment 1058 swept.
  • the robot systems commands may require it to travel over the same section of the area layout several times, perform function and tasks repeatedly on one area segment, and in any various order or schedule. Therefore, one should appreciate that many different combinations of function tasks and area segments can be accomplished based upon the desired programming input into the robot system by the operator.
  • the robot system understands the function tasks it is to perform in different area segments based upon the stored map associated with the commanded function, and tasks assigned to area segments stored along with the map of the area layout.
  • FIGS. 16 and 17 show illustrative methods of editing a map of an area layout in accordance with further embodiments of the method and system of the invention.
  • FIG. 16 is a flowchart showing a method of editing a map of an area layout in accordance with one embodiment of the system and method of the invention.
  • the process begins in step S 700 , and then passes to step S 710 .
  • an operator accesses a map of an area layout.
  • step S 720 the operator edits the map of the area layout.
  • the operator stores the edited map of the area layout in a memory device, in step S 730 .
  • the process then passes to step S 740 where it ends.
  • the robot system may be programmed to create updated maps of area layouts in which the robot system is operating within based upon the detection of obstacles in the function path, for example. This may include commanding the robot system to create an amended map upon the repeated detection of an obstacle.
  • FIG. 17 is provided.
  • FIG. 17 is a flowchart showing a method of editing a map of an area layout in accordance with a further embodiment of the method and system of the invention.
  • the process begins in step S 800 , and then passes to step S 810 an operator accesses a map of an area layout and function tasks that have been associated with the area layout. Then, in step S 820 , the operator determines whether or not to edit the map of the accessed area layout. If yes, the process passes to step S 830 , in which the operator edits the accessed map of the area layout. Once the map has been edited, the process passes to step S 840 . Alternatively, if in step S 820 , the operator decides not to edit the accessed map of the area layout, the process passes to step S 840 .
  • step S 840 the operator determines whether or not to edit the function task(s) associated with any area segment in the map of the area layout. In this respect, function tasks can be added or deleted. If yes, the process passes to step S 850 , in which the operator edits or changes the associated function task(s). Then, the process passes to step S 860 . Alternatively, if in step S 840 , the operator chooses not to edit the associated function task(s), the process passes to step S 860 . Then, in step S 860 , the operator stores the edited map of the area layout or edited associated function task(s) or both in a memory device. The process then passes to step S 870 where it ends.
  • FIG. 18 a is a diagram of an unedited area layout 1100 in accordance with one embodiment of the method and system of the invention. As shown in FIG. 18 a, area layout 1100 is divided into a first area segment 1110 and second area segment 1120 , which are separated by artificial boundary 1119 . For purposes of this example, consider area layout 1110 to represent a tiled floor area of an office building. However, if changes to the area of the office building are made, an operator can edit the map of this area layout 1100 .
  • FIG. 18 b is a diagram of an edited area layout in accordance with one embodiment of the method and system of the invention.
  • the same area layout 1100 is now subdivided into area segments 1110 , 1130 and 1140 , separated by boundaries 1119 and 1139 , respectively.
  • 1130 and 1140 make up what was 1120 .
  • the office building area has been modified to add multiple pillars 1112 in area segment 1110 , a receptionist desk 1114 in area segment 1110 , and a glass wall 1132 with glass double doors 1134 has been placed on boundary 1139 .
  • an operator can edit the original stored map of area layout 1100 (as shown in FIG. 18 a ) to include these new features (as shown in FIG. 18 b ) and update a function for a robot system to perform in this area layout 1100 .
  • FIG. 19 a is a diagram of an illustrative area layout in accordance with one embodiment of the method and system of the invention.
  • FIGS. 19 b, 19 c and 19 d are diagrams of the illustrative area layout of FIG. 19 a in further detail in accordance with one embodiment of the method and system of the invention.
  • illustrative area layout 1200 is provided. Area layout 1200 is divided into sections 1210 , 1230 , 1250 and 1270 , respectively, for purposes of this embodiment. As shown in FIG.
  • sections 1210 and 1230 can be grouped together to define area segment 1220 .
  • sections 1230 and 1250 can be grouped together to define area segment 1240 .
  • sections 1230 and 1270 can be grouped together to define area segment 1260 .
  • FIGS. 19 a- 19 d have been selected only for purposes of illustrating one embodiment of the system and method of the invention.
  • An operator may define different area segments throughout an area layout in any suitable manner desired for accomplishing the desired function. For example, if a small circular area rug was placed in the center of section 1230 , the operator could choose to make that portion of section 1230 covered by the area rug to define yet another area segment.
  • an operator can utilize several different commands for directing the robot system's functioning within differing area segments.
  • the operator can program the robot system to change area segments with a simple programmed turn of the robot system.
  • the routine could include a programmed 90 degree turn clockwise, after which the robot system would be in the next area segment. The robot system would then move to the tasks commanded for the second area segment upon making the turn.
  • an operator commands a cleaning robot system to perform several different cleaning tasks within area layout 1200 .
  • the operator can dictate a series of tasks to be completed.
  • area layout 1200 is divided into area segments 1220 , 1240 and 1260 , respectively. It should be appreciated, as shown in FIGS. 19 a- 19 d, area segments may overlap and may further be defined in any suitable manner desired.
  • the series of tasks in this embodiment may be scripted such that the robot system first applies cleaning solution to and then mops area segment 1220 . Then, the robot system applies cleaning solution to, scrubs and then mops area segment 1240 . Finally, the robot system simply dust mops area segment 1260 . Although these area segments overlap and are not separated by physical boundaries, the operator may change the area segment in which the robot system is operating by commanding the robot system to turn upon the completion of the assigned tasks for an area segment. Thus, once the robot system finished mopping area segment 1220 , and was situated in section 1230 , a 90 degree clockwise turn could place the robot system in the next area segment. Therefore, upon the 90 degree turn, the robot system would be at its initial operating point for area segment 1240 . However, it should be understood that turns and other physical movements of the robot system may be utilized to effectuate the robot system's assigned tasks, and the differentiation between area segments in an area.
  • commands may be received from a central system via any suitable communication interface, modem, telephone, fax, or other computer connection, the receipt of data input from an IP address given to the robot, or any other suitable connection through which the robot's processor might receive input from an external source.
  • a robot system may receive a command from another robot to perform a function or function task.
  • an illustrative fleet of robots may take commands from a manager robot dispatching commands through interfaces with the other robots in the area.
  • the robot system's ability to perform functions in an area may extend to those periods of time when no operators are present to supervise the robot system.
  • the robot system could perform a maintenance and security function, as well as a conventional cleaning function.
  • the robot may be programmed to handle certain emergency situations, including for example, fire emergencies, burglaries or loss of power in the area in which it is operating. It should be appreciated that once the robot system detects an emergency condition, the robot system may alert all necessary personnel to the emergency condition.
  • the robot system in accordance with the robot system's ability to operate autonomously the robot will be provided with the necessary programming, tasking and commands to ensure its readiness to perform functions in an area. This may require that the robot system monitor its own diagnostic system, including its power status and internal components, such that the robot system would understand if it needs to be recharged or serviced to maintain its working condition. Then, the robot system may alert the necessary personnel that it needs service. For simple service requirements, like the recharging of the robot system's batteries, dumping or refilling tanks, the robot system may deliver itself to a recharging station where it can autonomously recharge its batteries, and dump or refill its tanks.
  • FIGS. 1-8, and FIGS. 20-27 may incorporate a computer or computer system.
  • the term “computer system” is to be understood to include at least one processor utilizing a memory or memories.
  • the memory stores at least portions of an executable program code at one time or another during operation of the processor.
  • the processor executes various instructions included in that executable program code.
  • An executable program code means a program in machine language or other language that is able to run in a particular computer system environment to perform a particular task.
  • the executable program code process data in response to commands by a user.
  • executable program code and term “software” mean substantially the same thing for the purposes of the description as used herein.
  • processor, or subportions of the processor, and/or the memory, or subportions of the memory be physically located in the same place or disposed in the same physical portion of the robot system 10 . That is, it should be appreciated that each of the processor and the memory may be located in geographically distinct locations and connected so as to communicate in any suitable manner, such as over a wireless communication path, for example. Additionally, it should be appreciated that each of the processor and/or the memory may be composed of different physical pieces of equipment. Accordingly, it is not necessary that the processor be one single piece of equipment in one location and that the memory be another single piece of equipment in another location.
  • the processor may be two pieces of equipment in two different physical locations.
  • the two distinct pieces of equipment may be connected in any suitable manner.
  • each respective portion of the memory described above may include two or more portions of memory in two or more physical locations.
  • the memory could include or utilize memory stores from the Internet, Intranet, Extranet, LAN or some other source or over some other network, as may be necessary or desired.
  • the invention may illustratively be embodied in the form of a computer or computer operating system. It is to be appreciated that the software that enables the computer operating system to perform the operations described above may be supplied on any of a wide variety of data holding media. Further, it should be appreciated that the implementation and operation of the invention may be in the form of computer code written in any suitable programming language, which provide instructions to the computer.
  • the software code or programming language that is utilized in a computer system to perform the various operations of the above described invention may be provided in any of a wide variety of forms.
  • the software may be provided in the form of machine language, assembly code, object code, or source language, as well as in other forms.
  • the software may be in the form of compressed or encrypted data utilizing an encryption algorithm.
  • the particular medium utilized may take on any of a variety of physical forms.
  • the medium may be in the form of a compact disk, a DVD, an integrated circuit, a hard disk, a floppy diskette, a magnetic tape, a RAM, a ROM, or a remote transmission, as well as any other medium or source of information that may be read by a computer or other operating system.
  • the software of the method of the invention which is utilized in operation of the robot system 10 , may be provided in the form of a hard disk or be transmitted in some form using a direct wireless telephone connection, the Internet, an Intranet, or a satellite transmission, for example.
  • the programming language enabling the system and method of the invention as described above may be utilized on all of the foregoing and any other medium by which software or executable program code may be communicated to and utilized by a computer or other operating system.
  • the system and method of the invention may utilize an application program, a collection of separate application programs, a module of a program that is designed to handle, or a portion of a module of a program, for example.
  • the computer language used in the system and method of the invention may be any of a wide variety of programming languages. Further, it is not necessary that a single programming language be utilized in conjunction with the operation of the system and method of the invention. Rather, any number of different programming languages may be utilized as is necessary or desirable.
  • a user interface may be in the form of a dialogue screen for example.
  • a user interface includes any software, hardware or combination of hardware and software used in an operating system that allows a user to interact with the operating system.
  • a user interface may include any of a touch screen, keyboard, mouse, voice reader, voice recognizer, dialogue screen, menu box, a list, a checkbox, a toggle switch, a pushbutton or any other object that allows a user to receive information regarding the operation of the program and/or provide the operating system with information.
  • the user interface is any device that provides communication between a user and a computer.
  • the information provided by the user to the computer through the user interface may be in the form of a command, a selection or data, or other input, for example.
  • a user interface is utilized by an operating system running an application program to process data for a user.
  • a user interface is typically used by a computer for interacting with a user either to convey information or receive information.
  • the user interface of the invention interact, i.e., convey and receive information, in communication with another operating system or computer, rather than a human user.
  • the user interfaces utilized in the system and method of the invention may interact partially with another operating system while also interacting partially with a human user.

Abstract

A robot touch shield device comprising a shell supported by at least one shell support member mounted on a base member, and a sensor device for sensing an exterior force applied to the shell, the sensor device having a base sensor portion having a center and a vertical member, the base sensor portion affixed on the base member, the vertical member affixed on the shell, the vertical member positioned over the center of the base sensor portion, wherein the exterior force applied to the shell translates the shell relative to the base member, the base sensor portion senses a displacement of the vertical member relative to the center of the base sensor portion and produces an output representing at least one of a direction of the exterior force applied and the degree of the exterior force applied.

Description

CROSS REFERENCE TO RELATED PATENT APPLICATIONS
This patent application is a Continuation-In-Part application of U.S. patent application Ser. No. 09/928,669, which was filed on Aug. 13, 2001.
FIELD OF THE INVENTION
The system and methods of the invention relate to utilizing a robot system with a touch shield to perform a function. More specifically, the invention relates to a service robot system and a method of utilizing a service robot system to perform a service function in an area.
BACKGROUND OF THE INVENTION
In all facets of today's society, people are relying on computers and robots to accomplish more on a day to day basis. Many industrial processes that, in the past, required a human worker to staff are now being done by robots controlled by computers. For example, the automotive industry relies heavily on robots in its automated manufacturing processes. With the reliance on computers and robots to perform simple functions, like cleaning or manufacturing, increasing everyday, there is a need to make controlling these systems easier and more efficient. Therefore, a computer or robot could operate efficiently and effectively with as little human direction as possible.
This need for efficient and effective mechanisms for controlling computer and robot systems can be seen in several industries. The industrial cleaning industry provides one example. Every night, throughout the country, thousands of janitors and cleaning people enter the buildings, plants, airports, hotels, restaurants, for example, to clean these indoor spaces. Many of the cleaning functions provided by these personnel could also be done by an autonomous robot system if there was an efficient and effective method of controlling it. As a result, many corporations, businesses and retailers could save precious business expenses if a robot system could perform the same function.
Furthermore, it would be desirable to have a robot system that could operate for extended periods of time autonomously, without the need for extended human supervision. In this respect, a robot system could perform a series of tasks that free the robot system operator to perform other duties. This need can, once again, be seen in the industrial cleaning industry. For example, illustrative cleaning systems will autonomously clean an area, but then require an operator to move the system to the next area that requires service. This may require transport over areas that do not require any type of cleaning or other service. Thus, there is a need for a method of controlling a cleaning robot system such that the system can be given multiple tasks, in many different areas, wherein the robot system could finish the tasks in each different area without a human operator being required.
However, balanced along with the need for autonomously operating robot systems is the need for safe operation of these machines. Most industrial robots weigh in excess of several hundred pounds, and often possess the ability to exhibit a force greater than the robot's own weight. For robots that move autonomously, obstacle avoidance is critical to avoid damaging persons and property. Although these robots are often provided with laser and sonar sensors to detect objects that may be obstacles for a moving robot, occasionally these sensors do not detect an object before contact occurs. Accordingly, there is a need for a robot system with an improved device for detecting a force contact, i.e. unintended contact with an obstacle, applied on the robot system such that the robot system ceases movement until the force is removed from contact with the robot system by moving in a direction away from the sensed contact.
Accordingly, there is a need for an efficient and effective system and method for addressing these problems and others with respect to the utilization of robot systems and cleaning robot systems.
BRIEF SUMMARY OF THE INVENTION
In accordance with one embodiment of the method and system of the invention, a robot touch shield device comprising a shell supported by at least one shell support member mounted on a base member, and a sensor device for sensing an exterior force applied to the shell, the sensor device having a base sensor portion having a center and a vertical member, the base sensor portion affixed on the base member, the vertical member affixed on the shell, the vertical member positioned over the center of the base sensor portion, wherein the exterior force applied to the shell translates the shell relative to the base member, the base sensor portion senses a displacement of the vertical member relative to the center of the base sensor portion and produces an output representing at least one of a direction of the exterior force applied and the degree of the exterior force applied.
In yet another embodiment of the invention, a robot system with a touch shield device is disclosed. The robot system comprising a processing portion for processing data in the robot system, a memory portion, the processor portion storing data in the memory portion and retrieving data from the memory portion, a transport portion for transporting the robot system from a first location to a second location, a body portion, the body portion containing at least one of the processor portion, the memory portion, and the transport portion, a touch shield device mounted on the body portion, the touch shield device having a shell supported by at least one shell support member mounted on a base member, and a sensor device for sensing an exterior force applied to the shell, the sensor device having a base sensor portion having a center and a vertical member, the base sensor portion affixed on the base member, the vertical member affixed on the shell, the vertical member positioned over the center of the base sensor portion, wherein the exterior force applied to the shell translates the shell relative to the base member, the base sensor portion senses a displacement of the vertical member relative to the center of the base sensor portion and produces an output signal representing at least one of a direction of the exterior force applied and the degree of the exterior force applied, the processor portion monitoring the output signal produced by the base sensor portion and commanding the robot system to cease navigating and maneuver on an exit path away from the exterior force applied to the shell.
In a further embodiment of the invention, a method of utilizing a robot system with a touch shield device comprising the steps of commanding the robot system to perform a function in an area, the function having at least one function task, the area having an area layout including at least one area segment; accessing by the robot system a stored map of the area layout, the stored map having at least one function task associated with the at least one area segment; localizing a first position of the robot system in the area; determining a function path by the robot system from the first position of the robot system for navigation of the area and completion of the at least one function task; repeatedly continuously localizing a current position of the robot system while navigating the robot system along the function path; repeatedly continuously monitoring by the robot system the touch shield device for obstacles in the function path, the touch shield device having a shell supported by at least one shell support member mounted on a base member, and a sensor device for sensing an exterior force applied to the shell, the sensor device having a base sensor portion having a center and a vertical member, the base sensor portion affixed on the base member, the vertical member affixed on the shell, the vertical member positioned over the center of the base sensor portion; wherein the exterior force applied to the shell translates the shell relative to the base member, the base sensor portion senses a displacement of the vertical member relative to the center of the base sensor portion and produces an output representing at least one of a direction of the exterior force applied and the degree of the exterior force applied.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention can be more fully understood by reading the following detailed description of the presently preferred embodiments together with the accompanying drawings, in which like reference indicators are used to designate like elements, and in which:
FIG. 1 is a block diagram showing a robot system in accordance with one embodiment of the system and method of the invention;
FIG. 2 is a block diagram showing the processor portion of FIG. 1 in further detail in accordance with one embodiment of the system and method of the invention;
FIG. 3 is a block diagram showing the device subsystem portion of FIG. 2 in further detail in accordance with one embodiment of the system and method of the invention;
FIG. 4 is a block diagram showing the motion subsystem portion of FIG. 2 in further detail in accordance with one embodiment of the system and method of the invention;
FIG. 5 is a block diagram showing the memory portion of FIG. 1 in further detail in accordance with one embodiment of the system and method of the invention;
FIG. 6 is a block diagram showing the interaction portion of FIG. 1 in further detail in accordance with one embodiment of the system and method of the invention;
FIG. 7 is a block diagram showing the cleaning portion of FIG. 1 in further detail in accordance with one embodiment of the system and method of the invention;
FIG. 8 is a block diagram showing the transport portion of FIG. 1 in further detail in accordance with one embodiment of the system and method of the invention;
FIG. 9 is a flowchart showing a method of utilizing a robot system to perform a function in an area in accordance with an embodiment of the method and system of the invention;
FIG. 10 is a flowchart showing the “robot system determines a function path” step of FIG. 9 in further detail in accordance with an embodiment of the method and system of the invention;
FIG. 11 is a flowchart showing a method of mapping an area utilizing a robot system in accordance with one embodiment of the method and system of the invention;
FIG. 12 is a flowchart showing a method of mapping an area utilizing a robot system in accordance with yet another embodiment of the method and system of the invention;
FIG. 13 is a flowchart showing a method of storing a map of an area layout in accordance with one embodiment of the system and method of the invention;
FIG. 14 is a flowchart showing a method of associating a function task with an area segment on a map of an area layout in accordance with yet another embodiment of the method and system of the invention;
FIG. 15 is a diagram of an illustrative area layout in accordance with one embodiment of the method and system of the invention;
FIG. 16 is a flowchart showing a method of editing a map of an area layout in accordance with one embodiment of the system and method of the invention;
FIG. 17 is a flowchart showing a method of editing a map of an area layout in accordance with a further embodiment of the method and system of the invention;
FIG. 18a is a diagram of an unedited area layout in accordance with one embodiment of the method and system of the invention;
FIG. 18b is a diagram of an edited area layout in accordance with one embodiment of the method and system of the invention;
FIG. 19a is a diagram of an illustrative area layout in accordance with one embodiment of the method and system of the invention;
FIG. 19b is a diagram of the illustrative area layout of FIG. 19a in further detail in accordance with one embodiment of the method and system of the invention;
FIG. 19c is a diagram of the illustrative area layout of FIG. 19a in further detail in accordance with one embodiment of the method and system of the invention;
FIG. 19d is a diagram of the illustrative area layout of FIG. 19a in further detail in accordance with one embodiment of the method and system of the invention;
FIG. 20 is an isometric view of an illustrative robot with touch shield in accordance with one embodiment of the method and system of the invention;
FIG. 21 is an isometric view of an illustrative touch shield device shell in accordance with one embodiment of the method and system of the invention;
FIG. 22 shows a planar view of an illustrative touch shield device shell in accordance with one embodiment of the method and system of the invention;
FIG. 23 shows a planar view of the illustrative touch shield device shell in accordance with one embodiment of the method and system of the invention;
FIG. 24 is an isometric view of an illustrative robot without a touch shield device shell in accordance with one embodiment of the method and system of the invention;
FIG. 25 is a planar view of the illustrative robot without a touch shield device shell of FIG. 24, in further detail, in accordance with one embodiment of the method and system of the invention;
FIG. 26 is a side sectional view of an illustrative touch shield device mounted on an illustrative base member in accordance with one embodiment of the method and system of the invention;
FIG. 27 is a side sectional view of an illustrative touch shield device mounted on an illustrative base member in accordance with one embodiment of the method and system of the invention;
FIG. 28 is an illustrative flowchart showing a method of utilizing a robot with a touch shield to perform a function in an area in accordance with one embodiment of the method and system of the invention;
FIG. 29 is an isometric view of an illustrative robot without a touch shield device shell in accordance with one embodiment of the method and system of the invention;
FIG. 30 is a side sectional view of an illustrative touch shield device mounted on an illustrative base member in accordance with one embodiment of the method and system of the invention; and
FIG. 31 is a side sectional view of an illustrative touch shield device mounted on an illustrative base member in accordance with one embodiment of the method and system of the invention.
DETAILED DESCRIPTION OF THE INVENTION
In accordance with one embodiment, the invention provides a method of utilizing a robot system, the method comprising the steps of commanding the robot system to perform a function in an area, the area having an area layout including at least one area segment. The method further includes accessing by the robot system a stored map of the area layout, the stored map having at least one function task associated with the at least one area segment, localizing a first position of the robot system in the area, and determining a function path from the first position of the robot system for navigation of the area and completion of the at least one function task. Lastly, the method includes repeatedly continuously localizing a current position of the robot system while navigating the robot system along the function path, and completing the at least one function task that is associated with the current position of the robot system on the stored map of the area, for example.
Further details of the systems and methods of the invention will hereinafter be described. As used herein, items referred to in the singular may also be in the plural, and items referred to in the plural may also be in the singular.
As used herein, a “robot” or “robot system” or “cleaning robot system” is a stand-alone system, for example, that is mobile, that performs both physical activities and computational activities. The physical activities may be performed using a wide variety of movable parts including cleaning devices and tools, for example. The computational activities may be performed utilizing a suitable processor and memory stores, i.e., a data memory storage device, for example. The computational activities may include processing information input from various sensors or other inputs of the robot system to perform commanded functions; processing the input information, as well as other data in the memory stores of the robot system, to generate a variety of desired information; or outputting information that has been acquired or produced by the robot system to a desired destination, for example.
As used herein, the term “area” is a distinct part or section of an environment, surroundings or space, that is set aside from other parts or sections, for example. An area may include, but not be limited to, a part or section of a store, factory, warehouse, shop, mall, fair, outside market, display area, hospital, law firm, accounting firm, restaurant, commercial office space, convention center, hotel, airport, arena, stadium, outdoor venue or any other space either inside a structure or outside in which boundaries may be provided for the surroundings, for example. An area may describe a two dimensional plot or three dimensional space, for example. Accordingly, an area could be mapped utilizing coordinates in the X and Y axis, or using coordinates in the X, Y and Z axis.
As used herein, the term “area layout” is an arrangement, plan, or structuring of an area, for example. An area layout may define the walls of a structure, different zones within a building, or other structural features of an area, for example. As used herein, the term “area segment” is any portion, part or section of an area layout that can be divided or subdivided. Therefore, an area segment could include a hallway, doorway, staircase, or other section of an area layout, for example. It should be appreciated that an area segment does not have to be separated from other area segments by physical boundaries, need not be contiguous, may overlap other area segments. In general, an area segment can be defined in any suitable manner, as desired by an operator, for the performance of commanded tasks.
As used herein, the term “function” describes any assigned duty, activity, service, assignment or role, that an operator commands a robot system to perform. For example, the robot system may be commanded to perform a cleaning function, security function, entertainment function, or other services. In addition, as used herein, the term “function task” describes a piece of work assigned or done as part of a robot system's function. It should be appreciated that in order to perform its function, a robot system must complete at least one function task associated with the function.
As used above and herein, the term “cleaning function” describes any function associated with cleaning of an area. For example, a cleaning function may include, but not be limited to, rinsing, wringing, flushing, wiping, mopping, dust mopping, sponging, scouring, abrading, grinding, leveling, swabbing, scrubbing, scraping, stripping, sanding, brushing, washing, drying, laving, laundering, applying detergent to, applying abrasive to, clearing, disinfecting, irradiating, deodorizing, whitewashing, fumigating, applying antimicrobial agents to, sweeping, vacuuming, soaking, removing stains and soil marks from, waxing, buffing, utilizing a squeegee device on, applying cleaning solution to, dusting, bleaching, or shampooing a portion of an area.
Illustratively, an operator may develop a cleaning function, entitled “basic clean,” in which the robot system is programmed to sweep the floor of an area. This “basic clean” function would include two function tasks, in which the robot system would (1) navigate the area it is sweeping and (2) perform the sweeping.
In another example, an operator may associate a function with an area, such that when the robot system receives the command to perform the function, the function is associated with a known area. Illustratively, the operator may utilize a data map of the first floor of a building and the “basic clean” function. The operator could create a new function, entitled “basic clean 1st floor,” in which the robot system accesses a map of the first floor and function tasks associated with each section of the first floor. Moreover, if the first floor area could be broken down into three sections A, B and C, in which sections A and C have tiles floors that must be swept, but B is carpeted and you must travel over section B to get from section A to C, or vice versa. In this example, the operator could program “basic clean 1st floor,” based upon the stored data map of the first floor, such that the robot would navigate and sweep section A, simply transport over section B and not perform any cleaning task, and then navigate and sweep section C.
It should be appreciated from this example, and others described below, that a function may be very broad, and may include more than one function task. Furthermore, functions may be associated with stored maps of area, in which the area can be broken down into smaller segments, each having different function tasks associated with each. These segments do not have to be contiguous, may overlap other area segments, and in general, can be defined in any suitable manner as desired by an operator. Therefore, a robot system may, as part of performing its commanded function, perform one task in three different area segments within an area, or six tasks within one area segment in an area, for example. Accordingly, one of ordinary skill in the art should be able to understand that the steps in the design and programming of functions, function tasks, and the associations with stored maps of areas, area layouts and area segments, can be accomplished in several ways without deviating from the spirit and scope of the present invention.
Robot System
FIG. 1 is a block diagram showing a robot system 10 in accordance with one embodiment of the system and method of the invention. As shown in FIG. 1, the robot system 10 includes a control portion 20. The control portion 20 includes a processor portion 100 and a memory portion 600. The robot system 10 further includes an interaction portion 700, a cleaning portion 800, and a transport portion 900. Each of the processor portion 100, the memory portion 600, the interaction portion 700, the cleaning portion 800 and the transport portion 900 are connected to and in communication with each other through a data bus 30. However, it should of course be appreciated that any suitable communication interface might be utilized to connect the operating components of the robot system 10.
The components of the robot system 10 as described above perform a wide variety of operations. The processor portion 100 monitors and controls the various operations of the robot system 10 as described in detail below. The memory portion 600 serves as a memory store for a wide variety of data used by the processor portion 100 as well as the other components of the robot system 10. As described below, the interaction portion 700 includes a variety of operational components that are controlled by the processor portion 100. Illustratively, the interaction portion 700 includes components that allow navigation of the robot system 10 and interaction with operators.
The robot system 10 further includes a cleaning portion 800. The cleaning portion 800 also includes a variety of components which are in communication with the processor portion 100 in accordance with some embodiments of the invention. The components contained in a cleaning portion 800 perform a variety of cleaning function tasks in the area the robot system is operating.
The robot system 10 further includes a transport portion 900. The transport portion 900 is controlled by the processor portion 100 based on data input to the processor portion 100. The transport portion 900 provides mobile capabilities to the robot system 10. The transport portion 900 may include a mechanical system of wheels or an electromechanical system, for example. Further details of the transport portion 900 are described below.
The block diagram of FIG. 1 illustrates various operating components of the robot system 10. It should be appreciated that the operating components of the robot system 10, or select operating components of the robot system 10, may be encased or enclosed in a suitable body or body portion 40, as illustrated in FIG. 1. Alternatively, it should be appreciated that the operating components of the robot system 10 may simply be suitably disposed on a support framework or structure.
FIG. 2 is a block diagram showing in further detail the processor portion 100. As shown in FIG. 2, the processor portion 100 includes a general operating portion 200, a device subsystem portion 300, a motion subsystem portion 400 and a cognition subsystem portion 500. The cognition subsystem 500 might be characterized as the “brain” of the robot system 10. The components of the processor portion 100 allow the robot system 10 to interact with operators and other robot systems, navigate within an area, and perform function tasks in the area. The general operating portion 200 controls general operations of the processor portion 100 not otherwise handled by the other processor portions. For example, the general operating portion 200 controls system and memory backup operations, virus protection processes and processor multitasking monitoring and control, for example.
The device subsystem portion 300 is responsible for controlling a variety of devices in the interaction portion 700, as well as devices in the cleaning portion 800. Illustratively, such devices in the interaction portion 700 and the cleaning portion 800 may be electrical, electro-mechanical or mechanical devices. As described further below, these devices include, but are not limited to, sonar sensors, laser sensors, a touch shield device, shell, analog, optical or digital joystick sensor, odometry sensors, a gyroscope, a global positioning device (GPS), solution container, cleaning brush, vacuum device, squeegee device, a monitor, joy stick, magnetic strip readers, speakers, touch screens, keypads, a mouse, and motor controllers, for example. It should be appreciated that further devices may be included in the cleaning portion 800, such as a buffer device, waxing device, dryer device, mopping device, or other cleaning devices necessary to effectuate any of the above-described cleaning functions or tasks, for example.
The processor portion 100 also includes a motion subsystem portion 400. The motion subsystem portion 400 monitors and controls various navigational aspects of the robot system 10. Illustratively, the motion subsystem portion 400 determines the position of the robot system 10 in the area (i.e. localization), and controls navigation of the robot to different positions in area and along the function path. Further aspects of the motion subsystem portion 400 are described below with reference to FIG. 4.
As noted above, the processor portion 100 also includes a cognition subsystem portion 500. The cognition subsystem portion 500 is essentially the brain of the robot system 10. The cognition subsystem portion 500 is responsible for all cognitive tasks performed in the robot system 10 including environment interaction, logic processes, and game logic processes, for example.
FIG. 3 shows the device subsystem portion 300 in further detail. As described above, the device subsystem portion 300 controls a variety of devices utilized in operation of the robot system 10. The device subsystem portion 300 includes an environment interface controller 310, a user interface controller 320, and system support controller 340.
The various controllers (310, 320, 340) respectively control operational devices in the interaction portion 700. The environment interface controller 310 in general controls devices utilized to input information regarding the area, as well as to navigate within the area. The user interface controller 320 in general controls a variety of devices utilized to input information from operators, and other robot systems, and output responsive information. Further, the system support controller 340 controls a variety of devices, not otherwise controlled by the environment interface controller 310 or the user interface controller 320, that are used in operation of the robot system 10. Further aspects of the controllers (310, 320, 340) will be described with reference to FIG. 6 below.
Hereinafter further aspects of the motion subsystem portion 400 will be described with reference to FIG. 4.
FIG. 4 is a block diagram showing further details of the motion subsystem portion 400. The motion subsystem 400 includes a path planner portion 410, a path tracker portion 420, a localizer portion 430, and a Kalman filter portion 440. As noted above, the motion subsystem portion 400 monitors and controls a variety of navigational aspects of the robot system 10. The localizer portion 430 is responsible for gathering a variety of sensor information. For example, the sensor information may include laser data, sonar data, touch shield device data, shell sensor data, position sensor data (X-Y axis coordinate data) from an analog, optical or digital joystick sensor, odometry data, gyroscope data, global position system (GPS) data, pre-stored maps data, and x-y position system, i.e., grid data, for example. Accordingly, in one embodiment, the localizer portion 430 accesses a stored map of the area layout in which the robot system has been commanded to perform a function. The localizer portion 430 then localizes the robot system's position in the area and associates that current position with an actual position on the stored map. Utilizing the current position and the actual position on the stored map, the path planner portion 410 may determine a function path for the robot system to complete its assigned function and tasks. Furthermore, the localizer portion 430 utilizes the devices contained in the interaction portion 700 shown in FIG. 6. Accordingly, the localizer portion 430 determines the position and heading of the robot system 10 in the area in which the robot system 10 is operating.
The path planner portion 410, based on a variety of input, generates a desired path of travel for traveling from a current position of the robot system 10 to a final position in the area once the assigned function and tasks have been completed. A suitable algorithm or other logic may be utilized by the path planner portion 410 to generate such a desired path of travel. According to one embodiment of the invention, the path planner portion 410 may utilize stored pre-determined function paths for given areas to complete assigned function tasks upon localization in the area.
It should also be appreciated that the localizer portion 430 may detect obstacles in the planned path of the robot system 10 while the robot system 10 is moving along the path. When an obstacle is detected, the localizer portion 430 communicates with the path tracker portion 420 to stop the robot system's 10 movement, and the path planner portion 410 to generate an amended function path, thus, avoiding the obstacle.
Once the path of travel is generated by the path planner portion 410, the path tracker portion 420 utilizes this information. Specifically, the path tracker portion 420 uses the path information from the path planner, as well as the position information from the localizer portion 430, to control the robot system 10 to move along the desired function path. The path tracker portion 420 may utilize a suitable logic or algorithm as is necessary or desired. The path tracker portion 420 may further include an obstacle avoidance planner portion which may track the position of obstacles detected in the area and in the function path of the robot system.
Alternatively to the localizer portion 430, it should be appreciated that the Kalman filter 440 may be employed by the localizer portion 430 to assist in the prediction of the robot system's position based on a current sensor position. It should be appreciated that the various software pieces illustrated in the motion subsystem 400 perform separate tasks, as described above. However, it should be appreciated that in another embodiment of the system of the invention, two or more of these respective tasks may be performed by a single processor, or alternatively, the tasks performed by a particular component may be further broken down into multiple processing components.
As shown in FIG. 1, the control portion 20 in the robot system 10 includes a memory portion 600. The memory portion 600 stores a variety of data utilized in operation of the robot system 10. FIG. 5 is a block diagram showing further details of the memory portion 600.
As shown in FIG. 5, the memory portion 600 includes a general operating memory 610, a device subsystem memory 620, a motion subsystem memory 630, and a cognition subsystem memory 640.
The general operating memory 610 in the memory portion 600 stores a wide variety of data used by the general operating portion 200 in general operations of the robot system 10. Furthermore, the general operating memory 610 contains specialty data stores in accordance with one embodiment of the system of the invention. Specifically, the general operating memory 610 includes a stored area memory 612, a function path mapping memory 614, a function task memory 616 and a function path editing memory 618.
These data stores contain various information related to the particular area within which the robot system 10 is operating. Illustratively, the stored area memory 612 may contain data and/or files relating to area layouts and area segments. The function task memory 616 may contain data and/or files relating to functions, function commands, function tasks, and associations between this information and data on areas, area layouts and area segments. The function path mapping memory 614 may store data and input information acquired by then robot system 10 and utilized to produce maps of different areas. In addition, the function path editing memory 618 may store a wide variety of data related to the editing of stored maps of areas, area layouts or area segments, as is necessary or desired. While the general operating memory 610 of FIG. 5, as well as the other memories (620, 630, 640) illustrate one embodiment of the system of the invention, it should of course be appreciated that the various memory stores are widely adaptable to the particular operating environment in which the robot system 10 is disposed and may be altered accordingly. It should be appreciated that these data stores may contain information on several different areas wherein the robot system may be called upon to perform various functions.
The memory portion 600 also includes the device subsystem memory 620. The device subsystem memory 620 is the memory store utilized by the device subsystem portion 300. Accordingly, the device subsystem memory 620 stores a variety of information related to operation of the devices in the interaction portion 700, which are controlled by the device subsystem portion 300.
The memory portion 600 also includes a motion subsystem memory 630. The motion subsystem memory 630 is the memory store utilized by the components of the motion subsystem portion 400, i.e., the path planner portion 410, the path tracker portion 420, the localizer portion 430 and the Kalman filter 440. In addition to the general memory stores contained in the motion subsystem memory 630, there are also specialty data stores. Specifically, the motion subsystem memory 630 includes a path planner memory 632 and a path tracker memory 634 utilized by the path planner portion 410 and the path tracker portion 420, respectively.
The memory portion 600 also includes a cognition subsystem memory 640. The cognition subsystem memory 640 serves as the memory store for the cognition subsystem portion 500, which is shown in FIG. 2. Accordingly, the cognition subsystem memory 640 contains a wide variety of data utilized in the cognitive operations performed by the cognition subsystem portion 500. Illustratively, the cognition subsystem memory 640 may contain data associating particular observed inputs with desired outputs.
It should be appreciated that the various memory components contained in the memory portion 600 may take on a variety of architectures as is necessary or desired by the particular operating circumstances. Further, the various memory components of the memory portion 600 may exchange data or utilize other memory component data utilizing known techniques such as relational database techniques.
Hereinafter, further details of the device subsystem portion 300, shown in FIG. 3, will be described in conjunction with the interaction portion 700 shown in FIG. 6. As described above, the interaction portion 700 contains various components to collect information and data from the area in which the robot system 10 is operating, as well as to output information and data. The interaction portion 700 includes an environment interface portion 710, a user interface portion 720 and a system support portion 740. The environment interface portion 710 collects various information regarding the area in which the robot system 10 is operating, as well as information regarding travel of the robot system 10 through the area. The user interface portion 720 generally provides operator interaction capabilities. That is, the user interface portion 720 is controlled by the processor portion 100, or components thereof, to interface with an operator or other robot system including inputting commands and outputting data or information relating to the areas and functions. The system support portion 740 contains a variety of operational components of a more general type not contained in either the environment interface portion 710 or the user interface portion 720. Hereinafter, further details of the portions (710, 720, 740) will be described.
The environment interface portion 710 as shown in FIG. 6, contains a variety of components to input information from the area in which the robot system 10 is operating. More specifically, the environment interface portion 710 may include sonar sensors 711, laser sensors 712, an odometry sensor 713, a global position (GPS) device 714, microwave sensors 715, Doppler radar sensors 716, a gyroscope 717, motion sensors 718, and touch shield device 760, as described hereinafter, for example. Further, it should be appreciated that other suitable input devices might be utilized such as an x-y grid device, drive control boards, a gaseous sensor, a heat sensor, a camera, a recording portion, air quality sensors, flame sensors, a metal detector, for example.
Each of the components in the environment interface portion 710 perform respective operations. The various components in the environment interface portion 710 are in general controlled by the environment interface controller 310, which includes sub-processing systems that respectively control some of the components in the environment interface portion 710. More specifically, the map generator controller 311 controls and receives feedback from the sonar sensors 711 and the laser sensors 712. Furthermore, the gyroscope controller 317 controls the gyroscope 717. The interface device controller 318 controls any additional devices in the environment interface portion 710 that are not controlled by the map generator controller 311 or gyroscope controller 317.
Hereinafter, further aspects of the devices shown in the environment interface portion 710 of FIG. 6 will be described in further detail. As noted above, the environment interface controller 310 monitors and controls the various devices contained in the environment interface portion 710.
The environment interface portion 710 includes sonar sensors 711 and laser sensors 712, which are both responsible for the “localization” or determining the positioning of the robot system 10 within the are in which the robot system is operating. The sonar sensors 711 and laser sensors 712 can be utilized to determine positions of the robot system 10 in the area, and during navigation of the function path and completion of tasks. Accordingly, in one embodiment of the invention, the robot system 10 utilizes the sonar sensors 711 and laser sensors 712 to continuously localize the position of the robot system 10 in the area in which the robot system 10 is operating.
Each of the sonar sensors 711 and the laser sensors 712 are controlled by the map generator controller 311. The sonar sensors 711 and the laser sensors 712 provide the map generator controller 311 with various spatial information regarding the surrounding area in which the robot system 10 is operating. Specifically, the sonar sensors 711 transmits and detects reflected acoustic waves to determine surrounding objects. The sonar sensors 711 in accordance with the system of the invention can detect both static objects, i.e., such as a wall, or dynamic objects such as a moving person. The sonar sensors 711 conveys gathered information back to the map generator controller 311. Thereafter, the map generator controller 311 outputs the processed information to other components in the robot system 10 as is desired or necessary.
Similarly, the laser sensors 712 gather information utilizing laser technology. For example, the laser sensors 712 may include a device that converts incident electromagnetic radiation of mixed frequencies to coherent visible radiation having one or more discreet frequencies. The reflection of this radiation off surrounding objects may be sensed by the laser sensors 712 so as to determine the surrounding area specifics. The laser sensors 712 in accordance with the system of the invention can also detect both static objects, i.e., such as a wall, or dynamic objects such as a moving person. The laser sensors 712 conveys gathered information back to the map generator controller 311. Thereafter, the map generator controller 311 outputs the processed information to other components in the robot system 10 as is desired or necessary.
As described above, the sonar or laser sensors, as well as other sensor input devices, may be utilized in the system of the invention to produce a map of an area in which the robot system will operate. It should be appreciated that the robot system may receive a command to map an area, for example. Accordingly, the map generator controller 311 would utilize the information and data gathered by the sonar and laser sensors to create a map of the area sensed. Once the area layout of an area is determined, for example, an operator could utilize a stored map of an area to develop an increasingly enhanced navigational capability in the area for later navigation by the robot system. Alternatively, the robot system could generate a map of the area for alternate uses such as providing floor plans, emergency exit maps, for example.
Furthermore, it should be appreciated that the robot system may create updated maps of the area layout when obstacles are detected in the area layout. Once detected, the obstacles can be added to an amended map of the area layout that could be utilized from thereon. Also, if the robot system continued to operate within the area layout and later the obstacle was removed, the robot system could either create another amended map that removed the obstacle, or could return to the previous stored map. In this respect, the robot system could operate efficiently in an environment such as a grocery store where floor displays are constantly being moved and rearranged.
It should also be appreciated that the updating of stored maps by the robot system can be effectuated in several different fashions. For example, the operator may set the updating of a stored map to occur on the detection of an obstacle for the third consecutive cleaning cycle through a given area layout. Therefore, the operator may program the robot system to create an amended map only upon three detections of an obstacle to avoid unnecessary effort. The number of repeated detections of an obstacle before it is added to a map of the area layout can be determined by one of ordinary skill in the art based upon the needs of the given area layout.
Alternatively, it should be appreciated that a robot in accordance with the system of the invention, the robot system might be manually taught regarding the surroundings. Accordingly, the robot system could receive data and input relating to obstacles in the environment as well as the location of beacons relative to those objects. In another embodiment, an operator may edit a pre-determined stored map, change the nature of mapped boundaries in the area layout, i.e., increase the size of a room, add a doorway, or place a pillar in a hallway, and the robot system would receive updated information on the area. In addition to a training process or a manual process, other suitable techniques may be utilized to provide improved navigational capabilities to the robot system.
The environment interface portion 710 also includes an odometry sensor 716. The odometry sensor 716 may monitor the distance traveled by the robot system 10 for any of a variety of purposes. For example, the distance traveled by the robot system 10 may be utilized in combination with a stored map of an area to provide an efficient function path for performing assigned tasks. Alternatively, the distance traveled might assist in estimations relating to when replenishment of the robot system 10 will be required. The information gathered by the odometry sensor 716, as well as the information gathered in the other components of the environment interface portion 710, may be stored in the device subsystem memory 620 in accordance with one embodiment of the system of the invention.
The environment interface portion 710 also includes a gyroscope 717. The gyroscope 717 is monitored and controlled by the gyroscope controller 317 in the environment interface controller 310 as shown in FIG. 3. Illustratively, the gyroscope 717 may include a known structure using orientational gyroscope technology, that comprises a spinning mass, the spin access of which is allowed to rotate between low-friction supports so as to maintain its angular orientation with respect to initial coordinates when the spinning mass is not subjected to external torque. Accordingly, the gyroscope 717 provides feedback to the gyroscope controller 317 indicative of movement of the robot system 10. Alternatively, other gyroscope technology may be utilized. For example, tuning fork or laser ring gyroscope technology might be utilized in conjunction with the system and method of the invention.
As described above, the localizer portion 430 in the motion subsystem portion 400, as shown in FIG. 4, is responsible for gathering sensor information and determining the position or heading of the robot system 10. Accordingly, a purpose of the localizer portion 430 is to assist in navigation of the robot system 10 in its travels through the area in which the robot system 10 is operating, i.e. along a function path. The map generator controller 311 assists the localizer portion 430 in its operations. Specifically, the map generator controller 311 forwards information it gathers from the sonar sensor 711 and/or the laser sensors 712 and forwards that information to the vocalizer portion 430.
The user interface portion 720 as shown in FIG. 6, contains a variety of components utilized to interface with a customer in an area. Specifically, the user interface portion 720 includes a touch screen 721, a keypad 722, a mouse 723, a joystick 724, speakers 725, a magnetic strip reader 726, i.e., a card reader, user buttons 727, and a monitor 728, for example. The user interface portion may include additional components including, for example, an armature, a microphone, or printer.
The various components in the user interface portion 720 are controlled by the user interface controller 320, in general, or alternatively, by a sub-processing system of the user interface controller 320. Also, the user interface controller 320 outputs data to the speakers 723 so as to provide audible messages, automated alert signals for robot operation, or simulated speech and voice generation using the speakers, for example. In addition, the card reader controller 328 in the user interface controller 320 controls and inputs information from the magnetic strip reader 726. Similarly, the touch screen controller 333 in the user interface controller 320 controls and inputs information from the touch screen 721 and key pad 722.
Hereinafter, further aspects of the user interface portion 720 will be described. As set forth above, the user interface portion 720 includes a variety of devices used to operate the robot system in an area to perform a function. The devices in the user interface portion 720 are controlled by the user interface controller 320 or a sub-component thereof.
With further reference to FIG. 6, the user interface portion 720 includes a touch screen 721 and key pad 722 that are controlled by a specialized processing component in the user interface controller 320, which is the touch screen controller 333. An operator may use the touch screen 733 to input information into the robot system 10, i.e. commands to perform a function. In one embodiment, an operator may use the touch screen 733 to command a cleaning robot system to perform a cleaning function in a given area. For example, the touch screen 733 could be used to select a specific hallway, and the operator could specify that the floor of the hallway should be washed and waxed. In addition, an operator could utilize the key pad 722, or any combination of the touch screen and key pad to give similar commands.
Further, the user interface portion 720 includes a mouse 723, joystick 724, user buttons 727, and monitor 728. Each of these additional components may be utilized to input a wide variety of information into the robot system 10. For example, each of these components could be utilized to command the robot system, including changing the function path, for example.
The user interface portion 720 also includes a magnetic strip reader 730, i.e., a card reader. Alternatively, it should be appreciated that barcode or laser scanners might also be utilized. The magnetic strip reader 726 is controlled by the card reader controller 328. In one embodiment, the magnetic strip reader 726 may be utilized to identify an operator, wherein the robot system would not respond to a given command unless the command came from an identified and authorized operator, for example.
The system support portion 740 as shown in FIG. 6 includes a variety of components used to support operation of the robot system 10. Specifically, the system support portion 740 includes a communication interface 742, a battery 746, an encoder device 748, and a security portion 749.
The devices in the system support portion 740 are controlled by respective processing components in the system support controller 340. That is, the communication interface controller 342 controls operation of the communication interface 742. The battery interface controller 346 controls operation of the battery 746. The encoder interface controller 348 controls operation of the encoder device 748. Lastly, the security controller 349 controls the security portion 749.
Hereinafter, further aspects of the components in the system support portion 740, as shown in FIG. 6, will be described. The system support portion 740 includes the communication interface 742. The communication interface 742 is controlled by the communication interface controller 342, as shown in FIG. 3. The communication interface 742 provides for transmission of data both to the robot system 10 and from the robot system 10. In accordance with one embodiment of the invention, the communication interface 742 is a wireless device. Various communications techniques may be utilized to provide the wireless transmission both to and from the robot system 10, including radio, spread spectrum, infrared line of sight, cellular, microwave, or satellite, for example. Further, the communication interface 742 may use wire technology wherein a physical cable is running from the robot system 10 to a desired location, such as a modem which may then be connected to the Internet, for example. In particular, the wire technology may be utilized where the robot system 10 is operated in a small defined area.
The system support portion 740 also includes a battery 746. The battery 746 is monitored and controlled by the battery interface controller 346. The battery 746 may be any suitable type including lithium polymer, nickel cadmium, nickel hydride, lead acid, lithium ion, lithium polymer, zinc air or alkaline, for example. Further, it should be appreciated that a plurality of batteries may be utilized that are the same or different types. This may be preferable in that various processing systems and operational devices utilized on the robot system 10 may optimally utilize different types of batteries for enhanced performance. The battery interface controller 346 monitors the battery 746, or alternatively batteries, for possible malfunctions and recharging requirements. Once the battery interface controller 346 determines that a battery 746 requires recharging or replacement, the battery interface controller 346 works in conjunction with the other processing portions and devices to effect travel of the robot system 10 to a recharging station, in accordance with one embodiment of the invention.
The system support portion also includes a security portion 749 that is controlled by the security controller 349 in the system support controller 340. The security portion 749 working in conjunction with the security controller 349 provides various capabilities related to security of both the area in which the robot system 10 is operating, as well as the robot system 10 itself. For example, the security controller 749 may provide theft detection capabilities. For example, the security portion 749 may include a proximity sensor that interacts with a base station, an embodiment of which is described below. As a result, once the robot system 10 is a predetermined distance away from the base station various operations may be performed such as sounding an audio alarm, electronically transmitting a signal to close exit ways, or effecting certain mechanical operations such as locking wheels of the robot system 10 so as to hinder transport.
Further, it should be appreciated that the processing capabilities of the security controller 349 may also utilize input devices controlled by the environment interface controller 310 and the user interface controller 320. For example, the security controller 349 may direct that a communication or message, describing an emergency condition, be dispatched to an emergency services provider, such as the police, fire department or building manager. Accordingly, if the robot system 10 detected smoke in an operation area, the security controller 349 could alert the fire department of the emergency. This monitoring is controlled by the security controller 349, working in conjunction with the other processing systems.
As described above, FIG. 1 includes a cleaning portion 800 in accordance with one embodiment of the system and method of the invention. FIG. 7 is a block diagram showing the cleaning portion 800 in further detail. In general, the cleaning portion 800 includes various operational components providing additional capabilities to the robot system 10, i.e. the ability to perform cleaning functions.
In accordance with this embodiment of the invention, the components in the cleaning portion 800 provide feedback to the device subsystem portion 300, which monitors and controls operation of the components of the customer service device portion 800.
In accordance with one embodiment of the system and method of the invention, the cleaning portion 800 includes a solution container 810, applicator nozzles 812, a cleaning brush 820, a vacuum device 830, and a squeegee device 840.
The solution container 810 is a physical container that is disposed on the robot system 10 in accordance with one embodiment of the invention. For example, the solution container 810 may be disposed on or within the body portion 40 so as to be accessible by an operator, i.e. for filling and re-filling cleaning solution. It should be appreciated that a sensor or sensors may be disposed in the holding container 810 to monitor the quantity of solution disposed in the solution container 810. Accordingly, once such sensors in the solution container 810 convey feedback to the device subsystem portion 300 that the quantity of solution is sufficiently diminished, then the device subsystem portion 300 may effect a desired action. For example, the device subsystem portion 300 may effect travel of the robot system 10 to a predetermined location such that the solution container 810 may be refilled with more solution. Additionally, the cleaning portion 800 also includes applicator nozzles 812 for use in the application of the cleaning solution in the operation of the robot system 10.
The cleaning portion 800 also includes a cleaning brush 820. The brush 820 is disposed on the body portion 40 such that it may be utilized to perform a cleaning function upon command. Similarly, a vacuum device 830 is provided in the cleaning portion 800, and disposed on the body portion 40 such that it may be utilized to vacuum an area upon command. Lastly, a squeegee device 840 is provided in the cleaning portion 800, and disposed on the body portion 40 such that it may be utilized to perform a cleaning function upon command.
As described above, the robot system 10 includes a transport portion 900. The transport portion 900 is controlled by the device subsystem portion 300. In summary, the device subsystem portion 300 inputs various information from the interaction portion 700 including operator commands, for example. As a result, the robot system 10 moves to a position and location where the robot system 10 can complete the function it has been commanded to perform. Working in conjunction with the motion subsystem portion 400, the device subsystem portion 300 utilizes the transport portion 900 to effect this movement of the robot system 10.
The transport portion 900 as shown in FIG. 8 controls various mechanical or electro mechanical components needed to effect physical movement of the robot system 10. Specifically, the transport portion 900 includes a motor 910 used to drive the wheels 920. The motor 910 may be powered by the battery 746. Further, various directional devices and sensors may be utilized as is needed or desired. In one embodiment, the robot system 10 utilizes data gathered from sonar sensors 711 and laser sensors 712 to continuously localize the position of the robot system 10, and guide the movement of the robot system 10 along a determined function path. In addition, the transport portion 900 may utilize a gyroscope 930 to monitor and control the direction of travel of the robot system 10.
As described above, the robot system 10 includes a touch screen 733, which provides a graphical user interface (GUI). The graphical user interface is a device that utilizes separate passive process that performs two functions. These two functions include: (1) display of images on the touch screen as controlled by the cognition subsystem portion 500, and (2) informing the cognition subsystem portion 500 of the location of any touches on the touch screen 733. In accordance with one embodiment, all logic is removed from the graphical user interface, i.e. the touch screen 733. As a result, a clean division of the robot system's “brain” and the graphical user interface operation, i.e., the operation of a device, is achieved.
Hereinafter additional features in accordance with further embodiments of the method and system of the invention will be described. It should be appreciated that the various features and embodiments described herein may be utilized in combination with a variety of known technology. For example, the above features and embodiments may be used in conjunction with the features described in U.S. Pat. No. 5,548,511, which is incorporated herein by reference in its entirety, and U.S. Pat. No. 6,124,694, which is incorporated herein by reference in its entirety. Further, it should be appreciated that the various embodiments and features described herein may be used in conjunction with features in U.S. patent application Ser. No. 09/906,216, Attorney Docket No. 55274.000014, directed to system for a retail environment, which is incorporated herein by reference in its entirety. Further, it should be appreciated that the various embodiments and features described herein may be used in conjunction with features in U.S. patent application Ser. No. 09/906,159, Attorney Docket No. 55274.000018, directed to methods for facilitating a retail environment, which is incorporated herein by reference in its entirety.
It should be appreciated that a wide variety of interactions may be performed between different robots in accordance with the method and system of the invention. The robots may be in communication utilizing any suitable wireless technology, for example, such as a gateway. For example, one robot could be washing a first area, while another robot is vacuuming a second area. At a predetermined time, the robots might communicate with each other to determine if each robot is done with their respective tasks, thus allowing the robots to switch areas to perform the other functions, for example. In addition, if the system determines that one robot is low on cleaning solution, a second robot could be dispatched from another area to finish the cleaning solution washing.
Additionally, it should be appreciated that one robot might communicate with multiple other robots. For example, one robot, while busy performing one function, might be commanded to perform a second function. As a result, that robot might communicate with a fleet of robots in the area to determine which robot in the fleet is available to perform the second function. As noted above, a gateway might be utilized to route communication between the robots. The gateway might be characterized as a traffic controller or a coordinator between the various robots.
In accordance with a further embodiment of the system and method of the invention, it should be appreciated that a particular robot may be guided by a device, such as a mouse, in a remote location. To explain, it should be appreciated that a camera mounted upon the robot may input information using the camera and communicate that visual information to a distant location, where a human operator is monitoring and controlling movement of the robot. Thus, using a mouse or other suitable device, the operator may control movement of the robot through a particular area and obtain visual information based on the travels of the robot.
The robot system 10 may communicate with the Internet, and Ethernet, other network systems, or other processing systems, utilizing wireless technology. For example, the robot system 10 may use streaming video technology.
In accordance with one embodiment of the robot system 10, the body portion 30, as described above, is an outer shell. The outer shell may be formed in any of a wide variety of shapes depending on the area and functions in which the robot system is to be used.
The robot system may also utilize voice recognition techniques in operations of the robot system 10. The voice recognition techniques may identify a particular operator, or alternatively, accept a given command to perform a function.
As described above, a Kalman filter portion 440 may be utilized in the motion sub-system portion 400. For example, an off-the-shelf Kalman filter may be utilized in accordance with one embodiment of the system and method of the invention. The Kalman filter takes input and then processes that input to generate navigation information. Various sensor inputs may be utilized including sonar information, odometry information, and gyroscope information. The Kalman filter may be utilized to assess the accuracy of the various inputs.
Further, an encoder device 748 may be utilized in the support system portion 740 as described above. The encoder device may be utilized to control operation of the drive wheels, for example. Illustratively, the processing portion or control portion of the robot system 10 may command the motors to turn the robot system 10 at a certain rate or, alternatively, to travel 4 feet. However, the motors do not know exactly what 4 feet is and, as a result, the motors receive feedback from an encoder mechanism, i.e., a disk or optical reader, to provide feedback information. A portion of the encoder mechanism is disposed on and spinning with the wheels. For example, there may be slots utilized in the encoder and the control system knows that there are, for example, 1000 slots, and 4000 slots are necessary to travel a distance of 4 feet. For example, an optical encoder may be positioned on the drive shaft of the motor, or alternatively on a wheel, keeping track of wheel rotation. It should further be appreciated that it is not required that the encoder actually be disposed on the driven wheel. For example, the encoder device could be disposed on a free rolling trailing wheel, for example. The rotation of the trailing wheel would provide the necessary feedback to a motor control mechanism to monitor the driven wheel, i.e., the travel of the robot system, as is necessary or desired.
As described above with reference to FIG. 6, the environment interface portion 710 may include a gyroscope 717. The gyroscope may be thought of a rotational compass. It should be appreciated that various known techniques may be utilized in operation of the gyroscope. For example, appropriate techniques and devices may be utilized to prevent the gyroscope from drifting, and in particular, when less expensive gyroscopes are utilized. In accordance with one embodiment of the system and method of the invention, a filtering process may be utilized to effectively use data output by the gyroscope. For example, if a controller portion commands the robot to go straight and the wheels are experiencing slippage, the gyroscope will accurately inform the controller of rotation of the robot system 10. Accordingly, the gyroscope provides angular sensing and input, which is particularly useful when turning the robot.
It should be appreciated that the robot system 10 may utilize a docking system. A “home position” is provided at which the robot docks in the “docking position.” When positioned in a docking position, the robot system 10 is electrically connected to a recharging source, for example. Accordingly, the robot system 10 may go out onto a floor of an area and work for a number of hours at which time the robot navigates its way back to the home position. At the home position, the robot system 10 self-docks itself so as to provide for replenishment. With respect to recharging the batteries, the recharging may, for example, be performed utilizing an inductive-type pickup wherein a plate is positioned in the floor; and the robot system 10 is positioned over that plate so as to provide for charging utilizing inductive techniques. Of course, a variety of other items may be replenished on the robot system 10 including cleaning solution, wax, water, as well as other exhaustible items.
In accordance with one embodiment of the method and system of the invention, a touch shield or lower shield may be provided. The touch shield provides feedback to the robot such that, if the robot bumps into something, or if something bumps into the robot, the robot can determine where the impact came from. For example, the impact may have come from the left, right, rear or front, for example. The touch shield is a physical element that surrounds a lower portion of the robot. The touch shield may be connected to the robot body using movement sensitive linkages, for example.
As stated above, one embodiment of the robot system 10 may include a touch shield. An illustrative touch shield include a shell and a joystick sensor device mounted on the body portion of the robot. To provide further understanding of an illustrative touch shield, FIGS. 20-27 illustrate a robot with a touch shield, in accordance with one embodiment of the method and system of the invention described above.
FIG. 20 is an isometric view of an illustrative robot with touch shield in accordance with one embodiment of the method and system of the invention. As shown in FIG. 20, the robot system 10 includes a body portion 40, environment interface portion 710, as embodied by sensors 711, and user interface portion 720, as embodied by touch screen 721, user buttons 727, and key pad 722, and transport portion 900, as embodied by wheels 920. It should be appreciated that while other portions of the robot system 10 may not be shown in FIG. 20, these portions and components are incorporated in the embodiment of the robot system 10.
As shown in FIG. 20, the environment interaction portion 710 of the robot system 10 is further embodied by the inclusion of a touch shield device 760. The touch shield device 760 includes a shell 770 and a joystick sensor device 780. Shell 770 is supported by at least one shell support member 44 affixed to a base member 42. The base member 42 may be the body portion 40 of the robot system 10, or a part thereof, i.e. a robot chasse. Additionally, it should be appreciated that the base member 42 may be another physical element attached to the robot body portion 40.
When assembled, each shell support member 44 must be flexible yet self-centering, such that the shell 770, which is supported by each shell support member 44, can translate relative to the base member 42 when an exterior force is applied to the shell 770. Shell 770 is supported by shell support members 44, and mounted over base member 42 with a sufficient space between base member 42 and shell 770, such that shell 770 can move in any direction in a horizontal plane parallel to the base member 42, in response to the exterior force applied. An exterior force may come from a human touching the shell, or the shell contacting an object while the robot is moving, for example.
A shell support member 44 may include a rubber mount or column, a spring, pneumatic cylinder, hydraulic cylinder, or air cylinders, for example. Rubber mounts or columns provide the additional benefit of being self-damping, thus allowing the rubber mount to self-center much easier than other potential support members. A suitable shell support member 44, or rubber mount, must be sufficiently flexible such that when an exterior force is applied to shell 770, the shell support member 44 bends and shell 770 moves relative to the base member 42. Moreover, the shell support member 44, or rubber mount, must be sufficiently sturdy such that the shell support member 44 returns to a neutral vertical alignment after the exterior force is removed from shell 770. A shell support member 44, or rubber mount, may be affixed to the base member 42, and a fastener may be threaded through the top panel 772 of shell 770 and into the shell support member 44, securing the shell 770 to the shell support member 44. Therefore, the placement of the shell 770 on the shell support members 44 is such that the shell 770 can translate relative to the base member 42 when an exterior force is applied to the shell 770, due to the displacement in the shell support members 44. Further descriptions of shell 770 are shown in FIGS. 21-23.
As described above, touch shield device 760 includes a shell 770 and a sensor device 780. The sensor device 780 includes a base sensor portion 782 and a vertical member 784. The base sensor portion 782, which may be a joystick base plate, is affixed to the base member 42. This may be done in any suitable manner including, for example, by screws, bolts or other fastening means. The vertical member 784, which may be an armature or pin, is affixed to the shell 770, as well in any suitable manner. In a non-operational condition, the placement of the shell 770 with the vertical member 784 affixed, over the base member 42 with the base sensor portion 782 affixed, positions the vertical member 784 over the center 783 of the base sensor portion 782, in a zero degree (neutral) position. It should be appreciated that an adjustable centering device may also be utilized to center the vertical member 784 over the center 783 of base sensor portion 782. Such an adjustable centering device may take the form of a planar member, i.e. plexiglass disc or sheet, with the vertical member affixed to the planar member, and a plurality of fasteners, i.e. nut, bolt and washer combinations, integrally connected to the planar member and shell. For example, bolts may be threaded through a plurality of clearance holes in the top panel of the shell, and integrally threaded into and affixed to the planar member. The above-described adjustable centering device allows the vertical member to be centered over the base sensor portion, and also adjust the position of the vertical member once the shell is supported on the shell support members. An additional viewing port in the shell may also be provided to allow the vertical member to be centered over the base sensor portion visually.
Accordingly, movement of shell 770 in response to an exterior force applied translates vertical member 784 from over the center 783 of the base sensor portion 782, i.e. the zero degree position, such that the base sensor portion 782 senses the angular direction and magnitude of the exterior force on the shell 770. The base sensor portion 782 senses the distance the vertical member 784 is displaced from over center 783, i.e. the zero degree position, which allows the robot system 10 to determine the magnitude of the exterior force, as well as the angular direction the vertical member 784 is displaced from the center 783, i.e. the zero degree position, which allows the robot system 10 to determine the direction from the exterior force was applied to shell 770. Upon movement of the vertical member 784, the base sensor portion 782 produces an output to the processor portion signaling the exterior force on the shell 770. The output to the processor portion signals the direction of the exterior force applied and the degree of the exterior force applied. The placement of the sensor device 780 in relation to the shell 770, and/or base member 42 is shown in FIGS. 24-27.
FIG. 21 is an isometric view of an illustrative touch shield device shell in accordance with one embodiment of the method and system of the invention. As shown in FIG. 21, shell 770 includes a top panel 772, with a front panel 776 and side panels 778 extending longitudinally downward there from. In this embodiment, the top panel 772 of shell 770 is formed in a “U” shape to allow the shell 770 to be placed over the base member 42, and supported on shell support members 44.
It should be appreciated that further embodiments of the robot touch shield device shell may incorporate different designs, shapes, panels, or formations, without diverting from the scope of the invention. For example, although shell 770, as shown in FIG. 21, does not include a rear panel, further embodiments of the invention may utilize a touch shield device shell that covers the base member from the top, front, rear and sides. Such an embodiment may employ a dome type shell, which covers the base member from all directions except the side of the robot body portion 40 that is in contact with a floor surface. Additionally, the shell may be provided with clearance holes for the attachment of an adjustable centering device to the shell, for positioning of the vertical member in relation to the base sensor portion. A viewing portal in the shell may also aid positioning of the vertical member.
Returning to FIG. 21, shell 770 further includes shell mounting holes 773, wherein the shell 770 is affixed to shell supporting members 44. Also, shell 770 includes upward sensor ports 774, frontal sensor ports 777, and side sensor ports 779, wherein sensors are placed to sense objects, or obstacles in relation to the position of the robot system. Sonar sensors 711, i.e. ultrasonic transducers, are placed within the upward sensor ports 774, frontal sensor ports 777, and side sensor ports 779.
In one embodiment, as shown in FIG. 20, sonar sensors 711, i.e. ultrasonic transducers, are placed within the upward sensor ports 774, frontal sensor ports 777, and side sensor ports 779, such that input gathered by the sonar sensors 711 is output to the processor portion of the robot system 10. The processor utilizes the input from the sonar sensor to determine the position of walls in the area, location of obstacles, or mapping of an area. As described above, these sensors, as placed in the upward sensor ports 774, frontal sensor ports 777, and side sensor ports 779, can be arranged in a variety of directions and angles on the shell 770 to gather a full spectrum of information on the location of obstacles in the area. For example, the sonar sensors 711 placed in the upward sensor ports 774 gather information that allows the processor to determine if the shell 770 has moved under an obstacle, i.e. the overhang of a desk or table. Sonar sensors 711 placed in the frontal sensor ports 777 gather information that allows the processor to determine if an obstacle is in front of the robot system 10. Sonar sensors 711 placed in the side sensor ports 779 gather information that allows the processor to track walls or other obstacles on either side of the robot system 10, and also provide improved steering of the robot system 10, and map an area. For example, a plurality of sensors placed in side sensor ports at angles of 80°, 90° (perpendicular to the path of robot system 10), and 100°, may provide additional steering and mapping capabilities.
FIG. 22 shows a planar view of an illustrative touch shield device shell in accordance with one embodiment of the method and system of the invention. The embodiment shown in FIG. 22 illustrates the top panel 772 of the shell 770. As stated above, shell 770 includes shell mounting holes 773, wherein the shell 770 is affixed to shell supporting members 44, and sensor ports, i.e. upward sensor ports 774, for example.
FIG. 23 shows a planar view of the illustrative touch shield device shell in accordance with one embodiment of the method and system of the invention. The embodiment shown in FIG. 23 illustrates the underside of the shell 770 and top panel 772. The shell 770 includes side panels 778 and front panel 776. As stated above, shell 770 also includes shell mounting holes 773. FIG. 23 also illustrates sensors 711 placed within and filling upward sensor ports 774, from the underside of top panel 772. Vertical member 784 is illustrated in FIG. 23. Vertical member 784 is affixed to the underside of shell 770, such that in a non-operational condition, the placement of the shell 770 with the vertical member 784 affixed, over the base member 42 with the base sensor portion 782 affixed, positions the vertical member 784 over the center 783 of the base sensor portion 782, in a zero degree (neutral) position.
As described above, touch shield device 760 includes a shell 770 and a sensor device 780. The sensor device 780 includes a base sensor portion 782 and a vertical member 784. The placement of the sensor device 780 in relation to the shell 770, and/or base member 42 is shown in FIGS. 24-27.
FIG. 24 is an isometric view of an illustrative robot without a touch shield device shell in accordance with one embodiment of the method and system of the invention. In this embodiment, however, shell 770 has been removed to provide a better understanding of the base sensor portion 782 and shell support members 44, as affixed to base member 42. As shown in FIG. 24, base sensor portion 782 is affixed to base member 42. The base sensor portion 782 may be affixed in any suitable manner to the base member 42, including, for example, with bolts 788. Also, shell support members 44 are affixed to base member 42, and extend vertically therefrom, for attachment to shell 770. Brush 820, one component of the cleaning portion 800 of robot system 10, is also illustrated in FIG. 24. In this embodiment, brush 820 is affixed to the base member 42.
FIG. 25 is a planar view of the illustrative robot without a touch shield device shell of FIG. 24, in further detail, in accordance with one embodiment of the method and system of the invention. FIG. 25 illustrates the robot system 10 from the planar view, along vertical plane A-A′. Base member 42 extends from underneath the body portion 40 of the robot system 10. Brush 820 extends outwardly from beneath the base member 42. Base sensor portion 782 is affixed to base member 42 with bolts 788. Shell support members 44 are affixed to and extend vertically upward from base member 42 to support shell 770. Sensor 711 extends outwardly from the face of body portion 40.
To provide a better understanding of the placement of the sensor device 780 in relation to the shell 770, shell support members 44, and/or base member 42, FIGS. 26-27 are provided. It should be appreciated that while FIGS. 26-27 do not illustrate each component or portion of robot system 10, the embodiments of an illustrative touch shield shown therein may incorporate the descriptions and drawings of the embodiments shown and described in FIGS. 20-25.
FIG. 26 is a side sectional view of an illustrative touch shield device mounted on an illustrative base member in accordance with one embodiment of the method and system of the invention. With respect to the touch shield device 760, FIG. 26 illustrates a non-operational position. A non-operational position includes any point in which the vertical member 784 of joystick sensor device 780 is positioned over the center 783 of the base sensor portion 782. It should be appreciated that a non-operational position may include when the robot system 10 is operational, and even moving, as long as an exterior force is not being applied to the shell 770 such that the shell 770 would be translated. Shell 770, with side panel 778, are supported by shell support members 44, which are affixed to base member 42. Brush 820 is also attached to base member 42. Base sensor portion 782 is bolted to base member 42, and vertical member 784 is centered over base sensor portion 782, and center 783. FIG. 27 illustrates the translation of shell 770, vertical member 784, and shell support members 44 when an exterior force is applied to shell 770.
FIG. 27 is a side sectional view of an illustrative touch shield device mounted on an illustrative base member in accordance with one embodiment of the method and system of the invention. With respect to the touch shield device, FIG. 27 illustrates the occurrence of an exterior force F, applied to shell 770. The force F pushes on the shell 770, forcing the shell support members 44 to flex in response to the force. Once the shell support members 44 flex from the exterior force, and the shell 770 translates relative to the base member 42, vertical member 784 moves away from the center 783, i.e. the zero degree position, of the base sensor portion 782. Accordingly, the shell support members 44, which may be rubber mounts, support the shell 770 and translate the exterior force F into a parallelogram where the shell 770 moves in a plane parallel to the base member 42, but in the opposite direction. Once the force F is no longer being applied to the shell 770, the shell support members 44 return to a vertical upright position, and the vertical member 784 returns to a centered non-operational position over center 783 of the base sensor portion 782.
In accordance with one embodiment of the invention, base sensor portion 782 is the joystick base plate of a joystick with position sensors incorporated therein. Vertical member 784 is an elongate rod-like element, or armature or pin, positioned such that vertical member 784 is vertically over center 783 of base sensor portion 782. In a non-operational condition, the placement of the shell 770, over the base member 42 with the base sensor portion 782 affixed, places the vertical member 784 over the center 783 of the base sensor portion 782, in a zero degree (neutral) position. Accordingly, movement of shell 770 in response to an exterior force applied translates vertical member 784 from over the center 783 of the base sensor portion 782, the zero degree position, such that the base sensor portion 782 senses the angular direction and magnitude of the exterior force on the shell 770. The base sensor portion 782 senses the distance the vertical member 784 is displaced from the zero degree position, which allows the robot system 10 to determine the magnitude of the exterior force, as well as the angular direction the vertical member 784 is displaced from the zero degree position over center 783.
In operation, a robot with a touch shield device is commanded to perform a function within an area. Consider, for example, the robot system 10 is commanded to perform a cleaning function as described in FIGS. 9-19 and the accompanying descriptions described below. Once the robot system begins to navigate within an area, completing its assigned function task, the robot system is continuously localizing its position and searching for obstacles in its path. Although the robot system 10 is provided with a plurality of sensors, sonar and laser, for example, not all obstacles can be detected before the robot comes into contact with the obstacle. However, as shown in FIGS. 20-27, robot system 10 with touch shield device 760 provides an emergency stop mechanism for ceasing the movement of the robot system 10 upon contact with an exterior force, i.e. an obstacle.
When the shell 770 of touch shield device 760 has an exterior force applied to it, shell 770 translates on deformed shell support members 44 in a plane parallel to base member 42, but in the direction of the force. The shell 770, which is freely moveable in that it is being supported by the flexible support members 44, can move in any direction depending on the angle of the exterior force applied on it. When translated, the shell 770 causes an affixed vertical member 784 to move in relation to its neutral position over center 783 of base sensor portion 782. The angle and degree of displacement of the vertical member 784 from over center 783 allows the robot system 10 to determine the angular displacement of the vertical member 784, and the magnitude of the exterior force. The movement of vertical member 784 from over center 783 triggers an interrupt signal to the processor portion 100, which commands the transport portion 900 to cease movement of the robot system 10. The processor portion 100, utilizing the information gathered from the base sensor portion 782 may command the transport portion 900 to then move the robot system 10 in a direction away from the exterior force, and consequently, away from the obstacle contacted. Once avoided, the robot system 10 functions similar to having detected a normal obstacle, and determines a new function path that allows the robot system 10 to continue its commanded function, but avoid the obstacle it previously contacted.
Accordingly, a method of utilizing a robot with a touch shield to perform a function in an area is provided. FIG. 28 is an illustrative flowchart showing a method of utilizing a robot with a touch shield to perform a function in an area in accordance with one embodiment of the method and system of the invention. The process begins in step S2810, wherein it passes to step S2820. In step S2820, the robot system is commanded to perform a function in an area. Once the robot system determines the area layout (in step S2840) and localizes a position in the area (in step S2860), the robot system determines a function path (in step S2880). The robot system then begins, in step S2900, to navigate the area and complete at least one function task associated with the robot system's localized position in the area. When the robot detects an obstacle in its function path by sensing an exterior force applied to the shell of the touch shield device (in step S2920), the robot system ceases navigating the area (in step S2940). The robot system determines at least one of the angle and direction of the exterior force applied to the shell of the touch shield device (in step S2960), and the robot system determines an exit path that moves the robot system in a direction opposite of the exterior force (in step S2980). The robot system then determines a new function path (in step S2990) and continues to navigate the area performing a commanded function (in step S2999). The process then ends in step S3000.
With respect to the sensor device 780, in various embodiments of the invention, the sensor device 780 may comprise an analog joystick sensor, optical joystick sensor, digital joystick sensor, or electromechanical joystick sensor, for the base sensor portion 782 and accompanying vertical member 784. However, in these various embodiments, an optical, digital or mechanical joystick, can be utilized interchangeably as necessary based on the skilled artisan's desired configurations. A suitable optical joystick may be an eight position optical joystick, providing eight octants of sensory output information, such as the Perfect 360°™ Joystick manufactured by Happ Controls, Inc. The eight position optical joystick senses movement of the joystick handle member in eight octants, or in 45° segments of the base sensor portion. The zero degree position, over center 783, provides a ninth position sensed by the base sensor portion. The optical joystick would provide pulses giving a digital output by utilizing a base portion having sensors in it to make and break an optical link to provide digital light pulses which can be sensed electrically.
An analog joystick, utilizing a plurality of capacitors and potentiometers, may provide angular direction measurements within one degree, and force magnitude values. FIGS. 29-31 are provided to illustrate one embodiment of the method and system of the invention utilizing an analog joystick.
FIG. 29 is an isometric view of an illustrative robot without a touch shield device shell in accordance with one embodiment of the method and system of the invention. In this embodiment, shell 770 has been removed to provide a better understanding of the sensor device 780, base sensor portion 782, vertical member 784, and shell support members 44, as affixed to base member 42. As shown in FIG. 29, base sensor portion 782 is affixed to base member 42. For the analog joystick embodiment, the vertical member 784 is integrally connected to the base sensor portion 782 in a conventional joystick manner. Vertical member 784, which may be a handle, armature or other elongate element, extends vertically upward from the center 783 of the base sensor portion 782, such that movement of the shell cause movement of the vertical member 784. Accordingly, vertical member 784 may extend through a clearance hole 775 in shell 770, or other suitable fixture on the shell 770. The base sensor portion 782 may be affixed in any suitable manner to the base member 42, including, for example, with bolts 788. Also, shell support members 44 are affixed to base member 42, and extend vertically upward for attachment to shell 770. Brush 820, one component of the cleaning portion 800 of robot system 10, is also illustrated in FIG. 29. In this embodiment, brush 820 is affixed to the base member 42.
FIG. 30 is a side sectional view of an illustrative touch shield device mounted on an illustrative base member in accordance with one embodiment of the method and system of the invention. With respect to the touch shield device 760, FIG. 30 illustrates a non-operational position. For the analog joystick embodiment, the non-operational position includes any point in which the vertical member 784 of sensor device 780 is positioned in the center 783 of the base sensor portion 782. It should be appreciated that a non-operational position may include when the robot system 10 is operational, and even moving, as long as an exterior force is not being applied to the shell 770 such that the shell 770 would be translated, and consequently move vertical member 784 extending upwardly through the shell 770. Shell 770, with side panel 778, is supported by shell support members 44, which are affixed to base member 42. Brush 820 is also attached to base member 42. Base sensor portion 782 is bolted to base member 42, and vertical member 784 is centered over base sensor portion 782, and center 783. FIG. 31 illustrates the translation of shell 770, vertical member 784, and shell support members 44 when an exterior force is applied to shell 770.
FIG. 31 is a side sectional view of an illustrative touch shield device mounted on an illustrative base member in accordance with one embodiment of the method and system of the invention. With respect to the touch shield device, FIG. 31 illustrates the occurrence of an exterior force F, applied to shell 770. The force F pushes on the shell 770, forcing the shell support members 44 to flex in response to the force. Once the shell support members 44 flex from the exterior force, and the shell 770 translates relative to the base member 42, vertical member 784 is bent from a perpendicular vertical position over the center 783, i.e. the zero degree position, of the base sensor portion 782. Accordingly, the shell support members 44, which may be rubber mounts, support the shell 770 and translate the exterior force F into a parallelogram where the shell 770 moves in a plane parallel to the base member 42, but in the opposite direction. Once the force F is no longer being applied to the shell 770, the shell support members 44 return to a vertical upright position, and the vertical member 784 returns to a centered non-operational position over center 783 of the base sensor portion 782.
Various further embodiments of the touch shield device may be employed in a variety of shapes, designs and configurations without diverting from the scope of the method and system of the invention. For example, a touch shield device may be constructed in any suitable manner such that a force applied to the shell causes a movement of the shell relative to a base member. This movement causes a joystick sensor device to move relative to its neutral position, i.e. the zero degree position. The above described embodiments illustrate the movement of a vertical member relative to a base sensor portion mounted on a base member. It should be appreciated that although both the base sensor portion 782 and the vertical member 784 comprise the joystick sensor device 780, in yet another embodiment of the invention, the base sensor portion 782 may be affixed on the shell 770, while the vertical member 784 is mounted on the base member 42. It should be appreciated that the base sensor portion and the vertical member may be mounted on the shell and base member in any suitable fashion, position, or placement wherein the joystick sensor device measures the movement of the shell versus the base member by change in position of the vertical member in relation to its neutral position.
Furthermore, additional joystick sensor devices may be utilized such that movement of the shell causes the shell to contact and move a vertical member of a joystick sensor device. In such an embodiment, the vertical member may extend through an opening in the shell, or fit within a space integral with the shell such that contact of the shell would necessarily cause the vertical member to be moved.
It should be appreciated that the robot touch shield device described above may be utilized in combination with additional robot systems and embodiments, such as those incorporated by reference in their entirety, U.S. patent application Nos. 09/906,216, and U.S. 09/906,159, U.S. Pat. Nos. 5,548,511 and 6,124,694.
Method of Utilizing Robot System to Perform Function
In accordance with further embodiments of the invention, a method of utilizing a robot system to perform a function in an area is provided comprising the steps of first commanding the robot system to perform a function in an area, the area having an area layout including at least one area segment. The method further includes accessing by the robot system a stored map of the area layout, the stored map having at least one function task associated with the at least one area segment, localizing a first position of the robot system in the area, and determining a function path from the first position of the robot system for navigation of the area and completion of the at least one function task. Lastly, the method includes repeatedly continuously localizing a current position of the robot system while navigating the robot system along the function path, and completing the at least one function task that is associated with the current position of the robot system on the stored map of the area. An illustrative method of utilizing a robot system to perform a function in an area is shown in FIGS. 9-10, and described below.
FIG. 9 is a flowchart showing a method of utilizing a robot system to perform a function in an area in accordance with an embodiment of the method and system of the invention. The process begins in step S10, and then passes to step S20, wherein the robot system is commanded to perform a function in an area. The function has at least one function task, while the area has an area layout which includes at least one area segment. Then, once the robot system has received a command to perform a function in an area, the robot system accesses a stored map of the area layout in step S40. The area layout has at least one function task associated with its at least one area segment. In step S60, the robot system localizes a first position in the area. Once the first position is determined, in step S80, the robot system determines a function path, from the first position, for navigation of the area and completion of the at least one function task. Then, in step S100, the robot system navigates the area and completes the at least one function task associated with the position of the robot system in the area, while continuously localizing the robot system position in the area. In addition, while the robot system is navigating the area and completing the at least one function task, the robot system is continuously monitoring for obstacles, and determining if an obstacle is in the function path in step S120. If an obstacle is detected in the function path, the process returns to step S60, where the robot system will once again localize a position (in step S60) and recalculate a new function path that avoids the obstacle in the current function path (in step S80).
Alternatively, if the robot system does not detect an obstacle in the function path (in step S120), the process passes to step S140, wherein the robot system determines if it has completed the at least one function task in the area. If yes, the process passes to step S160 wherein the robot system returns to a non-operating position. Then, the process ends in step S180. However, if the robot system has not completed its at least one function task in step S160, the process returns to step S100 and the robot system continues to navigate the area and complete its at least one function task. The process may be further understood by examining FIG. 10, which shows the method of FIG. 9 in further detail.
It should be appreciated that if the robot system is not in the area when it receives the command to perform a function, the robot system moves to the area. For example, the robot system may take appropriate measures to guide itself to the area, or the robot system may be directed to the area by an operator.
FIG. 10 is a flowchart showing the “robot system determines a function path” step of FIG. 9 in further detail in accordance with an embodiment of the method and system of the invention. The process begins in step S80, and then passes to step S82. In step S82, the robot system determines whether it has received a new command to perform a function. If the robot system has received a new command, the process passes to step S84, where the robot system determines if there is a stored function path associated with the command received. This may include a previously determined function path for a given area based on layout of the area and associated tasks. If a stored function path exists that is associated with the new command received, the process passes to step S86. In step S86, the stored function path is identified as the function path for the continuing process. The, the process passes to step S99, wherein the process returns to step S100.
Alternatively, if in step S82, the robot system determines that it has not received a new command, the process passes to step S88. Then, in step S88, the robot system determines that an obstacle has been detected in the current function path. The process then passes to step S90, where the robot system develops a new function path and identifies the new function path as the function path for the continuing process. Then, the process passes to step S99, wherein the process returns to step S100.
In step S84, if the robot system determines that there is no stored function path associated with the command received, the process passes to step S90. Then, in step S90, the robot system develops a new function path and identifies the new function path as the function path for the continuing process. Thereafter, the process passes to step S99, wherein the process returns to step S100.
It should also be appreciated that a stored function path for a given area segment in an area layout may be more efficient than a function path determined by a robot system operating of information gathered from sensors. This may occur because pre-programmed function paths may allow an operator to direct the robot system very close to obstacles, such as walls, and guide the robot system into tight spaces that the robot system's obstacle avoidance systems would otherwise not allow the robot system to navigate within.
As stated above, the robot system may create updated maps of the area layout when obstacles are detected in the area layout. Once detected, the obstacles can be added to an amended map of the area layout that can be utilized. Also, if the obstacle is removed, the robot system can either create another amended map that removes the obstacle, or could return to the previous stored map. It should also be appreciated that the operator may set the updating of a stored map to occur on the detection of an obstacle for a given number of consecutive cleaning cycles through a given area layout. The number of repeated detections of an obstacle before it is added to a map of the area layout can be determined by one of ordinary skill in the art based upon the needs of the given area layout.
To provide a better understanding of the method of utilizing a robot system to perform a function in an area, FIGS. 11-13 are illustrative flowcharts showing a method of mapping an area and assigning function tasks to an area segment in accordance with one embodiment of the method and system of the invention.
FIG. 11 is a flowchart showing a method of mapping an area utilizing a robot system in accordance with one embodiment of the method and system of the invention. The process begins in step S300, and then passes to step S310. Then, in step S310, the robot system is commanded to map an area. The area has an area layout which includes at least one area segment. Once the robot system receives the command to map an area, in step S320, the robot system determines the area layout utilizing a plurality of sensors. Accordingly, the robot system produces a map of the area layout in step S330, and stores the map of the area layout in a memory device in step S340. The process then passes to step S350.
Similarly, FIG. 12 is a flowchart showing a method of mapping an area utilizing a robot system in accordance with yet another embodiment of the method and system of the invention. The process begins in step S400, and then passes to step S410. Then, in step S410, the robot system is commanded to map an area. The area has an area layout which includes at least one area segment. Once the robot system receives the command to map an area, in step S420, an operator directs the robot system on a mapping path that transverses the area being mapped. Then, in step S430, while moving along the path transversing the area being mapped, the robot system determines the area layout utilizing a plurality of sensors. Accordingly, the robot system produces a map of the area layout in step S440. Then, in step S450, the robot system determines whether or not the entire area has been mapped. If not, the process returns to step S420 to continue mapping. However, if the entire area has been mapped, the robot system stores the map of the area layout in a memory device in step S460. The process then passes to step S470, where it ends.
As stated above, both illustrative embodiments of mapping processes shown in FIGS. 11 and 12 store the completed map of an area layout in a suitable memory device. FIG. 13 is a flowchart showing a method of storing a map of an area layout in accordance with one embodiment of the system and method of the invention. For example, the process shown in FIG. 13 may represent the steps of S340 and S460, of FIGS. 11 and 12 respectively, in further detail.
The illustrative storing process shown in FIG. 13 begins in step S500, and then passes to step S510. In step S510, the robot system determines whether to store the map of the area layout in an internal memory device. If yes, the robot system stores the map of the area layout in an internal memory device in step S520. Alternatively, if the robot system determines not to store the map of the are layout internally, the robot system stores the map in an external memory device in step S530. Once the map of the area layout has been stored in steps S520 and S530, the processes pass to step S540, wherein the processes end. It should be appreciated that these illustrative storing processes may be utilized in conjunction with other processes in which other steps may be added or deleted.
It should be appreciated that the operator may direct the robot system along a mapping path that transverses the area being mapped in any suitable manner in which the robot system receives a command from an operator to move from one point to another. For example, an operator may physically guide the robot system from a first point in an area segment to a finishing point of the mapping. In addition, the operator may control the movement of the robot system through the use of a wireless keyboard or joystick. Furthermore, any suitable mapping path that allows the robot system to produce a complete map of an area segment may be utilized. Therefore, as stated above, the robot system need only transverse the area being mapped to the extent necessary for mapping the area. It should be appreciated that the robot system may not need to physically move in any respect to produce a map of an area.
Furthermore, the robot system performing the mapping of an area may utilize a plurality of different sensors in producing the map. Once the map is complete, the map may be stored in any suitable memory device. This may include an internal memory store within the robot system, or any external memory device in which the robot system is in communication, or both, for example. Accordingly, the robot system may store the completed map in a central memory device, wherein the map is accessible by one or more alternate robot systems. The alternate robot systems may employ their access to the stored map upon receiving a command to perform a function in an area associated with the stored map.
Furthermore, along with the mapping of an area, and its accompanying area layouts, in accordance with one embodiment of the method and system of the invention, an operator may assign, program, or associate function tasks for a given area segment. Accordingly, an operator could map an area segment and assign one function task to be completed in that area segment once the robot system receives a command to perform a function in the area, for example. FIG. 14 provides a better understanding of how function tasks can be associated with maps of area layouts.
FIG. 14 is a flowchart showing a method of associating a function task with an area segment on a map of an area layout in accordance with yet another embodiment of the method and system of the invention. The process begins in step S600, and then passes to step S610. Then, in step S610, the robot system is commanded to map an area. The area has an area layout which includes at least one area segment. Once the robot system receives the command to map an area, in step S620, an operator directs the robot system on a mapping path that transverses the area being mapped. Then, in step S630, while moving along the path transversing the area being mapped, the robot system determines the area layout utilizing a plurality of sensors. Accordingly, the robot system produces a map of the area layout in step S640. The process then passes to step S650, wherein an operator associates at least one function task to be completed in the area segment with the map of the area layout, which includes the area segment. Then, in step S660, the robot system determines whether or not the entire area has been mapped. If not, the process returns to step S620 to continue mapping. However, if the entire area has been mapped, the robot system stores the map of the area layout and the at least one function task associated with the at least one area segment in a memory device in step S670. The process then passes to step S680, where it ends.
As stated above, it should be appreciated that the assignment of function tasks to given area segments throughout an area can be accomplished in any suitable manner and in any desired combination. For instance, an operator may associate a certain function task to an area segment before the robot system begins to map the area segment. In this example, an operator could command the robot to map the area, the robot begins sensing the area layout and the operator programs a function task to go along with the area segment before the robot system begins to transverse the area. Simply, the operator could press a record button, which tells the robot system to map the area, then push a function task button, which tells the robot to associate the function task with the area segment. The operator could then direct the robot system over the area segment and map and assign tasks simultaneously.
In addition, an operator may choose to associate several function tasks with one area segment and only one for another. However, in one embodiment of the invention, when the robot system determines the function tasks it will perform based on a stored map and associated tasks, each area segment should have at least one function task. This one function task may be as simple as moving through the area segment and not performing any other function task. In this respect, one of ordinary skill in the art could prepare several different stored functions within the robot system, which include several commands, functions, area layouts further including numerous area segments, for example. This embodiment of the invention may be further understood with reference to FIG. 15.
Illustratively, FIG. 15 is a diagram of an illustrative area layout in accordance with one embodiment of the method and system of the invention. In this example, the robot system is first placed in an unmapped area 1000 at point 1001. An operator then commands the robot system to begin mapping the area layout in which the robot system is placed. The robot system begins sensing the boundaries, i.e. walls, of the area layout and then moves to point 1002, while continuously sensing. Once at point 1002, the robot system produces a map of the first area segment 1051, which is defined by the walls where the robot system first began sensing at point 1001, and artificial boundary 1021, which is the programmed boundary between area segments 1051 and 1052. Accordingly, the operator or programmer can assign any number of function tasks for completion in this area segment 1051, the least of which is to travel through it. Continuing with this embodiment of the invention, the operator then directs the robot system from point 1002 to point 1003. The robot system continues sensing and updating a map of the area layout based on the movement, while recognizing the established boundary 1021 between area segments 1051 and 1052. In addition, area segment 1052 is defined by the walls of the area and space between boundary 1021 and boundary 1022, the boundary between area segments 1052 and 1053. The mapping process may continue accordingly through mapping passes from points 1003-1014. However, for this example, the robot system will move from point 1003 to 1004 to 1005 to 1006 to 1007 to 1008, back to 1009 to 1010 to 1011 to 1012 back to 1009 to 1013 and finishing with point 1014. In this respect, the mapping process can map the entire area layout 1000, while dividing the area layout 1000 into area segments 1051, 1052, 1053, 1054, 1055, 1056, 1057, and 1058, separated by artificial boundaries 1021, 1022, 1023, 1024, 1025, 1026 and 1027, as shown in FIG. 15.
As stated above, the operator possesses a wide range of latitude in determining which function tasks may be assigned to each area segment. In FIG. 15, the operator may program than area segments 1051 and 1052 should be mopped and dried, area segment 1053 should be passed over because it is carpeted (unless vacuuming desired), area segments 1054 and 1055 should be scrubbed, mopped, dried and waxed, area segment 1056 should be passed over because it is carpeted, area segment 1057 vacuumed, and area segment 1058 swept. As described in the above description of FIG. 15, the robot systems commands may require it to travel over the same section of the area layout several times, perform function and tasks repeatedly on one area segment, and in any various order or schedule. Therefore, one should appreciate that many different combinations of function tasks and area segments can be accomplished based upon the desired programming input into the robot system by the operator.
Furthermore, several functions may be tailored to very specific activities that can recur daily, weekly or monthly. However, once in operation, the robot system understands the function tasks it is to perform in different area segments based upon the stored map associated with the commanded function, and tasks assigned to area segments stored along with the map of the area layout.
Additionally, it should be appreciated that any stored map of an area may be edited or amended once produced. Illustratively, FIGS. 16 and 17 show illustrative methods of editing a map of an area layout in accordance with further embodiments of the method and system of the invention.
FIG. 16 is a flowchart showing a method of editing a map of an area layout in accordance with one embodiment of the system and method of the invention. The process begins in step S700, and then passes to step S710. In step S710, an operator accesses a map of an area layout. In step S720, the operator edits the map of the area layout. Once the map of the area layout has been edited, the operator stores the edited map of the area layout in a memory device, in step S730. The process then passes to step S740 where it ends. As stated above, the robot system may be programmed to create updated maps of area layouts in which the robot system is operating within based upon the detection of obstacles in the function path, for example. This may include commanding the robot system to create an amended map upon the repeated detection of an obstacle.
It should be appreciated that not only can the map of the area layout be edited, amended or modified, but the programmed tasks associated with given area segments may also be edited or changed. Illustratively, FIG. 17 is provided.
FIG. 17 is a flowchart showing a method of editing a map of an area layout in accordance with a further embodiment of the method and system of the invention. The process begins in step S800, and then passes to step S810 an operator accesses a map of an area layout and function tasks that have been associated with the area layout. Then, in step S820, the operator determines whether or not to edit the map of the accessed area layout. If yes, the process passes to step S830, in which the operator edits the accessed map of the area layout. Once the map has been edited, the process passes to step S840. Alternatively, if in step S820, the operator decides not to edit the accessed map of the area layout, the process passes to step S840.
In step S840, the operator determines whether or not to edit the function task(s) associated with any area segment in the map of the area layout. In this respect, function tasks can be added or deleted. If yes, the process passes to step S850, in which the operator edits or changes the associated function task(s). Then, the process passes to step S860. Alternatively, if in step S840, the operator chooses not to edit the associated function task(s), the process passes to step S860. Then, in step S860, the operator stores the edited map of the area layout or edited associated function task(s) or both in a memory device. The process then passes to step S870 where it ends.
One embodiment of an editing process may be further understood by reference to FIGS. 18a and 18 b. FIG. 18a is a diagram of an unedited area layout 1100 in accordance with one embodiment of the method and system of the invention. As shown in FIG. 18a, area layout 1100 is divided into a first area segment 1110 and second area segment 1120, which are separated by artificial boundary 1119. For purposes of this example, consider area layout 1110 to represent a tiled floor area of an office building. However, if changes to the area of the office building are made, an operator can edit the map of this area layout 1100.
FIG. 18b is a diagram of an edited area layout in accordance with one embodiment of the method and system of the invention. As shown in FIG. 18b, the same area layout 1100 is now subdivided into area segments 1110, 1130 and 1140, separated by boundaries 1119 and 1139, respectively. It should be appreciated that 1130 and 1140 make up what was 1120. In this example, the office building area has been modified to add multiple pillars 1112 in area segment 1110, a receptionist desk 1114 in area segment 1110, and a glass wall 1132 with glass double doors 1134 has been placed on boundary 1139. Accordingly, an operator can edit the original stored map of area layout 1100 (as shown in FIG. 18a) to include these new features (as shown in FIG. 18b) and update a function for a robot system to perform in this area layout 1100.
Further methods of controlling an illustrative robot system in accordance with further embodiments of the system and methods of the invention may be understood with reference to FIGS. 19a, 19 b, 19 c and 19 d. FIG. 19a is a diagram of an illustrative area layout in accordance with one embodiment of the method and system of the invention. FIGS. 19b, 19 c and 19 d are diagrams of the illustrative area layout of FIG. 19a in further detail in accordance with one embodiment of the method and system of the invention. As shown in FIG. 19a, illustrative area layout 1200 is provided. Area layout 1200 is divided into sections 1210, 1230, 1250 and 1270, respectively, for purposes of this embodiment. As shown in FIG. 19b, sections 1210 and 1230 can be grouped together to define area segment 1220. As shown in FIG. 19c, sections 1230 and 1250 can be grouped together to define area segment 1240. As shown in FIG. 19d, sections 1230 and 1270 can be grouped together to define area segment 1260.
It should be appreciated that the above described sections and area segments, as shown in FIGS. 19a- 19 d, have been selected only for purposes of illustrating one embodiment of the system and method of the invention. An operator may define different area segments throughout an area layout in any suitable manner desired for accomplishing the desired function. For example, if a small circular area rug was placed in the center of section 1230, the operator could choose to make that portion of section 1230 covered by the area rug to define yet another area segment.
Moreover, an operator can utilize several different commands for directing the robot system's functioning within differing area segments. For example, the operator can program the robot system to change area segments with a simple programmed turn of the robot system. In this respect, once the robot system would finish performing a task within one area segment, the routine could include a programmed 90 degree turn clockwise, after which the robot system would be in the next area segment. The robot system would then move to the tasks commanded for the second area segment upon making the turn. The following example further illustrates these and other features.
In one embodiment, based on area layout 1200, an operator commands a cleaning robot system to perform several different cleaning tasks within area layout 1200. Utilizing the stored map of area layout 1200, the operator can dictate a series of tasks to be completed. In this embodiment, area layout 1200 is divided into area segments 1220, 1240 and 1260, respectively. It should be appreciated, as shown in FIGS. 19a- 19 d, area segments may overlap and may further be defined in any suitable manner desired.
Accordingly, the series of tasks in this embodiment may be scripted such that the robot system first applies cleaning solution to and then mops area segment 1220. Then, the robot system applies cleaning solution to, scrubs and then mops area segment 1240. Finally, the robot system simply dust mops area segment 1260. Although these area segments overlap and are not separated by physical boundaries, the operator may change the area segment in which the robot system is operating by commanding the robot system to turn upon the completion of the assigned tasks for an area segment. Thus, once the robot system finished mopping area segment 1220, and was situated in section 1230, a 90 degree clockwise turn could place the robot system in the next area segment. Therefore, upon the 90 degree turn, the robot system would be at its initial operating point for area segment 1240. However, it should be understood that turns and other physical movements of the robot system may be utilized to effectuate the robot system's assigned tasks, and the differentiation between area segments in an area.
It should be appreciated that commands may be received from a central system via any suitable communication interface, modem, telephone, fax, or other computer connection, the receipt of data input from an IP address given to the robot, or any other suitable connection through which the robot's processor might receive input from an external source. In addition, a robot system may receive a command from another robot to perform a function or function task. As such, if a fleet of robots were deployed in an area, and one robot was given a function or function task that it could not address at that time, the robot could send a command to another robot, through any suitable interface, to perform the necessary function or task. Accordingly, an illustrative fleet of robots may take commands from a manager robot dispatching commands through interfaces with the other robots in the area.
It should further be appreciated that the robot system's ability to perform functions in an area may extend to those periods of time when no operators are present to supervise the robot system. As such, the robot system could perform a maintenance and security function, as well as a conventional cleaning function. In addition, the robot may be programmed to handle certain emergency situations, including for example, fire emergencies, burglaries or loss of power in the area in which it is operating. It should be appreciated that once the robot system detects an emergency condition, the robot system may alert all necessary personnel to the emergency condition.
It should be further understood that in accordance with the robot system's ability to operate autonomously the robot will be provided with the necessary programming, tasking and commands to ensure its readiness to perform functions in an area. This may require that the robot system monitor its own diagnostic system, including its power status and internal components, such that the robot system would understand if it needs to be recharged or serviced to maintain its working condition. Then, the robot system may alert the necessary personnel that it needs service. For simple service requirements, like the recharging of the robot system's batteries, dumping or refilling tanks, the robot system may deliver itself to a recharging station where it can autonomously recharge its batteries, and dump or refill its tanks.
Referring now to the above-described FIGS. 9-19, and the illustrative examples in accordance with the method and system of the invention, it should be appreciated that the steps in the utilization of the robot system to perform a function in an area may be accomplished in several manners. For example,
As described above, one embodiment of the robot system of the invention as shown in FIGS. 1-8, and FIGS. 20-27, may incorporate a computer or computer system. As used herein, the term “computer system” is to be understood to include at least one processor utilizing a memory or memories. The memory stores at least portions of an executable program code at one time or another during operation of the processor. Additionally, the processor executes various instructions included in that executable program code. An executable program code means a program in machine language or other language that is able to run in a particular computer system environment to perform a particular task. The executable program code process data in response to commands by a user. As used herein, it will be appreciated that the term “executable program code” and term “software” mean substantially the same thing for the purposes of the description as used herein.
Further, it is to be appreciated that to practice the system and method of the invention, it is not necessary that the processor, or subportions of the processor, and/or the memory, or subportions of the memory be physically located in the same place or disposed in the same physical portion of the robot system 10. That is, it should be appreciated that each of the processor and the memory may be located in geographically distinct locations and connected so as to communicate in any suitable manner, such as over a wireless communication path, for example. Additionally, it should be appreciated that each of the processor and/or the memory may be composed of different physical pieces of equipment. Accordingly, it is not necessary that the processor be one single piece of equipment in one location and that the memory be another single piece of equipment in another location. That is, it is contemplated that the processor may be two pieces of equipment in two different physical locations. The two distinct pieces of equipment may be connected in any suitable manner. Additionally, each respective portion of the memory described above may include two or more portions of memory in two or more physical locations. Further, the memory could include or utilize memory stores from the Internet, Intranet, Extranet, LAN or some other source or over some other network, as may be necessary or desired.
As described above, the invention may illustratively be embodied in the form of a computer or computer operating system. It is to be appreciated that the software that enables the computer operating system to perform the operations described above may be supplied on any of a wide variety of data holding media. Further, it should be appreciated that the implementation and operation of the invention may be in the form of computer code written in any suitable programming language, which provide instructions to the computer.
It should further be appreciated that the software code or programming language that is utilized in a computer system to perform the various operations of the above described invention may be provided in any of a wide variety of forms. Illustratively, the software may be provided in the form of machine language, assembly code, object code, or source language, as well as in other forms. Further, the software may be in the form of compressed or encrypted data utilizing an encryption algorithm.
Additionally, it should be appreciated that the particular medium utilized may take on any of a variety of physical forms. Illustratively, the medium may be in the form of a compact disk, a DVD, an integrated circuit, a hard disk, a floppy diskette, a magnetic tape, a RAM, a ROM, or a remote transmission, as well as any other medium or source of information that may be read by a computer or other operating system.
Accordingly, the software of the method of the invention, which is utilized in operation of the robot system 10, may be provided in the form of a hard disk or be transmitted in some form using a direct wireless telephone connection, the Internet, an Intranet, or a satellite transmission, for example. Further, the programming language enabling the system and method of the invention as described above may be utilized on all of the foregoing and any other medium by which software or executable program code may be communicated to and utilized by a computer or other operating system.
As described herein, the system and method of the invention may utilize an application program, a collection of separate application programs, a module of a program that is designed to handle, or a portion of a module of a program, for example. As noted above, it should be appreciated that the computer language used in the system and method of the invention may be any of a wide variety of programming languages. Further, it is not necessary that a single programming language be utilized in conjunction with the operation of the system and method of the invention. Rather, any number of different programming languages may be utilized as is necessary or desirable.
As described above, in the system and method of the invention, a variety of user interfaces are utilized. A user interface may be in the form of a dialogue screen for example. As used herein, a user interface includes any software, hardware or combination of hardware and software used in an operating system that allows a user to interact with the operating system. A user interface may include any of a touch screen, keyboard, mouse, voice reader, voice recognizer, dialogue screen, menu box, a list, a checkbox, a toggle switch, a pushbutton or any other object that allows a user to receive information regarding the operation of the program and/or provide the operating system with information. Accordingly, the user interface is any device that provides communication between a user and a computer. The information provided by the user to the computer through the user interface may be in the form of a command, a selection or data, or other input, for example.
A user interface is utilized by an operating system running an application program to process data for a user. As should be appreciated, a user interface is typically used by a computer for interacting with a user either to convey information or receive information. However, it should be appreciated that in accordance with the system and method of the invention, it is not necessary that a human user actually interact with a user interface generated by the operating system of the invention. Rather, it is contemplated that the user interface of the invention interact, i.e., convey and receive information, in communication with another operating system or computer, rather than a human user. Further, it is contemplated that the user interfaces utilized in the system and method of the invention may interact partially with another operating system while also interacting partially with a human user.
It will be readily understood by those persons skilled in the art that the present invention is susceptible to broad utility and application. Many embodiments and adaptations of the present invention other than those herein described, as well as many variations, modifications and equivalent arrangements, will be apparent from or reasonably suggested by the present invention and foregoing description thereof, without departing from the substance or scope of the invention.
Accordingly, while the present invention has been described here in detail in relation to its preferred embodiment, it is to be understood that this disclosure is only illustrative and exemplary of the present invention and is made merely for the purposes of providing a full and enabling disclosure of the invention. Many modifications to the embodiments described above can be made without departing from the spirit and scope of the invention. Accordingly, the foregoing disclosure is not intended to be construed or to limit the present invention or otherwise to exclude any other such embodiments, adaptations, variations, modifications and equivalent arrangements.

Claims (14)

What is claimed is:
1. A robot touch shield device comprising:
a shell supported by at least one shell support member mounted on a base member, and
a sensor device for sensing an exterior force applied to the shell, the sensor device having
a base sensor portion having a center and
a vertical member, the base sensor portion affixed on the base member, the vertical member affixed on the shell, the vertical member positioned over the center of the base sensor portion;
wherein the exterior force applied to the shell translates the shell relative to the base member, the base sensor portion senses a displacement of the vertical member relative to the center of the base sensor portion and produces an output representing at least one of a direction of the exterior force applied and the degree of the exterior force applied.
2. A robot system with a touch shield device comprising:
a processing portion for processing data in the robot system;
a memory portion, the processor portion storing data in the memory portion and retrieving data from the memory portion;
a transport portion for transporting the robot system from a first location to a second location;
a body portion, the body portion containing at least one of the processor portion, the memory portion, and the transport portion;
a touch shield device mounted on the body portion, the touch shield device having
a shell supported by at least one shell support member mounted on a base member, and
a sensor device for sensing an exterior force applied to the shell, the sensor device having
a base sensor portion having a center and
a vertical member, the base sensor portion affixed on the base member, the vertical member affixed on the shell, the vertical member positioned over the center of the base sensor portion;
wherein the exterior force applied to the shell translates the shell relative to the base member, the base sensor portion senses a displacement of the vertical member relative to the center of the base sensor portion and produces an output signal representing at least one of a direction of the exterior force applied and the degree of the exterior force applied, the processor portion monitoring the output signal produced by the base sensor portion and commanding the robot system to cease navigating and maneuver on an exit path away from the exterior force applied to the shell.
3. A method of utilizing a robot system with a touch shield device comprising the steps of:
commanding the robot system to perform a function in an area, the function having at least one function task, the area having an area layout including at least one area segment;
accessing by the robot system a stored map of the area layout, the stored map having at least one function task associated with the at least one area segment;
localizing a first position of the robot system in the area;
determining a function path by the robot system from the first position of the robot system for navigation of the area and completion of the at least one function task;
repeatedly continuously localizing a current position of the robot system while navigating the robot system along the function path;
repeatedly continuously monitoring by the robot system the touch shield device for obstacles in the function path, the touch shield device having
a shell supported by at least one shell support member mounted on a base member, and
a sensor device for sensing an exterior force applied to the shell, the sensor device having
a base sensor portion having a center and
a vertical member, the base sensor portion affixed on the base member, the vertical member affixed on the shell, the vertical member positioned over the center of the base sensor portion;
wherein the exterior force applied to the shell translates the shell relative to the base member, the base sensor portion senses a displacement of the vertical member relative to the center of the base sensor portion and produces an output representing at least one of a direction of the exterior force applied and the degree of the exterior force applied.
4. The method claim 3 further including the step of ceasing by the robot system the navigating the robot system along the function path upon the detection of an obstacle in the function path by the touch shield device, wherein the base sensor portion senses a displacement of the vertical member relative to the center of the base sensor portion and produces an output representing at least one of a direction of the exterior force applied and the degree of the exterior force applied.
5. The method of claim 3 further including the step of updating the function path upon the detection of an obstacle in the function path by the touch shield device, the base sensor portion sensing a displacement of the vertical member relative to the center of the base sensor portion and producing an output representing at least one of a direction of the exterior force applied and the degree of the exterior force applied.
6. The method of claim 3 wherein the step of commanding the robot system to perform a function in an area includes sending a command from an operator to the robot system via a communication interface, the communication interface being a wireless communication from the operator to the robot system.
7. The method of claim 3 wherein the step of commanding the robot system to perform a function in an area includes sending a command from an operator to the robot system via a communication interface, the communication interface being a wireless communication from the operator to a communication network linked to the robot system.
8. The method of claim 7 wherein the communication interface is a telephone call from the operator to the robot system.
9. The method of claim 3 further including the step of outputting information from the robot system upon performance of the function in the area.
10. A robot touch shield device comprising:
a shell supported by at least one shell support member mounted on a base member, and
a sensor device for sensing an exterior force applied to the shell, the sensor device having
a base sensor portion having a center and
a vertical member, the base sensor portion affixed on the shell, the vertical member affixed on the base member, the vertical member positioned under the center of the base sensor portion;
wherein the exterior force applied to the shell translates the shell relative to the base member, the base sensor portion senses a displacement of the vertical member relative to the center of the base sensor portion and produces an output representing at least one of a direction of the exterior force applied and the degree of the exterior force applied.
11. A robot system with a touch shield device comprising:
a processing portion for processing data in the robot system;
a memory portion, the processor portion storing data in the memory portion and retrieving data from the memory portion;
a transport portion for transporting the robot system from a first location to a second location;
a body portion, the body portion containing at least one of the processor portion, the memory portion, and the transport portion;
a touch shield device mounted on the body portion, the touch shield device having
a shell supported by at least one shell support member mounted on a base member, and
a sensor device for sensing an exterior force applied to the shell, the sensor device having
a base sensor portion having a center and
a vertical member, the base sensor portion affixed on the shell, the vertical member affixed on the base member, the vertical member positioned under the center of the base sensor portion;
wherein the exterior force applied to the shell translates the shell relative to the base member, the base sensor portion senses a displacement of the vertical member relative to the center of the base sensor portion and produces an output signal representing at least one of a direction of the exterior force applied and the degree of the exterior force applied, the processor portion monitoring the output signal produced by the base sensor portion and commanding the robot system to cease navigating and maneuver on an exit path away from the exterior force applied to the shell.
12. A method of utilizing a robot system with a touch shield device comprising the steps of:
commanding the robot system to perform a function in an area, the function having at least one function task, the area having an area layout including at least one area segment;
accessing by the robot system a stored map of the area layout, the stored map having at least one function task associated with the at least one area segment;
localizing a first position of the robot system in the area;
determining a function path by the robot system from the first position of the robot system for navigation of the area and completion of the at least one function task;
repeatedly continuously localizing a current position of the robot system while navigating the robot system along the function path;
repeatedly continuously monitoring by the robot system the touch shield device for obstacles in the function path, the touch shield device having
a shell supported by at least one shell support member mounted on a base member, and
a sensor device for sensing an exterior force applied to the shell, the sensor device having
a base sensor portion having a center and
a vertical member, the base sensor portion affixed on the shell, the vertical member affixed on the base member, the vertical member positioned under the center of the base sensor portion;
wherein the exterior force applied to the shell translates the shell relative to the base member, the base sensor portion senses a displacement of the vertical member relative to the center of the base sensor portion and produces an output representing at least one of a direction of the exterior force applied and the degree of the exterior force applied.
13. A robot touch shield device comprising:
a shell supported by at least one shell support member mounted on a base member, and
a sensor device for sensing an exterior force applied to the shell, the sensor device having
a base sensor portion having a center and
a vertical member, the base sensor portion affixed on the base member, the vertical member integrally vertically mounted in the center of the base sensor portion and extending upwardly though an aperture in the shell;
wherein the exterior force applied to the shell translates the shell relative to the base member, the shell contacts the vertical member and displaces the vertical member relative to the center of the base portion, the base sensor portion senses a displacement of the vertical member relative to the center of the base sensor portion and produces an output representing at least one of a direction of the exterior force applied and the degree of the exterior force applied.
14. A robot system with a touch shield device comprising:
a processing portion for processing data in the robot system;
a memory portion, the processor portion storing data in the memory portion and retrieving data from the memory portion;
a transport portion for transporting the robot system from a first location to a second location;
a body portion, the body portion containing at least one of the processor portion, the memory portion, and the transport portion;
a touch shield device mounted on the body portion, the touch shield device having
a shell supported by at least one shell support member mounted on a base member, and
a sensor device for sensing an exterior force applied to the shell, the sensor device having
a base sensor portion having a center and
a vertical member, the base sensor portion affixed on the base member, the vertical member integrally vertically mounted in the center of the base sensor portion and extending upwardly though an aperture in the shell;
wherein the exterior force applied to the shell translates the shell relative to the base member, the shell contacts the vertical member and displaces the vertical member relative to the center of the base portion, the base sensor portion senses a displacement of the vertical member relative to the center of the base sensor portion and produces an output representing at least one of a direction of the exterior force applied and the degree of the exterior force applied, the processor portion monitoring the output signal produced by the base sensor portion and commanding the robot system to cease navigating and maneuver on an exit path away from the exterior force applied to the shell.
US09/976,420 2001-08-13 2001-10-13 Robot touch shield Expired - Lifetime US6580246B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/976,420 US6580246B2 (en) 2001-08-13 2001-10-13 Robot touch shield

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/928,669 US6667592B2 (en) 2001-08-13 2001-08-13 Mapped robot system
US09/976,420 US6580246B2 (en) 2001-08-13 2001-10-13 Robot touch shield

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/928,669 Continuation-In-Part US6667592B2 (en) 2001-08-13 2001-08-13 Mapped robot system

Publications (2)

Publication Number Publication Date
US20030030399A1 US20030030399A1 (en) 2003-02-13
US6580246B2 true US6580246B2 (en) 2003-06-17

Family

ID=46280123

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/976,420 Expired - Lifetime US6580246B2 (en) 2001-08-13 2001-10-13 Robot touch shield

Country Status (1)

Country Link
US (1) US6580246B2 (en)

Cited By (157)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040158354A1 (en) * 2002-12-30 2004-08-12 Samsung Electronics Co., Ltd. Robot localization system
US20040158357A1 (en) * 2003-02-06 2004-08-12 Samsung Gwangju Electronics Co., Ltd Robot cleaner system having external recharging apparatus and method for docking robot cleaner with external recharging apparatus
US20040168837A1 (en) * 2002-11-27 2004-09-02 Universite De Sherbrooke Modular robotic platform
US20040187249A1 (en) * 2002-01-03 2004-09-30 Jones Joseph L. Autonomous floor-cleaning robot
US20040199301A1 (en) * 2003-01-23 2004-10-07 Lg Electronics Inc. Position information recognition apparatus for cleaning robot
US20040204804A1 (en) * 2003-04-08 2004-10-14 Samsung Electronics Co., Ltd. Method and apparatus for generating and tracing cleaning trajectory of home cleaning robot
US20040210346A1 (en) * 2003-04-15 2004-10-21 Samsung Electronics Co., Ltd. Method and apparatus for allowing mobile robot to return to docking station
US20040249511A1 (en) * 2001-10-11 2004-12-09 Markus Jager Method, arrangement and computer programme with programme-coding means and computer programme product for allocating a partial surface of a total surface divided into several partial surfaces on one of several mobile units
US20050021181A1 (en) * 2003-07-24 2005-01-27 Samsung Gwangju Electronics Co., Ltd. Robot cleaner
US20050222711A1 (en) * 2004-04-01 2005-10-06 Kabushiki Kaisha Toshiba Robot and a robot control method
US20050217061A1 (en) * 2004-04-02 2005-10-06 Royal Appliance Mfg. Co. Robotic appliance with on-board joystick sensor and associated methods of operation
US20050239594A1 (en) * 2004-04-23 2005-10-27 Alto U.S. Inc. Joystick controlled scrubber
US20050251292A1 (en) * 2000-01-24 2005-11-10 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US20060061657A1 (en) * 2004-09-23 2006-03-23 Lg Electronics Inc. Remote observation system and method thereof
US20060142896A1 (en) * 2004-12-14 2006-06-29 Honda Motor Co., Ltd. System for carrying an item
US20060149419A1 (en) * 2004-11-30 2006-07-06 Kabushiki Kaisha Toshiba Movable robot without falling over
US20060287801A1 (en) * 2005-06-07 2006-12-21 Lg Electronics Inc. Apparatus and method for notifying state of self-moving robot
US20070100496A1 (en) * 2003-05-27 2007-05-03 Stockholmsmassan Robot system, method and computer program product
US20070112461A1 (en) * 2005-10-14 2007-05-17 Aldo Zini Robotic ordering and delivery system software and methods
US20070152619A1 (en) * 2005-12-12 2007-07-05 Honda Motor Co., Ltd. Autonomous mobile robot and goods carrying method of using the same
US20070234492A1 (en) * 2005-12-02 2007-10-11 Irobot Corporation Coverage robot mobility
US7320149B1 (en) 2002-11-22 2008-01-22 Bissell Homecare, Inc. Robotic extraction cleaner with dusting pad
US20080052846A1 (en) * 2006-05-19 2008-03-06 Irobot Corporation Cleaning robot roller processing
US20080086249A1 (en) * 2006-10-05 2008-04-10 Trimble Navigation Limited Farm apparatus having implement sidehill drift compensation
US20080084174A1 (en) * 2001-01-24 2008-04-10 Irobot Corporation Robot Confinement
US20080144978A1 (en) * 2003-02-26 2008-06-19 Silverbrook Research Pty Ltd Mobile Robot For Sensing And Decoding A Surface Coding Pattern On A Surface
US20080150466A1 (en) * 2004-01-28 2008-06-26 Landry Gregg W Debris Sensor for Cleaning Apparatus
US20080214260A1 (en) * 2007-03-02 2008-09-04 National Taiwan University Of Science And Technology Board game system utilizing a robot arm
US20080229885A1 (en) * 2007-03-22 2008-09-25 Mah Pat Y Jar opener
WO2008135978A2 (en) * 2007-05-06 2008-11-13 Wave Group Ltd. A robotic platform
US20090024250A1 (en) * 2007-07-18 2009-01-22 Kabushiki Kaisha Toshiba Mobile Robot and method for controlling mobile robot
US7663333B2 (en) 2001-06-12 2010-02-16 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US7706917B1 (en) 2004-07-07 2010-04-27 Irobot Corporation Celestial navigation system for an autonomous robot
US7761954B2 (en) 2005-02-18 2010-07-27 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US20110050841A1 (en) * 2009-08-26 2011-03-03 Yulun Wang Portable remote presence robot
US20110153081A1 (en) * 2008-04-24 2011-06-23 Nikolai Romanov Robotic Floor Cleaning Apparatus with Shell Connected to the Cleaning Assembly and Suspended over the Drive System
US20110162157A1 (en) * 2010-01-06 2011-07-07 Evolution Robotics, Inc. Apparatus for holding a cleaning sheet in a cleaning implement
US20110202175A1 (en) * 2008-04-24 2011-08-18 Nikolai Romanov Mobile robot for cleaning
US8209051B2 (en) 2002-07-25 2012-06-26 Intouch Technologies, Inc. Medical tele-robotic system
US8239992B2 (en) 2007-05-09 2012-08-14 Irobot Corporation Compact autonomous coverage robot
US8340819B2 (en) 2008-09-18 2012-12-25 Intouch Technologies, Inc. Mobile videoconferencing robot system with network adaptive driving
US8374721B2 (en) 2005-12-02 2013-02-12 Irobot Corporation Robot system
US8380350B2 (en) 2005-12-02 2013-02-19 Irobot Corporation Autonomous coverage robot navigation system
US8382906B2 (en) 2005-02-18 2013-02-26 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US8386081B2 (en) 2002-09-13 2013-02-26 Irobot Corporation Navigational control system for a robotic device
US8390251B2 (en) 2004-01-21 2013-03-05 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8396592B2 (en) 2001-06-12 2013-03-12 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US8401275B2 (en) 2004-07-13 2013-03-19 Intouch Technologies, Inc. Mobile robot with a head-based movement mapping scheme
US8417383B2 (en) 2006-05-31 2013-04-09 Irobot Corporation Detecting robot stasis
US8442661B1 (en) * 2008-11-25 2013-05-14 Anybots 2.0, Inc. Remotely controlled self-balancing robot including a stabilized laser pointer
US20130131866A1 (en) * 2003-12-09 2013-05-23 Intouch Technologies, Inc. Protocol for a Remotely Controlled Videoconferencing Robot
US8463435B2 (en) 2008-11-25 2013-06-11 Intouch Technologies, Inc. Server connectivity control for tele-presence robot
US8515578B2 (en) 2002-09-13 2013-08-20 Irobot Corporation Navigational control system for a robotic device
US8515577B2 (en) 2002-07-25 2013-08-20 Yulun Wang Medical tele-robotic system with a master remote station with an arbitrator
US8584307B2 (en) 2005-12-02 2013-11-19 Irobot Corporation Modular robot
US8634960B2 (en) 2006-03-17 2014-01-21 Irobot Corporation Lawn care robot
US8670017B2 (en) 2010-03-04 2014-03-11 Intouch Technologies, Inc. Remote presence system including a cart that supports a robot face and an overhead camera
US8688275B1 (en) 2012-01-25 2014-04-01 Adept Technology, Inc. Positive and negative obstacle avoidance system and method for a mobile robot
US20140121876A1 (en) * 2012-10-30 2014-05-01 Agait Technology Corporation Autonomous mobile device and operating method for the same
US8718837B2 (en) 2011-01-28 2014-05-06 Intouch Technologies Interfacing with a mobile telepresence robot
US8739355B2 (en) 2005-02-18 2014-06-03 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US8780342B2 (en) 2004-03-29 2014-07-15 Irobot Corporation Methods and apparatus for position estimation using reflected light sources
US8788092B2 (en) 2000-01-24 2014-07-22 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8786429B2 (en) 2009-03-02 2014-07-22 Diversey, Inc. Hygiene monitoring and management system and method
US8788096B1 (en) 2010-05-17 2014-07-22 Anybots 2.0, Inc. Self-balancing robot having a shaft-mounted head
US8800107B2 (en) 2010-02-16 2014-08-12 Irobot Corporation Vacuum brush
US8836751B2 (en) 2011-11-08 2014-09-16 Intouch Technologies, Inc. Tele-presence system with a user interface that displays different communication links
US8849680B2 (en) 2009-01-29 2014-09-30 Intouch Technologies, Inc. Documentation through a remote presence robot
US8849679B2 (en) 2006-06-15 2014-09-30 Intouch Technologies, Inc. Remote controlled robot system that provides medical images
US8861750B2 (en) 2008-04-17 2014-10-14 Intouch Technologies, Inc. Mobile tele-presence system with a microphone system
US8892251B1 (en) 2010-01-06 2014-11-18 Irobot Corporation System and method for autonomous mopping of a floor surface
US8892260B2 (en) 2007-03-20 2014-11-18 Irobot Corporation Mobile robot for telecommunication
US8897920B2 (en) 2009-04-17 2014-11-25 Intouch Technologies, Inc. Tele-presence robot system with software modularity, projector and laser pointer
US8902278B2 (en) 2012-04-11 2014-12-02 Intouch Technologies, Inc. Systems and methods for visualizing and managing telepresence devices in healthcare networks
US8930023B2 (en) 2009-11-06 2015-01-06 Irobot Corporation Localization by learning of wave-signal distributions
US8930019B2 (en) 2010-12-30 2015-01-06 Irobot Corporation Mobile human interface robot
US8935005B2 (en) 2010-05-20 2015-01-13 Irobot Corporation Operating a mobile robot
US8972052B2 (en) 2004-07-07 2015-03-03 Irobot Corporation Celestial navigation system for an autonomous vehicle
US8984685B2 (en) 2012-02-15 2015-03-24 Stryker Corporation Patient support apparatus and controls therefor
US8996165B2 (en) 2008-10-21 2015-03-31 Intouch Technologies, Inc. Telepresence robot with a camera boom
US9008835B2 (en) 2004-06-24 2015-04-14 Irobot Corporation Remote control scheduler and method for autonomous robotic device
US9014848B2 (en) 2010-05-20 2015-04-21 Irobot Corporation Mobile robot system
US9098611B2 (en) 2012-11-26 2015-08-04 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US9138891B2 (en) 2008-11-25 2015-09-22 Intouch Technologies, Inc. Server connectivity control for tele-presence robot
US9160783B2 (en) 2007-05-09 2015-10-13 Intouch Technologies, Inc. Robot system that operates through a network firewall
US9174342B2 (en) 2012-05-22 2015-11-03 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
US9193065B2 (en) 2008-07-10 2015-11-24 Intouch Technologies, Inc. Docking system for a tele-presence robot
US9198728B2 (en) 2005-09-30 2015-12-01 Intouch Technologies, Inc. Multi-camera mobile teleconferencing platform
US9220389B2 (en) 2013-11-12 2015-12-29 Irobot Corporation Cleaning pad
US9241442B2 (en) 2012-10-23 2016-01-26 Daniel A. DIAZDELCASTILLO Autonomous and remote control all purpose machine (ARCAPM)
US9251313B2 (en) 2012-04-11 2016-02-02 Intouch Technologies, Inc. Systems and methods for visualizing and managing telepresence devices in healthcare networks
US9264664B2 (en) 2010-12-03 2016-02-16 Intouch Technologies, Inc. Systems and methods for dynamic bandwidth allocation
US9265396B1 (en) 2015-03-16 2016-02-23 Irobot Corporation Autonomous floor cleaning with removable pad
US9320398B2 (en) 2005-12-02 2016-04-26 Irobot Corporation Autonomous coverage robots
US9323250B2 (en) 2011-01-28 2016-04-26 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
US9361021B2 (en) 2012-05-22 2016-06-07 Irobot Corporation Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US9420741B2 (en) 2014-12-15 2016-08-23 Irobot Corporation Robot lawnmower mapping
US9427127B2 (en) 2013-11-12 2016-08-30 Irobot Corporation Autonomous surface cleaning robot
US9498886B2 (en) 2010-05-20 2016-11-22 Irobot Corporation Mobile human interface robot
US9510505B2 (en) 2014-10-10 2016-12-06 Irobot Corporation Autonomous robot localization
US9516806B2 (en) 2014-10-10 2016-12-13 Irobot Corporation Robotic lawn mowing boundary determination
US9538702B2 (en) 2014-12-22 2017-01-10 Irobot Corporation Robotic mowing of separated lawn areas
US9554508B2 (en) 2014-03-31 2017-01-31 Irobot Corporation Autonomous mobile robot
US9568911B2 (en) 2012-11-30 2017-02-14 Tennant Company Dynamic maintenance scheduling system for surface cleaning machines
US9580285B2 (en) 2011-08-26 2017-02-28 Crown Equipment Corporation Method and apparatus for using unique landmarks to locate industrial vehicles at start-up
US9610685B2 (en) 2004-02-26 2017-04-04 Intouch Technologies, Inc. Graphical interface for a remote presence system
US9775485B2 (en) 2013-06-03 2017-10-03 Bissell Homecare, Inc. Autonomous floor cleaner
US9811089B2 (en) 2013-12-19 2017-11-07 Aktiebolaget Electrolux Robotic cleaning device with perimeter recording function
US9842192B2 (en) 2008-07-11 2017-12-12 Intouch Technologies, Inc. Tele-presence robot system with multi-cast features
US9907449B2 (en) 2015-03-16 2018-03-06 Irobot Corporation Autonomous floor cleaning with a removable pad
US9939529B2 (en) 2012-08-27 2018-04-10 Aktiebolaget Electrolux Robot positioning system
US9946263B2 (en) 2013-12-19 2018-04-17 Aktiebolaget Electrolux Prioritizing cleaning areas
US9949608B2 (en) 2002-09-13 2018-04-24 Irobot Corporation Navigational control system for a robotic device
US9958873B2 (en) 2011-04-11 2018-05-01 Crown Equipment Corporation System for efficient scheduling for multiple automated non-holonomic vehicles using a coordinated path planner
US9974612B2 (en) 2011-05-19 2018-05-22 Intouch Technologies, Inc. Enhanced diagnostics for a telepresence robot
US10021830B2 (en) 2016-02-02 2018-07-17 Irobot Corporation Blade assembly for a grass cutting mobile robot
US10034421B2 (en) 2015-07-24 2018-07-31 Irobot Corporation Controlling robotic lawnmowers
US10045675B2 (en) 2013-12-19 2018-08-14 Aktiebolaget Electrolux Robotic vacuum cleaner with side brush moving in spiral pattern
US10089586B2 (en) 2012-02-08 2018-10-02 Omron Adept Technologies, Inc. Job management system for a fleet of autonomous mobile robots
US10149589B2 (en) 2013-12-19 2018-12-11 Aktiebolaget Electrolux Sensing climb of obstacle of a robotic cleaning device
US10209080B2 (en) 2013-12-19 2019-02-19 Aktiebolaget Electrolux Robotic cleaning device
US10219665B2 (en) 2013-04-15 2019-03-05 Aktiebolaget Electrolux Robotic vacuum cleaner with protruding sidebrush
US10231591B2 (en) 2013-12-20 2019-03-19 Aktiebolaget Electrolux Dust container
US10343283B2 (en) 2010-05-24 2019-07-09 Intouch Technologies, Inc. Telepresence robot system that can be accessed by a cellular phone
US10433697B2 (en) 2013-12-19 2019-10-08 Aktiebolaget Electrolux Adaptive speed control of rotating side brush
US10448794B2 (en) 2013-04-15 2019-10-22 Aktiebolaget Electrolux Robotic vacuum cleaner
US10459063B2 (en) 2016-02-16 2019-10-29 Irobot Corporation Ranging and angle of arrival antenna system for a mobile robot
US10471588B2 (en) 2008-04-14 2019-11-12 Intouch Technologies, Inc. Robotic based health care system
US10499778B2 (en) 2014-09-08 2019-12-10 Aktiebolaget Electrolux Robotic vacuum cleaner
US10518416B2 (en) 2014-07-10 2019-12-31 Aktiebolaget Electrolux Method for detecting a measurement error in a robotic cleaning device
US10534367B2 (en) 2014-12-16 2020-01-14 Aktiebolaget Electrolux Experience-based roadmap for a robotic cleaning device
US10595698B2 (en) 2017-06-02 2020-03-24 Irobot Corporation Cleaning pad for cleaning robot
US10617271B2 (en) 2013-12-19 2020-04-14 Aktiebolaget Electrolux Robotic cleaning device and method for landmark recognition
US10678251B2 (en) 2014-12-16 2020-06-09 Aktiebolaget Electrolux Cleaning method for a robotic cleaning device
WO2020120462A2 (en) 2018-12-12 2020-06-18 Kemaro Ag Device and method for automatically performing an activity, in particular for cleaning dirty surfaces
US10729297B2 (en) 2014-09-08 2020-08-04 Aktiebolaget Electrolux Robotic vacuum cleaner
US10769739B2 (en) 2011-04-25 2020-09-08 Intouch Technologies, Inc. Systems and methods for management of information among medical providers and facilities
US10808882B2 (en) 2010-05-26 2020-10-20 Intouch Technologies, Inc. Tele-robotic system with a robot face placed on a chair
US10874271B2 (en) 2014-12-12 2020-12-29 Aktiebolaget Electrolux Side brush and robotic cleaner
US10877484B2 (en) 2014-12-10 2020-12-29 Aktiebolaget Electrolux Using laser sensor for floor type detection
US10875182B2 (en) 2008-03-20 2020-12-29 Teladoc Health, Inc. Remote presence system mounted to operating room hardware
US10874274B2 (en) 2015-09-03 2020-12-29 Aktiebolaget Electrolux System of robotic cleaning devices
US11099554B2 (en) 2015-04-17 2021-08-24 Aktiebolaget Electrolux Robotic cleaning device and a method of controlling the robotic cleaning device
US11115798B2 (en) 2015-07-23 2021-09-07 Irobot Corporation Pairing a beacon with a mobile robot
US11122953B2 (en) 2016-05-11 2021-09-21 Aktiebolaget Electrolux Robotic cleaning device
US11154981B2 (en) 2010-02-04 2021-10-26 Teladoc Health, Inc. Robot user interface for telepresence robot system
US11169533B2 (en) 2016-03-15 2021-11-09 Aktiebolaget Electrolux Robotic cleaning device and a method at the robotic cleaning device of performing cliff detection
US11272822B2 (en) 2013-11-12 2022-03-15 Irobot Corporation Mobile floor cleaning robot with pad holder
US11389064B2 (en) 2018-04-27 2022-07-19 Teladoc Health, Inc. Telehealth cart that supports a removable tablet with seamless audio/video switching
US11399153B2 (en) 2009-08-26 2022-07-26 Teladoc Health, Inc. Portable telepresence apparatus
US11474533B2 (en) 2017-06-02 2022-10-18 Aktiebolaget Electrolux Method of detecting a difference in level of a surface in front of a robotic cleaning device
US11470774B2 (en) 2017-07-14 2022-10-18 Irobot Corporation Blade assembly for a grass cutting mobile robot
US11636944B2 (en) 2017-08-25 2023-04-25 Teladoc Health, Inc. Connectivity infrastructure for a telehealth platform
US11742094B2 (en) 2017-07-25 2023-08-29 Teladoc Health, Inc. Modular telehealth cart with thermal imaging and touch screen user interface
US11862302B2 (en) 2017-04-24 2024-01-02 Teladoc Health, Inc. Automated transcription and documentation of tele-health encounters
US11921517B2 (en) 2017-09-26 2024-03-05 Aktiebolaget Electrolux Controlling movement of a robotic cleaning device
US11957286B2 (en) * 2022-04-28 2024-04-16 Irobot Corporation Autonomous floor cleaning with a removable pad

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070156372A1 (en) * 2003-07-31 2007-07-05 Thomas Christ Determining distances in a warehouse
US7376487B2 (en) * 2003-11-25 2008-05-20 International Business Machines Corporation Nesting negotiation for self-mobile devices
KR101086092B1 (en) * 2004-01-21 2011-11-25 아이로보트 코퍼레이션 Method of docking an autonomous robot
US11209833B2 (en) 2004-07-07 2021-12-28 Irobot Corporation Celestial navigation system for an autonomous vehicle
US11835343B1 (en) * 2004-08-06 2023-12-05 AI Incorporated Method for constructing a map while performing work
KR100809342B1 (en) * 2004-10-05 2008-03-05 삼성전자주식회사 Apparatus and method for navigation based on intensity of illumination
US7720572B2 (en) * 2005-09-30 2010-05-18 Irobot Corporation Companion robot for personal interaction
US7826926B2 (en) * 2005-11-07 2010-11-02 Samsung Electronics Co., Ltd. Robot and method of localizing the same
KR100791384B1 (en) * 2006-07-05 2008-01-07 삼성전자주식회사 Method for dividing regions by feature points and apparatus thereof and mobile cleaning robot
JP5027735B2 (en) * 2007-05-25 2012-09-19 サッポロビール株式会社 Method for producing sparkling alcoholic beverage
EP2612208A1 (en) * 2010-09-03 2013-07-10 Aldebaran Robotics Mobile robot
US9114440B1 (en) * 2013-05-02 2015-08-25 Michael A. Colucci Outdoor home cleaning robot—system and method
US10518407B2 (en) 2015-01-06 2019-12-31 Discovery Robotics Apparatus and methods for providing a reconfigurable robotic platform
US10328573B2 (en) 2015-01-06 2019-06-25 Discovery Robotics Robotic platform with teach-repeat mode
GB2538231A (en) * 2015-05-07 2016-11-16 Airbus Operations Ltd Method and apparatus for aircraft inspection
JP6445151B2 (en) * 2015-05-22 2018-12-26 富士フイルム株式会社 Robot apparatus and movement control method of robot apparatus
US10496262B1 (en) * 2015-09-30 2019-12-03 AI Incorporated Robotic floor-cleaning system manager
US10180733B2 (en) * 2015-12-22 2019-01-15 Kindred Systems Inc. Systems, devices, and methods for foot control of robots
JP6212590B2 (en) * 2016-03-31 2017-10-11 本田技研工業株式会社 Control device for autonomous vehicle
USD869108S1 (en) 2016-07-14 2019-12-03 Discovery Robotics Robot comprising a service module
GB2553327B (en) * 2016-09-01 2021-11-03 M Mover Holdings Ltd An apparatus for transporting a load
GB2553326B (en) * 2016-09-01 2021-11-03 M Mover Holdings Ltd An apparatus for transporting a load
US10422648B2 (en) * 2017-10-17 2019-09-24 AI Incorporated Methods for finding the perimeter of a place using observed coordinates
WO2019203878A1 (en) * 2018-04-20 2019-10-24 Discovery Robotics Apparatus and methods of a service robotic platform
JP7070107B2 (en) * 2018-06-05 2022-05-18 富士通株式会社 Information processing equipment, work planning program and work planning method
US10835096B2 (en) * 2018-08-30 2020-11-17 Irobot Corporation Map based training and interface for mobile robots
CN117140560B (en) * 2023-11-01 2024-01-23 太原阿凡达机器人科技有限公司 Intelligent robot with display screen self-cleaning function

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4782550A (en) * 1988-02-12 1988-11-08 Von Schrader Company Automatic surface-treating apparatus
US4968878A (en) * 1989-02-07 1990-11-06 Transitions Research Corporation Dual bumper-light curtain obstacle detection sensor
US5309592A (en) * 1992-06-23 1994-05-10 Sanyo Electric Co., Ltd. Cleaning robot
US5363305A (en) 1990-07-02 1994-11-08 Nec Research Institute, Inc. Navigation system for a mobile robot
US5363535A (en) * 1992-03-30 1994-11-15 Racine Industries, Inc. Carpet cleaning machine with convertible-use feature
US5911767A (en) 1994-10-04 1999-06-15 Garibotto; Giovanni Navigation system for an autonomous mobile robot
US6108597A (en) 1996-03-06 2000-08-22 Gmd-Forschungszentrum Informationstechnik Gmbh Autonomous mobile robot system for sensor-based and map-based navigation in pipe networks
US6124694A (en) * 1999-03-18 2000-09-26 Bancroft; Allen J. Wide area navigation for a robot scrubber
US6459955B1 (en) * 1999-11-18 2002-10-01 The Procter & Gamble Company Home cleaning robot

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4782550A (en) * 1988-02-12 1988-11-08 Von Schrader Company Automatic surface-treating apparatus
US4968878A (en) * 1989-02-07 1990-11-06 Transitions Research Corporation Dual bumper-light curtain obstacle detection sensor
US5363305A (en) 1990-07-02 1994-11-08 Nec Research Institute, Inc. Navigation system for a mobile robot
US5363535A (en) * 1992-03-30 1994-11-15 Racine Industries, Inc. Carpet cleaning machine with convertible-use feature
US5309592A (en) * 1992-06-23 1994-05-10 Sanyo Electric Co., Ltd. Cleaning robot
US5911767A (en) 1994-10-04 1999-06-15 Garibotto; Giovanni Navigation system for an autonomous mobile robot
US6108597A (en) 1996-03-06 2000-08-22 Gmd-Forschungszentrum Informationstechnik Gmbh Autonomous mobile robot system for sensor-based and map-based navigation in pipe networks
US6124694A (en) * 1999-03-18 2000-09-26 Bancroft; Allen J. Wide area navigation for a robot scrubber
US6459955B1 (en) * 1999-11-18 2002-10-01 The Procter & Gamble Company Home cleaning robot

Cited By (382)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050251292A1 (en) * 2000-01-24 2005-11-10 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8412377B2 (en) 2000-01-24 2013-04-02 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US9446521B2 (en) 2000-01-24 2016-09-20 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8478442B2 (en) 2000-01-24 2013-07-02 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8565920B2 (en) 2000-01-24 2013-10-22 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8761935B2 (en) 2000-01-24 2014-06-24 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US8788092B2 (en) 2000-01-24 2014-07-22 Irobot Corporation Obstacle following sensor scheme for a mobile robot
US9144361B2 (en) 2000-04-04 2015-09-29 Irobot Corporation Debris sensor for cleaning apparatus
US9167946B2 (en) 2001-01-24 2015-10-27 Irobot Corporation Autonomous floor cleaning robot
US9038233B2 (en) 2001-01-24 2015-05-26 Irobot Corporation Autonomous floor-cleaning robot
US8659256B2 (en) 2001-01-24 2014-02-25 Irobot Corporation Robot confinement
US8659255B2 (en) 2001-01-24 2014-02-25 Irobot Corporation Robot confinement
US20080084174A1 (en) * 2001-01-24 2008-04-10 Irobot Corporation Robot Confinement
US9582005B2 (en) 2001-01-24 2017-02-28 Irobot Corporation Robot confinement
US8368339B2 (en) 2001-01-24 2013-02-05 Irobot Corporation Robot confinement
US9622635B2 (en) 2001-01-24 2017-04-18 Irobot Corporation Autonomous floor-cleaning robot
US8463438B2 (en) 2001-06-12 2013-06-11 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US7663333B2 (en) 2001-06-12 2010-02-16 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US8838274B2 (en) 2001-06-12 2014-09-16 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US8396592B2 (en) 2001-06-12 2013-03-12 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US9104204B2 (en) 2001-06-12 2015-08-11 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US20100263142A1 (en) * 2001-06-12 2010-10-21 Irobot Corporation Method and system for multi-mode coverage for an autonomous robot
US20040249511A1 (en) * 2001-10-11 2004-12-09 Markus Jager Method, arrangement and computer programme with programme-coding means and computer programme product for allocating a partial surface of a total surface divided into several partial surfaces on one of several mobile units
US7792606B2 (en) * 2001-10-11 2010-09-07 Siemens Aktiengesellschaft System for assigning one of many part domains in a domain to one of many mobile units
US8656550B2 (en) 2002-01-03 2014-02-25 Irobot Corporation Autonomous floor-cleaning robot
US20070266508A1 (en) * 2002-01-03 2007-11-22 Irobot Corporation Autonomous Floor Cleaning Robot
US20080000041A1 (en) * 2002-01-03 2008-01-03 Irobot Corporation Autonomous Floor Cleaning Robot
US20040187249A1 (en) * 2002-01-03 2004-09-30 Jones Joseph L. Autonomous floor-cleaning robot
US8671507B2 (en) 2002-01-03 2014-03-18 Irobot Corporation Autonomous floor-cleaning robot
US8474090B2 (en) 2002-01-03 2013-07-02 Irobot Corporation Autonomous floor-cleaning robot
US8763199B2 (en) 2002-01-03 2014-07-01 Irobot Corporation Autonomous floor-cleaning robot
US8516651B2 (en) 2002-01-03 2013-08-27 Irobot Corporation Autonomous floor-cleaning robot
US20080307590A1 (en) * 2002-01-03 2008-12-18 Irobot Corporation Autonomous Floor-Cleaning Robot
US9128486B2 (en) 2002-01-24 2015-09-08 Irobot Corporation Navigational control system for a robotic device
USRE45870E1 (en) 2002-07-25 2016-01-26 Intouch Technologies, Inc. Apparatus and method for patient rounding with a remote controlled robot
US9849593B2 (en) 2002-07-25 2017-12-26 Intouch Technologies, Inc. Medical tele-robotic system with a master remote station with an arbitrator
US8209051B2 (en) 2002-07-25 2012-06-26 Intouch Technologies, Inc. Medical tele-robotic system
US8515577B2 (en) 2002-07-25 2013-08-20 Yulun Wang Medical tele-robotic system with a master remote station with an arbitrator
US10315312B2 (en) 2002-07-25 2019-06-11 Intouch Technologies, Inc. Medical tele-robotic system with a master remote station with an arbitrator
US8515578B2 (en) 2002-09-13 2013-08-20 Irobot Corporation Navigational control system for a robotic device
US8793020B2 (en) 2002-09-13 2014-07-29 Irobot Corporation Navigational control system for a robotic device
US8386081B2 (en) 2002-09-13 2013-02-26 Irobot Corporation Navigational control system for a robotic device
US9949608B2 (en) 2002-09-13 2018-04-24 Irobot Corporation Navigational control system for a robotic device
US7320149B1 (en) 2002-11-22 2008-01-22 Bissell Homecare, Inc. Robotic extraction cleaner with dusting pad
US20040168837A1 (en) * 2002-11-27 2004-09-02 Universite De Sherbrooke Modular robotic platform
US20040158354A1 (en) * 2002-12-30 2004-08-12 Samsung Electronics Co., Ltd. Robot localization system
US7970491B2 (en) * 2002-12-30 2011-06-28 Samsung Electronics Co., Ltd. Robot localization system
US7103449B2 (en) * 2003-01-23 2006-09-05 Lg Electronics Inc. Position information recognition apparatus for cleaning robot
US20040199301A1 (en) * 2003-01-23 2004-10-07 Lg Electronics Inc. Position information recognition apparatus for cleaning robot
US7031805B2 (en) * 2003-02-06 2006-04-18 Samsung Gwangju Electronics Co., Ltd. Robot cleaner system having external recharging apparatus and method for docking robot cleaner with external recharging apparatus
US20040158357A1 (en) * 2003-02-06 2004-08-12 Samsung Gwangju Electronics Co., Ltd Robot cleaner system having external recharging apparatus and method for docking robot cleaner with external recharging apparatus
US8115439B2 (en) 2003-02-26 2012-02-14 Silverbrook Research Pty Ltd System for moving mobile robots in accordance with predetermined algorithm
US20100013153A1 (en) * 2003-02-26 2010-01-21 Silverbrook Research Pty Ltd Game System With Robotic Game Pieces
US20080144978A1 (en) * 2003-02-26 2008-06-19 Silverbrook Research Pty Ltd Mobile Robot For Sensing And Decoding A Surface Coding Pattern On A Surface
US7605557B2 (en) * 2003-02-26 2009-10-20 Silverbrook Research Pty Ltd Mobile robot for sensing and decoding a surface coding pattern on a surface
US7893646B2 (en) 2003-02-26 2011-02-22 Silverbrook Research Pty Ltd Game system with robotic game pieces
US20040204804A1 (en) * 2003-04-08 2004-10-14 Samsung Electronics Co., Ltd. Method and apparatus for generating and tracing cleaning trajectory of home cleaning robot
US7860608B2 (en) 2003-04-08 2010-12-28 Samsung Electronics Co., Ltd. Method and apparatus for generating and tracing cleaning trajectory of home cleaning robot
US7546179B2 (en) * 2003-04-15 2009-06-09 Samsung Electronics Co., Ltd. Method and apparatus for allowing mobile robot to return to docking station
US20040210346A1 (en) * 2003-04-15 2004-10-21 Samsung Electronics Co., Ltd. Method and apparatus for allowing mobile robot to return to docking station
US20070100496A1 (en) * 2003-05-27 2007-05-03 Stockholmsmassan Robot system, method and computer program product
US7474941B2 (en) * 2003-07-24 2009-01-06 Samsung Gwangju Electronics Co., Ltd. Robot cleaner
US20050021181A1 (en) * 2003-07-24 2005-01-27 Samsung Gwangju Electronics Co., Ltd. Robot cleaner
US9956690B2 (en) 2003-12-09 2018-05-01 Intouch Technologies, Inc. Protocol for a remotely controlled videoconferencing robot
US9375843B2 (en) 2003-12-09 2016-06-28 Intouch Technologies, Inc. Protocol for a remotely controlled videoconferencing robot
US10882190B2 (en) 2003-12-09 2021-01-05 Teladoc Health, Inc. Protocol for a remotely controlled videoconferencing robot
US20130131866A1 (en) * 2003-12-09 2013-05-23 Intouch Technologies, Inc. Protocol for a Remotely Controlled Videoconferencing Robot
US9296107B2 (en) * 2003-12-09 2016-03-29 Intouch Technologies, Inc. Protocol for a remotely controlled videoconferencing robot
US8461803B2 (en) 2004-01-21 2013-06-11 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8390251B2 (en) 2004-01-21 2013-03-05 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US9215957B2 (en) 2004-01-21 2015-12-22 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8749196B2 (en) 2004-01-21 2014-06-10 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8854001B2 (en) 2004-01-21 2014-10-07 Irobot Corporation Autonomous robot auto-docking and energy management systems and methods
US8253368B2 (en) 2004-01-28 2012-08-28 Irobot Corporation Debris sensor for cleaning apparatus
US8378613B2 (en) 2004-01-28 2013-02-19 Irobot Corporation Debris sensor for cleaning apparatus
US8456125B2 (en) 2004-01-28 2013-06-04 Irobot Corporation Debris sensor for cleaning apparatus
US20080150466A1 (en) * 2004-01-28 2008-06-26 Landry Gregg W Debris Sensor for Cleaning Apparatus
US9610685B2 (en) 2004-02-26 2017-04-04 Intouch Technologies, Inc. Graphical interface for a remote presence system
US8780342B2 (en) 2004-03-29 2014-07-15 Irobot Corporation Methods and apparatus for position estimation using reflected light sources
US9360300B2 (en) 2004-03-29 2016-06-07 Irobot Corporation Methods and apparatus for position estimation using reflected light sources
US20050222711A1 (en) * 2004-04-01 2005-10-06 Kabushiki Kaisha Toshiba Robot and a robot control method
US7603744B2 (en) * 2004-04-02 2009-10-20 Royal Appliance Mfg. Co. Robotic appliance with on-board joystick sensor and associated methods of operation
CN1683120B (en) * 2004-04-02 2010-04-28 皇家器具有限公司 Robotic appliance with on-board joystick sensor and associated methods of operation
US20050217061A1 (en) * 2004-04-02 2005-10-06 Royal Appliance Mfg. Co. Robotic appliance with on-board joystick sensor and associated methods of operation
US20050239594A1 (en) * 2004-04-23 2005-10-27 Alto U.S. Inc. Joystick controlled scrubber
US7041029B2 (en) 2004-04-23 2006-05-09 Alto U.S. Inc. Joystick controlled scrubber
US9486924B2 (en) 2004-06-24 2016-11-08 Irobot Corporation Remote control scheduler and method for autonomous robotic device
US9008835B2 (en) 2004-06-24 2015-04-14 Irobot Corporation Remote control scheduler and method for autonomous robotic device
US7706917B1 (en) 2004-07-07 2010-04-27 Irobot Corporation Celestial navigation system for an autonomous robot
US8594840B1 (en) 2004-07-07 2013-11-26 Irobot Corporation Celestial navigation system for an autonomous robot
US8972052B2 (en) 2004-07-07 2015-03-03 Irobot Corporation Celestial navigation system for an autonomous vehicle
US9229454B1 (en) 2004-07-07 2016-01-05 Irobot Corporation Autonomous mobile robot system
US11360484B2 (en) * 2004-07-07 2022-06-14 Irobot Corporation Celestial navigation system for an autonomous vehicle
US8634956B1 (en) 2004-07-07 2014-01-21 Irobot Corporation Celestial navigation system for an autonomous robot
US9223749B2 (en) 2004-07-07 2015-12-29 Irobot Corporation Celestial navigation system for an autonomous vehicle
US8874264B1 (en) 2004-07-07 2014-10-28 Irobot Corporation Celestial navigation system for an autonomous robot
US9766624B2 (en) 2004-07-13 2017-09-19 Intouch Technologies, Inc. Mobile robot with a head-based movement mapping scheme
US8401275B2 (en) 2004-07-13 2013-03-19 Intouch Technologies, Inc. Mobile robot with a head-based movement mapping scheme
US8983174B2 (en) 2004-07-13 2015-03-17 Intouch Technologies, Inc. Mobile robot with a head-based movement mapping scheme
US10241507B2 (en) 2004-07-13 2019-03-26 Intouch Technologies, Inc. Mobile robot with a head-based movement mapping scheme
US20060061657A1 (en) * 2004-09-23 2006-03-23 Lg Electronics Inc. Remote observation system and method thereof
US20060149419A1 (en) * 2004-11-30 2006-07-06 Kabushiki Kaisha Toshiba Movable robot without falling over
US7551978B2 (en) * 2004-12-14 2009-06-23 Honda Motor Co., Ltd. System for carrying an item
US20060142896A1 (en) * 2004-12-14 2006-06-29 Honda Motor Co., Ltd. System for carrying an item
US8774966B2 (en) 2005-02-18 2014-07-08 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US10213081B2 (en) 2005-02-18 2019-02-26 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8966707B2 (en) 2005-02-18 2015-03-03 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US9706891B2 (en) 2005-02-18 2017-07-18 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8985127B2 (en) 2005-02-18 2015-03-24 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US9445702B2 (en) 2005-02-18 2016-09-20 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8855813B2 (en) 2005-02-18 2014-10-07 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8382906B2 (en) 2005-02-18 2013-02-26 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US11185204B2 (en) 2005-02-18 2021-11-30 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US10470629B2 (en) 2005-02-18 2019-11-12 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US8670866B2 (en) 2005-02-18 2014-03-11 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8392021B2 (en) 2005-02-18 2013-03-05 Irobot Corporation Autonomous surface cleaning robot for wet cleaning
US8387193B2 (en) 2005-02-18 2013-03-05 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US7761954B2 (en) 2005-02-18 2010-07-27 Irobot Corporation Autonomous surface cleaning robot for wet and dry cleaning
US8739355B2 (en) 2005-02-18 2014-06-03 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US8782848B2 (en) 2005-02-18 2014-07-22 Irobot Corporation Autonomous surface cleaning robot for dry cleaning
US20060287801A1 (en) * 2005-06-07 2006-12-21 Lg Electronics Inc. Apparatus and method for notifying state of self-moving robot
US9198728B2 (en) 2005-09-30 2015-12-01 Intouch Technologies, Inc. Multi-camera mobile teleconferencing platform
US10259119B2 (en) 2005-09-30 2019-04-16 Intouch Technologies, Inc. Multi-camera mobile teleconferencing platform
US9020679B2 (en) 2005-10-14 2015-04-28 Aethon, Inc. Robotic ordering and delivery system software and methods
US20070112461A1 (en) * 2005-10-14 2007-05-17 Aldo Zini Robotic ordering and delivery system software and methods
US20110163160A1 (en) * 2005-10-14 2011-07-07 Aethon, Inc. Robotic Ordering and Delivery System Software and Methods
US9679270B2 (en) 2005-10-14 2017-06-13 Aethon, Inc. Robotic ordering and delivery system software and methods
US9563206B2 (en) 2005-10-14 2017-02-07 Aethon, Inc. Robotic ordering and delivery system software and methods
US9026301B2 (en) * 2005-10-14 2015-05-05 Aethon, Inc. Robotic ordering and delivery system software and methods
US8380350B2 (en) 2005-12-02 2013-02-19 Irobot Corporation Autonomous coverage robot navigation system
US8584307B2 (en) 2005-12-02 2013-11-19 Irobot Corporation Modular robot
US9144360B2 (en) 2005-12-02 2015-09-29 Irobot Corporation Autonomous coverage robot navigation system
US8950038B2 (en) 2005-12-02 2015-02-10 Irobot Corporation Modular robot
US8661605B2 (en) 2005-12-02 2014-03-04 Irobot Corporation Coverage robot mobility
US8600553B2 (en) 2005-12-02 2013-12-03 Irobot Corporation Coverage robot mobility
US20070234492A1 (en) * 2005-12-02 2007-10-11 Irobot Corporation Coverage robot mobility
US9392920B2 (en) 2005-12-02 2016-07-19 Irobot Corporation Robot system
US10524629B2 (en) 2005-12-02 2020-01-07 Irobot Corporation Modular Robot
US8606401B2 (en) 2005-12-02 2013-12-10 Irobot Corporation Autonomous coverage robot navigation system
US8374721B2 (en) 2005-12-02 2013-02-12 Irobot Corporation Robot system
US9320398B2 (en) 2005-12-02 2016-04-26 Irobot Corporation Autonomous coverage robots
US8761931B2 (en) 2005-12-02 2014-06-24 Irobot Corporation Robot system
US8954192B2 (en) 2005-12-02 2015-02-10 Irobot Corporation Navigating autonomous coverage robots
US9599990B2 (en) 2005-12-02 2017-03-21 Irobot Corporation Robot system
US8978196B2 (en) 2005-12-02 2015-03-17 Irobot Corporation Coverage robot mobility
US8584305B2 (en) 2005-12-02 2013-11-19 Irobot Corporation Modular robot
US9149170B2 (en) 2005-12-02 2015-10-06 Irobot Corporation Navigating autonomous coverage robots
US7822508B2 (en) * 2005-12-12 2010-10-26 Honda Motor Co., Ltd. Autonomous mobile robot and goods carrying method of using the same
US20070152619A1 (en) * 2005-12-12 2007-07-05 Honda Motor Co., Ltd. Autonomous mobile robot and goods carrying method of using the same
US8868237B2 (en) 2006-03-17 2014-10-21 Irobot Corporation Robot confinement
US8634960B2 (en) 2006-03-17 2014-01-21 Irobot Corporation Lawn care robot
US9713302B2 (en) 2006-03-17 2017-07-25 Irobot Corporation Robot confinement
US9043952B2 (en) 2006-03-17 2015-06-02 Irobot Corporation Lawn care robot
US8781627B2 (en) 2006-03-17 2014-07-15 Irobot Corporation Robot confinement
US8954193B2 (en) 2006-03-17 2015-02-10 Irobot Corporation Lawn care robot
US11194342B2 (en) 2006-03-17 2021-12-07 Irobot Corporation Lawn care robot
US10037038B2 (en) 2006-03-17 2018-07-31 Irobot Corporation Lawn care robot
US9043953B2 (en) 2006-03-17 2015-06-02 Irobot Corporation Lawn care robot
US10244915B2 (en) 2006-05-19 2019-04-02 Irobot Corporation Coverage robots and associated cleaning bins
US9492048B2 (en) 2006-05-19 2016-11-15 Irobot Corporation Removing debris from cleaning robots
US8572799B2 (en) 2006-05-19 2013-11-05 Irobot Corporation Removing debris from cleaning robots
US8528157B2 (en) 2006-05-19 2013-09-10 Irobot Corporation Coverage robots and associated cleaning bins
US8418303B2 (en) 2006-05-19 2013-04-16 Irobot Corporation Cleaning robot roller processing
US8087117B2 (en) 2006-05-19 2012-01-03 Irobot Corporation Cleaning robot roller processing
US20080052846A1 (en) * 2006-05-19 2008-03-06 Irobot Corporation Cleaning robot roller processing
US9955841B2 (en) 2006-05-19 2018-05-01 Irobot Corporation Removing debris from cleaning robots
US9317038B2 (en) 2006-05-31 2016-04-19 Irobot Corporation Detecting robot stasis
US8417383B2 (en) 2006-05-31 2013-04-09 Irobot Corporation Detecting robot stasis
US8849679B2 (en) 2006-06-15 2014-09-30 Intouch Technologies, Inc. Remote controlled robot system that provides medical images
US7844378B2 (en) 2006-10-05 2010-11-30 Trimble Navigation Limited Farm apparatus having implement sidehill drift compensation
US20080086249A1 (en) * 2006-10-05 2008-04-10 Trimble Navigation Limited Farm apparatus having implement sidehill drift compensation
US7780513B2 (en) * 2007-03-02 2010-08-24 National Taiwan University Of Science And Technology Board game system utilizing a robot arm
US20080214260A1 (en) * 2007-03-02 2008-09-04 National Taiwan University Of Science And Technology Board game system utilizing a robot arm
US8892260B2 (en) 2007-03-20 2014-11-18 Irobot Corporation Mobile robot for telecommunication
US9296109B2 (en) 2007-03-20 2016-03-29 Irobot Corporation Mobile robot for telecommunication
US20080229885A1 (en) * 2007-03-22 2008-09-25 Mah Pat Y Jar opener
WO2008135978A3 (en) * 2007-05-06 2010-02-25 Wave Group Ltd. A robotic platform
US20100179691A1 (en) * 2007-05-06 2010-07-15 Wave Group Ltd. Robotic Platform
WO2008135978A2 (en) * 2007-05-06 2008-11-13 Wave Group Ltd. A robotic platform
US9160783B2 (en) 2007-05-09 2015-10-13 Intouch Technologies, Inc. Robot system that operates through a network firewall
US8839477B2 (en) 2007-05-09 2014-09-23 Irobot Corporation Compact autonomous coverage robot
US8726454B2 (en) 2007-05-09 2014-05-20 Irobot Corporation Autonomous coverage robot
US8239992B2 (en) 2007-05-09 2012-08-14 Irobot Corporation Compact autonomous coverage robot
US11498438B2 (en) 2007-05-09 2022-11-15 Irobot Corporation Autonomous coverage robot
US9480381B2 (en) 2007-05-09 2016-11-01 Irobot Corporation Compact autonomous coverage robot
US10299652B2 (en) 2007-05-09 2019-05-28 Irobot Corporation Autonomous coverage robot
US11072250B2 (en) 2007-05-09 2021-07-27 Irobot Corporation Autonomous coverage robot sensing
US10070764B2 (en) 2007-05-09 2018-09-11 Irobot Corporation Compact autonomous coverage robot
US10682763B2 (en) 2007-05-09 2020-06-16 Intouch Technologies, Inc. Robot system that operates through a network firewall
US8438695B2 (en) 2007-05-09 2013-05-14 Irobot Corporation Autonomous coverage robot sensing
US8634955B2 (en) * 2007-07-18 2014-01-21 Kabushiki Kaisha Toshiba Mobile robot and method for controlling mobile robot
US20090024250A1 (en) * 2007-07-18 2009-01-22 Kabushiki Kaisha Toshiba Mobile Robot and method for controlling mobile robot
US10875182B2 (en) 2008-03-20 2020-12-29 Teladoc Health, Inc. Remote presence system mounted to operating room hardware
US11787060B2 (en) 2008-03-20 2023-10-17 Teladoc Health, Inc. Remote presence system mounted to operating room hardware
US10471588B2 (en) 2008-04-14 2019-11-12 Intouch Technologies, Inc. Robotic based health care system
US11472021B2 (en) 2008-04-14 2022-10-18 Teladoc Health, Inc. Robotic based health care system
US8861750B2 (en) 2008-04-17 2014-10-14 Intouch Technologies, Inc. Mobile tele-presence system with a microphone system
US10730397B2 (en) 2008-04-24 2020-08-04 Irobot Corporation Application of localization, positioning and navigation systems for robotic enabled mobile products
US20110202175A1 (en) * 2008-04-24 2011-08-18 Nikolai Romanov Mobile robot for cleaning
US20110160903A1 (en) * 2008-04-24 2011-06-30 Nikolai Romanov Articulated Joint and Three Points of Contact
US9725012B2 (en) 2008-04-24 2017-08-08 Irobot Corporation Articulated joint and three areas of contact
US10766132B2 (en) 2008-04-24 2020-09-08 Irobot Corporation Mobile robot for cleaning
US8961695B2 (en) 2008-04-24 2015-02-24 Irobot Corporation Mobile robot for cleaning
US9725013B2 (en) 2008-04-24 2017-08-08 Irobot Corporation Robotic floor cleaning apparatus with shell connected to the cleaning assembly and suspended over the drive system
US20110153081A1 (en) * 2008-04-24 2011-06-23 Nikolai Romanov Robotic Floor Cleaning Apparatus with Shell Connected to the Cleaning Assembly and Suspended over the Drive System
US8452450B2 (en) 2008-04-24 2013-05-28 Evolution Robotics, Inc. Application of localization, positioning and navigation systems for robotic enabled mobile products
US9193065B2 (en) 2008-07-10 2015-11-24 Intouch Technologies, Inc. Docking system for a tele-presence robot
US10493631B2 (en) 2008-07-10 2019-12-03 Intouch Technologies, Inc. Docking system for a tele-presence robot
US9842192B2 (en) 2008-07-11 2017-12-12 Intouch Technologies, Inc. Tele-presence robot system with multi-cast features
US10878960B2 (en) 2008-07-11 2020-12-29 Teladoc Health, Inc. Tele-presence robot system with multi-cast features
US8340819B2 (en) 2008-09-18 2012-12-25 Intouch Technologies, Inc. Mobile videoconferencing robot system with network adaptive driving
US9429934B2 (en) 2008-09-18 2016-08-30 Intouch Technologies, Inc. Mobile videoconferencing robot system with network adaptive driving
US8996165B2 (en) 2008-10-21 2015-03-31 Intouch Technologies, Inc. Telepresence robot with a camera boom
US10875183B2 (en) 2008-11-25 2020-12-29 Teladoc Health, Inc. Server connectivity control for tele-presence robot
US8463435B2 (en) 2008-11-25 2013-06-11 Intouch Technologies, Inc. Server connectivity control for tele-presence robot
US8442661B1 (en) * 2008-11-25 2013-05-14 Anybots 2.0, Inc. Remotely controlled self-balancing robot including a stabilized laser pointer
US9138891B2 (en) 2008-11-25 2015-09-22 Intouch Technologies, Inc. Server connectivity control for tele-presence robot
US10059000B2 (en) 2008-11-25 2018-08-28 Intouch Technologies, Inc. Server connectivity control for a tele-presence robot
US8849680B2 (en) 2009-01-29 2014-09-30 Intouch Technologies, Inc. Documentation through a remote presence robot
US11681288B2 (en) 2009-03-02 2023-06-20 Diversey, Inc. Hygiene monitoring and management system and method
US8786429B2 (en) 2009-03-02 2014-07-22 Diversey, Inc. Hygiene monitoring and management system and method
US11181907B2 (en) 2009-03-02 2021-11-23 Diversey, Inc. Hygiene monitoring and management system and method
US9847015B2 (en) 2009-03-02 2017-12-19 Diversey, Inc. Hygiene monitoring and management system and method
US10782682B2 (en) 2009-03-02 2020-09-22 Diversey, Inc. Hygiene monitoring and management system and method
US8897920B2 (en) 2009-04-17 2014-11-25 Intouch Technologies, Inc. Tele-presence robot system with software modularity, projector and laser pointer
US10969766B2 (en) 2009-04-17 2021-04-06 Teladoc Health, Inc. Tele-presence robot system with software modularity, projector and laser pointer
US10404939B2 (en) 2009-08-26 2019-09-03 Intouch Technologies, Inc. Portable remote presence robot
US20110050841A1 (en) * 2009-08-26 2011-03-03 Yulun Wang Portable remote presence robot
US9602765B2 (en) 2009-08-26 2017-03-21 Intouch Technologies, Inc. Portable remote presence robot
US10911715B2 (en) 2009-08-26 2021-02-02 Teladoc Health, Inc. Portable remote presence robot
US8384755B2 (en) 2009-08-26 2013-02-26 Intouch Technologies, Inc. Portable remote presence robot
US11399153B2 (en) 2009-08-26 2022-07-26 Teladoc Health, Inc. Portable telepresence apparatus
US8930023B2 (en) 2009-11-06 2015-01-06 Irobot Corporation Localization by learning of wave-signal distributions
US20110162157A1 (en) * 2010-01-06 2011-07-07 Evolution Robotics, Inc. Apparatus for holding a cleaning sheet in a cleaning implement
US11350810B2 (en) 2010-01-06 2022-06-07 Irobot Corporation System and method for autonomous mopping of a floor surface
US9370290B2 (en) 2010-01-06 2016-06-21 Irobot Corporation System and method for autonomous mopping of a floor surface
US9167947B2 (en) 2010-01-06 2015-10-27 Irobot Corporation System and method for autonomous mopping of a floor surface
US8316499B2 (en) 2010-01-06 2012-11-27 Evolution Robotics, Inc. Apparatus for holding a cleaning sheet in a cleaning implement
US9179813B2 (en) 2010-01-06 2015-11-10 Irobot Corporation System and method for autonomous mopping of a floor surface
US8869338B1 (en) 2010-01-06 2014-10-28 Irobot Corporation Apparatus for holding a cleaning sheet in a cleaning implement
US8892251B1 (en) 2010-01-06 2014-11-18 Irobot Corporation System and method for autonomous mopping of a floor surface
US9801518B2 (en) 2010-01-06 2017-10-31 Irobot Corporation System and method for autonomous mopping of a floor surface
US10258214B2 (en) 2010-01-06 2019-04-16 Irobot Corporation System and method for autonomous mopping of a floor surface
US11154981B2 (en) 2010-02-04 2021-10-26 Teladoc Health, Inc. Robot user interface for telepresence robot system
US8800107B2 (en) 2010-02-16 2014-08-12 Irobot Corporation Vacuum brush
US10314449B2 (en) 2010-02-16 2019-06-11 Irobot Corporation Vacuum brush
US11058271B2 (en) 2010-02-16 2021-07-13 Irobot Corporation Vacuum brush
US10887545B2 (en) 2010-03-04 2021-01-05 Teladoc Health, Inc. Remote presence system including a cart that supports a robot face and an overhead camera
US9089972B2 (en) 2010-03-04 2015-07-28 Intouch Technologies, Inc. Remote presence system including a cart that supports a robot face and an overhead camera
US11798683B2 (en) 2010-03-04 2023-10-24 Teladoc Health, Inc. Remote presence system including a cart that supports a robot face and an overhead camera
US8670017B2 (en) 2010-03-04 2014-03-11 Intouch Technologies, Inc. Remote presence system including a cart that supports a robot face and an overhead camera
US8788096B1 (en) 2010-05-17 2014-07-22 Anybots 2.0, Inc. Self-balancing robot having a shaft-mounted head
US9014848B2 (en) 2010-05-20 2015-04-21 Irobot Corporation Mobile robot system
US8935005B2 (en) 2010-05-20 2015-01-13 Irobot Corporation Operating a mobile robot
US9902069B2 (en) 2010-05-20 2018-02-27 Irobot Corporation Mobile robot system
US9498886B2 (en) 2010-05-20 2016-11-22 Irobot Corporation Mobile human interface robot
US11389962B2 (en) 2010-05-24 2022-07-19 Teladoc Health, Inc. Telepresence robot system that can be accessed by a cellular phone
US10343283B2 (en) 2010-05-24 2019-07-09 Intouch Technologies, Inc. Telepresence robot system that can be accessed by a cellular phone
US10808882B2 (en) 2010-05-26 2020-10-20 Intouch Technologies, Inc. Tele-robotic system with a robot face placed on a chair
US9264664B2 (en) 2010-12-03 2016-02-16 Intouch Technologies, Inc. Systems and methods for dynamic bandwidth allocation
US10218748B2 (en) 2010-12-03 2019-02-26 Intouch Technologies, Inc. Systems and methods for dynamic bandwidth allocation
US8930019B2 (en) 2010-12-30 2015-01-06 Irobot Corporation Mobile human interface robot
US8965579B2 (en) 2011-01-28 2015-02-24 Intouch Technologies Interfacing with a mobile telepresence robot
US8718837B2 (en) 2011-01-28 2014-05-06 Intouch Technologies Interfacing with a mobile telepresence robot
US9785149B2 (en) 2011-01-28 2017-10-10 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
US11468983B2 (en) 2011-01-28 2022-10-11 Teladoc Health, Inc. Time-dependent navigation of telepresence robots
US10399223B2 (en) 2011-01-28 2019-09-03 Intouch Technologies, Inc. Interfacing with a mobile telepresence robot
US10591921B2 (en) 2011-01-28 2020-03-17 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
US9323250B2 (en) 2011-01-28 2016-04-26 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
US9469030B2 (en) 2011-01-28 2016-10-18 Intouch Technologies Interfacing with a mobile telepresence robot
US11289192B2 (en) 2011-01-28 2022-03-29 Intouch Technologies, Inc. Interfacing with a mobile telepresence robot
US9958873B2 (en) 2011-04-11 2018-05-01 Crown Equipment Corporation System for efficient scheduling for multiple automated non-holonomic vehicles using a coordinated path planner
US10769739B2 (en) 2011-04-25 2020-09-08 Intouch Technologies, Inc. Systems and methods for management of information among medical providers and facilities
US9974612B2 (en) 2011-05-19 2018-05-22 Intouch Technologies, Inc. Enhanced diagnostics for a telepresence robot
US10611613B2 (en) 2011-08-26 2020-04-07 Crown Equipment Corporation Systems and methods for pose development using retrieved position of a pallet or product load to be picked up
US9580285B2 (en) 2011-08-26 2017-02-28 Crown Equipment Corporation Method and apparatus for using unique landmarks to locate industrial vehicles at start-up
US10331323B2 (en) 2011-11-08 2019-06-25 Intouch Technologies, Inc. Tele-presence system with a user interface that displays different communication links
US8836751B2 (en) 2011-11-08 2014-09-16 Intouch Technologies, Inc. Tele-presence system with a user interface that displays different communication links
US9715337B2 (en) 2011-11-08 2017-07-25 Intouch Technologies, Inc. Tele-presence system with a user interface that displays different communication links
US8688275B1 (en) 2012-01-25 2014-04-01 Adept Technology, Inc. Positive and negative obstacle avoidance system and method for a mobile robot
US9592609B2 (en) 2012-01-25 2017-03-14 Omron Adept Technologies, Inc. Autonomous mobile robot for handling job assignments in a physical environment inhabited by stationary and non-stationary obstacles
US10089586B2 (en) 2012-02-08 2018-10-02 Omron Adept Technologies, Inc. Job management system for a fleet of autonomous mobile robots
US8984685B2 (en) 2012-02-15 2015-03-24 Stryker Corporation Patient support apparatus and controls therefor
US10762170B2 (en) 2012-04-11 2020-09-01 Intouch Technologies, Inc. Systems and methods for visualizing patient and telepresence device statistics in a healthcare network
US8902278B2 (en) 2012-04-11 2014-12-02 Intouch Technologies, Inc. Systems and methods for visualizing and managing telepresence devices in healthcare networks
US9251313B2 (en) 2012-04-11 2016-02-02 Intouch Technologies, Inc. Systems and methods for visualizing and managing telepresence devices in healthcare networks
US11205510B2 (en) 2012-04-11 2021-12-21 Teladoc Health, Inc. Systems and methods for visualizing and managing telepresence devices in healthcare networks
US10328576B2 (en) 2012-05-22 2019-06-25 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
US10658083B2 (en) 2012-05-22 2020-05-19 Intouch Technologies, Inc. Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US11515049B2 (en) 2012-05-22 2022-11-29 Teladoc Health, Inc. Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US10061896B2 (en) 2012-05-22 2018-08-28 Intouch Technologies, Inc. Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US9174342B2 (en) 2012-05-22 2015-11-03 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
US10603792B2 (en) 2012-05-22 2020-03-31 Intouch Technologies, Inc. Clinical workflows utilizing autonomous and semiautonomous telemedicine devices
US11628571B2 (en) 2012-05-22 2023-04-18 Teladoc Health, Inc. Social behavior rules for a medical telepresence robot
US9776327B2 (en) 2012-05-22 2017-10-03 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
US10780582B2 (en) 2012-05-22 2020-09-22 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
US9361021B2 (en) 2012-05-22 2016-06-07 Irobot Corporation Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US11453126B2 (en) 2012-05-22 2022-09-27 Teladoc Health, Inc. Clinical workflows utilizing autonomous and semi-autonomous telemedicine devices
US10892052B2 (en) 2012-05-22 2021-01-12 Intouch Technologies, Inc. Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US9939529B2 (en) 2012-08-27 2018-04-10 Aktiebolaget Electrolux Robot positioning system
US9241442B2 (en) 2012-10-23 2016-01-26 Daniel A. DIAZDELCASTILLO Autonomous and remote control all purpose machine (ARCAPM)
US20140121876A1 (en) * 2012-10-30 2014-05-01 Agait Technology Corporation Autonomous mobile device and operating method for the same
US8918241B2 (en) * 2012-10-30 2014-12-23 Agait Technology Corporation Autonomous mobile device and operating method for the same
US11910128B2 (en) 2012-11-26 2024-02-20 Teladoc Health, Inc. Enhanced video interaction for a user interface of a telepresence network
US10334205B2 (en) 2012-11-26 2019-06-25 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US9098611B2 (en) 2012-11-26 2015-08-04 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US10924708B2 (en) 2012-11-26 2021-02-16 Teladoc Health, Inc. Enhanced video interaction for a user interface of a telepresence network
US9568911B2 (en) 2012-11-30 2017-02-14 Tennant Company Dynamic maintenance scheduling system for surface cleaning machines
US10448794B2 (en) 2013-04-15 2019-10-22 Aktiebolaget Electrolux Robotic vacuum cleaner
US10219665B2 (en) 2013-04-15 2019-03-05 Aktiebolaget Electrolux Robotic vacuum cleaner with protruding sidebrush
US10111570B2 (en) 2013-06-03 2018-10-30 Bissell Homecare, Inc. Autonomous floor cleaner
US9775485B2 (en) 2013-06-03 2017-10-03 Bissell Homecare, Inc. Autonomous floor cleaner
US10952584B2 (en) 2013-06-03 2021-03-23 Bissell Inc. Autonomous floor cleaner
US9427127B2 (en) 2013-11-12 2016-08-30 Irobot Corporation Autonomous surface cleaning robot
US9220389B2 (en) 2013-11-12 2015-12-29 Irobot Corporation Cleaning pad
US9615712B2 (en) 2013-11-12 2017-04-11 Irobot Corporation Mobile floor cleaning robot
US10398277B2 (en) 2013-11-12 2019-09-03 Irobot Corporation Floor cleaning robot
US11272822B2 (en) 2013-11-12 2022-03-15 Irobot Corporation Mobile floor cleaning robot with pad holder
US10045675B2 (en) 2013-12-19 2018-08-14 Aktiebolaget Electrolux Robotic vacuum cleaner with side brush moving in spiral pattern
US9811089B2 (en) 2013-12-19 2017-11-07 Aktiebolaget Electrolux Robotic cleaning device with perimeter recording function
US10433697B2 (en) 2013-12-19 2019-10-08 Aktiebolaget Electrolux Adaptive speed control of rotating side brush
US10209080B2 (en) 2013-12-19 2019-02-19 Aktiebolaget Electrolux Robotic cleaning device
US9946263B2 (en) 2013-12-19 2018-04-17 Aktiebolaget Electrolux Prioritizing cleaning areas
US10149589B2 (en) 2013-12-19 2018-12-11 Aktiebolaget Electrolux Sensing climb of obstacle of a robotic cleaning device
US10617271B2 (en) 2013-12-19 2020-04-14 Aktiebolaget Electrolux Robotic cleaning device and method for landmark recognition
US10231591B2 (en) 2013-12-20 2019-03-19 Aktiebolaget Electrolux Dust container
US9554508B2 (en) 2014-03-31 2017-01-31 Irobot Corporation Autonomous mobile robot
US10518416B2 (en) 2014-07-10 2019-12-31 Aktiebolaget Electrolux Method for detecting a measurement error in a robotic cleaning device
US10729297B2 (en) 2014-09-08 2020-08-04 Aktiebolaget Electrolux Robotic vacuum cleaner
US10499778B2 (en) 2014-09-08 2019-12-10 Aktiebolaget Electrolux Robotic vacuum cleaner
US10067232B2 (en) 2014-10-10 2018-09-04 Irobot Corporation Autonomous robot localization
US9510505B2 (en) 2014-10-10 2016-12-06 Irobot Corporation Autonomous robot localization
US9854737B2 (en) 2014-10-10 2018-01-02 Irobot Corporation Robotic lawn mowing boundary determination
US9516806B2 (en) 2014-10-10 2016-12-13 Irobot Corporation Robotic lawn mowing boundary determination
US11452257B2 (en) 2014-10-10 2022-09-27 Irobot Corporation Robotic lawn mowing boundary determination
US10750667B2 (en) 2014-10-10 2020-08-25 Irobot Corporation Robotic lawn mowing boundary determination
US10877484B2 (en) 2014-12-10 2020-12-29 Aktiebolaget Electrolux Using laser sensor for floor type detection
US10874271B2 (en) 2014-12-12 2020-12-29 Aktiebolaget Electrolux Side brush and robotic cleaner
US9420741B2 (en) 2014-12-15 2016-08-23 Irobot Corporation Robot lawnmower mapping
US11231707B2 (en) 2014-12-15 2022-01-25 Irobot Corporation Robot lawnmower mapping
US10274954B2 (en) 2014-12-15 2019-04-30 Irobot Corporation Robot lawnmower mapping
US10534367B2 (en) 2014-12-16 2020-01-14 Aktiebolaget Electrolux Experience-based roadmap for a robotic cleaning device
US10678251B2 (en) 2014-12-16 2020-06-09 Aktiebolaget Electrolux Cleaning method for a robotic cleaning device
US10159180B2 (en) 2014-12-22 2018-12-25 Irobot Corporation Robotic mowing of separated lawn areas
US10874045B2 (en) 2014-12-22 2020-12-29 Irobot Corporation Robotic mowing of separated lawn areas
US11589503B2 (en) 2014-12-22 2023-02-28 Irobot Corporation Robotic mowing of separated lawn areas
US20190141888A1 (en) 2014-12-22 2019-05-16 Irobot Corporation Robotic Mowing of Separated Lawn Areas
US9538702B2 (en) 2014-12-22 2017-01-10 Irobot Corporation Robotic mowing of separated lawn areas
US9826678B2 (en) 2014-12-22 2017-11-28 Irobot Corporation Robotic mowing of separated lawn areas
US10064533B2 (en) 2015-03-16 2018-09-04 Irobot Corporation Autonomous floor cleaning with removable pad
US9565984B2 (en) 2015-03-16 2017-02-14 Irobot Corporation Autonomous floor cleaning with removable pad
US10499783B2 (en) 2015-03-16 2019-12-10 Irobot Corporation Autonomous floor cleaning with a removable pad
US9907449B2 (en) 2015-03-16 2018-03-06 Irobot Corporation Autonomous floor cleaning with a removable pad
US11324376B2 (en) 2015-03-16 2022-05-10 Irobot Corporation Autonomous floor cleaning with a removable pad
US10952585B2 (en) 2015-03-16 2021-03-23 Robot Corporation Autonomous floor cleaning with removable pad
US9320409B1 (en) 2015-03-16 2016-04-26 Irobot Corporation Autonomous floor cleaning with removable pad
US9265396B1 (en) 2015-03-16 2016-02-23 Irobot Corporation Autonomous floor cleaning with removable pad
US20220257080A1 (en) * 2015-03-16 2022-08-18 Irobot Corporation Autonomous floor cleaning with a removable pad
US11099554B2 (en) 2015-04-17 2021-08-24 Aktiebolaget Electrolux Robotic cleaning device and a method of controlling the robotic cleaning device
US11115798B2 (en) 2015-07-23 2021-09-07 Irobot Corporation Pairing a beacon with a mobile robot
US10785907B2 (en) 2015-07-24 2020-09-29 Irobot Corporation Controlling robotic lawnmowers based on fluctuating weather conditions
US10034421B2 (en) 2015-07-24 2018-07-31 Irobot Corporation Controlling robotic lawnmowers
US10874274B2 (en) 2015-09-03 2020-12-29 Aktiebolaget Electrolux System of robotic cleaning devices
US11712142B2 (en) 2015-09-03 2023-08-01 Aktiebolaget Electrolux System of robotic cleaning devices
US10426083B2 (en) 2016-02-02 2019-10-01 Irobot Corporation Blade assembly for a grass cutting mobile robot
US10021830B2 (en) 2016-02-02 2018-07-17 Irobot Corporation Blade assembly for a grass cutting mobile robot
US10459063B2 (en) 2016-02-16 2019-10-29 Irobot Corporation Ranging and angle of arrival antenna system for a mobile robot
US11169533B2 (en) 2016-03-15 2021-11-09 Aktiebolaget Electrolux Robotic cleaning device and a method at the robotic cleaning device of performing cliff detection
US11122953B2 (en) 2016-05-11 2021-09-21 Aktiebolaget Electrolux Robotic cleaning device
US11862302B2 (en) 2017-04-24 2024-01-02 Teladoc Health, Inc. Automated transcription and documentation of tele-health encounters
US11571104B2 (en) 2017-06-02 2023-02-07 Irobot Corporation Cleaning pad for cleaning robot
US10595698B2 (en) 2017-06-02 2020-03-24 Irobot Corporation Cleaning pad for cleaning robot
US11474533B2 (en) 2017-06-02 2022-10-18 Aktiebolaget Electrolux Method of detecting a difference in level of a surface in front of a robotic cleaning device
US11470774B2 (en) 2017-07-14 2022-10-18 Irobot Corporation Blade assembly for a grass cutting mobile robot
US11742094B2 (en) 2017-07-25 2023-08-29 Teladoc Health, Inc. Modular telehealth cart with thermal imaging and touch screen user interface
US11636944B2 (en) 2017-08-25 2023-04-25 Teladoc Health, Inc. Connectivity infrastructure for a telehealth platform
US11921517B2 (en) 2017-09-26 2024-03-05 Aktiebolaget Electrolux Controlling movement of a robotic cleaning device
US11389064B2 (en) 2018-04-27 2022-07-19 Teladoc Health, Inc. Telehealth cart that supports a removable tablet with seamless audio/video switching
EP4278940A2 (en) 2018-12-12 2023-11-22 Kemaro AG Device for cleaning dirty surfaces
WO2020120462A2 (en) 2018-12-12 2020-06-18 Kemaro Ag Device and method for automatically performing an activity, in particular for cleaning dirty surfaces
US20220022712A1 (en) * 2018-12-12 2022-01-27 Kemaro Ag Device for cleaning dirty surfaces
US11957286B2 (en) * 2022-04-28 2024-04-16 Irobot Corporation Autonomous floor cleaning with a removable pad

Also Published As

Publication number Publication date
US20030030399A1 (en) 2003-02-13

Similar Documents

Publication Publication Date Title
US6580246B2 (en) Robot touch shield
US6667592B2 (en) Mapped robot system
US11400595B2 (en) Robotic platform with area cleaning mode
US10328573B2 (en) Robotic platform with teach-repeat mode
US20200047337A1 (en) Robotic platform with event based mode change
US20180364045A1 (en) Robotic platform with mapping facility
US20200047343A1 (en) Remote planning and locally adaptive service mapping
US20180361585A1 (en) Robotic platform with multi-function service module
US20170312916A1 (en) Apparatus and methods for providing a reconfigurable robotic platform
EP3785093B1 (en) Robot contextualization of map regions
US20210260773A1 (en) Systems and methods to control an autonomous mobile robot
US20230409032A1 (en) Method for controlling an autonomous, mobile robot
US20180361584A1 (en) Robotic platform with long-term learning
US20180361581A1 (en) Robotic platform with following mode
JP6054425B2 (en) How to perform self-location estimation automatically
EP3484678A1 (en) Apparatus and methods for providing a reconfigurable robotic platform
JP2019003630A (en) System including at least two ground processing devices
CN112739244A (en) Mobile robot cleaning system
KR20190087355A (en) Method for driving cleaning robot and cleaning robot which drives using regional human activity data
CN110605713A (en) Robot positioning method, robot, and storage medium
WO2019203878A1 (en) Apparatus and methods of a service robotic platform
JP7423656B2 (en) Control of autonomous mobile robots
WO2020086557A1 (en) Apparatus and method for operations of a robotic platform
US20200397202A1 (en) Floor treatment by means of an autonomous mobile robot
US11737627B2 (en) Methods for setting and programming zoning for use by autonomous modular robots

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTELLIBOT, L.L.C., VIRGINIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:JACOBS, STEPHEN;REEL/FRAME:012258/0380

Effective date: 20011013

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: AXXON ROBOTICS, LLC, OREGON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTELLIBOT, LLC;REEL/FRAME:015766/0288

Effective date: 20031120

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12

AS Assignment

Owner name: INTELLIBOT ROBOTICS LLC, OREGON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AXXON ROBOTICS LLC;REEL/FRAME:034638/0974

Effective date: 20150105

AS Assignment

Owner name: DIVERSEY, INC., OREGON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTELLIBOT ROBOTICS, LLC;REEL/FRAME:035401/0158

Effective date: 20150318

AS Assignment

Owner name: CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH, AS COLLATERAL AGENT, NEW YORK

Free format text: SECURITY AGREEMENT;ASSIGNORS:DIVERSEY, INC.;THE BUTCHER COMPANY;REEL/FRAME:045300/0141

Effective date: 20170906

Owner name: CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH, AS COLLAT

Free format text: SECURITY AGREEMENT;ASSIGNORS:DIVERSEY, INC.;THE BUTCHER COMPANY;REEL/FRAME:045300/0141

Effective date: 20170906

AS Assignment

Owner name: THE BUTCHER COMPANY, NORTH CAROLINA

Free format text: RELEASE OF SECURITY AGREEMENT REEL/FRAME 045300/0141;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:064236/0722

Effective date: 20230705

Owner name: DIVERSEY, INC., NORTH CAROLINA

Free format text: RELEASE OF SECURITY AGREEMENT REEL/FRAME 045300/0141;ASSIGNOR:CREDIT SUISSE AG, CAYMAN ISLANDS BRANCH;REEL/FRAME:064236/0722

Effective date: 20230705