WO1988004081A1 - Node map system and method for vehicle - Google Patents

Node map system and method for vehicle Download PDF

Info

Publication number
WO1988004081A1
WO1988004081A1 PCT/US1987/001144 US8701144W WO8804081A1 WO 1988004081 A1 WO1988004081 A1 WO 1988004081A1 US 8701144 W US8701144 W US 8701144W WO 8804081 A1 WO8804081 A1 WO 8804081A1
Authority
WO
WIPO (PCT)
Prior art keywords
node
vehicle
path
beacon
angle
Prior art date
Application number
PCT/US1987/001144
Other languages
French (fr)
Inventor
Amy L. Taivalkoski
Mark B. Kadonoff
Original Assignee
Denning Mobile Robotics, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US06/936,092 external-priority patent/US4821192A/en
Application filed by Denning Mobile Robotics, Inc. filed Critical Denning Mobile Robotics, Inc.
Publication of WO1988004081A1 publication Critical patent/WO1988004081A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/026Acoustical sensing devices
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/027Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0272Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising means for registering the travel distance, e.g. revolutions of wheels
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0225Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Acoustics & Sound (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A mapping system (1500) and method which establish a number of navigation nodes and paths among them for a robot in an environment to be navigated by directing the robot to a selected location in the environment, designating that location as a first node, identifying the initial direction of a path to a second node such as by locating a navigation beacon proximate the second node, and moving the robot in the initial direction from the first node while measuring distance travelled along the path to the second node. The mapping system (1500) comprises a directing module (1502), path identification module (1504), robot movement command system (1506), path measuring assembly (1508) and node define system (1512).

Description

NODE MAP SYSTEM AND METHOD FOR VEHICLE
FIELD OF INVENTION This invention relates to a mapping system and method which establish navigation nodes and paths for a vehicle such as an autonomous robot and more particularly to such a system and method which define each node in relation to a previous node by path distance and direction from the previous node.
RELATED APPLICATIONS This application is a Continuation-in-Part of U.S. Serial No. 864,442, filed May 16, 1986, "Beacon Navigation System and Method for a Vehicle", adonoff et al.
The following applications, filed May 16, 1986, are incorporated herein by reference:
Inventors Title Serial No.
Maddox et al. Intrusion Detection System 864,032
Muller et al. Ultrasonic Ranging System 864,002
Benayad-Cherif Position Locating System 864,031 for Vehicle
Pavlak et al. Power-Up Sequencing Apparatus 864,590
Maddox et al. Beacon Proximity Detection 864,292 System for Vehicle Kadonoff et al. Orientation Adjustment System 864,450 and Robot Using Same
Kadonoff et al. Obstacle Avoidance System 864,585
George II et al. Recharge Docking System - 864,028 for Mobile Robot
BACKGROUND OF INVENTION
Defining navigation paths through an environment for a vehicle presents a number of difficulties, especially for autonomous robots. Presently there are a number of navigation systems which rigidly define paths the robot must follow.
One system utilizes a cable, tape or other track which is laid across the ground. The robot must follow that track and cannot deviate from it. This presents problems when obstacles appear across the path. Scheduling- problems also arise for robots attempting to share the same path.
Another system uses spot markers such as glass bubbles which are placed along the floor along a path. The bubbles are highly reflective and reflect incident light which is tracked b the robot. Again, the robot is confined to the path and become lost if it deviates from the path.
Yet other systems follow laser beams or spotlight a target with infrared radiation or a laser beam. The robots then head toward that target. An operator must position the beam on the target.
Some robots utilize inertial reference systems to guide them among compass headings. Inertial reference is typically provided by a gyroscope. Accurate gyroscopes, particularly digitally readable ones, are quite expensive and are relatively delicate.
Navigation can also be accomplished by dead reckoning. Encoders on the drive system record the distance travelled by the wheels or tracks of the robot. This technique, also know as odometry, further includes steering encoders to record changes in orientation. There are several mobile robots for home use which are maneuvered by a joystick between rooms. T robots learn the route' using odometry to record their travels the route can be approximately retraced by following the odometry measurements. However, accumulated errors rapidly create large deviations: odometry is suited only for navigation over short distances.
Other systems model the surrounding environment by reduc the environment to geometric features. The environment is typically modelled as a series of line segments representing surface features, or as a grid representing the probability o presence or absence of objects within particular locations. Paths such as S-curves are then plotted around the modelled features. The modelling is absolute when a complete, overlapping model describes the entire environment, or is relative when scattered grids or other modelling provides onl partial coverage of an environment.
It is known to designate spatial positions as nodes and create links, that is, paths, joining them. The nodes are typically referenced by an absolute coordinate system where a x,y coordinate specifies each node in relation to all others. Each link is a straight-line segment between two nodes. Measuring the location of each node is a difficult, time-consuming task, however, and is prone to accumulated erro in distance measurement.
The guidance system of Daemmer, U.S. Patent No. 4,500,970 combines an odometry system in a robot vehicle with a proximit
SUBSTITUTE detection system which detects physical features, e.g., wall corners, and compares at pre-selected increments the actual orientation toward such a checkpoint with the expected orientation. The expected orientation is learned by following an initial pre-selected guidepath through the environment and stopping at pre-selected increments to identify checkpoints. There are numerous limitations to this guidance system, including its reliance on the odometry system alone to navigat between checkpoints and to determine when the next checkpoint reached: when the vehicle deviates from the guidepath due to wheel slippage or other factors, the deviations can apparently become large enough to cause the robot vehicle to search prematurely for the checkpoint or to search in the wrong location. In either case the robot vehicle could miss the checkpoint entirely or mistake another feature for the intended checkpoint. Further, each checkpoint must apparently exhibit a change in depth to enable the robot to recogniz.e it and to determine if the robot is misaligned with it.
Moreover, the operation of the propulsion system controller, of other control functions, and of the sensors themselves are unclear as described. For example, Daemmer assumes that his range data is perfect. However, sensors typically exhibit a slight but noticeable error, and their accuracy depends heavily on the characteristics of their immediate surroundings. Daemmer does not disclose a system for updating the deadreckoning system when the position estimations are imperfect.
SUMMARY OF INVENTION It is therefore an object of this invention to provide an improved mapping system and method which readily establish a ma having navigation nodes and paths for a vehicle such as a mobil robot.
It is a further object of this invention to provide such system and method which utilize the robot itself to define th navigation nodes.
It is a further object of this invention to provide such system and method in which the map is learned by the robot un actual operating conditions.
It is a further object of this invention to provide such system and method in which the robot can successively monitor its orientation relative to its surroundings or a navigation beacon during the establishment of nodes and interconnecting paths.
It is a further object of this invention to provide such system and method which are versatile and easily allow new nod and paths to be established.
A still further object of this invention is to provide su a system and method which enable maps to be rearranged without alterations in the robot hardware or in the surrounding environment.
It is a further object of this invention to provide such system and method which generate a map that can be transferred among robots.
Yet another object of this invention is to provide a map that, once completed, enables a robot to operate autonomously.
It is a further object of this invention to provide such map which can instruct the robot to perform certain tasks for specified paths and at specified nodes.
This invention results from the realization that a truly effective navigation capability for a vehicle such as a mobile robot can be achieved by a system and method which establish a number of nodes in an environment to be navigated by learning, for each node, the angular orientation of a path toward it fro a previous node and the distance of that path as actually travelled by the robot.
This invention features a mapping system for establishing number of navigation nodes, each proximate a navigation beacon and interconnecting paths in an environment to be navigated by vehicle having a drive system. There are means for directing the vehicle to a selected location in the environment and designating that location as a first node, and means for locating the initial direction of a path from said first node a second node relative to a beacon proximate the second node. There are also means, responsive to the means for locating, fo commanding the drive system to move the vehicle in a path alon the initial direction from the first node, means for measuring distance travelled by the vehicle along the path from the firs to the second node, and means, responsive to the means for measuring, for defining the second node by at least the distan travelled along the path and by the initial direction.
In one embodiment, the means for locating may include azimuthal detector means for sensing a navigation beacon and f resolving the azimuthal angle between the vehicle and the beacon. The means for commanding includes means, responsive-t the azimuthal detector means, for turning the vehicle by the vehicle-beacon azimuthal angle. The means for locating may further include means for moving the azimuthal detector means search for the navigation beacon which has a particular identification code.
In another embodiment, the system further includes means for monitoring the orientation of the vehicle relative to the environment and means for substantially continuously resolving, while the vehicle is travelling along the path, the orientatio of the vehicle relative to a feature in the environment and fo updating the means for monitoring. The means for resolving ma
SUBSTITUTE SHEET define vehicle orientation relative to the beacon. The syste may further include means for halting the vehicle at said sec node and the means for halting includes halt input means for operator input of a halt command. Alternatively, the means f halting includes altitudinal detector means for sensing a navigation beacon and for resolving the altitudinal angle between the beacon and the vehicle. The means for halting further includes means for comparing the altitudinal angle wi a predetermined angle and for arresting travel of the vehicle when they match.
In yet another embodiment, the means for directing includ means for selecting the second node as the location from which third node is to be established and includes input means for operator input of the selected location for the first node. T means for directing further includes guidance input means for operator guidance of the vehicle to the selected location. Th system further includes task input means for operator input of task commands defining actions to be performed by the vehicle along that path and at that node. The means for defining includes means for storing the initial path direction and the path distance to that next node.
This invention also features a mapping system for establishing a number of navigation nodes and paths among them in an environment to be navigated by a vehicle having a drive system. There are means for directing the vehicle to a select location in the environment and designating that location as a first node, means for identifying the initial direction of a path to a second node, and means for commanding the drive syst to move the vehicle in the initial direction from the first node. There are also means for measuring distance travelled b the vehicle along the path, and means, responsive to the means for measuring, for defining the second node by at least the
S distance travelled along the path and by the initial direction.
In one embodiment, the means for identifying includes mean for determining the azimuthal angle by which vehicle must turn to face the initial direction. The means for determining may include angle input means for operator input of the azimuthal angle and the system may further include means or monitoring the orientation of the vehicle relative to the environment and means for substantially continuously resolving, while the vehicle is travelling along the path, the orientation of the vehicle relative to a feature in the environment and for updating the means for monitoring.
This invention further features a mapping system for establishing a number of navigation nodes and paths including a vehicle having a drive system including a drive motor and a steering system, means for directing the vehicle to the first node, means for identifying the initial direction of a path to second node, means for commanding the drive system, and means for measuring distance travelled by the vehicle along the path. The system further includes means for defining the second node. The means for measuring distance may include drive encoder means, responsive to the drive motor, for determining distance travelled by the vehicle and the means for identifying includes means for determining the azimuthal angle by which the vehicle must turn to face the initial direction. The means for determining may include azimuthal detector means and the means for commanding includes means for turning the vehicle by the vehicle-beacon azimuthal angle. The vehicle may be a mobile robot.
This invention may also be expressed as a mapping system including input means for operator input to designate a selecte location in the environment as a first node, to identify the initial direction of a path a second node, and to halt the
SUBSTITUTE SHEET vehicle at the second node. The system further includes mean for commanding the drive system, means for measuring distance travelled, and means for defining the second node by at least the distance travelled and the initial path direction.
Yet another feature of the invention is a system for navigating a vehicle from node to node utilizing a map, store in memory in the vehicle, which defines a plurality of navigation nodes in relation to each other, including means fo directing the vehicle to a first node recorded in the map, and means for referencing the map to identify a known initial direction of a path from the first node to a second node proximate a navigation beacon and the known distance of the path. There are also detector means for sensing the navigatio beacon and for resolving at least the azimuthal angle between the vehicle and the beacon. The system further includes means responsive to the means for referencing and the detector means for commanding the vehicle to proceed in the initial direction and to navigate along the path to the second node for the know distance.
In one embodiment, the means for referencing identifies t initial direction based on a predetermined azimuthhal angle fr the first node toward the beacon proximate the second node. T detector means also resolves the altitudinal angle between the vehicle and the beacon, and the means for referencing identifi the second node by a predetermined altitudinal angle observabl at the second node. The means for commanding includes means, responsive to the detector means, for halting the vehicle when the resolved altitudinal angle matches the predetermined altitudinal angle.
In another embodiment, the beacon includes means for producing energy in a coded signal which uniquely identifies that beacon, and the means for referencing retrieves from the
SUBSTITUTE SHEET map the identification code for that beacon. The detector mea includes a multi-sector sensor for sensing a signal emitted by the beacon, means for generating a code data signal representative of the coded signal, and means for generating an angle data signal representative of the angle from the sensor t the beacon in at least the azimuthal dimension. The system may further include planner means for designating successive nodes to visit and for providing the successive nodes to the means fo directing.
This invention further features a method of establishing a number of navigation nodes and paths among them for a vehicle i an environment to be navigated, including directing the vehicle to a selected location in the environment, designating that location as a first node, identifying the initial direction of the path to a second node, and moving the vehicle in the initia direction from the first node while measuring distance travelle along the path. The method further includes defining that node by at least the distance travelled along the path and the initial path direction, and repeatedly identifying, moving, and defining until a desired number of nodes and paths are established.
In one embodiment, the method further includes establishin a return path to the first node and recording the established nodes to form a map of the environment. The method may further include assigning tasks for the vehicle to perform along each path or at each node.
DISCLOSURE OF PREFERRED EMBODIMENT Other objects, features and advantages will occur from the following description of a preferred embodiment and the accompanying drawings, in which:
Fig. 1 is an axonometric view of a robot utilizing the obstacle avoidance system according to this invention;
Fig. 2 is a simplified exploded view with parts remove of the robot of Fig. 1;
Fig. 3 is a block diagram of the electronic modules included in the robot of Figs. 1 and 2;
Fig. 4A is a schematic top plan view of the relationship between the head and the body to each other an relative to the environment;
Fig. 4B is a schematic top plan view of alignment between the head and the body using a position encoder;
Fig. 4C is a schematic top plan view of the sign of angles in relation to the front of the robot;
Fig. 5A is a schematic top plan view of the angles and distances between the robot, (a predetermined node-, and a beacon;
Fig . 5B is an elevational plan view of the robot and beacon of Fig. 5A relative to each other;
Fig. 5C is a schematic top plan view of the angles and distances between the robot, a selected path, and an offset beacon;
Fig. 6 is a schematic block diagram of the mobile module of Fig. 3;
Fig. 7A is a schematic diagram of a map of a number of nodes established in the environment;
Fig. 7B is a schematic diagram of a global navigation path to a goal node within the map of Fig. 7A;
Fig. 7C is a schematic diagram of local navigation between two nodes of the global path of Fig. 7B;
Fig. 8A is a schematic block diagram of position estimation;
Fig. 8B is a flow chart of the arbiter of Fig. 8A; Fig. 9 is a flow chart of the X-axis calibration of Fig. 8A;
Figs. 10A and 10B are flow charts of end-path calibration of Fig. 8A;
Figs. 11A and 11B are flow charts of body angle calibration of Fig. 8A;
Fig. 12A is a schematic block diagram of local navigation including an obstacle avoidance system according to this invention;
Fig. 12B is a flow chart of the arbiter of Fig. 12A;
Fig. 13 is a flow chart of the beacon tracking"system of Fig. 12A;
Fig. 14 is a -schematic top plan view of a robot navigating around an obstacle;
Figs. 15A-15C are flow charts of the obstacle avoider of Fig. 12A;
Fig. 16 is a schematic top plan view of the determination of avoidance distance in a corridor;
Fig. 17 is a schematic top plan view of the computation of side swipe angle around an object;
Fig. 18 is a flow chart of the path follower system of Fig. 12A;
Fig. 19 is a schematic block diagram of the interrelation between the map, global navigation and local navigation;
Fig. 20 is a flow chart of the planner of Fig. 19;
Fig. 21 is a flow chart of the executive of Fig. 19;
Fig. 22 is a block diagram of position locating including beacon sensors and the beacon electronic module;
Fig. 23 is an illustration of the optical burst output of the beacons of Fig. 22;
Fig. 24 is an enlarged detail of a single burst of Fig. 23; Fig. 25 is a more detailed block diagram of a beacon shown in Fig. 22;
Fig. 26 is a more detailed block diagram of an eye shown in Fig. 22;
HEET Fig'v^-27 is a more detailed block diagram of the beacon STD-bus interface of Fig. 22;
Figs. 28A and 28B are flow charts of the software utilized in the microprocessor of Fig. 22;
Fig. 29 is a schematic of the photodiode of Fig. 26 Fig. 30 is a schematic block diagram of a mapping system according to this invention;
Figs. 31A and 31B are flow charts of mapping utilizing different navigation techniques;
Fig. 32 is a schematic top plan view of aisle-centering and wall-following navigation techniques;
Fig. 33 is a flow chart of the operation of a robot upon initial exit from a recharge station;
Fig. 34 is a flow chart of the operation performed on the last path to be established;
Figs. 35A-35C are schematics of the establishment of additional paths between known and unknown nodes; and
Fig. 36 is a flow chart of the operation to add nodes and paths to the map.
There is shown in Fig. 1 a vehicle, robot 10 according to this invention including a head section 12 and a base 14 movable on three wheels, only two of which, 16, 18, are visible. The wheels are mounted in three steerable trucks, only two of which, 20 and 22, are visible. There are twenty-four ultrasonic transducers. 24 such as the electrostatic transducers of the Sell type available from Polaroid equally spaced at fifteen degrees around the periphery of base 14. One of transducers 24, such as transducer 24a, is designated as sensor zero; its azimuthal heading is utilized as described below. Above that on reduced neck 26 there are located six passive infrared motion detectors 28, 30, 32, 34, 36, 38, only two of which, 28 and 30, are shown. These detectors are equally spaced at sixty degrees apart and may be DR-321's available from Aritech. Just above that are two conductor bands 50 and 52 which are used to engage a charging arm for recharging the robot's batteries. Head section 12 is mounted to base 14 and rotates with respect to base 14 about a central vertical axis. Head section 12 carries an RF antenna 65 for sending and receiving communication signals to a base location or guard station. Head section 14 also includes an infrared sensor 60 for sensing radiation in the near infrared region, e.g. 880 nm, such as emitted from LED 62 of beacon 64, one or more of which are mounted on the walls in the space to be protected by robot 10 to assist in locating and directing robot 10 in the area in which it is to roam. An ultrasonic transducer 66 similar to one of the transducers 24 used for maneuvering and avoidance may be provided for ranging. There is also provided a passive infrared sensor 68 similar to sensors 28 and 30. A microwave transmission and reception antenna 70 and a TV camera 72 which may be turned on when an apparent intrusion has occurred; these are also included in head 12.
Base 14, Fig. 2 , includes a main chassis 80 which carries three batteries 82 such as globe 12V 80AH gel cells, only one of which is shown. When fully charged they will operate the robot for ^ welve hours or more.
Robot 10 is provided with a synchro-drive locomotion system 83. The wheels are steered simultaneously and driven simultaneously. By steering simultaneousl r base 14 optimally maintains an azimuthal angle which is fixed in space and serves as the basic reference of the robot to the outside world. Trucks 20 and 22, with wheels 16 and 18 respectively, are suspended from chassis 80. Each truck as indicated at truck 20 includes a right-angle drive 84 which receives input from vertical drive shaft 86 and provides output on horizontal drive shaft 88 to operate pulley 90. Pulley 90 through belt 92 drives pulley 94 attached to the axle of wheel 16. Vertical drive shaft 86 and counterpart drive shafts 96 and 98 are driven by their respective sprockets or pulleys 100, 102, 104 which in turn are driven by endless belt 106 powered by the pulley 107 on output shaft 108 of drive motor 110 mounted beneath chassis 80. An encoder 111 mounted with motor 110 monitors the velocity of
SUBSTI the robot. An idler wheel 112 is provided to maintain proper tension on belt 106.
Three additional shafts, only one of which, 99, is shown, concentric with shafts 86, 96 and 98, respectively, are driven by a second set of pulleys or sprockets 120, 122, 124 engaged with drive belt 126 powered by sprocket 128 driven by steering motor 130 mounted beneath chassis 80. Idler pulley 131 is used to maintain tension on belt 126. An encoder 132 is associated with steering motor 130 to provide outputs indicative of the steering position. The steering motor shaft is connected through pulley 128 to extension shaft 134, the top of which is provided with a flange 136 with a plurality of mounting holes 138. Position sensor 135, such as a Hall Effect device, interacts with a magnetic detent on shaft 134 as described below. Electronic chassis 140 is mounted by means of screws 142 on three shorter standoffs 144. Three holes 146 in electronic chassis 140 accommodate the pass-through of longer standoffs 148, which mount neck 26 by means of screws 150. Electronic chassis 140 contains all of the electronic circuit boards and components such as indicated as items 152 that are contained in the base 14, including the status module described infra.
When an electronic chassis 140 and neck 26 are mounted on their respective standoffs, extension shaft 134 and flange 136 and the associated structure are accommodated by the central hole 160 in electronic chassis 140 and the opening in neck 26 so that the head plate 170 may be mounted by means of screws 172 to threaded holes 138 in flange 136. In this way the entire head rotates in synchronism with the trucks and wheels as they are steered by steering motor 130. Arrow 171 represents the frontal, forward-facing orientation of head 12.
In addition to the primary microwave sensor 70 there are three additional microwave sensors only one of which,
T 190, is visible spaced at ninety degrees about head plate 170 mounted in housings 192, 194, and 196. One or more additional ultrasonic sensors can also be mounted in head 12, e.g., ultrasonic sensor 195 on housing 192. Housing 194 faces directly to the back of the head as opposed to primary microwave sensor 70 which faces front. Housing 194 also contains a second infrared sensor, not visible, which is the same as infrared sensor 68.
Head 12 also contains internal reference sensors. Inclinometer 197, such as the Accustar Clinometer Sensor available from Sperry Corp., is mounted on the rear portion of external sensor housing 199. Its use in X-axis calibration is described below.
Cover 200 protects the electronics on head plate 170. All of the electrical interconnections between head 12 and base 14 are made through slip rings contained in slip ring unit 202 mounted about extension shaft 134 in base 14.
Head 12, Fig. 3, includes three electronic portions: beacon module 210, head ultrasonic module 212, and intrusion detection module 214. Beacon module 210 responds to the IR sensor 60 to determine what angle "the beacon 64 is with respect to the robot. That angle is fed on bus 216 through the slip ring unit 202 to the main CPU 218. Beacon module 210 is also responsive to inclinometer 197.
Head ultrasonic module 212 responds to ultrasonic transducer 66 to provide ranging information on bus 216 to CPU 218. Intruder detection module 214 responds to the four microwave sensors such as sensors 70, 190, and the two IR sensors such as sensor 68, to provide indications as of yet unconfirmed intrusion events. These events are processed by the alarm confirmation unit 220 in CPU 218 to determine whether a true confirmed intrusion has occurred.
In the body section 14, there is included status module 222, mobile module 224, body ultrasonics module 226, and CPU 218. Status module 222 responds to the six infrared sensors
SUBSTITUTE SHEET 28-38 to provide an indication of an intrusion. Status module 222 may also monitor fire and smoke detectors, diagnostic sensors throughout the robot such as inclinometer 197, as well as chemical and odor detectors and other similar sensors. Mobile module 224 operates and monitors the action of drive motor 110 and steering motor 130. The twenty-four ultrasonic transducers 24 provide an input to the body of ultrasonic module 226, which provides digital range information for the robot. Finally, body 14 contains CPU 218, which in addition to the alarm confirmation unit 220 also interconnects with a floppy disk controller, a two-channel serial I/O board, and a reset board which receives inputs from a pushbutton reset and CPU 218 and provides as outputs ultrasonic resets, motor resets, status resets, beacon resets, I/O module resets and head ultra resets. CPU 218 also receives inputs from RF antenna 65 through RF circuit 240.
Several notations describe the orientation of head 12 to body 14, as shown in Fig. 4A. Arrow 300 represents its front, which faces the direction of travel, shown as front arrow 171 in Fig. 2. The azimuthal orientation of body 14, Fig. 4A, is represented by point 302, hereinafter referred to as sensor zero. Sensor zero is a designated, fixed point on body 14 itself, e.g., sensor 24a, Fig. 2. The angle between head front 300 and sensor zero is theta . The azimuthal angular distance from global zero to sensor zero is represented by thetaB; arrows 304 represent the designated orientation for the surrounding environment.
Head 12 is realigned with base 14 using position sensor 135 and magnetic detent 306 on shaft 134 as shown in Fig. 4B. This is accomplished at designated homing nodes by rotating head 12, that is, by pivoting the three steerable trucks, about its vertical axis such that magnetic detent 306 is brought into alignment with position sensor 135 of base 14. The head direction angle theta is then set to zero; this and other operational parameters of the robot are maintained in updatable memory, hereinafter referred to as the blackboard, in CPU 218, Fig. 3.
The sign of angles is determined as shown in Fig. 4C. Object 290 is directly aligned with robot front 300. Angles in the clockwise direction, indicated by arro v291 between object 290 and object 292, are positive in value. Angles in the counterclockwise direction, shown by arrow 293 toward object 294, are negative.
The position and orientation of robot 10 relative to beacon 64a is shown in Fig. 5A. Theta.-... is the actual horizontal angle, that is, the azimuthal angle, between head front 300 and beacon 64a. Beacon 64a has a 30° transmitter beam width, indicated by arrow 303, which is detected by sensor 60, Figs. 1 and 2, within its 22° field of view. The expected horizontal angle, theta__, is zero when robot 10 is properly following path 308 toward wall 301. Horizontal deviation from path 308 is designated by the variable _, here shown by arrow 310. The orientation of path 308 is represented by angle theta , which is 180° in relation to coordinate arrows 304.
Dashed line 312 parallels path 308. The angular deviation from the direction of path 308 to front 300 is represented by head-path angle theta . Sensor zero path angle thetaq„ is the angle from the path to sensor zero 302. Head direction angle thetan is equal to the sensor zero path angle thetagz subtracted from head-path angle thetaM. Angle thetas„ is equal to path angle thetap minus body angle theta^.
Thus, body angle thetaβ serves as the basic reference for robot 10 to the outside world. The actual direction of an intruder, a fire or a path is determined for the robot in relation to sensor zero whose heading is updated by body angle theta„.
SUBSTITUTE SHEET Body angle calibration, that is, the updating of the angular orientation between sensor zero and global zero, can be conducted at any place along path 308. Calibration of the distance travelled in the X direction, shown by arrow
314, must be conducted at a predetermined location such as node 316. As shown in Fig. 5B, the vertical angle, that is, the altitude angle, between beacon sensor 60 and LED 62a of beacon 64a is classified as actual vertical angle theta
This is obtained while robot 10 is distance f from wall
301. The expected vertical angle thetaτ_ is learned by initially placing robot 10 at node 316 to measure the angle.
Inclinometer 197, Fig. 2, is utilized to correct for unevenness in terrain.
As described below, the expected distance ft_ is measured from node 316 to wall 301 and can be used instead of expected vertical angle theta . ' In either case, the vertical distance between the height of beacon sensor 60 and LED 62a must be measured, represented by vertical distance e_ and shown by arrow 318. When the distance between wall '301 and vertical axis VA is desired, the robot's radius r_ is added to the trigonometric calculations.
The actual distance between robot 10 and wall 301 can also be measured by head ultrasonic sensor 66 within its 15° field of view. This distance measurement is denoted as horizontal distance R„_ and is shown by dashed arrow 320.
Geometric parameters describing the location of robot 10 along the path and its orientation relative to an offset beacon are shown in Fig. 5C. The path distance X is represented by line 321 between start node 320 and end node 322. Line 324 represents the distance travelled, X , and line 326 designates the remaining distance to travel, X - m
If the reference to be observed by robot 10 such as beacon 64b, is not directly aligned with path 308a, the angle between beacon 64b and path 308a is denoted by offset
SUBSTITUTE SHEET angle theta ^^ . Head horizontal angle theta represents the difference in orientation between head front
300 and beacon 64b.
An estimate of distance travelled and of location is maintained by dead reckoning accomplished by X-Y positioner
332 of mobile module 224, Fig. 6. Mobile module 224 includes motor controller 330, which controls drive motor
110 and steering motor 130. Distance travelled is determined by encoder 111 which provides this information to
X-Y positioner 332. Similarly, steering encoder 132 provides change in orientation information to positioner
332, which accordingly outputs path distance ^, path deviation YM.., and head orientation theta.M.,' which denotes the angle of the head to the path. Encoders 111, 132 also provide feedback to motor controller 330.
Navigation of the robot through its environment is described in relation to successive nodes. Map 338, Fig. 7A, includes selected nodes 340 which have a known distance and angle among each other. The robot travels toward successive goal nodes. For example, if the robot is at node 4, the starting node, the robot can be directed to visit goal node 7. The paths required to accomplish this are designated in Fig. 7B as the global path 342. Global path 342 designates consecutive navigation nodes, and a list of paths between these nodes. At each node, the robot must change headings, recalibrate estimated position, and head towards the next node.
Travel along the particular path is represented by path 344, Fig. 7C. Local navigation along path 344 moves the robot from node 4 to node 5.
The spatial models represented by Figs. 7A-7C simply represent navigable paths through the environment and do not actually describe the surrounding environment. This system requires much less information than other systems utilizing
SUBSTITUTE SHEET ore complicated techniques such as geometric modeling of the environment.
In addition to the identification of the starting node and the list of successive nodes to visit, the predetermined map information also includes a number of characteristics for each path. These include the distance of the path and its angle, thetap, in relation to global zero. The path information may also include the width of the path, and, if a beacon is disposed at the end of the path, the beacon code, distance f_, height e, and expected vertical angle thetaT VraE. The intensity of the beacon*can be monitored to confirm proper operation of the beacon if it is an active beacon; expected intensity of passive beacons such as reflectors or bar codes are not required. One or more position calibration techniques can be specified, such as X-axis calibration, ultrasonic end-path calibration, and body-angle calibration.
Robot 10 uses one or more position calibration systems
399 to estimate its position as shown in Fig. 8A. The position calibration systems 399 are located within CPU 218 and draw upon several modules. X-axis calibration system
400 obtains the vertical deviation phi. from inclinometer 197, information from map 402, and actual vertical angle thetav, from beacon module 210.
End-path calibration system 404 provides an alternate technique of determining the difference in travel along the
X-axis and determining adjustments to body angle theta __>
End-path calibration system 404 draws upon map 402 and ultrasonic range data from module 212.
Body-angle calibration system 406 corrects for accumulated error in orientation due to such factors as drift and precession of base 14. System 406 is responsive to map 402, beacon data including theta _.-1.A-. from beacon module 210 and data from mobile module 224 representing present estimated position and head orientation.
SUBSTITUTE SHEET After obtaining one or more calibrations from calibration systems 400, 404 and 406, arbiter 408 updates blackboard 410 which maintains current status information for robot 10 including its actual position. Initializer 411 supplies initial values to blackboard 410, such as zeros generated for body angle theta-,. Arbiter 408 provides corrected Y„ and theta„ values to X-Y positioner 332, Fig. 6 while the robot is moving and while stationary provides X„, YM and thetaM corrections to motor controller 330 which are implemented as directed.
Arbiter 408 uses end-path calibration from system 404 when nearing the end of the path. Body-angle calibration from system 406 is examined along the path and at specified nodes. X-axis calibration from system 400 is obtained only at selected nodes.
The operation of arbiter 408 is shown in Fig. 8B. Calibration systems continually output calibration values or "NO VALUE" but are sampled to obtain their estimations of position only if enabled from map 402 for the particular path or node the robot presently occupies. Calibration of body angle, X-axis, and end-path position are successively accessed, steps 412, 414 and 416, respectively. Further, the arbiter decides whether to relinquish control of the robot when a calibration system requests motor control.
If map 402 contains the appropriate enable for body angle calibration, step 412, the arbiter observes, step 418, whether calibration system 406 has requested control by setting its control flag. If" it has, control is provided, step 420, unless an external override command Cnot shown) is present. The arbiter waits for body angle calibration to be completed, step 422, and then examines whether the output calibration data is other than "NO VALUE". Valid data is entered into blackboard 410 via step 426 while the operation proceeds directly to step 414 if the data are NO VALUE.
SUBSTITUTE When the control flag is not set, step 418, examination of the output data is immediately made, step 424. Similarly, the outputs of X-axis calibration and end-path calibration are obtained, as represented by loops 430 and 432, respectively. During these calibrations, steps 434 and 436, control remains with the respective calibration systems until the difference between the actual and expected position of the robot along the path becomes zero.
The operation of X-axis calibration system 400 is shown in Fig. 9. If the robot is stopped, step 440, a distance correction of delta X equal to NO VALUE is provided, step 442. When the robot is in a fixed position the X-axis calibration flag is raised, step 444, and the calibration system waits until control is received, step 446. The beacon is located, step 448, and inclinometer 197 is read until motion of the robot settles, step 450. Once stable, the final inclination reading, phi. is taken, as is the stable reading of vertical angle theta , step 452. Actual vertical angle theta^ is corrected by inclination .phi. which has a positive or negative valuedepending whether robot 10 is inclined toward or away from the beacon, respectively, step 454.
The difference between the expected X distance and the actual X distance is obtained in step 456 as represented by the trigonometric formula
ΔX = (ΘVE - ΘVA} sin (θVA) (1)
where delta X is the difference in position along the X-axis,
Figure imgf000025_0001
is the expected vertical angle, theta. the actual vertical angle, and e_ is the vertical elevation of the beacon as described in Fig. 5B. The value of delta X is provided to motor controller 330 through arbiter 408, step 458. The robot moves along distance X„ by an amount equal to delta X, step 459. The operation cycles to step
SUBSTITUT 440 and the robot is readjusted until a delta X of zero is resolved in step 456.
Other methods of calculating delta X can be used, for example, where the expected horizontal distance between the robot and the beacon is substituted for expected vertical angle theta-V.—E,. The latter parameter is preferred because it can be obtained empirically in such operations as map making by locating the robot at the node or other predetermined location and instructing it to observe the beacon to learn the vertical angle. This obviates the need for physically measuring the horizontal distance between the robot and the beacon.
The operation of the end-path calibration system commences with reading of the head ultrasonic range τjar step 460, Fig. 10A. If the actual range is greater than the expected range, step 462, supplied by map 402r an output of delta X equals no value is generated, steps 462, 463, respectively.
Otherwise, if the observed range is less than the expected range, "the end-path calibration flag is raised, step 464 and the calibration system waits until control is received, step 466. Once control is obtained, the calibration system commands that the drive motor halt, step 468 r and commands head ultrasonic 66 of head 12, Figs 1 and 2 , to take j ultrasonic readings along a selected arc, such as fifteen readings along a 30° arc, each reading two degrees apart from the others, step 470. The minimum distance is found in step 472.
A direction perpendicular to the wall is computed, step
474, e.g., by using the angle which gave the minimum distance in step 472. The robot is commanded to turn to this perpendiculardirection, step 476, Fig. 10B, after which delta theta B is set to theta. , step 476. Now that the robot is facing directly toward the object, a final reading of the head ultrasonics are taken to obtain distance R. H-.,, , step 478. The deviation in distance traveled along the X-axis is determined by the formula
ΔX = RHE - RHA (2)
and delta X and delta theta- are output to motor controller 330 through arbiter 408, steps 480, 482, respectively. The position X of the robot is adjusted by delta X, step 483. The end-path calibration system cycles to step 460 to resume monitoring.
The body angle calibration system corrects for accumulated error in orientation which is due to the drift or precession of base 14. Path deviation distance YM occurs during obstacle avoidance, for example, and changes in head orientation represented by theta.. can be attributed to rotation of head 12 while tracking a beacon to correct for precession of the body. Body angle calibration system 406 corrects and calibrates for each of these variables.
In operation,' the beacon horizontal angle theta A is read, step 484, Fig. 11A. Optionally, such' as shown in phantom by step 486, the operation first determines whether the robot is stopped at a node and, if it is, raises body angle calibration flag, step 488, Fig. 11B. Once control is received, step 490, the system determines whether the node is a homing node, step 492. If it is, the robot turns to face sensor zero, step 494 and direction angle thetan is set to zero. Otherwise, operation proceeds directly to step 496 where path direction theta_ is obtained from the map
402. The turn ang -le thetat.urn is set to the distance direction theta-. subtracted from path orientation theta_, step 498.
Once turn ang -le thetat.urn is obtained, that value is output to motor controller 330, Fig. 6, step 500. The robot turns to face the expected beacon and hunts for the beacon,
SUBSTITUTE SHEET steps 502, 504, respectively. The final head orientation thetaM is recorded, operation proceeds to step 484, Fig. 11A, and beacon horizontal angle theta,,- is read. At this stage, the expected deviation distance YM and the head direction theta„ are read from the mobile module, step 504. If the beacon is in view and the robot is on course, meaning that the absolute values of horizontal orientation theta.,.. is less than 10° and deviation distance Y is less than one foot, calibration of body angle is obtained by determining the increment delta theta.-, as equal to the sum of theta„ and theta_ step 510. If either of steps 506 and step 508 are not satisfied, meaning that the robot has significantly deviated from the path, no value is output, step 512, and other calibration systems such as end-path calibration, are tried.
Once the increment delta thetaD D is determined, Y is reset by setting delta Y to negative Y, step 514, and delta theta.o. and Y are provided to the arbiter, step 516.
Optionally, head angle theta is reset by setting it equal to minus horizontal angle t eta , step 515, shown' in phantom. Operation then returns to steps 486, 484 as described above.
In its simplest construction, body angle calibration involves reading the beacon horizontal angle thetaHA, turning to face the beacon, setting delta theta-i_> to th tag-. , and outputting delta thetaβ. In blackboard 410, delta theta„ is initially set to zero and then incremented as needed by either the body angle calibration or end-path calibration systems.
Position estimation system 339 ensures that the robot is properly positioned and oriented. Navigation system 520, Fig. 12A, accurately guides the robot along a path toward a destination while avoiding obstacles in the path. Navigation system 520 provides a turn angle theta. and
SUBSTITUTE SHEET a drive velocity V,. Obstacle avoider system 528 according to this invention, beacon tracker 522 and path follower 526 continually output values for tnetatUrn and V_. However, motor controller 330 does not receive these values unless approved by arbiter 524.
When a beacon is placed as a marker proximate to a destination, the output of beacon tracker 522 is directed by arbiter 524 to motor controller portion 330 of mobile module 224. Beacon tracker 522 utilizes horizontal angle theta„, from beacon data 516 from beacon module 210, position data 518 from mobile module 224 as updated on blackboard 410, and information from map 402. If the beacon is not within a certain azimuthal angle from the head orientation of the robot, or when no beacons are present, path follower 526 provides turn angle theta. and the appropriate drive velocity V_. Path follower 526 performs dead reckoning utilizing position data 518 from mobile module 224 and expected data from map 402.
.By providing the robot with obstacle avoider system 528 according to this invention, the robot can operate in a partially unknown environment where obstacles may arise in the path. Obstacle avoider 528 utilizes ultrasonic data 530 from body ultrasonic module 226, head-path angle theta,., and data from map 402. The local navigation system 520 is initialized at each node as described below.
The operation of arbiter 524 is shown in Fig. 12B. Upon starting local navigation, the parameters X„, YM, and thetaM are set to zero in the X-Y positioner 332. The drive motor is started, step 534, and the navigation system runs until X^ is equal to or greater than the path distance X_ and the absolute value of the deviation distance YM is less than one foot, step 536. If these conditions are satisfied, local navigation is ended, step 538; otherwise, the navigation system continues to operate until the robot is brought quite close to the desired
SUBSTITUT -28- destination node. The final adjustment in position is provided by position estimation system 399 as described above.
Until these conditions are satisfied, local navigation
520 examines the output first of obstacle avoider 528, then beacon tracker 522, and finally path follower system 526.
As shown in step 540, if avoider system 528 provides a turn angle of greater than zero, its output is provided to motor controller 330 of the mobile module, step 542. If the avoidance turn angle thetat.urn is zero, and the deviation distance YM is less than one foot, theta. and VQ of beacon tracker system 522 are provided to mobile module, steps 544 and 546, respectively. Otherwise, the dead reckoning of the path follower 526 is provided to mobile module 224, step 548.
The operation of the beacon tracker system is shown in Fig. 13. The horizontal theta.-.-. and the distance traveled X„ are read from blackboard 410 and the path distance Xp and known horizontal distance Y_ of the beacon from the path are obtained from the map, steps 550, and 552, respectively. Horizontal distance Y„ represents the offset distance of the beacon from the path as shown above in Fig. 5C. The offset angle theta ~ is determined in step 554 by the formula
Figure imgf000030_0001
The turn angle theta. is equal to the sum of the horizontal angle thetaH, and offset angle theta f^, step 556. When the beacons are provided with unique identifying codes, as described below, the correct code must be present, step 558, or velocity V, and turn angle thetafc are set to no value, steps 560 and 562, respectively. If the
SUBSTITUTE SH correct code is present and the absolute value of the direction from the front of the head of the robot to the beacon is less than 11°, step 564, velocity VD is set to the patrol velocity V . , and is provided as output along with the turn angle theta. , steps 566 and 568, respectively. Operation then returns to step 550.
An important feature of these navigation systems is that they output values continuously. The mobile module can accept new commands during execution of turns and changes of velocity, as directed by arbiter 524, to rapidly adjust its operation and provide for smoother turning and acceleration changes for the robot.
Avoidance and path following of robot 10 in relation to obstacle 570 are shown in Fig. 14. Body ultrasound transducers 24, Figs. 1 and 2, provide a 25-foot range about robot 10 as indicated by circle 572. However, for obstacle avoidance, the range of the sensors is ignored beyond maximum rang3e Rmax, indicated by arrow 574, e.g. three feet. Further, a limited sensor arc 575 of 135°, or 67.5° to either side of the desired direction heading: The field of view of the avoider therefore utilizes the 15° field of view of the sensor oriented toward the desired direction, indicated by sector 576, and four sensors to either side of it, such as sensors in sectors 578, 580. Note that front 300 is oriented with sector 578, and not with sector 576, since face direction 300 is not oriented in the desired direction.
For velocity calculation, however, the field of view, between dashed lines 577, 579, is established relative to the forward facing direction, represented by direction arrow 300. Velocity field of view 577, 579 is a 75° arc which utilizes the forward facing sensor and two sensors on each side of that sensor.
While the desired path is represented by line 582, the actual path traveled during object avoidance is illustrated
SUBSTITU by path 584. Once clear of the obstacle 570, in terms of the avoider field of view 575, robot 10 is returned to path
582 along route 586 using the path following system. The path following system attempts to return robot 10 to path
582 within correction distance Xcorr, shown by line 588.
Once returned to the path, robot 10 proceeds to follow a beacon, if present, or to continue path following.
The operation of obstacle avoider system 528 according to this invention is shown in Figs. 15A-15C. The range of each sensor r_ is examined for each sensor i_, where i_ is the index of the sensors. The range of each sensor r. is monitored to determine when it is within rang 3e Rmax, step 590. As shown in step 592, turn angle theta. is set to zero and the drive velocity V, is set to the patrol velocity, until an obstacle comes within avoidance range.
To prevent damage to the robot, obstacle avoider system 528 must take into account the width of the robot. As shown in Fig. 16, robot 10 has radius r_ and is navigating through passage 594 between walls 596, 598. Passage 594 has* a width of 2w. The avoidance distance ravoi.d, is the distance by which an object is avoided and is computed according to the formula
ravoi.d, = w-r (4)
as shown in step 600, Fig. 15A. A maximum avoid distance
"max avoid" is set when ravoi.d -,. exceeds max avoid, steos
602, 604, respectively, to prevent robot 10 from deviating excessively from its path during obstacle avoidance. Further, a minimum clearance around objects can be specified, such as a wide avoider clearance, e.g. eight inches, or a narrow avoider, e.g. three inches.
Loop 606 thresholds the range data within the avoidance viewing angle such that for each range element r., step 608, the threshold set element t. is set to zero when r.
SUBSTITUTE is less than or eq^ual to ravoi.d.,' and tl. is setr-to one when rl. is g3reater than ravoi.,, step rs 610, 612, respectively.
After the last range element r. is thresholded, that is, set to one or zero, step 614, the side-swipe angle thetass is computed, step 616. As shown in Fig. 17, side-swipe angle theta SS is the minimum angle which robot
10 must turn such that point P will miss obstacle 618. In other words, robot 10 must alter heading 620 to at least heading 622 to avoid collision. The side-swipe angle is computed according to the formula
Figure imgf000033_0001
The side-swipe angle is utilized, step 616, Fig. 15A, to compute the number of sectors by which the robot must avoid an object. In other words, neighboring sectors of a blocked sector must be set to zero. The zero-neighbor factor is computed by dividing the side-swipe angle theta by the beam width theta, , step 618, and rounding upwards to a whole integer. As stated above, the beam width theta, of the ultrasound transducers of robot 10 is 15°.
A clear set is constructed in loop 620 by examining each thresholded range element t. and when it equals zero, step 622, its neighbors are zeroed such that the resulting clear set elements C. within Z sensor sectors of a blocked sector are set to zero, step 624. When the last thresholded range element t. has been processed, step 626, the sensor facing the direction of travel is computed in steps 628,
630. The sensor zero path angle thetasz is the angle from the path heading to sensor zero and is determined by subtracting the body angle theta, from the path angle theta . The integer number of the sensor facing the direction of travel, sensor S., is determined according to the formula
SUBSTITUTE SHEET Sd = <ΘSZ - ~^ ) θbw . ( 6 )
as shown in step 630.
Since direction sensor S, is blocked, the nearest open sensor which, when the robot heads in that direction, will enable the robot to clear the obstacle, is determined by hunting successively clockwise to counterclockwise and back again from direction sensor S, to find the clear sensor Sc,' step^ 632. The clear sensor Sc is described as the non-zero element in the clear set nearest to the direction sensor Sα, in clear set Ci.. This can be determined according to the formula
i = σk (k+l)/2 + (1-σ )(n-k/2) modulo n (7)
where i is the index of a sensor relative to the forwardrfacing sensor, n is the number of active sensors, k is a counter constant, and sigma, is equal to zero when k is even and equal to one when k is odd. Thus, equation (7) successively generates the elements 0, 1, n-1, 2, n-2,
3,..., by which elements on either side of the desired sensor direction are alternately examined. The first non—zero clear set member is selected as the clear sensor.
Before the turn angle is calculated, the head direction angle theta, is calculated by subtracting the sensor zero angle theta from the head-path angle theta , step 634.
The index number of clear sensor Sc is converted into degrees by multiplying it by beam width theta, , from which is subtracted head direction angle theta,, step 636, to obtain theta. , step 636.
The velocity is determined within the velocity field of view 577, 579, Fig. 14, by determining the minimum range rna.n within the forward-facing3 field of view, that is, ' around head direction angle theta,, step 638. Minimum range rmm. is utilized in loop σ 640 to determine the drive velocity V, by dividing the minimum detected range r . by the maximum avoidance range R and multiplying it by the patrol velocity, step 642. Thus, the patrol velocity is maintained, step es 644, 646, until Rmm. drops below Rmax
The turn value thetat.urn and drive velocity Vα,. are provided to arbiter 524, step 648. Operation then returns to step 590, where the calculations are repeated. Obstacle avoider system 528 thereby provides new commands which are accepted by motor controller 330 during execution of turns and during changes in velocity to provide smoother changes. Once the robot has avoided the obstacle, it returns to the path utilizing path follower 526, whose operation is dep eicted in Fig. 18. The oresent distances Xm, Ym, are read from blackboard 410, step 650. The path distance X is obtained from map 402 in step 652 and is compared to the distance actually travelled, X , to ensure that the correction distance Xcorr_. will not bring the robot bey -~ond the destination node. The correction distance X is described above in relation to Fig. 14. From step 654, Fig.
18, the minimum elapsed distance along the X axis is set to correction distance Xcorr, step - 656, unless the robot is close to the node as defined in step 658. The turn angle theta. is computed according to the formula
Figure imgf000035_0001
as shown in step 660.
The travel velocity is inversely proportional to the magnitude of turn angle theta. as shown in step 662. The drive velocity and turn angle are then provided to arbiter 524, step 664, and operation returns to step 650. As shown in Fig. 12B, the output of beacon tracking system 522 is granted priority over the output of path follower 526
SUBSTITUTE SHEET so that the robot resumes tracking the beacon, if present, once it is back on the path.
The hierarchy of operational commands for robot 10 are shown in Fig. 19. Global navigation system 670 includes planner 672 and executive 674. Planner 672 is responsive to map 676, which contains the expected position estimation data, described above in relation to Figs. 7A-7C, and other data such as the path type. For instance, a path can be designated as a avoidance path where objects are known to be present, a dead- reckoning path where no beacon is present, or a beacon tracking path where a navigation beacon is provided. The path can also be identified as a docking path which returns the robot to a recharge station, or as an intrusion detection path where intrusion detection is to be performed. Other information can be specified such as the patrol velocity or whether a narrow or wide avoidance clearance is required.
Operator 678 represents a guard or other personnel which provide commands to planner 672 such as the start node, the goal list, or a command to go to a particular node. Operator 678 can command the robot"to resume patrol or to return to the charge station.
Planner 672 prepares a path list comprising the executive nodes to visit between each goal node and provides this list to executive 674. Executive 674 identifies the path direction and destination node when a current node is occupied, at which point local navigation 520 operates until the destination node is reached. Local navigation 520 relies upon the blackboard for current status information.
The travel of the robot is interrupted by battery status 680, which monitors voltage to determine when the batteries have discharged below a predetermined amount. Recharge current is monitored while the robot is docked to determine when the batteries are sufficiently charged.
SUBSTITUTE SHEET Executive 674 notifies planner 672 on line 682 when the path list is complete or when battery status 680 indicates that the batteries have discharged below a certain level. Planner 672 then calculates the return path to the nearest charge station.
The operation of planner 672 is shown in Fig. 20. The current node is set to the start node, step 690, and loop 692 computes the shortest path from the current node to the goal node for each node in the goal list. The shortest path is computed using Djikstra's algorithm such as described in Aho, Hopcroft and Ullman, Data Structures and Algorithms (1983), page 205. The path list for those nodes is thereby constructed, step 694, unless the path is not possible, step 696, in which case the next goal node is requested, step 698, and an alternative destination planned.
For each path list, the executive 674 is summoned, step 700, and is provided with the path list. When the executive has completed that path list, step 702, the next goal node is requested, step 698.
The operation of executive 674 is shown in Fig. 21. Local navigation 520 is summoned, step 710, and the path information is provided to it, step 712. Once local navigation has ended, step 714, and the destination node becomes the current node, the current node is updated, step 716, to identify it as the former destination node.
Two low battery conditions are examined. When the battery voltage is at a minimum, step 718, e.g. 11.1 volts, the executive instructs the planner while at the node to return the robot to the nearest charge station, step 720. If the battery voltage is merely low, step 722, e.g. 11.7 volts, the operator is warned, step 724, and the robot resumes patrol. Executive 674 proceeds through the path list, step 726, until it is completed.
Map 676 can be generated by an operator who escorts the robot from node to node in a premises to be patrolled. The
SUBSTITUTE SHEET operator directs the robot to the node, a beacon is placed at the next node, and the robot is informed of the code for that distant beacon. The robot then scans, finds the beacon, and proceeds toward the beacon. If a beacon is not placed at the next node, the operator again directs the robot where to travel. The robot when at the next node records such information as the vertical angle to the beacon thetav , the distance to a wall RRA and the path distance Xp is set to the actual distance travelled, Xm.
The node may be named by the operator so that he can designate it as a goal node in the future. The operator places another beacon at the next node, and the process is continued until the robot has mapped its area of patrol.
In summary, the robot finds each node by operator designation or by beacons previously placed at the node. The robot then defines nodes by noting the angle and distance to and from the beacon, by uniquely identifying the beacon, or by other landmarks including passive beacons such as reflectors and bar codes. The robot can also define a node in terms of grid-based or segment-based geometric modeling.
Position locating system 1350, Fig. 22, includes one or more beacon transmitters 64, 64a, 64b, each having an infrared source 62, 62a, 62b. Also included is an infrared sensor 60 sensitive to the infrared radiation emitted by source 62, and associated with sensor 60 is an eye circuit 1352 whose output is provided on bus 1354. Bus 1354 interconnects with beacon STD-bus interface 1356 in beacon module 210. Interface 1356 communicates with microprocessor 1358 over STD bus 1360. Microprocessor 1358 may be a Z80 and it communicates directly with CPU 218, which may be a 68000.
Beacon transmitter 64 provides an optical burst 1362 of coded signals every 15.6 milliseconds, Fig. 23. Each burst, as shown in greater detail in Fig. 24, has a total burst
SUBSTITUTE SHEE time of 244 microseconds which defines an eight-bit word, each bit being 30.5 microseconds wide. The first bit is a start bit; the next seven bits are code bits and represent 128 different possible codes. Each code can uniquely identify a single beacon, so that with this simple arrangement one hundred twenty-eight different beacons can be uniquely identified; that is, when the infrared source is seen that is considered a logic one. When the infrared source, which may be a light-emitting diode or LED, is off, then the signal is low and is considered a logic zero. The signals shown in Figs. 23 and 24 are generated in beacon transmitter 64 by an oscillator 1364, Fig. 25, which runs continuously at 32.768 KHz. Its output is delivered directly to a register in code generator 1366. Its output is also delivered to a counter 1368, modulo 512, which divides the 32.768 KHz signal to provide the time period shown in Figs. 23 and 24. That is, with every 64th pulse (or every 15.6 ms) a burst occurs of eight bits. Eight bits are set to one or zero to produce the unique code for a particular beacon by the setting of the code select keys 1370. When one of the keys 1370 is toggled to ground, the associated stage of the register in 1366 is grounded, thereby placing a logic one in that bit position. Switches that are left toggled to high-level voltage produce a logic zero in the associated stage. The patterns of ones and zeros modulate the infrared radiation produced, by LED 62 so that a coded signal is provided which uniquely defines the particular beacon.
Sensor 60 in eye circuit 1352, Fig. 26, is a multisector sensor such as a dual-axis lateral effect photodiode. It provides four separate outputs, each indicative of the infrared radiation incident on its particular sector. By analyzing the relative values of the radiation falling on the different sectors, a determination can be made as to the angle of the sensor to the emitting
EET beacon. Each of the four sector outputs from photodiode 60 is fed to a different channel 1372, 1374, 1376, 1378. Each channel includes an amplifier 1380, high-pass filters 1382, voltage amplifiers 1384, and sample and hold circuits 1386. High-pass filters 1382 pass the coded signal from beacon 64 but block 60-cycle and 120-cycle signals introduced by ambient light conditions; periodically on command from microprocessor 1358 a signal on sample and hold line 1388 causes sample and hold circuits 1386 to sample and hold the signal in each channel. Those signals are then multiplexed by analog multiplexer 1392 as directed by a command from microprocessor 1358 on line 1390. The signal from each channel is fed directly to the gain control of amplifier 1394. Finally, the output from each channel is fed to A/D converter 1398, where it stops unless a control signal on line 1400 from microprocessor 1358 requests the angle data signal on line 1402. Microprocessor 1358 also provides a select and enable signal on line 1404 to A/D converter 1398' to indicate the particular eye circuit 1352, 1352a, 1352b or 1352c which is currently being interrogated.
Simultaneously with this, one or more of the outputs from photodiode 60 after passing through amplifiers 1380 are combined in an AC summer 1406 in order to maximize the signal which will be used to detect the identifying code. From summer circuit 1406 the signal is passed to clipper circuit 1408, which limits the output independent of the input amplitude. At this point the signal is constituted by one or more coded pulses riding on an envelope of sixty or one hundred twenty cycle noise. Differentiator circuit 1414 is therefore used to detect only the transitions of the pulses; thus, for every positive-going transition a positive spike appears at the output of differentiator 1414 and for every negative-going transition a negative spike occurs at the output of differentiator 1414. The positive-going spikes pass through amplifier 1416 and set flip-flop 1418 to
SUBSTITUTE SHEET define the beginning of a pulse. Negative-going spikes passing through amplifier 1420 reset flip-flop 1418 and define the end of the pulse. In this way the pulses and the received coded signal are reconstituted one at a time to construct the code data signal on line 1422.
The angle data signal on line 1402, Fig. 27, is fed directly through MUX 1424 in beacon STD-bus interface 1356 to STD-bus 1360. The code data signal is fed from MUX 1424 to code verifier circuit 1426. After it is verified it is submitted to a converter 1428 where it is changed from a serial signal to a parallel signal and then provided to STD-bus 1360. Code verifier circuit 1426 may utilize any of a number of techniques for verifying the authenticity of an incoming code. For example, the incoming signal may be sampled at fixed times following a start pulse when pulse transitions would normally be expected in a valid signal. If the transitions occur within narrow windows at the expected times, they are treated as valid code; otherwise they are rejected. The code status is provided on line 1430 to STD-bus 1360.
Under software control. Figs. 28A and 28B, operation may begin with a signal from CPU 218 in step 1440 with the command "Get Eye Data". When microprocessor 1358 receives that signal it selects a particular eye in step 1442. The A/D converter is then commanded to start the conversion in step 1446 and the code data is obtained on line 1422 in step 1448. In step 1450, if the code data is bad the cycle starts again with the beginning of a new conversion in step 1446. If the code is good then the angle information is used and the next step 1452 provides the azimuth angle and the altitude angle and the code in step 1454 to microprocessor 1358. Here the angle data is converted to the azimuth angle and the altitude angle and combined with the code and directed to CUP 218. The azimuth angle needs no further processing. The altitude angle and code are
SUBSTITUTE SHEET delivered to CPU 218, which then retrieves the height H of the identified beacon in step 1456; height H can be unique for that beacon, or all beacons can be placed at the same height. That height is used to calculate the distance D to the beacon by dividing the height by triangulation, e.g., the tangent of the altitude angle in step 1458. Then the distance and direction of the robot versus the beacon is output in step 1460.
The calculation in step 1454, Fig. 28A, of the azimuth angle and the altitude angle from the angle data signal is accomplished by determining the X position and the Y position from the dual axis lateral effect photodiode of Fig. 26 shown in more detail in Fig. 29. The X position is calculated according to the expression:
X position = jl _ (9)
and the Y position by the expression:
. . B-D - ,,n,
Y position = -g^Q U
The division by A + C and B + D respectively normalizes the signal to reduce its dependence on the incident light level. The angles are those determined by the expression:
Figure imgf000042_0001
where K is a constant dependent on the size of the detector and focal length of the light gathering lens if one is used:
K « - - (13)
Fl where D is the diameter of the detector and F. is the focal length of the lens.
SUBSTITUTE SHEET where D is a diameter of the detector and Fi is the focal len of the lens.
Mapping system 1500 according to this invention is shown Fig. 30. System 1500 establishes a map for a vehicle such as mobile robot which at a minimum allows the robot to navigate from node to node as described above. The map may further instruct the robot to perform certain tasks along the path, s as intruder detection, beacon tracking, obstacle avoidance, o docking with a recharge station. Similarly, tasks can be assigned to nodes such as intruder detection or position calibration. Robots using such a map can operate autonomously in accordance with a path list, the robot determines where to and what tasks to perform along each path and at each node.
The map is established node by node. Directing module 15 directs the robot to a selected location in a premises to be patrolled and designates that location' as a start node. The initial direction of a path to the next node to be established is provided by path identification module 1504. When the robo is located at the start node, robot movement command system 15 commands the robot through mobile module 224, Fig. 3, to move that direction.
The robot proceeds along the path using one or more navigation techniques while orientation update system 1057, shown in phantom, maintains the robot on an accurate, repeatab course. Orientation update system 1507 successively observes the surrounding environment, such as by aisle centering and wa following which are described below in relation to Fig. 32, or continuously monitors a beacon to update its actual orientatio relative to global zero and to the intended heading.
SUBSTITUTE SHEET The distance travelled along the path is measured by path measuring assembly 1508, Fig. 30, which includes drive encoder 111, Fig. 6. When the robot reaches the next node it is stoppe by halt module 1510, shown in phantom, and the node is defined by node defining system 1512 in terms of at least the initial path direction and the path distance.
In one construction, directing module 1502, identification module 1504 and halt module 1510 are part of input device 1514 such as a terminal connected by cable directly to CPU 218, Fig. 3. An operator accompanies the robot and enters commands to guide it to a start node, identify the initial path direction, and to halt the robot once it reaches the next node.
In another construction, the robot is guided among nodes b navigation beacons. The robot emerges from a selected location e.g. , a recharge station, and approaches the beacon until' thetavA matches a selected thetavE as shown in Fig. 5B. As discussed above, all beacons can be set at a fixed height or th individual height and vertical angle thetavE for each uniquely coded beacon can be provided to the robot. After establishing one node, the robot scans using beacon detector 60, Fig. 1, to locate the next beacon; if a beacon is not in view, the robot rotates in position until the next beacon is located. The initial direction of the path to the next node can be identifie by the orientation of the next beacon in relation to the previous path or by the global heading to the next beacon.
The general mapping operation is summarized in Fig. 31A- 3IB. A start node is designated, step 1520. If the next path to be established is not the final path, step 1522, the type of that path is selected, step 1524. Path types include exiting a recharge station, dead reckoning, beacon path navigation, and aisle centering or wall following, steps 1526, 1528, 1530 and 1532, respectively. Step 1534 indicates that one or more tasks
SUBSTITUTE SHEET can be designated for that path. Recharge station exit procedure is shown in Fig. 33.
For dead reckoning, the turn angle thetaturn is entered step 1535, after which the robot turns and sets path angle thetap in relation to global zero, step 1536. The robot the dead reckons in the direction of the path, step 1538, until is stopped at the goal node, step 1540, as designated by an operator.
When the path type is a beacon tracking path where each beacon is uniquely coded, the beacon code for the next beaco entered by the operator, step 1542, or retrieved from memory. The beacon is located, step 1544, and the path angle is set after the robot turns and faces the beacon. The robot track the beacon, step 1546, while recalibrating body angle thetaβ head direction angle thetaD as described above. The robot c be halted at the goal node, step 1540, by a command from an operator or when actual vertical angle thetavA is equal to expected vertical angle thetavE*
Two additional forms of navigation, aisle centering and wall following, step 1532, are illustrated in Fig. 32. Robot is shown in passageway 1550 positioned at distances Di and D2 from walls 1552 and 1554, respectively. Where passage 1550 i narrow aisle, robot 10 is instructed to maintain distance Di equal to distance D2« When wall following is desired, robot is instructed to maintain a fixed distance from a designated wall, e.g., to maintain distance D2 from wall 1554. Distance can be overridden when an obstacle comes within avoidance fie of view 1556.
When the path type is designated as aisle centering or w following, step 1532, Fig. 31A, a turn angle is entered, step 1560. The path angle is set, step 1562, and the robot procee with a combination of dead reckoning and aisle centering or w
SUBSTITUTE SHEET following, step 1564.
Once at the goal node, a number is assigned to it, step 1566, as entered by the operator or as incremented from the preceding goal number. While during initial operation the robo will be establishing new, previously unknown goal nodes in the environment, the robot eventually can establish paths to and from known nodes as described below, in which case the goal nod number will correspond to a previous number. The path distance Xp is set to the travel distance XM, step 1568, and the positio of the goal node is defined, step 1570. The definition can include position estimation using end-calibration or X-axis calibration. Further, the actual ultrasonic readings RHA and beacon angle readings thetavA are designated as the expected values RHE and thetavE, respectively. The node can also be defined by geometric modelling such as grid-based or line-segment-based modelling. Further, the location of the nod in relation to physical features such as intersections of passageways can be noted The data defining the position is recorded in the map, step 1572, and operation returns to step 1520 after setting travel distance m to zero, step 1573.
In a preferred construction, the robot is initially oriented to its environment in relation to the position of its recharge station. The robot's position within the recharge station is designated as the start node, step 1520, and the pat type is set as a docking path, step 1524, which disables the obstacle avoidance sensors. The robot, now occupying the base node, aligns its head direction angle thetaD with body sensor zero, step 1579, such as shown in Fig. 4B. The robot then searches, step 1580, Fig. 33, for a navigation beacon which is preferably positioned at an orientation directly in front of th recharge station. After locating the beacon, that direction is set as global zero, step 1582. Body angle thetaβ is set to the
SUBSTITUTE SHEET angle between global zero and sensor zero; the angle between forward-facing direction 300 and sensor zero 302, Fig. 4A, i designated as direction angle thetao-
The robot prepares to exit the station. Its velocity is set to low, step 1586, in accordance with the docking path ty The robot tracks on the beacon, recalibrating thetaβ and thetao, step 1588. The robot is stopped at the goal node whi serves as the entry node, step 1590, and operation proceeds t step 1566, Fig. 31B, where the entry node is identified by number.
When the last path is reached, step 1522, Fig. 31A, operation proceeds to step 1602, Fig. 34. The path parameter are set, step 1602: the path type is a docking path, the pat angle is 180 degrees from global zero, and the velocity is lo e.g., 50% of the maximum velocity. The beacon code is obtain for the beacon within the recharge station, step 1604, and th robot enters the recharge station while tracking the beacon, step 1606. The robot is stopped at the base node, step 1608, operator designation or by the vertical angle thetavA. and th path between the entry node and the base node is defined by setting Xp to XM and recording the path distance and the path angle in the map, steps 1612 and 1614, respectively. Steps 1602-1614 ensure that the robot is capable of returning to th recharge station.
After the final node is recorded in the map, additional information is entered to prepare the robot for autonomous patrol. The start node for commencing patrol is entered, ste 1622, and the goal list for patrol is selected, step 1624. T final information is combined with the map and saved, step 162 such as by storing information on diskette.
Several situations may arise in which the operator desire o establish new paths or new nodes. Expansion of the map
SUBSTITUTE SHEET enables the robot to accommodate an enlarged environment or to establish new roots within a mapped environment. A new path 1630 can be established between known nodes, such as between nodes 5 and 6, Fig. 35A, where nodes 0, 1, 2, 3, 4, 5 and 6 we previously established. Node 6 is then defined not only in relation to nodes 3 and 4, but also as a distance, e.g., 100.4 feet, and a direction, e.g., 90 degrees, from node 5.
Alternatively, as shown in Fig. 35B, a path 1632 is established between known node 5 and unknown node 6. Finally, path 134 can be established between two unknown nodes 5 and 6, Fig. 35C. These contingencies are addressed in Fig. 36. If both the start and the goal nodes exist, as determined at step 1640, the robot is maneuvered to start node, step 1642, and instructed to proceed from the start node to the goal node, st 1644, by calling steps 1520-1572, Figs. 31A-31B. If there are further additions, step 1646, operation returns to step 1640; otherwise, step 1622, Fig. 34, is entered. If neither node exists, as determined at step 1650, the mapping operation commencing with step 1520, Fig. 31A, is begun. If the start node exists, step 1652, Fig. 36, the robot is positioned at th start node, step 1654, and instructed to proceed to the goal node, step 1656, after proceeding through steps 1520-1572. Th operator can be warned that the goal node is not connected to a adjacent node, step 1658, shown in phantom. The new node must be tied in to at least one adjacent node, step 1660 so that the robot is able to complete a circuit.
If the goal node exists but the start node does not, the robot is positioned at a node adjacent to the start node to be established, step 1662, and proceeds from the adjacent node to the start node using steps 1520-1572, indicated as step 1664. The robot then proceeds from the newly established start node t the goal node, indicated as step 1666.
SUBSTITUTE SHEET As indicated above, once such a map has been establishe the robot can operate autonomously to navigate to designated places and perform designated tasks along each path. By adjusting the goal list, the operator can restrict the robot a selected portion of the map, e.g., to provide increased surveillance of that area. Further, the goal list can be pseudorandomly generated to provide an unpredictable route fo the robot.
Although specific features of the invention are shown in some drawings and not others, this is for convenience only as each feature may be combined with any or all of the other features in accordance with the invention.
Other embodiments will occur to those skilled in the art and are within the following claims: What is claimed is:
SUBSTITUTE SHEET

Claims

1. A mapping system for establishing a number of navigation nodes, each proximate a navigation beacon, and interconnecting paths in an environment to be navigated by a vehicle having a drive system, comprising: means for directing the vehicle to a selected locati in the environment and designating that location as a first node; means for locating the initial direction of a path from said first node to a second node relative to a beacon proximate said second node; means, responsive to said means for locating, for commanding the drive system to move the vehicle along said pat in said initial direction from said first node;
means for measuring distance travelled by the vehicl along said path from said first to said second node; and means, responsive to said means for measuring, for defining said second node by at least the distance travelled along said path between said first and second nodes and by sai initial direction.
2. The system of claim 1 in which said means for locatin includes azimuthal detector means for sensing a navigation beacon and for resolving the azimuthal angle between a referenc direction for the vehicle and the direction of the beacon relative to the vehicle.
3. The system of claim 2 in which said means for commanding includes means, responsive to said azimuthal detecto means, for turning said vehicle by said vehicle-beacon azimutha angle.
SUBSTITUTE SHEET
4. The system of claim 2 in which said means for locat further- includes means for moving said azimuthal detector me to search for the navigation beacon.
5. The system of claim 4 in which said means for movin searches for a navigation beacon having a particular identification code.
6. The system of claim 1 further including means for monitoring the orientation of the vehicle relative to the environment.
7. The system of claim 6 further including means for substantially continuously resolving, while the vehicle is travelling along said path, the orientation of the vehicle relative to a feature in the environment and for updating sai means for monitoring.
8. The system of claim 7 in which said means for resolv defines vehicle orientation relative to the beacon.
9. The system of claim 1 further including means for halting the vehicle at said second node.
10. The system of claim 9 in which said means for halti includes halt input means for operator input of a halt comman
11. The system of claim 1 in which said means for halti includes altitudinal detector means for sensing a navigation beacon and for resolving the altitudinal angle between the beacon and the vehicle.
SUBSTITUTE SHEET
12. The system of claim 11 in which said means for halti further includes means for comparing the altitudinal angle wit a predetermined angle and for arresting travel of the vehicle when they match.
13. The system of claim 1 in which said means for directing includes means for selecting said second node as the
<?location from which a third node is to be established.
14. The system of claim 1 in which said means for directing includes input means for operator input of said selected location for said first node.
15. The system of claim 14 in which said means for directing further includes guidance input means for operator guidance of the vehicle to said selected location.
16. The system of claim 1 further including task input means for operator input of task commands defining actions to b performed by the vehicle along that path.
17. The system of claim 1 further including node task means for operator input of task commands defining actions to b performed by the vehicle at said second node.
18. The system of claim 1 in which said means for definin includes means for storing said initial path direction and said path distance.
SUBSTITUTE SHEET
19. A mapping system for establishing a number of navigation nodes and paths among them in an environment to be navigated by a vehicle having a drive system, comprising: means for directing the vehicle to a selected locat in the environment and designating that location as a first node; means for identifying the initial direction of a pa from said first node to a second node; means for commanding the drive system to move the vehicle in said initial direction from said first node; means for measuring distance travelled by the vehic along said path; and means, responsive to said means for measuring, for defining said second node by at least the distance travelled along said path and by said initial direction.
20. The system of claim 19 in which said means for identifying includes means for determining the azimuthal angl by which the vehicle must turn to face said initial direction.
21. The system of claim 20 in which said means for determining includes angle input means for operator input of said azimuthal angle.
22. A mapping system for establishing a number of navigation nodes and paths among them in an environment to be navigated, comprising: a vehicle having a drive system including a drive motor and a steering system; means for directing said vehicle to a selected location in the environment and designating that location as a first node;
SUBSTITUTE SHEET means for identifying the initial direction of a pat to a second node; means for commanding said drive, system to move said vehicle in said initial direction from said first node; means for measuring distance travelled by said vehic along said path; and means, responsive to said means for measuring, for defining said second node by at least the distance travelled along said path and by said initial direction.
23. The system of claim 22 in which said means for measuring distance includes drive encoder means, responsive to said drive motor, for determining distance travelled by said vehicle.
24. The system of claim 22 in which said means for identifying includes means for determining the azimuthal angle by which said vehicle must turn to face said initial direction
25. The system of claim 24 in which said means for determining includes azimuthal detector means for sensing a navigation beacon and for resolving the azimuthal angle betwee said vehicle and the beacon.
26. The system of claim 25 in which said means for commanding includes means, responsive to said azimuthal detect means and said steering system, for turning said vehicle by sa vehicle-beacon azimuthal angle.
27. The system of claim 22 in which said vehicle is a mobile robot.
SUBSTITUTE SHEET
28. A mapping system for establishing a number of navigation nodes and paths among them in an environment to be navigated by a vehicle having a drive system, comprising: input means for operator input to designate a selec location in the environment as a first node, to identify the initial direction of a path to a second node, and to halt the vehicle at said second node; means for commanding the drive system to move the vehicle in said initial direction from said first node; means for measuring distance travelled by the vehicl along said path; and means, responsive to said means for measuring, for defining said second node by at least the distance travelled along said path and by said initial direction.
29. A system for navigating a vehicle from node to node utilizing a map, stored in the vehicle, which defines a plurality of navigation nodes in relation to each other, comprising: means for directing the vehicle to a first node recorded in the map; means for referencing the map to identify the known initial direction of a path from said first node to a second node proximate a navigation beacon and to identify the known distance of said path; detector means for sensing said navigation beacon an for resolving at least the azimuthal angle between a reference direction for the vehicle and the direction of the beacon relative to the vehicle; and means, responsive to said means for referencing and said detector means, for commanding the vehicle to proceed in said initial direction and to navigate along said path to said second node for said known distance.
SUBSTITUTE SHEET
30. The system of claim 29 in which said means for referencing identifies said initial direction based on a predetermined azimuthal angle from said first node toward said beacon proximate said second node.
31. The system of claim 29 in which said detector means also resolves the altitudinal angle between the vehicle and th beacon.
32. The system of claim 31 in which said means for referencing identifies said second node by a predetermined altitudinal angle observable at said second node.
33. The system of claim 32 in which said means for commanding includes means, responsive to said detector means, for halting the vehicle when the resolved altitudinal angle matches said predetermined altitudinal angle.
34. The system of claim 29 in which said beacon includes means for producing energy in a coded signal which uniquely identifies that beacon, and said means for referencing retrieve from the map the identification code for that beacon.
35. The system of claim 34 in which said detector means includes: a multisector sensor for sensing a signal emitted by said beacon; means, responsive to said sensor, for generating a code data signal representative of said coded signal; and means, responsive to said sensor sectors, for generating an angle data signal representative of the angle fro said sensor to said beacon in at least the azimuthal dimension.
SUBSTITUTE SHEET
36. The system of claim 29 further including planner me for designating successive nodes to visit and for providing s successive nodes to said means for directing.
37. A method of establishing a number of navigation nod and paths among them for a vehicle in an environment to be navigated, comprising: directing the vehicle to a selected location in the environment; designating that location as a first node; identifying the initial direction of a path to a second node; moving the vehicle in the initial direction from th first node while measuring distance travelled along the path; defining the second node by at least the distance travelled along the path and by the initial path direction; a repeatedly identifying, moving and defining until a desired number.of nodes and paths are established.
38. The method of claim 37 further including establishi a return path to said first node.
39. The method of claim 37 further including recording t established nodes to form a map of the environment.
40. The method of claim 39 further including assigning tasks for the vehicle to perform along each path.
41. The method of claim 39 further including assigning tasks for the vehicle to perform at each node.
SUBSTITUTE SHEET
PCT/US1987/001144 1986-11-28 1987-05-14 Node map system and method for vehicle WO1988004081A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US06/936,092 US4821192A (en) 1986-05-16 1986-11-28 Node map system and method for vehicle
US936,092 1986-11-28

Publications (1)

Publication Number Publication Date
WO1988004081A1 true WO1988004081A1 (en) 1988-06-02

Family

ID=25468161

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1987/001144 WO1988004081A1 (en) 1986-11-28 1987-05-14 Node map system and method for vehicle

Country Status (2)

Country Link
AU (1) AU7484287A (en)
WO (1) WO1988004081A1 (en)

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0367528A2 (en) * 1988-10-31 1990-05-09 Texas Instruments Incorporated Hierarchical control system for automatically guided vehicles
EP0367526A2 (en) * 1988-10-31 1990-05-09 Texas Instruments Incorporated Closed-loop navigation system for mobile robots
EP0367527A2 (en) * 1988-10-31 1990-05-09 Texas Instruments Incorporated A method for controlling movements of a mobile robot in a multiple node factory
US5367456A (en) * 1985-08-30 1994-11-22 Texas Instruments Incorporated Hierarchical control system for automatically guided vehicles
WO1997033212A1 (en) * 1996-03-06 1997-09-12 Gmd - Forschungszentrum Informationstechnik Gmbh Autonomous mobile robot system for sensor-based and map-based navigation in pipe networks
GB2353909A (en) * 1999-08-28 2001-03-07 John Alfred Cawkwell Robot positioning and obstacle sensing
GB2532601A (en) * 2014-10-28 2016-05-25 Deere & Co Robotic mower navigation system
WO2016095965A3 (en) * 2014-12-16 2016-08-25 Aktiebolaget Electrolux Experience-based roadmap for a robotic cleaning device
CN107065858A (en) * 2017-02-13 2017-08-18 四川超影科技有限公司 Crusing robot air navigation aid based on ultra wide band
US9811089B2 (en) 2013-12-19 2017-11-07 Aktiebolaget Electrolux Robotic cleaning device with perimeter recording function
US9939529B2 (en) 2012-08-27 2018-04-10 Aktiebolaget Electrolux Robot positioning system
US9946263B2 (en) 2013-12-19 2018-04-17 Aktiebolaget Electrolux Prioritizing cleaning areas
US10045675B2 (en) 2013-12-19 2018-08-14 Aktiebolaget Electrolux Robotic vacuum cleaner with side brush moving in spiral pattern
US10149589B2 (en) 2013-12-19 2018-12-11 Aktiebolaget Electrolux Sensing climb of obstacle of a robotic cleaning device
US10209080B2 (en) 2013-12-19 2019-02-19 Aktiebolaget Electrolux Robotic cleaning device
US10219665B2 (en) 2013-04-15 2019-03-05 Aktiebolaget Electrolux Robotic vacuum cleaner with protruding sidebrush
US10231591B2 (en) 2013-12-20 2019-03-19 Aktiebolaget Electrolux Dust container
US10433697B2 (en) 2013-12-19 2019-10-08 Aktiebolaget Electrolux Adaptive speed control of rotating side brush
US10448794B2 (en) 2013-04-15 2019-10-22 Aktiebolaget Electrolux Robotic vacuum cleaner
US10499778B2 (en) 2014-09-08 2019-12-10 Aktiebolaget Electrolux Robotic vacuum cleaner
US10518416B2 (en) 2014-07-10 2019-12-31 Aktiebolaget Electrolux Method for detecting a measurement error in a robotic cleaning device
US10617271B2 (en) 2013-12-19 2020-04-14 Aktiebolaget Electrolux Robotic cleaning device and method for landmark recognition
US10678251B2 (en) 2014-12-16 2020-06-09 Aktiebolaget Electrolux Cleaning method for a robotic cleaning device
US10729297B2 (en) 2014-09-08 2020-08-04 Aktiebolaget Electrolux Robotic vacuum cleaner
US10877484B2 (en) 2014-12-10 2020-12-29 Aktiebolaget Electrolux Using laser sensor for floor type detection
US10874274B2 (en) 2015-09-03 2020-12-29 Aktiebolaget Electrolux System of robotic cleaning devices
US10874271B2 (en) 2014-12-12 2020-12-29 Aktiebolaget Electrolux Side brush and robotic cleaner
US11099554B2 (en) 2015-04-17 2021-08-24 Aktiebolaget Electrolux Robotic cleaning device and a method of controlling the robotic cleaning device
US11122953B2 (en) 2016-05-11 2021-09-21 Aktiebolaget Electrolux Robotic cleaning device
US11169533B2 (en) 2016-03-15 2021-11-09 Aktiebolaget Electrolux Robotic cleaning device and a method at the robotic cleaning device of performing cliff detection
US11474533B2 (en) 2017-06-02 2022-10-18 Aktiebolaget Electrolux Method of detecting a difference in level of a surface in front of a robotic cleaning device
US11525921B2 (en) 2018-04-03 2022-12-13 Sharkninja Operating Llc Time of flight sensor arrangement for robot navigation and methods of localization using same
US11921517B2 (en) 2017-09-26 2024-03-05 Aktiebolaget Electrolux Controlling movement of a robotic cleaning device

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4049961A (en) * 1974-02-01 1977-09-20 Thomson-Csf Automatic guidance system for moving objects
US4328545A (en) * 1978-08-01 1982-05-04 Imperial Chemical Industries Limited Driverless vehicle autoguide by light signals and two directional detectors
US4463821A (en) * 1982-03-01 1984-08-07 Robot Crabtor International Drivable, steerable platform for lawnmower and the like
US4500970A (en) * 1982-01-15 1985-02-19 Richard A. Boulais Robot vehicle guidance system including checkpoint realignment system
US4556940A (en) * 1980-09-05 1985-12-03 Mitsubishi Denki Kabushiki Kaisha Robot vehicle
US4593238A (en) * 1983-09-17 1986-06-03 Tsubakimoto Chain Co. Method and apparatus for controlling travel of an automatic guided vehicle
US4620285A (en) * 1984-04-24 1986-10-28 Heath Company Sonar ranging/light detection system for use in a robot
US4628454A (en) * 1982-07-13 1986-12-09 Kubota, Ltd. Automatic running work vehicle
US4638445A (en) * 1984-06-08 1987-01-20 Mattaboni Paul J Autonomous mobile robot

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4049961A (en) * 1974-02-01 1977-09-20 Thomson-Csf Automatic guidance system for moving objects
US4328545A (en) * 1978-08-01 1982-05-04 Imperial Chemical Industries Limited Driverless vehicle autoguide by light signals and two directional detectors
US4556940A (en) * 1980-09-05 1985-12-03 Mitsubishi Denki Kabushiki Kaisha Robot vehicle
US4500970A (en) * 1982-01-15 1985-02-19 Richard A. Boulais Robot vehicle guidance system including checkpoint realignment system
US4463821A (en) * 1982-03-01 1984-08-07 Robot Crabtor International Drivable, steerable platform for lawnmower and the like
US4628454A (en) * 1982-07-13 1986-12-09 Kubota, Ltd. Automatic running work vehicle
US4593238A (en) * 1983-09-17 1986-06-03 Tsubakimoto Chain Co. Method and apparatus for controlling travel of an automatic guided vehicle
US4620285A (en) * 1984-04-24 1986-10-28 Heath Company Sonar ranging/light detection system for use in a robot
US4638445A (en) * 1984-06-08 1987-01-20 Mattaboni Paul J Autonomous mobile robot

Cited By (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5280431A (en) * 1985-08-30 1994-01-18 Texas Instruments Incorporated Method for controlling the movements of a mobile robot in a multiple node factory
US5367456A (en) * 1985-08-30 1994-11-22 Texas Instruments Incorporated Hierarchical control system for automatically guided vehicles
EP0367526A3 (en) * 1988-10-31 1990-06-06 Texas Instruments Incorporated Closed-loop navigation system for mobile robots
EP0367527A3 (en) * 1988-10-31 1990-06-13 Texas Instruments Incorporated A method for controlling movements of a mobile robot in a multiple node factory
EP0367528A3 (en) * 1988-10-31 1990-06-13 Texas Instruments Incorporated Hierarchical control system for automatically guided vehicles
EP0367527A2 (en) * 1988-10-31 1990-05-09 Texas Instruments Incorporated A method for controlling movements of a mobile robot in a multiple node factory
EP0367526A2 (en) * 1988-10-31 1990-05-09 Texas Instruments Incorporated Closed-loop navigation system for mobile robots
EP0367528A2 (en) * 1988-10-31 1990-05-09 Texas Instruments Incorporated Hierarchical control system for automatically guided vehicles
WO1997033212A1 (en) * 1996-03-06 1997-09-12 Gmd - Forschungszentrum Informationstechnik Gmbh Autonomous mobile robot system for sensor-based and map-based navigation in pipe networks
US6108597A (en) * 1996-03-06 2000-08-22 Gmd-Forschungszentrum Informationstechnik Gmbh Autonomous mobile robot system for sensor-based and map-based navigation in pipe networks
GB2353909A (en) * 1999-08-28 2001-03-07 John Alfred Cawkwell Robot positioning and obstacle sensing
GB2353909B (en) * 1999-08-28 2004-03-17 John Alfred Cawkwell Robot positioning and motion mechanism
US9939529B2 (en) 2012-08-27 2018-04-10 Aktiebolaget Electrolux Robot positioning system
US10448794B2 (en) 2013-04-15 2019-10-22 Aktiebolaget Electrolux Robotic vacuum cleaner
US10219665B2 (en) 2013-04-15 2019-03-05 Aktiebolaget Electrolux Robotic vacuum cleaner with protruding sidebrush
US10149589B2 (en) 2013-12-19 2018-12-11 Aktiebolaget Electrolux Sensing climb of obstacle of a robotic cleaning device
US9811089B2 (en) 2013-12-19 2017-11-07 Aktiebolaget Electrolux Robotic cleaning device with perimeter recording function
US9946263B2 (en) 2013-12-19 2018-04-17 Aktiebolaget Electrolux Prioritizing cleaning areas
US10045675B2 (en) 2013-12-19 2018-08-14 Aktiebolaget Electrolux Robotic vacuum cleaner with side brush moving in spiral pattern
US10433697B2 (en) 2013-12-19 2019-10-08 Aktiebolaget Electrolux Adaptive speed control of rotating side brush
US10209080B2 (en) 2013-12-19 2019-02-19 Aktiebolaget Electrolux Robotic cleaning device
US10617271B2 (en) 2013-12-19 2020-04-14 Aktiebolaget Electrolux Robotic cleaning device and method for landmark recognition
US10231591B2 (en) 2013-12-20 2019-03-19 Aktiebolaget Electrolux Dust container
US10518416B2 (en) 2014-07-10 2019-12-31 Aktiebolaget Electrolux Method for detecting a measurement error in a robotic cleaning device
US10499778B2 (en) 2014-09-08 2019-12-10 Aktiebolaget Electrolux Robotic vacuum cleaner
US10729297B2 (en) 2014-09-08 2020-08-04 Aktiebolaget Electrolux Robotic vacuum cleaner
GB2532601B (en) * 2014-10-28 2021-04-21 Deere & Co Robotic mower navigation system
GB2532601A (en) * 2014-10-28 2016-05-25 Deere & Co Robotic mower navigation system
US9788481B2 (en) 2014-10-28 2017-10-17 Deere & Company Robotic mower navigation system
US10877484B2 (en) 2014-12-10 2020-12-29 Aktiebolaget Electrolux Using laser sensor for floor type detection
US10874271B2 (en) 2014-12-12 2020-12-29 Aktiebolaget Electrolux Side brush and robotic cleaner
US10534367B2 (en) 2014-12-16 2020-01-14 Aktiebolaget Electrolux Experience-based roadmap for a robotic cleaning device
WO2016095965A3 (en) * 2014-12-16 2016-08-25 Aktiebolaget Electrolux Experience-based roadmap for a robotic cleaning device
US10678251B2 (en) 2014-12-16 2020-06-09 Aktiebolaget Electrolux Cleaning method for a robotic cleaning device
US20190004537A1 (en) * 2014-12-16 2019-01-03 Aktiebolaget Electrolux Experience-based roadmap for a robotic cleaning device
CN107003669A (en) * 2014-12-16 2017-08-01 伊莱克斯公司 The road sign based on experience for robotic cleaning device
KR102339531B1 (en) * 2014-12-16 2021-12-16 에이비 엘렉트로룩스 Experience-based roadmap for a robotic cleaning device
KR20170095840A (en) * 2014-12-16 2017-08-23 악티에볼라겟 엘렉트로룩스 Experience-based roadmap for a robotic cleaning device
US11099554B2 (en) 2015-04-17 2021-08-24 Aktiebolaget Electrolux Robotic cleaning device and a method of controlling the robotic cleaning device
US10874274B2 (en) 2015-09-03 2020-12-29 Aktiebolaget Electrolux System of robotic cleaning devices
US11712142B2 (en) 2015-09-03 2023-08-01 Aktiebolaget Electrolux System of robotic cleaning devices
US11169533B2 (en) 2016-03-15 2021-11-09 Aktiebolaget Electrolux Robotic cleaning device and a method at the robotic cleaning device of performing cliff detection
US11122953B2 (en) 2016-05-11 2021-09-21 Aktiebolaget Electrolux Robotic cleaning device
CN107065858A (en) * 2017-02-13 2017-08-18 四川超影科技有限公司 Crusing robot air navigation aid based on ultra wide band
US11474533B2 (en) 2017-06-02 2022-10-18 Aktiebolaget Electrolux Method of detecting a difference in level of a surface in front of a robotic cleaning device
US11921517B2 (en) 2017-09-26 2024-03-05 Aktiebolaget Electrolux Controlling movement of a robotic cleaning device
US11525921B2 (en) 2018-04-03 2022-12-13 Sharkninja Operating Llc Time of flight sensor arrangement for robot navigation and methods of localization using same

Also Published As

Publication number Publication date
AU7484287A (en) 1988-06-16

Similar Documents

Publication Publication Date Title
US4821192A (en) Node map system and method for vehicle
US4829442A (en) Beacon navigation system and method for guiding a vehicle
WO1988004081A1 (en) Node map system and method for vehicle
US4751658A (en) Obstacle avoidance system
US4777416A (en) Recharge docking system for mobile robot
US4815008A (en) Orientation adjustment system and robot using same
US4710020A (en) Beacon proximity detection system for a vehicle
AU2012241780B2 (en) System and method for controlling an unmanned air vehicle
US5276618A (en) Doorway transit navigational referencing system
EP1023610B1 (en) Method and device for association of anonymous reflectors to detected angle positions
CA1236899A (en) Apparatus and method for optical guidance system for automatic guided vehicle
Madhavan et al. Distributed heterogeneous outdoor multi-robot localization
US4847769A (en) Automated vehicle drift correction
FI88655C (en) Ways of navigating a driverless vehicle as well as a vehicle for exercising in the manner
US20080294288A1 (en) Autonomous Mobile Robot
JP2023500082A (en) ROBOT GENERAL CHARGING STAND RETURN CHARGING CONTROL METHOD, CHIP AND ROBOT
KR101049906B1 (en) Autonomous mobile apparatus and method for avoiding collisions of the same
Nishizawa et al. An implementation of on-board position estimation for a mobile robot-ekf based odometry and laser reflector landmarks detection
US20240004389A1 (en) Robot traveling method, and electronic device
Yamauchi The Wayfarer modular navigation payload for intelligent robot infrastructure
Williams et al. A decoupled, distributed AUV control architecture
PL180337B1 (en) Guidance system for a self-driving vehicle with a separate mobile transmission equipment and method of steering such vehicle
Al Arabi et al. Autonomous rover navigation using gps based path planning
Hamner et al. An efficient system for combined route traversal and collision avoidance
CN110398251A (en) A kind of trackless navigation AGV positioning system and its localization method based on Multi-sensor Fusion

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU BR JP KR

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE FR GB IT LU NL SE