US20100030473A1 - Laser ranging process for road and obstacle detection in navigating an autonomous vehicle - Google Patents

Laser ranging process for road and obstacle detection in navigating an autonomous vehicle Download PDF

Info

Publication number
US20100030473A1
US20100030473A1 US12/182,774 US18277408A US2010030473A1 US 20100030473 A1 US20100030473 A1 US 20100030473A1 US 18277408 A US18277408 A US 18277408A US 2010030473 A1 US2010030473 A1 US 2010030473A1
Authority
US
United States
Prior art keywords
autonomous vehicle
current range
range scan
ground plane
scan
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US12/182,774
Other versions
US8755997B2 (en
Inventor
Kwong Wing Au
Jon Schewe
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honeywell International Inc
Original Assignee
Honeywell International Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honeywell International Inc filed Critical Honeywell International Inc
Priority to US12/182,774 priority Critical patent/US8755997B2/en
Assigned to HONEYWELL INTERNATIONAL INC. reassignment HONEYWELL INTERNATIONAL INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AU, KWONG WING, Schewe, Jon
Priority to EP09160948A priority patent/EP2149799A2/en
Publication of US20100030473A1 publication Critical patent/US20100030473A1/en
Application granted granted Critical
Publication of US8755997B2 publication Critical patent/US8755997B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders

Definitions

  • Unmanned ground vehicles include remote-driven or self-driven land vehicles that can carry cameras, sensors, communications equipment, or other payloads.
  • Self-driven or “autonomous” land vehicles are essentially robotic platforms that are capable of operating outdoors and over a wide variety of terrain.
  • Autonomous land vehicles can travel at various speeds under diverse road constructs.
  • an autonomous land vehicle can travel at the speed limit when traffic is sparse, at low speed during a traffic jam, or can stop at a traffic light.
  • the autonomous land vehicle can also travel at a constant speed, as well as accelerate or decelerate.
  • the road on which the vehicle traverses can be straight, curved, uphill, downhill, or have many undulations.
  • the number of lanes on the road can vary, and there are numerous types of road side constructs such as curbs, lawns, ditches, or pavement.
  • Objects on and off the road such as cars, cycles, and pedestrians add more complexity to the scenario. It is important to accurately classify these road elements in order that the vehicle can navigate safely.
  • a laser detection and ranging (LADAR) sensor is used to measure the range to each point within a scan that sweeps across a horizontal line.
  • On-board global positioning system (GPS) and inertial navigation system (INS) sensors provide the geo-location and dynamics of the vehicle, which includes the position and altitude of the vehicle in world coordinates, as well as the velocity and angular velocity of the vehicle.
  • GPS global positioning system
  • INS inertial navigation system
  • This navigation system for autonomous land vehicles often has difficulty in processing the LADAR data and combining the GPS/INS data to accurately classify each range measurement in a scan into one of traversable, non-traversable, lane-mark, and obstacle classes. Classification of the range measurements based only on one input scan and its corresponding GPS/INS input is not robust enough with the diversity of vehicle states and road configurations that can be encountered.
  • An alternate navigation system classifies each range measurement in a scan based on the history of recent range scans.
  • a fixed-size history buffer is employed having a size based on a fixed number of range scans. Consequently, the distance covered by the range scans saved in this buffer depends on the speed of the vehicle. When the vehicle travels at high speed, the area coverage in a fixed number of scans is large. When the vehicle travels at slow speed, the area coverage is small. Using the scans in the fixed-size buffer for ground plane estimation causes varying degrees of inaccuracy.
  • the present invention includes a method and system that provide road and obstacle detection in navigating an autonomous vehicle.
  • the method comprises scanning and storing range scans of a fixed size area ahead of the autonomous vehicle, such as with a laser scanner, and obtaining a current range scan and its associated navigation data including dynamics, position, and orientation measurements of the autonomous vehicle.
  • the current range scan is transformed to world coordinates with respect to a reference location based on the navigation data, and when the autonomous vehicle is deemed to be non-stationary, the transformed current range scan is input into a distance-based accumulator, which has a variable size buffer.
  • a ground plane is estimated from the transformed current range scan and prior range scans stored in the variable size buffer.
  • the estimated ground plane is represented as a constrained quadratic surface, based on which the input range scan is classified into one or more of a traversable area, a non-traversable area, or an obstacle area for navigation of the autonomous vehicle.
  • FIG. 1 illustrates a laser scanning system in an autonomous land vehicle according to one embodiment
  • FIGS. 2A and 2B show exemplary laser range scans for an autonomous land vehicle traveling on a road
  • FIG. 3 is a functional block diagram showing a laser ranging process according to the present invention for road and obstacle detection in navigating an autonomous land vehicle;
  • FIG. 4 is a functional block diagram depicting a fixed distance-based accumulator process for use in the laser ranging process of FIG. 3 ;
  • FIGS. 5-7 are three-dimensional graphical representations showing the shape and curvature of quadratic surfaces calculated by various parameter values indiciating the need for their constraint.
  • the present invention is directed to a method and system that apply a laser range processing technique for road and obstacle detection in navigating an autonomous vehicle through varied terrain.
  • the information obtained and used by the present laser range processing technique can be employed to build a world map of what is around the autonomous vehicle to help the vehicle successfully navigate.
  • an automonous vehicle such as an automonous land vehicle must know the locations of the roads, the markings on the roads, and any obstacles ahead.
  • One instrument that can be used to acquire this information is a laser scanner such as a laser ranging scanner, which scans a beam across the ground ahead of the vehicle in a back and forth sweeping motion.
  • the scanning rate can be variable, and every scanner sweep once across the ground produces data to be processed.
  • the range measurements from one scanner sweep are referred to herein as a “range scan.”
  • the present technique processes range scan data, which can be acquired by one or more laser scanners, with navigation data to provide a situation awareness, which includes the detection of traversable areas, non-traversable areas, or obstacles.
  • Traversable areas include, for example, roads, large flat areas, and the like.
  • Non-traversable areas include, for example, road curbs, pedestrian walkways, steep slopes, and the like.
  • Obstacles include objects of a certain size and height that a vehicle cannot traverse over, such as other vehicles, pedestrians, and the like. This situation awareness allows the autonomous vehicle to plan a route for safe navigation to a desired destination.
  • the present method estimates a ground plane from a history of range scan measurements that are cumulated from recent range scans.
  • Many conventional approaches represent the ground plane with a plane surface.
  • a road is often not planar, especially where a turn transitions into uphill or downhill.
  • the technique of the invention models the road as a quadratic surface with restricted curvature.
  • the present ground plane estimation fits the road measurements into a quadratic surface that has a small curvature.
  • the present method and system employ a variable size history buffer, in which the buffer memory can hold a variable number of range scans, with the range scans being acquired for a fixed distance at a particular scan angle.
  • the number of range scans is not fixed, rather only the distance covered by the range scans is fixed.
  • the amount of memory used in the history buffer is based on the fixed distance covered at a minimum number of range scans.
  • standard techniques store a fixed number of scans in a fixed size buffer.
  • the fixed distance of the range scans used in the present method can be predetermined based on the intended environment where the autonomous vehicle will operate.
  • a fixed distance coverage area based on the range scans is saved in the history buffer.
  • range measurements are not added to the history buffer. This allows for a much more consistent estimation of the ground plane based on the fixed size area, regardless of the vehicle speed, and consequently a more accurate classification of heights of objects in the path of the vehicle. If data accumulated in the history buffer results in a variation from the predetermined fixed distance, older data is removed from the buffer so that the fixed distance can be maintained. For example, when range measurements from an input range scan are updated into the history buffer, any data outliers present in the history buffer are eliminated during each update.
  • FIG. 1 illustrates one embodiment of a laser scanning system in an autonomous land vehicle 100 for road and obstacle detection in navigating vehicle 100 .
  • the laser scanning system includes at least one laser scanner 110 mounted on vehicle 100 .
  • the laser scanner 110 is configured to scan the land ahead of vehicle 100 in a sweeping pattern, and measure ranges and intensities in a scan region 120 , with a fixed size area, ahead of vehicle 100 .
  • the fixed size area is based on a fixed distance d and a scan angle a.
  • the laser scanner 110 can include at least one light detection and ranging (LIDAR) device.
  • LIDAR light detection and ranging
  • the laser scanner 110 is operatively coupled to a processing unit 112 in vehicle 110 .
  • the processing unit 112 can be a computer, a digital signal processor (DSP), or a field programmable gate array (FPGA), which form part of the laser scanning system.
  • DSP digital signal processor
  • FPGA field programmable gate array
  • the processing unit includes a variable size buffer, which is discussed in further detail hereafter.
  • An on-board navigation unit 114 in vehicle 100 is also operatively coupled to processing unit 112 .
  • the navigation unit 114 can be used to accurately determine a position of vehicle 100 , and can include one or more global positioning system (GPS) sensors, and one or more inertial navigation system (INS) sensors such as one or more inertial measurement units (IMUs).
  • GPS global positioning system
  • INS inertial navigation system
  • IMUs inertial measurement units
  • GPS and INS sensors provide data related to the geo-locations and dynamics of vehicle 100 . Such data is used to determine the position and altitude of vehicle 100 in world coordinates, and the velocity and angular velocity of vehicle 100 .
  • laser scanner 110 sweeps a beam 116 across a line segment 122 of scan region 120 and measures ranges at discrete points along line segment 122 .
  • processing unit 114 synchronizes range scan inputs from laser scanner 110 and from navigation unit 114 , classifies the range scans, and transforms the classification results into world coordinates.
  • FIGS. 2A and 2B Exemplary range scans for an autonomous land vehicle traveling on a road are shown in the diagrams of FIGS. 2A and 2B .
  • FIG. 2A depicts a first range scan 210 along the road, in which the segments a-b 1 and c 1 -d represent a sidewalk on either side of the road, segments b 1 -b 2 and c 1 -c 2 represent a curb adjacent to each sidewalk, and the middle segment b 2 -c 2 represents the road.
  • FIG. 2B depicts a second range scan 220 further along the road, in which the segment e-f, in between the segment b-c, represents an obstacle such as a car on the road in front of the autonomous land vehicle.
  • FIGS. 1 depicts a first range scan 210 along the road, in which the segments a-b 1 and c 1 -d represent a sidewalk on either side of the road, segments b 1 -b 2 and c 1 -c 2 represent a curb adjacent to each sidewalk,
  • the beam lines R 0 , R i , and R m extending from an origin O for each of range scans 210 and 220 , represent the distances (ranges) from the laser scanner to the points a, i, and d.
  • the angle ⁇ i is the azimuth angle of the line O-i with respect to the laser scanner reference.
  • the method of the invention builds a three-dimensional road model from cumulated range scans, which are gathered by the laser scanner, and from geo-locations, which are obtained from the navigation unit.
  • This three-dimensional road model which represents a ground plane, is formulated as a constrained quadratic surface.
  • the inputted range scan data after being transformed into world coordinate points of the three-dimensional road model, can then be correctly classified based on heights above the ground plane.
  • FIG. 3 is a functional block diagram showing a laser ranging process 300 according to the present invention for road and obstacle detection.
  • range scans and GPS/INS measurements are input into a World Coordinate Transformation (WCT) unit 310 , which is configured to combine the range measurement data with the navigation data to determine a coordinate transformation.
  • the WCT unit 310 transforms the range measurements in a range scan, which are recorded as the azimuth angles and ranges with respect to a location of the laser scanner, to Universal Transverse Mercator (UTM) world coordinate data with respect to a reference location, e.g., the starting point of the route.
  • UTM Universal Transverse Mercator
  • a distance-based accumulator 320 is configured to receive the world coordinate data from WCT unit 310 .
  • the accumulator 320 assembles the range scans into a variable size buffer 330 based on a specified fixed distance, which is a variable parameter depending on the expected road type. For example, a smooth road type dictates a shorter fixed distance while a rough and irregular road type requires a longer fixed distance.
  • variable size buffer 330 saves a fixed distance coverage area of the range scans regardless of the speed of the vehicle. For example, as shown in FIG. 1 , the distance covered, d, is kept fixed for the range scans that are saved in the buffer. The time for scanning (t 0 to t n ) and the buffer size vary. The process performed by accumulator 320 is described hereafter in further detail with respect to FIG. 4 .
  • ground plane estimator 340 Data from distance-based accumulator 320 and the variable size buffer 330 is fed into a quadratic ground plane estimator 340 , which estimates a ground plane from a history of road measurements that are cumulated from previous range measurement scans.
  • the ground plane estimation fits the road measurements into a quadratic surface with a restricted curvature.
  • the range measurements sometimes can be noisy, and measurements other than that from the road may be added into the ground plane estimation as data outliers. Standard best fit methods, such as least square fit, often yield a biased estimate due to the presence of the outliers.
  • the ground plane estimator 340 utilizes a modified RANSAC (RANdom SAmple Consensus) process, which is immune to outliers, to estimate the ground plane quadratic surface.
  • RANSAC Random SAmple Consensus
  • Data generated by the ground plane estimator 340 is input into a traversability/obstacle assessment module 350 , which also is configured to receive input range scan data transformed to UTM coordinates.
  • the assessment module 350 classifies the range scan data into traversable, non-traversable, and obstacle areas.
  • the heights of the range scan measurements from the estimated ground plane are computed and compared with predetermined height thresholds, such as a curb threshold and an obstacle threshold. For example, heights below the curb threshold are classified as traversable such as for a road, heights above the obstacle threshold are classified as obstacles and thus non-traversable, and heights in between the two thresholds are classified as non-traversable such as for a curb.
  • the height thresholds can be varied depending on the size and type of vehicle. For example, heights in between a curb threshold and an obstacle threshold can be classified as traversable when the vehicle is large enough. The height thresholds can also be ignored if necessary to avoid an obstacle (e.g., drive over a curb to avoid a pedestrian).
  • the range scan measurements are classified into traversable, non-traversable, or obstacle areas for the autonomous vehicle, the range scan measurements are labeled in world coordinates and output to a path planner to determine a route for navigating the autonomous vehicle.
  • Range scan measurements in world coordinates are received by a stationary vehicle assessment module 410 , which determines whether the vehicle is stationary or moving, by comparing a current range scan with an immediately preceding range scan stored in variable size buffer 330 . These two range scans will be very similar when the vehicle is stationary. One method to measure the similarity is that a sum of the absolute difference between the two range scans is less than the measurement noise.
  • Another embodiment in the determination of a stationary vehicle is based on the INS input, which provides the dynamics of the vehicle, such as the velocity and the positions of the vehicle. If the vehicle is stationary, a “stationary vehicle” signal 412 is returned and no change is made to buffer 330 . In addition, no new information will be output to the path planner from this stationary range scan.
  • the current range scan is added to buffer 330 , and the distance covered (D c ) by the range scans stored in buffer 330 is computed by a computation module 420 .
  • a position in world coordinates is computed. This position can be the first measurement of the range scan. In such a case, no additional computation and storage are required.
  • the mean or median position of the measurement in each range scan is computed.
  • the distance covered in the buffer, D c can then be computed as the L2 norm between the first and last range scans in the buffer.
  • a distance assessment module 430 compares a required fixed distance threshold, D thrs , with the distance covered, D c , in buffer 330 . If D c is less than D thrs , then an insufficient distance signal 432 is returned. If D c is approximately equal to D thrs , then a distance covered signal 434 is returned. When D c is greater than D thrs , a distance too far signal 436 is sent to a range scan removal module 440 , which removes older range scans from buffer 330 until a new distance, D c ′, in buffer 330 is just larger than D thrs . A distance covered signal 442 from range scan removal module 440 is then returned.
  • D thrs required fixed distance threshold
  • the quadratic ground plane estimator 340 ( FIG. 3 ) generates a ground plane representation.
  • the present method represents the ground plane, which corresponds to the road surface, as a constrained quadratic surface defined by:
  • z is the height of the plane
  • x and y are the locations at the horizontal axes
  • a and B are constrained constants
  • the constrained constants A and B can only have restricted values.
  • the shape and curvature of the quadratic surface is mainly governed by these two constrained constants.
  • FIGS. 5-7 are three-dimensional graphical representations showing the shape and curvature of quadratic surfaces.
  • the quadratic surface is concave, as shown in the diagram of FIG. 5 .
  • the quadratic surface is convex, as shown in the diagram of FIG. 6 .
  • the curvature of the quadratic surface can also be determined by the values of constrained constants A and B, as shown in the diagram of FIG. 7 .
  • a road is not concave and obviously, a road does not resemble FIGS. 5 and 6 because of their high curvatures.
  • the values of A and B must be constrained in order to properly represent a road surface.
  • the values of constrained constants A and B will be confined to a range of values, during the ground plane estimation process.
  • the range of values for A and B can be determined based on the types and construct of the road, such as the number of lanes, straight or curved road, etc. Without the knowledge of the expected road type and construct, one can restrict the values of constrained constants A and B to be small.
  • the ground plane estimator utilizes a modified RANSAC process.
  • Laser measurements are typically noisy, e.g., due to wetness, roughness, scatter, and irregular reflectance of the scanned surfaces, which can be a road, other vehicles, or a sidewalk. Many outliers that are not the elements of a road are measured and collected in the buffer. In this situation, the conventional least square fit method to estimate the quadratic ground plane does not work well. Hence, the method of the invention applies a modified RANSAC process.
  • the standard RANSAC process uses an iterative technique to estimate parameters of a mathematical model from a set of observed data.
  • a randomly sampled population is selected from the entire population, which are the measurements of the range scans in the buffer.
  • a technique such as least square fit, is used to estimate the quadratic ground plane (all the constants in equation 2).
  • the error of the entire population as fitted into the estimated ground plane is computed. If the error is less than a predefined threshold or a maximum number of iterations is reached, the best ground plane estimate is outputted. If the error exceeds the threshold, the best ground plane estimate so far is kept and another iteration is exercised.
  • a basic assumption of the RANSAC process is that the data contains “inliers” which are data points that can be explained by some set of model parameters, and “outliers” which are data points that do not fit the model.
  • the RANSAC process also assumes that, given a set of inliers, there exists a procedure which can estimate the parameters of a model that optimally explains or fits the data. Further details regarding the RANSAC process are in an article by M. A. Fischler, R. C. Bolles, “Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography,” Comm. of the ACM 24: 381-395 (June 1981), which is incorporated herein by reference.
  • the modified RANSAC process utilized in the present method changes two elements of the standard RANSAC process.
  • the first change is to the definition of the best ground plane estimate.
  • the best ground plane estimate is the one that has the minimum error.
  • the definition of the best ground plane estimate is the one that has the minimum error and for which the constrained constant criterion is met.
  • the second changed element is the sampling process.
  • the entire population (all range scans) is divided into n number of bins, with each bin having range scans that cover approximately equal distance traveled. Then, range scans from each bin are randomly sampled to form the sample space for the least square fit. This sampling technique assures that the measurements are obtained from the full spectrum of the buffer for each iteration of the RANSAC process.
  • the present method stores in the buffer only those measurements of the range scans that belong to the estimated ground plane. This avoids the sampling of the range scan measurements from the non-ground plane elements, which need to be rejected as outliers during the RANSAC process anyway.
  • Instructions for carrying out the various process tasks, calculations, and generation of signals and other data used in the operation of the method and system of the invention can be implemented in software, firmware, or other computer readable instructions. These instructions are typically stored on any appropriate computer readable media used for storage of computer readable instructions or data structures. Such computer readable media can be any available media that can be accessed by a general purpose or special purpose computer or processor, or any programmable logic device.
  • Suitable computer readable media may comprise, for example, non-volatile memory devices including semiconductor memory devices such as EPROM, EEPROM, or flash memory devices; magnetic disks such as internal hard disks or removable disks; magneto-optical disks; CDs, DVDs, or other optical storage disks; nonvolatile ROM, RAM, and other like media; or any other media that can be used to carry or store desired program code in the form of computer executable instructions or data structures. Any of the foregoing may be supplemented by, or incorporated in, specially-designed application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs).
  • ASICs application-specific integrated circuits
  • FPGAs field programmable gate arrays
  • the method of the invention can be implemented by computer executable instructions, such as program modules, which are executed by a processor.
  • program modules include routines, programs, objects, data components, data structures, algorithms, and the like, which perform particular tasks or implement particular abstract data types.
  • Computer executable instructions, associated data structures, and program modules represent examples of program code means for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A method and system provide road and obstacle detection in navigating an autonomous vehicle. The method comprises scanning a distance ahead of the autonomous vehicle to obtain a current range scan, and obtaining navigation data, including dynamics, position, and orientation measurements of the autonomous vehicle. The current range scan is transformed to world coordinates with respect to a reference location based on the navigation data, and the transformed current range scan is input into a distance-based accumulator. The transformed current range scan is added to a variable size buffer when the autonomous vehicle is deemed to be non-stationary. A ground plane is estimated from the transformed current range scan and prior range scans stored in the variable size buffer. The estimated ground plane is represented as a constrained quadratic surface, which is classified into one or more of a traversable area, a non-traversable area, or an obstacle area for navigation of the autonomous vehicle.

Description

    GOVERNMENT LICENSE RIGHTS
  • The U.S. Government may have certain rights in the present invention as provided for by the terms of contract number HR0011-06-9-0011 with DARPA.
  • BACKGROUND
  • Unmanned ground vehicles (UGVs) include remote-driven or self-driven land vehicles that can carry cameras, sensors, communications equipment, or other payloads. Self-driven or “autonomous” land vehicles are essentially robotic platforms that are capable of operating outdoors and over a wide variety of terrain.
  • Autonomous land vehicles can travel at various speeds under diverse road constructs. For example, an autonomous land vehicle can travel at the speed limit when traffic is sparse, at low speed during a traffic jam, or can stop at a traffic light. The autonomous land vehicle can also travel at a constant speed, as well as accelerate or decelerate. The road on which the vehicle traverses can be straight, curved, uphill, downhill, or have many undulations. The number of lanes on the road can vary, and there are numerous types of road side constructs such as curbs, lawns, ditches, or pavement. Objects on and off the road such as cars, cycles, and pedestrians add more complexity to the scenario. It is important to accurately classify these road elements in order that the vehicle can navigate safely.
  • In one navigation system for autonomous land vehicles, a laser detection and ranging (LADAR) sensor is used to measure the range to each point within a scan that sweeps across a horizontal line. On-board global positioning system (GPS) and inertial navigation system (INS) sensors provide the geo-location and dynamics of the vehicle, which includes the position and altitude of the vehicle in world coordinates, as well as the velocity and angular velocity of the vehicle. This navigation system for autonomous land vehicles often has difficulty in processing the LADAR data and combining the GPS/INS data to accurately classify each range measurement in a scan into one of traversable, non-traversable, lane-mark, and obstacle classes. Classification of the range measurements based only on one input scan and its corresponding GPS/INS input is not robust enough with the diversity of vehicle states and road configurations that can be encountered.
  • An alternate navigation system classifies each range measurement in a scan based on the history of recent range scans. A fixed-size history buffer is employed having a size based on a fixed number of range scans. Consequently, the distance covered by the range scans saved in this buffer depends on the speed of the vehicle. When the vehicle travels at high speed, the area coverage in a fixed number of scans is large. When the vehicle travels at slow speed, the area coverage is small. Using the scans in the fixed-size buffer for ground plane estimation causes varying degrees of inaccuracy.
  • SUMMARY
  • The present invention includes a method and system that provide road and obstacle detection in navigating an autonomous vehicle. The method comprises scanning and storing range scans of a fixed size area ahead of the autonomous vehicle, such as with a laser scanner, and obtaining a current range scan and its associated navigation data including dynamics, position, and orientation measurements of the autonomous vehicle. The current range scan is transformed to world coordinates with respect to a reference location based on the navigation data, and when the autonomous vehicle is deemed to be non-stationary, the transformed current range scan is input into a distance-based accumulator, which has a variable size buffer. A ground plane is estimated from the transformed current range scan and prior range scans stored in the variable size buffer. The estimated ground plane is represented as a constrained quadratic surface, based on which the input range scan is classified into one or more of a traversable area, a non-traversable area, or an obstacle area for navigation of the autonomous vehicle.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Features of the present invention will become apparent to those skilled in the art from the following description with reference to the drawings. Understanding that the drawings depict only typical embodiments of the invention and are not therefore to be considered limiting in scope, the invention will be described with additional specificity and detail through the use of the accompanying drawings, in which:
  • FIG. 1 illustrates a laser scanning system in an autonomous land vehicle according to one embodiment;
  • FIGS. 2A and 2B show exemplary laser range scans for an autonomous land vehicle traveling on a road;
  • FIG. 3 is a functional block diagram showing a laser ranging process according to the present invention for road and obstacle detection in navigating an autonomous land vehicle;
  • FIG. 4 is a functional block diagram depicting a fixed distance-based accumulator process for use in the laser ranging process of FIG. 3; and
  • FIGS. 5-7 are three-dimensional graphical representations showing the shape and curvature of quadratic surfaces calculated by various parameter values indiciating the need for their constraint.
  • DETAILED DESCRIPTION
  • In the following detailed description, embodiments are described in sufficient detail to enable those skilled in the art to practice the invention. It is to be understood that other embodiments may be utilized without departing from the scope of the present invention. The following description is, therefore, not to be taken in a limiting sense.
  • The present invention is directed to a method and system that apply a laser range processing technique for road and obstacle detection in navigating an autonomous vehicle through varied terrain. The information obtained and used by the present laser range processing technique can be employed to build a world map of what is around the autonomous vehicle to help the vehicle successfully navigate.
  • In order to implement a plan for reaching a destination safely, an automonous vehicle such as an automonous land vehicle must know the locations of the roads, the markings on the roads, and any obstacles ahead. One instrument that can be used to acquire this information is a laser scanner such as a laser ranging scanner, which scans a beam across the ground ahead of the vehicle in a back and forth sweeping motion. The scanning rate can be variable, and every scanner sweep once across the ground produces data to be processed. The range measurements from one scanner sweep are referred to herein as a “range scan.”
  • The present technique processes range scan data, which can be acquired by one or more laser scanners, with navigation data to provide a situation awareness, which includes the detection of traversable areas, non-traversable areas, or obstacles. Traversable areas include, for example, roads, large flat areas, and the like. Non-traversable areas include, for example, road curbs, pedestrian walkways, steep slopes, and the like. Obstacles include objects of a certain size and height that a vehicle cannot traverse over, such as other vehicles, pedestrians, and the like. This situation awareness allows the autonomous vehicle to plan a route for safe navigation to a desired destination.
  • The present method estimates a ground plane from a history of range scan measurements that are cumulated from recent range scans. Many conventional approaches represent the ground plane with a plane surface. A road is often not planar, especially where a turn transitions into uphill or downhill. The technique of the invention models the road as a quadratic surface with restricted curvature. Hence, the present ground plane estimation fits the road measurements into a quadratic surface that has a small curvature.
  • The present method and system employ a variable size history buffer, in which the buffer memory can hold a variable number of range scans, with the range scans being acquired for a fixed distance at a particular scan angle. Thus, the number of range scans is not fixed, rather only the distance covered by the range scans is fixed. The amount of memory used in the history buffer is based on the fixed distance covered at a minimum number of range scans. In contrast, standard techniques store a fixed number of scans in a fixed size buffer.
  • The fixed distance of the range scans used in the present method can be predetermined based on the intended environment where the autonomous vehicle will operate. When an autonomous vehicle is moving, a fixed distance coverage area based on the range scans is saved in the history buffer. When the autonomous vehicle is stationary, range measurements are not added to the history buffer. This allows for a much more consistent estimation of the ground plane based on the fixed size area, regardless of the vehicle speed, and consequently a more accurate classification of heights of objects in the path of the vehicle. If data accumulated in the history buffer results in a variation from the predetermined fixed distance, older data is removed from the buffer so that the fixed distance can be maintained. For example, when range measurements from an input range scan are updated into the history buffer, any data outliers present in the history buffer are eliminated during each update.
  • The method of the invention can be implemented in software, firmware, or embedded in application specific hardware. Various implementations of the present invention are described in further detail hereafter with respect to the drawings.
  • FIG. 1 illustrates one embodiment of a laser scanning system in an autonomous land vehicle 100 for road and obstacle detection in navigating vehicle 100. The laser scanning system includes at least one laser scanner 110 mounted on vehicle 100. The laser scanner 110 is configured to scan the land ahead of vehicle 100 in a sweeping pattern, and measure ranges and intensities in a scan region 120, with a fixed size area, ahead of vehicle 100. The fixed size area is based on a fixed distance d and a scan angle a. In one embodiment, the laser scanner 110 can include at least one light detection and ranging (LIDAR) device.
  • The laser scanner 110 is operatively coupled to a processing unit 112 in vehicle 110. The processing unit 112 can be a computer, a digital signal processor (DSP), or a field programmable gate array (FPGA), which form part of the laser scanning system. The processing unit includes a variable size buffer, which is discussed in further detail hereafter.
  • An on-board navigation unit 114 in vehicle 100 is also operatively coupled to processing unit 112. The navigation unit 114 can be used to accurately determine a position of vehicle 100, and can include one or more global positioning system (GPS) sensors, and one or more inertial navigation system (INS) sensors such as one or more inertial measurement units (IMUs). The GPS and INS sensors provide data related to the geo-locations and dynamics of vehicle 100. Such data is used to determine the position and altitude of vehicle 100 in world coordinates, and the velocity and angular velocity of vehicle 100.
  • As vehicle 100 is traveling, laser scanner 110 sweeps a beam 116 across a line segment 122 of scan region 120 and measures ranges at discrete points along line segment 122. As discussed further hereafter, processing unit 114 synchronizes range scan inputs from laser scanner 110 and from navigation unit 114, classifies the range scans, and transforms the classification results into world coordinates.
  • Exemplary range scans for an autonomous land vehicle traveling on a road are shown in the diagrams of FIGS. 2A and 2B. FIG. 2A depicts a first range scan 210 along the road, in which the segments a-b1 and c1-d represent a sidewalk on either side of the road, segments b1-b2 and c1-c2 represent a curb adjacent to each sidewalk, and the middle segment b2-c2 represents the road. FIG. 2B depicts a second range scan 220 further along the road, in which the segment e-f, in between the segment b-c, represents an obstacle such as a car on the road in front of the autonomous land vehicle. In FIGS. 2A and 2B, the beam lines R0, Ri, and Rm, extending from an origin O for each of range scans 210 and 220, represent the distances (ranges) from the laser scanner to the points a, i, and d. The angle αi is the azimuth angle of the line O-i with respect to the laser scanner reference.
  • Due to noise in the range measurements, as well as the configuration and condition of roads and sidewalks, classification of traversable and non-traversable areas based on only one range scan is not reliable and robust. Accordingly, the method of the invention builds a three-dimensional road model from cumulated range scans, which are gathered by the laser scanner, and from geo-locations, which are obtained from the navigation unit. This three-dimensional road model, which represents a ground plane, is formulated as a constrained quadratic surface. The inputted range scan data, after being transformed into world coordinate points of the three-dimensional road model, can then be correctly classified based on heights above the ground plane.
  • FIG. 3 is a functional block diagram showing a laser ranging process 300 according to the present invention for road and obstacle detection. Initially, range scans and GPS/INS measurements are input into a World Coordinate Transformation (WCT) unit 310, which is configured to combine the range measurement data with the navigation data to determine a coordinate transformation. The WCT unit 310 transforms the range measurements in a range scan, which are recorded as the azimuth angles and ranges with respect to a location of the laser scanner, to Universal Transverse Mercator (UTM) world coordinate data with respect to a reference location, e.g., the starting point of the route. This transformation is a standard technique given the vehicle's UTM coordinate, a lever arm from the navigation unit to the laser scanner, and the range scan measurements.
  • A distance-based accumulator 320 is configured to receive the world coordinate data from WCT unit 310. The accumulator 320 assembles the range scans into a variable size buffer 330 based on a specified fixed distance, which is a variable parameter depending on the expected road type. For example, a smooth road type dictates a shorter fixed distance while a rough and irregular road type requires a longer fixed distance.
  • The variable size buffer 330 saves a fixed distance coverage area of the range scans regardless of the speed of the vehicle. For example, as shown in FIG. 1, the distance covered, d, is kept fixed for the range scans that are saved in the buffer. The time for scanning (t0 to tn) and the buffer size vary. The process performed by accumulator 320 is described hereafter in further detail with respect to FIG. 4.
  • Data from distance-based accumulator 320 and the variable size buffer 330 is fed into a quadratic ground plane estimator 340, which estimates a ground plane from a history of road measurements that are cumulated from previous range measurement scans. The ground plane estimation fits the road measurements into a quadratic surface with a restricted curvature. The range measurements sometimes can be noisy, and measurements other than that from the road may be added into the ground plane estimation as data outliers. Standard best fit methods, such as least square fit, often yield a biased estimate due to the presence of the outliers. Thus, the ground plane estimator 340 utilizes a modified RANSAC (RANdom SAmple Consensus) process, which is immune to outliers, to estimate the ground plane quadratic surface. The process performed by ground plane estimator 340 is described hereafter in further detail with respect to FIGS. 5-7.
  • Data generated by the ground plane estimator 340 is input into a traversability/obstacle assessment module 350, which also is configured to receive input range scan data transformed to UTM coordinates. The assessment module 350 classifies the range scan data into traversable, non-traversable, and obstacle areas. The heights of the range scan measurements from the estimated ground plane are computed and compared with predetermined height thresholds, such as a curb threshold and an obstacle threshold. For example, heights below the curb threshold are classified as traversable such as for a road, heights above the obstacle threshold are classified as obstacles and thus non-traversable, and heights in between the two thresholds are classified as non-traversable such as for a curb. The height thresholds can be varied depending on the size and type of vehicle. For example, heights in between a curb threshold and an obstacle threshold can be classified as traversable when the vehicle is large enough. The height thresholds can also be ignored if necessary to avoid an obstacle (e.g., drive over a curb to avoid a pedestrian).
  • Once the range scan measurements are classified into traversable, non-traversable, or obstacle areas for the autonomous vehicle, the range scan measurements are labeled in world coordinates and output to a path planner to determine a route for navigating the autonomous vehicle.
  • Turning to FIG. 4, a process 400 performed by the distance-based accumulator is depicted in further detail. Range scan measurements in world coordinates are received by a stationary vehicle assessment module 410, which determines whether the vehicle is stationary or moving, by comparing a current range scan with an immediately preceding range scan stored in variable size buffer 330. These two range scans will be very similar when the vehicle is stationary. One method to measure the similarity is that a sum of the absolute difference between the two range scans is less than the measurement noise. Another embodiment in the determination of a stationary vehicle is based on the INS input, which provides the dynamics of the vehicle, such as the velocity and the positions of the vehicle. If the vehicle is stationary, a “stationary vehicle” signal 412 is returned and no change is made to buffer 330. In addition, no new information will be output to the path planner from this stationary range scan.
  • If the vehicle is deemed non-stationary, the current range scan is added to buffer 330, and the distance covered (Dc) by the range scans stored in buffer 330 is computed by a computation module 420. For each range scan, a position in world coordinates is computed. This position can be the first measurement of the range scan. In such a case, no additional computation and storage are required. In an alternate approach, the mean or median position of the measurement in each range scan is computed. The distance covered in the buffer, Dc, can then be computed as the L2 norm between the first and last range scans in the buffer.
  • A distance assessment module 430 compares a required fixed distance threshold, Dthrs, with the distance covered, Dc, in buffer 330. If Dc is less than Dthrs, then an insufficient distance signal 432 is returned. If Dc is approximately equal to Dthrs, then a distance covered signal 434 is returned. When Dc is greater than Dthrs, a distance too far signal 436 is sent to a range scan removal module 440, which removes older range scans from buffer 330 until a new distance, Dc′, in buffer 330 is just larger than Dthrs. A distance covered signal 442 from range scan removal module 440 is then returned.
  • When buffer 330 has saved enough range scans that cover the required fixed distance, the quadratic ground plane estimator 340 (FIG. 3) generates a ground plane representation.
  • Most conventional ground plane representations are based on a planar formula:

  • z=Ax+By+C   (1)
  • where
      • z is the height of the plane,
      • x and y are the locations at the horizontal axes, and
      • A, B and C are constants.
        The estimation process then computes the constants, A, B, and C, such that the range scan measurements in the buffer best fit a plane according to equation (1).
  • In reality, most roads are not constructed as plane surfaces, but rather as curved surfaces, such as for drainage purposes. This is particularly true at the transition from a level road to uphill or downhill. Thus, the present method represents the ground plane, which corresponds to the road surface, as a constrained quadratic surface defined by:

  • z=Ax 2 +By 2 +Cx+Dy+E   (2)
  • where
  • z is the height of the plane,
  • x and y are the locations at the horizontal axes,
  • A and B are constrained constants, and
  • C, D and E are constants.
  • The constrained constants A and B can only have restricted values. The shape and curvature of the quadratic surface is mainly governed by these two constrained constants. FIGS. 5-7 are three-dimensional graphical representations showing the shape and curvature of quadratic surfaces. When constrained constants A and B are positive, the quadratic surface is concave, as shown in the diagram of FIG. 5. When constrained constants A and B are negative, the quadratic surface is convex, as shown in the diagram of FIG. 6. In addition, the curvature of the quadratic surface can also be determined by the values of constrained constants A and B, as shown in the diagram of FIG. 7. In general, a road is not concave and obviously, a road does not resemble FIGS. 5 and 6 because of their high curvatures. Hence, the values of A and B must be constrained in order to properly represent a road surface.
  • As most roads are designed for vehicle transportation, the values of constrained constants A and B will be confined to a range of values, during the ground plane estimation process. The range of values for A and B can be determined based on the types and construct of the road, such as the number of lanes, straight or curved road, etc. Without the knowledge of the expected road type and construct, one can restrict the values of constrained constants A and B to be small.
  • As mentioned previously, the ground plane estimator utilizes a modified RANSAC process. Laser measurements are typically noisy, e.g., due to wetness, roughness, scatter, and irregular reflectance of the scanned surfaces, which can be a road, other vehicles, or a sidewalk. Many outliers that are not the elements of a road are measured and collected in the buffer. In this situation, the conventional least square fit method to estimate the quadratic ground plane does not work well. Hence, the method of the invention applies a modified RANSAC process.
  • The standard RANSAC process uses an iterative technique to estimate parameters of a mathematical model from a set of observed data. In each iteration, a randomly sampled population is selected from the entire population, which are the measurements of the range scans in the buffer. Then, using only the sampled population, a technique, such as least square fit, is used to estimate the quadratic ground plane (all the constants in equation 2). The error of the entire population as fitted into the estimated ground plane is computed. If the error is less than a predefined threshold or a maximum number of iterations is reached, the best ground plane estimate is outputted. If the error exceeds the threshold, the best ground plane estimate so far is kept and another iteration is exercised.
  • A basic assumption of the RANSAC process is that the data contains “inliers” which are data points that can be explained by some set of model parameters, and “outliers” which are data points that do not fit the model. The RANSAC process also assumes that, given a set of inliers, there exists a procedure which can estimate the parameters of a model that optimally explains or fits the data. Further details regarding the RANSAC process are in an article by M. A. Fischler, R. C. Bolles, “Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography,” Comm. of the ACM 24: 381-395 (June 1981), which is incorporated herein by reference.
  • The modified RANSAC process utilized in the present method changes two elements of the standard RANSAC process. The first change is to the definition of the best ground plane estimate. In most standard approaches, the best ground plane estimate is the one that has the minimum error. In the present approach, the definition of the best ground plane estimate is the one that has the minimum error and for which the constrained constant criterion is met. The second changed element is the sampling process. The entire population (all range scans) is divided into n number of bins, with each bin having range scans that cover approximately equal distance traveled. Then, range scans from each bin are randomly sampled to form the sample space for the least square fit. This sampling technique assures that the measurements are obtained from the full spectrum of the buffer for each iteration of the RANSAC process.
  • Even though the standard RANSAC process, by its random sampling nature, eventually can arrive at a similar sampling coverage, the present modified RANSAC process reaches the optimal solution faster, which is essential for a real-time process.
  • In one implementation, the present method stores in the buffer only those measurements of the range scans that belong to the estimated ground plane. This avoids the sampling of the range scan measurements from the non-ground plane elements, which need to be rejected as outliers during the RANSAC process anyway.
  • Instructions for carrying out the various process tasks, calculations, and generation of signals and other data used in the operation of the method and system of the invention can be implemented in software, firmware, or other computer readable instructions. These instructions are typically stored on any appropriate computer readable media used for storage of computer readable instructions or data structures. Such computer readable media can be any available media that can be accessed by a general purpose or special purpose computer or processor, or any programmable logic device.
  • Suitable computer readable media may comprise, for example, non-volatile memory devices including semiconductor memory devices such as EPROM, EEPROM, or flash memory devices; magnetic disks such as internal hard disks or removable disks; magneto-optical disks; CDs, DVDs, or other optical storage disks; nonvolatile ROM, RAM, and other like media; or any other media that can be used to carry or store desired program code in the form of computer executable instructions or data structures. Any of the foregoing may be supplemented by, or incorporated in, specially-designed application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs). When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a computer readable medium. Thus, any such connection is properly termed a computer readable medium. Combinations of the above are also included within the scope of computer readable media.
  • The method of the invention can be implemented by computer executable instructions, such as program modules, which are executed by a processor. Generally, program modules include routines, programs, objects, data components, data structures, algorithms, and the like, which perform particular tasks or implement particular abstract data types. Computer executable instructions, associated data structures, and program modules represent examples of program code means for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps.
  • The present invention may be embodied in other specific forms without departing from its essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is therefore indicated by the appended claims rather than by the foregoing description. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims (20)

1. A method for road and obstacle detection in navigating an autonomous vehicle, the method comprising:
scanning a distance ahead of the autonomous vehicle to obtain a current range scan;
obtaining navigation data, including dynamics, position, and orientation measurements of the autonomous vehicle, during the scanning of the distance ahead;
transforming the current range scan to world coordinates with respect to a reference location based on the navigation data;
inputting the transformed current range scan into a distance-based accumulator;
adding the transformed current range scan to a variable size buffer when the autonomous vehicle is deemed to be non-stationary;
estimating a ground plane from the transformed current range scan and prior range scans stored in the variable size buffer;
representing the estimated ground plane as a constrained quadratic surface; and
classifying the constrained quadratic surface into one or more of a traversable area, a non-traversable area, or an obstacle area for navigation of the autonomous vehicle.
2. The method of claim 1, wherein the distance ahead is scanned by at least one laser scanner mounted on the autonomous vehicle.
3. The method of claim 2, wherein the laser scanner comprises a light detection and ranging device.
4. The method of claim 1, wherein the navigation data is obtained from at least one global positioning system sensor, and at least one inertial navigation system sensor.
5. The method of claim 1, wherein the distance-based accumulator comprises instructions for:
comparing the transformed current range scan with an immediately preceding range scan stored in the variable size buffer to determine whether the autonomous vehicle is stationary or non-stationary;
computing a distance covered by the transformed current range scan and the prior range scans stored in the variable size buffer;
comparing a fixed distance threshold with the distance covered; and
removing one or more of the prior range scans from the variable size buffer when the distance covered is greater than the fixed distance threshold.
6. The method of claim 1, further comprising running a modified RANSAC process to estimate the constrained quadratic surface.
7. The method of claim 1, wherein the estimated ground plane corresponds to a road surface on which the autonomous vehicle is traveling.
8. The method of claim 7, wherein the estimated ground plane is represented by a three-dimensional road model corresponding to the constrained quadratic surface.
9. The method of claim 1, further comprising outputting one or more map labels in world coordinates based on the transformed current range scan and the prior range scans.
10. The method of claim 9, wherein classifying the constrained quadratic surface comprises computing a height of the map labels above the estimated ground plane.
11. The method of claim 10, further comprising comparing the height of the map labels with one or more height thresholds for the traversable area, the non-traversable area, and the obstacle area.
12. A computer program product, comprising:
a computer readable medium having instructions stored thereon for a method according to claim 1.
13. A system for road and obstacle detection in navigating an autonomous vehicle, the system comprising:
at least one laser scanner mounted on an autonomous vehicle, the laser scanner configured to perform mulitple range scans of a fixed size area ahead of the autonomous vehicle;
a navigation unit in the autonomous vehicle, the navigation unit configured to obtain navigation data, including dynamics, position, and orientation measurements of the autonomous vehicle, during the range scans of the fixed size area;
at least one processing unit in the autonomous vehicle and in operative communication with the laser scanner and the navigation unit, the processing unit including a variable size buffer and configured to execute instructions to:
synchronize a current range scan from the laser scanner with navigation data obtained during the current range scan;
transform the current range scan to world coordinates with respect to a reference location based on the navigation data;
input the transformed current range scan into a distance-based accumulator;
add the transformed current range scan to the variable size buffer when the autonomous vehicle is deemed to be non-stationary;
estimate a ground plane from the transformed current range scan and prior range scans stored in the variable size buffer;
represent the estimated ground plane as a constrained quadratic surface; and
classify the constrained quadratic surface into one or more of a traversable area, a non-traversable area, or an obstacle area for navigation of the autonomous vehicle.
14. The system of claim 13, wherein the at least one laser scanner comprises a light detection and ranging device.
15. The system of claim 13, wherein the navigation unit comprises one or more global positioning system sensors, and one or more inertial navigation system sensors.
16. The system of claim 13, wherein the processing unit comprises a computer, a digital signal processor, or a field programmable gate array.
17. The system of claim 13, wherein the distance-based accumulator comprises instructions to:
compare the transformed current range scan with an immediately preceding range scan stored in the variable size buffer to determine whether the autonomous vehicle is stationary or non-stationary;
compute a distance covered by the transformed current range scan and the prior range scans stored in the variable size buffer;
compare a fixed distance threshold with the distance covered; and
remove one or more of the prior range scans from the variable size buffer when the distance covered is greater than the fixed distance threshold.
18. The system of claim 13, wherein the constrained quadratic surface is obtained by a modified RANSAC process.
19. The system of claim 13, wherein the ground plane is represented by a three-dimensional road model corresponding to the constrained quadratic surface.
20. The system of claim 13, wherein the constrained quadratic surface is classified by a height of one or more map labels in world coordinates above the ground plane, the map labels based on the transformed current range scan and the prior range scans.
US12/182,774 2008-07-30 2008-07-30 Laser ranging process for road and obstacle detection in navigating an autonomous vehicle Active 2031-12-08 US8755997B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12/182,774 US8755997B2 (en) 2008-07-30 2008-07-30 Laser ranging process for road and obstacle detection in navigating an autonomous vehicle
EP09160948A EP2149799A2 (en) 2008-07-30 2009-05-22 Laser ranging process for road and obstacle detection in navigating an autonomous vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/182,774 US8755997B2 (en) 2008-07-30 2008-07-30 Laser ranging process for road and obstacle detection in navigating an autonomous vehicle

Publications (2)

Publication Number Publication Date
US20100030473A1 true US20100030473A1 (en) 2010-02-04
US8755997B2 US8755997B2 (en) 2014-06-17

Family

ID=41278693

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/182,774 Active 2031-12-08 US8755997B2 (en) 2008-07-30 2008-07-30 Laser ranging process for road and obstacle detection in navigating an autonomous vehicle

Country Status (2)

Country Link
US (1) US8755997B2 (en)
EP (1) EP2149799A2 (en)

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090064427A1 (en) * 2007-06-05 2009-03-12 Lothar Emrich Method for laying a military bridge
US20090268948A1 (en) * 2008-04-24 2009-10-29 Gm Global Technology Operations, Inc. Pixel-based texture-rich clear path detection
US20130006482A1 (en) * 2011-06-30 2013-01-03 Ramadev Burigsay Hukkeri Guidance system for a mobile machine
US20130050490A1 (en) * 2011-08-23 2013-02-28 Fujitsu General Limited Drive assisting apparatus
US8565958B1 (en) * 2011-06-02 2013-10-22 Google Inc. Removing extraneous objects from maps
JP2014106897A (en) * 2012-11-29 2014-06-09 Toyota Motor Corp Passage propriety determination device
US8825391B1 (en) 2011-08-04 2014-09-02 Google Inc. Building elevation maps from laser data
US20140257623A1 (en) * 2013-03-05 2014-09-11 Ross Carl Removable dashboard instrument system
US8885151B1 (en) * 2012-09-04 2014-11-11 Google Inc. Condensing sensor data for transmission and processing
US20140350835A1 (en) * 2013-05-22 2014-11-27 Jaybridge Robotics, Inc. Method and system for obstacle detection for vehicles using planar sensor data
US20150029186A1 (en) * 2013-07-29 2015-01-29 Google Inc. Systems and Methods for Reducing a Data Set
US9052721B1 (en) 2012-08-28 2015-06-09 Google Inc. Method for correcting alignment of vehicle mounted laser scans with an elevation map for obstacle detection
US9123152B1 (en) * 2012-05-07 2015-09-01 Google Inc. Map reports from vehicles in the field
US20160011594A1 (en) * 2014-07-09 2016-01-14 Korea University Research And Business Foundation Method for extracting curb of road using laser range finder and method for localizing of mobile robot using curb informaiton of road
US9285230B1 (en) * 2013-12-20 2016-03-15 Google Inc. Methods and systems for detecting road curbs
US20160094808A1 (en) * 2014-09-29 2016-03-31 Vislab S.R.L. All-round view monitoring system for a motor vehicle
US9494093B2 (en) 2014-10-08 2016-11-15 Ford Global Technologies, Llc Detecting and negotiating a climbable obstacle in a vehicle
JP2017058264A (en) * 2015-09-17 2017-03-23 株式会社東芝 Estimation device, method, and program
WO2017123301A2 (en) 2015-12-18 2017-07-20 Raytheon Company Negative obstacle detector
KR101820299B1 (en) * 2011-11-23 2018-03-02 삼성전자주식회사 Stairs recognition method for three dimension data image
CN107766405A (en) * 2016-08-23 2018-03-06 德尔福技术有限公司 Automotive vehicle road model defines system
US20180099667A1 (en) * 2016-10-12 2018-04-12 Honda Motor Co., Ltd Vehicle control device
WO2018152441A1 (en) * 2017-02-17 2018-08-23 Essential Products, Inc. Modular light detection and ranging device of a vehicular navigation system
WO2018166718A1 (en) * 2017-03-15 2018-09-20 Zf Friedrichshafen Ag Arrangement and method for determining a gradient signal in a vehicle
US20180345992A1 (en) * 2015-09-17 2018-12-06 Sony Corporation System and method for providing driving assistance to safely overtake a vehicle
CN109212543A (en) * 2017-07-06 2019-01-15 通用汽车环球科技运作有限责任公司 Calibration verification method for autonomous vehicle operation
US10296010B2 (en) * 2016-10-11 2019-05-21 Mobileye Vision Technologies Ltd. Navigating a vehicle based on a detected barrier
WO2019106664A1 (en) * 2017-11-28 2019-06-06 Israel Aerospace Industries Ltd. Failure detection in an autonomous vehicle
US10486699B2 (en) * 2016-05-04 2019-11-26 Ford Global Technologies, Llc Off-road autonomous driving
US20200019797A1 (en) * 2017-03-28 2020-01-16 Denso Corporation Obstacle detection apparatus
JP2020098188A (en) * 2018-09-27 2020-06-25 バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド Obstacle detection method, obstacle detection device, electronic apparatus, vehicle and storage medium
WO2021030508A1 (en) * 2019-08-13 2021-02-18 Zoox, Inc. Modifying limits on vehicle dynamics for trajectories
US10967789B2 (en) * 2017-09-07 2021-04-06 Hitachi Construction Machinery Co., Ltd. Safe driving assistance device
US11333879B2 (en) 2019-09-20 2022-05-17 Raytheon Company Electronically steered inter-satellite optical communication system with micro-electromechanical (MEM) micromirror array (MMA)
US11397434B2 (en) 2019-08-13 2022-07-26 Zoox, Inc. Consistency validation for vehicle trajectory selection
US11407409B2 (en) 2019-08-13 2022-08-09 Zoox, Inc. System and method for trajectory validation
US11458965B2 (en) 2019-08-13 2022-10-04 Zoox, Inc. Feasibility validation for vehicle trajectory selection
US11522331B2 (en) 2020-09-23 2022-12-06 Raytheon Company Coherent optical beam combination using micro-electro-mechanical system (MEMS) micro-mirror arrays (MMAs) that exhibit tip/tilt/piston (TTP) actuation
US11539131B2 (en) 2020-08-24 2022-12-27 Raytheon Company Optical true time delay (TTD) device using microelectrical-mechanical system (MEMS) micromirror arrays (MMAS) that exhibit tip/tilt/piston (TTP) actuation
US20230031096A1 (en) * 2021-07-28 2023-02-02 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for detecting an object in a turning alert zone of a vehicle
US11584377B2 (en) * 2019-11-21 2023-02-21 Gm Cruise Holdings Llc Lidar based detection of road surface features
US11616840B2 (en) * 2018-12-17 2023-03-28 Apollo Intelligent Driving Technology (Beijing) Co., Ltd. Method, apparatus and system for processing unmanned vehicle data, and storage medium
IL260821B1 (en) * 2018-07-26 2023-07-01 Israel Aerospace Ind Ltd Failure detection in an autonomous vehicle
US11815676B2 (en) 2020-09-17 2023-11-14 Raytheon Company Active pushbroom imaging system using a micro-electro-mechanical system (MEMS) micro-mirror array (MMA)
US11837840B2 (en) 2020-09-01 2023-12-05 Raytheon Company MEMS micro-mirror array laser beam steerer for simultaneous illumination of multiple tracked targets
US11914368B2 (en) 2019-08-13 2024-02-27 Zoox, Inc. Modifying limits on vehicle dynamics for trajectories

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9098086B2 (en) * 2012-08-07 2015-08-04 Caterpillar Inc. Method and system for planning a turn path for a machine
SE537265C2 (en) * 2013-03-19 2015-03-17 Scania Cv Ab Control system and method for controlling vehicles when detecting obstacles
CN104849723B (en) * 2015-04-14 2017-12-26 同济大学 A kind of recognition methods of the simulation menology landform based on polynary linear array laser radar
US9969325B2 (en) 2015-09-15 2018-05-15 International Business Machines Corporation Projected surface markings
US10386480B1 (en) * 2016-02-02 2019-08-20 Waymo Llc Radar based mapping and localization for autonomous vehicles
DE102016207463A1 (en) 2016-04-29 2017-11-02 Robert Bosch Gmbh Method and device for operating at least one vehicle with respect to at least one passable object in the vicinity of the at least one vehicle
IT201600114161A1 (en) * 2016-11-11 2018-05-11 Info Solution S P A METHOD AND DEVICE FOR PILOTING A SELF-PROPELLED VEHICLE AND ITS PILOT SYSTEM
SE541527C2 (en) * 2017-01-19 2019-10-29 Scania Cv Ab Method and control unit for avoiding that an autonomus vehicle get stuck in a soft soil segment
US10766487B2 (en) 2018-08-13 2020-09-08 Denso International America, Inc. Vehicle driving system
GB2578108A (en) 2018-10-15 2020-04-22 Atlantic Inertial Systems Ltd A navigation system
US11560153B2 (en) 2019-03-07 2023-01-24 6 River Systems, Llc Systems and methods for collision avoidance by autonomous vehicles
US11828848B2 (en) * 2019-03-14 2023-11-28 Aeva, Inc. Velocity estimation using doppler per point LiDAR systems

Citations (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4740085A (en) * 1986-02-18 1988-04-26 Northrop Corporation Scale factor stability control
US5050423A (en) * 1989-12-04 1991-09-24 Motorola, Inc. Multi-variable sensor calibration
US5367898A (en) * 1991-01-23 1994-11-29 Sumitomo Electric Industries, Ltd. Method of calculating scale factor of gyro
US5525882A (en) * 1993-10-25 1996-06-11 International Business Machines Corporation Method and system for maneuvering a mobile robot
US5956250A (en) * 1990-02-05 1999-09-21 Caterpillar Inc. Apparatus and method for autonomous vehicle navigation using absolute data
US5999866A (en) * 1996-11-05 1999-12-07 Carnegie Mellon University Infrastructure independent position determining system
US6002983A (en) * 1997-08-27 1999-12-14 Delphi Technologies, Inc. Angle extent estimation method for a motor vehicle object detection system
US6298288B1 (en) * 1998-12-16 2001-10-02 Hughes Electronics Corp. Autonomous gyro scale factor and misalignment calibration
US6374191B1 (en) * 1998-04-20 2002-04-16 Nagano Keiki Co., Ltd. Self-calibrating sensor
US20030016161A1 (en) * 2001-07-18 2003-01-23 Hitachi, Ltd. Vehicle control apparatus
US6615117B2 (en) * 2001-11-13 2003-09-02 The Boeing Company Attitude determination system and method with outer-loop gyro scale-factor non-linearity calibration
US6679702B1 (en) * 2001-12-18 2004-01-20 Paul S. Rau Vehicle-based headway distance training system
US6728608B2 (en) * 2002-08-23 2004-04-27 Applied Perception, Inc. System and method for the creation of a terrain density model
US6754370B1 (en) * 2000-08-14 2004-06-22 The Board Of Trustees Of The Leland Stanford Junior University Real-time structured light range scanning of moving scenes
US6778924B2 (en) * 2001-11-06 2004-08-17 Honeywell International Inc. Self-calibrating inertial measurement system method and apparatus
US6823261B2 (en) * 2001-11-02 2004-11-23 Fuji Jukogyo Kabushiki Kaisha Monitor system of vehicle outside and the method thereof
US20050024492A1 (en) * 2003-07-03 2005-02-03 Christoph Schaefer Obstacle detection and terrain classification method
US20050278098A1 (en) * 1994-05-23 2005-12-15 Automotive Technologies International, Inc. Vehicular impact reactive system and method
US7272474B1 (en) * 2004-03-31 2007-09-18 Carnegie Mellon University Method and system for estimating navigability of terrain
US20070219720A1 (en) * 2006-03-16 2007-09-20 The Gray Insurance Company Navigation and control system for autonomous vehicles
US20080046150A1 (en) * 1994-05-23 2008-02-21 Automotive Technologies International, Inc. System and Method for Detecting and Protecting Pedestrians
US20080223107A1 (en) * 2007-03-15 2008-09-18 Stewart Robert E Self-calibration of scale factor for dual resonator class II coriolis vibratory gyros
US7437244B2 (en) * 2004-01-23 2008-10-14 Kabushiki Kaisha Toshiba Obstacle detection apparatus and a method therefor
US7643966B2 (en) * 2004-03-10 2010-01-05 Leica Geosystems Ag Identification of 3D surface points using context-based hypothesis testing
US20100106356A1 (en) * 2008-10-24 2010-04-29 The Gray Insurance Company Control and systems for autonomously driven vehicles
US7742841B2 (en) * 2005-02-23 2010-06-22 Panasonic Electric Works Co., Ltd. Autonomous vehicle and planar obstacle recognition method
US7995055B1 (en) * 2007-05-25 2011-08-09 Google Inc. Classifying objects in a scene
US8325979B2 (en) * 2006-10-30 2012-12-04 Tomtom Global Content B.V. Method and apparatus for detecting objects from terrestrial based mobile mapping data
US8446468B1 (en) * 2007-06-19 2013-05-21 University Of Southern California Moving object detection using a mobile infrared camera

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008068542A1 (en) 2006-12-04 2008-06-12 Nokia Corporation Auto-calibration method for sensors and auto-calibrating sensor arrangement

Patent Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4740085A (en) * 1986-02-18 1988-04-26 Northrop Corporation Scale factor stability control
US5050423A (en) * 1989-12-04 1991-09-24 Motorola, Inc. Multi-variable sensor calibration
US5956250A (en) * 1990-02-05 1999-09-21 Caterpillar Inc. Apparatus and method for autonomous vehicle navigation using absolute data
US5367898A (en) * 1991-01-23 1994-11-29 Sumitomo Electric Industries, Ltd. Method of calculating scale factor of gyro
US5525882A (en) * 1993-10-25 1996-06-11 International Business Machines Corporation Method and system for maneuvering a mobile robot
US20050278098A1 (en) * 1994-05-23 2005-12-15 Automotive Technologies International, Inc. Vehicular impact reactive system and method
US20080046150A1 (en) * 1994-05-23 2008-02-21 Automotive Technologies International, Inc. System and Method for Detecting and Protecting Pedestrians
US5999866A (en) * 1996-11-05 1999-12-07 Carnegie Mellon University Infrastructure independent position determining system
US6002983A (en) * 1997-08-27 1999-12-14 Delphi Technologies, Inc. Angle extent estimation method for a motor vehicle object detection system
US6374191B1 (en) * 1998-04-20 2002-04-16 Nagano Keiki Co., Ltd. Self-calibrating sensor
US6298288B1 (en) * 1998-12-16 2001-10-02 Hughes Electronics Corp. Autonomous gyro scale factor and misalignment calibration
US6754370B1 (en) * 2000-08-14 2004-06-22 The Board Of Trustees Of The Leland Stanford Junior University Real-time structured light range scanning of moving scenes
US20030016161A1 (en) * 2001-07-18 2003-01-23 Hitachi, Ltd. Vehicle control apparatus
US6823261B2 (en) * 2001-11-02 2004-11-23 Fuji Jukogyo Kabushiki Kaisha Monitor system of vehicle outside and the method thereof
US6778924B2 (en) * 2001-11-06 2004-08-17 Honeywell International Inc. Self-calibrating inertial measurement system method and apparatus
US6968281B2 (en) * 2001-11-06 2005-11-22 Honeywell International, Inc. Method for calibrating an inertial measurement unit
US6615117B2 (en) * 2001-11-13 2003-09-02 The Boeing Company Attitude determination system and method with outer-loop gyro scale-factor non-linearity calibration
US6679702B1 (en) * 2001-12-18 2004-01-20 Paul S. Rau Vehicle-based headway distance training system
US6728608B2 (en) * 2002-08-23 2004-04-27 Applied Perception, Inc. System and method for the creation of a terrain density model
US20050024492A1 (en) * 2003-07-03 2005-02-03 Christoph Schaefer Obstacle detection and terrain classification method
US7437244B2 (en) * 2004-01-23 2008-10-14 Kabushiki Kaisha Toshiba Obstacle detection apparatus and a method therefor
US7643966B2 (en) * 2004-03-10 2010-01-05 Leica Geosystems Ag Identification of 3D surface points using context-based hypothesis testing
US7272474B1 (en) * 2004-03-31 2007-09-18 Carnegie Mellon University Method and system for estimating navigability of terrain
US7742841B2 (en) * 2005-02-23 2010-06-22 Panasonic Electric Works Co., Ltd. Autonomous vehicle and planar obstacle recognition method
US20070219720A1 (en) * 2006-03-16 2007-09-20 The Gray Insurance Company Navigation and control system for autonomous vehicles
US8325979B2 (en) * 2006-10-30 2012-12-04 Tomtom Global Content B.V. Method and apparatus for detecting objects from terrestrial based mobile mapping data
US20080223107A1 (en) * 2007-03-15 2008-09-18 Stewart Robert E Self-calibration of scale factor for dual resonator class II coriolis vibratory gyros
US7995055B1 (en) * 2007-05-25 2011-08-09 Google Inc. Classifying objects in a scene
US8446468B1 (en) * 2007-06-19 2013-05-21 University Of Southern California Moving object detection using a mobile infrared camera
US20100106356A1 (en) * 2008-10-24 2010-04-29 The Gray Insurance Company Control and systems for autonomously driven vehicles

Cited By (80)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7784134B2 (en) * 2007-06-05 2010-08-31 General Dynamics European Land Systems-Germany Gmbh Method for laying a military bridge
US20090064427A1 (en) * 2007-06-05 2009-03-12 Lothar Emrich Method for laying a military bridge
US20090268948A1 (en) * 2008-04-24 2009-10-29 Gm Global Technology Operations, Inc. Pixel-based texture-rich clear path detection
US8452053B2 (en) * 2008-04-24 2013-05-28 GM Global Technology Operations LLC Pixel-based texture-rich clear path detection
US8565958B1 (en) * 2011-06-02 2013-10-22 Google Inc. Removing extraneous objects from maps
US20130006482A1 (en) * 2011-06-30 2013-01-03 Ramadev Burigsay Hukkeri Guidance system for a mobile machine
US8825391B1 (en) 2011-08-04 2014-09-02 Google Inc. Building elevation maps from laser data
US10185324B1 (en) 2011-08-04 2019-01-22 Waymo Llc Building elevation maps from laser data
US9709679B1 (en) 2011-08-04 2017-07-18 Waymo Llc Building elevation maps from laser data
US20130050490A1 (en) * 2011-08-23 2013-02-28 Fujitsu General Limited Drive assisting apparatus
KR101820299B1 (en) * 2011-11-23 2018-03-02 삼성전자주식회사 Stairs recognition method for three dimension data image
US9810540B1 (en) 2012-05-07 2017-11-07 Waymo Llc Map reports from vehicles in the field
US11519739B1 (en) 2012-05-07 2022-12-06 Waymo Llc Map reports from vehicles in the field
US9123152B1 (en) * 2012-05-07 2015-09-01 Google Inc. Map reports from vehicles in the field
US10520323B1 (en) 2012-05-07 2019-12-31 Waymo Llc Map reports from vehicles in the field
US9052721B1 (en) 2012-08-28 2015-06-09 Google Inc. Method for correcting alignment of vehicle mounted laser scans with an elevation map for obstacle detection
US8885151B1 (en) * 2012-09-04 2014-11-11 Google Inc. Condensing sensor data for transmission and processing
US10094670B1 (en) 2012-09-04 2018-10-09 Waymo Llc Condensing sensor data for transmission and processing
JP2014106897A (en) * 2012-11-29 2014-06-09 Toyota Motor Corp Passage propriety determination device
US9251627B2 (en) * 2013-03-05 2016-02-02 Sears Brands, L.L.C. Removable dashboard instrument system
US20140257623A1 (en) * 2013-03-05 2014-09-11 Ross Carl Removable dashboard instrument system
US9129523B2 (en) * 2013-05-22 2015-09-08 Jaybridge Robotics, Inc. Method and system for obstacle detection for vehicles using planar sensor data
US20140350835A1 (en) * 2013-05-22 2014-11-27 Jaybridge Robotics, Inc. Method and system for obstacle detection for vehicles using planar sensor data
US9305241B2 (en) * 2013-07-29 2016-04-05 Google Inc. Systems and methods for reducing a data set
US20150029186A1 (en) * 2013-07-29 2015-01-29 Google Inc. Systems and Methods for Reducing a Data Set
US9285230B1 (en) * 2013-12-20 2016-03-15 Google Inc. Methods and systems for detecting road curbs
US9454156B2 (en) * 2014-07-09 2016-09-27 Korea University Research And Business Foundation Method for extracting curb of road using laser range finder and method for localizing of mobile robot using curb information of road
US20160011594A1 (en) * 2014-07-09 2016-01-14 Korea University Research And Business Foundation Method for extracting curb of road using laser range finder and method for localizing of mobile robot using curb informaiton of road
US20160094808A1 (en) * 2014-09-29 2016-03-31 Vislab S.R.L. All-round view monitoring system for a motor vehicle
US10099615B2 (en) * 2014-09-29 2018-10-16 Ambarella, Inc. All-round view monitoring system for a motor vehicle
US9494093B2 (en) 2014-10-08 2016-11-15 Ford Global Technologies, Llc Detecting and negotiating a climbable obstacle in a vehicle
US11458986B2 (en) 2015-09-17 2022-10-04 Sony Corporation System and method for providing driving assistance to safely overtake a vehicle
JP2017058264A (en) * 2015-09-17 2017-03-23 株式会社東芝 Estimation device, method, and program
US10604161B2 (en) * 2015-09-17 2020-03-31 Sony Corporation System and method for providing driving assistance to safely overtake a vehicle
US20180345992A1 (en) * 2015-09-17 2018-12-06 Sony Corporation System and method for providing driving assistance to safely overtake a vehicle
US11780457B2 (en) 2015-09-17 2023-10-10 Sony Group Corporation System and method for providing driving assistance to safely overtake a vehicle
US9946259B2 (en) 2015-12-18 2018-04-17 Raytheon Company Negative obstacle detector
WO2017123301A2 (en) 2015-12-18 2017-07-20 Raytheon Company Negative obstacle detector
US10486699B2 (en) * 2016-05-04 2019-11-26 Ford Global Technologies, Llc Off-road autonomous driving
US10101746B2 (en) * 2016-08-23 2018-10-16 Delphi Technologies, Inc. Automated vehicle road model definition system
CN107766405A (en) * 2016-08-23 2018-03-06 德尔福技术有限公司 Automotive vehicle road model defines system
US10296010B2 (en) * 2016-10-11 2019-05-21 Mobileye Vision Technologies Ltd. Navigating a vehicle based on a detected barrier
US11669102B2 (en) 2016-10-11 2023-06-06 Mobileye Vision Technologies Ltd. Navigating a vehicle based on a detected barrier
US11029699B2 (en) 2016-10-11 2021-06-08 Mobileye Vision Technologies Ltd. Navigating a vehicle based on a detected barrier
US10649463B2 (en) 2016-10-11 2020-05-12 Mobileye Vision Technologies Ltd. Navigating a vehicle based on a detected barrier
CN107933560A (en) * 2016-10-12 2018-04-20 本田技研工业株式会社 Controller of vehicle
US20180099667A1 (en) * 2016-10-12 2018-04-12 Honda Motor Co., Ltd Vehicle control device
US10696301B2 (en) * 2016-10-12 2020-06-30 Honda Motor Co., Ltd. Vehicle control device
WO2018152441A1 (en) * 2017-02-17 2018-08-23 Essential Products, Inc. Modular light detection and ranging device of a vehicular navigation system
US10598790B2 (en) 2017-02-17 2020-03-24 Essential Products, Inc. Modular light detection and ranging device of a vehicular navigation system
JP7254709B2 (en) 2017-03-15 2023-04-10 ツェットエフ、フリードリッヒスハーフェン、アクチエンゲゼルシャフト Apparatus and method for determining grade signals in vehicles
JP2020512546A (en) * 2017-03-15 2020-04-23 ツェットエフ、フリードリッヒスハーフェン、アクチエンゲゼルシャフトZf Friedrichshafen Ag Apparatus and method for determining a slope signal in a vehicle
US11472415B2 (en) 2017-03-15 2022-10-18 Zf Friedrichshafen Ag Arrangement and method for determining a gradient signal in a vehicle
CN110392845A (en) * 2017-03-15 2019-10-29 Zf 腓德烈斯哈芬股份公司 For determining the facility and method of grade signal in the car
WO2018166718A1 (en) * 2017-03-15 2018-09-20 Zf Friedrichshafen Ag Arrangement and method for determining a gradient signal in a vehicle
US10929695B2 (en) * 2017-03-28 2021-02-23 Denso Corporation Obstacle detection apparatus
US20200019797A1 (en) * 2017-03-28 2020-01-16 Denso Corporation Obstacle detection apparatus
CN109212543A (en) * 2017-07-06 2019-01-15 通用汽车环球科技运作有限责任公司 Calibration verification method for autonomous vehicle operation
US10967789B2 (en) * 2017-09-07 2021-04-06 Hitachi Construction Machinery Co., Ltd. Safe driving assistance device
AU2018329306B2 (en) * 2017-09-07 2022-01-13 Hitachi Construction Machinery Co., Ltd. Safe driving assistance device
US11673586B2 (en) 2017-11-28 2023-06-13 Elta Systems Ltd. Failure detection in an autonomous vehicle
WO2019106664A1 (en) * 2017-11-28 2019-06-06 Israel Aerospace Industries Ltd. Failure detection in an autonomous vehicle
IL260821B1 (en) * 2018-07-26 2023-07-01 Israel Aerospace Ind Ltd Failure detection in an autonomous vehicle
US11393219B2 (en) 2018-09-27 2022-07-19 Apollo Intelligent Driving Technology (Beijing) Co., Ltd. Method and apparatus for detecting obstacle, electronic device, vehicle and storage medium
JP7395301B2 (en) 2018-09-27 2023-12-11 アポロ インテリジェント ドライビング テクノロジー(ペキン)カンパニー リミテッド Obstacle detection method, obstacle detection device, electronic equipment, vehicle and storage medium
JP2020098188A (en) * 2018-09-27 2020-06-25 バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド Obstacle detection method, obstacle detection device, electronic apparatus, vehicle and storage medium
US11616840B2 (en) * 2018-12-17 2023-03-28 Apollo Intelligent Driving Technology (Beijing) Co., Ltd. Method, apparatus and system for processing unmanned vehicle data, and storage medium
WO2021030508A1 (en) * 2019-08-13 2021-02-18 Zoox, Inc. Modifying limits on vehicle dynamics for trajectories
US11914368B2 (en) 2019-08-13 2024-02-27 Zoox, Inc. Modifying limits on vehicle dynamics for trajectories
US11458965B2 (en) 2019-08-13 2022-10-04 Zoox, Inc. Feasibility validation for vehicle trajectory selection
US11407409B2 (en) 2019-08-13 2022-08-09 Zoox, Inc. System and method for trajectory validation
US11397434B2 (en) 2019-08-13 2022-07-26 Zoox, Inc. Consistency validation for vehicle trajectory selection
US11333879B2 (en) 2019-09-20 2022-05-17 Raytheon Company Electronically steered inter-satellite optical communication system with micro-electromechanical (MEM) micromirror array (MMA)
US11584377B2 (en) * 2019-11-21 2023-02-21 Gm Cruise Holdings Llc Lidar based detection of road surface features
US11539131B2 (en) 2020-08-24 2022-12-27 Raytheon Company Optical true time delay (TTD) device using microelectrical-mechanical system (MEMS) micromirror arrays (MMAS) that exhibit tip/tilt/piston (TTP) actuation
US11837840B2 (en) 2020-09-01 2023-12-05 Raytheon Company MEMS micro-mirror array laser beam steerer for simultaneous illumination of multiple tracked targets
US11815676B2 (en) 2020-09-17 2023-11-14 Raytheon Company Active pushbroom imaging system using a micro-electro-mechanical system (MEMS) micro-mirror array (MMA)
US11522331B2 (en) 2020-09-23 2022-12-06 Raytheon Company Coherent optical beam combination using micro-electro-mechanical system (MEMS) micro-mirror arrays (MMAs) that exhibit tip/tilt/piston (TTP) actuation
US20230031096A1 (en) * 2021-07-28 2023-02-02 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for detecting an object in a turning alert zone of a vehicle
US11851051B2 (en) * 2021-07-28 2023-12-26 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for detecting an object in a turning alert zone of a vehicle

Also Published As

Publication number Publication date
EP2149799A2 (en) 2010-02-03
US8755997B2 (en) 2014-06-17

Similar Documents

Publication Publication Date Title
US8755997B2 (en) Laser ranging process for road and obstacle detection in navigating an autonomous vehicle
US8364334B2 (en) System and method for navigating an autonomous vehicle using laser detection and ranging
Sun et al. A 3D LiDAR data-based dedicated road boundary detection algorithm for autonomous vehicles
US20080033645A1 (en) Pobabilistic methods for mapping and localization in arbitrary outdoor environments
Hata et al. Feature detection for vehicle localization in urban environments using a multilayer LIDAR
CN108369420B (en) Apparatus and method for autonomous positioning
Levinson et al. Robust vehicle localization in urban environments using probabilistic maps
Levinson et al. Map-based precision vehicle localization in urban environments.
US9989967B2 (en) All weather autonomously driven vehicles
CN114812581B (en) Cross-country environment navigation method based on multi-sensor fusion
Levinson Automatic laser calibration, mapping, and localization for autonomous vehicles
WO2015173034A1 (en) Method and system for determining a position relative to a digital map
CN109387857B (en) Cross-network segment detection method and device in laser radar system
Veronese et al. A light-weight yet accurate localization system for autonomous cars in large-scale and complex environments
WO2018061084A1 (en) Self-position estimation method and self-position estimation device
Hervieu et al. Road side detection and reconstruction using LIDAR sensor
CN112292582A (en) Method and system for generating high definition map
Pannen et al. Hd map change detection with a boosted particle filter
Zhang et al. Robust lidar localization for autonomous driving in rain
US20230194269A1 (en) Vehicle localization system and method
Moras et al. Drivable space characterization using automotive lidar and georeferenced map information
Vora et al. Aerial imagery based lidar localization for autonomous vehicles
Liebner et al. Crowdsourced hd map patches based on road model inference and graph-based slam
Jiménez et al. Improving the lane reference detection for autonomous road vehicle control
Burger et al. Unstructured road slam using map predictive road tracking

Legal Events

Date Code Title Description
AS Assignment

Owner name: HONEYWELL INTERNATIONAL INC.,NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AU, KWONG WING;SCHEWE, JON;REEL/FRAME:021318/0749

Effective date: 20080728

Owner name: HONEYWELL INTERNATIONAL INC., NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AU, KWONG WING;SCHEWE, JON;REEL/FRAME:021318/0749

Effective date: 20080728

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551)

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FEPP Fee payment procedure

Free format text: 7.5 YR SURCHARGE - LATE PMT W/IN 6 MO, LARGE ENTITY (ORIGINAL EVENT CODE: M1555); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8