US20060119473A1 - System and method of avoiding collisions - Google Patents

System and method of avoiding collisions Download PDF

Info

Publication number
US20060119473A1
US20060119473A1 US11/297,273 US29727305A US2006119473A1 US 20060119473 A1 US20060119473 A1 US 20060119473A1 US 29727305 A US29727305 A US 29727305A US 2006119473 A1 US2006119473 A1 US 2006119473A1
Authority
US
United States
Prior art keywords
vehicle
data
sensors
sensor
algorithm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/297,273
Inventor
Richard Gunderson
Michael Parisi
Richard Gorman
Kurtis Melin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Altra Technologies Inc
Original Assignee
Altra Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=22443934&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US20060119473(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Altra Technologies Inc filed Critical Altra Technologies Inc
Priority to US11/297,273 priority Critical patent/US20060119473A1/en
Publication of US20060119473A1 publication Critical patent/US20060119473A1/en
Assigned to ALTRA RECOVERY, LLC reassignment ALTRA RECOVERY, LLC COUERT ORDER AND JUDGMENT Assignors: ALTRA TECHNOLOGIES, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • B60Q9/002Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle
    • B60Q9/004Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors
    • B60Q9/006Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors using a distance sensor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • B60Q9/002Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle
    • B60Q9/007Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle providing information about the distance to an obstacle, e.g. varying sound
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • B60Q9/008Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • G01S13/878Combination of several spaced transmitters or receivers of known location for determining the position of a transponder or a reflector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/003Transmission of data between radar, sonar or lidar systems and remote stations
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9324Alternative operation using ultrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93272Sensor installation details in the back of the vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93273Sensor installation details on the top of the vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93275Sensor installation details in the bumper area

Definitions

  • Mirrors don't work well in changing lanes, particularly in tractor-trailer rigs, since, as soon as the rig begins to turn, the mirror that looked down along the side of the vehicle is directed into the side of the trailer and the driver is blinded to activity on that side of his truck.
  • a final approach taken by a number of companies is the use of sensors to locate objects external to the vehicle.
  • Electronic Controls Company of Boise, Id. sells an ultrasonic sensor system that assists drivers in determining all is clear before the driver changes lanes, backs up or docks.
  • the system includes ultrasonic sensors mounted on the back and sides of the vehicle and an alert module mounted in the cab of the vehicle.
  • Each ultrasonic sensor continuously monitors a defined detection zone for objects moving within the zone. When a vehicle enters the detection zone, the sensor measures the time between sending the sound wave and receiving its reflection and sends that measurement to the cab.
  • Sonar Safety Systems of Santa Fe Springs, Calif. has a rear-mounted sensor system which detects objects in three distance zones from the rear of the vehicle. That is, it doesn't display distance to the object. Instead, the system provides alarms and audible feedback that inform the driver whether the obstacle is real close (Zone III), out a little farther (Zone II), or even farther out yet (Zone I). And it only looks up to 8 feet behind the vehicle. They also have a single sensory unit where they only put one sensor in the back.
  • a common problem with rear-mounted sensors to date is that sensors mounted on the rear of the vehicle detect the distance from the sensor to the object, not the perpendicular distance from the vehicle to the object.
  • these systems do not communicate to the driver the transverse location of the object (i.e., is the object directly behind the vehicle, off to the side, or far enough to the left or right that the driver will not hit it).
  • range measurement often does not exist, or is inaccurate.
  • the collision avoidance systems used to date are deficient in other ways as well.
  • the systems provide only partial coverage around the periphery of the vehicle. That is, they either lack a forward-looking detection capability, lack range and range rate measurement capability or they lack sufficient detection capability around the periphery of the vehicle to eliminate blind spots.
  • range measurement often is inaccurate.
  • those systems which do have forward-looking detection are prone to a high rate of false alarms from the environment or to distracting off-the-road clutter.
  • Armatron International of Melrose, Mass. has a side and rear obstacle detection system which includes wireless communications between the tractor and trailer, however, the sensors are all hard-wired to the trailer. This does not address the need in which tractors often are required to pull a multitude of trailers, some of which are owned by different companies, which are not likely to be equipped with any sensors.
  • FIG. 2 shows a more complex embodiment of the collision avoidance system shown in FIG. 1 ;
  • FIG. 3 is a system block diagram of the collision avoidance system according to FIG. 2 ;
  • FIGS. 4 a - c show operator interface units which can be used with the Control modules of FIGS. 1 and 3 ;
  • FIGS. 5 a - c show the operation of two rear-mounted sensors according to the present invention.
  • FIGS. 6 a - c show an alternate embodiment of the operator interface units of FIGS. 4 a - c;
  • FIG. 7 illustrates a backup warning system
  • FIGS. 8 a and 8 b show wireless portable transducer systems
  • FIG. 10 shows a high/low detection system
  • FIG. 11 shows an operator interface unit which can be used with the Control modules of FIGS. 1 and 3 ;
  • FIG. 12 shows the guided operation of two rear-mounted sensors according to the present invention.
  • FIG. 14 shows one embodiment of a forward looking detector radar module system.
  • FIG. 17 is an illustration of one embodiment of a type C radar module interface board.
  • FIG. 18 is a block diagram of one embodiment of the power distribution plan.
  • FIGS. 20 a - c show one embodiment of the pin configurations for the connectors.
  • FIG. 21 is a block diagram of one embodiment of a forward looking detector.
  • FIG. 22 is an illustration of the radar module alignment for a forward looking detector.
  • FIG. 23 illustrates one embodiment of a functional design for a proximity detector.
  • FIG. 24 is a proximity detector interface and operation timing diagram.
  • FIG. 25 a is a side view of one embodiment of a type A radar module layout.
  • FIG. 25 b is a top view of one embodiment of a type A radar module layout.
  • FIG. 26 is a proximity detector system timing diagram.
  • FIG. 29 illustrates the signal/data processing functions of one embodiment of a control module.
  • FIG. 30 illustrates one embodiment of forward looking detector object data processing.
  • FIG. 31 illustrates one embodiment of track report generator functions.
  • FIG. 34 shows an example of multiple hypotheses through amplitude versus time data for automatic ranging.
  • FIG. 1 shows a collision avoidance system 10 according to the present invention.
  • System 10 includes a Control module 12 and two sensors 14 .
  • Each sensor 14 includes a transmitter 16 and a receiver 18 .
  • transmitter 16 and receiver 18 are mounted together in a single sensor housing. In another embodiment, transmitter 16 and receiver 18 are mounted in separate housings.
  • sensors 14 include separate acoustic transducers for each of transmitter 16 and receiver 18 .
  • a single acoustic transducer is used for both transmitting a signal and receiving its echo.
  • Some transducers which would operate in such a system 10 are the 9000 Series Piezo Transducers available from Polaroid OEM Components Group and the KSN 6530 45 KHz transducer available from Motorola.
  • the KSN 6529 45 KHz transducer available from Motorola could be used for receiver 18 .
  • sensors 14 are microwave transceiver devices.
  • each transceiver includes a small integrated antenna and electronic interface board.
  • sensors 14 include both proximity detectors 14 . 1 and longer range detectors 14 . 2 .
  • the longer range detectors incorporate a larger antenna to operate as a Doppler Radar Forward Looking Detector.
  • An example of one such transducer is the model DRO3000 Microwave Transceiver Module available from Advanced Frequency Products of Andover, Mass.
  • sensors 14 of system 30 are grouped in detection subsystems 34 .
  • the output from each proximity detector subsystem 34 is fed into Control module 12 , as is, shown in FIG. 3 .
  • system 30 includes a Control module 12 , an operator interface 32 , and two sensors 14 .
  • Each sensor includes a transmitter 16 and receiver 18 .
  • sensors 14 of system 30 are grouped in detection subsystems: namely, forward-looking detector subsystem (with 2 sensors), proximity detector subsystem (with up to 15 sensors), and a rear-guard subsystem (with up to 7 sensors).
  • the output of each sensor in each detection subsystem is fed into control module 12 , as shown in FIG. 3 .
  • system 30 includes a control module 12 , an operator interface 32 , and other optional system features.
  • optional system features include rear warning lights 98 and side warning lights 99 .
  • Each sensor includes a transmitter 16 and receiver 18 .
  • Collision avoidance systems to date typically put transducers on the rear of the vehicle and measure the distance from the sensor to the object. This is not optimal since those sensors are transmitting in an arc. They are looking at the distance from the sensor to the object and back again. That may not, however, be the perpendicular distance from the vehicle to the object. A deficiency, therefore, of systems to date is that they do not communicate to the driver the transverse location of this object.
  • a plurality of sensors 14 are mounted on the back of the vehicle.
  • Control module 12 takes the readings from the rear-mounted sensors 14 , determines whether it can triangulate and calculates an actual perpendicular distance from the truck to the object. This is important because the truck driver really wants to know, not that the object is off at this angle over here five feet away, is how far he can back up before he hits the object. Examples of these measurements are shown in FIG. 4 a - c and 5 a - c . In contrast to other approaches, this triangulation procedure makes system 10 a precision distance measuring system. Alternate approaches to the operator interfaces shown in FIGS. 4 a - c are shown in FIGS. 6 a - c.
  • FIG. 5 a represents the top view of a tractor trailer rig 50 with a post 52 located behind the trailer 54 .
  • the post 50 represents a hazard unless the driver knows the precise distance from the vehicle.
  • Sensor 14 on the right rear of the trailer senses the post 52 at a distance of six (6) feet.
  • Sensor 14 on the left rear of the trailer senses the post at a distance of just over six and one half (6.5) feet.
  • Control System 12 calculates that the actual distance to the post as 5.2 feet and determines it is located just to the right of the center of the trailer. The distance is then displayed digitally on the control module 12 .
  • the transverse location is displayed, for instance, on the bar graph located just to the right of the digital display and it indicates the location of the post.
  • Perpendicular distance between the rear of a vehicle and external objects is increasingly important the closer the vehicle gets to an external object.
  • the actual perpendicular distance is 2.6 feet.
  • the Precision Measurement System correctly uses the sensor 14 distance readings as well as the known distance between the left and right sensors 14 to calculate the exact perpendicular distance to the post. This is very important as an aid to the driver in the prevention of an accident.
  • a third sensor 14 is mounted between the right and left sensors 14 .
  • the system can determine that the object is a point source (such as a post) as opposed to a wall or large vehicle.
  • the sensor 14 on the right rear of the trailer senses the post at a distance of six (6) feet.
  • the sensor 14 on the left rear of the trailer senses the post at a distance of just over six and one half (6.6) feet.
  • the Control module 12 knowing that the object is a point source, calculates that the actual distance to the post is 5.2 feet and is located just to the right of the center of the trailer.
  • the distance is displayed digitally on the Operator Interface and Side Display Modules.
  • the transverse location is displayed in graphic form (e.g. bar graph) on the Operator Interface.
  • FIG. 5 b represents the top view of a tractor trailer rig with a post located far behind the trailer.
  • the post represents a hazard unless the driver has sufficient information to aid in maneuvering around the obstacle.
  • the sensor 14 on the right rear of the trailer senses the post at a distance of 21.0 feet.
  • the sensor 14 on the left rear of the trailer senses the post at a distance of 22.1 feet.
  • the control module 12 calculates that the actual distance to the post is 21.0 feet, and that it is located near the right side of the trailer.
  • the distance is displayed digitally on the operator interface.
  • the transverse location is displayed on the bar graph located just to the right of the digital display and it indicates the location. Precision distance measurement is less of a concern when obstacles are a long distance from the rear of the vehicle.
  • the distance between sensors 14 In order to triangulate, the distance between sensors 14 must be known. Therefore, the distance between sensors 14 must be controlled. In one embodiment, the distance between sensors 14 is a system parameter that can be programmed. In one such programmable embodiment, a programming device is provided to a dealer or a fleet owner such that once sensors 14 are installed, they can measure the actual distance between the sensors and the distance from the sensors to the sides of the vehicle and program that into control module 12 . Control module 12 can then accurately calculate distance.
  • control module 12 calculates transverse location and communicates that information via a graphical indicator such as bar graph 22 of FIG. 4 a (also shown as 22 ′ in FIG. 6 a ).
  • the purpose of the bar graph display is to break the transverse set of distances or locations, up into anywhere from 7 to 11 or more segments on the bar graph display.
  • Control module 12 lights the segments that indicate where that object is from extreme left to extreme right.
  • transverse location is communicated through another graphic display, (e.g., a Liquid Crystal or other Display).
  • transverse location is displayed through a little video monitor.
  • operator interface unit 32 also shown as 32 ′ in FIG. 6 a displays an area behind the vehicle and places a dot within that area showing the closest object. A representation such as is shown in FIG. 5 a would be sufficient.
  • the operator interface unit 32 includes a display 129 , speakers 125 , push-button switches 47 , system status indicator 127 and switch legends 128 .
  • system 10 includes vertical compensation as discussed below.
  • vertical compensation is activated automatically when the front panel switch in FIG. 4 c or the soft key in FIG. 6 c is in the Loading Dock (LD) position.
  • the purpose of this feature is to compensate for the protrusion of loading dock impact bars in cases where the Transducer Assembly is located below the point of impact of the trailer with the loading dock impact bar.
  • FIG. 5 c represents the side view of a tractor-trailer pulling up to a loading dock.
  • the impact bar is the point of contact with the trailer.
  • the depth (i.e., front-to-back) of the impact bar is typically 4.5 inches.
  • the top of the impact bar is typically 48 inches above the ground.
  • the Precision Measurement System will adjust the distance measurement by 4.5 inches if the Transducer Assembly is mounted so low that it cannot detect the impact bar when the trailer is within 12 inches of the impact bar. For example, if the perpendicular distance from the rear of the trailer to the loading dock is 1 foot and the Transducer is 2 feet below the impact bar, the measured distance of 1.0 feet will be corrected to 0.6 feet.
  • software running in systems 10 and 30 uses Multi-Hypothesis Ranging to provide an accurate estimate of range to an object.
  • a range estimate will be calculated from the signal strength versus time and closing rate of each tracked object.
  • the signal strength will vary, due to the properties of radar, by range to the fourth power and by a scattering property called scintillation. Combining this property with the distance traveled by the object will yield the starting range and thus the current range to the object.
  • the distance traveled by the object is computed in the system by combining time since the data collection and tracking started, with the individual measured closing rates versus time.
  • the signal strengths versus time will be inserted into an algorithm which matches the hypothetical signal strength curve to a “range and distance traveled one over range to the fourth curve”.
  • One hypothesis a set of points drawn through the returned signal levels over time, will correspond to the correct starting range for the object given the measured distance traveled. This hypothesis will provide the best statistical match to a one over range to the fourth curve and will be the range estimate provided.
  • One sonar embodiment of system 10 incorporates temperature compensation. Temperature compensation is needed because of the fact that sonar travels at different speeds through air depending on temperature. And so systems 10 and 30 measure the temperature of the air, and compensate in the distance calculation for the effects of temperature. In such an embodiment, transverse location detection, triangulation, perpendicular position compensation and temperature compensation cooperate to form a precision measurement system.
  • the systems include automatic sensitivity control.
  • systems 10 and 30 When systems 10 and 30 are trying to sense an object at a far location, it is advantageous to pulse your maximum energy, to transmit a high burst of energy.
  • systems 10 and 30 transmit such a high burst of energy, if there is an object at a far distance, systems 10 and 30 are more likely to get an echo they can sense.
  • the sensitivity should be set to be very sensitive for that application. But once you sense that far off object and you start backing toward it, systems 10 and 30 should back off the transmitted energy.
  • the output of transmitter 16 can be reduced and the sensitivity of receiver 18 increased automatically by systems 10 and 30 .
  • a backup warning system is provided as shown in FIG. 7 .
  • the intent is to provide immediate feedback to the driver shortly after the vehicle transmission is shifted into reverse.
  • This information includes information on objects in the vicinity of the rear of the vehicle as well as information on objects in the path of the rear of the vehicle.
  • an auditory prompt representing an “alert” is sounded for the driver. If an object is detected in the path of the vehicle, in the range of 5 to 10 feet, the system will categorize that as a hazard situation and an auditory prompt representing a “warning” is sounded for the driver.
  • the system will categorize that as an emergency situation and an auditory prompt representing an “emergency” is sounded for the driver.
  • the alert, warning, and emergency will have cleared and the system will begin providing range feedback to the driver in the form of distance information, as displayed on the Operator Interface and Side Display Modules, and auditory feedback in the form of pulsed tones.
  • the system will automatically detect a sudden change in range to the object and the “emergency” auditory prompt will be issued to the driver so he/she can take action.
  • systems 10 and 30 when the driver is going to back up, if there is an object within range, one of three scenarios will happen. First, if the system senses a truck or other object real close to it on either side, systems 10 and 30 will give him an alert. The system knows that there is no collision potential here, but just alerts him that there is something there. In one embodiment systems 10 and 30 provide one set of tones to the driver for an alert. Second, if there is an object in the range of 5-10 feet as soon as the driver throws it into reverse, systems 10 and 30 sense the object and provide the driver with a different alarm (e.g., a different set of tones or a different flashing light). This alarm is called a hazard alarm.
  • a different alarm e.g., a different set of tones or a different flashing light
  • an emergency alarm i.e., a third set of tones, or a third flashing light.
  • Systems 10 and 30 therefore provide feedback indicative of the distance to an object behind the driver.
  • audible or visual feedback tells the driver he's getting closer; the pulses go faster and faster to the point where, when he's within a foot, the pulses are continuous. But, if in the process of backing up, the system automatically detects that the distance suddenly became shorter, it will provide the emergency alarm right away so the driver can take action.
  • systems 10 and 30 sense that and automatically provide the emergency alarm so the driver can take action. As noted above, some of the systems that are out there actually detect zones of distance and provide feedback for that. Systems 10 and 30 go beyond that in that they detect and differentiate objects outside the area of potential collision from those inside and secondly, they can detect sudden changes in distance for an emergency alarm.
  • control module 12 is highly programmable and dealers and fleet owners are given an ability to program key parameters that the system can use to more adequately address the needs of that application and that customer.
  • an external programmer is plugged into a connector in the back of control module 12 ; the dealer can then respond to basically the number of fields and change a number (e.g., the distance between the rear-mounted transducers as discussed above), and key that in.
  • the programmer downloads the data, and feeds it back to the control module 12 .
  • Control module 12 then is configured for that vehicle.
  • system 10 includes a security monitor/alarm system coupled to control module 12 .
  • an ultrasonic transmitter and an ultrasonic receiver is placed in the cab of the vehicle.
  • the driver leaves the vehicle, he turns the alarm system on with a key switch and it automatically scans the cab to see what the distances are to the closest object up in the cab. If somebody climbs up into the seat, one of the distances changes and an alarm is set off.
  • the driver has approximately 15 seconds to get in and disable the alarm with his key switch. But if it was somebody other than the driver, the alarm will go off.
  • the alarm also activates an auto alarm underneath the hood of his vehicle to draw attention to and possibly scare off the intruder.
  • an on-board computer interface is provided.
  • the reason for this is some of the larger tractor-trailer rigs, in particular, have on-board information systems that monitor factors relating to use of the vehicle. They may monitor, for instance, the location of the vehicle, the delivery route, the delivery schedule, things that the driver does along the way, engine performance or things that might be an indication to the fleet owner that there's some service needed.
  • information relating to driver performance that is detected with systems 10 and 30 is captured and downloaded into the on-board computer so that when the fleet owner gets a download from the on-board computer, it contains additional information provided by systems 10 and 30 . So, with an interface through a single cable, systems 10 and 30 can tie into the on-board computer and provide real time information.
  • control module 12 if there is no on-board computer there, data storage is provided in control module 12 so that it can store the data internally. Data can then be downloaded to a fleet computer at a future date.
  • systems 10 and 30 include an accident reconstruction memory installed in the control module. This memory maintains a record, in non-volatile memory, of data pertinent to system operation, vehicle operation, and obstacle detection. Some of these parameters are stored over longer periods of time and some relate to the last 2 or more minutes leading up to an accident.
  • a G-force switch detects the presence of a crash and discontinues the data recording process thus saving data stored prior to the crash.
  • a self test capability is provided.
  • Self test addresses several issues. One is when systems 10 and 30 are first turned on (i.e., the driver throws the power switch into an “on” position) the systems will turn all the indicators on so that the driver right away can see that all the indicators are lit.
  • control module 12 tests its internal circuitry to ensure that the system comes up running. The second thing the system does is while it's running, if the micro controller or microprocessor in control module 12 were to fail, systems 10 and 30 then provide a “watch-dog timer” that will detect the failure. Thirdly, the driver can activate self test mode. On doing so, control module 12 flashes all of the indicators of front panel 20 .
  • control panel 20 includes an indicator 24 for each transducer mounted around the vehicle and, on entering self test, transducer indicators 24 begin to flash.
  • the driver then walks around the vehicle and gets back in the cab. Every one of those transducers should detect him; each time they detect him, the transducer indicator 24 associated with the transducer goes off (i.e., quits flashing). If the driver gets back to the cab and there's a transducer still flashing, he knows that something didn't work and he can investigate the problem.
  • systems 10 and 30 automatically and sequentially activate a Built-In Test (BIT) function for each sensor.
  • BIT Built-In Test
  • the Built-In-Test (BIT) function is conducted in two ways: initial power-up and an integrated BIT performed during vehicle motion.
  • control module 12 performs a BIT of control module 12 functions.
  • the BIT function verifies that sensor transmitter 16 , receiver 18 , and the electronics of control module 12 and the rest of systems 10 and 30 are working properly.
  • indicators associated with every element tested will turn off for all sensors that pass the Built-In Test. If a sensor 14 repeatedly fails the BIT, it will automatically be taken out of the service and the driver will be alerted of the failure and the need to service that particular sensor 14 .
  • the system When the vehicle is in motion, the system will perform BIT on all detector modules and integrate the results into the data acquisition process to insure the integrity of the data being processed. This is accomplished by looking for road clutter signatures from each of the radar modules (i.e. forward-looking, side-looking, and rear-looking detectors). When the vehicle is in motion, the system will integrate the BIT of all sensor modules into the data acquisition process to insure the integrity of the data being processed. This is accomplished by looking for road clutter signatures from each of the radar modules (i.e. forward-looking, side-looking, and rear-looking detectors). If the radar modules are working properly, they will always detect low level return signals from the road surface while the vehicle is moving and will transmit information pertaining to these signals back to the control module.
  • the system will continue to function, bypassing the defective sensor. If the BIT detects a catastrophic failure, an error message will be displayed on the operator interface and the system will halt. The date, time, and results of the most recent BIT will be stored in Accident Reconstruction System memory if that option is installed. This integrated approach to BIT does not slow up the data acquisition process and it insures the integrity of all sampled data and the data communications from all sensors.
  • sensors 14 are provided within a wireless portable transducer system 40 .
  • the problem with that is if you look at the number of trailers out there, they far exceed the number of truck-tractors out there. And so truck-tractors are basically moving from trailer to trailer. It could easily reach the point where establishing a complete collision avoidance system 10 or 30 on each combination of tractors and trail would be prohibitively expensive.
  • a system 10 is constructed having a wireless portable system 40 .
  • FIGS. 8 a and 8 b show two embodiments of such portable systems.
  • each box 70 includes an antenna sticking out the side.
  • Each box 70 mounts under the trailer and clamps to the frame of the trailer.
  • Inside each box 70 is an ultrasonic transmitter and receiver, electronic circuitry, and a radio transmitter and receiver.
  • a two wire cable connects battery from the trailer to the electronic circuitry to provide power.
  • a cable between each box provides common control signals from the radio transmitter/receiver such that signals from either rear mounted antenna control both Transducer Assemblies.
  • FIG. 8 b there is one long extrusion 72 with an antenna sticking out each side.
  • the extrusion clamps to the frame on the rear of the trailer.
  • the extrusion may be made of one piece, or two pieces (one within another) with a mechanism to adjust the width of the extrusion 72 to the width of the trailer.
  • a Transducer Assembly (transmitter and receiver) is mounted on each end of the extrusion.
  • the electronic circuitry including radio transmitter and receiver are mounted inside the extrusion.
  • a two wire cable connects battery from the trailer to provide power to the electronic circuitry.
  • Signals to and from the boxes 70 and 72 are communicated to the control module of the collision avoidance system via the Wireless Communicator to detect, measure, and display distance to objects behind the trailer.
  • System 40 is designed so that it can quickly be disconnected from one trailer and moved to another trailer.
  • a Wireless Portable Transducer System provides for wireless communication between the electronics mounted in the cab of the vehicle and the Portable Transducer Array mounted on the rear of the trailer. Power to operate the Portable Transducer Array is provided by connecting in to existing power wiring provided to the trailer from the truck's electrical system.
  • the Portable Transducer Array could be made to be totally battery operated.
  • the Portable Transducer Array were designed using Micropower Impulse Radar, Doppler Radar or other alternative low-power technologies, the transmitting and receiving functions to measure distance to objects behind the vehicle would be low power and could operate on batteries built into the Portable Transducer Array.
  • the communications between the electronics in the cab of the vehicle and the Portable Transducer Array could also use Micropower Impulse Radar, Doppler Radar, or other alternative low-power technologies, thus enabling portability with built-in battery power. This solution will eliminate the need to tap into the truck's electrical system to power the Portable Transducer Array.
  • multiple sensors are designed into the wireless subsystem 40 to detect obstacles to the rear of the vehicle and on either side of the vehicle. Communication with control module 12 is via wireless digital signals. Control module 12 is designed to sense when the wireless portable sensor subsystem is not installed or is not functioning properly.
  • portable wireless sensor subsystem 40 is a tubular structure with integral electronics, battery pack, and sensors mounted internal or external to the structure. The unit would clamp on the trailer chassis or the underride bumper provided on the rear of many trailers. Antennas would be mounted on one or both sides of the wireless portable sensor subsystem protruding just outside the left and right edges of the trailer.
  • portable wireless sensor subsystem 40 is enclosed in two separate housings mounted at the left rear or right rear of the trailer. Again, quick connect mounting arrangements will be made to secure each unit to the trailer. A cable will interconnect each unit to allow the sharing of one battery pack, one controller, and one wireless transceiver.
  • the sensors on the trailer are hardwired together, however, communication between the sensors and the control module 12 is wireless.
  • a Transceiver Module will be mounted on the tractor and a second unit on the trailer.
  • the Transceiver Module on the trailer will receive its power from the tractor-trailer umbilical electrical cable. Electrical signals will be passed between tractor and trailer just like any non-wireless system with the exception that the signals will be converted to wireless communication and then reconverted back to their electrical form at the other end. This approach provides additional flexibility for the customer's needs.
  • drivers need to be able to detect objects directly in front of, or to the side of, the front of the vehicle.
  • one of the problems that busses have is the number of small children in front of and on the sides of the bus.
  • the only options provided to these drivers are mirrors angled to see the front of the bus. Even the use of angled mirrors, however, has only limited effectiveness.
  • forward-looking proximity detectors are provided in order to detect objects immediately in front of the vehicle (an area that is a blind spot for the driver).
  • Buses also have a problem with children that crawl under the bus to retrieve a dropped toy or ball. Bus drivers cannot always see these areas. To help prevent problems, in one embodiment, side-looking proximity detectors are positioned on the bus to monitor these areas.
  • forward-looking proximity detectors have a problem with clogging due to debris, dirt, ice, etc. accumulated while the vehicle travels down the road.
  • forward-looking transducers are typically needed only when the vehicle is stationary and about to move forward. It would, therefore, be advantageous to expose the forward-looking transducer to the elements in only those situations where they are needed.
  • a forward-looking Transducer with an Environmental Shield solves this problem in situations where the Transducer need not be active while the vehicle is in motion. While the vehicle is in motion, the shield covers the front of the Transducer Assembly, protecting it from contamination. When the vehicle stops, the system using this device will open the front door, thus enabling the Transducer Assembly to detect and measure the distance to all objects in front of the vehicle. Shortly after the vehicle starts to move, the system closes the Environmental Shield to protect the Transducers.
  • FIGS. 9 a - d demonstrate one way of solving this problem.
  • the solution is independent of the type of Transducer technology being used. However, the intended use is with ultrasonic Transducer Assemblies.
  • FIGS. 9 a and 9 b represent a side view and a front view of a mounting bracket with the Transducer Assembly 88 mounted via a Transducer Mounting Bracket 90 to Mounting Bracket Top Plate 92 .
  • Mounting Side Brackets are shown in place. Note the mounting holes in the flanges that protrude beyond the width of the mounting Bracket Side Plates 94 . These mounting holes are used to mount the completed assembly to the underside of the vehicle front bumper or chassis just behind the front bumper. In one embodiment, spacers are used to adjust the actual height of the overall assembly so as to provide an unobstructed opening for the Transducers to work properly.
  • shield 60 replaces the solenoid with a motor.
  • the motor is to rotate the shield cover out of position when the transducer is operating.
  • an identification can be computed for a tracked object.
  • the signal strength at an estimated range provides an estimate of radar cross section. From this radar cross section an initial categorization as a truck/large vehicle, a car or other can be determined. From scintillation combined with a small radar cross section the categorization as a human/large animal or a sign can be determined.
  • an Emergency Alarm is sounded to alert the driver to take action before damaging the vehicle.
  • an additional indicator on the operator interface 32 flashes to inform the driver that the alarm was caused due to lack of clearance.
  • the motor home is backing under an overhanging building roof 62 .
  • the MicroController in the control module 12 can calculate the distance of the roof overhang above the ground. Based on the required clearance, which is programmed into the Memory of control module 12 , the system can detect whether there is sufficient clearance for the vehicle. If there is not sufficient clearance, the Emergency Alarm will sound.
  • systems 10 or 30 can be mounted on farm trucks. Farm trucks are often pulling up into close spaces with loading and unloading equipment, grain augers and whatever, and in some cases even have to straddle a grain auger in order to dump a load so that the grain auger can take the load away. And that's a tough maneuvering situation.
  • software is provided which not only prevents accidents but also helps guide them into some of those tight maneuvering situations.
  • systems 10 and 30 sense the equipment the vehicle is trying to mate with and guides the driver such that they stay centered on that equipment. Such a system is shown in FIGS. 11 and 12 .
  • a grain auger example is given in FIG. 12 .
  • the example shown is that of a farm truck preparing to dump grain into a grain auger 55 .
  • the driver will activate a TruTrack switch on operator interface 32 .
  • the system will automatically measure the distance to the auger 55 , will calculate the transverse location of the auger 55 , will display this location on the bar graph, and will display the distance on the digital readout on the operator interface 23 .
  • the right rear Transducer has detected the auger 55 at a distance of 6.0 feet.
  • the left rear Transducer has detected the auger 55 at a distance of 6.6 feet.
  • the system will automatically calculate a perpendicular distance of 5.2 feet.
  • the system will also calculate the transverse location and display it on the bar graph as slightly right of center. With this information, the driver can make minor maneuvering corrections to keep the auger 55 centered.
  • Side Display Module 36 provides visual feedback to the driver when looking in the direction of either side view mirror.
  • These modules may be mounted on the edge of the side view mirrors, or they may be mounted inside the cab in the approximate line-of-sight as the side view mirrors.
  • the Side Display Modules 36 ( FIG. 13 ) consist of a plastic housing, a small PCB Assembly with five LED indicators, two half-inch high seven segment displays, a cable which runs into the cab and connects to the rear of the Control module, and a clear plastic cover on the front of the module.
  • the Display Module 36 mounted on the left side of the cab is identical to the module mounted on the right side of the cab.
  • the seven-segment display drivers and LED driver will be located in the Control module.
  • the above diagrams show a distance reading of twelve feet ( 12 ′). Distance readings associated with the Forward-Looking Detector Subsystem will not be displayed on the Side Display Modules. Only Backup Mode rear distance readings will be displayed. If an alarm condition exists anywhere around the vehicle, all five LED's will flash. The LEDs are not meant to provide any detector-specific information. Similarly, in one embodiment, the graphics displays shown in FIGS. 6 a - c will flash a visual warning on detection of an alarm condition.
  • Collision avoidance is the primary goal in the application of advanced technology. Collision Avoidance as applied to truck vehicles can be defined in three categories:
  • the FLD 14 . 2 must operate reliably in a complex environment consisting of:
  • the Primary Mode which is concerned with the potential for accidents directly in the path of the vehicle, described above, and the Secondary Mode which includes the Primary Mode plus detection of objects to the right of a snow plow that could impact a wing plow.
  • the PD is designed to detect objects in the immediate perimeter of a tractor-trailer.
  • Radar modules are mounted in an array around the periphery of the cab and trailer.
  • FIG. 15 shows the location of each radar module and the area of coverage.
  • the proximity detector modules detect objects in the perimeter field and provide the data to the control module for processing. After pertinent data is derived, it is sent to a display where the driver is alerted to take appropriate action to avoid a collision.
  • the front modules will look for small children or objects immediately in front of the vehicle.
  • the right and left side mounted modules will detect vehicles, pedestrians, and objects that may not be clearly visible to the driver.
  • the rear mounted modules will monitor the area directly behind the vehicle.
  • a special case on snow plows requires that the center rear mounted RM be used to measure time-to-impact for vehicles approaching from the rear.
  • the CM will activate a pulsed high intensity light to warn the driver of the oncoming vehicle of the presence of the snowplow.
  • the PD Modules are selectively activated by control signals sent by the CM.
  • the conditions under which they are activated include:
  • Module Activate rear, left and right PD. Module groups when the transmission is in reverse.
  • Master Clear initializes all electronics in the proximity detector.
  • the Rear Guard Detector 160 is functionally the same as the Proximity Detector. The main difference is that the RGD 160 covers the peripheral area around the trailer only. It is a portable system which can be moved from trailer to trailer and works in conjunction with the CM in the cab. Being portable, the RGD is self powered and a RF link has been added to communicated with the CM in the cab. Configuration, location, and area of coverage are shown in FIG. 16 .
  • the functional interface for the RGD is identical to the PD except that the interface uses a RF link to transmit data to the CM rather than a hard-wired connection.
  • the CM sends activation signals as follows:
  • the Radar Modules are a combination of motion sensors available off-the-shelf, an amplifier and a signal processing chip. They come in three configurations: Type A with a motion sensor, an amplifier and a microcontroller; Type B with a motion sensor and an amplifier; Type C with a motion sensor with a big antenna, an amplifier and a microcontroller. A notional diagram of the Type C RM Interface Board 170 is shown in FIG. 17 .
  • the motion sensors are microwave motion sensors that operate in X-Band frequency range. These modules utilize a dielectric resonator oscillator and a microstrip patch antenna to achieve low current consumption, high temperature stability, high reliability, and flat profile.
  • Type A will include the radar, an op-amp circuit, and the RM interface board.
  • Type B will include the radar and an op-amp. Up to two type B RMs can be connected to a Type A.
  • the connection between a Type A and Type B will be a 4-wire cable.
  • the 4-wire cable will be for +12 volts, two for signal, and ground.
  • the housing for the Type A and Type B RMs should be similar or the same.
  • the Type A will have two connectors for the Type B inputs and one connector for connection to a serial port and for power.
  • the Type B will have one connector for output and power.
  • a Type A RM will distribute power to a maximum of three radar motion sensors, the onboard motion sensor and two Type B RMs.
  • the Type A will use up to 10 A/D ports on a microcontroller and sequentially sample data from each attached motion sensor.
  • the Type A will also perform a 64 point FFT on each set of 5 kHz sampled motion sensor data. The first 20 samples from the FFT results will be output via a serial channel.
  • the location of all sensors is important to the operation of the data fusion system. A typical installation will only use five Type A's but there is really no reason why all the sensors could not be Type A's. They can be put in any PD or RGD position.
  • the installer will set the CM into installation mode and select on the menu, through the programmer, the position of the first RM, Type A or B. The installer will then approach the selected sensor location and wave his/her hand within one inch in front of the antenna housing, until a tone is heard from the CM and stop for five seconds and repeat the waving. A tone will sound and the installer will repeat this step. The installer will repeat the intermittent waving until the system gives a three beep OK response. This will typically take only waving at the sensor twice. The installer will then proceed to the next RM. All RM's will be programmed in this fashion. This will allow the CM and Type A modules to coordinate the location of each sensor.
  • the software in the CM will send an initialization serial message to all Type A modules.
  • the software on the microcontroller will look for this message if it has not been assigned an address.
  • the software will perform a 64 point FFT every 300 milliseconds. The first 20 samples out of the FFT will be sent back to the control module if one of these samples crosses a threshold.
  • the CM will use this data to identify the RM which responded to the installer. Once the installer has gotten the three beep OK, the CM will send out an address number (1 to 15) to identify the RM's position (see FIG. 15 ).
  • the CM will also send out the Type A's position along the truck, height, transverse distance from the left front corner of the truck, and distance from the front of the truck.
  • the Type A will accept this data and store it in EPROM.
  • the Type A which is being positioned will then store the port number (connector) on which the signal was being received. This will allow the Type A to respond to this port number's address when polled by the CM.
  • the position of the sensors with respect to the tractor will be communicated to the CM, by the Type A's, upon startup.
  • the CM initializes the system the RM's will be polled (1 through 15).
  • Each Type A module will respond when its number or the number of an attached Type B, is polled.
  • the Type A module will send out location and other information about the RM.
  • the Type C RM is similar to a Type A. It contains the radar motion sensor, with a 16 ⁇ 2 pad antenna. The software samples 256 points of data from the onboard sensor. The data is fed to a 256 Point FFT. The first 128 samples from the FFT results will be output via a serial channel. To distinguish between left and right Type C, the last pin on the left connector will be shorted to ground. This pin will not be used for anything else (power or signal). The wiring in the FLD enclosure will be fixed such that it cannot be confused and reversed. The FLD uses two Type C RMs.
  • One rear guard RM (#14-14) (the center one) will be configured to search a shorter range to assist in increasing back-up range accuracy.
  • This RM must be a Type A RM.
  • the CM will command this RM to sample either a unity gain op-amp channel or to sample the normal gain op-amp channel (every Type A will be able to do this). This will allow the RM to be used for long-range detection when the vehicle is not in reverse and short-range measurements when the vehicle is in reverse.
  • This same command from the CM will change the sampling rate on the A/D unity gain channel to 2 kHz when in reverse (provides a 2.5 times finer measurement of vehicle speed).
  • the power distribution plan 180 is shown in FIG. 18 .
  • the vehicle battery powers the CM.
  • the +12 volts is filtered and fused in the CM.
  • the +12 volts is then supplied to the FLD and any Type A RM on the cab or truck without a trailer.
  • Trucks with a trailer will have power to all cab Type A RMs and to a transceiver.
  • the trailer will use either a set of PD's or an RGD.
  • the PD's will use the trailer's +12 power to supply the Transceiver/power convert module. This module will filter and fuse the +12 volts and convert the power to +3 volts for the transceiver and send the +12 volts out.
  • the +12 volt power will then be sent to all trailer Type A modules.
  • the Type A and B modules will DC to DC regulate the +12 down to +5 volts.
  • the RGD is powered by its own battery and will distribute power from this 12 volt battery the same as the PD.
  • Type B's send audio frequency signals to the Type A's.
  • Type A's send RS-485 at 19,200 Baud to either a transceiver or the CM directly.
  • the RS-485 cabling is T′ed between the Type A modules.
  • Type C's send data over a 400 Kbaud RS485 interface. All RS485 interfaces are two way.
  • the cabling between the Type A and Type B consists of four wires: +12 volts, ground, and two for Signal.
  • the cabling between a Type A and the CM or a transceiver is four wires: two for RS-485, +12 volts, and ground.
  • the cabling between a Type C and the CM is four wires: two for RS-485, +12 volts, and ground.
  • the cabling between a transceiver and the CM is four wires: two for RS-485, +12 volts, and ground.
  • the pin configuration for the connectors is shown in FIG. 20 .
  • the Radar Modules may need to be calibrated.
  • a calibration fixture consisting of a fan permanently mounted to one end of a rectangular tube assembly will be used to program a gain characteristic number into the sensor microcontroller memory. This will be done for Type A and C modules.
  • the coding in the microcontroller will be put in manufacturing mode and will expect a specific return from the test assembly.
  • a number denoting the difference between the expected and the measured value, to the nearest dB, will be stored. This will be sent via the header message to the CM for use in signal processing.
  • the microcontroller will perform the following functions in firmware:
  • the FLD consists of two Type C Radar Modules.
  • the Block diagram of the FLD 190 is shown in FIG. 21 .
  • These two sensors are narrow beam motion sensors.
  • the beamwidth is 8.5 degrees at the 3 dB point of the antenna pattern.
  • the two sensors are pointed across each other as shown in FIG. 10 . This results in a 0 to 10 dB antenna pattern change for both of the sensors focused in a 10 foot column 300 feet in front of the track.
  • the difference in antenna pattern gain will be used to differentiate between objects directly in front of the truck and objects not directly in front of the truck.
  • the RM alignment for the FLD 190 is shown in FIG. 22 . This alignment is such that the antenna gain is 10 dB lower than the peak at the edge of a 10-foot by 300-foot rectangle.
  • the Interface Board is built into the FLD Type C RM and is the primary interface between the RM and the CM.
  • the Interface Board uses chips from MicroChip Development Systems. These MicroChip chips will be used to perform the A/D, FFT/signal processing, and communications formatting for the messages.
  • the messages will either be parallel or serial depending on the most cost-effective method that meets the FLD to CM data rate requirements. These chips are powered by a +5 volt DC source and are programmable in C and assembly language.
  • the Interface Board performs four primary functions:
  • Timing generate on/off power pulses to the radar modules for either minimization of power consumption or to meet FCC regulations. Timing between the two MicroChip A/D chips is handled by handshaking with the CM. This timing controls the sampling, FFT, and data transfer to the control module. Sample time for each FLD sensor is 25.6 ms for 256 samples of data at 10 kHz. Using two FLD sensors collecting data simultaneously and combining the data in the control module, the overall sensor report data rate would be approximately 50 milliseconds.
  • A/D Digitizes the FLD radar data.
  • the A/D function performs a ten-bit quantization of the incoming analog data.
  • Individual MicroChip A/D processors are used for each FLD sensor. This allows minimal latency and a faster overall sampling rate.
  • Two channels will be used on the A/D. The first channel will sample a high gain op-amp output. The second channel will sample a low gain op-amp output, with the third channel sampling the lowest gain op-amp.
  • the fourth and fifth channels will be used to set the reference voltage on the A/D. This will provide for 90 dB dynamic range when using a 10-bit A/D.
  • Communications with the Control module Provides a serial data interface.
  • the first 128 samples from the FFT results, from each Type C in the FLD, will be output via a serial channel. This channel will be a two way communications link with the CM.
  • the chip When the data is ready, finished the FFT, the chip will wait for the command to send the data.
  • the data (256 bytes) will be transferred in less than 10 milliseconds. This equates to a data rate of 257,000 Baud of unpacked data.
  • Each pair of bytes will contain one 16-bit point of the FFT output.
  • a header message will accompany the data, identifying the RM being sampled. The Interface Design Specification will define this message.
  • the purpose of the PD is to monitor the area around the periphery of the vehicle by detecting objects that could potentially be struck by the vehicle if it moved left, right or back and to provide distance information to the CM.
  • FIG. 23 presents the Block diagram for the PD 230 .
  • the PD uses Type A and B Radar Modules.
  • This array of radars will be interconnected in groups of up to three radars to a RM Interface Board which is used to sample all three RMs simultaneously and send the processed data to the CM upon request.
  • the multi-port interface card in the CM will cycle through each device sampling the information.
  • the object signal data from the RMs is digitized and sent by wire link to the CM for processing.
  • the CM will control the sampling.
  • FIG. 24 is a diagram of the timing for a single RM interface board and the associated RM's. The only unknown time is the time for performing the FFT and associated formatting of the data. It is not believed that this time approaches the idle time for the interface board.
  • the individual sensors are switched on for 12.8 milliseconds every 333 milliseconds (more than one sensor group will collect data at the same time). They are sampled at 5 kHz, giving 64 samples of data.
  • the CM multi-port Interface Card sequences through the PD RM interface boards until all fifteen RMs have been sampled. This function is repeated every 333 milliseconds.
  • An installation of the PD on a Cab and Trailer rig will require an RF link between the trailer and the cab.
  • the transceiver at the trailer will contain a power supply to derive +3 volts from the trailer power of +12 volts.
  • Hardwired installations will use a 4-wire cable between the CM and the Type A RMs.
  • the 4-wire cable will carry: +12 volts; power ground; two wires for the two-way serial communication.
  • the MicroChip PIC17C756 series microcontroller chip will be used for the Type A radar Module. This chip requires one oscillator at 4 MHZ. Using a serial EEPROM, the microcontroller will have identification encoded in it to provide RM ID back to the CM and know when to respond to CM commands.
  • the A/D function of the MicroChip will use up to 10 channels, sampling at a rate of 5 kHz for 64 samples.
  • the A/D will be switched on and off via the software in the MicroChip.
  • the collection will be synchronized with the other Type B RMs connected to the Type A (see 24 ).
  • the 10 channels used on the A/D are three for each Type B motion sensor and four for the on-board Type A motion sensor.
  • the first channel of the three for a motion sensor will sample a high gain op-amp output.
  • the second channel will sample a low gain op-amp output, with the third channel sampling the lowest gain op-amp. This will provide at least 90 dB of dynamic range necessary for close approach of objects.
  • the channels will be examined and when the high gain channel is at its maximum value, the second channel will be used in the signal processing.
  • the center rear facing RM When in the reverse gear, the center rear facing RM will be set to use the unity gain op-amp only. This sensor will be sampled at a 2 kHz rate to get more precise measurement of vehicle speed.
  • the software samples 64 points of data. This data is fed to a 64 Point FFT. At this time it is believed that the PIC17C756 series of chip is capable of performing a 64 point FFT in the required time.
  • the first 20 samples from the FFT results, for each attached RM, will be output via a serial channel. This channel will be a two-way communications link with the CM.
  • the chip When the data is ready, finished the FFT, the chip will wait for the command to send the data.
  • the data (60 bytes) will be transferred in about 30 milliseconds. This equates to a data rate of 19,200 Baud.
  • a header message will accompany the data, identifying the RM being sampled. The Interface Design Specification will define this message.
  • the PD consists of multiple Type A and Type B radar modules. Possible PD RM combinations include at least one Type A and up to two Type B modules.
  • the CM Upon initialization in the CM, the CM will poll for each Type A RM.
  • the Type A RM when first powered up will check the two ports for Type B RMs and detect the existence of a RM. This data will be reported back during the CM's initial poll.
  • the CM will build a table in RAM of each RM and its position, for use when performing other detection and tracking functions.
  • the Type A RM's will have a code that will indicate to the CM the location of each RM in its suite.
  • the Type A RM will respond to the CM commands when it receives a message with its address in the header. These messages will be defined in the Interface Design Document.
  • the Radar Modules will be positioned around the truck according to the diagram in the System Specification.
  • the RMs on the side of the truck should not be more than 25 feet apart and not closer than 10 feet.
  • the RMs on the rear should be spaced such that one is at the center and the others are as far on the edge as possible.
  • the System Timing is shown in FIG. 26 (for 2-Type A's connected to 2-Type B's each).
  • the data collection takes 12.8 milliseconds for each RM.
  • the signal processing takes X milliseconds per RM.
  • the total data collection time and processing time for 15 RMs is X milliseconds.
  • the data transmission time is 160 milliseconds. To conserve time the transmission of data will be going on from one set of RM's while another is collecting and processing data.
  • the timing of the PD RM's will be integrated into the timing of the FLD when the FLD is in operation.
  • the CM will poll the FLD and receive an 8-millisecond burst of serial data.
  • the CM will then poll one Type A PD RM and get up to 30 milliseconds of data.
  • the CM then does signal and data processing for 12 or more milliseconds, processing the downloaded data. 50 milliseconds after the system polled the FLD, it repeats this sequence.
  • the trailer mounted transceiver module in the PD application will also provide the power filtering, fusing and regulation for the trailer mounted PD radar modules.
  • the RGD has Type A RM located in the center of the three rear facing sensors, and one for the pair of radar modules on the right and one for the pair of radar modules on the left.
  • the Type A RMs output data into a RGD interface/transceiver, which sends the signal to the front cab.
  • a transceiver picks up the signal and converts it to a digital serial input to the CM.
  • a battery will be provided to power the RGD. This battery will be rechargeable and have a 5 Amp Hour capacity for a 25-day interval between recharging.
  • the RGD subsystem will be configured with three, five, or seven sensors. All of the RMs are mounted on one multi-detector array and will be mounted at the rear of the trailer. No electrical connection to the trailer will be required since it has a self-contained battery pack.
  • the MicroChip PIC17C756 series chip will be used for the Type A Radar Module. This chip requires one oscillator at 4 MHZ.
  • the MicroChip will have identification encoded in it to provide RM ID back to the CM and know when to respond to CM commands.
  • the A/D function of the MicroChip will use up to 10 channels, sampling at a rate of 5 kHz for 64 samples.
  • the A/D will be switched on and off via the software in the MicroChip.
  • the collection will be synchronized with the other Type B RMs connected to the Type A (see FIG. 27 ).
  • the 10 channels used on the A/D are three for each Type B motion sensor and four for the on-board Type A motion sensor.
  • the first channel of the three for a motion sensor will sample a high gain op-amp output.
  • the second channel will sample a low gain op-amp output, with the third channel sampling the lowest gain op-amp. This will provide at least 90 dB of dynamic range necessary for close approach of objects.
  • the channels will be examined and when the high gain channel is at its maximum value the second channel will be used in the signal processing.
  • the center rear facing RM When in the reverse gear, the center rear facing RM will be set to use the unity gain op-amp only. This sensor will be sampled at a 2 kHz rate to get more precise measurement of vehicle speed.
  • the software samples 64 points of data. This data is fed to a 64 Point FFT. At this time it is believed that the PIC17C756 series of chip is capable of performing a 64 point FFT in the required time.
  • the first 20 samples from the FFT results, for each attached RM, will be output via a serial channel. This channel will be a two-way communications link with the CM.
  • the chip When the data is ready, finished the FFT, the chip will wait for the command to send the data.
  • the data (60 bytes) will be transferred in about 30 milliseconds. This equates to a data rate of 19,200 Baud.
  • This chip performs a 10-bit A/D and 16-bit FFT.
  • a header message will accompany the data, identifying the RM being sampled. The Interface Design Specification will define this message.
  • the radar modules used in the Rear Guard will be the same as the PD.
  • the RGD has a special condition where one Type A module can be installed in the middle of the rear facing mounting bracket and the signal processing in the CM will be set to give longer range performance for a snow plow application.
  • the functional interface for the RGD is identical to the PD, including the use of the MicroChip chip set.
  • the RGD is equipped with the same RF link as is available on the PD.
  • a rechargeable 12-volt battery powers the RGD RF Link.
  • the power distribution from the trailer transceiver is the same as the PD's.
  • the RGD has a sleep mode to conserve battery power. If an activation signal is not received by the RGD for 20 seconds the system will go into standby or sleep mode. The CM will send out an activation command every five seconds when the RGD should be operational.
  • the RF transceiver module on the trailer and the microcontroller in each Type A RM controls the sleep mode.
  • the RF transceiver and microcontroller will have a sleep mode watch dog timer set to two seconds. When in sleep mode the transceiver will activate and search for a receive signal.
  • the CM will command a repeated transmit signal until the sleep mode stop data word is received. This signal will be used if it has been over 20 seconds since the RGD sent data to the CM.
  • the receiver in the trailer transceiver will come on for two milliseconds and search for the transmit signal. If one is received the transceiver will activate a serial message (controlled by the CM) to wake up the microcontrollers. When all microcontrollers have reported back the RGD operation will start. The entire wake up procedure will not take more than four seconds and will usually take less than two seconds.
  • the duty cycle is 1% while in sleep mode for power conservation.
  • the CM is a customized PC. Processor speed, memory, and interface drivers will determine the CM configuration based on a nominal set of performance requirements and hardware/cost tradeoffs.
  • a functional diagram of the CM 280 is shown in FIG. 28 .
  • the CM 280 consists of two primary elements: an Interface Card 282 and the Processor Board 284 .
  • the CM multi-port Interface 282 buffers the incoming data from the FLD 281 , PD 283 and RGD 285 , routes it to the Processor Board 284 , and routes control signals from the Processor Board 284 to the FLD 281 , PD 283 and RGD 285 .
  • the CM Multi-port Interface 282 routes the FLD Doppler spectrum to the Object Data Processing module 288 on the Processor Board 284 . It also routes the PD and RGD Doppler spectrums to the Detection Processing module 286 on the Processor Board 284 .
  • the CM performs the following signal/data processing functions: Object Data Processing (FLD only), Track Report Generator (FLD only), Detection Processing (PD and RGD only), Data Fusion, Situation Report Generator, Display Driver, and System Control.
  • FLD only Object Data Processing
  • FLD only Track Report Generator
  • PD and RGD only Detection Processing
  • Data Fusion Situation Report Generator
  • Display Driver and the System Control function are the responsibility of ATI and will not be discussed in this document except where an interface exists with one of the other functions. The remaining functions are shown in FIG. 29 and discussed below.
  • the Object Data Processing module receives the 128 samples of FFT′d data (frequency domain signal) from each radar module. It processes these samples (256 per cycle) and determines the existence of objects for reporting (see FIG. 30 ). A cycle for the FLD is 50 milliseconds.
  • the two forward looking radar modules are treated separately for clutter removal. They are combined in the Multi-Object Detector.
  • the Object Data Processing module will output detections/velocities, associated signal strengths, pulse timing data, clutter estimates and clutter distribution.
  • the frequency domain signal will be analyzed and the clutter removed individually for each sensor.
  • the clutter is removed in four steps. These four steps are discussed in detail below. It should be noted that to perform accurate calculations in this process the height of each RM antenna is required along with the dimensions of the truck and the location of each sensor. This information will be programmed in the CM at the time of the system installation.
  • the first step is to compute a threshold versus frequency (speed) for the received spectrum.
  • This set of numbers (128, one for each sample) is computed from the speed of the truck and the height of the sensor above the road.
  • the road will reflect a certain amount of energy back to the sensor.
  • Each road surface type will reflect a different amount but an average amount of reflection will be used since height above the road is dominant.
  • the frequency spectrum of the clutter is related to the speed of the truck and the distance to the road.
  • the height of the sensor will be the strongest return and it will be at 0 Hz in the spectrum.
  • the 38.75 Hz return (1.25 mph) will be from a distance where the velocity component of the road is 1.25 mph.
  • the bin spacing is 38.75 Hz, thus the first 1.25 mph will appear in the first bin.
  • the next spectral bin will be from 1.25 mph to 2.5 mph and so on.
  • the distance to the road for each frequency will be pre-computed and an equation for clutter will use the
  • T m This value (T m ) will start as 3 dB but it will be determined through lab and initial field-testing. The result of this multiplication is then multiplied by the truck speed based road surface clutter.
  • Removal of weather related clutter will be done in steps two and three.
  • Rain clutter produces a distinct pattern in the frequency spectrum.
  • the 128-sample frequency spectrum will be examined for this pattern. If found, the threshold values at the frequencies where rain is present will be adjusted for the presence of rain.
  • the pattern is recognizable over time. Distinct lines at a constant velocity and no real discernable change in signal strength over several seconds will denote rain. This condition will be flagged and the most prevalent frequencies will be marked.
  • Snow will appear as colored noise. Several frequencies may have more noise than others, but in general the average noise will go up throughout the spectrum. The thresholds will be adjusted accordingly.
  • Step four is the search for specific clutter from stationary objects. This will be done by comparing the returns of both sensors at the same frequencies.
  • Course Clutter Removal for ⁇ ⁇ all ⁇ ⁇ frequency ⁇ ⁇ bins ⁇ ⁇ ⁇ ⁇ if ⁇ ⁇ S 1 ⁇ i > S 2 ⁇ i + 30 ⁇ dB then ⁇ ⁇ Fr i TRUE ( 4 )
  • Road clutter from a specific object such as a sign or bridge will appear at one frequency. As the truck approaches if the object is on the side of the road the frequency will decrease and the signal strength in one sensor will decrease (with respect to the R 4 curve) while the signal in the other sensor will increase faster than expected.
  • the frequency spectrum, object clutter candidates, and clutter thresholds will be fed into a Multi-Object Detection algorithm.
  • This algorithm will be used to differentiate between multiple returns in the spectrum from objects that are and are not clutter.
  • This algorithm will be designed to offer up candidate detections, which when combined with other sensor and truck data can be used to determine the actual presence of an object.
  • the pair of FLD radar modules will be used in this algorithm to differentiate between clutter and objects not in front of the truck. Three steps are performed to find the candidate detections.
  • the first step is the application of the clutter thresholds (equation 2) to the entire spectrum and the elimination of the colored clutter.
  • C i is one if a threshold crossing was detected at the i th Doppler bin
  • the second step is to detect the threshold crossings. These crossings will be compared to each other and to the estimated road/object clutter data.
  • the two sensors will be combined in this step. After the initial clutter removal the frequency spectrums of the two sensors will be compared for all threshold crossings. For a candidate detection to be declared, a threshold crossing must have occurred for each radar module at a frequency not more than +1 FFT bin apart, with no more than 30 dB SNR difference (Fr flag).
  • the CM will tell the FLD when a turn is underway and the direction of the turn.
  • the sensor pointing in the direction of the turn will be allowed to have stronger detections. If the sensor in the opposite direction of the turn has a signal over 20 dB stronger that crossing will not be accepted for detection.
  • the final step in the Multi-Object Detector is to eliminate all but the five best detections.
  • This algorithm is the first stage of a tracker. The detections will be sorted by closing velocity and SNR. The objects that will most likely reach the truck first will be given highest priority. If over five detections exist the pre-tracker will then sort the detections further. The pre-tracker will compare the detection to objects already being tracked from previous sensor cycles. Those detections closest to existing tracks will receive priority within the original sort.
  • detections will be “associated” with the existing tracks. If more than one detection associates with a track the closest detection, in speed and SNR, will be marked as the best association with the track. More than one detection may associate with one track. Up to five detections will be passed on. These detections will be the objects that will reach the truck first and have been around the longest. In all cases the shortest time to impact will be given priority. Longevity will only be used to sort on detections that have closing speeds within 10 mph and SNR's within 20 dB of each other. That is if there are more than five detections, all at about the same speed, the associated detections with the longest track existence time will be output.
  • Objects extending over the track and across the road, such as bridges and signs will require special processing. This processing will use the combined SNR and examine stationary objects over time. The characteristics of the SNR over time from flat or non-complex objects such as a bridge or sign will be used to identify these objects.
  • the data collection and detection rate of the FLD is 50 milliseconds.
  • the Data Processing, Tracking and Data Fusion functions will use several cycles of detections to produce the best answers.
  • Third, the higher accuracy of a stable system is only required in computing the larger time to impact numbers.
  • the stable track will allow the system to “track” an object into the alarm area or region.
  • the same approach is used in the PD and RGD systems and is discussed later. These same four points apply to the PD and RGD, except the cycle time is 6 time longer.
  • an ID can be computed for a tracked object.
  • the signal strength at an estimated range provides an estimate of radar cross section. From this radar cross section an initial categorization as a truck/large vehicle, a car or other can be determined. From scintillation combined with a small radar cross section the categorization as a human/large animal or a sign can be determined.
  • the algorithm will compare the two forward looking radar module signals. This comparison will be a time-based comparison of a track's speed and SNR, with the current associated detection. Tracks, which are traveling slower then 20 mph, will be designated human. All new tracks will be given a truck designator until more than one cycle of data has been gathered. The SNR will be averaged over time for each track.
  • RCS j ⁇ ( Savg j )/ R j 4 ⁇ 1/2 (6)
  • the Object ID Algorithm will attempt to differentiate between small objects and large objects. This algorithm will help to eliminate false alarms from the side of the road and in front of the truck when approaching and during a turn by identifying the track as not a truck or a car.
  • a probability of correct ID and an associated confidence level will be computed for each ID. These parameters will be set from an equation empirically derived during testing of the system.
  • the candidate detections will be combined with the object ID data and analyzed for the presence of an object for which a report should be generated.
  • Stable object reports or tracks will be required to achieve the range estimate accuracy desired.
  • the object reports will be of the closest object determined to not be a false object. There will be up to five reports every 50 milliseconds.
  • the object reports will be sent to the Data Fusion algorithm for further processing.
  • the trackfiles generated by the Multi-Object Detector contain kinematics from previous tracks and, for each track, an associated detection (if one was available). The track and the detection need to be merged.
  • the first step in merging the track and detections is the reduction of detections associated with track.
  • the Condition Checking function will eliminate all but the best detection. If more than one detection associates with a track, this function will compare the SNR of the track to the SNR of the Detection and compare the closing rate of the track with the closing rate of the detection. The closest match in closing rate with a reasonable match in SNR will be correlated with the track and the other detections will be made available to create a new track or correlate with another track.
  • the correlated track/detection data will be used to maintain the tracks.
  • a new track will be created from detections not associated with tracks. New tracks will be kept for up to five radar module cycles (50 milliseconds per cycle). If a second detection is associated with a new track before five cycles without an association, the new track is made a hold track. Hold tracks must experience 10 cycles in a row of no detection associations before they are eliminated.
  • the Track Maintenance function will apply these rules and output a set of trackfiles containing new and hold tracks. The trackfiles will be identified as coming from the FLD. (See FIG. 21 , Track Report Generator Funcitons.)
  • the PD and RGD are not as sophisticated as the FLD.
  • the clutter processing will be a simpler version of the FLD processing.
  • the detection process will also be a simpler version of the Multi-Object Detection algorithm used in the FLD.
  • the functional design is shown in FIG. 32 and is discussed below.
  • the Detection Processing module receives the 20 samples of FFT′d data from the Interface boards. There can be up to 15 sets of data. Each set of data will be clutter processed individually. The clutter processing will have the same functions as described earlier for the FLD but the functions will be adapted to the PD or RGD requirements.
  • the clutter will have a different spectral characteristic for each radar module view angle.
  • the forward looking radar modules will have similar clutter to the FLD.
  • the side looking sensors will have clutter which is lower in frequency and for the radar modules closest to and viewing the side of the road the clutter will be stronger.
  • the rear modules will have the most action from objects approaching the truck at low speeds relative to the truck's speed. All of these specific conditions will be addressed in the Threshold Computation Function and the Road/Object Clutter Location Function.
  • the frequency domain signal will be analyzed and the clutter removed individually for each sensor (see Equation 1).
  • the clutter is removed in four steps.
  • the first step is to compute a threshold versus frequency (speed) for the received spectrum. This set of numbers (20, one for each sample) is computed from the speed of the truck and the height of the sensor above the road.
  • the frequency spectrum of the clutter is related to the speed of the truck, the distance to the road and the view angle of the radar module.
  • the height of the sensor will be the strongest return and it will be a 0 Hz in the spectrum.
  • the 31 Hz return (1 MPH) will be from a distance where the velocity component of the road is 1 MPH.
  • the next spectral bin will be from 2 MPH and so on.
  • the distance to the road for each frequency will be pre-computed and an equation for clutter will use the resulting values (see Equation 2 except the bin spacing is 31 Hz versus 38.75 Hz).
  • the values in this equation are sensor height and truck speed.
  • Snow will appear as colored noise. Several frequencies may have more noise than others, but in general the average noise will go up throughout the spectrum. The thresholds will be adjusted accordingly.
  • Step four is the search for specific clutter from non-moving objects. This will be done by flagging large returns (see Equation 4). Objects that are stationary will appear at specific frequencies in the spectrum. Depending on the angle to the object the frequency and amplitude will change. Objects on the side of the road at close ranges will appear stronger in one sensor and at frequencies lower than the speed of the truck. These objects and their frequencies will be noted for processing later in the data fusion function.
  • a Multi-Object Detection algorithm After clutter reduction, the frequency spectrum, and clutter thresholds will be fed into a Multi-Object Detection algorithm. This algorithm will be used to detect multiple objects in the presence of road, snow and rain clutter. This algorithm will be designed to offer up candidate detections, which when combined with other sensor and truck data can be used to determine the actual presence of an object.
  • the first step is the application of the clutter thresholds to the entire spectrum and the elimination of the colored clutter. If a particular frequency bin exceeds the threshold it will be stored for later processing as a detection candidate. If a certain segment of frequency bins produces an excessive number of detections the thresholds will be raised in that region and the strongest detections will be reported.
  • the second step is to detect the threshold crossings (see equation 5). For a detection to be declared a threshold crossing must have occurred for one radar module. This step will output no more than the two strongest candidate detections.
  • the final step in the Multi-Object Detector is to eliminate all but the 15 best detections.
  • This algorithm is the first stage of a tracker.
  • the detections will be sorted by closing velocity, SNR and radar module of origin. The objects that will most likely reach the truck first will be given highest priority. If over 15 detections exist the pre-tracker will then sort the detections further.
  • the pre-tracker will compare the detection to objects already being tracked from previous sensor cycles. Those detections closest to existing tracks will receive priority within the original sort. These detections will be “associated” with the existing tracks (See the pre-tracker of the Signal Data Processing Functions Section). More than one detection can associate with a track. Up to 15 detections will be passed on. These detections will be the objects that will reach the truck first and have been around the longest. In all cases the shortest time to impact will be given priority. Longevity will only be used to sort on detections that have closing speeds within 10 mph and SNR's within 20 dB of each other.
  • the Detection Processing module will output detections/velocities, associated signal strengths, pulse timing data, clutter estimates and clutter distribution.
  • the Data Fusion algorithm will be designed to take inputs from a N out of M tracker. This Data Fusion algorithm is specifically designed to not require any specific set of sensors and adapts as sensors are added using a lookup table of the new sensor parameters and an indication of the number of and type of sensors added.
  • the Data Fusion Algorithm can also take into account any data from the host vehicle supplied by various sensors. The absence of data will not cause a problem with the algorithm, however, the more data the better the performance.
  • the purpose of the Data Fusion Algorithm is to reduce all of the tracks and detections down to a small set of object tracks representing the objects surrounding the track. Each radar module and sensor set may detect the same object. It is the task of the Data Fusion Algorithm to sort this out.
  • the algorithm uses a technique called Deepest Hole to combine the data from multiple sensor and Kinematics Combination to fuse this data together.
  • the Data Fusion functions are shown in FIG. 33 .
  • the purpose of the Data Fusion Algorithm is to reduce all of the tracks and detections down to a small set of object tracks representing the objects surrounding the truck.
  • Each radar module and sensor set (FLD, PD, and RGD) may detect the same object. It is the task of the Data Fusion Algorithm to sort all of this out. This algorithm is described below.
  • the Deepest Hole function “associates” tracks from the sensor sets with existing Fused Tracks. It is assumed that multiple sensor sets may find the same object and that multiple radar modules within a sensor set will often see and report on the same object.
  • the Deepest Hole function will resolve these redundant tracks into a set of fused tracks, one per object.
  • the output of this function is a list of track links linking the tracks from multiple radar modules together for each object.
  • the purpose of this function is to match new sensor data with current tracks (multi-sensor track or MST). Matched sets of MST and sensor data are found by operating on the agreement matrix with a heuristic search algorithm.
  • the agreement matrix contains the normalized distances (referred to as “standard differences”) between shared state variables calculated for every possible combination of sensor and MST track.
  • the “deepest hole” search algorithm finds the set of matches between rows and columns of the agreement matrix to minimize the sum of matrix elements found at the intersection of matched rows and columns.
  • the standard differences are calculated for every possible combination of MST and sensor track.
  • An agreement matrix is built which contains MST tracks as the first index (rows) and sensor tracks as the second index (columns).
  • the standard difference for each MST/sensor pair is put into the appropriate cross-index position.
  • the standard difference is the sum of the squares of the differences in shared state variables normalized by the sum of the state variances and the number of variables shared by the two tracks.
  • N number of shared state variables
  • V mst vector of MST track state variances corresponding to X mst
  • V scn vector of sensor track state variances corresponding to X sen
  • MST track The largest value found (i.e. deepest hole) is 0.5 corresponding to Row 2 or Sensor Detection 2 .
  • This row is then examined to find the corresponding column (MST track) which has the minimum standard difference.
  • the minimum is 2.0 corresponding to column 1.
  • 2.0 indicates that MST Track 1 is the closest to sensor detection 2 .
  • MST track 2 is the next closest to sensor track 2 , with a standard difference of 2.5.
  • the larger the distance between the standard differences the more likely that the actual match is the minimum value found (2.0 in this case). That is why the most probable match is determined from the largest distance between standard differences. Therefore, it is concluded that MST track 1 and sensor detection 2 are a probable match.
  • Row 2 and column 1 are now removed from the matrix (step 6) and the entire procedure repeated on the reduced matrix (step 7).
  • the reduced matrix is: MST Tracks 2 3 4 Sensor 1 1.8 3.1 2.3 Tracks 3 5.4 1.5 1.9
  • the track data from the tracks, which are linked, will be merged in this function.
  • the speeds from each track will be averaged together.
  • the SNR's will be averaged using a weighted average considering radar module antenna gain (the FLD's and potentially one RGD will have 15 to 20 dB more gain than the other radar modules).
  • the range estimate for the new merged track will be handled by the Range Estimator Function.
  • the ID will be merged using the Probability of Correct ID and the ID Confidence.
  • the kinematics merge process consists of multiple passes, one pass for each sensor being processed on a given cycle.
  • the algorithm acts as a sensor track combiner and does not provide additional filtering to the sensor data. Given that only Radar sensors with differing beamwidths are being considered the merge process would behave as follows.
  • X(k) represent a state vector at cycle k.
  • X could be any of the vectors (N Ndot) T , (E Edot) T , (D Ddot) T , (R Rdot) T where the superscript T indicates transposition.
  • X M ( k ) X M 1 ( k )+ W ( k )[ X S (E) ( k ) ⁇ X M (1) ( k )] (8) where:
  • the fusion process generates an MST track with lower variances.
  • Range Estimation consists of three steps. These steps are designed to achieve maximum range resolution without using time measurement as a tool.
  • the estimator works using the principal behind radar wave propagation. Radar signals are received in relation to the transmitted power by range to the fourth power. That is the radar signal drops in strength by range to the object squared and on the return by range to the receiver from the object squared. Thus when range goes from 300 feet to 100 feet there is an increase in received power of 81 times. This change in power can be measured and it is greater than the changes due to object size or object perceived size (angle dependent).
  • the range to a object can be estimated by following the curve of the received power over time. This is why tracks are formed in previous functions. The tracks give a time history of the received signal which will be used in the range estimate.
  • MADR Multi-Hypothesis Automatic Difference Ranging
  • MADR range to the track
  • the first step in MADR is to apply the SNR history to the radar range curve fit program.
  • An algorithm dubbed “Automatic Ranging” will be used to establish this first range estimate.
  • MADR will estimate the starting range of a track based on the SNR history. This starting range will be added to distance traveled (a negative number for a closing object) and a current range estimate will be computed.
  • the MADR algorithm is discussed in detail below.
  • a range estimate will be calculated from the signal strength versus time and closing rate of each tracked object. Due to the properties of radar, as an object changes its relative position to the host vehicle, the signal strength will vary, by range to the fourth power and by a scattering property called scintillation. Combining this signal strength property with the distance traveled by the object will yield the starting range and thus the current range to the object. The distance traveled by the object is computed in the sensors by combining time since the data collection and tracking started, with the individual measured closing rates versus time. Using multiple hypotheses the signal strengths versus time will be inserted into an algorithm which matches the hypothetical signal strength curve to a (range and distance traveled)/(range to the fourth) curve.
  • FIG. 34 shows an example of multiple hypotheses through amplitude versus time data.
  • Automatic Ranging is a technique first used by the Navy in the late 1970's to passively estimate range to a target. This application is substantially different but it can still use the same principals.
  • AR's methodology to determine where the emitter is.
  • the key to AR is its implementation which uses both known signal strength relationships and the computational power of the digital computer.
  • AR's ranging technique can be applied to any number of problems where a measurable parameter varies in some non-linear manner while other measurable or assumed parameters are linear.
  • AR's basic implementation concept and its application to the anti-collision warning system is described in the following paragraphs.
  • the second method is to assume a constant radar cross section for an object.
  • the RCS will be derived from a look-up table and the track ID.
  • the SNR time history curve will be smoothed.
  • the final step will be to resolve the two range estimates.
  • the resolution will be dependent on the history of range estimates for the subject track, the ID of the track, the quality of the SNR history (noise on the SNR curve) and the quality of the track ID.
  • This algorithm receives trackfiles from the Data Fusion algorithm and range estimates from the Range Estimator. This data is compared to the reporting criteria established by a lookup table in the CM.
  • the lookup table will be mode and RM/sensor system (FLD, PD, RGD) dependent. Depending on the RM(s) reporting and updating the Fused Trackfile, the lookup table will determine whether the track should be formatted and reported. This lookup table will be created and updated by ATI.
  • Alarm Threshold Warning Threshold Range FLD ⁇ 3 feet ⁇ 10 feet Range front PD ⁇ 3 feet ⁇ 6 feet Range rear PD/RGD (fwd gear) — ⁇ 6 feet Range rear PD/RGD (rev gear) ⁇ .5 feet ⁇ 2 feet Range Side PD/RGD (turning) ⁇ 12 feet ⁇ 24 feet Time to Impact FLD ⁇ 3 seconds ⁇ 6 seconds Time to impact rear PD/RGD ⁇ 3 seconds ⁇ 6 seconds
  • the Built In Test (BIT) function will be performed in the CM on all of the system's components.
  • the BIT software will be designed to exercise the sensors such that a known preset performance can be measured. If a fault occurs, the faulty component will be recycled and retested. If the fault persists, a permanent record of the condition and the associated component will be stored in flash memory in the CM and the condition routed to the display processor by the BIT software.
  • the other CM functions will always assume a fully functional system unless BIT informs it of a faulty component. This fault detection will be at a 50-millisecond rate for the FLD and a 333-millisecond rate for the PD and RGD.
  • Each RM has a distinct clutter response from the surface and an identifying code in its digitized signal being fed to the CM.
  • BIT initiate will cause the BIT software to poll each RM for the FLD, PD and RGD. If an individual RM is faulty, the BIT software will identify the faulty RM through a comparison of the clutter return to the expected clutter return. If the faulty RM is in the FLD, BIT will inform the Object Data Processing module that specific RM is no longer functional. The Object Data Processing module will then revert to a degraded mode. If the faulty RM is in the PD or RGD, BIT will inform the Detection Processing module, which will revert to a degraded mode.
  • BIT detects a fault in each of the RM responses connected to a specific interface board or if no response is received from a specific interface board, then BIT will assume that interface board has failed.
  • the failure of the FLD Interface Board will result in complete loss of the FLD capability and this will be reported to the Object Data Processing module.
  • the failure of a PD or RGD will be reported to the Detection Processing module, which will revert to a degraded mode.
  • BIT will inform the Display Processor of all failures and their severity, so that the operator is aware of the system status.
  • the CM initiates the Master Clear function.
  • the FLD, PD and RGD will reinitialize all functions. All signal processing will cease and be restarted.
  • the A/D and FFT functions will continue to operate.
  • a watch Dog timer set to 1 second will be used to detect a reset condition in the RMs.
  • the microcontroller Upon receiving a time out (no serial request from the CM in the last second) the microcontroller will be reset. All message formatting will stop and any existing but unsent messages will be cleared.

Abstract

A collision avoidance system including a control module, a first transmitting device connected to the control module, wherein the first transmitting device transmits a signal, a first receiving device connected to the control module, wherein the first receiving device receives a return of the signal transmitted from the first transmitting device and transmits a first return signal representative of the return to the control device, a second transmitting device connected to the control module, wherein the second transmitting device transmits a signal, and a second receiving device connected to the control module device, wherein the second receiving device receives a return of the signal transmitted from the second transmitting device and transmits a second return signal representative of the return to the control device, wherein the control module includes measurement circuitry used to measure the first and second return signals and display means for displaying a transverse location of an object as a function of said first and second return signals.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates generally to sensor-based systems, and more particularly to a multi-sensor collision avoidance system which combines data from two or more sensors to provide range, range rate, or location information.
  • 2. Background Information
  • The roads are becoming more and more congested with vehicular traffic. As traffic congestion has increased, the number of accidents has also increased. Some of these accidents can be traced to driver inattentiveness or to the failure of the driver to see another vehicle. What is needed is a system and method for warning drivers of possible problems before the problems result in an accident.
  • Systems for making drivers aware of objects external to their vehicle have been around for a long time. Mirrors, and sometimes combinations of mirrors, are being used to reveal locations hidden to the driver's view (i.e. “blind spots”). Mirrors, however, have a deficiency in that the driver can only look in one spot at any one time. If they look behind the vehicle, see that the way is clear, start looking elsewhere and then a vehicle pulls behind them, they won't see it and may back into the vehicle. There is a similar problem with changing lanes. Mirrors don't work well in changing lanes, particularly in tractor-trailer rigs, since, as soon as the rig begins to turn, the mirror that looked down along the side of the vehicle is directed into the side of the trailer and the driver is blinded to activity on that side of his truck.
  • More recently, trucking and bussing companies have used backup alarms to warn bystanders that the truck is backing. The problem with backup alarms is that if you have somebody with a hearing problem, or if you had an immovable object such as a car or a trash container, the alarm isn't going to move that object out of the way.
  • Companies have also experimented with the use of video systems to view blind spots. For example, garbage pickup trucks for Browning-Ferris are using video systems which have a video camera installed on the back of the truck and a monitor up in the cab. Some recreational vehicle (RV) owners are doing the same thing. The problem with the video system approach is that such systems are expensive (even if you use an inexpensive approach, it would likely cost into the $1,000-$1,500 price range) and video monitors mounted in the cab can distract the driver from what is happening outside his vehicle. Finally, video lenses do not give depth perception. So, when drivers are backing a vehicle, they don't know how close they are to an object they are trying to avoid.
  • A final approach taken by a number of companies is the use of sensors to locate objects external to the vehicle. Electronic Controls Company of Boise, Id. sells an ultrasonic sensor system that assists drivers in determining all is clear before the driver changes lanes, backs up or docks. The system includes ultrasonic sensors mounted on the back and sides of the vehicle and an alert module mounted in the cab of the vehicle. Each ultrasonic sensor continuously monitors a defined detection zone for objects moving within the zone. When a vehicle enters the detection zone, the sensor measures the time between sending the sound wave and receiving its reflection and sends that measurement to the cab.
  • Sonar Safety Systems of Santa Fe Springs, Calif. has a rear-mounted sensor system which detects objects in three distance zones from the rear of the vehicle. That is, it doesn't display distance to the object. Instead, the system provides alarms and audible feedback that inform the driver whether the obstacle is real close (Zone III), out a little farther (Zone II), or even farther out yet (Zone I). And it only looks up to 8 feet behind the vehicle. They also have a single sensory unit where they only put one sensor in the back.
  • A common problem with rear-mounted sensors to date is that sensors mounted on the rear of the vehicle detect the distance from the sensor to the object, not the perpendicular distance from the vehicle to the object. In addition, these systems do not communicate to the driver the transverse location of the object (i.e., is the object directly behind the vehicle, off to the side, or far enough to the left or right that the driver will not hit it). Furthermore, range measurement often does not exist, or is inaccurate.
  • The collision avoidance systems used to date are deficient in other ways as well. For instance, the systems provide only partial coverage around the periphery of the vehicle. That is, they either lack a forward-looking detection capability, lack range and range rate measurement capability or they lack sufficient detection capability around the periphery of the vehicle to eliminate blind spots. Furthermore, even if present, range measurement often is inaccurate. Finally, those systems which do have forward-looking detection are prone to a high rate of false alarms from the environment or to distracting off-the-road clutter.
  • Systems to date do not provide an adequate solution for the combination tractor-trailer rig. Armatron International of Melrose, Mass. has a side and rear obstacle detection system which includes wireless communications between the tractor and trailer, however, the sensors are all hard-wired to the trailer. This does not address the need in which tractors often are required to pull a multitude of trailers, some of which are owned by different companies, which are not likely to be equipped with any sensors.
  • Finally, systems to date lack the programmability to address the configuration and installation variables that influence the integrity of the sensor data. In addition, current systems are designed such that changes in the transmitted sensor frequency require a redesign of the software algorithms.
  • What is needed is a collision avoidance system and method which avoids these deficiencies.
  • SUMMARY OF THE INVENTION
  • The present invention is a collision avoidance system. The collision avoidance system includes a control module, a first transmitting device connected to the control module, wherein the first transmitting device transmits a signal, a first receiving device connected to the control module, wherein the first receiving device receives a return of the signal transmitted from the first transmitting device and transmits a first return signal representative of the return to the control device, a second transmitting device connected to the control module, wherein the second transmitting device transmits a signal, and a second receiving device connected to the control module device, wherein the second receiving device receives a return of the signal transmitted from the second transmitting device and transmits a second return signal representative of the return to the control device, wherein the control module includes measurement circuitry used to measure the first and second return signals and display means for displaying a transverse location of an object as a function of said first and second return signals.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows a collision avoidance system;
  • FIG. 2 shows a more complex embodiment of the collision avoidance system shown in FIG. 1;
  • FIG. 3 is a system block diagram of the collision avoidance system according to FIG. 2;
  • FIGS. 4 a-c show operator interface units which can be used with the Control modules of FIGS. 1 and 3;
  • FIGS. 5 a-c show the operation of two rear-mounted sensors according to the present invention;
  • FIGS. 6 a-c show an alternate embodiment of the operator interface units of FIGS. 4 a-c;
  • FIG. 7 illustrates a backup warning system;
  • FIGS. 8 a and 8 b show wireless portable transducer systems;
  • FIGS. 9 a-d show a forward looking proximity sensor;
  • FIG. 10 shows a high/low detection system;
  • FIG. 11 shows an operator interface unit which can be used with the Control modules of FIGS. 1 and 3;
  • FIG. 12 shows the guided operation of two rear-mounted sensors according to the present invention; and
  • FIG. 13 illustrates a side display module.
  • FIG. 14 shows one embodiment of a forward looking detector radar module system.
  • FIG. 15 illustrates one embodiment of a proximity detector radar module system.
  • FIG. 16 illustrates one embodiment of rear guard detector radar module system.
  • FIG. 17 is an illustration of one embodiment of a type C radar module interface board.
  • FIG. 18 is a block diagram of one embodiment of the power distribution plan.
  • FIG. 19 is a block diagram of one embodiment of the data communications between radar modules.
  • FIGS. 20 a-c show one embodiment of the pin configurations for the connectors.
  • FIG. 21 is a block diagram of one embodiment of a forward looking detector.
  • FIG. 22 is an illustration of the radar module alignment for a forward looking detector.
  • FIG. 23 illustrates one embodiment of a functional design for a proximity detector.
  • FIG. 24 is a proximity detector interface and operation timing diagram.
  • FIG. 25 a is a side view of one embodiment of a type A radar module layout.
  • FIG. 25 b is a top view of one embodiment of a type A radar module layout.
  • FIG. 26 is a proximity detector system timing diagram.
  • FIG. 27 illustrates one embodiment of a functional design for a rear guard detector.
  • FIG. 28 illustrates one embodiment of a functional design for a control module.
  • FIG. 29 illustrates the signal/data processing functions of one embodiment of a control module.
  • FIG. 30 illustrates one embodiment of forward looking detector object data processing.
  • FIG. 31 illustrates one embodiment of track report generator functions.
  • FIG. 32 illustrates one embodiment of proximity detector and rear guard detector detection processing.
  • FIG. 33 illustrates one embodiment of data fusion and range estimation.
  • FIG. 34 shows an example of multiple hypotheses through amplitude versus time data for automatic ranging.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • In the following detailed description of the preferred embodiments, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific embodiments in which the invention may be practiced. It is to be understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the present invention.
  • FIG. 1 shows a collision avoidance system 10 according to the present invention. System 10 includes a Control module 12 and two sensors 14. Each sensor 14 includes a transmitter 16 and a receiver 18. In one embodiment, transmitter 16 and receiver 18 are mounted together in a single sensor housing. In another embodiment, transmitter 16 and receiver 18 are mounted in separate housings.
  • In one embodiment, sensors 14 include separate acoustic transducers for each of transmitter 16 and receiver 18. In another embodiment, a single acoustic transducer is used for both transmitting a signal and receiving its echo. Some transducers which would operate in such a system 10 are the 9000 Series Piezo Transducers available from Polaroid OEM Components Group and the KSN 6530 45 KHz transducer available from Motorola. In addition, the KSN 6529 45 KHz transducer available from Motorola could be used for receiver 18.
  • In another embodiment, sensors 14 are micropower impulse radar (MIR) devices. In one embodiment, MIR devices such as those described in the white paper entitled “Microwave Impulse Radar (MIR) Technology Overview”, available from Lawrence Livermore National Laboratory, are used. The advantage of such devices are that they are low power and fairly inexpensive. In addition, a single device can be used as both transmitter 16 and receiver 18.
  • In yet another embodiment, sensors 14 are microwave transceiver devices. In one such embodiment, each transceiver includes a small integrated antenna and electronic interface board. In another embodiment, sensors 14 include both proximity detectors 14.1 and longer range detectors 14.2. The longer range detectors incorporate a larger antenna to operate as a Doppler Radar Forward Looking Detector. An example of one such transducer is the model DRO3000 Microwave Transceiver Module available from Advanced Frequency Products of Andover, Mass.
  • In one embodiment, such as is shown in FIG. 2, a collision avoidance system 30 includes up to seventeen sensors 14 mounted around the periphery of a vehicle.
  • In one embodiment, sensors 14 of system 30 are grouped in detection subsystems 34. The output from each proximity detector subsystem 34 is fed into Control module 12, as is, shown in FIG. 3. As shown in FIG. 3, system 30 includes a Control module 12, an operator interface 32, and two sensors 14. Each sensor includes a transmitter 16 and receiver 18.
  • In one such embodiment, sensors 14 of system 30 are grouped in detection subsystems: namely, forward-looking detector subsystem (with 2 sensors), proximity detector subsystem (with up to 15 sensors), and a rear-guard subsystem (with up to 7 sensors). The output of each sensor in each detection subsystem is fed into control module 12, as shown in FIG. 3. As shown in FIG. 3, system 30 includes a control module 12, an operator interface 32, and other optional system features. In one embodiment, optional system features include rear warning lights 98 and side warning lights 99. Each sensor includes a transmitter 16 and receiver 18.
  • Collision avoidance systems to date typically put transducers on the rear of the vehicle and measure the distance from the sensor to the object. This is not optimal since those sensors are transmitting in an arc. They are looking at the distance from the sensor to the object and back again. That may not, however, be the perpendicular distance from the vehicle to the object. A deficiency, therefore, of systems to date is that they do not communicate to the driver the transverse location of this object.
  • In one embodiment of the system shown in FIGS. 1 and 2, a plurality of sensors 14 are mounted on the back of the vehicle. Control module 12 takes the readings from the rear-mounted sensors 14, determines whether it can triangulate and calculates an actual perpendicular distance from the truck to the object. This is important because the truck driver really wants to know, not that the object is off at this angle over here five feet away, is how far he can back up before he hits the object. Examples of these measurements are shown in FIG. 4 a-c and 5 a-c. In contrast to other approaches, this triangulation procedure makes system 10 a precision distance measuring system. Alternate approaches to the operator interfaces shown in FIGS. 4 a-c are shown in FIGS. 6 a-c.
  • FIG. 5 a represents the top view of a tractor trailer rig 50 with a post 52 located behind the trailer 54. The post 50 represents a hazard unless the driver knows the precise distance from the vehicle. Sensor 14 on the right rear of the trailer senses the post 52 at a distance of six (6) feet. Sensor 14 on the left rear of the trailer senses the post at a distance of just over six and one half (6.5) feet. Control System 12 calculates that the actual distance to the post as 5.2 feet and determines it is located just to the right of the center of the trailer. The distance is then displayed digitally on the control module 12. The transverse location is displayed, for instance, on the bar graph located just to the right of the digital display and it indicates the location of the post.
  • Perpendicular distance between the rear of a vehicle and external objects is increasingly important the closer the vehicle gets to an external object. In the same example as above, when sensor 14 on the right rear of the trailer senses the post at a distance of four (4) feet and the sensor 14 on the left rear senses the post at a distance of 4.8 feet, the actual perpendicular distance is 2.6 feet. The Precision Measurement System correctly uses the sensor 14 distance readings as well as the known distance between the left and right sensors 14 to calculate the exact perpendicular distance to the post. This is very important as an aid to the driver in the prevention of an accident.
  • In one embodiment, a third sensor 14 is mounted between the right and left sensors 14. With the aid of third sensor 14, the system can determine that the object is a point source (such as a post) as opposed to a wall or large vehicle. The sensor 14 on the right rear of the trailer senses the post at a distance of six (6) feet. The sensor 14 on the left rear of the trailer senses the post at a distance of just over six and one half (6.6) feet. The Control module 12, knowing that the object is a point source, calculates that the actual distance to the post is 5.2 feet and is located just to the right of the center of the trailer. The distance is displayed digitally on the Operator Interface and Side Display Modules. The transverse location is displayed in graphic form (e.g. bar graph) on the Operator Interface.
  • FIG. 5 b represents the top view of a tractor trailer rig with a post located far behind the trailer. The post represents a hazard unless the driver has sufficient information to aid in maneuvering around the obstacle. The sensor 14 on the right rear of the trailer senses the post at a distance of 21.0 feet. The sensor 14 on the left rear of the trailer senses the post at a distance of 22.1 feet. The control module 12 calculates that the actual distance to the post is 21.0 feet, and that it is located near the right side of the trailer. The distance is displayed digitally on the operator interface. The transverse location is displayed on the bar graph located just to the right of the digital display and it indicates the location. Precision distance measurement is less of a concern when obstacles are a long distance from the rear of the vehicle. However, productivity is a concern. With the aid of the transverse location information and the ability of the control module 12 to detect objects up to 25 feet behind the vehicle, the driver of the tractor trailer shown in FIG. 5 b can readily maneuver the vehicle to avoid the obstacle without having to stop, drive forward, and then proceed in backing the vehicle.
  • In order to triangulate, the distance between sensors 14 must be known. Therefore, the distance between sensors 14 must be controlled. In one embodiment, the distance between sensors 14 is a system parameter that can be programmed. In one such programmable embodiment, a programming device is provided to a dealer or a fleet owner such that once sensors 14 are installed, they can measure the actual distance between the sensors and the distance from the sensors to the sides of the vehicle and program that into control module 12. Control module 12 can then accurately calculate distance.
  • For example, if a collision avoidance system 10 or 20 provides a measurement, is that object directly behind the vehicle, or is it off to the left or right? Is it actually far enough off to the left or far enough off to the right that he won't hit it but needs to be aware of it? To provide more accurate information, in one embodiment, control module 12 calculates transverse location and communicates that information via a graphical indicator such as bar graph 22 of FIG. 4 a (also shown as 22′ in FIG. 6 a).
  • In the embodiment shown in FIGS. 4 a-c and 6 a-c, the purpose of the bar graph display is to break the transverse set of distances or locations, up into anywhere from 7 to 11 or more segments on the bar graph display. Control module 12 lights the segments that indicate where that object is from extreme left to extreme right. In another embodiment, transverse location is communicated through another graphic display, (e.g., a Liquid Crystal or other Display). In addition, in one embodiment transverse location is displayed through a little video monitor. In one such embodiment, operator interface unit 32 also shown as 32′ in FIG. 6 a displays an area behind the vehicle and places a dot within that area showing the closest object. A representation such as is shown in FIG. 5 a would be sufficient. The operator interface unit 32 includes a display 129, speakers 125, push-button switches 47, system status indicator 127 and switch legends 128.
  • Another issue is the vertical position of the rear-mounted transducers, relative to the ground, and relative to the impact point with a loading dock, is an important issue. For example, loading docks have an impact plank that protrudes out from the wall. If sensors 14 are mounted too low, you may actually look underneath the impact plank. If so, the truck could hit the plank under power with the driver thinking he or she had another 4-6 inches to go. In one embodiment, system 10 includes vertical compensation as discussed below.
  • In one embodiment, vertical compensation is activated automatically when the front panel switch in FIG. 4 c or the soft key in FIG. 6 c is in the Loading Dock (LD) position. The purpose of this feature is to compensate for the protrusion of loading dock impact bars in cases where the Transducer Assembly is located below the point of impact of the trailer with the loading dock impact bar.
  • FIG. 5 c represents the side view of a tractor-trailer pulling up to a loading dock. The impact bar is the point of contact with the trailer. The depth (i.e., front-to-back) of the impact bar is typically 4.5 inches. The top of the impact bar is typically 48 inches above the ground. When the Transducer Assembly is located below the point of impact of the trailer with the impact bar, the Precision Measurement System will adjust the distance measurement by 4.5 inches if the Transducer Assembly is mounted so low that it cannot detect the impact bar when the trailer is within 12 inches of the impact bar. For example, if the perpendicular distance from the rear of the trailer to the loading dock is 1 foot and the Transducer is 2 feet below the impact bar, the measured distance of 1.0 feet will be corrected to 0.6 feet.
  • In one radar embodiment, software running in systems 10 and 30 uses Multi-Hypothesis Ranging to provide an accurate estimate of range to an object. A range estimate will be calculated from the signal strength versus time and closing rate of each tracked object. As an object changes its relative position to the host vehicle, the signal strength will vary, due to the properties of radar, by range to the fourth power and by a scattering property called scintillation. Combining this property with the distance traveled by the object will yield the starting range and thus the current range to the object. The distance traveled by the object is computed in the system by combining time since the data collection and tracking started, with the individual measured closing rates versus time. Using multiple hypotheses, the signal strengths versus time will be inserted into an algorithm which matches the hypothetical signal strength curve to a “range and distance traveled one over range to the fourth curve”. One hypothesis, a set of points drawn through the returned signal levels over time, will correspond to the correct starting range for the object given the measured distance traveled. This hypothesis will provide the best statistical match to a one over range to the fourth curve and will be the range estimate provided.
  • One sonar embodiment of system 10 incorporates temperature compensation. Temperature compensation is needed because of the fact that sonar travels at different speeds through air depending on temperature. And so systems 10 and 30 measure the temperature of the air, and compensate in the distance calculation for the effects of temperature. In such an embodiment, transverse location detection, triangulation, perpendicular position compensation and temperature compensation cooperate to form a precision measurement system.
  • In another sonar embodiment of systems 10 and 30, the systems include automatic sensitivity control. When systems 10 and 30 are trying to sense an object at a far location, it is advantageous to pulse your maximum energy, to transmit a high burst of energy. When systems 10 and 30 transmit such a high burst of energy, if there is an object at a far distance, systems 10 and 30 are more likely to get an echo they can sense. And the sensitivity should be set to be very sensitive for that application. But once you sense that far off object and you start backing toward it, systems 10 and 30 should back off the transmitted energy. In addition, it is advantageous to adjust receiver sensitivity. In one embodiment, the output of transmitter 16 can be reduced and the sensitivity of receiver 18 increased automatically by systems 10 and 30.
  • In yet another embodiment of systems 10 and 30, a backup warning system is provided as shown in FIG. 7.
  • The intent is to provide immediate feedback to the driver shortly after the vehicle transmission is shifted into reverse. This information includes information on objects in the vicinity of the rear of the vehicle as well as information on objects in the path of the rear of the vehicle. In the case where objects are in close proximity to the rear of the vehicle, but not in the path of the vehicle, an auditory prompt representing an “alert” is sounded for the driver. If an object is detected in the path of the vehicle, in the range of 5 to 10 feet, the system will categorize that as a hazard situation and an auditory prompt representing a “warning” is sounded for the driver. If an object is detected in the path of the vehicle, within a range of 5 feet, the system will categorize that as an emergency situation and an auditory prompt representing an “emergency” is sounded for the driver. After the vehicle has been backing up for two or more seconds, the alert, warning, and emergency will have cleared and the system will begin providing range feedback to the driver in the form of distance information, as displayed on the Operator Interface and Side Display Modules, and auditory feedback in the form of pulsed tones. The closer the vehicle gets to an object, the faster the repetition rate of the pulses until the rear of the vehicle is within one foot at which time the pulses have turned into a continuous tone. In the process of backing up, if a person or vehicle suddenly appeared behind the vehicle, the system will automatically detect a sudden change in range to the object and the “emergency” auditory prompt will be issued to the driver so he/she can take action.
  • In one such embodiment, when the driver is going to back up, if there is an object within range, one of three scenarios will happen. First, if the system senses a truck or other object real close to it on either side, systems 10 and 30 will give him an alert. The system knows that there is no collision potential here, but just alerts him that there is something there. In one embodiment systems 10 and 30 provide one set of tones to the driver for an alert. Second, if there is an object in the range of 5-10 feet as soon as the driver throws it into reverse, systems 10 and 30 sense the object and provide the driver with a different alarm (e.g., a different set of tones or a different flashing light). This alarm is called a hazard alarm. And again, that's to alert the driver so he can take action on the hazard alarm. Third, if there is an object within 5 feet, the driver receives an emergency alarm (i.e., a third set of tones, or a third flashing light). Systems 10 and 30 therefore provide feedback indicative of the distance to an object behind the driver. In one such embodiment, audible or visual feedback tells the driver he's getting closer; the pulses go faster and faster to the point where, when he's within a foot, the pulses are continuous. But, if in the process of backing up, the system automatically detects that the distance suddenly became shorter, it will provide the emergency alarm right away so the driver can take action. For example, if somebody drove in behind the driver, or some kid ran in back of the vehicle, systems 10 and 30 sense that and automatically provide the emergency alarm so the driver can take action. As noted above, some of the systems that are out there actually detect zones of distance and provide feedback for that. Systems 10 and 30 go beyond that in that they detect and differentiate objects outside the area of potential collision from those inside and secondly, they can detect sudden changes in distance for an emergency alarm.
  • In one embodiment, control module 12 is highly programmable and dealers and fleet owners are given an ability to program key parameters that the system can use to more adequately address the needs of that application and that customer. In one such embodiment, an external programmer is plugged into a connector in the back of control module 12; the dealer can then respond to basically the number of fields and change a number (e.g., the distance between the rear-mounted transducers as discussed above), and key that in. When all the information is in, the programmer downloads the data, and feeds it back to the control module 12. Control module 12 then is configured for that vehicle.
  • In yet another embodiment, system 10 includes a security monitor/alarm system coupled to control module 12. In one such embodiment, an ultrasonic transmitter and an ultrasonic receiver is placed in the cab of the vehicle. When the driver leaves the vehicle, he turns the alarm system on with a key switch and it automatically scans the cab to see what the distances are to the closest object up in the cab. If somebody climbs up into the seat, one of the distances changes and an alarm is set off. In one such embodiment, the driver has approximately 15 seconds to get in and disable the alarm with his key switch. But if it was somebody other than the driver, the alarm will go off. In one embodiment, the alarm also activates an auto alarm underneath the hood of his vehicle to draw attention to and possibly scare off the intruder.
  • In yet another embodiment, an on-board computer interface is provided. The reason for this is some of the larger tractor-trailer rigs, in particular, have on-board information systems that monitor factors relating to use of the vehicle. They may monitor, for instance, the location of the vehicle, the delivery route, the delivery schedule, things that the driver does along the way, engine performance or things that might be an indication to the fleet owner that there's some service needed. In one embodiment of systems 10 and 30, information relating to driver performance that is detected with systems 10 and 30 is captured and downloaded into the on-board computer so that when the fleet owner gets a download from the on-board computer, it contains additional information provided by systems 10 and 30. So, with an interface through a single cable, systems 10 and 30 can tie into the on-board computer and provide real time information.
  • In another embodiment, if there is no on-board computer there, data storage is provided in control module 12 so that it can store the data internally. Data can then be downloaded to a fleet computer at a future date. In one such embodiment, systems 10 and 30 include an accident reconstruction memory installed in the control module. This memory maintains a record, in non-volatile memory, of data pertinent to system operation, vehicle operation, and obstacle detection. Some of these parameters are stored over longer periods of time and some relate to the last 2 or more minutes leading up to an accident. A G-force switch detects the presence of a crash and discontinues the data recording process thus saving data stored prior to the crash.
  • In one embodiment a self test capability is provided. Self test addresses several issues. One is when systems 10 and 30 are first turned on (i.e., the driver throws the power switch into an “on” position) the systems will turn all the indicators on so that the driver right away can see that all the indicators are lit. In addition, control module 12 tests its internal circuitry to ensure that the system comes up running. The second thing the system does is while it's running, if the micro controller or microprocessor in control module 12 were to fail, systems 10 and 30 then provide a “watch-dog timer” that will detect the failure. Thirdly, the driver can activate self test mode. On doing so, control module 12 flashes all of the indicators of front panel 20. In one such embodiment, control panel 20 includes an indicator 24 for each transducer mounted around the vehicle and, on entering self test, transducer indicators 24 begin to flash. The driver then walks around the vehicle and gets back in the cab. Every one of those transducers should detect him; each time they detect him, the transducer indicator 24 associated with the transducer goes off (i.e., quits flashing). If the driver gets back to the cab and there's a transducer still flashing, he knows that something didn't work and he can investigate the problem.
  • In another embodiment, systems 10 and 30 automatically and sequentially activate a Built-In Test (BIT) function for each sensor. The Built-In-Test (BIT) function is conducted in two ways: initial power-up and an integrated BIT performed during vehicle motion.
  • During initial power-up, when power is first turned ON, control module 12 performs a BIT of control module 12 functions. The BIT function verifies that sensor transmitter 16, receiver 18, and the electronics of control module 12 and the rest of systems 10 and 30 are working properly. In one embodiment, indicators associated with every element tested will turn off for all sensors that pass the Built-In Test. If a sensor 14 repeatedly fails the BIT, it will automatically be taken out of the service and the driver will be alerted of the failure and the need to service that particular sensor 14.
  • When the vehicle is in motion, the system will perform BIT on all detector modules and integrate the results into the data acquisition process to insure the integrity of the data being processed. This is accomplished by looking for road clutter signatures from each of the radar modules (i.e. forward-looking, side-looking, and rear-looking detectors). When the vehicle is in motion, the system will integrate the BIT of all sensor modules into the data acquisition process to insure the integrity of the data being processed. This is accomplished by looking for road clutter signatures from each of the radar modules (i.e. forward-looking, side-looking, and rear-looking detectors). If the radar modules are working properly, they will always detect low level return signals from the road surface while the vehicle is moving and will transmit information pertaining to these signals back to the control module. If the sensor is defective, the system will continue to function, bypassing the defective sensor. If the BIT detects a catastrophic failure, an error message will be displayed on the operator interface and the system will halt. The date, time, and results of the most recent BIT will be stored in Accident Reconstruction System memory if that option is installed. This integrated approach to BIT does not slow up the data acquisition process and it insures the integrity of all sampled data and the data communications from all sensors.
  • Wireless Portable Transducer System
  • In one embodiment sensors 14 are provided within a wireless portable transducer system 40. The problem with that is if you look at the number of trailers out there, they far exceed the number of truck-tractors out there. And so truck-tractors are basically moving from trailer to trailer. It could easily reach the point where establishing a complete collision avoidance system 10 or 30 on each combination of tractors and trail would be prohibitively expensive. To better address the needs of fleet owners, a system 10 is constructed having a wireless portable system 40. FIGS. 8 a and 8 b show two embodiments of such portable systems.
  • In FIG. 8 a, two boxes 70 provide the portable transducer function. Each box 70 includes an antenna sticking out the side. Each box 70 mounts under the trailer and clamps to the frame of the trailer. Inside each box 70 is an ultrasonic transmitter and receiver, electronic circuitry, and a radio transmitter and receiver. A two wire cable connects battery from the trailer to the electronic circuitry to provide power. A cable between each box provides common control signals from the radio transmitter/receiver such that signals from either rear mounted antenna control both Transducer Assemblies.
  • In FIG. 8 b, there is one long extrusion 72 with an antenna sticking out each side. The extrusion clamps to the frame on the rear of the trailer. The extrusion may be made of one piece, or two pieces (one within another) with a mechanism to adjust the width of the extrusion 72 to the width of the trailer. A Transducer Assembly (transmitter and receiver) is mounted on each end of the extrusion. The electronic circuitry, including radio transmitter and receiver are mounted inside the extrusion. In one embodiment, a two wire cable connects battery from the trailer to provide power to the electronic circuitry.
  • Signals to and from the boxes 70 and 72 are communicated to the control module of the collision avoidance system via the Wireless Communicator to detect, measure, and display distance to objects behind the trailer.
  • System 40 is designed so that it can quickly be disconnected from one trailer and moved to another trailer.
  • In one such embodiment, a Wireless Portable Transducer System provides for wireless communication between the electronics mounted in the cab of the vehicle and the Portable Transducer Array mounted on the rear of the trailer. Power to operate the Portable Transducer Array is provided by connecting in to existing power wiring provided to the trailer from the truck's electrical system.
  • Dependent on the transducer technology used, the Portable Transducer Array could be made to be totally battery operated. For example, if the Portable Transducer Array were designed using Micropower Impulse Radar, Doppler Radar or other alternative low-power technologies, the transmitting and receiving functions to measure distance to objects behind the vehicle would be low power and could operate on batteries built into the Portable Transducer Array. The communications between the electronics in the cab of the vehicle and the Portable Transducer Array could also use Micropower Impulse Radar, Doppler Radar, or other alternative low-power technologies, thus enabling portability with built-in battery power. This solution will eliminate the need to tap into the truck's electrical system to power the Portable Transducer Array.
  • The bulk of the electronics stays with the tractor. In addition, the rear transducer array stays with the tractor (i.e., as the driver goes from trailer to trailer he simply pulls off system 40 and clamps it on the next trailer. In one such embodiment, a connector arrangement is provided so the driver can connect system 40 to the power that's already on the trailer and quickly get the system up and running.
  • In another embodiment, multiple sensors are designed into the wireless subsystem 40 to detect obstacles to the rear of the vehicle and on either side of the vehicle. Communication with control module 12 is via wireless digital signals. Control module 12 is designed to sense when the wireless portable sensor subsystem is not installed or is not functioning properly.
  • Different quick-connect mounting arrangements might be needed for different style trucks. In one embodiment, as is shown in FIG. 8 b, portable wireless sensor subsystem 40 is a tubular structure with integral electronics, battery pack, and sensors mounted internal or external to the structure. The unit would clamp on the trailer chassis or the underride bumper provided on the rear of many trailers. Antennas would be mounted on one or both sides of the wireless portable sensor subsystem protruding just outside the left and right edges of the trailer. In another embodiment, as is shown in FIG. 8 a, portable wireless sensor subsystem 40 is enclosed in two separate housings mounted at the left rear or right rear of the trailer. Again, quick connect mounting arrangements will be made to secure each unit to the trailer. A cable will interconnect each unit to allow the sharing of one battery pack, one controller, and one wireless transceiver.
  • In another embodiment of system 40, the sensors on the trailer are hardwired together, however, communication between the sensors and the control module 12 is wireless. In this case, a Transceiver Module will be mounted on the tractor and a second unit on the trailer. The Transceiver Module on the trailer will receive its power from the tractor-trailer umbilical electrical cable. Electrical signals will be passed between tractor and trailer just like any non-wireless system with the exception that the signals will be converted to wireless communication and then reconverted back to their electrical form at the other end. This approach provides additional flexibility for the customer's needs.
  • Adding Additional Sensors
  • In certain situations, drivers need to be able to detect objects directly in front of, or to the side of, the front of the vehicle. For example, in the case of a school bus, one of the problems that busses have is the number of small children in front of and on the sides of the bus. There are deaths in school bus accidents in the United States every year; they are generally related to accidents at the front of the bus. To date, the only options provided to these drivers are mirrors angled to see the front of the bus. Even the use of angled mirrors, however, has only limited effectiveness.
  • To address this need, in one embodiment, forward-looking proximity detectors are provided in order to detect objects immediately in front of the vehicle (an area that is a blind spot for the driver).
  • Buses also have a problem with children that crawl under the bus to retrieve a dropped toy or ball. Bus drivers cannot always see these areas. To help prevent problems, in one embodiment, side-looking proximity detectors are positioned on the bus to monitor these areas.
  • Sensor Protection
  • Some forward-looking proximity detectors, however, have a problem with clogging due to debris, dirt, ice, etc. accumulated while the vehicle travels down the road. As noted above, forward-looking transducers are typically needed only when the vehicle is stationary and about to move forward. It would, therefore, be advantageous to expose the forward-looking transducer to the elements in only those situations where they are needed.
  • A forward-looking Transducer with an Environmental Shield solves this problem in situations where the Transducer need not be active while the vehicle is in motion. While the vehicle is in motion, the shield covers the front of the Transducer Assembly, protecting it from contamination. When the vehicle stops, the system using this device will open the front door, thus enabling the Transducer Assembly to detect and measure the distance to all objects in front of the vehicle. Shortly after the vehicle starts to move, the system closes the Environmental Shield to protect the Transducers.
  • FIGS. 9 a-d demonstrate one way of solving this problem. The solution is independent of the type of Transducer technology being used. However, the intended use is with ultrasonic Transducer Assemblies.
  • FIGS. 9 a and 9 b represent a side view and a front view of a mounting bracket with the Transducer Assembly 88 mounted via a Transducer Mounting Bracket 90 to Mounting Bracket Top Plate 92. Mounting Side Brackets are shown in place. Note the mounting holes in the flanges that protrude beyond the width of the mounting Bracket Side Plates 94. These mounting holes are used to mount the completed assembly to the underside of the vehicle front bumper or chassis just behind the front bumper. In one embodiment, spacers are used to adjust the actual height of the overall assembly so as to provide an unobstructed opening for the Transducers to work properly.
  • FIG. 9 c represents a side view of the Face Plate 91 used to protect the front of the Transducer Assembly. The Face Plate 91 is positioned at an angle to deflect air and contaminates down under the overall assembly. A pivot arm 93 is an integral part of the Face Plate 91. Attached to a slot in the pivot arm 93 is an electrically activated solenoid 95 and a return spring 97. The return spring holds the Face Plate 91 closed over the front of the overall assembly when no power is applied to the solenoid 95. This pivot arm 93 has a hole around which the Face Plate 91 will pivot when the solenoid 95 is activated.
  • FIG. 9 d represents a front view of the overall assembly. Note that the face plate 91 fits just under the Mounting Bracket Top Plate 92 and over the front of the Mounting Bracket Side Plates 94. This is to minimize moisture from seeping in behind the Face Plate 91. However, there is a gap between the lower edge of the Face Plate 91 and the front edge of the Bottom Plate 96 to allow any moisture that might enter the assembly to drain out. The assembly includes a nut, bolt and bushing 87.
  • Features not shown in the drawing include:
      • A bracket mounting the solenoid 95 to the inside of the Mounting Bracket Side Plate 94;
      • An access hole provided in the Back Plate 98 for the Transducer Cable Assembly which connects with the Transducer Assembly 88; and
      • The other end of the return spring 97, that mounts to the Mounting Bracket Side Plate 94 with a screw and washer.
  • One embodiment of such a forward-facing transducer is shown in FIGS. 9 a-d. Forward-looking systems only need to be activated when the bus comes to a stop and it first goes into gear to move forward.
  • In one embodiment, such as is shown in FIGS. 9 a-d, a forward-looking transducer includes an environmental shield. When the vehicle comes to a stop and is in neutral, or in park, that shield drops out of the way. When the driver goes into gear, the forward-looking proximity detectors begin looking and alert the driver with an alarm if there is an object in front of the vehicle. Shortly thereafter, the shield goes back into place and protects the detector from the environment.
  • In one embodiment shield 60 replaces the solenoid with a motor. The motor is to rotate the shield cover out of position when the transducer is operating.
  • As noted above, it can be advantageous to provide truckers and other long-range drivers with early warning of slow or stopped vehicles in their line of path. Collisions with such objects may result from poor visibility, driver inattention, or driver distraction. To counter this problem, in one embodiment one or more forward-looking transducers 14.2 are attached to the front of the vehicle as long range detectors that can see objects well ahead of the vehicle. As noted above, forward-looking devices 14.2 have been used in the past to detect slow or stopped objects in the path of the vehicle. Such attempts have largely failed due to the inability to control the false alarm and off-the-road clutter problems. Therefore, it is important to control and monitor a broad range of objects and clutter. Such objects and clutter must be properly analyzed to detect potential accidents with minimal false alarms.
  • In the software embodied in systems 10 and 30, there are several advanced features built-into the software to minimize false alarms, including (a) the combination of multiple sensors comparing signals for same objects and using sensor antenna patterns to derive angular position, (b) the use of a N out of M tracking algorithm, and c) the fusion of data from multiple sensors.
  • In one embodiment of the Forward-Looking Detector (FLD), the returns of both sensors at the same frequencies can be used to do False Alarm Rate (FAR) Reduction. Depending on the angle to an object, the frequency and amplitude will change. Stationary objects on the side of the road at close ranges will appear stronger in one sensor and at frequencies lower than the speed of the vehicle. Road clutter from a specific object such as a sign or bridge will appear at one frequency. As the vehicle approaches, if the object is on the side of the road the frequency will decrease and the signal strength in one sensor will decrease while the signal in the other sensor will increase faster than expected. This will denote that the object being detected is not in front of the host vehicle.
  • The difference in signal strength can be used to remove the majority of the False Alarms resulting from side of the road clutter. However, objects moving virtually in front of the host vehicle may present themselves in a manner consistent with objects directly in front of the host vehicle. An N out of M tracking scheme is used to track these objects, such a tracking scheme uses the statistical properties of scintillation and the FLD antenna patterns to average the signal return and differentiate between the lane in front and off to the side of the host vehicle.
  • The Proximity Detectors and Rear Guard Detectors, being wide beam sensors, require both N out of M tracking and Data Fusion to remove False Alarms. The Data Fusion system receives tracks from each sensor surrounding the host vehicle. This data is fused into one track for each object surrounding the truck with all data from all contributing sensors used to differentiate between an alarming condition and a False Alarm.
  • Another factor in FAR reduction is object identification. With a range estimate from the Multiple Hypothesis Ranging software, the signal strength versus time (scintillation characteristics) and average signal strength at an estimated range, an identification (ID) can be computed for a tracked object. The signal strength at an estimated range provides an estimate of radar cross section. From this radar cross section an initial categorization as a truck/large vehicle, a car or other can be determined. From scintillation combined with a small radar cross section the categorization as a human/large animal or a sign can be determined.
  • Another extremely important software feature for reliable system performance is Data Fusion. The Data Fusion algorithm will be designed to take inputs from a N out of M tracker. This Data Fusion algorithm is specifically designed to not require any specific set of sensors and adapts as sensors are added using a lookup table of the new sensor parameters and an indication of the number of and type of sensors added. The Data Fusion Algorithm can also take into account any data from the host vehicle supplied by various sensors. The absence of data will not cause a problem with the algorithm, however, the more data the better the performance. The purpose of the Data Fusion Algorithm is to reduce all of the tracks and detections down to a small set of object tracks representing the objects surrounding the host vehicle. Each radar module and sensor set may detect the same object. It is the task of the Data Fusion Algorithm to sort this out. The algorithm uses a technique called Deepest Hole to combine the data from multiple sensor and Kinematics Combination to fuse this data together.
  • The Deepest Hole function “associates” tracks from the sensor sets with existing Fused Tracks. It is assumed that multiple sensor sets may find the same object and that multiple radar modules within a sensor set will often see and report on the same object. The Deepest Hole function resolves these redundant tracks into a set of fused tracks, one per object. The output of this function is a list of track links linking the tracks from multiple radar modules together for one object.
  • The track data from the tracks which are linked are merged in this function. The speeds of each track are averaged together. The Signal to Noise Ratios will be averaged using a weighted average considering radar module antenna gain. The range estimate for the new merged track is created using the time and sensor averaged signal strength. The track ID is merged using the Probability of Correct ID and the ID Confidence.
  • In one embodiment, sensors 14 include transducers placed on the rear of the vehicle. Those transducers are activated when the driver shifts his transmission into reverse. As soon as the driver shifts into reverse, the transducers on the back begin to send out sound energy from the transducers, which bounces off an object, comes back to a receive transducer. Distance is then calculated as a function of time of return (e.g., acoustic applications) or intensity of the return signal (e.g., radar applications). In one embodiment, a Multiple Hypothesis Ranging algorithm is used to calculate distance. In addition, sensors 14 detect that there's something back there. And if there is something back there, systems 10 and 30 can alert the driver immediately, so that he can take action and not back into whatever that object happens to be.
  • In another embodiment, additional sensors 14 are mounted on the side of the vehicle. Sensors on the right side of the vehicle are activated when the right turn signal is active. Sensors on the left side of the vehicle are activated when the left turn signal is active. When the transmission is shifted into Reverse, the sensors on both sides of the vehicle are activated to monitor for potential accidents at the sides of the vehicle when backing up. When a sensor is activated, it begins to send out signals which bounce of any nearby object and come back to a receive transducer. Distance is then calculated as a function of time of return (e.g. acoustic sensors) or intensity of the return signal (e.g. radar applications). In the case of radar, a Multiple Hypothesis Ranging algorithm is then used to calculate distance. Based on the distance to the object, the Control module software can determine whether a valid alarm condition exists and the driver needs to be notified.
  • In another embodiment of system 30, systems may be equipped with a Side Display Module, such as is shown in FIG. 13. Side Display Module 36 is mounted internal or adjacent to each Side View Mirror. If one or more sensors 14 on the side of the cab detect an object, Side Display Module 36 will flash a forward-directed arrow on that side of the vehicle. In one embodiment, if one or more sensors on the side of the trailer detect an object, Side Display Module 36 flashes a rear-directed arrow on that side of the vehicle. If objects are detected on both the side of the cab and the side of the trailer, Side Display Module 36 flashes an arrow pointed both forward and to the rear. In one such embodiment, Side Display Module 36 also displays the distance between the rear of the vehicle and any object behind the vehicle when the transmission is in reverse.
  • In one such embodiment, Control module 12 includes a driver performance log. This is similar to the system used for the on-board computer interface, where Control module 13 actually collects and stores data that pertains to driver performance. For example, if a driver turned on his right turn signal and there was something over on his right hand side, Control module 13 extracts information from the scenario. If the driver turns off the signal, the result is stored. On the other hand, if there is an accident those results are also stored in the computer. The fleet owner can go in and retrieve the stored information and find out really what happened.
  • In yet another embodiment, as is shown in FIG. 10, sensors 14 are mounted toward the top and bottom of the back end of the vehicle. Vehicles such as RVs can have a problem when pulling into low overhang areas, low clearance areas, like with trees, large tree branches, etc. This embodiment places additional sensors up top, and can actually detect high up that there's a problem.
  • A system such as is shown in FIG. 10 may be used on a variety of vehicles. Sensors 14 at the top of the vehicle alert the driver of obstacles that may cause damage near the top of the vehicle. These obstacles may be trees, storage shed doors, or other similar objects. Sensors 14 at the bottom of the vehicle are re-oriented to provide a wider dispersion angle from high to low as opposed to their normal wide dispersion from left to right. With this change in orientation, this system is also able to triangulate on objects above the vehicle such that the system can calculate available clearance and compare it with required clearance (which, in one embodiment, is programmed into memory) for the vehicle.
  • Additional indicators on the operator interface 32 communicate to the driver whether an obstacle to the rear of the vehicle was detected by the High Transducer Assemblies or the Low Transducer Assemblies.
  • If the available clearance is less than the required clearance, in one embodiment, an Emergency Alarm is sounded to alert the driver to take action before damaging the vehicle. In this special case, an additional indicator on the operator interface 32 flashes to inform the driver that the alarm was caused due to lack of clearance.
  • In the example shown in FIG. 10, the motor home is backing under an overhanging building roof 62. Based on the known position of the Low Transducer above the ground and the position of the High Transducer above the ground, the MicroController in the control module 12 can calculate the distance of the roof overhang above the ground. Based on the required clearance, which is programmed into the Memory of control module 12, the system can detect whether there is sufficient clearance for the vehicle. If there is not sufficient clearance, the Emergency Alarm will sound.
  • In another embodiment, systems 10 or 30 can be mounted on farm trucks. Farm trucks are often pulling up into close spaces with loading and unloading equipment, grain augers and whatever, and in some cases even have to straddle a grain auger in order to dump a load so that the grain auger can take the load away. And that's a tough maneuvering situation. In one embodiment, software is provided which not only prevents accidents but also helps guide them into some of those tight maneuvering situations. In the software, systems 10 and 30 sense the equipment the vehicle is trying to mate with and guides the driver such that they stay centered on that equipment. Such a system is shown in FIGS. 11 and 12.
  • A grain auger example is given in FIG. 12. The example shown is that of a farm truck preparing to dump grain into a grain auger 55. To assist in guiding the truck up to the grain auger 55, the driver will activate a TruTrack switch on operator interface 32. As the vehicle approaches the grain auger 55, the system will automatically measure the distance to the auger 55, will calculate the transverse location of the auger 55, will display this location on the bar graph, and will display the distance on the digital readout on the operator interface 23.
  • In this example, the right rear Transducer has detected the auger 55 at a distance of 6.0 feet. The left rear Transducer has detected the auger 55 at a distance of 6.6 feet. The system will automatically calculate a perpendicular distance of 5.2 feet. The system will also calculate the transverse location and display it on the bar graph as slightly right of center. With this information, the driver can make minor maneuvering corrections to keep the auger 55 centered.
  • In one embodiment, as is shown in FIG. 13, Side Display Module 36 provides visual feedback to the driver when looking in the direction of either side view mirror. These modules may be mounted on the edge of the side view mirrors, or they may be mounted inside the cab in the approximate line-of-sight as the side view mirrors.
  • The Side Display Modules 36 (FIG. 13) consist of a plastic housing, a small PCB Assembly with five LED indicators, two half-inch high seven segment displays, a cable which runs into the cab and connects to the rear of the Control module, and a clear plastic cover on the front of the module. The Display Module 36 mounted on the left side of the cab is identical to the module mounted on the right side of the cab.
  • The seven-segment display drivers and LED driver will be located in the Control module. The above diagrams show a distance reading of twelve feet (12′). Distance readings associated with the Forward-Looking Detector Subsystem will not be displayed on the Side Display Modules. Only Backup Mode rear distance readings will be displayed. If an alarm condition exists anywhere around the vehicle, all five LED's will flash. The LEDs are not meant to provide any detector-specific information. Similarly, in one embodiment, the graphics displays shown in FIGS. 6 a-c will flash a visual warning on detection of an alarm condition.
  • One embodiment of a system 30 for use on vehicles such as commercial trucks is discussed below. Such an embodiment addresses many of the safety and operation issues raised above.
  • System Description
  • The loss of life, personal injury, and property damage are prime motivators when it comes to driver/vehicle safety improvements. This is particularly true in the trucking industry where continuing efforts are under way to improve driving safety through implementation of new technology. Factors which contribute to vehicle accidents are:
    Speed Visibility: night, fog, rain, and snow
    Driver Fatigue Highway congestion
  • Recent technology advances are available which could provide a positive influence on these factors by providing warnings to the operator and reduce the probability of and/or the intensity of a collision. Collision avoidance is the primary goal in the application of advanced technology. Collision Avoidance as applied to truck vehicles can be defined in three categories:
  • Head-on and Rear-End collision warning
  • Backing collision warning
  • Lateral collision warning
  • The purpose of the Collision Warning System is to monitor the area around a large vehicle and provide warning to the operator of the presence or approach of an object in the roadway, such as, a vehicle or pedestrian, and the potential for a collision with that object if action is not taken. The Collision Warning System must also provide the operator with the distance to the object, its speed of approach and classification. The Collision Warning Systems consist of a display, a control module (CM) and a combination of a Forward Looking Detector (FLD), a Proximity Detector (PD) and/or a Rear Guard Detector (RGD). FIG. 1 presents the system concept for the Collision Warning System.
  • Forward Looking Detector
  • The purpose of the forward looking detector is to monitor the area in front of the vehicle, to detect objects in the path of the vehicle that represent potential accidents, and to provide distance and speed information to the CM.
  • Two radar modules 14.2 are mounted on the cab roof or lower (bumper being the lowest mounting position), and aimed in the direction of the truck's forward motion to detect objects in the path of the vehicle. FIG. 14 shows the location of the detectors 14.2 on the cab and the area of coverage. Detector data is provided to the control module where it is processed and the pertinent information is displayed to the operator for his action. Computations are performed on detected objects to determine their position, size, speed and direction of travel. Time to impact will be determined for those objects that are determined to be on a collision course with the vehicle.
  • The FLD 14.2 must operate reliably in a complex environment consisting of:
  • Varying rates of speed
  • Rural, urban, and freeway conditions
  • Straight stretches of road as well as curves in the road
  • 2 lane, 4 lane, 6 lane and off-road conditions
  • Divided highways with a median or barricade
  • Level roads, uphill roads, and downhill roads
  • Extreme variations in environmental conditions.
  • There are two modes of operation required. The Primary Mode which is concerned with the potential for accidents directly in the path of the vehicle, described above, and the Secondary Mode which includes the Primary Mode plus detection of objects to the right of a snow plow that could impact a wing plow.
  • Proximity Detector
  • The PD is designed to detect objects in the immediate perimeter of a tractor-trailer. Radar modules are mounted in an array around the periphery of the cab and trailer. FIG. 15 shows the location of each radar module and the area of coverage. The proximity detector modules detect objects in the perimeter field and provide the data to the control module for processing. After pertinent data is derived, it is sent to a display where the driver is alerted to take appropriate action to avoid a collision. The front modules will look for small children or objects immediately in front of the vehicle. The right and left side mounted modules will detect vehicles, pedestrians, and objects that may not be clearly visible to the driver. The rear mounted modules will monitor the area directly behind the vehicle.
  • A special case on snow plows requires that the center rear mounted RM be used to measure time-to-impact for vehicles approaching from the rear. The CM will activate a pulsed high intensity light to warn the driver of the oncoming vehicle of the presence of the snowplow.
  • The PD Modules are selectively activated by control signals sent by the CM. The conditions under which they are activated include:
  • Activate front PD Module group when speed is under five mph.
  • Activate rear, left and right PD. Module groups when the transmission is in reverse.
  • Activate left or right PD Module group when left or right turn signal is turned on.
  • Activate right, left, and front PD Module groups when the three way mode is selected.
  • BIT initiation.
  • Master Clear: initializes all electronics in the proximity detector.
  • Rear Guard Detector
  • The Rear Guard Detector 160 is functionally the same as the Proximity Detector. The main difference is that the RGD 160 covers the peripheral area around the trailer only. It is a portable system which can be moved from trailer to trailer and works in conjunction with the CM in the cab. Being portable, the RGD is self powered and a RF link has been added to communicated with the CM in the cab. Configuration, location, and area of coverage are shown in FIG. 16.
  • The functional interface for the RGD is identical to the PD except that the interface uses a RF link to transmit data to the CM rather than a hard-wired connection. The CM sends activation signals as follows:
  • Activate rear, left and right RGD Module groups when the transmission is in reverse.
  • Activate left or right RGD Module group when left or right turn signal is turned on.
  • Activate right, left, and front RGD Module when the three way mode is selected.
  • BIT initiation.
  • Master Clear: initializes all electronics in the RGD.
  • Detailed Functional Descriptions for Each of Detector Subsystems are Provided Below.
  • Radar Module
  • The Radar Modules are a combination of motion sensors available off-the-shelf, an amplifier and a signal processing chip. They come in three configurations: Type A with a motion sensor, an amplifier and a microcontroller; Type B with a motion sensor and an amplifier; Type C with a motion sensor with a big antenna, an amplifier and a microcontroller. A notional diagram of the Type C RM Interface Board 170 is shown in FIG. 17. The motion sensors are microwave motion sensors that operate in X-Band frequency range. These modules utilize a dielectric resonator oscillator and a microstrip patch antenna to achieve low current consumption, high temperature stability, high reliability, and flat profile.
  • The radar modules for the PD and RGD systems will come in two generic types. Type A will include the radar, an op-amp circuit, and the RM interface board. Type B will include the radar and an op-amp. Up to two type B RMs can be connected to a Type A. The connection between a Type A and Type B will be a 4-wire cable. The 4-wire cable will be for +12 volts, two for signal, and ground.
  • The housing for the Type A and Type B RMs should be similar or the same. The Type A will have two connectors for the Type B inputs and one connector for connection to a serial port and for power. The Type B will have one connector for output and power.
  • A Type A RM will distribute power to a maximum of three radar motion sensors, the onboard motion sensor and two Type B RMs. The Type A will use up to 10 A/D ports on a microcontroller and sequentially sample data from each attached motion sensor. The Type A will also perform a 64 point FFT on each set of 5 kHz sampled motion sensor data. The first 20 samples from the FFT results will be output via a serial channel. The location of all sensors is important to the operation of the data fusion system. A typical installation will only use five Type A's but there is really no reason why all the sensors could not be Type A's. They can be put in any PD or RGD position.
  • At installation the installer will set the CM into installation mode and select on the menu, through the programmer, the position of the first RM, Type A or B. The installer will then approach the selected sensor location and wave his/her hand within one inch in front of the antenna housing, until a tone is heard from the CM and stop for five seconds and repeat the waving. A tone will sound and the installer will repeat this step. The installer will repeat the intermittent waving until the system gives a three beep OK response. This will typically take only waving at the sensor twice. The installer will then proceed to the next RM. All RM's will be programmed in this fashion. This will allow the CM and Type A modules to coordinate the location of each sensor.
  • At installation, the software in the CM will send an initialization serial message to all Type A modules. The software on the microcontroller will look for this message if it has not been assigned an address. Upon receiving this message the software will perform a 64 point FFT every 300 milliseconds. The first 20 samples out of the FFT will be sent back to the control module if one of these samples crosses a threshold. The CM will use this data to identify the RM which responded to the installer. Once the installer has gotten the three beep OK, the CM will send out an address number (1 to 15) to identify the RM's position (see FIG. 15). The CM will also send out the Type A's position along the truck, height, transverse distance from the left front corner of the truck, and distance from the front of the truck. The Type A, will accept this data and store it in EPROM. The Type A which is being positioned will then store the port number (connector) on which the signal was being received. This will allow the Type A to respond to this port number's address when polled by the CM.
  • The position of the sensors with respect to the tractor will be communicated to the CM, by the Type A's, upon startup. When the CM initializes the system the RM's will be polled (1 through 15). Each Type A module will respond when its number or the number of an attached Type B, is polled. The Type A module will send out location and other information about the RM.
  • The Type C RM is similar to a Type A. It contains the radar motion sensor, with a 16×2 pad antenna. The software samples 256 points of data from the onboard sensor. The data is fed to a 256 Point FFT. The first 128 samples from the FFT results will be output via a serial channel. To distinguish between left and right Type C, the last pin on the left connector will be shorted to ground. This pin will not be used for anything else (power or signal). The wiring in the FLD enclosure will be fixed such that it cannot be confused and reversed. The FLD uses two Type C RMs.
  • One rear guard RM (#14-14) (the center one) will be configured to search a shorter range to assist in increasing back-up range accuracy. This RM must be a Type A RM. The CM will command this RM to sample either a unity gain op-amp channel or to sample the normal gain op-amp channel (every Type A will be able to do this). This will allow the RM to be used for long-range detection when the vehicle is not in reverse and short-range measurements when the vehicle is in reverse. This same command from the CM will change the sampling rate on the A/D unity gain channel to 2 kHz when in reverse (provides a 2.5 times finer measurement of vehicle speed).
  • Power Distribution
  • The power distribution plan 180 is shown in FIG. 18. The vehicle battery powers the CM. The +12 volts is filtered and fused in the CM. The +12 volts is then supplied to the FLD and any Type A RM on the cab or truck without a trailer. Trucks with a trailer will have power to all cab Type A RMs and to a transceiver. The trailer will use either a set of PD's or an RGD. The PD's will use the trailer's +12 power to supply the Transceiver/power convert module. This module will filter and fuse the +12 volts and convert the power to +3 volts for the transceiver and send the +12 volts out. The +12 volt power will then be sent to all trailer Type A modules. The Type A and B modules will DC to DC regulate the +12 down to +5 volts. The RGD is powered by its own battery and will distribute power from this 12 volt battery the same as the PD.
  • Cabling
  • The communications signals between the modules are shown in FIG. 19. Type B's send audio frequency signals to the Type A's. Type A's send RS-485 at 19,200 Baud to either a transceiver or the CM directly. The RS-485 cabling is T′ed between the Type A modules. Type C's send data over a 400 Kbaud RS485 interface. All RS485 interfaces are two way.
  • The cabling between the Type A and Type B consists of four wires: +12 volts, ground, and two for Signal. The cabling between a Type A and the CM or a transceiver is four wires: two for RS-485, +12 volts, and ground. The cabling between a Type C and the CM is four wires: two for RS-485, +12 volts, and ground. The cabling between a transceiver and the CM is four wires: two for RS-485, +12 volts, and ground. The pin configuration for the connectors is shown in FIG. 20.
  • Calibration
  • At the time of manufacturing testing the Radar Modules may need to be calibrated. A calibration fixture consisting of a fan permanently mounted to one end of a rectangular tube assembly will be used to program a gain characteristic number into the sensor microcontroller memory. This will be done for Type A and C modules. The coding in the microcontroller will be put in manufacturing mode and will expect a specific return from the test assembly. A number denoting the difference between the expected and the measured value, to the nearest dB, will be stored. This will be sent via the header message to the CM for use in signal processing.
  • Microcontroller Firmware
  • The microcontroller will perform the following functions in firmware:
  • 1. Manufacturing test Calibration data storage
  • 2. Installation Initialization
  • 3. 2-way Serial Communications
  • 4. Multi-channel A/D
  • 5. 256 or 64 point FFT
  • 6. Command Logic Processing
  • Forward Looking Detector Functional Description
  • The FLD consists of two Type C Radar Modules. The Block diagram of the FLD 190 is shown in FIG. 21.
  • These two sensors are narrow beam motion sensors. The beamwidth is 8.5 degrees at the 3 dB point of the antenna pattern. The two sensors are pointed across each other as shown in FIG. 10. This results in a 0 to 10 dB antenna pattern change for both of the sensors focused in a 10 foot column 300 feet in front of the track. The difference in antenna pattern gain will be used to differentiate between objects directly in front of the truck and objects not directly in front of the truck.
  • The RM alignment for the FLD 190 is shown in FIG. 22. This alignment is such that the antenna gain is 10 dB lower than the peak at the edge of a 10-foot by 300-foot rectangle.
  • Interface Board
  • The Interface Board is built into the FLD Type C RM and is the primary interface between the RM and the CM. The Interface Board uses chips from MicroChip Development Systems. These MicroChip chips will be used to perform the A/D, FFT/signal processing, and communications formatting for the messages. The messages will either be parallel or serial depending on the most cost-effective method that meets the FLD to CM data rate requirements. These chips are powered by a +5 volt DC source and are programmable in C and assembly language. The Interface Board performs four primary functions:
  • Timing: generate on/off power pulses to the radar modules for either minimization of power consumption or to meet FCC regulations. Timing between the two MicroChip A/D chips is handled by handshaking with the CM. This timing controls the sampling, FFT, and data transfer to the control module. Sample time for each FLD sensor is 25.6 ms for 256 samples of data at 10 kHz. Using two FLD sensors collecting data simultaneously and combining the data in the control module, the overall sensor report data rate would be approximately 50 milliseconds.
  • A/D: Digitizes the FLD radar data. The A/D function performs a ten-bit quantization of the incoming analog data. Individual MicroChip A/D processors are used for each FLD sensor. This allows minimal latency and a faster overall sampling rate. Two channels will be used on the A/D. The first channel will sample a high gain op-amp output. The second channel will sample a low gain op-amp output, with the third channel sampling the lowest gain op-amp. The fourth and fifth channels will be used to set the reference voltage on the A/D. This will provide for 90 dB dynamic range when using a 10-bit A/D.
  • FFT: Performs a standard 256 point FFT using the 256 samples collected. Only lower 128 points are returned to the control module (this corresponds to a 1.25 mph per Doppler bin closing rate resolution). (This is subject to change based on the speed of the MicroChip processing. If processing is too slow in the MicroChip, the FFT will be preformed in the CM and 256 samples of data will be transferred).
  • Communications with the Control module: Provides a serial data interface.
  • The MicroChip PIC17C756 series chip will be used for the Type C radar Module. This chip requires one oscillator at 33 MHZ.
  • Data Communications
  • The first 128 samples from the FFT results, from each Type C in the FLD, will be output via a serial channel. This channel will be a two way communications link with the CM. When the data is ready, finished the FFT, the chip will wait for the command to send the data. The data (256 bytes) will be transferred in less than 10 milliseconds. This equates to a data rate of 257,000 Baud of unpacked data. Each pair of bytes will contain one 16-bit point of the FFT output. A header message will accompany the data, identifying the RM being sampled. The Interface Design Specification will define this message.
  • Circuit Design
  • The Type C RM consists of two major parts, the off-the-shelf motion sensor and the interface board. The interface board will be manufactured by ATI (Altra Technologies, Inc., 18220 South Shore Lane W., Eden Prairie, Minn.). It is a 4-layer board approximately 4″ by 3″. It contains one MicroChip PIC17C756 chip, an op-amp and various discrete components. It is wired to a five-pin connector on the RM housing.
  • Proximity Detector Functional Description
  • The purpose of the PD is to monitor the area around the periphery of the vehicle by detecting objects that could potentially be struck by the vehicle if it moved left, right or back and to provide distance information to the CM. FIG. 23 presents the Block diagram for the PD 230.
  • The PD uses Type A and B Radar Modules. This array of radars will be interconnected in groups of up to three radars to a RM Interface Board which is used to sample all three RMs simultaneously and send the processed data to the CM upon request. The multi-port interface card in the CM will cycle through each device sampling the information. As in the FLD, the object signal data from the RMs is digitized and sent by wire link to the CM for processing. The CM will control the sampling. FIG. 24 is a diagram of the timing for a single RM interface board and the associated RM's. The only unknown time is the time for performing the FFT and associated formatting of the data. It is not believed that this time approaches the idle time for the interface board.
  • The individual sensors are switched on for 12.8 milliseconds every 333 milliseconds (more than one sensor group will collect data at the same time). They are sampled at 5 kHz, giving 64 samples of data. The CM multi-port Interface Card sequences through the PD RM interface boards until all fifteen RMs have been sampled. This function is repeated every 333 milliseconds.
  • An installation of the PD on a Cab and Trailer rig will require an RF link between the trailer and the cab. The transceiver at the trailer will contain a power supply to derive +3 volts from the trailer power of +12 volts.
  • Hardwired installations will use a 4-wire cable between the CM and the Type A RMs. The 4-wire cable will carry: +12 volts; power ground; two wires for the two-way serial communication.
  • Type A Radar Module
  • Interface Boards are built into the Type A Radar Module and they are the primary interface between groups of RMs and the CM. The Interface Board uses a MicroChip chip in the same family as described in Section 3.2 and performs similar functions. The physical layout of the Type A module is shown in FIGS. 25A and 25B. FIG. 25A is a side view of the Type A module layout and FIG. 25B is a top view of the Type A module layout. The Interface Board performs the following functions:
  • MicroChip
  • The MicroChip PIC17C756 series microcontroller chip will be used for the Type A radar Module. This chip requires one oscillator at 4 MHZ. Using a serial EEPROM, the microcontroller will have identification encoded in it to provide RM ID back to the CM and know when to respond to CM commands.
  • A/D
  • The A/D function of the MicroChip will use up to 10 channels, sampling at a rate of 5 kHz for 64 samples. The A/D will be switched on and off via the software in the MicroChip. The collection will be synchronized with the other Type B RMs connected to the Type A (see 24). The 10 channels used on the A/D, are three for each Type B motion sensor and four for the on-board Type A motion sensor. The first channel of the three for a motion sensor will sample a high gain op-amp output. The second channel will sample a low gain op-amp output, with the third channel sampling the lowest gain op-amp. This will provide at least 90 dB of dynamic range necessary for close approach of objects. The channels will be examined and when the high gain channel is at its maximum value, the second channel will be used in the signal processing.
  • When in the reverse gear, the center rear facing RM will be set to use the unity gain op-amp only. This sensor will be sampled at a 2 kHz rate to get more precise measurement of vehicle speed.
  • FFT
  • The software samples 64 points of data. This data is fed to a 64 Point FFT. At this time it is believed that the PIC17C756 series of chip is capable of performing a 64 point FFT in the required time.
  • Data Communications
  • The first 20 samples from the FFT results, for each attached RM, will be output via a serial channel. This channel will be a two-way communications link with the CM. When the data is ready, finished the FFT, the chip will wait for the command to send the data. The data (60 bytes) will be transferred in about 30 milliseconds. This equates to a data rate of 19,200 Baud. A header message will accompany the data, identifying the RM being sampled. The Interface Design Specification will define this message.
  • System Operation
  • Multiple Radar Modules
  • The PD consists of multiple Type A and Type B radar modules. Possible PD RM combinations include at least one Type A and up to two Type B modules. Upon initialization in the CM, the CM will poll for each Type A RM. The Type A RM when first powered up will check the two ports for Type B RMs and detect the existence of a RM. This data will be reported back during the CM's initial poll. The CM will build a table in RAM of each RM and its position, for use when performing other detection and tracking functions.
  • The Type A RM's will have a code that will indicate to the CM the location of each RM in its suite. The Type A RM will respond to the CM commands when it receives a message with its address in the header. These messages will be defined in the Interface Design Document.
  • The Radar Modules will be positioned around the truck according to the diagram in the System Specification. The RMs on the side of the truck should not be more than 25 feet apart and not closer than 10 feet. The RMs on the rear should be spaced such that one is at the center and the others are as far on the edge as possible.
  • System Timing
  • The System Timing is shown in FIG. 26 (for 2-Type A's connected to 2-Type B's each). The data collection takes 12.8 milliseconds for each RM. The signal processing takes X milliseconds per RM. The total data collection time and processing time for 15 RMs is X milliseconds. The data transmission time is 160 milliseconds. To conserve time the transmission of data will be going on from one set of RM's while another is collecting and processing data.
  • The timing of the PD RM's will be integrated into the timing of the FLD when the FLD is in operation. The CM will poll the FLD and receive an 8-millisecond burst of serial data. The CM will then poll one Type A PD RM and get up to 30 milliseconds of data. The CM then does signal and data processing for 12 or more milliseconds, processing the downloaded data. 50 milliseconds after the system polled the FLD, it repeats this sequence.
  • System Interface
  • PD to Control Module
  • The following data will be transferred at 19,200-Baud or faster.
      • Digital Object Data—Provides digitized object data from each detector for signal processing analysis in the control module. 64 samples are collected over 12.8 milliseconds per RM. 20 samples per RM are sent to the CM every 333 milliseconds. The 333 milliseconds will also be used for control transfer and other data. When used with a trailer, the signals will be transferred via an RF link for the trailer RM interface boards. The three RM interface boards on the trailer will be wired to a transceiver and a transceiver will be installed on the cab that is connected to the CM. The cab RM interface boards will be hardwired to the CM.
      • BIT—Sends PD operational status data to the Control Module.
        RF Link
  • The RF Link to the trailer installation will use an RF transceiver modem. The design of the PD will be set for a 19,200 Baud link (5 kHz sampling and 20 samples transferred). 20 samples at 5 kHz represent 40-mph coverage with a 2-mph resolution.
  • The RF Link consists of a modem and a transceiver function. The system currently under consideration contains an internal battery which will last for years (advertised time with system mostly in receive). It is anticipated that in the PD application this battery will be replaced by a DC to DC 3 volt regulator deriving power for the transceiver from the truck battery.
  • The modem provides two way serial communications. RS485 is the electrical interface for the serial link. The RF Link will require a 6-inch antenna at both the tractor and the trailer. The typical range for the link is 150 feet. If two antennas are mounted on either side of the trailer at the rear or one antenna on the front of the trailer and the top of the cab, the link will be able to handle any size truck.
  • The manufacturer of the Transceiver is Axonn. Their address is:
  • Axonn
  • Suite 202
  • 101 W. Robert E. Lee Blvd.
  • New Orleans, La. 70124
  • Phone (504) 282-8119
  • They have a product, the AX-550, which exceeds the requirements for range. A new product is due out soon, which will be less power. Two units per system are needed for either the PD or the RGD.
  • The trailer mounted transceiver module in the PD application will also provide the power filtering, fusing and regulation for the trailer mounted PD radar modules.
  • Rear Guard Detector Functional Description
  • The Rear Guard Detector (RGD) is the functional equivalent of the PD system for the trailer only. It's designed to be portable and can be moved from trailer to trailer. Communications with the CM in the cab will be made over a wireless RF data link. The RGD purpose is to monitor the area around the periphery of the trailer and to detect objects that could potentially be struck by the vehicle as it moves left, right or backward and to provide object distance information to the CM. The functional design for the RGD 270 is shown in FIG. 27.
  • The RGD has Type A RM located in the center of the three rear facing sensors, and one for the pair of radar modules on the right and one for the pair of radar modules on the left. The Type A RMs output data into a RGD interface/transceiver, which sends the signal to the front cab. In the tractor a transceiver picks up the signal and converts it to a digital serial input to the CM.
  • A battery will be provided to power the RGD. This battery will be rechargeable and have a 5 Amp Hour capacity for a 25-day interval between recharging.
  • The RGD subsystem will be configured with three, five, or seven sensors. All of the RMs are mounted on one multi-detector array and will be mounted at the rear of the trailer. No electrical connection to the trailer will be required since it has a self-contained battery pack.
  • Interface Board
  • MicroChip
  • The MicroChip PIC17C756 series chip will be used for the Type A Radar Module. This chip requires one oscillator at 4 MHZ. The MicroChip will have identification encoded in it to provide RM ID back to the CM and know when to respond to CM commands.
  • A/D
  • The A/D function of the MicroChip will use up to 10 channels, sampling at a rate of 5 kHz for 64 samples. The A/D will be switched on and off via the software in the MicroChip. The collection will be synchronized with the other Type B RMs connected to the Type A (see FIG. 27). The 10 channels used on the A/D, are three for each Type B motion sensor and four for the on-board Type A motion sensor. The first channel of the three for a motion sensor will sample a high gain op-amp output. The second channel will sample a low gain op-amp output, with the third channel sampling the lowest gain op-amp. This will provide at least 90 dB of dynamic range necessary for close approach of objects. The channels will be examined and when the high gain channel is at its maximum value the second channel will be used in the signal processing.
  • When in the reverse gear, the center rear facing RM will be set to use the unity gain op-amp only. This sensor will be sampled at a 2 kHz rate to get more precise measurement of vehicle speed.
  • FFT
  • The software samples 64 points of data. This data is fed to a 64 Point FFT. At this time it is believed that the PIC17C756 series of chip is capable of performing a 64 point FFT in the required time.
  • Data Communications
  • The first 20 samples from the FFT results, for each attached RM, will be output via a serial channel. This channel will be a two-way communications link with the CM. When the data is ready, finished the FFT, the chip will wait for the command to send the data. The data (60 bytes) will be transferred in about 30 milliseconds. This equates to a data rate of 19,200 Baud. This chip performs a 10-bit A/D and 16-bit FFT. A header message will accompany the data, identifying the RM being sampled. The Interface Design Specification will define this message.
  • System Operation
  • Multiple Radar Modules
  • The radar modules used in the Rear Guard will be the same as the PD. The RGD has a special condition where one Type A module can be installed in the middle of the rear facing mounting bracket and the signal processing in the CM will be set to give longer range performance for a snow plow application.
  • System Timing
  • The functional interface for the RGD is identical to the PD, including the use of the MicroChip chip set.
  • System Interface
  • RGD to Control Module
      • Digital Object Data—Provides digitized object data from each detector for signal processing analysis in the control module. 64 samples are collected over 12.8 milliseconds per RM. 20 samples per RM are sent to the CM every 333 milliseconds. The 333 milliseconds will also be used for control transfer and other data. The RGD will use the same RF link as the PD.
      • BIT—Sends RGD operational status data to the CM.
        RF Link
  • The RGD is equipped with the same RF link as is available on the PD. A rechargeable 12-volt battery powers the RGD RF Link. The power distribution from the trailer transceiver is the same as the PD's. The RGD has a sleep mode to conserve battery power. If an activation signal is not received by the RGD for 20 seconds the system will go into standby or sleep mode. The CM will send out an activation command every five seconds when the RGD should be operational. The RF transceiver module on the trailer and the microcontroller in each Type A RM controls the sleep mode. The RF transceiver and microcontroller will have a sleep mode watch dog timer set to two seconds. When in sleep mode the transceiver will activate and search for a receive signal. The CM will command a repeated transmit signal until the sleep mode stop data word is received. This signal will be used if it has been over 20 seconds since the RGD sent data to the CM. The receiver in the trailer transceiver will come on for two milliseconds and search for the transmit signal. If one is received the transceiver will activate a serial message (controlled by the CM) to wake up the microcontrollers. When all microcontrollers have reported back the RGD operation will start. The entire wake up procedure will not take more than four seconds and will usually take less than two seconds. The duty cycle is 1% while in sleep mode for power conservation.
  • Control Module Functional Description
  • The CM is a customized PC. Processor speed, memory, and interface drivers will determine the CM configuration based on a nominal set of performance requirements and hardware/cost tradeoffs. A functional diagram of the CM 280 is shown in FIG. 28. The CM 280 consists of two primary elements: an Interface Card 282 and the Processor Board 284.
  • CM Multi-Port Interface
  • The CM multi-port Interface 282 buffers the incoming data from the FLD 281, PD 283 and RGD 285, routes it to the Processor Board 284, and routes control signals from the Processor Board 284 to the FLD 281, PD 283 and RGD 285. The CM Multi-port Interface 282 routes the FLD Doppler spectrum to the Object Data Processing module 288 on the Processor Board 284. It also routes the PD and RGD Doppler spectrums to the Detection Processing module 286 on the Processor Board 284.
  • Signal/Data Processing Functions
  • The CM performs the following signal/data processing functions: Object Data Processing (FLD only), Track Report Generator (FLD only), Detection Processing (PD and RGD only), Data Fusion, Situation Report Generator, Display Driver, and System Control. The Display Driver and the System Control function are the responsibility of ATI and will not be discussed in this document except where an interface exists with one of the other functions. The remaining functions are shown in FIG. 29 and discussed below.
  • Object Data Processing
  • The Object Data Processing module receives the 128 samples of FFT′d data (frequency domain signal) from each radar module. It processes these samples (256 per cycle) and determines the existence of objects for reporting (see FIG. 30). A cycle for the FLD is 50 milliseconds. The two forward looking radar modules are treated separately for clutter removal. They are combined in the Multi-Object Detector. The Object Data Processing module will output detections/velocities, associated signal strengths, pulse timing data, clutter estimates and clutter distribution.
  • Clutter Reduction
  • The frequency domain signal will be analyzed and the clutter removed individually for each sensor. The clutter is removed in four steps. These four steps are discussed in detail below. It should be noted that to perform accurate calculations in this process the height of each RM antenna is required along with the dimensions of the truck and the location of each sensor. This information will be programmed in the CM at the time of the system installation.
  • Threshold Computation
  • The first step is to compute a threshold versus frequency (speed) for the received spectrum. This set of numbers (128, one for each sample) is computed from the speed of the truck and the height of the sensor above the road. The road will reflect a certain amount of energy back to the sensor. Each road surface type will reflect a different amount but an average amount of reflection will be used since height above the road is dominant. The frequency spectrum of the clutter is related to the speed of the truck and the distance to the road. The height of the sensor will be the strongest return and it will be at 0 Hz in the spectrum. The 38.75 Hz return (1.25 mph) will be from a distance where the velocity component of the road is 1.25 mph. The bin spacing is 38.75 Hz, thus the first 1.25 mph will appear in the first bin. The next spectral bin will be from 1.25 mph to 2.5 mph and so on. The distance to the road for each frequency will be pre-computed and an equation for clutter will use the resulting values. The values in this equation are sensor height and truck speed.
  • The Average Spectral Power is given by: P i = j = i - N / 2 i + N / 2 S j N ( 1 )
    Where
      • Pi is the ith Doppler bin's average power
      • Sj is the jth Doppler bin's power
      • N is the size of the sliding window (Note: at either end of the spectrum, i=0 to N/2 or 255−N/2 to 255, N/2 points will be used)
  • The threshold for each bin (sample) is
    T i =T m ·P i.  (2)
  • Where (Tm) is the threshold multiplier.
  • This value (Tm) will start as 3 dB but it will be determined through lab and initial field-testing. The result of this multiplication is then multiplied by the truck speed based road surface clutter.
  • This is given by: M i = ( cos ( sin - 1 [ ( i 38.75 ) / ( v t 31 ) } h ) 4 ( 3 )
  • Where
      • Mi is the ith Doppler Bin's road clutter multiplier
      • Vt is the truck speed
      • h is the mounting height of the RM
        Thus the threshold is modified to:
        Ti=Ti·M i
        Rain Clutter Removal
  • Removal of weather related clutter will be done in steps two and three. Rain clutter produces a distinct pattern in the frequency spectrum. The 128-sample frequency spectrum will be examined for this pattern. If found, the threshold values at the frequencies where rain is present will be adjusted for the presence of rain.
  • The pattern is recognizable over time. Distinct lines at a constant velocity and no real discernable change in signal strength over several seconds will denote rain. This condition will be flagged and the most prevalent frequencies will be marked.
  • Snow Clutter Removal
  • Snow will appear as colored noise. Several frequencies may have more noise than others, but in general the average noise will go up throughout the spectrum. The thresholds will be adjusted accordingly.
  • Object Clutter Removal
  • Step four is the search for specific clutter from stationary objects. This will be done by comparing the returns of both sensors at the same frequencies.
    Course Clutter Removal for all frequency bins { if S 1 i > S 2 i + 30 dB then Fr i = TRUE ( 4 )
  • Where:
      • Fri is the ith flag for probable road clutter
      • S1 is the left sensor's ith Doppler bin's power
      • S2 is the right sensor's ith Doppler bin's power
        Depending on the angle to an object the frequency and amplitude will change. Stationary objects on the side of the road at close ranges will appear stronger in one sensor and at frequencies lower than the speed of the truck. These objects and their frequencies will be noted for processing later in the data fusion function.
  • Road clutter from a specific object such as a sign or bridge will appear at one frequency. As the truck approaches if the object is on the side of the road the frequency will decrease and the signal strength in one sensor will decrease (with respect to the R4 curve) while the signal in the other sensor will increase faster than expected.
  • Multi-Object Detection
  • After clutter reduction, the frequency spectrum, object clutter candidates, and clutter thresholds will be fed into a Multi-Object Detection algorithm. This algorithm will be used to differentiate between multiple returns in the spectrum from objects that are and are not clutter. This algorithm will be designed to offer up candidate detections, which when combined with other sensor and truck data can be used to determine the actual presence of an object. The pair of FLD radar modules will be used in this algorithm to differentiate between clutter and objects not in front of the truck. Three steps are performed to find the candidate detections.
  • Threshold Application
  • The first step is the application of the clutter thresholds (equation 2) to the entire spectrum and the elimination of the colored clutter.
  • For all Frequency Bins
    C i=1 if S i >T i  (5)
  • Where Ci is one if a threshold crossing was detected at the ith Doppler bin
  • If a particular frequency bin exceeds the threshold it will be stored for later processing as a detection candidate (Ci). If a certain segment of frequency bins produces an excessive number of detections the thresholds will be raised (see equation 1) in that region and the strongest detections will be reported.
  • Detection
  • The second step is to detect the threshold crossings. These crossings will be compared to each other and to the estimated road/object clutter data. The two sensors will be combined in this step. After the initial clutter removal the frequency spectrums of the two sensors will be compared for all threshold crossings. For a candidate detection to be declared, a threshold crossing must have occurred for each radar module at a frequency not more than +1 FFT bin apart, with no more than 30 dB SNR difference (Fr flag).
    For all Frequency Bins
    IF <NOT> Fri
    IF (CLi <AND> CRi−1) <OR> (CLi <AND> CRi) <OR> (CLi <AND> CRi+1)
    Di = 1
    Where Di is the ith Frequency bin detection flag.
  • If either condition is violated a detection will not be declared for that threshold crossing. This step will output no more than 15 candidate detections. The 15 candidates with the highest frequency (fastest closing rate) will be given priority.
  • In the future the CM will tell the FLD when a turn is underway and the direction of the turn. When a turn is detected the sensor pointing in the direction of the turn will be allowed to have stronger detections. If the sensor in the opposite direction of the turn has a signal over 20 dB stronger that crossing will not be accepted for detection.
  • If the Secondary Mode is in use this second step will allow the radar module on the left side of the truck to have a miss match in SNR with the radar module on the right. By allowing the left radar module to have stronger returns it will effectively widen the detection pattern to the right. The values for this imbalance will be determined when the antenna patterns for the two radar modules are provided by the manufacturer.
  • Pre-Tracker
  • The final step in the Multi-Object Detector is to eliminate all but the five best detections. This algorithm is the first stage of a tracker. The detections will be sorted by closing velocity and SNR. The objects that will most likely reach the truck first will be given highest priority. If over five detections exist the pre-tracker will then sort the detections further. The pre-tracker will compare the detection to objects already being tracked from previous sensor cycles. Those detections closest to existing tracks will receive priority within the original sort.
  • These detections will be “associated” with the existing tracks. If more than one detection associates with a track the closest detection, in speed and SNR, will be marked as the best association with the track. More than one detection may associate with one track. Up to five detections will be passed on. These detections will be the objects that will reach the truck first and have been around the longest. In all cases the shortest time to impact will be given priority. Longevity will only be used to sort on detections that have closing speeds within 10 mph and SNR's within 20 dB of each other. That is if there are more than five detections, all at about the same speed, the associated detections with the longest track existence time will be output.
    Association
    for all detections - j
    for all tracks - L
     if Rdotj− Gs<Rdot1 <Rdotj + Gs
    if (SNRj − SNR1) < 20 dB
    then Associate
    Where Rdot is the closing rate
    j is the detection counter
    L is the current track
    Gs is the speed gate
  • Objects extending over the track and across the road, such as bridges and signs will require special processing. This processing will use the combined SNR and examine stationary objects over time. The characteristics of the SNR over time from flat or non-complex objects such as a bridge or sign will be used to identify these objects.
  • NOTE: The data collection and detection rate of the FLD is 50 milliseconds. The Data Processing, Tracking and Data Fusion functions will use several cycles of detections to produce the best answers. Several points need to be made clear here. First, within the first 100 milliseconds the system can produce an alarm condition when the appropriate time to impact is measured. Second, the calculations for a tracking system of this type become stable over 10 to 15 cycles, this corresponds to 0.5 to 0.75 seconds. Third, the higher accuracy of a stable system is only required in computing the larger time to impact numbers. Fourth, in most conditions the stable track will allow the system to “track” an object into the alarm area or region. The same approach is used in the PD and RGD systems and is discussed later. These same four points apply to the PD and RGD, except the cycle time is 6 time longer. These systems are not measuring events that are as time critical, since velocities in the directions being monitored are not nearly as high.
  • Object ID
  • With a range estimate, the signal strength versus time (scintillation characteristics) and average signal strength at an estimated range, an ID can be computed for a tracked object. The signal strength at an estimated range provides an estimate of radar cross section. From this radar cross section an initial categorization as a truck/large vehicle, a car or other can be determined. From scintillation combined with a small radar cross section the categorization as a human/large animal or a sign can be determined.
  • First the algorithm will compare the two forward looking radar module signals. This comparison will be a time-based comparison of a track's speed and SNR, with the current associated detection. Tracks, which are traveling slower then 20 mph, will be designated human. All new tracks will be given a truck designator until more than one cycle of data has been gathered. The SNR will be averaged over time for each track.
    RCS j={(Savg j)/R j 4}1/2  (6)
  • Where
      • RCSj is the jth track's radar cross section
      • Savgj is the time averaged signal strength
      • Rj is the estimated range (initialized to 300 feet for the FLD and 25 feet for the PD/RGD
  • IF RCSj>XdB
    Then IDj=TRUCK
  • ELSE IF RCSj>YdB
    Then IDj=CAR
  • ELSE IDj=HUMAN
  • Where
      • XdB is the expected RCS for a truck in dB meters
      • YdB is the expected RCS for a car
      • IDj is the jth track's ID
  • The Object ID Algorithm will attempt to differentiate between small objects and large objects. This algorithm will help to eliminate false alarms from the side of the road and in front of the truck when approaching and during a turn by identifying the track as not a truck or a car.
  • A probability of correct ID and an associated confidence level will be computed for each ID. These parameters will be set from an equation empirically derived during testing of the system.
  • IF IDj=TRUCK
    PCID j=1−(ZdB−Savg j)/ZdB
  • ELSE IF IDj=CAR
    PCID j=1−(VdB−Savg j)/VdB
  • ELSE IF IDj=HUMAN
    PCID j=1−(UdB−Savg j)/UdB
  • Where
      • ZdB is the expected radar cross section of a truck
      • VdB is the expected radar cross section of a car
      • UdB is the expected radar cross section of a human
      • PCIDj is the jth probability of correct ID for the jth track
  • The ID confidence is given by: CF j = i = 1 N PCID ji N
  • Where
      • CFj is the confidence factor for the jth track's ID
      • N is the current number of cycles tracked
        Track Report Generator
  • The candidate detections will be combined with the object ID data and analyzed for the presence of an object for which a report should be generated. Stable object reports or tracks will be required to achieve the range estimate accuracy desired. The object reports will be of the closest object determined to not be a false object. There will be up to five reports every 50 milliseconds. The object reports will be sent to the Data Fusion algorithm for further processing.
  • Condition Checking
  • The trackfiles generated by the Multi-Object Detector contain kinematics from previous tracks and, for each track, an associated detection (if one was available). The track and the detection need to be merged. The first step in merging the track and detections is the reduction of detections associated with track. The Condition Checking function will eliminate all but the best detection. If more than one detection associates with a track, this function will compare the SNR of the track to the SNR of the Detection and compare the closing rate of the track with the closing rate of the detection. The closest match in closing rate with a reasonable match in SNR will be correlated with the track and the other detections will be made available to create a new track or correlate with another track.
    For all Tracks j
    For all Associations k
    IF NA> 1
    IF (Rdotj − Rdotk) < (Rdotj − Rdotb)
    IF (SNRj − SNRk) < (SNRj − SNRb)
    b = CRIDj =k
    Where b is the association number for the best correlated detection
    CRIDj is the jth track best correlating association
  • The Condition Checking function will output a set of “correlated” tracks and detections, one detection per track. There will be up to five existing tracks and five new tracks (if all five detections did not associate with any tracks) output from this function. Normally there will be five correlated tracks. NOTE: Track speed will be used to convert the detection and track closing rate into object speed. This will prevent large changes in truck speed from eliminating the track correlation.
  • Track Maintenance
  • The correlated track/detection data will be used to maintain the tracks. A new track will be created from detections not associated with tracks. New tracks will be kept for up to five radar module cycles (50 milliseconds per cycle). If a second detection is associated with a new track before five cycles without an association, the new track is made a hold track. Hold tracks must experience 10 cycles in a row of no detection associations before they are eliminated. The Track Maintenance function will apply these rules and output a set of trackfiles containing new and hold tracks. The trackfiles will be identified as coming from the FLD. (See FIG. 21, Track Report Generator Funcitons.)
  • Detection Processing PD and RGD
  • The PD and RGD are not as sophisticated as the FLD. The clutter processing will be a simpler version of the FLD processing. The detection process will also be a simpler version of the Multi-Object Detection algorithm used in the FLD. The functional design is shown in FIG. 32 and is discussed below.
  • Clutter Reduction
  • The Detection Processing module receives the 20 samples of FFT′d data from the Interface boards. There can be up to 15 sets of data. Each set of data will be clutter processed individually. The clutter processing will have the same functions as described earlier for the FLD but the functions will be adapted to the PD or RGD requirements.
  • Threshold Computation
  • First the main difference is the clutter will have a different spectral characteristic for each radar module view angle. The forward looking radar modules will have similar clutter to the FLD. The side looking sensors will have clutter which is lower in frequency and for the radar modules closest to and viewing the side of the road the clutter will be stronger. The rear modules will have the most action from objects approaching the truck at low speeds relative to the truck's speed. All of these specific conditions will be addressed in the Threshold Computation Function and the Road/Object Clutter Location Function.
  • The frequency domain signal will be analyzed and the clutter removed individually for each sensor (see Equation 1). The clutter is removed in four steps. The first step is to compute a threshold versus frequency (speed) for the received spectrum. This set of numbers (20, one for each sample) is computed from the speed of the truck and the height of the sensor above the road. The frequency spectrum of the clutter is related to the speed of the truck, the distance to the road and the view angle of the radar module. The height of the sensor will be the strongest return and it will be a 0 Hz in the spectrum. The 31 Hz return (1 MPH) will be from a distance where the velocity component of the road is 1 MPH. The next spectral bin will be from 2 MPH and so on. The distance to the road for each frequency will be pre-computed and an equation for clutter will use the resulting values (see Equation 2 except the bin spacing is 31 Hz versus 38.75 Hz). The values in this equation are sensor height and truck speed.
  • Rain Clutter Removal
  • Removal of weather related clutter will be done in steps two and three. Rain clutter produces a distinct pattern in the frequency spectrum. The 20-sample frequency spectrum will be examined for this pattern. If found the threshold values at the frequencies where rain is present will be adjusted for the presence of rain.
  • Snow Clutter Removal
  • Snow will appear as colored noise. Several frequencies may have more noise than others, but in general the average noise will go up throughout the spectrum. The thresholds will be adjusted accordingly.
  • Object Clutter Detection
  • Step four is the search for specific clutter from non-moving objects. This will be done by flagging large returns (see Equation 4). Objects that are stationary will appear at specific frequencies in the spectrum. Depending on the angle to the object the frequency and amplitude will change. Objects on the side of the road at close ranges will appear stronger in one sensor and at frequencies lower than the speed of the truck. These objects and their frequencies will be noted for processing later in the data fusion function.
  • Multi-Object Detection
  • After clutter reduction, the frequency spectrum, and clutter thresholds will be fed into a Multi-Object Detection algorithm. This algorithm will be used to detect multiple objects in the presence of road, snow and rain clutter. This algorithm will be designed to offer up candidate detections, which when combined with other sensor and truck data can be used to determine the actual presence of an object.
  • Clutter Threshold Application
  • The first step is the application of the clutter thresholds to the entire spectrum and the elimination of the colored clutter. If a particular frequency bin exceeds the threshold it will be stored for later processing as a detection candidate. If a certain segment of frequency bins produces an excessive number of detections the thresholds will be raised in that region and the strongest detections will be reported.
  • Detection
  • The second step is to detect the threshold crossings (see equation 5). For a detection to be declared a threshold crossing must have occurred for one radar module. This step will output no more than the two strongest candidate detections.
  • Pre-Tracker
  • The final step in the Multi-Object Detector is to eliminate all but the 15 best detections. This algorithm is the first stage of a tracker. The detections will be sorted by closing velocity, SNR and radar module of origin. The objects that will most likely reach the truck first will be given highest priority. If over 15 detections exist the pre-tracker will then sort the detections further. The pre-tracker will compare the detection to objects already being tracked from previous sensor cycles. Those detections closest to existing tracks will receive priority within the original sort. These detections will be “associated” with the existing tracks (See the pre-tracker of the Signal Data Processing Functions Section). More than one detection can associate with a track. Up to 15 detections will be passed on. These detections will be the objects that will reach the truck first and have been around the longest. In all cases the shortest time to impact will be given priority. Longevity will only be used to sort on detections that have closing speeds within 10 mph and SNR's within 20 dB of each other.
  • The Detection Processing module will output detections/velocities, associated signal strengths, pulse timing data, clutter estimates and clutter distribution.
  • Data Fusion
  • The Data Fusion algorithm will be designed to take inputs from a N out of M tracker. This Data Fusion algorithm is specifically designed to not require any specific set of sensors and adapts as sensors are added using a lookup table of the new sensor parameters and an indication of the number of and type of sensors added. The Data Fusion Algorithm can also take into account any data from the host vehicle supplied by various sensors. The absence of data will not cause a problem with the algorithm, however, the more data the better the performance. The purpose of the Data Fusion Algorithm is to reduce all of the tracks and detections down to a small set of object tracks representing the objects surrounding the track. Each radar module and sensor set may detect the same object. It is the task of the Data Fusion Algorithm to sort this out. The algorithm uses a technique called Deepest Hole to combine the data from multiple sensor and Kinematics Combination to fuse this data together.
  • The Data Fusion functions are shown in FIG. 33. The purpose of the Data Fusion Algorithm is to reduce all of the tracks and detections down to a small set of object tracks representing the objects surrounding the truck. Each radar module and sensor set (FLD, PD, and RGD) may detect the same object. It is the task of the Data Fusion Algorithm to sort all of this out. This algorithm is described below.
  • Deepest Hole
  • The Deepest Hole function “associates” tracks from the sensor sets with existing Fused Tracks. It is assumed that multiple sensor sets may find the same object and that multiple radar modules within a sensor set will often see and report on the same object. The Deepest Hole function will resolve these redundant tracks into a set of fused tracks, one per object. The output of this function is a list of track links linking the tracks from multiple radar modules together for each object.
  • The purpose of this function is to match new sensor data with current tracks (multi-sensor track or MST). Matched sets of MST and sensor data are found by operating on the agreement matrix with a heuristic search algorithm. The agreement matrix contains the normalized distances (referred to as “standard differences”) between shared state variables calculated for every possible combination of sensor and MST track. The “deepest hole” search algorithm finds the set of matches between rows and columns of the agreement matrix to minimize the sum of matrix elements found at the intersection of matched rows and columns.
  • The standard differences are calculated for every possible combination of MST and sensor track. An agreement matrix is built which contains MST tracks as the first index (rows) and sensor tracks as the second index (columns). The standard difference for each MST/sensor pair is put into the appropriate cross-index position.
  • The standard difference is the sum of the squares of the differences in shared state variables normalized by the sum of the state variances and the number of variables shared by the two tracks.
    • NOTE: Only the kinematic states NED and Rdot are queried for use in the standard difference calculation; NED is North, East and Down given by the sensor reporting the detection and the pointing angle of the sensor. Down is set to zero. DIFF = 1 N N i = 1 [ X mst ( i ) - X sen ( i ) ] 2 [ V mst ( i ) + V sen ( i ) ] ( 7 )
      where:
  • N=number of shared state variables
  • Xmst=vector of MST track state variables shared with sensor
  • Xsen=vector of sensor track state variables shared with MST
  • Vmst=vector of MST track state variances corresponding to Xmst
  • Vscn=vector of sensor track state variances corresponding to Xsen
  • DIFF=standard difference
  • The most probable matches between new sensor data and current MST tracks are found by searching through the agreement matrix. A simple “deepest hole” heuristic algorithm, which closely reproduces the results of exhaustive search algorithms, is used. “Deepest hole” finds the set of matches between rows (MST tracks) and columns (sensor tracks) in the agreement matrix which minimize the sum of the standard differences residing at the intersection of matched rows and columns. Matches are not allowed for matrix elements (standard differences) greater than a user defined limit.
  • The steps in Deepest Hole is as follows:
    • 1) If any matrix element is greater than MAX_DIFF, multiply this element by MAX_DIFF and place it back into the agreement matrix.
    • 2) If there are more rows than columns in the agreement matrix, transpose the matrix. Note that the matrix needs to be transposed again when processing is completed.
    • 3) Set up linked lists containing all unmatched rows and columns.
    • 4) If there is only one row in the list of rows, loop through linked list of columns to find the minimum value. Match this row and column. If there are no more rows to process all possible matches have been made.
    • 5) Loop through the linked list of rows. For each row
      • a) Loop through the linked list of columns to find the minimum and next minimum values for this row.
      • b) Calculate the difference between minimum and next minimum values in this row. Compare it with the largest value found so far in this loop, and save the larger one.
    • 6) Remove the row with the largest difference, found in step 5b, from the linked list of unmatched rows. Match this row to the column in which its minimum value was found. Remove that column from the linked list of unmatched columns.
    • 7) Return to step 4
    EXAMPLE Deepest Hole
  • Suppose the following agreement matrix was generated with MST (rows) and sensor (columns) tracks. All possible standard differences between the MST and sensor data are calculated and placed into the matrix. For this example it will be assumed that all standard differences fall below the limiting value (step 1). Since this matdx contains more rows than columns, the matrix is transposed (step 2).
    Sensor Detections MST Tracks
    1 2 3
    Figure US20060119473A1-20060608-P00801
    transpose
    Figure US20060119473A1-20060608-P00801
    1 2 3 4
    MST 1 1.7 2.0 2.7 Sensor 1 1.7 1.8 3.1 2.3
    Tracks 2 1.8 2.5 5.4 Detections 2 2.0 2.5 2.7 6.2
    3 3.1 2.7 1.5 3 2.7 5.4 1.5 1.9
    4 2.3 6.2 1.9
  • The difference between the minimum and next minimum values for each row is then calculated (step 5).
    Row 1 Row 2 Row 3
    (next min) − (min) A: 0.1 0.5 0.4
  • The largest value found (i.e. deepest hole) is 0.5 corresponding to Row 2 or Sensor Detection 2. This row is then examined to find the corresponding column (MST track) which has the minimum standard difference. The minimum is 2.0 corresponding to column 1. The smallest value in row 2, 2.0 indicates that MST Track 1 is the closest to sensor detection 2. MST track 2 is the next closest to sensor track 2, with a standard difference of 2.5. The larger the distance between the standard differences the more likely that the actual match is the minimum value found (2.0 in this case). That is why the most probable match is determined from the largest distance between standard differences. Therefore, it is concluded that MST track 1 and sensor detection 2 are a probable match. Row 2 and column 1 are now removed from the matrix (step 6) and the entire procedure repeated on the reduced matrix (step 7). The reduced matrix is:
    MST Tracks
    2 3 4
    Sensor 1 1.8 3.1 2.3
    Tracks 3 5.4 1.5 1.9
  • The difference between the minimum and next minimum values for each row of the reduced agreement matrix is then calculated (step 5).
    Row 1 Row 2 Row 3
    (next min) − (min) A: 0.5 0.4
  • The “deepest hole” is found to be 0.5 corresponding to row 1 or sensor track 1. The minimum standard difference for this row is 1.8 corresponding to MST track 2, the next probable match. This row and column are now removed.
    MST Tracks
    3 4
    Sensor 3 1.5 1.9
    Tracks
  • Now we need only find the most probable match between sensor detection 3 and the remaining MST tracks. The minimum standard difference is 1.5; hence, the last match pairs sensor detection 3 with MST track 3. Note that MST track 4 remained unpaired with any new sensor data. The status of this track would be evaluated and potentially changed. A summary of the resultant matches found in this example is given below.
    MATCHED TRACKS
    MST Sensor
    1 2
    2 1
    3 3

    Kinematics Combination
  • The track data from the tracks, which are linked, will be merged in this function. The speeds from each track will be averaged together. The SNR's will be averaged using a weighted average considering radar module antenna gain (the FLD's and potentially one RGD will have 15 to 20 dB more gain than the other radar modules). The range estimate for the new merged track will be handled by the Range Estimator Function. The ID will be merged using the Probability of Correct ID and the ID Confidence.
  • The kinematics merge process consists of multiple passes, one pass for each sensor being processed on a given cycle. The algorithm acts as a sensor track combiner and does not provide additional filtering to the sensor data. Given that only Radar sensors with differing beamwidths are being considered the merge process would behave as follows.
  • Let X(k) represent a state vector at cycle k. For brevity, vector notation will be used. Thus X could be any of the vectors (N Ndot)T, (E Edot)T, (D Ddot)T, (R Rdot)T where the superscript T indicates transposition. To merge the sensor data the following equation is used:
    X M(k)=X M 1(k)+W(k)[X S (E)(k)−X M (1)(k)]  (8)
    where:
  • XM=the merged MST state vector
  • XM (1)=the MSTstate vector
  • W=weight vector
  • X(E) s=sensor state vector extrapolated to current MST time
  • The weight vector W(k) is computed from the relation:
    W(k)=P M (E)(k)[P M (E)(k)+P S (E)(k)]−1  (9)
    where
  • PM (E)=extrapolated covariance matrix of the MST track
  • PS (E)=extrapolated covariance matrix of the sensor detection
  • Note that the sensor data affects the MST track in inverse proportion with the size of its errors.
  • The fusion process generates an MST track with lower variances. The fused covariance matrix is given by:
    P M (F)(k)=[I−W(k)]P M (E)(k)  (10)
    Range Estimation
  • Range Estimation consists of three steps. These steps are designed to achieve maximum range resolution without using time measurement as a tool. The estimator works using the principal behind radar wave propagation. Radar signals are received in relation to the transmitted power by range to the fourth power. That is the radar signal drops in strength by range to the object squared and on the return by range to the receiver from the object squared. Thus when range goes from 300 feet to 100 feet there is an increase in received power of 81 times. This change in power can be measured and it is greater than the changes due to object size or object perceived size (angle dependent). The range to a object can be estimated by following the curve of the received power over time. This is why tracks are formed in previous functions. The tracks give a time history of the received signal which will be used in the range estimate.
  • Multi-Hypothesis Automatic Difference Ranging (MADR)
  • To obtain range to the track an algorithm called MADR will be used. The first step in MADR is to apply the SNR history to the radar range curve fit program. An algorithm dubbed “Automatic Ranging” will be used to establish this first range estimate. MADR will estimate the starting range of a track based on the SNR history. This starting range will be added to distance traveled (a negative number for a closing object) and a current range estimate will be computed. The MADR algorithm is discussed in detail below.
  • A range estimate will be calculated from the signal strength versus time and closing rate of each tracked object. Due to the properties of radar, as an object changes its relative position to the host vehicle, the signal strength will vary, by range to the fourth power and by a scattering property called scintillation. Combining this signal strength property with the distance traveled by the object will yield the starting range and thus the current range to the object. The distance traveled by the object is computed in the sensors by combining time since the data collection and tracking started, with the individual measured closing rates versus time. Using multiple hypotheses the signal strengths versus time will be inserted into an algorithm which matches the hypothetical signal strength curve to a (range and distance traveled)/(range to the fourth) curve. One hypothesis, a set of points drawn through the returned signal levels over time, will correspond to the correct starting range for the object given the measured distance traveled. This hypothesis will provide the best statistical match to a one over range to the fourth curve and will be the range estimate provided. FIG. 34 shows an example of multiple hypotheses through amplitude versus time data.
  • The multiple hypotheses will be sent into the Automatic Ranging algorithm. Automatic Ranging is a technique first used by the Navy in the late 1970's to passively estimate range to a target. This application is substantially different but it can still use the same principals.
  • Automatic Ranging (AR) is a technique that was originally developed to determine the range and closing rate of an unknown emitter passively for a fire control radar. Using only a receiver, AR was able to determine the range and closing rate after the emitter's signal strength had changed approximately 1 dB (about 11% change in range). It's primary application was in determining range to noise jammers whose purpose was to deny range information that was required for missile firing equations. In order to do this, AR made two major assumptions: 1) during the time that AR was ranging, the emitters speed was constant, and 2) the emitters signal strength did not vary appreciably. The basis of AR's ranging was the fact that the emitters signal strength varies as range squared (Range2). This non-linearity is exploited in AR's methodology to determine where the emitter is. The key to AR is its implementation which uses both known signal strength relationships and the computational power of the digital computer. In general, AR's ranging technique can be applied to any number of problems where a measurable parameter varies in some non-linear manner while other measurable or assumed parameters are linear. AR's basic implementation concept and its application to the anti-collision warning system is described in the following paragraphs.
  • The key to AR is converting the non-linear terms of the problem to linear terms and then using the power of the computer to find the correct answer. Converting the radar range equation (either one way R2 or two way R4) to a linear equation merely requires the use of log's (decibel's) so that the equation becomes a series of linear operations. In simplified form, the radar equation can be written:
    S dB=(Some Constant)dB−(2*RangedB) (or (4*RangedB) for 2-way)
    That is the simple part. The hard part is that we have one equation with two unknowns (the constant and the Range). However, that's where the digital computer comes in. What we do know, is that as range changes the constant remains constant and the signal strength increases or decreases (depending on whether you are closing or opening). We also know (assume) that the range change per unit of time is constant. However, this doesn't help because by making the equation linear (taking the log of everything) we can't use the idea of delta range directly. What we need is the actual range. The solution is to assume an initial range and a speed. If we assume the correct initial range and speed, then the equation will remain linear over time (as range changes). The assumed initial range allows us to solve for the “constant” and use that in subsequent calculations. If we assume the wrong range or speed or both, then the equation will become progressively more incorrect over time. This is what AR does. Calculations for a series of initial ranges and a series of speeds are done. By using a linear regression to do a curve fit, only four terms are required to be saved for each range/speed combination. The linear regression allows calculation of the “slope” of the curve fit line. Since we've made the equation linear, over time the correct combination will have a “slope” of 1 and all other combinations will have slopes greater than or less than 1 (as the curve fits become progressively worse).
  • The application of AR to the anti-collision system requires only a one dimensional solution, since closing rate is known and only the initial range is calculated using a series of assumed initial ranges. As in all uses of AR, the accuracy of the AR depends on a variety of parameters. These parameters include: signal strength measurement accuracy, signal strength variations (due to scintillation, aspect changes, system non-linearity's), the amount of range change, the number of signal strength measurements, and the number (granularity) of the initial guesses which are calculated (every 5 feet, every 10 feet, etc.). It should be noted that while AR solves for initial conditions, calculation of the current position is straightforward since the time and speed from the initial position are known.
  • Constant Cross Section Ranging (CCSR)
  • The second method is to assume a constant radar cross section for an object. The RCS will be derived from a look-up table and the track ID. The SNR time history curve will be smoothed. Using the estimated RCS and the measured speed an estimate of range will be determined. Assuming a constant K for the losses and gains in the radar sensor the range is given by:
    R j ={K*RCS 2 /S j}1/4  (11)
  • Where
      • Rj is the CCSR range estimate
      • RCS is the guess for radar cross section
      • Sj is the jth track smoothed signal
        Range Estimate Resolution
  • The final step will be to resolve the two range estimates. The resolution will be dependent on the history of range estimates for the subject track, the ID of the track, the quality of the SNR history (noise on the SNR curve) and the quality of the track ID. α = ( σ ( R ar ( t ) ) MAX σ ( R ar ( t ) ) ) β = 1 - α R j = α R arj + β R ccsj
  • Where RCS(t) is the series of Radar Cross Section estimates versus sensor cycles
      • σ is the standard deviation
      • Rar is the range estimate using automatic ranging versus sensor cycle
      • Rccs is the Constant Cross Section range estimate.
        Situation Report Generator
  • The final stage of the processing is the Situation Report Generator. This algorithm is the interface to the Display Driver. The output from this algorithm will depend on the mode of the detectors and the detectors installed on the track. This algorithm will output detected objects in the truck's path as well as objects immediately adjacent to the truck. The design goal for the false alarm rate for reporting to the driver will be less than one per day.
  • This algorithm receives trackfiles from the Data Fusion algorithm and range estimates from the Range Estimator. This data is compared to the reporting criteria established by a lookup table in the CM. The lookup table will be mode and RM/sensor system (FLD, PD, RGD) dependent. Depending on the RM(s) reporting and updating the Fused Trackfile, the lookup table will determine whether the track should be formatted and reported. This lookup table will be created and updated by ATI. The format for the table is shown below:
    Alarm Threshold Warning Threshold
    Range FLD <3 feet <10 feet
    Range front PD <3 feet <6 feet
    Range rear PD/RGD (fwd gear) <6 feet
    Range rear PD/RGD (rev gear) <.5 feet <2 feet
    Range Side PD/RGD (turning) <12 feet <24 feet
    Time to Impact FLD <3 seconds <6 seconds
    Time to impact rear PD/RGD <3 seconds <6 seconds
  • When the vehicle is in reverse, the data from the rear of the truck will be used to provide the transverse angle to an object behind the vehicle. If the object is a point source such as a pole the transverse position will be stable and resolvable into eight increments. Wide objects such as a loading dock cannot be located in the transverse direction. The data fusion algorithm will output the data necessary to provide the transverse location.
  • Built In Test
  • The Built In Test (BIT) function will be performed in the CM on all of the system's components. The BIT software will be designed to exercise the sensors such that a known preset performance can be measured. If a fault occurs, the faulty component will be recycled and retested. If the fault persists, a permanent record of the condition and the associated component will be stored in flash memory in the CM and the condition routed to the display processor by the BIT software. The other CM functions will always assume a fully functional system unless BIT informs it of a faulty component. This fault detection will be at a 50-millisecond rate for the FLD and a 333-millisecond rate for the PD and RGD.
  • Each RM has a distinct clutter response from the surface and an identifying code in its digitized signal being fed to the CM. BIT initiate will cause the BIT software to poll each RM for the FLD, PD and RGD. If an individual RM is faulty, the BIT software will identify the faulty RM through a comparison of the clutter return to the expected clutter return. If the faulty RM is in the FLD, BIT will inform the Object Data Processing module that specific RM is no longer functional. The Object Data Processing module will then revert to a degraded mode. If the faulty RM is in the PD or RGD, BIT will inform the Detection Processing module, which will revert to a degraded mode.
  • If BIT detects a fault in each of the RM responses connected to a specific interface board or if no response is received from a specific interface board, then BIT will assume that interface board has failed. The failure of the FLD Interface Board will result in complete loss of the FLD capability and this will be reported to the Object Data Processing module. The failure of a PD or RGD will be reported to the Detection Processing module, which will revert to a degraded mode.
  • BIT will inform the Display Processor of all failures and their severity, so that the operator is aware of the system status.
  • Master Clear
  • The CM initiates the Master Clear function. On receipt of the Master Clear discrete, the FLD, PD and RGD will reinitialize all functions. All signal processing will cease and be restarted. The A/D and FFT functions will continue to operate. A watch Dog timer set to 1 second will be used to detect a reset condition in the RMs. Upon receiving a time out (no serial request from the CM in the last second) the microcontroller will be reset. All message formatting will stop and any existing but unsent messages will be cleared.
  • Although specific embodiments have been illustrated and described herein, it will be appreciated by those of ordinary skill in the art that any arrangement which is calculated to achieve the same purpose may be substituted for the specific embodiments shown. This application is intended to cover any adaptations or variations of the present invention. Therefore, it is intended that this invention be limited only by the claims and the equivalents thereof.

Claims (30)

1. A system comprising:
a plurality of sensors configured to provide a plurality of signals based on energy reflected by an object;
a processor communicatively coupled to the plurality of sensors and configured to generate data corresponding to a transverse location of the object based on an algorithm; and
an operator interface device coupled to the controller and configured to render human perceivable information based on the data.
2. The system of claim 1 wherein the processor is configured to execute a data fusion algorithm.
3. The system of claim 1 wherein the processor is configured to execute a deepest hole algorithm.
4. The system of claim 1 wherein the processor is configured to execute an automatic ranging algorithm.
5. The system of claim 1 wherein the processor is configured to generate tracking data.
6. The system of claim 1 wherein the controller is configured to execute a triangulation algorithm.
7. The system of claim 1 further including a vehicle and wherein the plurality of sensors are disposed about a surface of the vehicle.
8. The system of claim 7 wherein the motor vehicle includes at least one of an automobile, a truck, a bus, a recreational vehicle and an off-road vehicle.
9. The system of claim 7 wherein the plurality of sensors are responsive to reflected energy originating from within a 360 degree view of the vehicle.
10. The system of claim 7 wherein the operator interface is disposed within a cab of the vehicle.
11. The system of claim 1 wherein the processor is communicatively coupled to at least one of the plurality of sensors by a wireless link.
12. The system of claim 11 wherein the wireless link includes a radio frequency link.
13. The system of claim 12 wherein the processor includes a radio frequency transceiver modem.
14. The system of claim 1 wherein the operator interface and processor are disposed in a first housing and at least one sensor of the plurality of sensors is disposed in a second housing.
15. The system of claim 14 wherein the first housing is coupled to a vehicle power supply.
16. The system of claim 14 wherein the second housing includes a discrete battery.
17. A method comprising:
receiving a plurality of signals based on energy reflected by an object and received at a plurality of sensors, the plurality of sensors disposed about a surface of a vehicle;
processing the plurality of signals to generate positional data corresponding to the object; and
rendering a human perceivable output based on the positional data.
18. The method of claim 17 wherein processing includes manipulating frequency domain data.
19. The method of claim 17 wherein processing includes executing a clutter reduction algorithm.
20. The method of claim 19 wherein executing the clutter reduction algorithm includes reducing at least one of rain clutter, snow clutter, object clutter and course clutter.
21. The method of claim 17 wherein processing includes executing a multi-object detection algorithm.
22. The method of claim 17 wherein processing includes executing a tracker algorithm.
23. The method of claim 17 wherein processing includes executing a data fusion algorithm.
24. The method of claim 17 wherein processing includes executing a deepest hole algorithm.
25. The method of claim 17 wherein processing includes generating a range estimate.
26. The method of claim 17 wherein rendering the human perceivable output includes generating the output as a function of a vehicle height and vehicle speed.
27. A control module comprising:
a processor configured to execute instructions stored in a memory, the instructions configured to generate collision avoidance data for a vehicle;
an interface coupled to the processor and configured to communicate with a plurality of sensors disposed on a surface of the vehicle; and
an output port coupled to the processor, the output port configured to provide a signal for a display based on the collision avoidance data; and
further wherein the instructions are configured to determine a transverse location of an object relative to the vehicle.
28. The control module of claim 27 wherein the processor includes a track report generator.
29. The control module of claim 27 wherein the processor is configured to execute a data fusion algorithm.
30. The control module of claim 27 wherein the processor is configured to execute a built in test.
US11/297,273 1998-08-06 2005-12-08 System and method of avoiding collisions Abandoned US20060119473A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/297,273 US20060119473A1 (en) 1998-08-06 2005-12-08 System and method of avoiding collisions

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US09/130,279 US6268803B1 (en) 1998-08-06 1998-08-06 System and method of avoiding collisions
US58724400A 2000-06-02 2000-06-02
US10/794,794 US20050073433A1 (en) 1998-08-06 2004-03-04 Precision measuring collision avoidance system
US11/297,273 US20060119473A1 (en) 1998-08-06 2005-12-08 System and method of avoiding collisions

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/794,794 Continuation US20050073433A1 (en) 1998-08-06 2004-03-04 Precision measuring collision avoidance system

Publications (1)

Publication Number Publication Date
US20060119473A1 true US20060119473A1 (en) 2006-06-08

Family

ID=22443934

Family Applications (3)

Application Number Title Priority Date Filing Date
US09/130,279 Expired - Lifetime US6268803B1 (en) 1998-08-06 1998-08-06 System and method of avoiding collisions
US10/794,794 Abandoned US20050073433A1 (en) 1998-08-06 2004-03-04 Precision measuring collision avoidance system
US11/297,273 Abandoned US20060119473A1 (en) 1998-08-06 2005-12-08 System and method of avoiding collisions

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US09/130,279 Expired - Lifetime US6268803B1 (en) 1998-08-06 1998-08-06 System and method of avoiding collisions
US10/794,794 Abandoned US20050073433A1 (en) 1998-08-06 2004-03-04 Precision measuring collision avoidance system

Country Status (1)

Country Link
US (3) US6268803B1 (en)

Cited By (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040189448A1 (en) * 2003-03-24 2004-09-30 Helmuth Eggers Video display for a vehicle environment surveillance unit
US20040210364A1 (en) * 2003-04-17 2004-10-21 Fuji Jukogyo Kabushiki Kaisha Vehicle drive assist system
US20050192715A1 (en) * 2003-08-19 2005-09-01 Ho-Kyung Kim Back warning system for vehicle
US20050231339A1 (en) * 2004-02-17 2005-10-20 Fuji Jukogyo Kabushiki Kaisha Outside-vehicle monitoring system
US20070018801A1 (en) * 2005-07-25 2007-01-25 Novotny Steven J Digital voice/visual warning, alert, and status system for vehicles utilizing laser sensors
US20070257783A1 (en) * 2005-01-19 2007-11-08 Toyota Jidosha Kabushiki Kaisha Vehicle Warning Device
US20080030399A1 (en) * 2006-03-23 2008-02-07 Omron Corporation Radar device and radar method
US20080077327A1 (en) * 2006-09-26 2008-03-27 Harris Steven M Radar collison warning system for rooftop mounted cargo
FR2906372A1 (en) * 2006-09-21 2008-03-28 Derisys Sarl Industrial vehicle i.e. semi-trailer, driving assisting device, has control unit detecting discontinuous variation of distance measured by reverse sensors and transmitting sensorial signal alerting driver of vehicle
US20080211644A1 (en) * 2007-02-02 2008-09-04 Buckley Stephen J Dual mode vehicle blind spot system
US20090063053A1 (en) * 2007-09-04 2009-03-05 International Business Machines Corporation Method and system for blind spot identification and warning utilizing visual indicators
US20090128398A1 (en) * 2005-12-27 2009-05-21 Oliver Wieland Method of Calibrating a Sensor System
US20090146863A1 (en) * 2007-12-06 2009-06-11 Ralink Technology Corp. Radar detection method and apparatus using the same
WO2009080491A1 (en) * 2007-12-21 2009-07-02 Hella Kgaa Hueck & Co. Radar sensor arrangement
US20090174536A1 (en) * 2008-01-09 2009-07-09 Rao Manoharprasad K Accident avoidance during vehicle backup
US20090188322A1 (en) * 2007-12-27 2009-07-30 Scott Taillet Sound Measuring Device
NL1035766C2 (en) * 2008-07-29 2009-08-12 Melchior Frederik Leipoldt Sensors for e.g. truck, placed on sides of vehicle and activated or deactivated by actuation of vehicle in specific direction, where signals from sensors are delivered to person or object
US20090259399A1 (en) * 2008-04-15 2009-10-15 Caterpillar Inc. Obstacle detection method and system
EP2127999A2 (en) * 2008-05-30 2009-12-02 System Truck S.r.l. System for assisting a driver while manoeuvring a truck towards a loading or unloading bay
US20090326764A1 (en) * 2008-06-25 2009-12-31 Rao Manoharprasad K Ultrasonic sensor-based side impact sensing system
FR2933221A1 (en) * 2008-06-26 2010-01-01 Renault Sas Obstacle e.g. wall, detection system operating method for motor vehicle, involves processing data relative to obstacles susceptible to be at risk, in priority and controlling acquisition of data at frequency based on risk
US20100225521A1 (en) * 2009-03-05 2010-09-09 Honda Motor Co., Ltd. Object detecting apparatus for vehicle
US20110018737A1 (en) * 2009-07-24 2011-01-27 Automotive Research & Testing Center Vehicle Collision Avoidance System and Method
CN102099226A (en) * 2008-07-22 2011-06-15 罗伯特·博世有限公司 Method and controller for actuating personal protection means for a vehicle
US20110163868A1 (en) * 2008-09-25 2011-07-07 Binar Aktiebolag Warning system
US20110221584A1 (en) * 2008-09-19 2011-09-15 Continental Automotive Gmbh System for Recording Collisions
US20120025964A1 (en) * 2010-07-27 2012-02-02 Beggs Ryan P Methods and apparatus to detect and warn proximate entities of interest
WO2012013305A1 (en) * 2010-07-30 2012-02-02 Wabco Gmbh Monitoring system for monitoring the surrounding area, in particular the area behind motor vehicles
WO2012041414A1 (en) * 2010-10-02 2012-04-05 Wabco Gmbh Sensor mounting for a distance sensor
EP2455779A1 (en) * 2010-11-17 2012-05-23 Robert Bosch GmbH Ultrasound-based orientation detection of objects in the vicinity of a vehicle
US20120158243A1 (en) * 2010-12-21 2012-06-21 Anthony Pupin Vehicle camera system operable in off-road mode and method
US20120274503A1 (en) * 2011-04-29 2012-11-01 Searete Llc Network and personal electronic devices operatively coupled to micro-impulse radars
US20120290146A1 (en) * 2010-07-15 2012-11-15 Dedes George C GPS/IMU/Video/Radar absolute/relative positioning communication/computation sensor platform for automotive safety applications
US20130107044A1 (en) * 2011-10-26 2013-05-02 Anthony Azevedo Blind Spot Camera System
US20130229298A1 (en) * 2012-03-02 2013-09-05 The Mitre Corporation Threaded Track Method, System, and Computer Program Product
US20140022067A1 (en) * 2012-07-20 2014-01-23 Michael J. Dambra Scooter/wheelchair lift platform with back-up sensor and quick disconnect
US8810382B1 (en) * 2014-01-14 2014-08-19 Joseph N. Laurita Method and apparatus for warning vehicle of low overpass height
US8833815B2 (en) * 2012-10-23 2014-09-16 Ford Global Technologies, Llc Bumper integrated forward radar mounting system
US8884809B2 (en) 2011-04-29 2014-11-11 The Invention Science Fund I, Llc Personal electronic device providing enhanced user environmental awareness
US20140343836A1 (en) * 2013-05-17 2014-11-20 Dr. Ing. H.C.F. Porsche Aktiengesellschaft Method for operating a first-party vehicle
US9000973B2 (en) 2011-04-29 2015-04-07 The Invention Science Fund I, Llc Personal electronic device with a micro-impulse radar
US9103899B2 (en) 2011-04-29 2015-08-11 The Invention Science Fund I, Llc Adaptive control of a personal electronic device responsive to a micro-impulse radar
US20150247914A1 (en) * 2012-10-05 2015-09-03 FLARM Technology GmbH Method and device for estimating a distance
US20150274074A1 (en) * 2012-01-30 2015-10-01 Klear-View Camera, Llc System and method for providing front-oriented visual information to vehicle driver
US20150285906A1 (en) * 2012-10-04 2015-10-08 Technology Service Corporation Proximity sensor
US9177477B2 (en) 2010-07-19 2015-11-03 Honda Motor Co., Ltd. Collision warning system using driver intention estimator
US20160117841A1 (en) * 2014-10-22 2016-04-28 Denso Corporation Object detection apparatus
US20160116441A1 (en) * 2014-10-22 2016-04-28 Denso Corporation Object detection apparatus
US20160210861A1 (en) * 2015-01-16 2016-07-21 Texas Instruments Incorporated Integrated fault-tolerant augmented area viewing system
US20160291149A1 (en) * 2015-04-06 2016-10-06 GM Global Technology Operations LLC Fusion method for cross traffic application using radars and camera
US9685086B2 (en) * 2015-05-27 2017-06-20 Cisco Technology, Inc. Power conservation in traffic safety applications
US20170285165A1 (en) * 2014-09-25 2017-10-05 Audi Ag Method for operating a multiplicity of radar sensors in a motor vehicle and motor vehicle
CN107640090A (en) * 2016-07-22 2018-01-30 中兴通讯股份有限公司 A kind of traffic safety control method and device
US10175355B2 (en) 2014-10-22 2019-01-08 Denso Corporation Object detection apparatus
DE102017216791A1 (en) * 2017-09-22 2019-05-02 Zf Friedrichshafen Ag Sensory detection of open spaces under land vehicles
US20190270405A1 (en) * 2016-11-18 2019-09-05 Panasonic Intellectual Property Management Co., Ltd. Notifying device, automatic driving vehicle, notifying method, program, non-transitory recording medium, and notifying system
US10436899B2 (en) 2014-10-22 2019-10-08 Denso Corporation Object detection apparatus
US10436900B2 (en) 2014-10-22 2019-10-08 Denso Corporation Object detection apparatus
US10451734B2 (en) 2014-10-22 2019-10-22 Denso Corporation Object detecting apparatus
US10453343B2 (en) 2014-10-22 2019-10-22 Denso Corporation Object detection apparatus
CN110599800A (en) * 2019-09-24 2019-12-20 江苏集萃智能传感技术研究所有限公司 Parking lot parking space state monitoring system and monitoring method
US20200031276A1 (en) * 2018-07-25 2020-01-30 Mando Corporation Rear-side alarm device and rear-side alarm method thereof
US10578736B2 (en) 2014-10-22 2020-03-03 Denso Corporation Object detection apparatus
EP2814532B2 (en) 2012-02-13 2020-04-15 Integrated Healing Technologies Multi-modal wound treatment apparatus
US10788570B2 (en) 2017-09-29 2020-09-29 The Boeing Company Radar system for mobile platform and method of use
DE102019205504A1 (en) * 2019-04-16 2020-10-22 Zf Friedrichshafen Ag Control device and method as well as computer program product
US11390209B2 (en) * 2020-03-18 2022-07-19 Grote Industries, Llc System and method for adaptive driving beam headlamp
US11760264B2 (en) 2012-01-30 2023-09-19 Klear-View Camera Llc System and method for providing front-oriented visual information to vehicle driver

Families Citing this family (267)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5910854A (en) 1993-02-26 1999-06-08 Donnelly Corporation Electrochromic polymeric solid films, manufacturing electrochromic devices using such solid films, and processes for making such solid films and devices
US5668663A (en) 1994-05-05 1997-09-16 Donnelly Corporation Electrochromic mirrors and devices
US6891563B2 (en) 1996-05-22 2005-05-10 Donnelly Corporation Vehicular vision system
US6172613B1 (en) 1998-02-18 2001-01-09 Donnelly Corporation Rearview mirror assembly incorporating vehicle information display
US6326613B1 (en) 1998-01-07 2001-12-04 Donnelly Corporation Vehicle interior mirror assembly adapted for containing a rain sensor
US8294975B2 (en) 1997-08-25 2012-10-23 Donnelly Corporation Automotive rearview mirror assembly
US6124886A (en) 1997-08-25 2000-09-26 Donnelly Corporation Modular rearview mirror assembly
US8288711B2 (en) 1998-01-07 2012-10-16 Donnelly Corporation Interior rearview mirror system with forwardly-viewing camera and a control
US6445287B1 (en) 2000-02-28 2002-09-03 Donnelly Corporation Tire inflation assistance monitoring system
US6693517B2 (en) 2000-04-21 2004-02-17 Donnelly Corporation Vehicle mirror assembly communicating wirelessly with vehicle accessories and occupants
US6477464B2 (en) 2000-03-09 2002-11-05 Donnelly Corporation Complete mirror-based global-positioning system (GPS) navigation solution
US6329925B1 (en) 1999-11-24 2001-12-11 Donnelly Corporation Rearview mirror assembly with added feature modular display
JP4114292B2 (en) * 1998-12-03 2008-07-09 アイシン・エィ・ダブリュ株式会社 Driving support device
US6690413B1 (en) * 1999-04-21 2004-02-10 Michael S. Moore Tractor-trailer viewing system
US6894608B1 (en) 1999-07-22 2005-05-17 Altra Technologies Incorporated System and method for warning of potential collisions
US6642839B1 (en) * 2000-02-16 2003-11-04 Altra Technologies Incorporated System and method of providing scalable sensor systems based on stand alone sensor modules
US7167796B2 (en) 2000-03-09 2007-01-23 Donnelly Corporation Vehicle navigation system for use with a telematics system
US7370983B2 (en) 2000-03-02 2008-05-13 Donnelly Corporation Interior mirror assembly with display
WO2001064481A2 (en) 2000-03-02 2001-09-07 Donnelly Corporation Video mirror systems incorporating an accessory module
US7855755B2 (en) 2005-11-01 2010-12-21 Donnelly Corporation Interior rearview mirror assembly with display
US6657581B1 (en) 2000-08-16 2003-12-02 Raytheon Company Automotive lane changing aid indicator
US6577269B2 (en) 2000-08-16 2003-06-10 Raytheon Company Radar detection method and apparatus
US6748312B2 (en) 2000-08-16 2004-06-08 Raytheon Company Safe distance algorithm for adaptive cruise control
DE60107692T2 (en) * 2000-08-16 2005-12-15 Raytheon Company, Waltham SYSTEM FOR RECORDING NEARBY OBJECTS
AU2001291299A1 (en) 2000-09-08 2002-03-22 Raytheon Company Path prediction system and method
US7565230B2 (en) * 2000-10-14 2009-07-21 Temic Automotive Of North America, Inc. Method and apparatus for improving vehicle operator performance
US6925425B2 (en) * 2000-10-14 2005-08-02 Motorola, Inc. Method and apparatus for vehicle operator performance assessment and improvement
US20020151297A1 (en) * 2000-10-14 2002-10-17 Donald Remboski Context aware wireless communication device and method
US6909947B2 (en) 2000-10-14 2005-06-21 Motorola, Inc. System and method for driver performance improvement
US7255451B2 (en) 2002-09-20 2007-08-14 Donnelly Corporation Electro-optic mirror cell
DE60220379T2 (en) 2001-01-23 2008-01-24 Donnelly Corp., Holland IMPROVED VEHICLE LIGHTING SYSTEM
US7581859B2 (en) 2005-09-14 2009-09-01 Donnelly Corp. Display device for exterior rearview mirror
DE10110042A1 (en) * 2001-03-02 2002-10-10 Bosch Gmbh Robert Control / evaluation system for a sensor network
US6708100B2 (en) 2001-03-14 2004-03-16 Raytheon Company Safe distance algorithm for adaptive cruise control
DE20106977U1 (en) * 2001-04-23 2002-08-29 Mekra Lang Gmbh & Co Kg Warning device in motor vehicles
JP3759429B2 (en) * 2001-05-23 2006-03-22 株式会社東芝 Obstacle detection apparatus and method
SE519803C2 (en) * 2001-08-06 2003-04-08 Ericsson Telefon Ab L M Method and apparatus for analyzing sensor system performance
DE10153987B4 (en) * 2001-11-06 2018-05-30 Daimler Ag Information system in a vehicle
KR100981267B1 (en) * 2001-12-14 2010-09-10 레이티언 캄파니 Back-up aid indicator
US20030111902A1 (en) * 2001-12-17 2003-06-19 David Thiede Intelligent braking system and method
US6933837B2 (en) * 2002-01-25 2005-08-23 Altra Technologies Incorporated Trailer based collision warning system and method
US6812833B2 (en) * 2002-04-12 2004-11-02 Lear Corporation Turn signal assembly with tactile feedback
US6914521B2 (en) * 2002-04-12 2005-07-05 Lear Corporation Visual display for vehicle
US8718919B2 (en) * 2002-04-23 2014-05-06 Robert Bosch Gmbh Method and apparatus for lane recognition for a vehicle
US6918674B2 (en) 2002-05-03 2005-07-19 Donnelly Corporation Vehicle rearview mirror system
US7329013B2 (en) 2002-06-06 2008-02-12 Donnelly Corporation Interior rearview mirror system with compass
EP1514246A4 (en) 2002-06-06 2008-04-16 Donnelly Corp Interior rearview mirror system with compass
DE10230303A1 (en) * 2002-07-05 2004-01-15 Valeo Schalter Und Sensoren Gmbh Vehicle surroundings sensing system, method and control device for this purpose
US6611227B1 (en) 2002-08-08 2003-08-26 Raytheon Company Automotive side object detection sensor blockage detection system and related techniques
US7113098B1 (en) 2002-08-29 2006-09-26 Melvin Hayes Animal accident reduction systems, methods, and apparatuses
WO2004026633A2 (en) 2002-09-20 2004-04-01 Donnelly Corporation Mirror reflective element assembly
US7310177B2 (en) 2002-09-20 2007-12-18 Donnelly Corporation Electro-optic reflective element assembly
WO2004103772A2 (en) 2003-05-19 2004-12-02 Donnelly Corporation Mirror assembly for vehicle
FR2845331A1 (en) * 2002-10-04 2004-04-09 Erman Sarl Cabinet System for improving the driving security of an automotive vehicle, uses a system of video cameras installed at the front and right side of the vehicle
US7106213B2 (en) * 2002-10-28 2006-09-12 General Motors Corporation Distance detection and display system for use in a vehicle
US6987707B2 (en) * 2002-11-12 2006-01-17 General Dynamics Advanced Information Systems, Inc. Method and system for in-air ultrasonic acoustical detection and characterization
US20040183661A1 (en) * 2002-12-18 2004-09-23 Bowman Timothy D. Overhead obstacle detector for vehicles carrying roof top articles
KR20050014051A (en) * 2003-07-29 2005-02-07 안희태 Distance Measuring Method and Device by Frequency Separation with Ultrasonic
US6950733B2 (en) * 2003-08-06 2005-09-27 Ford Global Technologies, Llc Method of controlling an external object sensor for an automotive vehicle
US7446924B2 (en) 2003-10-02 2008-11-04 Donnelly Corporation Mirror reflective element assembly including electronic component
US7308341B2 (en) 2003-10-14 2007-12-11 Donnelly Corporation Vehicle communication system
JP4449409B2 (en) * 2003-10-27 2010-04-14 日産自動車株式会社 Vehicle occupant protection device
US20050128060A1 (en) * 2003-11-19 2005-06-16 Mark Rennick Universally usable object detection system and method
US7239958B2 (en) * 2003-12-18 2007-07-03 General Motors Corporation Apparatus and method for discerning a driver's intent and for aiding the driver
US20050184859A1 (en) * 2004-02-19 2005-08-25 Shih-Hsiung Li Ultrasonic detector installable on a truck trailer
EP1571040A3 (en) * 2004-03-04 2006-07-26 Parking Angel Ltd Proximity detection system for a vehicle
US20050253693A1 (en) * 2004-05-11 2005-11-17 Mark Rennick Object detection system, apparatus, and method
JP4461920B2 (en) * 2004-06-23 2010-05-12 株式会社デンソー Parking assistance device
JP4346521B2 (en) * 2004-07-28 2009-10-21 株式会社デンソー Obstacle detection device
US20060028351A1 (en) * 2004-08-09 2006-02-09 Lewis James M Docking monitor
DE102004045974A1 (en) * 2004-09-22 2006-03-23 Mekra Lang Gmbh & Co. Kg System for transmitting signals in a motor vehicle
US7324013B2 (en) * 2004-11-02 2008-01-29 Preco Electronics, Inc. Safety alarm system
JP4189858B2 (en) * 2004-11-16 2008-12-03 株式会社ホンダアクセス Obstacle detection device
DE102005018487B4 (en) * 2005-04-21 2008-09-25 Daimler Ag Method for operating a monitoring and alarm device in parked vehicles and monitoring and alarm device
DE102005019550A1 (en) * 2005-04-28 2006-11-09 Daimlerchrysler Ag A distance detection system for a towing vehicle and method for operating a distance detection system
KR100781135B1 (en) * 2005-04-29 2007-11-30 반병철 Vehicle communication device and control method thereof
DE102005021225A1 (en) * 2005-05-09 2006-11-16 Robert Bosch Gmbh Method and device for detecting the surface condition of objects of road traffic or persons
US7626749B2 (en) 2005-05-16 2009-12-01 Donnelly Corporation Vehicle mirror assembly with indicia at reflective element
US20070052703A1 (en) * 2005-09-06 2007-03-08 Denso Corporation Display device
US7864032B2 (en) * 2005-10-06 2011-01-04 Fuji Jukogyo Kabushiki Kaisha Collision determination device and vehicle behavior control device
US7496439B2 (en) * 2005-10-17 2009-02-24 Lang Mekra North America, Llc Multifunction exterior display for a vehicle mirror
US7688187B2 (en) * 2005-11-07 2010-03-30 Caird Andrew J Early detection system and method for exterior vehicle cargo
US7598845B2 (en) * 2005-11-09 2009-10-06 Chrysler Group Llc Towing load detection system
DE602007000509D1 (en) 2006-01-31 2009-03-19 Mekra Lang Gmbh & Co Kg Rear view mirror for vehicles with an electronic display for displaying object ion in a monitored area for collision avoidance
CN101401024B (en) 2006-03-09 2016-03-16 金泰克斯公司 Comprise the vehicle rearview assembly of high intensity display
US7876258B2 (en) * 2006-03-13 2011-01-25 The Boeing Company Aircraft collision sense and avoidance system and method
US9014871B2 (en) * 2006-03-22 2015-04-21 Eaton Corporation Method and system for associating a vehicle trailer to a vehicle
DE102006018075A1 (en) * 2006-04-11 2007-10-18 Valeo Schalter Und Sensoren Gmbh A method for monitoring at least part of a vehicle environment of a vehicle and system therefor
US7567167B2 (en) * 2006-04-24 2009-07-28 Reverse Control, Inc. Wireless signal apparatus for assisting drivers to back large vehicles
US20080042865A1 (en) * 2006-08-09 2008-02-21 Dock Watch, Llc Loading dock monitoring device and method
US9207673B2 (en) * 2008-12-04 2015-12-08 Crown Equipment Corporation Finger-mounted apparatus for remotely controlling a materials handling vehicle
US9122276B2 (en) 2006-09-14 2015-09-01 Crown Equipment Corporation Wearable wireless remote control device for use with a materials handling vehicle
US8452464B2 (en) * 2009-08-18 2013-05-28 Crown Equipment Corporation Steer correction for a remotely operated materials handling vehicle
US9645968B2 (en) * 2006-09-14 2017-05-09 Crown Equipment Corporation Multiple zone sensing for materials handling vehicles
KR101425424B1 (en) * 2006-09-14 2014-08-01 크라운 이큅먼트 코포레이션 Associating a transmitter and a receiver in a supplemental remote control system for materials handling vehicles
US8970363B2 (en) * 2006-09-14 2015-03-03 Crown Equipment Corporation Wrist/arm/hand mounted device for remotely controlling a materials handling vehicle
DE102006047634A1 (en) * 2006-10-09 2008-04-10 Robert Bosch Gmbh Method for detecting an environment of a vehicle
US8311730B2 (en) * 2006-11-29 2012-11-13 Neff Ryan A Vehicle position determination system
US8532862B2 (en) * 2006-11-29 2013-09-10 Ryan A. Neff Driverless vehicle
US20080319688A1 (en) * 2007-02-26 2008-12-25 Hyeung-Yun Kim Usage monitoring system of gas tank
GB2447672B (en) 2007-03-21 2011-12-14 Ford Global Tech Llc Vehicle manoeuvring aids
US8040226B2 (en) * 2007-04-02 2011-10-18 Datachassi Dc Ab Vehicle surveillance and communication system
SE531043C2 (en) * 2007-04-02 2008-11-25 Datachassi Dc Ab Method of monitoring vehicles
WO2008134815A1 (en) * 2007-05-04 2008-11-13 Teledyne Australia Pty Ltd. Collision avoidance system and method
US8013720B2 (en) 2007-11-02 2011-09-06 Reverse Control, Inc. Signal apparatus for facilitating safe backup of vehicles
DE102007053989A1 (en) * 2007-11-13 2009-05-14 Wabco Gmbh Method and arrangement for warning against obstacles with insufficient headroom and / or insufficient passage width
US8996294B2 (en) * 2007-12-19 2015-03-31 Nissan Motor Co., Ltd. Inter-vehicle distance maintenance supporting system and method
US8154418B2 (en) 2008-03-31 2012-04-10 Magna Mirrors Of America, Inc. Interior rearview mirror system
EP2260322A1 (en) * 2008-03-31 2010-12-15 Valeo Radar Systems, Inc. Automotive radar sensor blockage detection apparatus and method
JP4678611B2 (en) * 2008-06-05 2011-04-27 トヨタ自動車株式会社 Obstacle detection device and obstacle detection system
US8212660B2 (en) * 2008-06-23 2012-07-03 Frank Nugent Overhead obstacle avoidance system
US9487144B2 (en) 2008-10-16 2016-11-08 Magna Mirrors Of America, Inc. Interior mirror assembly with display
US9522817B2 (en) 2008-12-04 2016-12-20 Crown Equipment Corporation Sensor configuration for a materials handling vehicle
TWI339627B (en) * 2008-12-30 2011-04-01 Ind Tech Res Inst System and method for detecting surrounding environment
US8654197B2 (en) * 2009-03-04 2014-02-18 Raytheon Company System and method for occupancy detection
DE102009002277A1 (en) * 2009-04-08 2010-10-14 Robert Bosch Gmbh Driver assistance system for warning against obstacles
US8207836B2 (en) 2009-06-23 2012-06-26 Frank Nugent Overhead obstacle avoidance system
DE102009028451A1 (en) * 2009-08-11 2011-02-17 Robert Bosch Gmbh Collision monitoring for a motor vehicle
US8577551B2 (en) 2009-08-18 2013-11-05 Crown Equipment Corporation Steer control maneuvers for materials handling vehicles
AU2009351340B2 (en) * 2009-08-18 2015-06-18 Crown Equipment Corporation Steer correction for a remotely operated materials handling vehicle
US8731777B2 (en) 2009-08-18 2014-05-20 Crown Equipment Corporation Object tracking and steer maneuvers for materials handling vehicles
FR2949567B1 (en) * 2009-09-01 2012-02-24 Thales Sa MULTI-TARGET DATA PROCESSING FOR MULTI-RECEIVER PASSIVE RADARS IN SFN OR MFN MODE
JP5789911B2 (en) * 2009-10-06 2015-10-07 株式会社ジェイテクト Rotation angle detection device and electric power steering device
GB2476060A (en) * 2009-12-09 2011-06-15 Luke Malpass Trailer proximity sensing system
DE102009060169A1 (en) * 2009-12-23 2011-06-30 Volkswagen AG, 38440 Automatic forward parking in head parking spaces
US8354920B2 (en) * 2010-01-22 2013-01-15 John Kole Vehicle overhead clearance detection system and method of operation
US20110211507A1 (en) * 2010-02-23 2011-09-01 Automated Media Services, Inc. System and method for communicating data to electronic displays positioned in a retail establishment
DE102010030466B4 (en) * 2010-06-24 2021-05-20 Robert Bosch Gmbh Procedure for warning a driver of a collision
DE102010041424A1 (en) * 2010-09-27 2012-03-29 Robert Bosch Gmbh Method for detecting an environment of a vehicle
US8403402B1 (en) 2010-10-17 2013-03-26 Mario Placido Portela Magnetic band
US9824600B1 (en) 2010-11-28 2017-11-21 Mario Placido Portela Electromagnetic band and photoelectric cell safety device
JP2012144157A (en) * 2011-01-12 2012-08-02 Toyota Motor Corp Travel support apparatus
EP2484567B1 (en) * 2011-02-08 2017-12-27 Volvo Car Corporation An onboard perception system
US8902054B2 (en) 2011-02-10 2014-12-02 Sitting Man, Llc Methods, systems, and computer program products for managing operation of a portable electronic device
US8773251B2 (en) 2011-02-10 2014-07-08 Sitting Man, Llc Methods, systems, and computer program products for managing operation of an automotive vehicle
US8666603B2 (en) 2011-02-11 2014-03-04 Sitting Man, Llc Methods, systems, and computer program products for providing steering-control feedback to an operator of an automotive vehicle
DE102011011048B9 (en) * 2011-02-11 2021-10-07 Mekra Lang Gmbh & Co. Kg Monitoring of the close range around a commercial vehicle
US9854209B2 (en) 2011-04-19 2017-12-26 Ford Global Technologies, Llc Display system utilizing vehicle and trailer dynamics
US9346396B2 (en) 2011-04-19 2016-05-24 Ford Global Technologies, Llc Supplemental vehicle lighting system for vision based target detection
US9723274B2 (en) 2011-04-19 2017-08-01 Ford Global Technologies, Llc System and method for adjusting an image capture setting
US9248858B2 (en) 2011-04-19 2016-02-02 Ford Global Technologies Trailer backup assist system
US9555832B2 (en) 2011-04-19 2017-01-31 Ford Global Technologies, Llc Display system utilizing vehicle and trailer dynamics
US9374562B2 (en) 2011-04-19 2016-06-21 Ford Global Technologies, Llc System and method for calculating a horizontal camera to target distance
US9164955B2 (en) 2013-02-04 2015-10-20 Ford Global Technologies Trailer active back-up assist with object avoidance
US8825328B2 (en) 2011-04-19 2014-09-02 Ford Global Technologies Detection of and counter-measures for jackknife enabling conditions during trailer backup assist
US9500497B2 (en) 2011-04-19 2016-11-22 Ford Global Technologies, Llc System and method of inputting an intended backing path
US9506774B2 (en) 2011-04-19 2016-11-29 Ford Global Technologies, Llc Method of inputting a path for a vehicle and trailer
US9290204B2 (en) 2011-04-19 2016-03-22 Ford Global Technologies, Llc Hitch angle monitoring system and method
US9969428B2 (en) 2011-04-19 2018-05-15 Ford Global Technologies, Llc Trailer backup assist system with waypoint selection
US9926008B2 (en) 2011-04-19 2018-03-27 Ford Global Technologies, Llc Trailer backup assist system with waypoint selection
US9683848B2 (en) 2011-04-19 2017-06-20 Ford Global Technologies, Llc System for determining hitch angle
JP5722127B2 (en) * 2011-06-07 2015-05-20 株式会社小松製作所 Work vehicle perimeter monitoring device
US8878660B2 (en) 2011-06-28 2014-11-04 Nissan North America, Inc. Vehicle meter cluster
WO2013057849A1 (en) * 2011-10-21 2013-04-25 Panasonic Corporation Wireless communication system, wireless communication center module, wireless communication method, and method and program for relaying wireless communication
JP5757900B2 (en) * 2012-03-07 2015-08-05 日立オートモティブシステムズ株式会社 Vehicle travel control device
JP5667594B2 (en) * 2012-03-15 2015-02-12 株式会社小松製作所 Dump truck with obstacle detection mechanism and obstacle detection method thereof
GB2492435A (en) * 2012-03-29 2013-01-02 Peter Le Masurier Collision warning system which displays a live video image to a driver when a vulnerable vehicle is detected
US8879139B2 (en) 2012-04-24 2014-11-04 Gentex Corporation Display mirror assembly
US8649952B2 (en) 2012-06-13 2014-02-11 Ford Global Technologies, Llc Control of a backing vehicle
US20140010050A1 (en) * 2012-07-03 2014-01-09 Brian DeAngelo Distance detection alarm system
US9989637B2 (en) 2012-08-03 2018-06-05 Safie Holdings LLC Portable collision warning apparatus
CA2880902C (en) * 2012-08-03 2020-07-28 Charles Rashid Portable collision warning apparatus
US8954241B2 (en) * 2012-08-10 2015-02-10 Caterpillar Inc. Mining truck spotting under a shovel
DE102012215350A1 (en) * 2012-08-29 2014-03-06 Continental Automotive Gmbh Multi-sensory attention control
JP5550695B2 (en) * 2012-09-21 2014-07-16 株式会社小松製作所 Work vehicle periphery monitoring system and work vehicle
JP5411976B1 (en) * 2012-09-21 2014-02-12 株式会社小松製作所 Work vehicle periphery monitoring system and work vehicle
US10410071B2 (en) * 2012-12-05 2019-09-10 Florida Institute For Human And Machine Cognition, Inc. User display providing obstacle avoidance
US9415754B2 (en) * 2012-12-05 2016-08-16 Florida Institute For Human And Machine Cognition, Inc. User display providing obstacle avoidance
US9511799B2 (en) 2013-02-04 2016-12-06 Ford Global Technologies, Llc Object avoidance for a trailer backup assist system
US9592851B2 (en) 2013-02-04 2017-03-14 Ford Global Technologies, Llc Control modes for a trailer backup assist system
TWI488764B (en) * 2013-03-15 2015-06-21 Ind Tech Res Inst Vehicle driving assistant system for generating vehicle driving information
KR101881346B1 (en) 2013-03-15 2018-07-24 젠텍스 코포레이션 Display mirror assembly
DE102013210729A1 (en) * 2013-06-10 2014-12-11 Robert Bosch Gmbh Method and device for signaling a visually at least partially hidden traffic object for a driver of a vehicle
DE102013010993A1 (en) * 2013-07-02 2015-01-08 Brose Fahrzeugteile Gmbh & Co. Kommanditgesellschaft, Hallstadt Object detection device for a vehicle
DE102013214383A1 (en) * 2013-07-23 2015-01-29 Robert Bosch Gmbh Method and device for providing a collision signal with regard to a vehicle collision, method and device for managing collision data regarding vehicle collisions, and method and device for controlling at least one collision protection device of a vehicle
US10169821B2 (en) 2013-09-20 2019-01-01 Elwha Llc Systems and methods for insurance based upon status of vehicle software
US9424607B2 (en) * 2013-09-20 2016-08-23 Elwha Llc Systems and methods for insurance based upon status of vehicle software
KR101766635B1 (en) 2013-09-24 2017-08-09 젠텍스 코포레이션 Display mirror assembly
US20150097660A1 (en) * 2013-10-09 2015-04-09 Joel Adell Blind view sensor assembly
US9352777B2 (en) 2013-10-31 2016-05-31 Ford Global Technologies, Llc Methods and systems for configuring of a trailer maneuvering system
US10065562B2 (en) 2013-12-31 2018-09-04 International Business Mahcines Corporation Vehicle collision avoidance
WO2015116915A1 (en) 2014-01-31 2015-08-06 Gentex Corporation Backlighting assembly for display for reducing cross-hatching
US9233710B2 (en) 2014-03-06 2016-01-12 Ford Global Technologies, Llc Trailer backup assist system using gesture commands and method
US9562773B2 (en) 2014-03-15 2017-02-07 Aurora Flight Sciences Corporation Autonomous vehicle navigation system and method
EP3119643B1 (en) 2014-03-21 2018-05-23 Gentex Corporation Tri-modal display mirror assembly
US9834146B2 (en) 2014-04-01 2017-12-05 Gentex Corporation Automatic display mirror assembly
US9342747B2 (en) * 2014-04-14 2016-05-17 Bendix Commercial Vehicle Systems Llc Vehicle driver assistance apparatus for assisting a vehicle driver in maneuvering the vehicle relative to an object
US9875661B2 (en) 2014-05-10 2018-01-23 Aurora Flight Sciences Corporation Dynamic collision-avoidance system and method
JP6408832B2 (en) * 2014-08-27 2018-10-17 ルネサスエレクトロニクス株式会社 Control system, relay device, and control method
JP6393123B2 (en) * 2014-09-04 2018-09-19 日立建機株式会社 Obstacle detection system and transport vehicle
WO2016035215A1 (en) * 2014-09-05 2016-03-10 横浜ゴム株式会社 Collision avoidance system and collision avoidance method
US10399495B1 (en) * 2014-09-05 2019-09-03 United Services Automobile Association (Usaa) Systems and methods for indicating proximity conditions for a vehicle
US9694751B2 (en) 2014-09-19 2017-07-04 Gentex Corporation Rearview assembly
JP6505839B2 (en) 2014-11-07 2019-04-24 ジェンテックス コーポレイション Full screen display mirror actuator
JP6367486B2 (en) 2014-11-13 2018-08-01 ジェンテックス コーポレイション Rearview mirror system with display device
KR101997815B1 (en) 2014-12-03 2019-07-08 젠텍스 코포레이션 Display mirror assembly
US9522677B2 (en) 2014-12-05 2016-12-20 Ford Global Technologies, Llc Mitigation of input device failure and mode management
US9533683B2 (en) 2014-12-05 2017-01-03 Ford Global Technologies, Llc Sensor failure mitigation system and mode management
USD746744S1 (en) 2014-12-05 2016-01-05 Gentex Corporation Rearview device
US9744907B2 (en) 2014-12-29 2017-08-29 Gentex Corporation Vehicle vision system having adjustable displayed field of view
US9607242B2 (en) 2015-01-16 2017-03-28 Ford Global Technologies, Llc Target monitoring system with lens cleaning device
US9667875B2 (en) * 2015-01-21 2017-05-30 Caterpillar Inc. Vision system and method of monitoring surroundings of machine
US9720278B2 (en) 2015-01-22 2017-08-01 Gentex Corporation Low cost optical film stack
US9827905B1 (en) * 2015-02-05 2017-11-28 Tiffany Nicole Jones Real-time traffic monitoring systems and methods
US9766336B2 (en) * 2015-03-16 2017-09-19 Here Global B.V. Vehicle obstruction detection
US9910151B2 (en) * 2015-03-19 2018-03-06 Delphi Technologies, Inc. Radar object detection system
EP3286038A4 (en) 2015-04-20 2018-04-25 Gentex Corporation Rearview assembly with applique
WO2016187215A1 (en) 2015-05-18 2016-11-24 Gentex Corporation Full display rearview device
US10444337B2 (en) 2015-05-22 2019-10-15 Witricity Corporation Methods and apparatus utilizing time division access of multiple radar modules in living object detection for wireless power transfer applications
WO2016209877A1 (en) 2015-06-22 2016-12-29 Gentex Corporation System and method for processing streamed video images to correct for flicker of amplitude-modulated lights
JP2017044599A (en) * 2015-08-27 2017-03-02 ルネサスエレクトロニクス株式会社 Control system
US9896130B2 (en) 2015-09-11 2018-02-20 Ford Global Technologies, Llc Guidance system for a vehicle reversing a trailer along an intended backing path
US9836060B2 (en) 2015-10-28 2017-12-05 Ford Global Technologies, Llc Trailer backup assist system with target management
EP3368375B1 (en) 2015-10-30 2020-03-04 Gentex Corporation Rearview device
USD797627S1 (en) 2015-10-30 2017-09-19 Gentex Corporation Rearview mirror device
USD798207S1 (en) 2015-10-30 2017-09-26 Gentex Corporation Rearview mirror assembly
CN108349435B (en) 2015-10-30 2021-06-15 金泰克斯公司 Switching board
USD800618S1 (en) 2015-11-02 2017-10-24 Gentex Corporation Toggle paddle for a rear view device
JP6757738B2 (en) * 2015-11-10 2020-09-23 古河電気工業株式会社 Monitoring device and monitoring method
US9610975B1 (en) 2015-12-17 2017-04-04 Ford Global Technologies, Llc Hitch angle detection for trailer backup assist system
US9964642B2 (en) * 2016-02-04 2018-05-08 Toyota Motor Engineering & Manufacturing North America, Inc. Vehicle with system for detecting arrival at cross road and automatically displaying side-front camera image
USD845851S1 (en) 2016-03-31 2019-04-16 Gentex Corporation Rearview device
USD817238S1 (en) 2016-04-29 2018-05-08 Gentex Corporation Rearview device
US10112646B2 (en) 2016-05-05 2018-10-30 Ford Global Technologies, Llc Turn recovery human machine interface for trailer backup assist
US10025138B2 (en) 2016-06-06 2018-07-17 Gentex Corporation Illuminating display with light gathering structure
US11243301B2 (en) 2016-07-12 2022-02-08 Braze Mobility Inc. System, device and method for mobile device environment sensing and user feedback
CA3039675C (en) * 2016-10-07 2022-08-30 Phillips Connect Technologies Llc Smart trailer system
USD809984S1 (en) 2016-12-07 2018-02-13 Gentex Corporation Rearview assembly
USD854473S1 (en) 2016-12-16 2019-07-23 Gentex Corporation Rearview assembly
CN106515728A (en) * 2016-12-22 2017-03-22 深圳市招科智控科技有限公司 System and method for avoiding collision and obstacle for a driverless bus
JP2020505802A (en) 2016-12-30 2020-02-20 ジェンテックス コーポレイション Full screen mirror with on-demand spotter view
AU2018205223B9 (en) 2017-01-06 2023-11-09 Aurora Flight Sciences Corporation Collision-avoidance system and method for unmanned aircraft
US10914401B2 (en) 2017-01-10 2021-02-09 The Heil Co. Fuel monitoring system
US20210264382A1 (en) 2017-03-03 2021-08-26 State Farm Mutual Automobile Insurance Company Systems and methods for updating a loss history blockchain
US10444341B2 (en) * 2017-03-06 2019-10-15 GM Global Technology Operations LLC Road clutter mitigation
EP3595931A1 (en) 2017-03-17 2020-01-22 Gentex Corporation Dual display reverse camera system
US10691135B2 (en) 2017-09-22 2020-06-23 Waymo Llc Detecting motion of an autonomous vehicle using radar technology
US10890919B2 (en) 2017-09-22 2021-01-12 Waymo Llc Calculating velocity of an autonomous vehicle using radar technology
US10393873B2 (en) * 2017-10-02 2019-08-27 Ford Global Technologies, Llc Adaptive mitigation of ultrasonic emission in vehicular object detection systems
JP7039940B2 (en) * 2017-11-09 2022-03-23 トヨタ自動車株式会社 Vehicle control unit
CN109839631B (en) * 2017-11-27 2023-09-19 松下知识产权经营株式会社 Radar apparatus
US10928511B2 (en) 2017-12-07 2021-02-23 Ford Global Technologies, Llc Synchronous short range radars for automatic trailer detection
TWI684021B (en) * 2018-04-10 2020-02-01 為升電裝工業股份有限公司 School bus radar system
CA3097517C (en) * 2018-04-25 2023-04-18 Waymo Llc Underbody radar units
US11119212B2 (en) 2018-08-10 2021-09-14 Aurora Flight Sciences Corporation System and method to reduce DVE effect on lidar return
US11207974B2 (en) 2018-09-21 2021-12-28 The Heil Co. Multiple gas tank assembly with individual pressure monitoring
US11037453B2 (en) 2018-10-12 2021-06-15 Aurora Flight Sciences Corporation Adaptive sense and avoid system
TWI686747B (en) * 2018-11-29 2020-03-01 財團法人金屬工業研究發展中心 Method for avoiding obstacles of mobile vehicles all week
US11641121B2 (en) 2019-02-01 2023-05-02 Crown Equipment Corporation On-board charging station for a remote control device
CA3226839A1 (en) 2019-02-01 2020-08-06 Crown Equipment Corporation On-board charging station for a remote control device
US20210018629A1 (en) * 2019-07-18 2021-01-21 Jacob Kohn Angle Measurement System For Automotive Collision Avoidance Sensors
US11180148B2 (en) * 2019-09-03 2021-11-23 Ford Global Technologies, Llc Detection and response to confined trailer in system-assisted hitch operation
DE102019215393A1 (en) * 2019-10-08 2021-04-08 Robert Bosch Gmbh Method and device for classifying an object, in particular in the vicinity of a motor vehicle
JP7205701B2 (en) * 2019-10-11 2023-01-17 トヨタ自動車株式会社 vehicle alarm
CN112712717B (en) * 2019-10-26 2022-09-23 华为技术有限公司 Information fusion method, device and equipment
JP2021120624A (en) * 2020-01-30 2021-08-19 いすゞ自動車株式会社 Detection device and detection position calculation device
US20210318420A1 (en) * 2020-04-10 2021-10-14 Caterpillar Paving Products Inc. Ultrasonic sensors for work machine obstacle detection
US11543522B2 (en) * 2020-04-10 2023-01-03 Caterpillar Paving Products Inc. Ultrasonic sensors for work machine obstacle detection
AU2021325685B2 (en) 2020-08-11 2024-04-04 Crown Equipment Corporation Remote control device
US11760281B2 (en) 2020-11-17 2023-09-19 Ford Global Technologies, Llc Battery-powered vehicle sensors
US20220230655A1 (en) * 2021-01-15 2022-07-21 Continental Automotive Systems Inc. Method and device for detecting characteristic frequencies with a sensor
US11614513B2 (en) * 2021-03-12 2023-03-28 Ford Global Technologies, Llc Battery-powered vehicle sensors
US11912235B2 (en) 2021-03-12 2024-02-27 Ford Global Technologies, Llc Vehicle object detection
US11916420B2 (en) 2021-03-12 2024-02-27 Ford Global Technologies, Llc Vehicle sensor operation
US20220366793A1 (en) * 2021-05-14 2022-11-17 Heds Up Safety Inc. Vehicle proximity sensor and alert system
EP4163672A1 (en) * 2021-10-06 2023-04-12 KNORR-BREMSE Systeme für Nutzfahrzeuge GmbH Reversing assistance device and method for assisting a vehicle during reversing operation
DE102021130337B3 (en) * 2021-11-19 2023-02-02 Webasto SE Sensor module for attachment to a surface component of a motor vehicle and surface component with such a sensor module

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4403220A (en) * 1980-02-05 1983-09-06 Donovan John S Radar system for collision avoidance
US4674073A (en) * 1985-02-28 1987-06-16 Aisin Seiki Kabushiki Kaisha Reflective object detecting apparatus
US4694295A (en) * 1986-05-15 1987-09-15 Miller Brett A Vehicle blind spot detector
US5303205A (en) * 1990-02-26 1994-04-12 Trend Tec Inc. Vehicular distance measuring system with integral mirror display
US5734336A (en) * 1995-05-01 1998-03-31 Collision Avoidance Systems, Inc. Collision avoidance system

Family Cites Families (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3707708A (en) 1970-12-16 1972-12-26 Multra Guard Inc Muting circuit for a security alarm system providing a sonic alert
US3797309A (en) 1972-07-31 1974-03-19 J Tec Ass Inc Method and apparatus for sensing the relative direction and velocity of movement of a body in a liquid or gas medium
US3978481A (en) 1974-06-17 1976-08-31 Merlin A. Pierson Anti-collision vehicular radar system
USRE31509E (en) 1974-06-26 1984-01-24 Echo location systems
US3891966A (en) * 1974-08-08 1975-06-24 Zoltan G Sztankay Automobile collison avoidance laser system
US4204096A (en) 1974-12-02 1980-05-20 Barcus Lester M Sonic transducer mounting
US4056761A (en) 1975-09-11 1977-11-01 Quintron, Inc. Sonic transducer and drive circuit
JPS5259429A (en) 1975-11-10 1977-05-16 Nissan Motor Co Ltd Apparatus for preventing collision of vehicles
JPS5269131A (en) 1975-12-02 1977-06-08 Nissan Motor Co Ltd Collision preventing and warning apparatus
DE2623643C2 (en) 1976-05-26 1986-11-20 Daimler-Benz Ag, 7000 Stuttgart Method for automatically regulating the safety distance between a vehicle and vehicles in front and a device for carrying out this method
JPS5316230A (en) 1976-07-28 1978-02-15 Nissan Motor Co Ltd Automotive collision preventive device
JPS6045377B2 (en) 1976-08-03 1985-10-09 日産自動車株式会社 Collision prevention device
US4125826A (en) 1977-01-05 1978-11-14 Rasmussen Fred M Ultrasonic vehicle alarm system
US4162488A (en) 1977-03-11 1979-07-24 Emergency Products Corporation Alarm system
US4308536A (en) 1979-02-26 1981-12-29 Collision Avoidance Systems Anti-collision vehicular radar system
US4379497A (en) * 1980-09-02 1983-04-12 Bell & Howell, Company Vehicle collision avoidance system
JPS58217012A (en) 1982-06-11 1983-12-16 Kubota Ltd Traveling vehicle with obstacle detecting sensor
JPS5977517A (en) 1982-10-27 1984-05-04 Kubota Ltd Running vehicle
US4489321A (en) 1983-05-05 1984-12-18 Deere & Company Radar ground speed sensing system
US4759064A (en) 1985-10-07 1988-07-19 Chaum David L Blind unanticipated signature systems
US4759063A (en) 1983-08-22 1988-07-19 Chaum David L Blind signature systems
US4580251A (en) 1983-11-09 1986-04-01 Honeywell Inc. Ultrasonic distance sensor
DE3413769C1 (en) 1984-04-12 1985-04-11 Daimler-Benz Ag, 7000 Stuttgart Vehicle alarm system with acoustic signal delivery via at least one radio speaker installed in the vehicle
US4679175A (en) 1984-12-13 1987-07-07 Honeywell Inc. Ultrasonic distance sensor with dual burst noise rejection
US4681431A (en) 1985-02-27 1987-07-21 Sineco, Inc. Optical ranging anti-collision technique and system
US4737788A (en) 1985-04-04 1988-04-12 Motorola, Inc. Helicopter obstacle detector
US4815046A (en) 1985-04-29 1989-03-21 Xecutek Corporation Ultrasonic sensor system
US4692764A (en) 1986-06-20 1987-09-08 Bonar George D Automatic range finder and remote controller braking system
US4823042A (en) 1986-07-18 1989-04-18 Rich-Mar Corporation Sonic transducer and method for making the same
US4953141A (en) 1987-08-28 1990-08-28 Recurrent Solutions Limited Partnership Sonic distance-measuring device
DE3730105A1 (en) 1987-09-08 1989-03-16 Pietzsch Ibp Gmbh METHOD AND DEVICE FOR SECURING A VEHICLE OR DEVICE MOVING IN SPACE
US5059946A (en) 1989-05-10 1991-10-22 Hollowbush Richard R Ultrasonic obstacle detector
US5029290A (en) 1990-02-02 1991-07-02 Communications Test Design, Inc. Emergency alert system
US5373482A (en) 1990-02-26 1994-12-13 Trend Tec Inc. Distance measuring system arranged to limit false indications of distance measurements
KR930002467B1 (en) 1990-03-28 1993-04-02 박병용 Device detecting something in a vehicle
US5354983A (en) 1990-04-10 1994-10-11 Auto-Sense, Limited Object detector utilizing a threshold detection distance and suppression means for detecting the presence of a motor vehicle
JPH0459449A (en) 1990-06-27 1992-02-26 A C Ii:Kk Automotive alarm device for warning of automobile ahead
JP2847426B2 (en) 1990-08-14 1999-01-20 アスコ株式会社 How to check the operation of the vehicle safety system
EP0519287B1 (en) 1991-06-07 1995-08-30 Honda Giken Kogyo Kabushiki Kaisha Collision preventing system for vehicle
JP3197307B2 (en) 1991-10-14 2001-08-13 マツダ株式会社 Travel control device for mobile vehicles
US5235316A (en) 1991-12-20 1993-08-10 Qualizza Gregory K Vehicle collision avoidance system
US5251188A (en) 1992-04-13 1993-10-05 Recurrent Solutions Limited Partnership Elongated-pattern sonic transducer
FR2690252B1 (en) 1992-04-17 1994-05-27 Thomson Csf METHOD AND SYSTEM FOR DETERMINING THE POSITION AND ORIENTATION OF A MOBILE, AND APPLICATIONS.
US5249163A (en) 1992-06-08 1993-09-28 Erickson Jon W Optical lever for acoustic and ultrasound sensor
DE4232435C1 (en) 1992-09-28 1993-11-25 Telefunken Microelectron Method for operating an alarm system for motor vehicles
US5714928A (en) * 1992-12-18 1998-02-03 Kabushiki Kaisha Komatsu Seisakusho System for preventing collision for vehicle
US5389912A (en) 1993-02-10 1995-02-14 Arvin; Parham P. Truck clearance anti-collision device
DE4303815A1 (en) 1993-02-10 1994-08-11 Bosch Gmbh Robert Reversing and coupling auxiliary device for motor vehicles
US5471215A (en) 1993-06-28 1995-11-28 Nissan Motor Co., Ltd. Radar apparatus
US5483501A (en) 1993-09-14 1996-01-09 The Whitaker Corporation Short distance ultrasonic distance meter
DE4333112A1 (en) 1993-09-29 1995-03-30 Bosch Gmbh Robert Method and device for parking a vehicle
JP2799375B2 (en) 1993-09-30 1998-09-17 本田技研工業株式会社 Anti-collision device
DE4410617A1 (en) 1994-03-26 1995-09-28 Reitter & Schefenacker Gmbh Distance monitoring device for reversing car
US5583162A (en) * 1994-06-06 1996-12-10 Biopore Corporation Polymeric microbeads and method of preparation
US5517197A (en) 1994-10-24 1996-05-14 Rockwell International Corporation Modular radar architecture film (FM/CW or pulse) for automobile collision avoidance applications
IL112981A (en) * 1995-03-13 1999-03-12 Gilon Shmuel Collision avoidance detector
US5767793A (en) * 1995-04-21 1998-06-16 Trw Inc. Compact vehicle based rear and side obstacle detection system including multiple antennae
US5714947A (en) * 1997-01-28 1998-02-03 Northrop Grumman Corporation Vehicle collision avoidance system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4403220A (en) * 1980-02-05 1983-09-06 Donovan John S Radar system for collision avoidance
US4674073A (en) * 1985-02-28 1987-06-16 Aisin Seiki Kabushiki Kaisha Reflective object detecting apparatus
US4694295A (en) * 1986-05-15 1987-09-15 Miller Brett A Vehicle blind spot detector
US5303205A (en) * 1990-02-26 1994-04-12 Trend Tec Inc. Vehicular distance measuring system with integral mirror display
US5734336A (en) * 1995-05-01 1998-03-31 Collision Avoidance Systems, Inc. Collision avoidance system

Cited By (117)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040189448A1 (en) * 2003-03-24 2004-09-30 Helmuth Eggers Video display for a vehicle environment surveillance unit
US20040210364A1 (en) * 2003-04-17 2004-10-21 Fuji Jukogyo Kabushiki Kaisha Vehicle drive assist system
US7302325B2 (en) * 2003-04-17 2007-11-27 Fuji Jukogyo Kabushiki Kaisha Vehicle drive assist system
US20050192715A1 (en) * 2003-08-19 2005-09-01 Ho-Kyung Kim Back warning system for vehicle
US20050231339A1 (en) * 2004-02-17 2005-10-20 Fuji Jukogyo Kabushiki Kaisha Outside-vehicle monitoring system
US7567687B2 (en) * 2004-02-17 2009-07-28 Fuji Jukogyo Kabushiki Kaisha Outside-vehicle monitoring system
US20070257783A1 (en) * 2005-01-19 2007-11-08 Toyota Jidosha Kabushiki Kaisha Vehicle Warning Device
US8169305B2 (en) * 2005-01-19 2012-05-01 Toyota Jidosha Kabushiki Kaisha Vehicle warning device
US20070018801A1 (en) * 2005-07-25 2007-01-25 Novotny Steven J Digital voice/visual warning, alert, and status system for vehicles utilizing laser sensors
US20090128398A1 (en) * 2005-12-27 2009-05-21 Oliver Wieland Method of Calibrating a Sensor System
US7548182B2 (en) * 2006-03-23 2009-06-16 Omron Corporation Radar device and radar method
US20080030399A1 (en) * 2006-03-23 2008-02-07 Omron Corporation Radar device and radar method
EP1909114A1 (en) * 2006-09-21 2008-04-09 Derisys Device for assistance in driving an industrial vehicle
FR2906372A1 (en) * 2006-09-21 2008-03-28 Derisys Sarl Industrial vehicle i.e. semi-trailer, driving assisting device, has control unit detecting discontinuous variation of distance measured by reverse sensors and transmitting sensorial signal alerting driver of vehicle
US20080077327A1 (en) * 2006-09-26 2008-03-27 Harris Steven M Radar collison warning system for rooftop mounted cargo
US7877209B2 (en) * 2006-09-26 2011-01-25 Harris Steven M Radar collison warning system for rooftop mounted cargo
US20080211644A1 (en) * 2007-02-02 2008-09-04 Buckley Stephen J Dual mode vehicle blind spot system
US7830243B2 (en) * 2007-02-02 2010-11-09 Chrysler Group Llc Dual mode vehicle blind spot system
US20090063053A1 (en) * 2007-09-04 2009-03-05 International Business Machines Corporation Method and system for blind spot identification and warning utilizing visual indicators
US8645001B2 (en) 2007-09-04 2014-02-04 International Business Machines Corporation Method and system for blind spot identification and warning utilizing visual indicators
US20090146863A1 (en) * 2007-12-06 2009-06-11 Ralink Technology Corp. Radar detection method and apparatus using the same
US7948427B2 (en) * 2007-12-06 2011-05-24 Ralink Technology Corp. Radar detection method and apparatus using the same
WO2009080491A1 (en) * 2007-12-21 2009-07-02 Hella Kgaa Hueck & Co. Radar sensor arrangement
US8555721B2 (en) 2007-12-27 2013-10-15 Scott Taillet Sound measuring device
US20090188322A1 (en) * 2007-12-27 2009-07-30 Scott Taillet Sound Measuring Device
US9389118B2 (en) 2007-12-27 2016-07-12 Scott Taillet Sound measuring device
US7772991B2 (en) 2008-01-09 2010-08-10 Ford Global Technologies, Llc Accident avoidance during vehicle backup
US20090174536A1 (en) * 2008-01-09 2009-07-09 Rao Manoharprasad K Accident avoidance during vehicle backup
US20090259399A1 (en) * 2008-04-15 2009-10-15 Caterpillar Inc. Obstacle detection method and system
EP2127999A2 (en) * 2008-05-30 2009-12-02 System Truck S.r.l. System for assisting a driver while manoeuvring a truck towards a loading or unloading bay
EP2127999A3 (en) * 2008-05-30 2011-05-25 System Truck S.r.l. System for assisting a driver while manoeuvring a truck towards a loading or unloading bay
US20090326764A1 (en) * 2008-06-25 2009-12-31 Rao Manoharprasad K Ultrasonic sensor-based side impact sensing system
US8014921B2 (en) * 2008-06-25 2011-09-06 Ford Global Technologies, Llc Ultrasonic sensor-based side impact sensing system
FR2933221A1 (en) * 2008-06-26 2010-01-01 Renault Sas Obstacle e.g. wall, detection system operating method for motor vehicle, involves processing data relative to obstacles susceptible to be at risk, in priority and controlling acquisition of data at frequency based on risk
US20110218710A1 (en) * 2008-07-22 2011-09-08 Robert Bosch Gmbh Method and control device for triggering passenger protection means for a vehicle
CN102099226A (en) * 2008-07-22 2011-06-15 罗伯特·博世有限公司 Method and controller for actuating personal protection means for a vehicle
NL1035766C2 (en) * 2008-07-29 2009-08-12 Melchior Frederik Leipoldt Sensors for e.g. truck, placed on sides of vehicle and activated or deactivated by actuation of vehicle in specific direction, where signals from sensors are delivered to person or object
US20110221584A1 (en) * 2008-09-19 2011-09-15 Continental Automotive Gmbh System for Recording Collisions
US8816841B2 (en) * 2008-09-25 2014-08-26 Binar Aktiebolag Warning system
US20110163868A1 (en) * 2008-09-25 2011-07-07 Binar Aktiebolag Warning system
US8115668B2 (en) * 2009-03-05 2012-02-14 Honda Motor Co., Ltd. Object detecting apparatus for vehicle
US20100225521A1 (en) * 2009-03-05 2010-09-09 Honda Motor Co., Ltd. Object detecting apparatus for vehicle
US20110018737A1 (en) * 2009-07-24 2011-01-27 Automotive Research & Testing Center Vehicle Collision Avoidance System and Method
US8154422B2 (en) * 2009-07-24 2012-04-10 Automotive Research & Testing Center Vehicle collision avoidance system and method
US9099003B2 (en) 2010-07-15 2015-08-04 George C. Dedes GNSS/IMU positioning, communication, and computation platforms for automotive safety applications
US20120290146A1 (en) * 2010-07-15 2012-11-15 Dedes George C GPS/IMU/Video/Radar absolute/relative positioning communication/computation sensor platform for automotive safety applications
US8639426B2 (en) * 2010-07-15 2014-01-28 George C Dedes GPS/IMU/video/radar absolute/relative positioning communication/computation sensor platform for automotive safety applications
US9177477B2 (en) 2010-07-19 2015-11-03 Honda Motor Co., Ltd. Collision warning system using driver intention estimator
US9672713B2 (en) 2010-07-27 2017-06-06 Rite-Hite Holding Corporation Methods and apparatus to detect and warn proximate entities of interest
US9542824B2 (en) 2010-07-27 2017-01-10 Rite-Hite Holding Corporation Methods and apparatus to detect and warn proximate entities of interest
US20120025964A1 (en) * 2010-07-27 2012-02-02 Beggs Ryan P Methods and apparatus to detect and warn proximate entities of interest
US9633537B2 (en) 2010-07-27 2017-04-25 Rite-Hite Holding Corporation Methods and apparatus to detect and warn proximate entities of interest
US9607496B2 (en) 2010-07-27 2017-03-28 Rite-Hite Holding Corporation Methods and apparatus to detect and warn proximate entities of interest
US9230419B2 (en) * 2010-07-27 2016-01-05 Rite-Hite Holding Corporation Methods and apparatus to detect and warn proximate entities of interest
US9547969B2 (en) 2010-07-27 2017-01-17 Right-Hite Holding Corporation Methods and apparatus to detect and warn proximate entities of interest
WO2012013305A1 (en) * 2010-07-30 2012-02-02 Wabco Gmbh Monitoring system for monitoring the surrounding area, in particular the area behind motor vehicles
EP2598375B1 (en) 2010-07-30 2016-03-23 WABCO GmbH Monitoring system for monitoring the surrounding area, in particular the area behind motor vehicles
CN102905939A (en) * 2010-07-30 2013-01-30 威伯科有限公司 Monitoring system for monitoring the surrounding area, in particular the area behind motor vehicles
US9135822B2 (en) 2010-07-30 2015-09-15 Wabco Gmbh Monitoring system for monitoring the surrounding area, in particular the area behind motor vehicles
US9194725B2 (en) 2010-10-02 2015-11-24 Wabco Gmbh Mounting for a distance sensor
WO2012041414A1 (en) * 2010-10-02 2012-04-05 Wabco Gmbh Sensor mounting for a distance sensor
EP2455779A1 (en) * 2010-11-17 2012-05-23 Robert Bosch GmbH Ultrasound-based orientation detection of objects in the vicinity of a vehicle
US20120158243A1 (en) * 2010-12-21 2012-06-21 Anthony Pupin Vehicle camera system operable in off-road mode and method
US8983717B2 (en) * 2010-12-21 2015-03-17 Ford Global Technologies, Llc Vehicle camera system operable in off-road mode and method
US9103899B2 (en) 2011-04-29 2015-08-11 The Invention Science Fund I, Llc Adaptive control of a personal electronic device responsive to a micro-impulse radar
US9164167B2 (en) 2011-04-29 2015-10-20 The Invention Science Fund I, Llc Personal electronic device with a micro-impulse radar
US20120274503A1 (en) * 2011-04-29 2012-11-01 Searete Llc Network and personal electronic devices operatively coupled to micro-impulse radars
US8884809B2 (en) 2011-04-29 2014-11-11 The Invention Science Fund I, Llc Personal electronic device providing enhanced user environmental awareness
US9151834B2 (en) * 2011-04-29 2015-10-06 The Invention Science Fund I, Llc Network and personal electronic devices operatively coupled to micro-impulse radars
US9000973B2 (en) 2011-04-29 2015-04-07 The Invention Science Fund I, Llc Personal electronic device with a micro-impulse radar
US20130107044A1 (en) * 2011-10-26 2013-05-02 Anthony Azevedo Blind Spot Camera System
US20150274074A1 (en) * 2012-01-30 2015-10-01 Klear-View Camera, Llc System and method for providing front-oriented visual information to vehicle driver
US9511711B2 (en) * 2012-01-30 2016-12-06 Klear-View Camera, Llc System and method for providing front-oriented visual information to vehicle driver
US11760264B2 (en) 2012-01-30 2023-09-19 Klear-View Camera Llc System and method for providing front-oriented visual information to vehicle driver
EP2814532B2 (en) 2012-02-13 2020-04-15 Integrated Healing Technologies Multi-modal wound treatment apparatus
US20130229298A1 (en) * 2012-03-02 2013-09-05 The Mitre Corporation Threaded Track Method, System, and Computer Program Product
US20140022067A1 (en) * 2012-07-20 2014-01-23 Michael J. Dambra Scooter/wheelchair lift platform with back-up sensor and quick disconnect
US8902052B2 (en) * 2012-07-20 2014-12-02 Michael J Dambra Scooter/wheelchair lift platform with back-up sensor and quick disconnect
US20140023470A1 (en) * 2012-07-20 2014-01-23 Tyrone Soklaski System and apparatus for improved wheelchair lift
US9039341B2 (en) * 2012-07-20 2015-05-26 Tyrone Soklaski System and apparatus for improved wheelchair lift
US20150285906A1 (en) * 2012-10-04 2015-10-08 Technology Service Corporation Proximity sensor
US20150247914A1 (en) * 2012-10-05 2015-09-03 FLARM Technology GmbH Method and device for estimating a distance
US8833815B2 (en) * 2012-10-23 2014-09-16 Ford Global Technologies, Llc Bumper integrated forward radar mounting system
US20140343836A1 (en) * 2013-05-17 2014-11-20 Dr. Ing. H.C.F. Porsche Aktiengesellschaft Method for operating a first-party vehicle
US8810382B1 (en) * 2014-01-14 2014-08-19 Joseph N. Laurita Method and apparatus for warning vehicle of low overpass height
US10451729B2 (en) * 2014-09-25 2019-10-22 Audi Ag Method for operating a multiplicity of radar sensors in a motor vehicle and motor vehicle
US10877149B2 (en) 2014-09-25 2020-12-29 Audi Ag Method for operating a multiplicity of radar sensors in a motor vehicle and motor vehicle
US20170285165A1 (en) * 2014-09-25 2017-10-05 Audi Ag Method for operating a multiplicity of radar sensors in a motor vehicle and motor vehicle
US10175355B2 (en) 2014-10-22 2019-01-08 Denso Corporation Object detection apparatus
US10578736B2 (en) 2014-10-22 2020-03-03 Denso Corporation Object detection apparatus
US20160116441A1 (en) * 2014-10-22 2016-04-28 Denso Corporation Object detection apparatus
US10175354B2 (en) * 2014-10-22 2019-01-08 Denso Corporation Object detection apparatus
US20160117841A1 (en) * 2014-10-22 2016-04-28 Denso Corporation Object detection apparatus
US10210435B2 (en) * 2014-10-22 2019-02-19 Denso Corporation Object detection apparatus
US10453343B2 (en) 2014-10-22 2019-10-22 Denso Corporation Object detection apparatus
US10451734B2 (en) 2014-10-22 2019-10-22 Denso Corporation Object detecting apparatus
US10436900B2 (en) 2014-10-22 2019-10-08 Denso Corporation Object detection apparatus
US10436899B2 (en) 2014-10-22 2019-10-08 Denso Corporation Object detection apparatus
US10395541B2 (en) * 2015-01-16 2019-08-27 Texas Instruments Incorporated Integrated fault-tolerant augmented area viewing system
US20160210861A1 (en) * 2015-01-16 2016-07-21 Texas Instruments Incorporated Integrated fault-tolerant augmented area viewing system
US20160291149A1 (en) * 2015-04-06 2016-10-06 GM Global Technology Operations LLC Fusion method for cross traffic application using radars and camera
US9599706B2 (en) * 2015-04-06 2017-03-21 GM Global Technology Operations LLC Fusion method for cross traffic application using radars and camera
US9685086B2 (en) * 2015-05-27 2017-06-20 Cisco Technology, Inc. Power conservation in traffic safety applications
CN107640090A (en) * 2016-07-22 2018-01-30 中兴通讯股份有限公司 A kind of traffic safety control method and device
US20190270405A1 (en) * 2016-11-18 2019-09-05 Panasonic Intellectual Property Management Co., Ltd. Notifying device, automatic driving vehicle, notifying method, program, non-transitory recording medium, and notifying system
US11810452B2 (en) 2016-11-18 2023-11-07 Panasonic Intellectual Property Management Co., Ltd. Notifying device and notifying system
US10988078B2 (en) * 2016-11-18 2021-04-27 Panasonic Intellectual Property Management Co., Ltd. Notifying device and notifying system
DE102017216791A1 (en) * 2017-09-22 2019-05-02 Zf Friedrichshafen Ag Sensory detection of open spaces under land vehicles
US10788570B2 (en) 2017-09-29 2020-09-29 The Boeing Company Radar system for mobile platform and method of use
US20200031276A1 (en) * 2018-07-25 2020-01-30 Mando Corporation Rear-side alarm device and rear-side alarm method thereof
US11180081B2 (en) * 2018-07-25 2021-11-23 Mando Corporation Rear-side alarm device and rear-side alarm method thereof
DE102019205504A1 (en) * 2019-04-16 2020-10-22 Zf Friedrichshafen Ag Control device and method as well as computer program product
CN110599800A (en) * 2019-09-24 2019-12-20 江苏集萃智能传感技术研究所有限公司 Parking lot parking space state monitoring system and monitoring method
US11390209B2 (en) * 2020-03-18 2022-07-19 Grote Industries, Llc System and method for adaptive driving beam headlamp
US11485278B2 (en) * 2020-03-18 2022-11-01 Grote Industries, Llc System and method for adaptive driving beam headlamp
US20230010662A1 (en) * 2020-03-18 2023-01-12 Grote Industries, Llc System and method for adaptive driving beam headlamp
US11760254B2 (en) * 2020-03-18 2023-09-19 Grote Industries, Llc System and method for adaptive driving beam headlamp

Also Published As

Publication number Publication date
US20050073433A1 (en) 2005-04-07
US6268803B1 (en) 2001-07-31

Similar Documents

Publication Publication Date Title
US20060119473A1 (en) System and method of avoiding collisions
US6894608B1 (en) System and method for warning of potential collisions
EP0830266B1 (en) Obstacle detection system for vehicles moving in reverse
US6838981B2 (en) Stopped object filtering for side object detection system
KR100803414B1 (en) Near object detection system
US6680689B1 (en) Method for determining object classification from side-looking sensor data
US5229975A (en) Vehicle proximity sensor
US6674394B1 (en) Method for determining object location from side-looking sensor data
US6943726B2 (en) Device for searching a parking space
JP6203862B2 (en) Method for maintaining a warning signal in a vehicle based on the presence of a target in a warning area, in particular a blind spot, a corresponding driver assistance system and a vehicle
JP2991659B2 (en) Rear and side obstacle detection system including multiple antennas for small vehicles
US8212660B2 (en) Overhead obstacle avoidance system
US8207836B2 (en) Overhead obstacle avoidance system
US20100238066A1 (en) Method and system for generating a target alert
WO2001006276A1 (en) Method and apparatus for recognizing stationary objects with a moving side-looking radar
AU2009279093B2 (en) Method and device for assisting the driver of a vehicle, especially an industrial or commercial vehicle, in identifying obstacles nearby with ultrasound sensors
US7119734B2 (en) Target determination apparatus, target judgment apparatus, and determination aid apparatus
CN104973051A (en) Method, device and system for adjusting driving speed of vehicle
Ruff Test results of collision warning systems for surface mining dump trucks
US11198390B2 (en) Overhead obstacle detection and avoidance system for vehicles
Ruff Monitoring Blind Spots: A Major Concern For Haul Trucks-Introduction
CN201272316Y (en) Side distance monitoring and alarming system for automobile
CN210617998U (en) Blind area detection equipment for freight transport and passenger transport vehicles
CN113341414A (en) Chassis scratch prevention system and chassis scratch prevention method based on millimeter wave radar
KR20040028600A (en) Near object detection system

Legal Events

Date Code Title Description
AS Assignment

Owner name: ALTRA RECOVERY, LLC, MINNESOTA

Free format text: COUERT ORDER AND JUDGMENT;ASSIGNOR:ALTRA TECHNOLOGIES, INC.;REEL/FRAME:019597/0384

Effective date: 20070718

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION