WO1993019429A1 - Vision apparatus - Google Patents

Vision apparatus Download PDF

Info

Publication number
WO1993019429A1
WO1993019429A1 PCT/AU1993/000111 AU9300111W WO9319429A1 WO 1993019429 A1 WO1993019429 A1 WO 1993019429A1 AU 9300111 W AU9300111 W AU 9300111W WO 9319429 A1 WO9319429 A1 WO 9319429A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
vision apparatus
cameras
signals
vehicle
Prior art date
Application number
PCT/AU1993/000111
Other languages
French (fr)
Inventor
Robert Ciolli
Stuart A. Mccormack
Denis C. Hitchens
Original Assignee
In-Mar-Tech Australia Pty. Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by In-Mar-Tech Australia Pty. Ltd. filed Critical In-Mar-Tech Australia Pty. Ltd.
Publication of WO1993019429A1 publication Critical patent/WO1993019429A1/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/64Devices characterised by the determination of the time taken to traverse a fixed distance
    • G01P3/68Devices characterised by the determination of the time taken to traverse a fixed distance using optical means, i.e. using infrared, visible, or ultraviolet light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/015Detecting movement of traffic to be counted or controlled with provision for distinguishing between two or more types of vehicles, e.g. between motor-cars and cycles

Definitions

  • This invention relates to a vision apparatus for recognising a passing object and it relates particularly, but not exclusively, to such apparatus for use in detecting passing objects such as passing vehicles on roadways, passing packaging containers such as shipping containers. Description of the Prior Art
  • image recognition of the object such as characters or the like which may appear on a surface of the object.
  • image signature may comprise the shape of the vehicle
  • image attribute may comprise the registration number or some other characteristic.
  • shipping containers it may be code information applied to a face of the shipping container.
  • a 'signature' is obtained by measuring certain characteristics of a truck at the time of registration and storing this information in a central registry.
  • a detection system is then required to operate on the roadway, so that as a truck passes a detection station, a measurement of the truck's characteristics are made.
  • a comparison is performed to determine whether the truck is indeed the same truck that was presented at the time of initial registration.
  • TAG type of systems in which a transponder is fitted to a truck and interrogated by an antenna array mounted at the roadside, typically to read a unique number stored within the transponder.
  • a difficulty with the transponders is that it is not easy to know whether they are, in fact, operational, hence trucks with faulty transponders may unwittingly avoid detection.
  • the system is also expensive and difficult to implement since each truck must be fitted with a transponder.
  • vision apparatus for recognising an image attribute of a passing object where either one or other of the speed or the length of the object are unknown, said apparatus comprising at least one single- line scan type line scan camera mounted so the passing object will be viewed, electronic means comprising: store means for storing signals indicative of the image of the object throughout line scans, processing means for determining a length of the object between line scans, said processing means having input means where signals representative of at least one of the speed or length of the object can be inputted to enable said signal representative of the length of the object between line scans to be processed, image recognition means to which the signals stored in said store means and to which said signal representative of the length of the object between line scans and to which expected image signature or image attribute signals are supplied for recognition processing, to obtain an output representative of a recognised image signature or image attribute.
  • Such apparatus is independent of the direction of approach of the object.
  • a second ⁇ ingle- line scan type line scan camera which is directed so that the camera views the object, spaced along the direction of travel of the object, so that two sets of signals indicative of the image of the object can be obtained and processed.
  • the apparatus may be duplicated at other locations along the path of the object to provide for interval route timing.
  • the processors at the two locations may be in communication with each other or with yet another processor at a central location.
  • the distances between each of the locations is known so that when the object passes one location its feature such as image signature or image attribute or both will be obtained, and when the same object passes another location its same image signature or image attribute or both will again be obtained, and from the respective times of passing the two -locations and the known distance therebetween the average speed can be determined.
  • the apparatus may be directed to the detection of the position of a vehicle on a roadway.
  • a marking pattern may be laid on the roadway of a known length and position with respect to an edge of the roadway and being coincident with the line scan image field of a camera; the marking pattern comprising a regular pattern which, in the absence of a vehicle, produces a regularly repeating digitised image as determined by the processor, and upon a vehicle entering the image field the digitised image will change to form a resulting digitised image, and by comparison with the said regular digitised image, the width of the vehicle can be determined.
  • the determination can be made from the points in the resulting image where the said regularity ceases and recommences.
  • the position of the vehicle on the roadway can be determined with reference to the sides of the roadway from that point along the line image field where the regularity in the resulting digitised image ceases and recommences.
  • the determined width will indicate what category of vehicle has been detected through a comparison with a store of characteristic widths.
  • the apparatus may also identify an object in terms of the object's length and height.
  • an object By recording an image of the whole of the object as it passes through the field of view of a camera a unique 'signature' of the object can be obtained.
  • the height and length of the object can be determined from processing the image data obtained from the surface of the object as it passes through the field of a camera.
  • An estimate of a object's length might be obtained from the sum of the lengths of image segments having straight sides after normalization.
  • the object's height will be approximately the sum of the lengths of image segments having curved sides after having normalised those curved sides to represent true length.
  • Figure 1 shows schematic detail of two CCD (Change Coupled Device) cameras located above a roadway in a system constructed in accordance with the invention for viewing passing vehicles;
  • CCD Change Coupled Device
  • Figure 2 shows a top view of the configuration in Figure 1;
  • Figure 3 shows in schematic block circuit diagram form the components of the system
  • Figure 4 shows in side elevation a truck just entering the first line scan image field
  • Figure 5 shows a top view of the arrangement, in Figure 4;
  • Figure 6 shows the truck of Figure 4 having moved along the roadway to enter the second line scan image field;
  • Figure 7 shows the truck yet further progressed along the roadway
  • Figure 8 shows a rear view of the truck and the first line scan image field
  • Figure 9 shows examples of images obtained over the surface of the truck for varying conditions
  • Figure 10 shows an example of a marking pattern applied to the roadway
  • Figure 11 shows an embodiment having coverage of two lanes of a roadway
  • Figure 12 shows a flow diagram of typical steps performed by software in operation of a preferred system constructed in accordance with the above embodiments of the invention
  • Figure 13 shows schematic detail of a single CCD camera located to one side of path of travel of an object such as a shipping container in a system constructed in accordance with the invention for viewing passing shipping containers and for recognizing code information on a face of the shipping containers;
  • Figure 14 is a flow diagram of typical steps performed by software in operation of a preferred system in accordance with Figure 13.
  • FIG. 1 and 2 relate to a system installed on a public highway for viewing passing vehicles.
  • the system 10 has components located above and beside the roadway 20.
  • two CCD cameras 30.- 40 are arranged to be above the roadway 20 respectively at heights h j _ and h 2 , and located approximately above the mid- point of the left hand carriageway of roadway 20 (for right hand drive vehicles).
  • the cameras 30,40 are in spaced arrangement.
  • the cameras are charge coupled devices of the single-line scan type line scan cameras.
  • One suitable type of CCD line-scan camera is manufactured by the French company i2R, and sold under type number iVClOO.
  • the line-scan cameras 30,40 of the type specified have a resolution of 2048 pixels in each line, which translates to typically 2 mm lateral resolution on the road surface, given that lane width is typically 3.3 m. Best results are obtained for angles ⁇ , ⁇ in the range 30-60°. For an angle of 45°, since there is one-one correspondence of horizontal speed with vertical speed, any image attribute obtained by the camera of numerals or letters will be 1 to 1 scaled, thereby aiding recognition. There is often a judgement required to be made, in that at an angle near 30° there will be a large 'shadow' behind a vehicle and therefore tail gating vehicles may escape detection.
  • tail gating vehicles will be detected, but there is a loss of resolution in capturing a 'signature' image of a vehicle - the shape of the vehicle - and a similar loss of resolution in capturing an image attribute - registration number.
  • the cameras operate at a line scan frequency of 10,000 lines per second, i.e. 20 MHz, for a vehicle speed of 70 km/h the resolution obtainable is 2 mm at a 45° angle, which is sufficient to allow recognition of vehicle signatures and image attributes such as registration numbers. If a vehicle's speed is, say, 140 km/h, resolution reduces to 4 mm, but it is still possible to recognize the registration number as only the height of the character has been affected, and not the width.
  • the two cameras 30,40 do not necessarily need to have the same resolution, nor do they need to have as high a resolution as frame cameras, as the application and speed of vehicles being detected will dictate an appropriate configuration, particularly if the expected top speed of vehicles is less than 140 km/h.
  • the cameras 30,40 are mounted off the road by any suitable means such as on a gantry, or on some other structure which may also hold road signs, or even from a bridge.
  • the cameras 30,40 are connected to a remote station 50 which comprises an enclosure housing electronic means utilised in implementing the localised control and processing of the system 10.
  • the remote station 50 may include a power supply 51 which can be used to power the electronic means as well as the cameras 30,40.
  • the local processor 52 controls the operation of the localised control/processing at the remote station 50.
  • the processor 52 is in two-way communication with the cameras 30,40.
  • Information received by the local processor 52 from the cameras 30,40 typically relates to signals representative of image data collected, whereas the control lines from the local processor 52 to the cameras 30,40 relate to issuing of control instructions and providing power to the cameras.
  • the local processor 52 can read and write information from a store means in the form of a mass storage device 53. As shown in Figure 3 it is possible to connect further pairs of cameras 30', 40' to the local processor 52, the other pairs being sited above another carriageway.
  • the local processor 52 will periodically be required to transmit information to a centralised location. at a central station 100 for reasons that will become apparent.
  • a modem/Tx/Rx device 54 is connected to the local processor 52.
  • the communications from the local processor 52 to the central station 100 can be via land line or radio transmission or the like.
  • Signals of the image data to be transmitted is likely to be compressed to reduce transmission times.
  • the JPEG Joint Photographers Expert Group
  • the central station 100 can support a number of remote stations 50, with a suitable number of modem/Tx/Rx devices 102 being provided.
  • the host computer 101 has access to a mass storage device 103 and a central control station 104.
  • the central control station 104 would enable an operator to monitor the performance of any one of a number of remote stations 50 as well as performing other operational functions.
  • FIG. 4 shows a truck travelling on the roadway 20 having reached a position where its front bumper enters the image field a x .
  • the cameras 30,40 are typically chosen to operate using infra-red light in the range 600-800 nm, hence the light would not normally be visible to a driver of a truck or other motor vehicle. Even so, it is possible to use the cameras as being sensitive to the visible light spectrum.
  • the system does not require remote triggering from any other sensor to start the image recording.
  • Second, the system is non-invasive to the roadway as the road need not be disturbed to install trip sensors or the like. This, in turn, avoids subsequent road repair costs. The reason both cameras are operational is because the system is designed to operate for a vehicle travelling in either direction on the roadway, in which case either image field a ⁇ ⁇ or a 2 may be entered first. The camera will therefore usually be taking an image of the road surface to use as a reference.
  • the local processor 52 will commence storage of signals of digitised image data. As can be seen from Figure 5, only the middle portion of the line scan image will show a change as each side portion is still incident on the roadway 20. As soon as the truck 60 enters the line scan of the first image field a. it is possible to switch the local processor 52 into a high intensity/high capture mode to enhance the data image.
  • the detection of a vehicle may take place by means of processing the image data obtained from the roadway or a roadway marking pattern as will presently be described with reference to Figure 10 utilising a fast fourier transform (FFT) program to translate the information into the frequency domain.
  • FFT fast fourier transform
  • the front-most portion of the truck then enters the line scan of field a 2 .
  • This event is signalled by the camera 40 to the local processor 52 in which case the elapsed time, t, between field a having been entered and field a 2 now been entered will be known, and by knowing the horizontal separation of the fields, x (given that the same point of the truck will break both line scan beams in the first instance), the speed of the truck can be determined as x/t. This can then be scaled to give an indication of the speed of the truck in units such as km/h.
  • a signal representative of the speed of the truck is determined by comparing a time instant when a part of the truck is at a particular line scan of one of the cameras, and a line instant when the same part of the truck is at the same line scan of the other of the cameras, to obtain a line difference signal.
  • the recording of the image of the truck as performed by camera 30 or 40 should contain image information as to an image attribute - in this case - the registration number plate from which the registration number can be deciphered and appropriate action taken. It may be equally applicable to determine the registration number of a vehicle that is not speeding, for example in route interval timing as will be described presently. If the separation x between the fields a- L and a 2 is quite small, say 100 mm, then a sufficient amount of image information may not have been recorded by camera 30 in order to include the registration number plate.
  • a measure of the acceleration of a vehicle this can be achieved utilising the two camera system. If the time at which the front of the truck enters field a x is ascribed to be t 0 , and the time the front of the truck enters field a 2 having travelled distance x is t ⁇ r then a first velocity V ⁇ is known from x/t. In the same way another velocity can be determined using the rear of the truck as a reference. The times the rear of the truck pass fields a r and a 2 are t 2 and t 3 respectively. This then determines V 2 .
  • the acceleration can be determined from (V 2 - V ⁇ ) / ((t x - t 0 ) - (t 3 - t 2 ) ) .
  • the acceleration is determined by comparing the time difference in the signals indicative of the image of the truck, of the entry and exit instants of the truck within the fields of each camera.
  • a further embodiment to that shown in Figure 2 is to include a third single-line scan type line-scan CCD camera.
  • This third camera will be spaced from the other two cameras either in front of camera 30 or behind camera 40 or between the two cameras, and such that the line field of images a 3 is substantially parallel to fields a x and a 2 .
  • the separation between field a 3 and either a x or a 2 need not be x as between those two.
  • the function of the third camera is to provide another measure of a vehicle's speed some time after, before or during that obtained by the time interval between a x and a 2 . In this way a measure of the vehicle's acceleration can also be obtained, through knowing the change in speed and the time between the respective speeds being recorded.
  • One example of the value of knowing this quantity would be where the cameras were mounted near a set of traffic lights, and if a vehicle was determined to be speeding and accelerating it may be possible to cause the state of the traffic lights to be changed and avoid a collision. Alternatively, it can be determined a driver is engaging in unsafe driving practices by speeding or accelerating through traffic signals.
  • the third camera views the truck at a position along its path of travel different to the two cameras, and where the acceleration of the* object is obtained by comparing the time difference in said signals indicative of a part of the truck at a particular line scan image of the truck in each of the cameras .
  • the central station 100 can support a number of remote stations 50. Given the instance that the registration number of a vehicle has been obtained at one remote station 50, it is then possible to check the route interval time of the vehicle once it passes another remote station 50 simply by matching registration numbers and 'times of day' . This will give an indication of the average speed of that vehicle.
  • a modification of the system permits determining the vehicle parameters of position in a carriageway and its width. This is shown in Figure 10.
  • This modification comprises use of a roadway marking pattern 120 which is laid across the width of the carriageway at a position coincident with fields a ⁇ ⁇ and a 2 .
  • the pattern 120 comprises regularly spaced white squares or rectangles which are periodically interposed by alternately oriented triangles.
  • As the cameras 30,40 record the marking pattern 120 they will generate a regularly repeating digitised image based on 256 grey scale levels.
  • the width of the white segments is at least 2 pixels.
  • a white object will be recorded as a series of pixels (corresponding to the width of each segment according to the resolution) having a grey scale value of, say, 0-10.
  • the roadway segments are black and therefore detected as a series of pixels having a grey scale value of, say, 246-256.
  • a known pattern of grey scaled pixels is continually recorded. There is direct known correlation between each widthwise position in the carriageway and each pixel in the sequence.
  • the sides of the vehicle can be located, and therefore the position of the vehicle in the carriageway.
  • a first estimate of its width can also be determined.
  • FFT fast fourier transform
  • the triangular shapes in the pattern 120 serve a special purpose.
  • the cameras 30,40 are subject to the environment including wind and structural vibrational effects. This may result in the image fields a ⁇ and a 2 being not precisely normal to the carriageway but translating forward and backward along the roadway, or possibly becoming slightly skewed. This translation or skewing effect will be determinate since the reflected image will vary slightly as more or less of the triangle is imaged depending upon the skew. This allows correcting adjustment to be made in the processing of image data of a vehicle, hence maintaining the desired accuracy.
  • the task of obtaining a 'signature' of a truck is made more difficult where the truck is not travelling down the centre of a carriageway, is straddling lanes or is skewed by virtue of changing lanes.
  • These considerations impact on the system configuration and the software required to generate a total image of the truck.
  • each camera must be such as to partly overlap into the adjacent carriageway, and at least by one half of the width of a registration number plate.
  • the second carriageway could be for traffic travelling in the same or different direction.
  • the duplicated system can also be used in conjunction with the first for the detection, discrimination, deciphering of registration number and 'signature' image capture if a vehicle is straddling the lanes.
  • the cameras (30,40 and 70,80) of both carriageways will record respective parts of the whole, including some of the sides of the vehicle.
  • Figure 9A shows a perspective front view of a truck 60 centred in the lane and for a 50 mm object lens.
  • Figure 9B is the resulting image.
  • the configurations of Figures 9C-9I are evident from the annotation on the drawings themselves.
  • Figure 12 shows a flow diagram indicating typical steps in performing all the functions of speed determination, vehicle detection and discrimination, registration number, capture and decryption and vehicle
  • the local processor 52 will analyse that data to determine the actual speed and may also identify and decipher the registration number plate.
  • the registration number plate identification is performed by known software written by the present applicant which is incorporated into a system in use at the Traffic Camera Office in Victoria, Australia, and known under the trade mark ICONISCAN. This information is then transmitted to the host computer 101 where it can be acted on by issuing such as a fine or interfacing with other records over some external bus or communications device represented as 105 such as those that might be held by traffic enforcement agencies and which identify the truck's owner and/or driver.
  • the registration number deciphering could equally be performed by the host computer 101.
  • the complete truck image information is also transmitted to the host computer 101 which can call up the relevant record from the mass storage device 103 in relation to the identified registration number plate to determine whether the truck bearing that registration number plate is the same as the one measured at the time of original registration. It may also be possible to raise an alarm at the central control station 104 such that an operator can alert law enforcement agencies whom may take whatever action deemed appropriate.
  • FIG. 13 and to Figure 14 there is shown a further embodiment which uses only a single- line type line scan camera 30.
  • This system is useful for recognising an image signature or image attribute of a passing object.
  • this system one or other of the speed or the length of the object may be unknown.
  • this system can be used for detecting passing shipping containers 150 which may be lifted from transport vehicles into a ship or other storage area.
  • the line scan camera 30 is directed with its field of view orthogonal to an intended front face of the object - the shipping container 150.
  • the longitudinal extent of the line scan is directed generally horizontally assuming that the shipping container 150 will be lifted vertically.
  • camera 30 views a side face 151 of the shipping container 150 so that a code 153 marked on the face 151 can be viewed line by line as the shipping container 150 passes through the line scan image.
  • the signals provided by the camera 30 of the image of the object are provided to a remote station 50 as in previous embodiments.
  • the remote station 50 may include all the componentry referred to in previous embodiments as well as the componentry at the central station 100 in the previous embodiments. The exact arrangement of components is dictated by the in-use environment. For example, all the processing may be done in the remote unit 50 or there may be transmission of information from the remote unit 50 to the central station 100 as in the previous embodiments. As previously stated, the length of the object may or may not be known.
  • the speed of the object may or may not be known.
  • each single line scan image of the object can be processed relative to another line scan image of the object to form a normalized image of the object over all the line scans.
  • the image signals have been normalized they can be processed with image recognition means for recognition processing of either the signature of the object or an image attribute of the object - either the shape of the object or the code numbers respectively.
  • These signals when recognised by the recognition means can provide an output representative of a recognised image signature or image attribute.
  • Figure 14 shows a typical software flow diagram which can be used to provide the necessary software for the system.
  • the image attribute such as the code characters can be recognised by the software ICONISCAN previously referred to.
  • the length of the shipping container in this case the height of the shipping container
  • the known height can be pre-stored as an appropriate signal.
  • the system described provides an inexpensive, reliable and accurate method of determining the speed of objects utilising readily available equipment and avoiding time consuming and difficult data analysis.
  • the system also has the advantages of allowing other characteristic measurements of objects to be performed, such as an object's positioning, width and an estimate of height and length together with a 'signature' image.

Abstract

Vision apparatus is disclosed which comprises at least one single-line scan type line scan camera (30, 40) mounted for viewing a passing object (60, 150). The cameras (30, 40) are connected with electronic means (50) which includes store means (53) which stores signals indicative of the image of the object throughout line scans. The electronic means (50) includes processing means (52) whhich processes the signals indicative of the image of the object together with a signal representative of at least one of the speed or length of the object to determine a length of the object between line scans. That determined length between line scans is provided to image recognition means (100) and utilized with signals in the store means (53) and compared against expected image signature or image attribute signals for recognition. If there is recognition an output is provided representative of a recognised image signature or image attribute. Preferably two cameras (30, 40) are utilized so that one views the object (60, 150) at a different position along its path of travel and where a signal representative of the speed of the object is obtained by comparing particular time instants when a part of the object is detected by each camera.

Description

VISION APPARATUS Field of the Invention This invention relates to a vision apparatus for recognising a passing object and it relates particularly, but not exclusively, to such apparatus for use in detecting passing objects such as passing vehicles on roadways, passing packaging containers such as shipping containers. Description of the Prior Art
Hitherto, it has been desired to electronically capture an image of a passing object for subsequent processing. This may be required to control processes where the passing of an object initiates some action in the process.
It is particularly desired to obtain image recognition of the object such as characters or the like which may appear on a surface of the object. In relation to apparatus which is used to view a roadway to observe passing vehicles, it can be necessary to recognise some feature of the object such as a characteristic signature and/or an image attribute. This image signature may comprise the shape of the vehicle, and the image attribute may comprise the registration number or some other characteristic. In the case of shipping containers it may be code information applied to a face of the shipping container.
Hitherto, viewing of passing objects in such environments has required the use of expensive cameras which have the required high image resolution capabilities. Such cameras are typically frame cameras whereby signals frame-by-frame of the passing object can be stored and subsequently analysed to extract the required image signature or image attribute. By using whole frame cameras of this type not only is there the expense of the initial purchase cost of the cameras themselves but the signal processing and necessary storage mediums are relatively expensive to purchase and maintain. On open roads and highways there is a tendency for trucks and motor vehicles to exceed the allowable speed limit. Speeding will often cause accidents and possibly result in loss of life, as well as causing damage to property and the roadways themselves. Speeding can also cause gradual damage to roadways, especially if trucks are overladen since road surfaces are not normally designed to bear heavy loads travelling at excess speeds.
Many measures have been taken in an attempt to control speeding vehicles, including observing elapsed travel time over a fixed distance to determine the- speed of a vehicle. This type of measuring device is commonly referred to as an 'amphometer' . In more recent times speed cameras, radar detectors and the like have been used by law en orcement agencies. Fines or the like have been applied to the vehicle or to the driver if speeding is detected.
An extension to the use of speed detection devices has been to incorporate the facility of allowing identification of a vehicle's registration number. This is achieved by taking either a photographic image or a video recording of a speeding vehicle, then processing that data to decipher the actual registration number from the registration number plate. Examples of such techniques can be found in U.S. Patent No. 4,817,166 assigned to Perceptics Corporation, and U.S. Patent No . 4,878,248 assigned to Industrial Technology Research Institute.
All these systems rely on the technique of capturing an image of a vehicle and storing the image either on photographic film or magnetic media for subsequent processing. There are numerous technical problems associated with converting the stored image data on a frame-by-frame basis to decipher the registration number, and these problems are only solved by complex software and large computer power to identify the field of the registration number plate. Clearly such software and hardware incurs substantial expense.
Another measure which has been adopted in an attempt to reduce the incidence of speeding is to install speed limiting devices on vehicles, and particularly on trucks and buses. Even with speed limiting devices in place there have been instances where vehicles have been recorded travelling at speeds in excess of the limited speed, which suggests that the speed limiters either do not work correctly else may have been bypassed. Law enforcement agencies and statutory bodies responsible for the upkeep and regulation of use of roadways have also recently noticed a development amongst operators of trucks which seeks to avoid payment of correct user fees and charges. The practice involves a truck operator registering a new truck with the responsible authority then switching the registration number plates to a truck of the same make but of a larger size. This allows the truck operator to carry a greater load yet only pay fees and charges as if it were for a smaller truck. This evasion scheme has led to the concept of obtaining unique 'signatures' of trucks. A 'signature' is obtained by measuring certain characteristics of a truck at the time of registration and storing this information in a central registry. A detection system is then required to operate on the roadway, so that as a truck passes a detection station, a measurement of the truck's characteristics are made. A comparison is performed to determine whether the truck is indeed the same truck that was presented at the time of initial registration. One example of such systems in the prior art are the TAG type of systems in which a transponder is fitted to a truck and interrogated by an antenna array mounted at the roadside, typically to read a unique number stored within the transponder. A difficulty with the transponders is that it is not easy to know whether they are, in fact, operational, hence trucks with faulty transponders may unwittingly avoid detection. The system is also expensive and difficult to implement since each truck must be fitted with a transponder.
Statement of the Invention It is an object of the present invention to provide a reliable and less expensive vision system than in the past. This is achieved by use of single-line scan type line scan cameras. By collecting a series of line scan signals as the object passes, a picture image of the passing object can be obtained. If the passing object is to be recognised then it is necessary to obtain some information concerning the length of the object between line scans - normalization - so that the signals can be appropriately length scaled and the image size normalized.
According to the present invention there is provided vision apparatus for recognising an image attribute of a passing object where either one or other of the speed or the length of the object are unknown, said apparatus comprising at least one single- line scan type line scan camera mounted so the passing object will be viewed, electronic means comprising: store means for storing signals indicative of the image of the object throughout line scans, processing means for determining a length of the object between line scans, said processing means having input means where signals representative of at least one of the speed or length of the object can be inputted to enable said signal representative of the length of the object between line scans to be processed, image recognition means to which the signals stored in said store means and to which said signal representative of the length of the object between line scans and to which expected image signature or image attribute signals are supplied for recognition processing, to obtain an output representative of a recognised image signature or image attribute.
Such apparatus is independent of the direction of approach of the object.
In one embodiment, there may be a second εingle- line scan type line scan camera which is directed so that the camera views the object, spaced along the direction of travel of the object, so that two sets of signals indicative of the image of the object can be obtained and processed. By directing the fields of view of each of the line scan cameras in this way, it is possible to obtain time difference signals from each camera and to calculate the speed of the object without knowing its length. By using two such cameras and by noting the time of entry and exit from the line scan images of one camera and the entry and exit from the line scans of the other camera it is also possible to calculate acceleration. The distance of spacing of the fields of view of each line scan camera can be relatively close as, for example, one third meter. By using a third such camera spaced it is possible to obtain a better indication of the acceleration of the object by determining the difference in speed over the time taken for the object to pass between the camera fields of view.
The apparatus may be duplicated at other locations along the path of the object to provide for interval route timing. In that instance the processors at the two locations may be in communication with each other or with yet another processor at a central location. The distances between each of the locations is known so that when the object passes one location its feature such as image signature or image attribute or both will be obtained, and when the same object passes another location its same image signature or image attribute or both will again be obtained, and from the respective times of passing the two -locations and the known distance therebetween the average speed can be determined.
The apparatus may be directed to the detection of the position of a vehicle on a roadway. A marking pattern may be laid on the roadway of a known length and position with respect to an edge of the roadway and being coincident with the line scan image field of a camera; the marking pattern comprising a regular pattern which, in the absence of a vehicle, produces a regularly repeating digitised image as determined by the processor, and upon a vehicle entering the image field the digitised image will change to form a resulting digitised image, and by comparison with the said regular digitised image, the width of the vehicle can be determined. In one embodiment the determination can be made from the points in the resulting image where the said regularity ceases and recommences. Further, the position of the vehicle on the roadway can be determined with reference to the sides of the roadway from that point along the line image field where the regularity in the resulting digitised image ceases and recommences. The determined width will indicate what category of vehicle has been detected through a comparison with a store of characteristic widths.
The apparatus may also identify an object in terms of the object's length and height. By recording an image of the whole of the object as it passes through the field of view of a camera a unique 'signature' of the object can be obtained. The height and length of the object can be determined from processing the image data obtained from the surface of the object as it passes through the field of a camera. An estimate of a object's length might be obtained from the sum of the lengths of image segments having straight sides after normalization. The object's height will be approximately the sum of the lengths of image segments having curved sides after having normalised those curved sides to represent true length. Brief Description of the Drawings In order that the invention may be more clearly understood, examples of preferred embodiments will now be described with reference to the accompanying drawings, in which:
Figure 1 shows schematic detail of two CCD (Change Coupled Device) cameras located above a roadway in a system constructed in accordance with the invention for viewing passing vehicles;
Figure 2 shows a top view of the configuration in Figure 1;
Figure 3 shows in schematic block circuit diagram form the components of the system;
Figure 4 shows in side elevation a truck just entering the first line scan image field;
Figure 5 shows a top view of the arrangement, in Figure 4; Figure 6 shows the truck of Figure 4 having moved along the roadway to enter the second line scan image field;
Figure 7 shows the truck yet further progressed along the roadway; Figure 8 shows a rear view of the truck and the first line scan image field;
Figure 9 shows examples of images obtained over the surface of the truck for varying conditions; Figure 10 shows an example of a marking pattern applied to the roadway;
Figure 11 shows an embodiment having coverage of two lanes of a roadway; and
Figure 12 shows a flow diagram of typical steps performed by software in operation of a preferred system constructed in accordance with the above embodiments of the invention;
Figure 13 shows schematic detail of a single CCD camera located to one side of path of travel of an object such as a shipping container in a system constructed in accordance with the invention for viewing passing shipping containers and for recognizing code information on a face of the shipping containers;
Figure 14 is a flow diagram of typical steps performed by software in operation of a preferred system in accordance with Figure 13.
Detailed Description of Preferred Embodiments
The embodiments to be described in Figures 1 and 2 relate to a system installed on a public highway for viewing passing vehicles. Referring then to Figures 1, 2 and 3, the system 10 has components located above and beside the roadway 20. Particularly, two CCD cameras 30.- 40 are arranged to be above the roadway 20 respectively at heights hj_ and h2, and located approximately above the mid- point of the left hand carriageway of roadway 20 (for right hand drive vehicles). The cameras 30,40 are in spaced arrangement. The cameras are charge coupled devices of the single-line scan type line scan cameras. One suitable type of CCD line-scan camera is manufactured by the French company i2R, and sold under type number iVClOO. These cameras are less expensive than frame cameras which have a required resolution to detect image attributes such as vehicle registration number plate characters . In the example of Figures 1 and 2 the cameras are sited adjacent one another in the horizontal plane, however the cameras could be mounted in ^ther ways but it is desired that the angles of the image field between the respective cameras and the perpendicular from the roadway, being and β respectively, are substantially the same.
These angles are not orthogonal to the direction of travel of the vehicles the image field of each camera is therefore represented as lines a., and a2 and must be regularly spaced at distance x along the whole of their path length. This arrangement facilitates easy processing of image signals such that inexpensive computers, such as P.C. type computer processing or similar costing dedicated computers can be utilized. If the fields of view are not parallel - i.e. angles α and β - then involved mathematical computations must be made to the image signals to equate the image signals of the two cameras to a fixed datum for determining parameters of the image signals which can be appropriately related to one another.
The line-scan cameras 30,40 of the type specified have a resolution of 2048 pixels in each line, which translates to typically 2 mm lateral resolution on the road surface, given that lane width is typically 3.3 m. Best results are obtained for angles α,β in the range 30-60°. For an angle of 45°, since there is one-one correspondence of horizontal speed with vertical speed, any image attribute obtained by the camera of numerals or letters will be 1 to 1 scaled, thereby aiding recognition. There is often a judgement required to be made, in that at an angle near 30° there will be a large 'shadow' behind a vehicle and therefore tail gating vehicles may escape detection. On the other hand, if an angle of 60° is selected tail gating vehicles will be detected, but there is a loss of resolution in capturing a 'signature' image of a vehicle - the shape of the vehicle - and a similar loss of resolution in capturing an image attribute - registration number. As the cameras operate at a line scan frequency of 10,000 lines per second, i.e. 20 MHz, for a vehicle speed of 70 km/h the resolution obtainable is 2 mm at a 45° angle, which is sufficient to allow recognition of vehicle signatures and image attributes such as registration numbers. If a vehicle's speed is, say, 140 km/h, resolution reduces to 4 mm, but it is still possible to recognize the registration number as only the height of the character has been affected, and not the width.
The two cameras 30,40 do not necessarily need to have the same resolution, nor do they need to have as high a resolution as frame cameras, as the application and speed of vehicles being detected will dictate an appropriate configuration, particularly if the expected top speed of vehicles is less than 140 km/h.
The cameras 30,40 are mounted off the road by any suitable means such as on a gantry, or on some other structure which may also hold road signs, or even from a bridge. The cameras 30,40 are connected to a remote station 50 which comprises an enclosure housing electronic means utilised in implementing the localised control and processing of the system 10. The remote station 50 may include a power supply 51 which can be used to power the electronic means as well as the cameras 30,40.
Referring in detail to Figure 3, the local processor 52 controls the operation of the localised control/processing at the remote station 50. The processor 52 is in two-way communication with the cameras 30,40. Information received by the local processor 52 from the cameras 30,40 typically relates to signals representative of image data collected, whereas the control lines from the local processor 52 to the cameras 30,40 relate to issuing of control instructions and providing power to the cameras. The local processor 52 can read and write information from a store means in the form of a mass storage device 53. As shown in Figure 3 it is possible to connect further pairs of cameras 30', 40' to the local processor 52, the other pairs being sited above another carriageway.
The local processor 52 will periodically be required to transmit information to a centralised location. at a central station 100 for reasons that will become apparent. For this purpose a modem/Tx/Rx device 54 is connected to the local processor 52. The communications from the local processor 52 to the central station 100 can be via land line or radio transmission or the like.
Signals of the image data to be transmitted is likely to be compressed to reduce transmission times. For this purpose the JPEG (Joint Photographers Expert Group) compression technique could be adopted, although many other techniques would be equally suitable. There is also provided a local control station 55 which allows an operator to interrogate or control the operation of the local processor 52. It is likely this control station 55 would only be used for maintenance purposes, and otherwise the remote station 50 would operate automatically.
The central station 100 can support a number of remote stations 50, with a suitable number of modem/Tx/Rx devices 102 being provided. The host computer 101 has access to a mass storage device 103 and a central control station 104. The central control station 104 would enable an operator to monitor the performance of any one of a number of remote stations 50 as well as performing other operational functions.
Turning then to consider the operation of a remote station 50. Figure 4 shows a truck travelling on the roadway 20 having reached a position where its front bumper enters the image field ax. The cameras 30,40 are typically chosen to operate using infra-red light in the range 600-800 nm, hence the light would not normally be visible to a driver of a truck or other motor vehicle. Even so, it is possible to use the cameras as being sensitive to the visible light spectrum.
Normally the cameras 30,40 are in constant operation, hence the system does not require remote triggering from any other sensor to start the image recording. There are particular advantages in not requiring a trip mechanism. First, there is no stray electromagnetic radiation such as at microwave or RF frequencies. There is great concern in the community generally as to excessive man-made electromagnetic radiation and what are perceived as being associated health issues. Second, the system is non-invasive to the roadway as the road need not be disturbed to install trip sensors or the like. This, in turn, avoids subsequent road repair costs. The reason both cameras are operational is because the system is designed to operate for a vehicle travelling in either direction on the roadway, in which case either image field a^^ or a2 may be entered first. The camera will therefore usually be taking an image of the road surface to use as a reference. Once the image changes by virtue of the front-most part of the truck passing through the line scan field aιr the local processor 52 will commence storage of signals of digitised image data. As can be seen from Figure 5, only the middle portion of the line scan image will show a change as each side portion is still incident on the roadway 20. As soon as the truck 60 enters the line scan of the first image field a. it is possible to switch the local processor 52 into a high intensity/high capture mode to enhance the data image.
In another embodiment, the detection of a vehicle may take place by means of processing the image data obtained from the roadway or a roadway marking pattern as will presently be described with reference to Figure 10 utilising a fast fourier transform (FFT) program to translate the information into the frequency domain. When a vehicle enters the image field a change in the recorded frequency characteristic occurs, triggering the speed determination and other processes.
Referring then to Figure 6, as the truck 60 progresses along the roadway 20 the front-most portion of the truck then enters the line scan of field a2. This event is signalled by the camera 40 to the local processor 52 in which case the elapsed time, t, between field a having been entered and field a2 now been entered will be known, and by knowing the horizontal separation of the fields, x (given that the same point of the truck will break both line scan beams in the first instance), the speed of the truck can be determined as x/t. This can then be scaled to give an indication of the speed of the truck in units such as km/h. In other words a signal representative of the speed of the truck is determined by comparing a time instant when a part of the truck is at a particular line scan of one of the cameras, and a line instant when the same part of the truck is at the same line scan of the other of the cameras, to obtain a line difference signal.
If it is determined that the speed of the truck is in excess of the permissible speed limit, then the recording of the image of the truck as performed by camera 30 or 40 should contain image information as to an image attribute - in this case - the registration number plate from which the registration number can be deciphered and appropriate action taken. It may be equally applicable to determine the registration number of a vehicle that is not speeding, for example in route interval timing as will be described presently. If the separation x between the fields a-L and a2 is quite small, say 100 mm, then a sufficient amount of image information may not have been recorded by camera 30 in order to include the registration number plate. In this instance, it may be possible to continue the recording for some period of time based on a generalised understanding of the average height of the bonnet of a truck when considered against the speed of the truck. In the alternative, it is possible to continue the recording of the truck over the whole length of the truck until such time as the field at passes across the back of the truck and an image of the only roadway 20 is again obtained.
If it is desired to obtain a measure of the acceleration of a vehicle this can be achieved utilising the two camera system. If the time at which the front of the truck enters field ax is ascribed to be t0, and the time the front of the truck enters field a2 having travelled distance x is tιr then a first velocity Vχ is known from x/t. In the same way another velocity can be determined using the rear of the truck as a reference. The times the rear of the truck pass fields ar and a2 are t2 and t3 respectively. This then determines V2. The acceleration can be determined from (V2 - Vχ) / ((tx - t0) - (t3 - t2) ) . In other words the acceleration is determined by comparing the time difference in the signals indicative of the image of the truck, of the entry and exit instants of the truck within the fields of each camera.
A further embodiment to that shown in Figure 2 is to include a third single-line scan type line-scan CCD camera. This third camera will be spaced from the other two cameras either in front of camera 30 or behind camera 40 or between the two cameras, and such that the line field of images a3 is substantially parallel to fields ax and a2. The separation between field a3 and either ax or a2 need not be x as between those two. The function of the third camera is to provide another measure of a vehicle's speed some time after, before or during that obtained by the time interval between ax and a2. In this way a measure of the vehicle's acceleration can also be obtained, through knowing the change in speed and the time between the respective speeds being recorded. It may be important to know the acceleration of a vehicle. One example of the value of knowing this quantity would be where the cameras were mounted near a set of traffic lights, and if a vehicle was determined to be speeding and accelerating it may be possible to cause the state of the traffic lights to be changed and avoid a collision. Alternatively, it can be determined a driver is engaging in unsafe driving practices by speeding or accelerating through traffic signals.
Thus, in this arrangement, the third camera views the truck at a position along its path of travel different to the two cameras, and where the acceleration of the* object is obtained by comparing the time difference in said signals indicative of a part of the truck at a particular line scan image of the truck in each of the cameras .
As has been noted above, the central station 100 can support a number of remote stations 50. Given the instance that the registration number of a vehicle has been obtained at one remote station 50, it is then possible to check the route interval time of the vehicle once it passes another remote station 50 simply by matching registration numbers and 'times of day' . This will give an indication of the average speed of that vehicle.
There may be only one route or a number of routes between remote stations 50. If there is more than one route a minimum time for the journey by the fastest route would serve to determine whether speeding might have occurred. It is unlikely that trucks would take a longer route between two points, unless it were the case that route proved shorter in time. In that event the longer route distance-wise is nevertheless the shortest route time-wise hence chosen as the baseline. It is also possible that the remote stations 50 can be networked across different major routes, and in this way still track vehicles travelling between destinations which span different highways. The necessary processing would be performed by the host computer 101. An adjunct to route interval timing is to allow fleet performance monitoring. The system described is a tamper proof method of confirming the trip details of individual vehicles in a fleet.
A modification of the system permits determining the vehicle parameters of position in a carriageway and its width. This is shown in Figure 10. This modification comprises use of a roadway marking pattern 120 which is laid across the width of the carriageway at a position coincident with fields a^^ and a2. The pattern 120 comprises regularly spaced white squares or rectangles which are periodically interposed by alternately oriented triangles. As the cameras 30,40 record the marking pattern 120 they will generate a regularly repeating digitised image based on 256 grey scale levels. The width of the white segments is at least 2 pixels. A white object will be recorded as a series of pixels (corresponding to the width of each segment according to the resolution) having a grey scale value of, say, 0-10. The roadway segments are black and therefore detected as a series of pixels having a grey scale value of, say, 246-256. When there is no vehicle present a known pattern of grey scaled pixels is continually recorded. There is direct known correlation between each widthwise position in the carriageway and each pixel in the sequence. When a vehicle enters the image field it will cause a change in the image. By determining where the known pattern ceases and recommences, the sides of the vehicle can be located, and therefore the position of the vehicle in the carriageway. A first estimate of its width can also be determined. Alternatively, it is possible to apply the digitised images to a fast fourier transform (FFT) to convert the information to the frequency domain. A change in the frequency spectrum will indicate an object entering the image field. The triangular shapes in the pattern 120 serve a special purpose. The cameras 30,40 are subject to the environment including wind and structural vibrational effects. This may result in the image fields aλ and a2 being not precisely normal to the carriageway but translating forward and backward along the roadway, or possibly becoming slightly skewed. This translation or skewing effect will be determinate since the reflected image will vary slightly as more or less of the triangle is imaged depending upon the skew. This allows correcting adjustment to be made in the processing of image data of a vehicle, hence maintaining the desired accuracy.
The task of obtaining a 'signature' of a truck is made more difficult where the truck is not travelling down the centre of a carriageway, is straddling lanes or is skewed by virtue of changing lanes. These considerations impact on the system configuration and the software required to generate a total image of the truck. There are four angular measures as well as the speed and acceleration of the truck which come into play. These angular measures are the image field inclination (α) , the truck centre-line offset, the skew of motion and the camera angle field of view (which is a function of focal length) . These considerations are complicated by a trade-off in camera object lens size, where larger focal length lenses can resolve images in the distance better, i.e., the extent of the image field such as at bumper bar level, compared to better resolution over shorter distances for smaller focal length lenses, i.e., at windshield level. The choice of object lens will therefore be partly application dependant.
Where there are two adjacent carriageways it is advantageous to have the system replicated for the other carriageway, as is shown in Figure 11. In this instance the degree of coverage of each camera must be such as to partly overlap into the adjacent carriageway, and at least by one half of the width of a registration number plate. The second carriageway could be for traffic travelling in the same or different direction. Aside from performing the same functions as have been described, the duplicated system can also be used in conjunction with the first for the detection, discrimination, deciphering of registration number and 'signature' image capture if a vehicle is straddling the lanes. The cameras (30,40 and 70,80) of both carriageways will record respective parts of the whole, including some of the sides of the vehicle. Often there will be information on the sides of a vehicle which will aid identification of the vehicle in addition to the registration number plate. It is possible to confirm that it is the one vehicle straddling the carriageway as the image recorded by both sets of cameras will occur at the same instant in time, and there will be no roadway component at the common edges of the carria'geway.
As the truck passes through field a1# the whole top surface of the truck is recorded and stored in the mass storage device 53, and thereby forms a unique 'signature' of the truck. Because the resolution offered by the system is so great (say a 2 mm X 2 mm comb over the truck at 70 km/h and 45° for angles α, β) , the very small differences between trucks of an identical make and model may be detected. For example, the length of the exhaust stack may be 5 cm longer or shorter between individual tracks, and this difference could be detected. Features such as the number of driving lights could also be determined. Figure 9 shows a number of instances of truck passing through the image field a.λ and a2. Particularly the variables considered are offset from the carriageway centre line and two camera object lens sizes. The registration number plate is also shown to give an indication of how the resulting image will appear, and therefore dictating the requirements of the character recognition software. Figure 9A shows a perspective front view of a truck 60 centred in the lane and for a 50 mm object lens. Figure 9B is the resulting image. The configurations of Figures 9C-9I are evident from the annotation on the drawings themselves.
Figure 12 shows a flow diagram indicating typical steps in performing all the functions of speed determination, vehicle detection and discrimination, registration number, capture and decryption and vehicle
'signature' acquisition. it would be within the capability of a computer programmer to write software to implement the "system following the instructions of this specification and the flow diagram and without requiring inventive input. Two other physical properties of a truck which can be estimated from the overall image of the truck are its height and length. As the truck progresses through field a.., any part which has a vertical component increasing the height of the truck will generally appear as a curved sided image since a narrower part of the field a., is being entered at that greater height. Conversely, any horizontal part of the truck, such as the bonnet or trailer roof, will generally appear as a regularly sided image. By appropriate software processing it is possible to estimate the height of the vehicle from the sum of curved sections in the image once those curved sections have been normalised to represent true length, whereas the length of the vehicle can be estimated from the sum of the sections of the image which have straight sides.
Once the image of a speeding truck is completed, the local processor 52 will analyse that data to determine the actual speed and may also identify and decipher the registration number plate. The registration number plate identification is performed by known software written by the present applicant which is incorporated into a system in use at the Traffic Camera Office in Victoria, Australia, and known under the trade mark ICONISCAN. This information is then transmitted to the host computer 101 where it can be acted on by issuing such as a fine or interfacing with other records over some external bus or communications device represented as 105 such as those that might be held by traffic enforcement agencies and which identify the truck's owner and/or driver. The registration number deciphering could equally be performed by the host computer 101. The complete truck image information is also transmitted to the host computer 101 which can call up the relevant record from the mass storage device 103 in relation to the identified registration number plate to determine whether the truck bearing that registration number plate is the same as the one measured at the time of original registration. It may also be possible to raise an alarm at the central control station 104 such that an operator can alert law enforcement agencies whom may take whatever action deemed appropriate.
Referring now to Figure 13 and to Figure 14 there is shown a further embodiment which uses only a single- line type line scan camera 30. This system is useful for recognising an image signature or image attribute of a passing object. In this system one or other of the speed or the length of the object may be unknown. In one particular application this system can be used for detecting passing shipping containers 150 which may be lifted from transport vehicles into a ship or other storage area. In this instance, the line scan camera 30 is directed with its field of view orthogonal to an intended front face of the object - the shipping container 150. The longitudinal extent of the line scan is directed generally horizontally assuming that the shipping container 150 will be lifted vertically. Thus, camera 30 views a side face 151 of the shipping container 150 so that a code 153 marked on the face 151 can be viewed line by line as the shipping container 150 passes through the line scan image. The signals provided by the camera 30 of the image of the object are provided to a remote station 50 as in previous embodiments. The remote station 50 may include all the componentry referred to in previous embodiments as well as the componentry at the central station 100 in the previous embodiments. The exact arrangement of components is dictated by the in-use environment. For example, all the processing may be done in the remote unit 50 or there may be transmission of information from the remote unit 50 to the central station 100 as in the previous embodiments. As previously stated, the length of the object may or may not be known. Similarly, the speed of the object may or may not be known. With this embodiment it is possible by subsequently ascertaining the length of the object or the speed of the object to input one or other of signals representative of the speed or length of the object to enable a signal to be generated which is representative of the length of the object between line scans. In this way, each single line scan image of the object can be processed relative to another line scan image of the object to form a normalized image of the object over all the line scans. When the image signals have been normalized they can be processed with image recognition means for recognition processing of either the signature of the object or an image attribute of the object - either the shape of the object or the code numbers respectively. These signals when recognised by the recognition means can provide an output representative of a recognised image signature or image attribute. Figure 14 shows a typical software flow diagram which can be used to provide the necessary software for the system. The image attribute such as the code characters can be recognised by the software ICONISCAN previously referred to. In the embodiment of Figure 13 for use specifically with shipping containers, it can be assumed that the length of the shipping container (in this case the height of the shipping container) is pre-known. Thus, such height need not be calculated or obtained at another location for subsequent inputting to obtain a normalized image of the shipping container. The known height (length) can be pre-stored as an appropriate signal.
The system described provides an inexpensive, reliable and accurate method of determining the speed of objects utilising readily available equipment and avoiding time consuming and difficult data analysis. The system also has the advantages of allowing other characteristic measurements of objects to be performed, such as an object's positioning, width and an estimate of height and length together with a 'signature' image.

Claims

CLAIMS :
1. Vision apparatus for recognising an image signature or image attribute of a passing object where either one or other of the speed or the length of the object are unknown, said apparatus comprising at least one single- line scan type line scan camera mounted so the passing object will be viewed, electronic means comprising: store means for storing signals indicative of the image of the object throughout line scans, processing means for determining a length of the object between line scans, said processing means having input means where signals representative of at least one of the speed or length of the object can be inputted to enable said signal representative of the length of the object between line scans to be processed, image recognition means to which the signals stored in said store means and to which said signal representative of the length of the object between line scans and to which expected image signature or image attribute signals are supplied for recognition processing, to obtain an output representative of a recognised image signature or image attribute.
2. Vision apparatus as claimed in claim 1, wherein said processing means also processes the signals indicative of the image of the object to obtain a signal representative of the time taken for the object to pass so that by inputting signals representative of the length of the object, a speed signal representative of the speed of the object can also be obtained.
3. Vision apparatus as claimed in claim 1 wherein the field of view of the camera is directed to view the object orthogonally to the direction of travel of the ob j ect .
4. Vision apparatus as claimed in claim 1 including at least two such cameras one being mounted relative to the other so that one views the object at a different position along its path of travel to the other, and where said signal representative of the speed of the object is determined by comparing a time instant when a part of the object is at a particular line scan of one of the cameras and a time instant when the same part of the object is at similar line scan of the other of the cameras, to obtain a time difference signal which will result in said signal representative of the speed of the object.
5. Vision apparatus as claimed in claim 4 where a signal representative of an acceleration change of the object is obtained by comparing the time difference in said signals indicative of the image of the object, of the entry and exit instants of the object within the fields of view of each camera.
6. Vision apparatus as claimed in claim 4 or claim 5 including a third such camera mounted to view the object along its path of travel at a position different than the positions of the two cameras, and where an acceleration of the object is obtained by comparing the time difference in said signals indicative of the image of the object at a particular line scan image of the object in each of said cameras.
7. Vision apparatus as claimed in claims 4, 5 or 6 wherein said expected image attributes are characters, to permit recognition of characters carried by said object.
8. Vision apparatus as claimed in claims 4, 5 or 6 wherein the fields of view of each camera are directed substantially parallel to one another.
9. Vision apparatus as claimed in claim 8 wherein the fields of view of each camera are inclined other than being orthogonal relative to the direction of travel of the object, whereby to provide viewing of a leading or trailing face of the object plus viewing of at least one other face of the object.
10. Vision apparatus as claimed in claim 8 or claim 9 wherein signals representative of an image shape of the object are stored and processed whereby to provide signals representative of a signature of the object.
11. Vision apparatus as claimed in claim 10 wherein the signals representative of the signature are processed with said output signals representative of a recognised image attribute, are compared by comparing means with matched signals representative of known signatures and signals representative of known image attributes, to verify the detected signature and image attribute corresponds to the known matched signature and image attribute.
12. Vision apparatus as claimed in claim 9, 10 or 11 and being for specific use in road traffic supervision, said apparatus comprising means for detecting position of the vehicle on a roadway or width of the vehicle, said means comprising a marking pattern applied to the surface of the roadway, coincident with a line scan field of view of at least one of the cameras and extending across the width of the roadway, said marking pattern generating a repeating unchanging image signal each line scan in the absence of a passing vehicle, and in the presence of a passing vehicle generating a different line scan image signal, eaid processing means processing the resulting image signals to determine either or both the position of the vehicle on the roadway or the width of the vehicle.
13. Vision apparatus as claimed in claimed 12 wherein the marking pattern comprises a series of spaced but different images at least one of the spaced images being an upright triangular image and another being an inverted triangular image whereby comparisons of the lengths of line scan across both can be made by said processing means to determine if there is skew of the line scan relative to the width of the roadway.
14. Vision apparatus as claimed in claim 12 being interconnected with similar vision apparatus at a remote position along the expected path of travel of the vehicle, each such vision apparatus being electronically connected with processing means to compare either or both the image signature or image attribute recognised by the two vision apparatus is for the same vehicle and to then calculate the time taken for travel between the two vision apparatus.
PCT/AU1993/000111 1992-03-18 1993-03-18 Vision apparatus WO1993019429A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AUPL1403 1992-03-18
AUPL140392 1992-03-18

Publications (1)

Publication Number Publication Date
WO1993019429A1 true WO1993019429A1 (en) 1993-09-30

Family

ID=3776043

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU1993/000111 WO1993019429A1 (en) 1992-03-18 1993-03-18 Vision apparatus

Country Status (2)

Country Link
CA (1) CA2132346A1 (en)
WO (1) WO1993019429A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0674186A2 (en) * 1994-03-23 1995-09-27 Daimler-Benz Aerospace Aktiengesellschaft Apparatus for the detection and classification of aircraft or vehicles, preferably when moving on runways or taxiways
EP0823036A1 (en) * 1995-04-28 1998-02-11 Schwartz Electro-Optics, Inc. Intelligent vehicle highway system sensor and method
US5809161A (en) * 1992-03-20 1998-09-15 Commonwealth Scientific And Industrial Research Organisation Vehicle monitoring system
WO1999018554A1 (en) * 1997-10-08 1999-04-15 Tracon Systems, Ltd. A road-embedded video camera system
EP0978811A2 (en) * 1998-08-07 2000-02-09 Siemens Aktiengesellschaft Method and device to obtain travel times of vehicles
EP1048961A2 (en) * 1999-04-30 2000-11-02 Siemens Aktiengesellschaft Apparatus and method for simultaneous measurement of the speed and surface characteristics of moving objects
WO2012152868A1 (en) * 2011-05-11 2012-11-15 Morpho Method and device for the production of a contextual image of a moving object
CN104730280A (en) * 2015-04-10 2015-06-24 苏州大学 Speed measuring method and system for balls
CN109085374A (en) * 2018-07-27 2018-12-25 江苏科技大学 The multiple spot speed measuring device and its speed-measuring method for slow-speed of revolution system based on kinect
CN111650392A (en) * 2020-07-03 2020-09-11 东北大学 Metal sheet movement speed detection method based on linear array camera stereoscopic vision

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4063820A (en) * 1975-11-10 1977-12-20 Rca Corporation Apparatus for measuring a dimension of an object
GB1573188A (en) * 1977-09-23 1980-08-20 British Railways Board Measuring systems
FR2583882A1 (en) * 1985-06-25 1986-12-26 Renault Device for measuring the speed and position of a moving object with respect to the ground
US4948685A (en) * 1987-09-03 1990-08-14 Ricoh Company, Ltd. Sheet-shaped electrode, method of producing the same, and secondary battery using the sheet-shaped electrode
GB2231952A (en) * 1989-05-03 1990-11-28 Serco Limited Vehicle length measurement system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4063820A (en) * 1975-11-10 1977-12-20 Rca Corporation Apparatus for measuring a dimension of an object
GB1573188A (en) * 1977-09-23 1980-08-20 British Railways Board Measuring systems
FR2583882A1 (en) * 1985-06-25 1986-12-26 Renault Device for measuring the speed and position of a moving object with respect to the ground
US4948685A (en) * 1987-09-03 1990-08-14 Ricoh Company, Ltd. Sheet-shaped electrode, method of producing the same, and secondary battery using the sheet-shaped electrode
GB2231952A (en) * 1989-05-03 1990-11-28 Serco Limited Vehicle length measurement system

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5809161A (en) * 1992-03-20 1998-09-15 Commonwealth Scientific And Industrial Research Organisation Vehicle monitoring system
EP0674186A2 (en) * 1994-03-23 1995-09-27 Daimler-Benz Aerospace Aktiengesellschaft Apparatus for the detection and classification of aircraft or vehicles, preferably when moving on runways or taxiways
EP0674186A3 (en) * 1994-03-23 1997-05-28 Daimler Benz Aerospace Ag Apparatus for the detection and classification of aircraft or vehicles, preferably when moving on runways or taxiways.
EP0823036A1 (en) * 1995-04-28 1998-02-11 Schwartz Electro-Optics, Inc. Intelligent vehicle highway system sensor and method
EP0823036A4 (en) * 1995-04-28 1999-09-15 Schwartz Electro Optics Inc Intelligent vehicle highway system sensor and method
WO1999018554A1 (en) * 1997-10-08 1999-04-15 Tracon Systems, Ltd. A road-embedded video camera system
EP0978811A2 (en) * 1998-08-07 2000-02-09 Siemens Aktiengesellschaft Method and device to obtain travel times of vehicles
EP0978811A3 (en) * 1998-08-07 2000-08-16 Siemens Aktiengesellschaft Method and device to obtain travel times of vehicles
EP1048961A2 (en) * 1999-04-30 2000-11-02 Siemens Aktiengesellschaft Apparatus and method for simultaneous measurement of the speed and surface characteristics of moving objects
DE19919925A1 (en) * 1999-04-30 2000-11-16 Siemens Ag Arrangement and method for the simultaneous measurement of the speed and the surface shape of moving objects
DE19919925C2 (en) * 1999-04-30 2001-06-13 Siemens Ag Arrangement and method for the simultaneous measurement of the speed and the surface shape of moving objects
EP1048961A3 (en) * 1999-04-30 2003-10-15 Siemens Aktiengesellschaft Apparatus and method for simultaneous measurement of the speed and surface characteristics of moving objects
WO2012152868A1 (en) * 2011-05-11 2012-11-15 Morpho Method and device for the production of a contextual image of a moving object
CN104730280A (en) * 2015-04-10 2015-06-24 苏州大学 Speed measuring method and system for balls
CN109085374A (en) * 2018-07-27 2018-12-25 江苏科技大学 The multiple spot speed measuring device and its speed-measuring method for slow-speed of revolution system based on kinect
CN109085374B (en) * 2018-07-27 2020-06-16 江苏科技大学 Kinect-based multi-point speed measuring device for low-rotating-speed system and speed measuring method thereof
CN111650392A (en) * 2020-07-03 2020-09-11 东北大学 Metal sheet movement speed detection method based on linear array camera stereoscopic vision

Also Published As

Publication number Publication date
CA2132346A1 (en) 1993-09-30

Similar Documents

Publication Publication Date Title
US6081206A (en) Parking regulation enforcement system
AU760844B2 (en) Roadside control device for a toll apparatus installed in a motor vehicle
US7355527B2 (en) System and method for parking infraction detection
AU2017279793B2 (en) Device for tolling or telematics systems
US5708425A (en) Real time messaging interface for vehicle detection sensors
US8294595B1 (en) Speed detector for moving vehicles
CN104183133A (en) Method for acquiring and transmitting road traffic flow dynamic information
JPH076236A (en) Fare automatic collection system
WO1993019429A1 (en) Vision apparatus
JPH0830893A (en) Device for discriminating type of vehicle and toll area management device
JPH0714037A (en) Monitor system for checking state of fare payment of road user
Tropartz et al. Experiences and results from vehicle classification using infrared overhead laser sensors at toll plazas in New York City
AU638929B1 (en)
KR101907506B1 (en) System and method for electronic toll collection with antenna capable of double toll avoidance in High-pass
CN110308455A (en) Vehicle detecting system and method
AU3739893A (en) Vision apparatus
KR101895779B1 (en) Method for detect vehicle using laser scanner and method and system smart tolling control using the same
JP2968473B2 (en) Speed monitoring recorder
WO1997050067A1 (en) A multilane traffic registration system, comprising multiple and gantry-mounted detector means
JPH1123250A (en) Instrument for measuring length of object, object-monitoring device, instrument for measuring length of vehicle, and vehicle-monitoring device
KR100505047B1 (en) Electronic toll collection system
KR102112851B1 (en) Non-contact automatic vehicle detection system
RU2442218C1 (en) Vehicle speed measurement method
CN219303146U (en) Vehicle type abnormality detection system for expressway exit toll station
Saritha et al. RFID-based traffic violation detection and traffic flow analysis system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AT AU BB BG BR CA CH CZ DE DK ES FI GB HU JP KP KR KZ LK LU MG MN MW NL NO NZ PL PT RO RU SD SE SK UA US VN

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML MR SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
ENP Entry into the national phase

Ref document number: 1994 244067

Country of ref document: US

Date of ref document: 19940516

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2132346

Country of ref document: CA

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase