US20020070342A1 - Method and system for improving camera infrared sensitivity using digital zoom - Google Patents
Method and system for improving camera infrared sensitivity using digital zoom Download PDFInfo
- Publication number
- US20020070342A1 US20020070342A1 US09/732,429 US73242900A US2002070342A1 US 20020070342 A1 US20020070342 A1 US 20020070342A1 US 73242900 A US73242900 A US 73242900A US 2002070342 A1 US2002070342 A1 US 2002070342A1
- Authority
- US
- United States
- Prior art keywords
- distant object
- recited
- lens
- light radiation
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims description 10
- 230000035945 sensitivity Effects 0.000 title description 8
- 238000003384 imaging method Methods 0.000 claims abstract description 30
- 230000003287 optical effect Effects 0.000 claims abstract description 23
- 230000005855 radiation Effects 0.000 claims abstract description 20
- 238000005286 illumination Methods 0.000 claims abstract description 12
- 230000000295 complement effect Effects 0.000 claims description 2
- 239000004065 semiconductor Substances 0.000 claims description 2
- 238000013459 approach Methods 0.000 description 17
- 230000004297 night vision Effects 0.000 description 16
- 230000004438 eyesight Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 5
- 230000001771 impaired effect Effects 0.000 description 5
- 230000001965 increasing effect Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000001931 thermography Methods 0.000 description 4
- 241000282372 Panthera onca Species 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000003331 infrared imaging Methods 0.000 description 3
- 230000009467 reduction Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 229910021521 yttrium barium copper oxide Inorganic materials 0.000 description 2
- 230000005457 Black-body radiation Effects 0.000 description 1
- 241000282994 Cervidae Species 0.000 description 1
- 235000016796 Euonymus japonicus Nutrition 0.000 description 1
- 240000006570 Euonymus japonicus Species 0.000 description 1
- 241000282412 Homo Species 0.000 description 1
- XHCLAFWTIXFWPH-UHFFFAOYSA-N [O-2].[O-2].[O-2].[O-2].[O-2].[V+5].[V+5] Chemical compound [O-2].[O-2].[O-2].[O-2].[O-2].[V+5].[V+5] XHCLAFWTIXFWPH-UHFFFAOYSA-N 0.000 description 1
- BTGZYWWSOPEHMM-UHFFFAOYSA-N [O].[Cu].[Y].[Ba] Chemical compound [O].[Cu].[Y].[Ba] BTGZYWWSOPEHMM-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000009189 diving Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000011031 large-scale manufacturing process Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 229910001935 vanadium oxide Inorganic materials 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/02—Constructional details
Definitions
- the present invention relates generally to a system and method for providing recognition of an approaching object located in a distant no-light environment and, more specifically, to a low-cost near-infrared (IR) imaging system that, by capturing sufficient infrared light photons from the distant object, is capable of increased infrared imaging sensitivity and range.
- IR near-infrared
- Visible and infrared imaging systems are known in the art for their usefulness in mitigating the effects of impaired night vision.
- Impaired night vision is a problematic and potentially dangerous situation caused by a reduced range of vision under conditions of darkness and is all too familiar an experience for automobile drivers, particularly drivers over the age of 40.
- 20/20 vision is typically reduced to approximately 20/50 where a reduction in vision of this magnitude can result in the late perception of poorly illuminated obstacles located at a distance from a driver.
- a number of military and commercial approaches that mitigate the effects of impaired night vision have been developed in the art using different light sources, ranging from ultraviolet to infrared, in conjunction with imaging cameras sensitive to light in the range of visible to far infrared.
- One such approach employs the use of low level visible light intensifiers within night vision scope devices and is based on technologies originally developed for military applications.
- Commercial versions of such night vision scope devices like the Night Vision PocketscopeTM manufactured by ITT Defense & Electronics, amplify visible light using a microchannel plate as an electron multiplier and a photocathode as a detector.
- the night vision scope devices are relatively inexpensive and can provide significant enhancement in range on a clear night and, if used in conjunction with an illuminator, can also provide vision enhancement during overcast conditions.
- night vision scopes of this type are not suitable for the large-scale manufacture required by the automotive industry and other industries that have high-volume production demands.
- thermal imaging technology is certainly not new to the military and, in fact, has been used in some form or another for at least the past four decades.
- thermal imaging technology is being commercially exploited.
- General Motor's 2000 Cadillac DeVille uses long wavelength infrared detectors that can operate in the one to twelve micron wavelength and, as a result, have the capability to detect thermal energy rather than light photons.
- a warm object is thermally detected through its black body radiation.
- An advantage of such a system is its ability to detect—even when obstructed by foliage, etc.—objects having thermal emissions, such as humans, deer and automobile engines.
- a system of this type is disadvantaged because of its inability to detect fallen trees or other objects that do not emit thermally.
- a further disadvantage of such a system is its significant expensive. Thermal imaging systems typically require very expensive uncooled infrared detectors, such as resistive bolometers, that detect the heat energy of objects invisible to the human eye.
- Thermal detection involves focusing the thermal (heat) energy onto the uncooled infrared detector with sensor optics designed to pass IR wavelengths.
- uncooled infrared detectors include, a vanadium oxide 2D uncooled infrared detector array manufactured by Boeing Corporation; a yttrium barium copper oxide (YBCO) bolometer that has been demonstrated by MSI Inc; and Raytheon Corporation's approach to the uncooled infrared detector, is a pyroelectric capacitor array that requires a thermoelectric cooler as well as a chopper wheel, an approach that has been employed in the 2000 model GM Cadillac DeVille.
- the lowest cost approach to uncooled infrared detectors is a micro electro mechanical system (MEMS) cantilever beam array.
- the cantilever beam array is a low-mass bimetallic diving board structure similar to an accelerometer where the amount of beam flexure is a function of its temperature and the temperature depends on the amount of incident infrared.
- a night vision imaging system employing the use of near-infrared sensor with illumination generally consists of an illuminator that illuminates a distant scene and a near-IR camera that generates an image of the distant scene.
- a night vision imaging system developed by Ford Jaguar Inc. uses a charge coupled device (CCD) camera and a near-infrared (NIR) spotlight.
- CCD charge coupled device
- NIR near-infrared
- the Jaguar system works by integrating the NIR spotlight with conventional high-beam lamps.
- the Jaguar system is able to capture an image of an object located in a dark distant scene.
- CCD charge-coupled device
- the Jaguar approach and others like it are perhaps a more practical approach to night vision imaging, mostly due to the availability of low cost components.
- modem CCD and like cameras typically have short exposure times that range from approximately ⁇ fraction (1/60) ⁇ to ⁇ fraction (1/4000) ⁇ of a second and, as a result, the camera's range is limited.
- the camera's ability to enhance a driver's visibility of on-coming traffic or up-coming road conditions, if traveling at speeds of 60 mph or more is limited since it takes approximately 250 feet for an automobile traveling at 60 mph to come to a complete stop.
- the sensitivity of CCD detectors is considerably reduced to only approximately 15% to 25% of its peak response. This reduction prevents the camera from recognizing objects at more than approximately 200 feet away during cloud cover, fog or after sunset.
- a better approach to near-infrared sensors with illumination is to use a pulsed laser diode as an illuminator and to gate the CCD camera shutter synchronously with the laser pulses.
- This approach has several advantages, including an achievement of 4 times higher peak optical power.
- the gating makes it possible to see through particles to approximately four to five times the range of the human eye and other vision systems.
- filters can be used to enhance visibility in rain, fog, snow, etc.
- gated viewing systems can readily satisfy desired performance requirements, they are also too expensive for the average consumer.
- IR near-infrared
- the preceding and other shortcomings of the prior art are addressed and overcome by the present invention that provides a system for providing recognition of an approaching object located in a distant no-light environment.
- the system includes an illumination source for transmitting light to the distant object and an imaging device for detecting the light radiation reflected from the distant object to generate an image of the distant object corresponding thereto.
- the system also includes an independent digital signal processor for calculating a desired optical magnification of a lens of the imaging device that the holds an image of the distant object in a fixed dimension for a period of time sufficient to capture enough light radiation to more clearly identify the approaching distant object.
- the digital signal processor dynamically calculates the desired optical magnification of the imaging device lens as a function of a distance between the imaging device and the distant object.
- the digital signal processor then generates a voltage corresponding to the desired optical magnification, and applies this voltage to the imaging device to adjust a focus of the lens to the desired optical magnification.
- FIG. 1 a is a functional diagram of an embodiment of a system in accordance with the present invention.
- FIG. 1 b is a functional diagram of a distorted focus of an imaged object as a function of relative velocity
- FIG. 1 c is a functional diagram of a controlled focus of an imaged object as a function of relative velocity in accordance with the present invention
- FIG. 2 is a graphical illustration of a known geometric equation that is used in the present invention to determine optical flow as a function of distance in accordance with the present invention.
- FIG. 3 is an isometric diagram of a system for providing recognition of an approaching object located in a distant scene in accordance with an embodiment of the present invention.
- FIG. 1 is an embodiment of a near-infrared (IR) imaging system 10 in accordance with the present invention.
- the system 10 includes an
- IR sensitive imaging device 14 an illuminator 12 and a digital signal processor (DSP) 22 .
- the IR sensitive imaging device 14 comprises a detector element 16 having several hundred pixel elements (not shown), and a lens 18 that is capable of digitally zooming focus in on or pulling focus back from a distant object 20 located in a distant no-light environment.
- the IR sensitive imaging device 14 may be selected from one of the commercially available charge coupled device (CCD), complementary metal-oxide-semiconductor (CMOS) or like IR imaging cameras, such as the digital zoom capable Hi8 model CCD camera manufactured by SONY Corporation.
- CCD charge coupled device
- CMOS complementary metal-oxide-semiconductor
- the imaging device 14 herein further referenced as a camera, may be any device that is capable of detecting IR radiation and is also capable of electronic zoom.
- the illuminator 12 is provided to illuminate the distant object 20 and is preferably, but not necessarily, an infrared light emitting diode.
- the illuminator 12 is modulated at high frequency to emit an illumination wavelength of approximately 800 nanometers (nm) which is near the peak response of most CCD and CMOS cameras, but invisible to the human eye.
- Light 24 originates from the illuminator 12 , and is reflected from a surface of the object 20 .
- the lens 18 of the camera 14 receives the reflected light 27 and focuses the light 27 onto a focal plane 25 of the detector element 16 , here a CCD array chip, to generate an image (not shown) of the object 20 .
- the camera 14 is sensitive to light in the spectral range of approximately 800 nm, which is key to enhancing night vision, the camera 14 is able to generate the image (not shown) of the object 20 during the cover of darkness.
- the independent digital signal processor (DSP) 22 is provided to control the exposure of the camera 14 in such a way that allows the camera 14 to sufficiently integrate the distant object's photon energy 27 without distorting the image of the object 20 . More particularly, in accordance with the preferred embodiment of the present invention, improvement in the sensitivity and range of the camera 14 is achieved by holding a shutter (not shown) of the lens 18 open for an extended period of time that is preferably, but not necessarily, up to approximately one second. By allowing the lens 18 to remain open for a longer period of time, the lens 18 is able to stare at the object 20 longer, which allows the lens 18 to integrate on the object 20 for a longer period of time.
- the lens 18 is able to collect more of the light photons 27 reflected from the object 20 which, based on known optical principles, significantly enhances the signal-to-noise ratio of the camera 14 .
- the camera 14 may be mounted to a vehicle that is moving a significant distance in the direction of the distant object 20 . And this movement, as shown in FIG. 1 b, causes the dimensions of the object 20 to grow increasingly large—a phenomenon known in the art as optical flow.
- the DSP chip 22 applies a digital zoom correction voltage 26 to the lens 18 . As shown in FIG. 1 c, the voltage 26 digitally adjusts the magnification of the lens 18 so that the dimensions of the object 20 remain constant throughout the entire exposure period. This magnification correction is applied uniformly to all pixels of the CCD detector 16 by the camera 14 .
- an arithmetic logic circuit (not shown) of the DSP chip 22 is programmed with an algorithm which, based on the speed of the vehicle, and a predetermined viewing range and exposure time of the camera system 14 uses a known geometric equation to determine an appropriate correction voltage 26 that corrects for the forward motion of the camera 14 during exposure. More particularly, for given optical focus, the geometric equation relates the distance between the camera lens 18 and a distant object to a magnification in size of the object on the focal plane 25 of the camera 14 .
- Such an equation is, for example, discussed in detail in the publication, “Vision-based Vehicle Guidance,” by Ichiro Masaki, the general concepts of which are included here for reference.
- the Masaki publication generally provides estimates of the distance between a camera and a target by the following optical flow equation:
- the lens 18 is calibrated using known triangulation principles that correct for the fact that the camera lens 18 may not be linear.
- an object of known height here six feet
- the object height appears in the camera's viewfinder as 0.5 inches, meaning 0.5 inches corresponds to a six foot tall object located at 200 feet from the lens 18 .
- the object is relocated 100 feet from the lens 18 , if the lens 18 were linear, the object would appear 1 inch tall in the camera viewfinder. However, this is often not the case.
- V 1 a voltage that corresponds to a focus adjustment which is required to bring an object of known height located a predetermined range from the camera lens 18 into proper magnification.
- the desired predetermined range of the camera 14 is selected here as 200 feet, meaning the camera 14 is able to detect objects at up to 200 feet from the camera lens 18 .
- the camera 14 can also be calibrated to a voltage (V 2 ) that corresponds to a focus adjustment required to bring the same object, located a shorter distance from the camera lens 18 , into proper magnification.
- the camera 14 can be calibrated to the voltage (V 2 ) that corresponds to a focus adjustment required to bring the object, now located at (200 ft-93.5 ft) from the camera lens 18 , into proper magnification.
- the overall voltage 26 required to correct for the optical flow due to the forward motion of the camera 14 is equal to a voltage change ( ⁇ V) which is determined by difference between the voltage (V 1 ) at 200 feet and the voltage (V 2 ) at (200 ft-95.3 ft).
- ⁇ V voltage change
- the correction voltage 26 for 65 mph at a range of 200 ft is equal to minus ⁇ V.
- This correction voltage 26 is applied during the exposure period of the camera 14 , here one second, and is then reset to the voltage V 1 once the period of exposure has expired.
- the algorithm of the present invention is able to use any predetermined viewing range and exposure time of the camera system 14 , as well as the speed of the vehicle to determine an appropriate correction in the magnification of the lens 18 based on how far the vehicle travels during the exposure period.
- the inputs to the DSP chip 22 are the vehicle speed, the desired predetermined range of the camera 14 , and the desired predetermined exposure time of the camera 14 .
- the output of the DSP chip 22 is the digital zoom correction voltage 26 that has been computed based on the speed at which the camera 14 is moving.
- FIG. 3 illustrates the system 10 in accordance with the principles of the present invention.
- the camera 14 having enhanced infrared sensitivity and range in accordance with the principles of the present invention, is integrated into a front grill 29 of the car 28 .
- a pair of infrared LED illuminators 12 may be installed into an existing housing of the car's visible headlamps 33 or integrated with the car's standard visible headlamps 33 .
- a liquid crystal display (LCD) 31 may be included in the interior cabin of the car to display an image of a distant object 20 to the driver.
- LCD liquid crystal display
- the infrared LED headlamps 12 illuminate a distant scene containing the object 20 and so that the camera 14 can capture the image of the object 20 .
- the camera 14 captures the image of the object 20 by dynamically correcting the magnification of the object 20 during an extended exposure to compensate for the forward motion of the car 28 .
- the probability of detecting the object 20 is significantly increased.
- the present invention presents a low-cost alternative to other known night vision aid approaches by using low-cost commercially available components which may include commercially available digital imaging cameras, such as CCD or CMOS cameras.
- the present invention improves the infrared sensitivity and range of such cameras by a factor of approximately 60 by increasing their exposure time up to one second and correcting any image distortion that may occur during the exposure time as a result of optical flow.
Abstract
Description
- 1. Field of the Invention
- The present invention relates generally to a system and method for providing recognition of an approaching object located in a distant no-light environment and, more specifically, to a low-cost near-infrared (IR) imaging system that, by capturing sufficient infrared light photons from the distant object, is capable of increased infrared imaging sensitivity and range.
- 2. Description of the Prior Art
- Visible and infrared imaging systems are known in the art for their usefulness in mitigating the effects of impaired night vision. Impaired night vision is a problematic and potentially dangerous situation caused by a reduced range of vision under conditions of darkness and is all too familiar an experience for automobile drivers, particularly drivers over the age of 40. For example, during the cover of
darkness 20/20 vision is typically reduced to approximately 20/50 where a reduction in vision of this magnitude can result in the late perception of poorly illuminated obstacles located at a distance from a driver. - A number of military and commercial approaches that mitigate the effects of impaired night vision have been developed in the art using different light sources, ranging from ultraviolet to infrared, in conjunction with imaging cameras sensitive to light in the range of visible to far infrared. One such approach employs the use of low level visible light intensifiers within night vision scope devices and is based on technologies originally developed for military applications. Commercial versions of such night vision scope devices, like the Night Vision Pocketscope™ manufactured by ITT Defense & Electronics, amplify visible light using a microchannel plate as an electron multiplier and a photocathode as a detector. The night vision scope devices are relatively inexpensive and can provide significant enhancement in range on a clear night and, if used in conjunction with an illuminator, can also provide vision enhancement during overcast conditions. Unfortunately, however, night vision scopes of this type are not suitable for the large-scale manufacture required by the automotive industry and other industries that have high-volume production demands.
- Another approach known in the art for solving the problem of impaired night vision is the use of thermal imaging. As described in the publication “Give Me the Night (Vision),” by K. Jackson, AutoWorld Magazine, October 1998, thermal imaging technology is certainly not new to the military and, in fact, has been used in some form or another for at least the past four decades. However, more and more, thermal imaging technology is being commercially exploited. For example, General Motor's 2000 Cadillac DeVille uses long wavelength infrared detectors that can operate in the one to twelve micron wavelength and, as a result, have the capability to detect thermal energy rather than light photons. In other words, instead of detecting an object by sensing the infrared illumination (light photons) that the object reflects, a warm object is thermally detected through its black body radiation. An advantage of such a system is its ability to detect—even when obstructed by foliage, etc.—objects having thermal emissions, such as humans, deer and automobile engines. However, a system of this type is disadvantaged because of its inability to detect fallen trees or other objects that do not emit thermally. A further disadvantage of such a system is its significant expensive. Thermal imaging systems typically require very expensive uncooled infrared detectors, such as resistive bolometers, that detect the heat energy of objects invisible to the human eye. Thermal detection involves focusing the thermal (heat) energy onto the uncooled infrared detector with sensor optics designed to pass IR wavelengths. Known approaches to uncooled infrared detectors include, a vanadium oxide 2D uncooled infrared detector array manufactured by Boeing Corporation; a yttrium barium copper oxide (YBCO) bolometer that has been demonstrated by MSI Inc; and Raytheon Corporation's approach to the uncooled infrared detector, is a pyroelectric capacitor array that requires a thermoelectric cooler as well as a chopper wheel, an approach that has been employed in the 2000 model GM Cadillac DeVille. The lowest cost approach to uncooled infrared detectors, however, is a micro electro mechanical system (MEMS) cantilever beam array. The cantilever beam array is a low-mass bimetallic diving board structure similar to an accelerometer where the amount of beam flexure is a function of its temperature and the temperature depends on the amount of incident infrared.
- Still another approach known in the art for combating the effects of impaired night vision is the use of near-infrared sensors with illumination. A night vision imaging system employing the use of near-infrared sensor with illumination generally consists of an illuminator that illuminates a distant scene and a near-IR camera that generates an image of the distant scene. One such system developed by Ford Jaguar Inc., uses a charge coupled device (CCD) camera and a near-infrared (NIR) spotlight. The Jaguar system works by integrating the NIR spotlight with conventional high-beam lamps. And by using a 680×500 pixel charge-coupled device (CCD) monochrome digital camera that is sensitive to infrared light not visible to the human eye, the Jaguar system is able to capture an image of an object located in a dark distant scene. The Jaguar approach and others like it are perhaps a more practical approach to night vision imaging, mostly due to the availability of low cost components. But, because of the high sensitivity of conventional CCD detectors to visible illumination, modem CCD and like cameras typically have short exposure times that range from approximately {fraction (1/60)} to {fraction (1/4000)} of a second and, as a result, the camera's range is limited. Thus, the camera's ability to enhance a driver's visibility of on-coming traffic or up-coming road conditions, if traveling at speeds of 60 mph or more is limited since it takes approximately 250 feet for an automobile traveling at 60 mph to come to a complete stop. Moreover, for infrared wavelengths above 700 nm, the sensitivity of CCD detectors is considerably reduced to only approximately 15% to 25% of its peak response. This reduction prevents the camera from recognizing objects at more than approximately 200 feet away during cloud cover, fog or after sunset.
- A better approach to near-infrared sensors with illumination, currently being used in search and rescue applications and pursued by Daimler-Chrysler Inc., is to use a pulsed laser diode as an illuminator and to gate the CCD camera shutter synchronously with the laser pulses. This approach has several advantages, including an achievement of 4 times higher peak optical power. The gating makes it possible to see through particles to approximately four to five times the range of the human eye and other vision systems. And, since the laser is polarized, filters can be used to enhance visibility in rain, fog, snow, etc. However, while such gated viewing systems can readily satisfy desired performance requirements, they are also too expensive for the average consumer.
- Finally, other approaches known in the art include millimeter microwave (MMW) imaging and LIDAR, however, both of these approaches are far more expensive to implement than those approaches previously mentioned.
- Thus, a near-infrared (IR) imaging system that is capable of increased infrared imaging sensitivity and range under conditions of darkness while providing a low-cost approach that would allow the average consumer to take advantage of enhanced night vision viewing is highly desirable.
- The preceding and other shortcomings of the prior art are addressed and overcome by the present invention that provides a system for providing recognition of an approaching object located in a distant no-light environment. The system includes an illumination source for transmitting light to the distant object and an imaging device for detecting the light radiation reflected from the distant object to generate an image of the distant object corresponding thereto. The system also includes an independent digital signal processor for calculating a desired optical magnification of a lens of the imaging device that the holds an image of the distant object in a fixed dimension for a period of time sufficient to capture enough light radiation to more clearly identify the approaching distant object. The digital signal processor dynamically calculates the desired optical magnification of the imaging device lens as a function of a distance between the imaging device and the distant object. The digital signal processor then generates a voltage corresponding to the desired optical magnification, and applies this voltage to the imaging device to adjust a focus of the lens to the desired optical magnification.
- Reference is now made to the following description and attached drawings, wherein:
- FIG. 1a is a functional diagram of an embodiment of a system in accordance with the present invention;
- FIG. 1b is a functional diagram of a distorted focus of an imaged object as a function of relative velocity;
- FIG. 1c is a functional diagram of a controlled focus of an imaged object as a function of relative velocity in accordance with the present invention;
- FIG. 2 is a graphical illustration of a known geometric equation that is used in the present invention to determine optical flow as a function of distance in accordance with the present invention; and
- FIG. 3 is an isometric diagram of a system for providing recognition of an approaching object located in a distant scene in accordance with an embodiment of the present invention.
- Referring to FIG. 1, is an embodiment of a near-infrared (IR)
imaging system 10 in accordance with the present invention. As shown in FIG. 1a, thesystem 10 includes an - IR
sensitive imaging device 14, anilluminator 12 and a digital signal processor (DSP) 22. The IRsensitive imaging device 14 comprises adetector element 16 having several hundred pixel elements (not shown), and alens 18 that is capable of digitally zooming focus in on or pulling focus back from adistant object 20 located in a distant no-light environment. To maintain a low-cost system 10, the IRsensitive imaging device 14 may be selected from one of the commercially available charge coupled device (CCD), complementary metal-oxide-semiconductor (CMOS) or like IR imaging cameras, such as the digital zoom capable Hi8 model CCD camera manufactured by SONY Corporation. However, this is not a necessary limitation of the invention and, therefore, theimaging device 14, herein further referenced as a camera, may be any device that is capable of detecting IR radiation and is also capable of electronic zoom. - Referring still to FIG. 1, the
illuminator 12 is provided to illuminate thedistant object 20 and is preferably, but not necessarily, an infrared light emitting diode. Theilluminator 12 is modulated at high frequency to emit an illumination wavelength of approximately 800 nanometers (nm) which is near the peak response of most CCD and CMOS cameras, but invisible to the human eye.Light 24 originates from theilluminator 12, and is reflected from a surface of theobject 20. Thelens 18 of thecamera 14, receives the reflectedlight 27 and focuses the light 27 onto afocal plane 25 of thedetector element 16, here a CCD array chip, to generate an image (not shown) of theobject 20. Because thecamera 14 is sensitive to light in the spectral range of approximately 800 nm, which is key to enhancing night vision, thecamera 14 is able to generate the image (not shown) of theobject 20 during the cover of darkness. - The independent digital signal processor (DSP)22 is provided to control the exposure of the
camera 14 in such a way that allows thecamera 14 to sufficiently integrate the distant object'sphoton energy 27 without distorting the image of theobject 20. More particularly, in accordance with the preferred embodiment of the present invention, improvement in the sensitivity and range of thecamera 14 is achieved by holding a shutter (not shown) of thelens 18 open for an extended period of time that is preferably, but not necessarily, up to approximately one second. By allowing thelens 18 to remain open for a longer period of time, thelens 18 is able to stare at theobject 20 longer, which allows thelens 18 to integrate on theobject 20 for a longer period of time. By allowing the lens to integrate on theobject 20 for a longer period of time, thelens 18 is able to collect more of thelight photons 27 reflected from theobject 20 which, based on known optical principles, significantly enhances the signal-to-noise ratio of thecamera 14. - Unfortunately, during this extended exposure period the
camera 14 may be mounted to a vehicle that is moving a significant distance in the direction of thedistant object 20. And this movement, as shown in FIG. 1b, causes the dimensions of theobject 20 to grow increasingly large—a phenomenon known in the art as optical flow. To alleviate distortion of the object's image as a result of optical flow, theDSP chip 22 applies a digitalzoom correction voltage 26 to thelens 18. As shown in FIG. 1c, thevoltage 26 digitally adjusts the magnification of thelens 18 so that the dimensions of theobject 20 remain constant throughout the entire exposure period. This magnification correction is applied uniformly to all pixels of theCCD detector 16 by thecamera 14. - Referring to FIG. 2, to generate the
correction voltage 26, an arithmetic logic circuit (not shown) of theDSP chip 22 is programmed with an algorithm which, based on the speed of the vehicle, and a predetermined viewing range and exposure time of thecamera system 14 uses a known geometric equation to determine anappropriate correction voltage 26 that corrects for the forward motion of thecamera 14 during exposure. More particularly, for given optical focus, the geometric equation relates the distance between thecamera lens 18 and a distant object to a magnification in size of the object on thefocal plane 25 of thecamera 14. Such an equation is, for example, discussed in detail in the publication, “Vision-based Vehicle Guidance,” by Ichiro Masaki, the general concepts of which are included here for reference. The Masaki publication generally provides estimates of the distance between a camera and a target by the following optical flow equation: - u=(dX/dt)/Z−X(dZ/dt)/Z×Z. (1)
- where P(X,Z)=coordinates of a target point P
- (0, W)=translational components of the motion of the camera
- dZ/dt=−W, a component of the camera motion
- X=X/Z, the coordinate of a point p on the image plane that is the perspective projection from the point P
- u=dx/dt, the optical flow at a point x.
- Thus, if the
camera 14 were moving in the direction of anobject 20 at 65 miles per hour, as measured by a vehicle speedometer or a similar acceleration measuring device that is connected to theprocessor 22, and the shutter exposure time of thecamera 14 were set to one second, thecamera 14 would have traveled a distance of (65×5280)/3600=95.3 feet in one second. And according to equation (1), this distance corresponds to an increased magnification in the size of the object's image, meaning the object will appear 95.3 feet closer to thecamera 14. What is desired, therefore, is to reduce the magnification of thelens 18 by applying an appropriate electroniczoom correction voltage 26 to the digital zoom circuitry (not shown) of thecamera 14 so that the size of the object's image remains constant throughout the entire exposure period. - To determine the
appropriate correction voltage 26, thelens 18 is calibrated using known triangulation principles that correct for the fact that thecamera lens 18 may not be linear. In other words, suppose an object of known height, here six feet, is 200 feet from thecamera lens 18 and the object height appears in the camera's viewfinder as 0.5 inches, meaning 0.5 inches corresponds to a six foot tall object located at 200 feet from thelens 18. Then, suppose the object is relocated 100 feet from thelens 18, if thelens 18 were linear, the object would appear 1 inch tall in the camera viewfinder. However, this is often not the case. - Thus, it is possible to calibrate the
camera 14 at a voltage (V1) that corresponds to a focus adjustment which is required to bring an object of known height located a predetermined range from thecamera lens 18 into proper magnification. For purposes of illustration only, the desired predetermined range of thecamera 14 is selected here as 200 feet, meaning thecamera 14 is able to detect objects at up to 200 feet from thecamera lens 18. Thecamera 14 can also be calibrated to a voltage (V2) that corresponds to a focus adjustment required to bring the same object, located a shorter distance from thecamera lens 18, into proper magnification. So, as described above, if thecamera 14 were moving in the direction of the object at 65 miles per hour and the shutter exposure time of thecamera 14 were set to one second, thecamera 14 would have traveled a distance of (65×5280)/3600=95.3 feet in one second. And knowing the distance that thecamera 14 has traveled during exposure, thecamera 14 can be calibrated to the voltage (V2) that corresponds to a focus adjustment required to bring the object, now located at (200 ft-93.5 ft) from thecamera lens 18, into proper magnification. In the present example, once voltage V1 and V2 have been determined, theoverall voltage 26 required to correct for the optical flow due to the forward motion of thecamera 14 is equal to a voltage change (ΔV) which is determined by difference between the voltage (V1) at 200 feet and the voltage (V2) at (200 ft-95.3 ft). Thus, thecorrection voltage 26 for 65 mph at a range of 200 ft is equal to minus ΔV. Thiscorrection voltage 26 is applied during the exposure period of thecamera 14, here one second, and is then reset to the voltage V1 once the period of exposure has expired. - It is important to note that since the
correction voltage 26 is linear with respect to thelens 18 magnification, the algorithm of the present invention, is able to use any predetermined viewing range and exposure time of thecamera system 14, as well as the speed of the vehicle to determine an appropriate correction in the magnification of thelens 18 based on how far the vehicle travels during the exposure period. Thus, the inputs to theDSP chip 22 are the vehicle speed, the desired predetermined range of thecamera 14, and the desired predetermined exposure time of thecamera 14. And the output of theDSP chip 22 is the digitalzoom correction voltage 26 that has been computed based on the speed at which thecamera 14 is moving. - Referring to FIG. 3, the present invention is thus particularly useful as a night vision driving aid. Using a
car 28 as an example, FIG. 3 illustrates thesystem 10 in accordance with the principles of the present invention. Thecamera 14, having enhanced infrared sensitivity and range in accordance with the principles of the present invention, is integrated into afront grill 29 of thecar 28. A pair ofinfrared LED illuminators 12 may be installed into an existing housing of the car'svisible headlamps 33 or integrated with the car's standardvisible headlamps 33. A liquid crystal display (LCD) 31 may be included in the interior cabin of the car to display an image of adistant object 20 to the driver. And, based on the principles of the present invention, as thecar 28 is traveling in the direction of thedistant object 20, theinfrared LED headlamps 12 illuminate a distant scene containing theobject 20 and so that thecamera 14 can capture the image of theobject 20. Thecamera 14 captures the image of theobject 20 by dynamically correcting the magnification of theobject 20 during an extended exposure to compensate for the forward motion of thecar 28. As a result of correcting the magnification of theobject 20 by holding theobject 20 stationary on the focal plane of thecamera 14, the probability of detecting theobject 20 is significantly increased. - As illustrated in the embodiments of the present invention, the present invention presents a low-cost alternative to other known night vision aid approaches by using low-cost commercially available components which may include commercially available digital imaging cameras, such as CCD or CMOS cameras. The present invention improves the infrared sensitivity and range of such cameras by a factor of approximately 60 by increasing their exposure time up to one second and correcting any image distortion that may occur during the exposure time as a result of optical flow.
- Obviously, many modifications and variations of the present invention are possible in light of the above teachings. Thus, it is to be understood that, within the scope of the appended claims, the invention may be practiced otherwise than as specifically described above.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/732,429 US6420704B1 (en) | 2000-12-07 | 2000-12-07 | Method and system for improving camera infrared sensitivity using digital zoom |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/732,429 US6420704B1 (en) | 2000-12-07 | 2000-12-07 | Method and system for improving camera infrared sensitivity using digital zoom |
Publications (2)
Publication Number | Publication Date |
---|---|
US20020070342A1 true US20020070342A1 (en) | 2002-06-13 |
US6420704B1 US6420704B1 (en) | 2002-07-16 |
Family
ID=24943481
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/732,429 Expired - Fee Related US6420704B1 (en) | 2000-12-07 | 2000-12-07 | Method and system for improving camera infrared sensitivity using digital zoom |
Country Status (1)
Country | Link |
---|---|
US (1) | US6420704B1 (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040008410A1 (en) * | 2002-07-09 | 2004-01-15 | Stam Joseph S. | Vehicle vision system with high dynamic range |
DE10343479A1 (en) * | 2003-09-19 | 2005-04-28 | Bosch Gmbh Robert | Method for improving the visibility in a motor vehicle |
US20060164541A1 (en) * | 2005-01-27 | 2006-07-27 | Olmstead Bryan L | Rolling-reset imager with optical filter |
US20070188633A1 (en) * | 2006-02-15 | 2007-08-16 | Nokia Corporation | Distortion correction of images using hybrid interpolation technique |
WO2008117141A1 (en) * | 2007-03-28 | 2008-10-02 | Sony Ericsson Mobile Communications Ab | Zoom control |
US20090206260A1 (en) * | 2008-02-04 | 2009-08-20 | Brandon Hite | Night vision technology: broad band imaging |
WO2011116375A1 (en) * | 2010-03-19 | 2011-09-22 | Northeastern University | Roaming mobile sensor platform for collecting geo-referenced data and creating thematic maps |
CN103293558A (en) * | 2013-06-05 | 2013-09-11 | 合肥汉翔电子科技有限公司 | Comprehensive photoelectric detection system |
CN103327258A (en) * | 2012-03-21 | 2013-09-25 | 华晶科技股份有限公司 | License plate photographic device and adjusting method of image exposure thereof |
US20140139669A1 (en) * | 2012-01-30 | 2014-05-22 | Steven Petrillo | System and method for providing front-oriented visual information to vehicle driver |
EP2760194A3 (en) * | 2013-01-25 | 2014-08-27 | Shenzhen Protruly Electronics Co., Ltd | An automotive camera system and the data processing method based on its shooting angle changing synchronously with the automotive speed |
US9121703B1 (en) * | 2013-06-13 | 2015-09-01 | Google Inc. | Methods and systems for controlling operation of a laser device |
US20190089889A1 (en) * | 2017-09-20 | 2019-03-21 | Panasonic Intellectual Property Management Co., Ltd. | Night vision imaging apparatus |
US20190364199A1 (en) * | 2018-05-24 | 2019-11-28 | Magna Electronics Inc. | Vehicle vision system with infrared led synchronization |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020067413A1 (en) * | 2000-12-04 | 2002-06-06 | Mcnamara Dennis Patrick | Vehicle night vision system |
US20020171754A1 (en) * | 2001-05-18 | 2002-11-21 | I-Jen Lai | Digital camera with multi-illuminating source |
DE10146959A1 (en) * | 2001-09-24 | 2003-04-30 | Hella Kg Hueck & Co | Night vision device for vehicles |
US6611200B2 (en) * | 2001-10-03 | 2003-08-26 | The United States Of America As Represented By The Secretary Of The Air Force | Method for viewing through tinted windows |
JP3817174B2 (en) * | 2001-12-28 | 2006-08-30 | 矢崎総業株式会社 | Vehicle image correction device and night driving visibility support device |
DE10202163A1 (en) * | 2002-01-22 | 2003-07-31 | Bosch Gmbh Robert | Process and device for image processing and night vision system for motor vehicles |
DE10203413C2 (en) * | 2002-01-28 | 2003-11-27 | Daimler Chrysler Ag | Automobile infrared night vision device |
US6700123B2 (en) * | 2002-01-29 | 2004-03-02 | K. W. Muth Company | Object detection apparatus |
JP3700778B2 (en) * | 2002-03-14 | 2005-09-28 | 三菱電機株式会社 | Infrared imaging device |
US7139411B2 (en) * | 2002-06-14 | 2006-11-21 | Honda Giken Kogyo Kabushiki Kaisha | Pedestrian detection and tracking with night vision |
DE10305010B4 (en) * | 2003-02-07 | 2012-06-28 | Robert Bosch Gmbh | Apparatus and method for image formation |
US7024292B2 (en) * | 2003-08-27 | 2006-04-04 | Ford Motor Company | Active night vision control system |
JP4612635B2 (en) * | 2003-10-09 | 2011-01-12 | 本田技研工業株式会社 | Moving object detection using computer vision adaptable to low illumination depth |
US6967569B2 (en) * | 2003-10-27 | 2005-11-22 | Ford Global Technologies Llc | Active night vision with adaptive imaging |
EP1834312A2 (en) * | 2005-01-03 | 2007-09-19 | Vumii, Inc. | Systems and methods for night time surveillance |
US7764324B2 (en) * | 2007-01-30 | 2010-07-27 | Radiabeam Technologies, Llc | Terahertz camera |
CA2829457C (en) * | 2011-02-28 | 2016-02-02 | Arcelormittal Investigacion Y Desarollo Sl | Method and apparatus for real time video imaging of the snout interior on a hot dip coating line |
US8948449B2 (en) * | 2012-02-06 | 2015-02-03 | GM Global Technology Operations LLC | Selecting visible regions in nighttime images for performing clear path detection |
US10793054B2 (en) | 2016-03-04 | 2020-10-06 | Sean Neal | Vehicle light system |
USD985807S1 (en) | 2017-03-06 | 2023-05-09 | Sean Neal | Emergency vehicle light assembly |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5065024A (en) * | 1990-02-07 | 1991-11-12 | Inframetrics, Inc. | Infrared imaging system with simultaneously variable field of view and resolution and fixed optical magnification |
US5398095A (en) * | 1990-04-09 | 1995-03-14 | Minolta Camera Kabushiki Kaisha | Automatic zooming device |
JP3208492B2 (en) * | 1991-09-26 | 2001-09-10 | 株式会社リコー | Varifocal lens controller |
US5710428A (en) * | 1995-08-10 | 1998-01-20 | Samsung Electronics Co., Ltd. | Infrared focal plane array detecting apparatus having light emitting devices and infrared camera adopting the same |
-
2000
- 2000-12-07 US US09/732,429 patent/US6420704B1/en not_active Expired - Fee Related
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7683326B2 (en) * | 2002-07-09 | 2010-03-23 | Gentex Corporation | Vehicle vision system with high dynamic range |
US20040008410A1 (en) * | 2002-07-09 | 2004-01-15 | Stam Joseph S. | Vehicle vision system with high dynamic range |
DE10343479A1 (en) * | 2003-09-19 | 2005-04-28 | Bosch Gmbh Robert | Method for improving the visibility in a motor vehicle |
US20060164541A1 (en) * | 2005-01-27 | 2006-07-27 | Olmstead Bryan L | Rolling-reset imager with optical filter |
US7499090B2 (en) * | 2005-01-27 | 2009-03-03 | Datalogic Scanning, Inc. | Rolling-reset imager with optical filter |
US20070188633A1 (en) * | 2006-02-15 | 2007-08-16 | Nokia Corporation | Distortion correction of images using hybrid interpolation technique |
US7881563B2 (en) * | 2006-02-15 | 2011-02-01 | Nokia Corporation | Distortion correction of images using hybrid interpolation technique |
WO2008117141A1 (en) * | 2007-03-28 | 2008-10-02 | Sony Ericsson Mobile Communications Ab | Zoom control |
US7639935B2 (en) | 2007-03-28 | 2009-12-29 | Sony Ericsson Mobile Communications Ab | Zoom control |
US20080240698A1 (en) * | 2007-03-28 | 2008-10-02 | Sony Ericsson Mobile Communications Ab | Zoom control |
US20090206260A1 (en) * | 2008-02-04 | 2009-08-20 | Brandon Hite | Night vision technology: broad band imaging |
WO2011116375A1 (en) * | 2010-03-19 | 2011-09-22 | Northeastern University | Roaming mobile sensor platform for collecting geo-referenced data and creating thematic maps |
US9377528B2 (en) | 2010-03-19 | 2016-06-28 | Northeastern University | Roaming mobile sensor platform for collecting geo-referenced data and creating thematic maps |
US20140139669A1 (en) * | 2012-01-30 | 2014-05-22 | Steven Petrillo | System and method for providing front-oriented visual information to vehicle driver |
CN103327258A (en) * | 2012-03-21 | 2013-09-25 | 华晶科技股份有限公司 | License plate photographic device and adjusting method of image exposure thereof |
EP2760194A3 (en) * | 2013-01-25 | 2014-08-27 | Shenzhen Protruly Electronics Co., Ltd | An automotive camera system and the data processing method based on its shooting angle changing synchronously with the automotive speed |
CN103293558A (en) * | 2013-06-05 | 2013-09-11 | 合肥汉翔电子科技有限公司 | Comprehensive photoelectric detection system |
US9121703B1 (en) * | 2013-06-13 | 2015-09-01 | Google Inc. | Methods and systems for controlling operation of a laser device |
US20190089889A1 (en) * | 2017-09-20 | 2019-03-21 | Panasonic Intellectual Property Management Co., Ltd. | Night vision imaging apparatus |
US10652477B2 (en) * | 2017-09-20 | 2020-05-12 | Panasonic Intellectual Property Management Co., Ltd. | Night vision imaging apparatus |
US20190364199A1 (en) * | 2018-05-24 | 2019-11-28 | Magna Electronics Inc. | Vehicle vision system with infrared led synchronization |
US10958830B2 (en) * | 2018-05-24 | 2021-03-23 | Magna Electronics Inc. | Vehicle vision system with infrared LED synchronization |
US11240427B2 (en) | 2018-05-24 | 2022-02-01 | Magna Electronics Inc. | Vehicular vision system with infrared emitter synchronization |
US11627389B2 (en) | 2018-05-24 | 2023-04-11 | Magna Electronics Inc. | Vehicular vision system with infrared emitter synchronization |
US11849215B2 (en) | 2018-05-24 | 2023-12-19 | Magna Electronics Inc. | Vehicular vision system with camera and near-infrared emitter synchronization |
Also Published As
Publication number | Publication date |
---|---|
US6420704B1 (en) | 2002-07-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6420704B1 (en) | Method and system for improving camera infrared sensitivity using digital zoom | |
US9904859B2 (en) | Object detection enhancement of reflection-based imaging unit | |
US10390004B2 (en) | Stereo gated imaging system and method | |
JP4359121B2 (en) | Multifunction integrated vision system with a matrix of CMOS or CCD technology | |
US10564267B2 (en) | High dynamic range imaging of environment with a high intensity reflecting/transmitting source | |
EP2602640B1 (en) | Vehicle occupancy detection using time-of-flight sensor | |
WO2017151561A1 (en) | Methods and apparatus for an active pulsed 4d camera for image acquisition and analysis | |
WO2015198300A1 (en) | Gated sensor based imaging system with minimized delay time between sensor exposures | |
JP6333238B2 (en) | Raindrop detection on glass surface using camera and lighting | |
KR20150024860A (en) | Gated imaging using an adaptive dapth of field | |
US20180027191A2 (en) | System for controlling pixel array sensor with independently controlled sub pixels | |
EP3428686B1 (en) | A vision system and method for a vehicle | |
EP3227742B1 (en) | Object detection enhancement of reflection-based imaging unit | |
JP2004325202A (en) | Laser radar system | |
US20230152429A1 (en) | Auto-Exposure Occlusion Camera | |
US20230324671A1 (en) | Optical systems with tiltable filters | |
Matsubara et al. | Compact imaging LIDAR with CMOS SPAD | |
Bertozzi et al. | Camera-based automotive systems | |
KR20210050086A (en) | Camera for improved image acquisition in dark environment | |
Sarkar | Vision Sensors in Automobiles: An Indian Perspective | |
BALYASNIKOV et al. | SELECTION OF EQUIPMENT FOR VEHICLE PROXIMITY CONTROL |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TRW INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BERENZ, JOHN J.;MCIVER, GEORGE W.;DUNBRIDGE, BARRY (NMI);REEL/FRAME:011361/0200 Effective date: 20001206 |
|
AS | Assignment |
Owner name: JPMORGAN CHASE BANK, NEW YORK Free format text: THE US GUARANTEE AND COLLATERAL AGREEMENT;ASSIGNOR:TRW AUTOMOTIVE U.S. LLC;REEL/FRAME:014022/0720 Effective date: 20030228 |
|
REMI | Maintenance fee reminder mailed | ||
LAPS | Lapse for failure to pay maintenance fees | ||
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20060716 |