WO1993011631A1 - Solid state sensor arrangement for video camera - Google Patents

Solid state sensor arrangement for video camera Download PDF

Info

Publication number
WO1993011631A1
WO1993011631A1 PCT/GB1992/002260 GB9202260W WO9311631A1 WO 1993011631 A1 WO1993011631 A1 WO 1993011631A1 GB 9202260 W GB9202260 W GB 9202260W WO 9311631 A1 WO9311631 A1 WO 9311631A1
Authority
WO
WIPO (PCT)
Prior art keywords
lens
sensor
image capture
sensor arrays
spacer
Prior art date
Application number
PCT/GB1992/002260
Other languages
French (fr)
Inventor
Peter Brian Denyer
Original Assignee
Vlsi Vision Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vlsi Vision Limited filed Critical Vlsi Vision Limited
Priority to GB9409608A priority Critical patent/GB2276512B/en
Publication of WO1993011631A1 publication Critical patent/WO1993011631A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/257Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/13Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
    • H04N23/16Optical arrangements associated therewith, e.g. for beam-splitting or for colour correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/41Extracting pixel data from a plurality of image sensors simultaneously picking up an image, e.g. for increasing the field of view by combining the outputs of a plurality of sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/15Processing image signals for colour aspects of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/597Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding specially adapted for multi-view video sequence encoding

Definitions

  • the present invention relates to electronic cameras including electronic colour cameras.
  • colour sensors can be produced by discriminating three images of the primary colours (blue, green, red) of the scene. All colours can be analysed and synthesised via these primaries (or other complementary triples like cyan, magenta, yellow) .
  • Conventional electronic cameras classically use one of two approaches for forming the separate colour images. 3-tube cameras use a single ⁇ ens followed by a prism which forms three separate r.g.b images. Three sensors are used simultaneously to detect these three images. If the sensors are accurately aligned the resulting picture is of very high quality. However the sensors are separated in space and orientation and their assembly and alignment with the prism and lens is difficult for a volume manufacturing process. This technique is therefore used exclusively for expensive broadcast-quality equipment.
  • Colour-Mosaic Cameras use a single lens and sensor, but the sensor surface is covered with a high-resolution mosaic or grid of colour filters, with the pattern dimension equal to the pixel-pitch for a semiconductor CCD or MOS sensor array. Pixels of different colours are demultiplexed at the sensor output and interpolated to form synchronous parallel colour signals. This is well-suited to volume production as the surface colour mosaic can be fabricated as an extension of the semiconductor wafer fabrication process. The techniques for mosaic fabrication are restricted to relatively few companies worldwide who supply the colour sensor market and thus they are not commonly available. Furthermore, associated with this technique there are technical problems concerned with resolution and aliasing. Much work has been done to correct these effects , but usually at some cost in image-processing hardware.
  • the present invention provides an image capture system comprising a solid state image capture device which device comprises an integrated circuit having at least two sensor arrays, each said array having an image sensing surface and a respective lens system associated therewith.
  • the present invention provides two or more cameras on one chip each with its own lens system and sensor array.
  • the problem of alignment is greatly reduced by the fabrication of the various sensors required one one chip. This ensures that the sensors all lie in the same plane and have the same rotational orientation, and this is an important advantage.
  • the only alignment errors which are likely to occur are simple orthogonal translations in the form of vertical and horizontal errors in the centres of the optical axes. It is relatively easy though to calibrate these cameras after assembly and electronically to correct for these translations.
  • the degree of error in producing a single composite image i.e. a single image produced by the more or less accurately aligned super imposition of two or more corresponding images e.g. at different wavelengths, of the same scene
  • the present invention provides a composite image camera of particularly simple and economic construction.
  • the present invention also provides in another aspect a stereoscopic image capture system where larger sensor spacings are used to provide a greater parallax differential for producing different images with a more or less accurately defined parallax differential for use in producing stereoscopic image pairs.
  • larger sensor spacings are used to provide a greater parallax differential for producing different images with a more or less accurately defined parallax differential for use in producing stereoscopic image pairs.
  • the lens systems are mounted substantially directly on the image sensing surfaces.
  • a lens system in accordance with our earlier British Patent Application No. 9103846.3 dated 23rd February 1991 (published in
  • lens comprises a lens and a spacer in substantially direct contact with each other, said spacer preferably having a refractive index not less than that of said lens, said lens and spacer having refractive indices and being dimensioned so as to form an image in a plane at or in direct proximity to a rear face of said spacer element remote from said lens element, from an object, whereby in use of the lens system with said lens system mounted substantially directly on the image sensing surface of the image capture device an optical image may be captured thereby.
  • These lens systems have the advantage of physical dimensions which can be made similar to those of the sensor array itself, so that sensors and lenses may be immediately adjacent to each other. Camera separations as low as 2 to 3mm are easily achieved and this helps to minimise the parallax error.
  • the flat surfaces of the cylindrical lens spacer also help to maintain accurate planarity for groups of lenses attached to the same chip substrate.
  • At least one of the individual cameras constituted by respective lens system and sensor arrays is provided with a filter means for passing a desired wavelength (or wavelength range) of the electromagnetic radiation spectrum, whereby there may be captured a composite image comprised of two or more (depending on the number of individual cameras used) images of the same object differing substantially only in the wavelength thereof.
  • the integrated circuit has three sensor arrays provided with respective lens systems and filters for three different wavelengths e.g. red, green, and blue, or cyan, magenta, and yellow for providing a desired composite image e.g. a full-colour image.
  • three or more individual cameras are used, it will be appreciated that various different layouts of the sensor arrays relative to each other may be employed including e.g. linear arrangements or generally "circular" or other close-packed arrangements.
  • green In cameras using the 3 primary colours, green is dominant in providing image acuity since it generally dominates the derived luminance. In all cases therefore, the green camera is desirably made as central as possible, and the red and blue cameras are referenced to it. The parallax errors will therefore show up on red and blue only.
  • the expression “substantially direct contact” is used to mean that there should not be any significant interspace containing low refractive index material such as air i.e. no interspace having a thickness resulting in a significant optical effect.
  • this should normally be not more than 500 urn, preferably not more than 100 um, thick.
  • resin or like material is used between the components adhesively to secure them together and has a refractive index comparable to that of the lens or spacer, it may be considered as an extension of the lens or spacer and thus need not be so restricted in thickness though preferably the thickness thereof should not be excessive and should be more or less similarly restricted.
  • a plano-convex (or possibly plano-concave - see below) lens with a substantially plane spacer for manufacturing convenience and economy but other combinations e.g. a bi-convex lens and a plano-concave spacer, may also be used.
  • the lens system is secured to said image sensing surface by an optical grade adhesive i.e. a substantially transparent optically uniform adhesive.
  • the spacer has a higher, most preferably a substantially higher, refractive index than the lens. Where the same refractive index is acceptable for both then it will be appreciated that the spacer could be formed integrally with the lens.
  • the radius (or radii) of curvature of the lens element and its refractive index may be varied through a wide range of values depending on the required performance in terms of depth of field, image size, freedom from aberrations etc.
  • solid state image capture devices in the form of photoelectric sensor arrays
  • a lens of relatively short focal length should be used e.g. for a field of view angle of 80 degrees
  • (maximum) focal length will not normally exceed 1.19 times the image height and for 60 degrees will not normally exceed 1.73 times the (maximum) image height, the (maximum) image height corresponding to half the sensing surface diameter.
  • the use of a high refractive index spacer and the exclusion of any low refractive index material from the optical path significantly decreases aberration due to Petzval Curvature (otherwise known a curvature of field aberration) and limits spherical aberration.
  • the lens system is therefore particularly advantageous in wide field and/or large aperture applications required for low light conditions. In general there.
  • Low-dispersion glass such as type BK7 (available from various sources e.g. Schott Glaswerke) is particularly suitable for the lens element.
  • the spacer element may be made of LaKlO glass also readily available.
  • Other materials that may be used for the lens and/or spacer elements comprise plastics materials, although these are generally less preferred in view of their lower resistance to scratching and other damage and the lower refractive indices available. Nevertheless they may be acceptable for certain applications requiring low cost such as consumer door-entry and security cameras.
  • Suitable adhesive materials for use between the spacer and lens elements and between the spacer element and the solid state image capture device include optical grade epoxy resins.
  • the solid state image capture device comprises an integrated circuit image array sensor such as that disclosed in our earlier International patent application No. PCT/GB90/01452 (publication No. W091/04633 the contents of which are hereby incorporated herein by reference thereto) which has on-board signal processing means formed and arranged for directly providing a video signal output.
  • integrated circuit image array sensor such as that disclosed in our earlier International patent application No. PCT/GB90/01452 (publication No. W091/04633 the contents of which are hereby incorporated herein by reference thereto) which has on-board signal processing means formed and arranged for directly providing a video signal output.
  • image capture devices such as CCD, MOS and CCD sensors may also be used.
  • the image capture device may comprise simply a sensor chip on which are only provided the sensor arrays with all the electronic circuitry required to detect the response of individual sensor cells to incident radiation and further processing of the detected response provided externally of the sensor chip, and of course other arrangements with a greater or lesser part of this electronic circuitry provided on the chip bearing the sensor arrays, are also possible.
  • references to “cameras” herein includes references to apparatus in which substantially the whole of the electronic circuitry required to produce a video output signal is provided on the same chip as the sensor arrays, as well as apparatus in which a greater or lesser part is provided separately.
  • references to camera alignment relate only to alignment of the lenses and sensor arrays (and not to any other components that may be required to produce a video signal output.
  • Fig. 1 is a schematic perspective view of a composite image colour video camera of the invention with three individual camera units;
  • Figs 2(a) to (d) are schematic views showing 4 different 2-D arrangements of the three camera elements relative to each other;
  • Fig. 3 is a schematic illustration of the optical performance of a camera of the invention.
  • Fig. 4 is a block circuit diagram of one possible electronic architecture for a camera of the invention.
  • Fig. 5 is a sectional elevation of another camera of the invention.
  • Fig. 6 is a schematic perspective view of a spacer support element suitable for use in the camera of Fig. 5.
  • Fig.l shows a miniature colour video camera system C having three cameras 1 each comprising a lens system 2 mounted directly onto the image sensing surface 3 of a respective solid state image capture device in the form of an integrated circuit image array sensor 4.
  • the sensors 4 are formed as separate sections of a single monolithic VLSI microchip 5 mounted in a suitable housing CH containing a power supply 6 and provided with a video signal output interface 7.
  • the lens system 2 comprises a generally hemispherical lens 8 having a radius of curvature of the order of 0.85 mm, and a cylindrical spacer element 9 of substantially larger diameter (ca. 1.7mm) and a length of 1.59 mm, with an aperture stop 10 therebetween.
  • the aperture stop 10 is of metal e.g. steel alloy with a thickness of 0.15 mm and an iris diameter of 0.8 mm providing an effective lens aperture of f2.0.
  • the aperture opening is filled with clear epoxy resin 11 which has a refractive index substantially similar to that of the lens 8 and secures the lens 8 and spacer 9 to each other and to the aperture stop 10.
  • an aperture stop of metal or other material could simply be printed onto the spacer or lens e.g. using a photolithographic technique.
  • the R, G, B (red, green and blue) filters 12 for the three respective lenses 8 can also be disposed between the lenses 8 and spacers 9.
  • the lens 8 is of low dispersion glass (Bk7) having a refractive index n ⁇ of 1.568 and the spacer is of LaKlo glass which has a higher refractive index n ⁇ a of 1.7200. This combination produces low image blur and large image size (ca. 1.4mm image height from central axis).
  • the spacer 9 has a length of around 1.59mm. This lens system has an effective depth of field of from 2cms to OO with a field of view angle of 90° and has an rms blur of around 5um which is within the unit sensor pixel dimensions thereby providing a reasonably good video signal image output from the video signal output connection 7.
  • the spacer element could be a composite element made up of a plurality of plane components.
  • the lens element could also be composite though this would normally be less preferred due to the significantly increased complexity.
  • the various surfaces of the lens system could moreover be provided with diverse coatings for e.g. reducing undesirable reflections and selective filtration of the incident light rays in generally known manner.
  • the R, G, B filters could be mounted on a suitable support in front of the lenses 8 as further described hereinbelow.
  • Fig. 2 shows some possible alternative layouts for the individual cameras on the single chip.
  • layout (a) the absolute red-green and blue-green distances are minimised. All of the parallax error is vertical. This may not be optimum for TV applications, as in this case the colour signal is greatly averaged horizontally, but not at all vertically.
  • Layout (b) forces all of the parallax error into the horizontal dimension to take advantage of this.
  • Layout (c) has slightly worse red-green and blue-green parallax than (a) , but the red-blue distance is greatly less. Intuitively, this configuration minimises the total parallax error.
  • Layout (3) is as for (c) , but pushes most of the r-g, b-g errors into the horizontal axis.
  • the colour camera system of the present invention provides two further potential technical advantages over the alternative known approaches.
  • Axial colour aberration causes the focal plane for blue to be slightly closer to the lens, and for red slightly further from the lens, than green.
  • This aberration may actually be an advantage if we design the lens for green light and the blue and red images become slightly defocussed, thereby also blurring the effect of parallax errors.
  • the blue and green images are blurred by around 1 pixel at the geometries used above. If we wish, the 3-lens approach could be adapted to accommodate this aberration by fine-tuning the focal length of each lens. This is impossible with either of the existing single-lens approaches.
  • Transversal colour aberration is a change in magnification factors at different wavelengths. Again it may be possible to correct for this by modifying slightly the red and blue lens geometries.
  • Fig. 3 shows how the parallax errors between closely adjacent identical cameras can be maintained at or below one pixel for cameras with useful resolutions of several hundred pixels. If the cameras are calibrated to provide image alignment for objects at infinity, then an object at distance 0 (on the optical axis of one camera for simplicity) is imaged at an offset of e pixels in the second camera. It is obvious that the parallax error is greatest for objects which are closest to the cameras. To help in generalising the result, suppose we wish to image objects at minimum range Omm with a field of view of 2 ⁇ degrees and sensor resolution of P pixels. Then by trigonometry, the parallax error e, in pixels , is:-
  • Equation (1) is the parallax error for a close object with reference to objects at infinity.
  • e in equation (1) is the parallax error for a close object with reference to objects at infinity.
  • the error in this case will be approximately 0.5 pixels.
  • the parallax error is less than one pixel and therefore of the same order as aliasing and interpolation errors in single-chip colour mosaic cameras and accordingly reasonably acceptable.
  • parallax error increases for small object ranges or distances 0 and for larger camera separations s.
  • sensor size P 240 pixels
  • camera pitch separation s 3.1.mm
  • field of view 45° (2 ⁇ )
  • stereo image capture is feasible at ranges up to 1.8 metres. This range can be increased simply by corresponding increases in the camera separation s.
  • the electronic architecture of the camera is substantially independent of the optical and sensor arrangement described above. Nevertheless, the availability of synchronous, continuous RGB colour signals minimises the required image processing. This results in a simpler and lower-cost electronic implementation than for colour-mosaic cameras. Furthermore, the electronic requirements may be implemented feasibly on the same chip as the sensors where CMOS sensor technology such as that described in our earlier Patent Publication No. WO91/04498, is used.
  • FIG. 4 gives an overview of one possible electronic architecture.
  • Three colour arrays are driven in similar style to a monochrome array, except that the timing of vertical and horizontal control on the red and blue arrays is altered by offset values loaded into the controller from an off-chip PROM (Programmable Read Only Memory) Chip which has been programmed with the offset calibration data.
  • PROM Program Read Only Memory
  • AEC Automatic Exposure Control
  • AGC Automatic Gain Control
  • the resulting balanced colour signals are passed through a colour correction matrix, which performs weighted mixing of the three colours (see e.g. D'Luna and Parulski, IEEE JSSC Vol. 26, No. 5, pp. 727-737) followed by gamma correction on each colour. Both these functions are standard requirements for colour cameras intended for TV displays as they correct for known colour and amplitude nonlinearities in the display tubes.
  • the matrix may be implemented in analogue CMOS, either by using switched capacitors or switched current-sources. Either of these can accommodate alterable coefficients in digital form, or they could be fixed in layout. In the former case the coefficients may be stored in the PROM already provided for offset calibration and this may afford a useful degree of flexibility.
  • the gamma corrected RGB signals are then passed to an appropriate encoder for whatever standard is required. Suitable encoders for e.g. NTS/PAL are readily available.
  • Figs. 5 and 6 like parts corresponding to those in Fig. 1 are indicated by like reference numerals.
  • the Figs. 5 and 6 illustrate an alternative embodiment in which is used a lens system 20 supported at its edges 21 on spacer elements 22 with a substantial free air space 23 between the three lenses 8 of the lens system and the respective sensors 4.
  • the lens system 20 is a one-piece moulding from a suitable optical grade plastics material and incorporating an array of three lens portions 8 joined edge-to-edge. It will be appreciated that this affords a particularly economic and convenient form of production whilst at the same time simplyfying assembly of the camera insofar as the three lenses for the R, G, B components of the image, are automatically aligned with each other.
  • the lenses 8 could be manufactured from other materials e.g. glass or any other suitable optical material and/or as discrete individual components. Moreover each lens could comprise more than one element e.g. a doublet.
  • the lenses 8 may conveniently be aspheric or spherical or planar at either surface thereof.
  • the sensors 4 are formed as separate sections of a single monolithic VLSI microchip 5, mounted in a housing CH (only part shown) .
  • An optical support sub-housing 24 has various shoulder portions 25-28, for respectively securing the monolithic chip 5 to the housing CH, supporting the lens system 20 on the spacer elements 22 above the chip 5, supporting R, G and B filters 29-31 above the lenses 8, and supporting a protective outer Infra Red filter 32 (conveniently of doped glass e.g. Schott KG3 or BG39) .
  • spacer walls 33, 34 Additional support to the lens system 21 and the filters 29-31 is conveniently provided by spacer walls 33, 34 which have the further advantage of acting as light baffles between the three, R, G, B r cameras 1 to prevent cross-imaging between each lens 8 and the other sensors 4.
  • the lower spacer walls 33 may conveniently be formed integrally with the support sub-housing 24.

Abstract

The present invention relates to an image capture system suitable for use in an electronic camera system C and comprising a solid state image capture device (1) comprising an integrated circuit (5) having at least two sensor arrays (4), each said array having an image sensing surface (3) and a respective lens system (8) associated therewith.

Description

SOLID STATE SENSOR ARRANGEMENT FOR VIDEO CAMERA The present invention relates to electronic cameras including electronic colour cameras.
It is well known that colour sensors can be produced by discriminating three images of the primary colours (blue, green, red) of the scene. All colours can be analysed and synthesised via these primaries (or other complementary triples like cyan, magenta, yellow) . Conventional electronic cameras classically use one of two approaches for forming the separate colour images. 3-tube cameras use a single ϊens followed by a prism which forms three separate r.g.b images. Three sensors are used simultaneously to detect these three images. If the sensors are accurately aligned the resulting picture is of very high quality. However the sensors are separated in space and orientation and their assembly and alignment with the prism and lens is difficult for a volume manufacturing process. This technique is therefore used exclusively for expensive broadcast-quality equipment. Colour-Mosaic Cameras use a single lens and sensor, but the sensor surface is covered with a high-resolution mosaic or grid of colour filters, with the pattern dimension equal to the pixel-pitch for a semiconductor CCD or MOS sensor array. Pixels of different colours are demultiplexed at the sensor output and interpolated to form synchronous parallel colour signals. This is well-suited to volume production as the surface colour mosaic can be fabricated as an extension of the semiconductor wafer fabrication process. The techniques for mosaic fabrication are restricted to relatively few companies worldwide who supply the colour sensor market and thus they are not commonly available. Furthermore, associated with this technique there are technical problems concerned with resolution and aliasing. Much work has been done to correct these effects , but usually at some cost in image-processing hardware.
It is an object of the present invention to avoid or minimise one or more of the above disadvantages.
in one of its broadest aspects, the present invention provides an image capture system comprising a solid state image capture device which device comprises an integrated circuit having at least two sensor arrays, each said array having an image sensing surface and a respective lens system associated therewith.
Thus in effect the present invention provides two or more cameras on one chip each with its own lens system and sensor array. With such an arrangement the problem of alignment is greatly reduced by the fabrication of the various sensors required one one chip. This ensures that the sensors all lie in the same plane and have the same rotational orientation, and this is an important advantage. Assuming lenses can be accurately assembled in a parallel plane (see below) , the only alignment errors which are likely to occur are simple orthogonal translations in the form of vertical and horizontal errors in the centres of the optical axes. It is relatively easy though to calibrate these cameras after assembly and electronically to correct for these translations. Whilst the inevitable lateral off-set between the cameras at even the closest dispositions of the cameras on the chip, will of course give rise to a degree of parallax error, it has now been found that with a preferred system of the present invention with generally adjacent sensor arrays, the degree of error in producing a single composite image (i.e. a single image produced by the more or less accurately aligned super imposition of two or more corresponding images e.g. at different wavelengths, of the same scene) can be acceptably small for small camera geometries for low to medium resolution applications. Thus in said one preferred aspect the present invention provides a composite image camera of particularly simple and economic construction.
The present invention also provides in another aspect a stereoscopic image capture system where larger sensor spacings are used to provide a greater parallax differential for producing different images with a more or less accurately defined parallax differential for use in producing stereoscopic image pairs. Again the use of two or more cameras mounted on a single chip helps substantially to minimise alignment problems in producing an accurate stereoscopic view.
Advantageously the lens systems are mounted substantially directly on the image sensing surfaces. Preferably there is used a lens system in accordance with our earlier British Patent Application No. 9103846.3 dated 23rd February 1991 (published in
International Publication No. WO92/15036) which lens comprises a lens and a spacer in substantially direct contact with each other, said spacer preferably having a refractive index not less than that of said lens, said lens and spacer having refractive indices and being dimensioned so as to form an image in a plane at or in direct proximity to a rear face of said spacer element remote from said lens element, from an object, whereby in use of the lens system with said lens system mounted substantially directly on the image sensing surface of the image capture device an optical image may be captured thereby. These lens systems have the advantage of physical dimensions which can be made similar to those of the sensor array itself, so that sensors and lenses may be immediately adjacent to each other. Camera separations as low as 2 to 3mm are easily achieved and this helps to minimise the parallax error. The flat surfaces of the cylindrical lens spacer also help to maintain accurate planarity for groups of lenses attached to the same chip substrate.
It is also possible though to use more conventional r albeit similarly small, lens systems which are mounted on a suitable support so as to be spaced from the sensor surface with an air gap therebetween. One advantage of such systems is that they allow the use of more conventional and cheaper lens materials without the need for special materials having particular refractive indices.
In general at least one of the individual cameras constituted by respective lens system and sensor arrays, is provided with a filter means for passing a desired wavelength (or wavelength range) of the electromagnetic radiation spectrum, whereby there may be captured a composite image comprised of two or more (depending on the number of individual cameras used) images of the same object differing substantially only in the wavelength thereof.
In one preferred form of the invention the integrated circuit has three sensor arrays provided with respective lens systems and filters for three different wavelengths e.g. red, green, and blue, or cyan, magenta, and yellow for providing a desired composite image e.g. a full-colour image. Where three or more individual cameras are used, it will be appreciated that various different layouts of the sensor arrays relative to each other may be employed including e.g. linear arrangements or generally "circular" or other close-packed arrangements.
In cameras using the 3 primary colours, green is dominant in providing image acuity since it generally dominates the derived luminance. In all cases therefore, the green camera is desirably made as central as possible, and the red and blue cameras are referenced to it. The parallax errors will therefore show up on red and blue only.
With reference to lens systems of our earlier application No. 9103846.3, the expression "substantially direct contact" is used to mean that there should not be any significant interspace containing low refractive index material such as air i.e. no interspace having a thickness resulting in a significant optical effect. In the case of an air gap this should normally be not more than 500 urn, preferably not more than 100 um, thick, In the case where resin or like material is used between the components adhesively to secure them together and has a refractive index comparable to that of the lens or spacer, it may be considered as an extension of the lens or spacer and thus need not be so restricted in thickness though preferably the thickness thereof should not be excessive and should be more or less similarly restricted.
Advantageously there is used a plano-convex (or possibly plano-concave - see below) lens with a substantially plane spacer for manufacturing convenience and economy but other combinations e.g. a bi-convex lens and a plano-concave spacer, may also be used. Preferably the lens system is secured to said image sensing surface by an optical grade adhesive i.e. a substantially transparent optically uniform adhesive. Desirably there is used between the lens and spacer an adhesive having the same refractive index as the lens (or if preferred, as the spacer) and between the spacer and the sensing surface, an adhesive having the same refractive index as the spacer.
Preferably the spacer has a higher, most preferably a substantially higher, refractive index than the lens. Where the same refractive index is acceptable for both then it will be appreciated that the spacer could be formed integrally with the lens.
It will be appreciated that the radius (or radii) of curvature of the lens element and its refractive index may be varied through a wide range of values depending on the required performance in terms of depth of field, image size, freedom from aberrations etc. In general there will desirably be used solid state image capture devices in the form of photoelectric sensor arrays
(wherein photons are used to generate electric current and/or voltage or change electrical properties such as resistance etc.) which have relatively small size image sensing surfaces e.g. in the range from 0.1 to 5 cms across. Thus the lens system should in such cases desirably be formed and arranged to provide a similarly small-sized image. Where a wide angle field of view is also required (e.g. in surveillance applications) , then a lens of relatively short focal length should be used e.g. for a field of view angle of 80 degrees the
(maximum) focal length will not normally exceed 1.19 times the image height and for 60 degrees will not normally exceed 1.73 times the (maximum) image height, the (maximum) image height corresponding to half the sensing surface diameter. The use of a high refractive index spacer and the exclusion of any low refractive index material from the optical path significantly decreases aberration due to Petzval Curvature (otherwise known a curvature of field aberration) and limits spherical aberration. The lens system is therefore particularly advantageous in wide field and/or large aperture applications required for low light conditions. In general there. is desirably used, for such wide angle applications, a lens element having a refractive index ancj in the range from 1.45 to 1.65, and a spacer element with a higher, refractive index ancj in the range from 1.45 to 1.85.
Various optical grade materials having suitable refractive indices are widely available. Low-dispersion glass such as type BK7 (available from various sources e.g. Schott Glaswerke) is particularly suitable for the lens element. The spacer element may be made of LaKlO glass also readily available. Other materials that may be used for the lens and/or spacer elements comprise plastics materials, although these are generally less preferred in view of their lower resistance to scratching and other damage and the lower refractive indices available. Nevertheless they may be acceptable for certain applications requiring low cost such as consumer door-entry and security cameras.
Suitable adhesive materials for use between the spacer and lens elements and between the spacer element and the solid state image capture device include optical grade epoxy resins.
In a preferred image capture system of the present invention the solid state image capture device comprises an integrated circuit image array sensor such as that disclosed in our earlier International patent application No. PCT/GB90/01452 (publication No. W091/04633 the contents of which are hereby incorporated herein by reference thereto) which has on-board signal processing means formed and arranged for directly providing a video signal output. Naturally though other image capture devices such as CCD, MOS and CCD sensors may also be used. Also the image capture device may comprise simply a sensor chip on which are only provided the sensor arrays with all the electronic circuitry required to detect the response of individual sensor cells to incident radiation and further processing of the detected response provided externally of the sensor chip, and of course other arrangements with a greater or lesser part of this electronic circuitry provided on the chip bearing the sensor arrays, are also possible. Accordingly references to "cameras" herein includes references to apparatus in which substantially the whole of the electronic circuitry required to produce a video output signal is provided on the same chip as the sensor arrays, as well as apparatus in which a greater or lesser part is provided separately. Thus references to camera alignment relate only to alignment of the lenses and sensor arrays (and not to any other components that may be required to produce a video signal output.
Thus using miniature, chip-mounted lenses, it is possible to fabricate multiple independent cameras on single VLSI chips. These cameras accurately lie in the same plane and are rotationally in substantially perfect alignment.. Any remaining alignment errors are primarily translational and can be easily corrected by retiming the readout control sequences. Further preferred features and advantages of the present invention will appear from the following detailed description by way of example of a preferred embodiment illustrated with reference to the accompanying drawings in which:
Fig. 1 is a schematic perspective view of a composite image colour video camera of the invention with three individual camera units;
Figs 2(a) to (d) are schematic views showing 4 different 2-D arrangements of the three camera elements relative to each other;
Fig. 3 is a schematic illustration of the optical performance of a camera of the invention;
Fig. 4 is a block circuit diagram of one possible electronic architecture for a camera of the invention;
Fig. 5 is a sectional elevation of another camera of the invention; and
Fig. 6 is a schematic perspective view of a spacer support element suitable for use in the camera of Fig. 5.
Fig.l shows a miniature colour video camera system C having three cameras 1 each comprising a lens system 2 mounted directly onto the image sensing surface 3 of a respective solid state image capture device in the form of an integrated circuit image array sensor 4. The sensors 4 are formed as separate sections of a single monolithic VLSI microchip 5 mounted in a suitable housing CH containing a power supply 6 and provided with a video signal output interface 7.
In more detail the lens system 2 comprises a generally hemispherical lens 8 having a radius of curvature of the order of 0.85 mm, and a cylindrical spacer element 9 of substantially larger diameter (ca. 1.7mm) and a length of 1.59 mm, with an aperture stop 10 therebetween. The aperture stop 10 is of metal e.g. steel alloy with a thickness of 0.15 mm and an iris diameter of 0.8 mm providing an effective lens aperture of f2.0. The aperture opening is filled with clear epoxy resin 11 which has a refractive index substantially similar to that of the lens 8 and secures the lens 8 and spacer 9 to each other and to the aperture stop 10. Alternatively an aperture stop of metal or other material could simply be printed onto the spacer or lens e.g. using a photolithographic technique. The R, G, B (red, green and blue) filters 12 for the three respective lenses 8 can also be disposed between the lenses 8 and spacers 9.
The lens 8 is of low dispersion glass (Bk7) having a refractive index n^ of 1.568 and the spacer is of LaKlo glass which has a higher refractive index n<a of 1.7200. This combination produces low image blur and large image size (ca. 1.4mm image height from central axis). The spacer 9 has a length of around 1.59mm. This lens system has an effective depth of field of from 2cms to OO with a field of view angle of 90° and has an rms blur of around 5um which is within the unit sensor pixel dimensions thereby providing a reasonably good video signal image output from the video signal output connection 7.
It will be appreciated that various modifications may be made to the above described embodiment without departing from the scope of the present invention. Thus for example the spacer element could be a composite element made up of a plurality of plane components. The lens element could also be composite though this would normally be less preferred due to the significantly increased complexity. The various surfaces of the lens system could moreover be provided with diverse coatings for e.g. reducing undesirable reflections and selective filtration of the incident light rays in generally known manner. Also the R, G, B filters could be mounted on a suitable support in front of the lenses 8 as further described hereinbelow.
Fig. 2 shows some possible alternative layouts for the individual cameras on the single chip. In layout (a) the absolute red-green and blue-green distances are minimised. All of the parallax error is vertical. This may not be optimum for TV applications, as in this case the colour signal is greatly averaged horizontally, but not at all vertically. Layout (b) forces all of the parallax error into the horizontal dimension to take advantage of this. Layout (c) has slightly worse red-green and blue-green parallax than (a) , but the red-blue distance is greatly less. Intuitively, this configuration minimises the total parallax error. Layout (3) is as for (c) , but pushes most of the r-g, b-g errors into the horizontal axis.
Apart from its substantial simplicity, the colour camera system of the present invention provides two further potential technical advantages over the alternative known approaches. Axial colour aberration causes the focal plane for blue to be slightly closer to the lens, and for red slightly further from the lens, than green. This aberration may actually be an advantage if we design the lens for green light and the blue and red images become slightly defocussed, thereby also blurring the effect of parallax errors. For ordinary glasses, the blue and green images are blurred by around 1 pixel at the geometries used above. If we wish, the 3-lens approach could be adapted to accommodate this aberration by fine-tuning the focal length of each lens. This is impossible with either of the existing single-lens approaches. Transversal colour aberration is a change in magnification factors at different wavelengths. Again it may be possible to correct for this by modifying slightly the red and blue lens geometries.
Fig. 3 shows how the parallax errors between closely adjacent identical cameras can be maintained at or below one pixel for cameras with useful resolutions of several hundred pixels. If the cameras are calibrated to provide image alignment for objects at infinity, then an object at distance 0 (on the optical axis of one camera for simplicity) is imaged at an offset of e pixels in the second camera. It is obvious that the parallax error is greatest for objects which are closest to the cameras. To help in generalising the result, suppose we wish to image objects at minimum range Omm with a field of view of 2ø degrees and sensor resolution of P pixels. Then by trigonometry, the parallax error e, in pixels , is:-
e = P.s / 0.2 tan (ø) pixels (1)
where s is the camera separation. This important result is independent of the focal length of the lens and shows that the error is reduced by lowering s, lengthening 0 or increasing β.
Actually, e in equation (1) is the parallax error for a close object with reference to objects at infinity. Advantageously we might calibrate the cameras to provide alignment for objects at some mid range and achieve a balance of errors between close and far objects. It is easy to show that if we calibrate on objects at range 2x0 then the worst case parallax error for near and far objects is:-
e' = P.s / 0.4 tan (ø) pixels (2) Thus in the case of a colour camera for computer vision using a typical standard resolution of e.g. 240 x 320 pixels, then if we require a lens with 52° angle-of-view (ø = 26°) , horizontal resolution, P, of 320 pixels, minimum range 0, of 50cm and we achieve a separation, s, of 3mm, e' will be slightly less than one pixel. In the case of a low resolution colour camera for video telephones using the QCIF standard (144 x 178 pixels) , this device must work at close range, say down to 25cm. Say we use the same angle of view, 52°, and achieve a lens separation of 1.5mm (allowing for the smaller array) , then the error in this case will be approximately 0.5 pixels. In both cases, the parallax error is less than one pixel and therefore of the same order as aliasing and interpolation errors in single-chip colour mosaic cameras and accordingly reasonably acceptable.
Conversely it will be appreciated that where it is desired to capture stereo images the parallax error and should be greater than 1. It may be seen from the above equations that parallax error increases for small object ranges or distances 0 and for larger camera separations s. With a sensor size P of 240 pixels, camera pitch separation s of 3.1.mm, and field of view of 45° (2ø) , stereo image capture is feasible at ranges up to 1.8 metres. This range can be increased simply by corresponding increases in the camera separation s.
The electronic architecture of the camera is substantially independent of the optical and sensor arrangement described above. Nevertheless, the availability of synchronous, continuous RGB colour signals minimises the required image processing. This results in a simpler and lower-cost electronic implementation than for colour-mosaic cameras. Furthermore, the electronic requirements may be implemented feasibly on the same chip as the sensors where CMOS sensor technology such as that described in our earlier Patent Publication No. WO91/04498, is used.
Figure 4 gives an overview of one possible electronic architecture. Three colour arrays are driven in similar style to a monochrome array, except that the timing of vertical and horizontal control on the red and blue arrays is altered by offset values loaded into the controller from an off-chip PROM (Programmable Read Only Memory) Chip which has been programmed with the offset calibration data.
Automatic Exposure Control (AEC) is provided as for monochrome arrays. The same exposure value is used for all three arrays. Exposure is monitored via the green output alone, or possibly by deriving luminance from a combination of RGB signals.
Automatic Gain Control (AGC) is also provided as for monochrome arrays, but in this case, independent gains can be set for each colour. AGC provides three functions:-
(i) automatic peak level calibration using a saturated reference line in the Green array. In normal exposure circumstances, the gain of the green channel is fixed at this value and this forms a nominal gain for the red and blue channels also. (ii) dynamic colour balancing to correct for variations in the colour of ambient light by adjusting the blue and red gains according to colour*analysis of the three channels.
(iii) automatic gain of weak images in low light conditions when AEC has reached maximum exposure. At first dynamic colour balancing can continue, but at the lowest light levels (maximum gain) , flexibility may be lost. This is characteristic of many colour cameras.
The resulting balanced colour signals are passed through a colour correction matrix, which performs weighted mixing of the three colours (see e.g. D'Luna and Parulski, IEEE JSSC Vol. 26, No. 5, pp. 727-737) followed by gamma correction on each colour. Both these functions are standard requirements for colour cameras intended for TV displays as they correct for known colour and amplitude nonlinearities in the display tubes.
The matrix may be implemented in analogue CMOS, either by using switched capacitors or switched current-sources. Either of these can accommodate alterable coefficients in digital form, or they could be fixed in layout. In the former case the coefficients may be stored in the PROM already provided for offset calibration and this may afford a useful degree of flexibility.
The gamma corrected RGB signals are then passed to an appropriate encoder for whatever standard is required. Suitable encoders for e.g. NTS/PAL are readily available.
In Figs. 5 and 6 like parts corresponding to those in Fig. 1 are indicated by like reference numerals. The Figs. 5 and 6 illustrate an alternative embodiment in which is used a lens system 20 supported at its edges 21 on spacer elements 22 with a substantial free air space 23 between the three lenses 8 of the lens system and the respective sensors 4. In more detail, the lens system 20 is a one-piece moulding from a suitable optical grade plastics material and incorporating an array of three lens portions 8 joined edge-to-edge. It will be appreciated that this affords a particularly economic and convenient form of production whilst at the same time simplyfying assembly of the camera insofar as the three lenses for the R, G, B components of the image, are automatically aligned with each other. Nevertheless it will be understood that, if desired, the lenses 8 could be manufactured from other materials e.g. glass or any other suitable optical material and/or as discrete individual components. Moreover each lens could comprise more than one element e.g. a doublet. The lenses 8 may conveniently be aspheric or spherical or planar at either surface thereof.
As in the first embodiment, the sensors 4 are formed as separate sections of a single monolithic VLSI microchip 5, mounted in a housing CH (only part shown) . An optical support sub-housing 24 has various shoulder portions 25-28, for respectively securing the monolithic chip 5 to the housing CH, supporting the lens system 20 on the spacer elements 22 above the chip 5, supporting R, G and B filters 29-31 above the lenses 8, and supporting a protective outer Infra Red filter 32 (conveniently of doped glass e.g. Schott KG3 or BG39) . Additional support to the lens system 21 and the filters 29-31 is conveniently provided by spacer walls 33, 34 which have the further advantage of acting as light baffles between the three, R, G, Br cameras 1 to prevent cross-imaging between each lens 8 and the other sensors 4. As shown in Fig. 6 the lower spacer walls 33 may conveniently be formed integrally with the support sub-housing 24. It will be understood that various modifications may readily be made to the above described embodiment. Thus, for example, the order of the filters 29-31 and 32 may be changed.

Claims

1. An image capture system comprising a solid state image capture device which device comprises an integrated circuit having at least two sensor arrays, each said array having an image sensing surface and a respective lens system associated therewith.
2. A system according to claim 1 in which system at least one of the lens systems is mounted spaced apart from the sensor means with a fluid medium between the lens and the sensor.
3. A system according to claim 1 in which system at least one of the lens systems, comprises a lens in substantially direct contact with a transparent spacer in substantially direct contact with the sensor and extending between said lens and sensor, said lens and spacer having refractive indices and being dimensioned so as to form an image in a plane at or in direct proximity to a rear face of said spacer element remote from said lens element, from an object, whereby in use of the lens system with said lens system mounted substantially directly on the image sensing surface of the image capture device an optical image may be captured thereby.
4. A system according to claim 3 wherein said spacer has a refractive index not less than that of said lens.
5. A system according to any one of claims 1 to 4 wherein at least two said lens systems are formed integrally with each other.
6. A system according to any one of claims 1 to 5 wherein said sensor arrays are photoelectric.
7. A system according to claim 6 wherein each sensor array has a diameter of from 1 to 10mm.
8. A system according to any one of claims 1 to 7 which has a pair of sensor arrays spaced apart so as to provide a stereo image capture system for objects within a predetermined range from the camera.
9. A system according to any one of claims 1 to 7 which has at least three sensor arrays and wherein at least one of the respective lens systems is provided with a wavelength selective filter means, so as to provide a composite image capture system.
10. A system according to claim 9 wherein are provided three sensor arrays and in which the respective lens systems, are provided with red, green, and blue filters, whereby substantially full colour composite image capture may be effected.
11. A system according to claim 9 wherein are provided three sensor arrays and in which the respective lens systems are provided with cyan, magenta and yellow filters, whereby substantially full colour composite image capture may be effected.
12. A system according to any one of claims 9 to 11 wherein said sensor arrays are arranged in a linear array.
13. A system according to any one of claims 9 to 11 wherein said sensor arrays are arranged in a close-packed non-linear array.
14. A system according to any one of claims 9 to 13 wherein said sensor arrays are disposed at a pitch spacing of not more than 5mm.
15. A system according to any one of claims 1 to 14 wherein are used wide angle lens systems having a field of view of at least 60°.
16. A video camera having an image capture system according to claim 8 or any of claims 7 to 15 when dependant on claim 6.
PCT/GB1992/002260 1991-12-06 1992-12-04 Solid state sensor arrangement for video camera WO1993011631A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB9409608A GB2276512B (en) 1991-12-06 1992-12-04 Solid state sensor arrangement for video camera

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB919125954A GB9125954D0 (en) 1991-12-06 1991-12-06 Electronic camera
GB9125954.9 1991-12-06

Publications (1)

Publication Number Publication Date
WO1993011631A1 true WO1993011631A1 (en) 1993-06-10

Family

ID=10705808

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB1992/002260 WO1993011631A1 (en) 1991-12-06 1992-12-04 Solid state sensor arrangement for video camera

Country Status (2)

Country Link
GB (2) GB9125954D0 (en)
WO (1) WO1993011631A1 (en)

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0710039A3 (en) * 1994-10-25 1996-11-13 Toshiba Kk Video camera apparatus
EP0773673A1 (en) * 1995-05-31 1997-05-14 Sony Corporation Image pickup device, method of manufacturing the device, image pickup adapter, signal processor, signal processing method, information processor, and information processing method
US5763943A (en) * 1996-01-29 1998-06-09 International Business Machines Corporation Electronic modules with integral sensor arrays
WO2000067488A1 (en) * 1999-04-30 2000-11-09 Foveon, Inc. Color separation prisms having solid-state imagers mounted thereon
EP1067780A2 (en) * 1999-06-30 2001-01-10 Canon Kabushiki Kaisha Image pickup apparatus
EP1104908A1 (en) * 1998-09-16 2001-06-06 Lsi Card Corporation Optical apparatus, fingerprint pickup apparatus and fingerprint pickup method
EP1173009A1 (en) * 2000-06-22 2002-01-16 Mitsubishi Denki Kabushiki Kaisha Image pick-up apparatus and portable telephone utilizing the same
EP1067802A3 (en) * 1999-06-30 2002-11-27 Canon Kabushiki Kaisha Colour image pickup apparatus
EP1067779A3 (en) * 1999-06-30 2002-11-27 Canon Kabushiki Kaisha Image pickup apparatus
US6833873B1 (en) 1999-06-30 2004-12-21 Canon Kabushiki Kaisha Image pickup apparatus
WO2005018221A1 (en) * 2003-08-06 2005-02-24 Eastman Kodak Company Alignment of lens array images using autocorrelation
US6885404B1 (en) 1999-06-30 2005-04-26 Canon Kabushiki Kaisha Image pickup apparatus
WO2005057922A1 (en) * 2003-12-11 2005-06-23 Nokia Corporation Imaging device
WO2006027405A1 (en) * 2004-09-09 2006-03-16 Nokia Corporation Method of creating colour image, imaging device and imaging module
EP1646249A1 (en) * 2004-10-08 2006-04-12 Dialog Semiconductor GmbH Single chip stereo image pick-up system with dual array design
FR2880137A1 (en) * 2004-12-24 2006-06-30 Atmel Grenoble Soc Par Actions Image sensor for e.g. camera, has optical system projecting image of scene on network of light-sensitive zones which are divided into two matrices and including optical sub-assemblies for projecting scene on matrices
FR2880958A1 (en) * 2005-01-19 2006-07-21 Dxo Labs Sa Digital image`s color sharpness improving method for e.g. digital photo apparatus, involves choosing bold color among colors, and reflecting sharpness of bold color on another improved color
EP1695545A2 (en) * 2003-12-18 2006-08-30 Agilent Technologies, Inc. Color image sensor with imaging elements imaging on respective regions of sensor elements
WO2006095110A2 (en) * 2005-03-07 2006-09-14 Dxo Labs Method of controlling an action, such as a sharpness modification, using a colour digital image
US7235831B2 (en) 1999-02-25 2007-06-26 Canon Kabushiki Kaisha Light-receiving element and photoelectric conversion device
KR100868279B1 (en) * 2007-03-07 2008-11-11 노키아 코포레이션 Method of creating colour image, imaging device and imaging module
US7474349B2 (en) 2002-12-26 2009-01-06 Canon Kabushiki Kaisha Image-taking apparatus
WO2009026064A2 (en) * 2007-08-21 2009-02-26 Micron Technology, Inc. Multi-array sensor with integrated sub-array for parallax detection and photometer functionality
WO2009025959A1 (en) * 2007-08-21 2009-02-26 Micron Technology, Inc. De-parallax method and apparatus for lateral sensor arrays
WO2012024044A1 (en) * 2010-08-17 2012-02-23 Apple Inc. Image capture using luminance and chrominance sensors
WO2012058037A1 (en) * 2010-10-28 2012-05-03 Eastman Kodak Company Camera with sensors having different color patterns
US8405727B2 (en) 2008-05-01 2013-03-26 Apple Inc. Apparatus and method for calibrating image capture devices
US8502926B2 (en) 2009-09-30 2013-08-06 Apple Inc. Display system having coherent and incoherent light sources
US8527908B2 (en) 2008-09-26 2013-09-03 Apple Inc. Computer user interface system and methods
US8538084B2 (en) 2008-09-08 2013-09-17 Apple Inc. Method and apparatus for depth sensing keystoning
US8619128B2 (en) 2009-09-30 2013-12-31 Apple Inc. Systems and methods for an imaging system using multiple image sensors
US9113078B2 (en) 2009-12-22 2015-08-18 Apple Inc. Image capture device having tilt and/or perspective correction
US9171217B2 (en) 2002-05-03 2015-10-27 Magna Electronics Inc. Vision system for vehicle
US9191634B2 (en) 2004-04-15 2015-11-17 Magna Electronics Inc. Vision system for vehicle
US9356061B2 (en) 2013-08-05 2016-05-31 Apple Inc. Image sensor with buried light shield and vertical gate
US9436880B2 (en) 1999-08-12 2016-09-06 Magna Electronics Inc. Vehicle vision system
US9440535B2 (en) 2006-08-11 2016-09-13 Magna Electronics Inc. Vision system for vehicle
US10531067B2 (en) 2017-03-26 2020-01-07 Apple Inc. Enhancing spatial resolution in a stereo camera imaging system
US11951900B2 (en) 2023-04-10 2024-04-09 Magna Electronics Inc. Vehicular forward viewing image capture system

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2338080B (en) 1998-06-05 2003-05-21 Imco Electro Optics Ltd Imaging arrangement and method
FR2781929B1 (en) 1998-07-28 2002-08-30 St Microelectronics Sa IMAGE SENSOR WITH PHOTODIODE ARRAY
FR2820883B1 (en) 2001-02-12 2003-06-13 St Microelectronics Sa HIGH CAPACITY PHOTODIODE
FR2820882B1 (en) 2001-02-12 2003-06-13 St Microelectronics Sa THREE TRANSISTOR PHOTODETECTOR
FR2824665B1 (en) * 2001-05-09 2004-07-23 St Microelectronics Sa CMOS TYPE PHOTODETECTOR

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0043721A2 (en) * 1980-07-03 1982-01-13 Xerox Corporation Device for scanning coloured originals
EP0248687A1 (en) * 1986-05-02 1987-12-09 AEROSPATIALE Société Nationale Industrielle Charge transfer opto-electronic multifield sensor
GB2240444A (en) * 1985-12-20 1991-07-31 Philips Electronic Associated Imaging array devices and staring array imaging systems
WO1992015036A1 (en) * 1991-02-23 1992-09-03 Vlsi Vision Limited Lens system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS57157683A (en) * 1981-03-25 1982-09-29 Canon Inc Reader for color manuscript
JPS61154390A (en) * 1984-12-27 1986-07-14 Toshiba Corp Solid-state image pick-up device
JPS63107389A (en) * 1986-10-24 1988-05-12 Toshiba Corp Image pickup device for stereoscopic vision
JPH0647560A (en) * 1992-07-24 1994-02-22 Mitsubishi Materials Corp Manufacture of soldering spacer

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0043721A2 (en) * 1980-07-03 1982-01-13 Xerox Corporation Device for scanning coloured originals
GB2240444A (en) * 1985-12-20 1991-07-31 Philips Electronic Associated Imaging array devices and staring array imaging systems
EP0248687A1 (en) * 1986-05-02 1987-12-09 AEROSPATIALE Société Nationale Industrielle Charge transfer opto-electronic multifield sensor
WO1992015036A1 (en) * 1991-02-23 1992-09-03 Vlsi Vision Limited Lens system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 10, no. 356 (E-459)(2412) 29 November 1986 *
PATENT ABSTRACTS OF JAPAN vol. 12, no. 351 (E-660)(3198) 20 September 1988 *
PATENT ABSTRACTS OF JAPAN vol. 6, no. 259 (E-149)(1137) 17 December 1982 *
PATENT ABSTRACTS OF JAPAN vol. 9, no. 178 (E-330)23 July 1985 *

Cited By (86)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5940126A (en) * 1994-10-25 1999-08-17 Kabushiki Kaisha Toshiba Multiple image video camera apparatus
EP0710039A3 (en) * 1994-10-25 1996-11-13 Toshiba Kk Video camera apparatus
EP1357741A1 (en) * 1995-05-31 2003-10-29 Sony Corporation Image pickup apparatus
EP0773673A1 (en) * 1995-05-31 1997-05-14 Sony Corporation Image pickup device, method of manufacturing the device, image pickup adapter, signal processor, signal processing method, information processor, and information processing method
EP0773673A4 (en) * 1995-05-31 2001-05-23 Sony Corp Image pickup device, method of manufacturing the device, image pickup adapter, signal processor, signal processing method, information processor, and information processing method
US5763943A (en) * 1996-01-29 1998-06-09 International Business Machines Corporation Electronic modules with integral sensor arrays
US5869896A (en) * 1996-01-29 1999-02-09 International Business Machines Corporation Packaged electronic module and integral sensor array
US5907178A (en) * 1996-01-29 1999-05-25 International Business Machines Corporation Multi-view imaging apparatus
EP1104908A1 (en) * 1998-09-16 2001-06-06 Lsi Card Corporation Optical apparatus, fingerprint pickup apparatus and fingerprint pickup method
US7235831B2 (en) 1999-02-25 2007-06-26 Canon Kabushiki Kaisha Light-receiving element and photoelectric conversion device
WO2000067488A1 (en) * 1999-04-30 2000-11-09 Foveon, Inc. Color separation prisms having solid-state imagers mounted thereon
US6614478B1 (en) 1999-04-30 2003-09-02 Foveon, Inc. Color separation prisms having solid-state imagers mounted thereon and camera employing same
US6859229B1 (en) 1999-06-30 2005-02-22 Canon Kabushiki Kaisha Image pickup apparatus
EP1067780A3 (en) * 1999-06-30 2002-11-27 Canon Kabushiki Kaisha Image pickup apparatus
EP1067779A3 (en) * 1999-06-30 2002-11-27 Canon Kabushiki Kaisha Image pickup apparatus
US6833873B1 (en) 1999-06-30 2004-12-21 Canon Kabushiki Kaisha Image pickup apparatus
EP1067802A3 (en) * 1999-06-30 2002-11-27 Canon Kabushiki Kaisha Colour image pickup apparatus
EP1067780A2 (en) * 1999-06-30 2001-01-10 Canon Kabushiki Kaisha Image pickup apparatus
US6882368B1 (en) 1999-06-30 2005-04-19 Canon Kabushiki Kaisha Image pickup apparatus
US6885404B1 (en) 1999-06-30 2005-04-26 Canon Kabushiki Kaisha Image pickup apparatus
US6980248B1 (en) 1999-06-30 2005-12-27 Canon Kabushiki Kaisha Image pickup apparatus
US9436880B2 (en) 1999-08-12 2016-09-06 Magna Electronics Inc. Vehicle vision system
EP1173009A1 (en) * 2000-06-22 2002-01-16 Mitsubishi Denki Kabushiki Kaisha Image pick-up apparatus and portable telephone utilizing the same
US7030926B2 (en) 2000-06-22 2006-04-18 Mitsubishi Denki Kabushiki Kaisha Image pick-up apparatus and portable telephone utilizing the same
US9834216B2 (en) 2002-05-03 2017-12-05 Magna Electronics Inc. Vehicular control system using cameras and radar sensor
US11203340B2 (en) 2002-05-03 2021-12-21 Magna Electronics Inc. Vehicular vision system using side-viewing camera
US10683008B2 (en) 2002-05-03 2020-06-16 Magna Electronics Inc. Vehicular driving assist system using forward-viewing camera
US9171217B2 (en) 2002-05-03 2015-10-27 Magna Electronics Inc. Vision system for vehicle
US10351135B2 (en) 2002-05-03 2019-07-16 Magna Electronics Inc. Vehicular control system using cameras and radar sensor
US10118618B2 (en) 2002-05-03 2018-11-06 Magna Electronics Inc. Vehicular control system using cameras and radar sensor
US9555803B2 (en) 2002-05-03 2017-01-31 Magna Electronics Inc. Driver assistance system for vehicle
US9643605B2 (en) 2002-05-03 2017-05-09 Magna Electronics Inc. Vision system for vehicle
US7474349B2 (en) 2002-12-26 2009-01-06 Canon Kabushiki Kaisha Image-taking apparatus
WO2005018221A1 (en) * 2003-08-06 2005-02-24 Eastman Kodak Company Alignment of lens array images using autocorrelation
US7593597B2 (en) 2003-08-06 2009-09-22 Eastman Kodak Company Alignment of lens array images using autocorrelation
US7453510B2 (en) * 2003-12-11 2008-11-18 Nokia Corporation Imaging device
WO2005057922A1 (en) * 2003-12-11 2005-06-23 Nokia Corporation Imaging device
EP1695545A4 (en) * 2003-12-18 2009-12-02 Agilent Technologies Inc Color image sensor with imaging elements imaging on respective regions of sensor elements
EP1695545A2 (en) * 2003-12-18 2006-08-30 Agilent Technologies, Inc. Color image sensor with imaging elements imaging on respective regions of sensor elements
US10110860B1 (en) 2004-04-15 2018-10-23 Magna Electronics Inc. Vehicular control system
US10735695B2 (en) 2004-04-15 2020-08-04 Magna Electronics Inc. Vehicular control system with traffic lane detection
US9609289B2 (en) 2004-04-15 2017-03-28 Magna Electronics Inc. Vision system for vehicle
US11847836B2 (en) 2004-04-15 2023-12-19 Magna Electronics Inc. Vehicular control system with road curvature determination
US10015452B1 (en) 2004-04-15 2018-07-03 Magna Electronics Inc. Vehicular control system
US10187615B1 (en) 2004-04-15 2019-01-22 Magna Electronics Inc. Vehicular control system
US11503253B2 (en) 2004-04-15 2022-11-15 Magna Electronics Inc. Vehicular control system with traffic lane detection
US9948904B2 (en) 2004-04-15 2018-04-17 Magna Electronics Inc. Vision system for vehicle
US9428192B2 (en) 2004-04-15 2016-08-30 Magna Electronics Inc. Vision system for vehicle
US10306190B1 (en) 2004-04-15 2019-05-28 Magna Electronics Inc. Vehicular control system
US9736435B2 (en) 2004-04-15 2017-08-15 Magna Electronics Inc. Vision system for vehicle
US9191634B2 (en) 2004-04-15 2015-11-17 Magna Electronics Inc. Vision system for vehicle
US10462426B2 (en) 2004-04-15 2019-10-29 Magna Electronics Inc. Vehicular control system
WO2006027405A1 (en) * 2004-09-09 2006-03-16 Nokia Corporation Method of creating colour image, imaging device and imaging module
EP1646249A1 (en) * 2004-10-08 2006-04-12 Dialog Semiconductor GmbH Single chip stereo image pick-up system with dual array design
FR2880137A1 (en) * 2004-12-24 2006-06-30 Atmel Grenoble Soc Par Actions Image sensor for e.g. camera, has optical system projecting image of scene on network of light-sensitive zones which are divided into two matrices and including optical sub-assemblies for projecting scene on matrices
FR2880958A1 (en) * 2005-01-19 2006-07-21 Dxo Labs Sa Digital image`s color sharpness improving method for e.g. digital photo apparatus, involves choosing bold color among colors, and reflecting sharpness of bold color on another improved color
WO2006095110A2 (en) * 2005-03-07 2006-09-14 Dxo Labs Method of controlling an action, such as a sharpness modification, using a colour digital image
US8212889B2 (en) 2005-03-07 2012-07-03 Dxo Labs Method for activating a function, namely an alteration of sharpness, using a colour digital image
US7920172B2 (en) 2005-03-07 2011-04-05 Dxo Labs Method of controlling an action, such as a sharpness modification, using a colour digital image
WO2006095110A3 (en) * 2005-03-07 2006-11-02 Dxo Labs Method of controlling an action, such as a sharpness modification, using a colour digital image
US11396257B2 (en) 2006-08-11 2022-07-26 Magna Electronics Inc. Vehicular forward viewing image capture system
US9440535B2 (en) 2006-08-11 2016-09-13 Magna Electronics Inc. Vision system for vehicle
US10071676B2 (en) 2006-08-11 2018-09-11 Magna Electronics Inc. Vision system for vehicle
US10787116B2 (en) 2006-08-11 2020-09-29 Magna Electronics Inc. Adaptive forward lighting system for vehicle comprising a control that adjusts the headlamp beam in response to processing of image data captured by a camera
US11148583B2 (en) 2006-08-11 2021-10-19 Magna Electronics Inc. Vehicular forward viewing image capture system
US11623559B2 (en) 2006-08-11 2023-04-11 Magna Electronics Inc. Vehicular forward viewing image capture system
KR100868279B1 (en) * 2007-03-07 2008-11-11 노키아 코포레이션 Method of creating colour image, imaging device and imaging module
TWI413408B (en) * 2007-08-21 2013-10-21 Aptina Imaging Corp De-parallax methods and apparatuses for lateral sensor arrays
US7782364B2 (en) 2007-08-21 2010-08-24 Aptina Imaging Corporation Multi-array sensor with integrated sub-array for parallax detection and photometer functionality
WO2009026064A2 (en) * 2007-08-21 2009-02-26 Micron Technology, Inc. Multi-array sensor with integrated sub-array for parallax detection and photometer functionality
WO2009025959A1 (en) * 2007-08-21 2009-02-26 Micron Technology, Inc. De-parallax method and apparatus for lateral sensor arrays
WO2009026064A3 (en) * 2007-08-21 2009-04-23 Micron Technology Inc Multi-array sensor with integrated sub-array for parallax detection and photometer functionality
US8405727B2 (en) 2008-05-01 2013-03-26 Apple Inc. Apparatus and method for calibrating image capture devices
US8538084B2 (en) 2008-09-08 2013-09-17 Apple Inc. Method and apparatus for depth sensing keystoning
US8527908B2 (en) 2008-09-26 2013-09-03 Apple Inc. Computer user interface system and methods
US8619128B2 (en) 2009-09-30 2013-12-31 Apple Inc. Systems and methods for an imaging system using multiple image sensors
US8502926B2 (en) 2009-09-30 2013-08-06 Apple Inc. Display system having coherent and incoherent light sources
US9113078B2 (en) 2009-12-22 2015-08-18 Apple Inc. Image capture device having tilt and/or perspective correction
US9565364B2 (en) 2009-12-22 2017-02-07 Apple Inc. Image capture device having tilt and/or perspective correction
US8497897B2 (en) 2010-08-17 2013-07-30 Apple Inc. Image capture using luminance and chrominance sensors
WO2012024044A1 (en) * 2010-08-17 2012-02-23 Apple Inc. Image capture using luminance and chrominance sensors
WO2012058037A1 (en) * 2010-10-28 2012-05-03 Eastman Kodak Company Camera with sensors having different color patterns
US9356061B2 (en) 2013-08-05 2016-05-31 Apple Inc. Image sensor with buried light shield and vertical gate
US9842875B2 (en) 2013-08-05 2017-12-12 Apple Inc. Image sensor with buried light shield and vertical gate
US10531067B2 (en) 2017-03-26 2020-01-07 Apple Inc. Enhancing spatial resolution in a stereo camera imaging system
US11951900B2 (en) 2023-04-10 2024-04-09 Magna Electronics Inc. Vehicular forward viewing image capture system

Also Published As

Publication number Publication date
GB9125954D0 (en) 1992-02-05
GB2276512A (en) 1994-09-28
GB9409608D0 (en) 1994-07-13
GB2276512B (en) 1995-03-01

Similar Documents

Publication Publication Date Title
WO1993011631A1 (en) Solid state sensor arrangement for video camera
US10708492B2 (en) Array camera configurations incorporating constituent array cameras and constituent cameras
EP1031239B1 (en) Optoelectronic camera and method for image formatting in the same
CN101995758B (en) Imaging device and video recording/reproducing system
EP1987379B1 (en) Intergrated lens system for image sensor and method for manufacturing the same
US6882368B1 (en) Image pickup apparatus
US20170168199A1 (en) Method of Producing a Focal Plane Array for a Multi-Aperture Camera Core
US20120268574A1 (en) Imager integrated circuit and stereoscopic image capture device
US9300885B2 (en) Imaging device, portable information terminal, and imaging system
US20110169918A1 (en) 3d image sensor and stereoscopic camera having the same
KR20100084524A (en) Liquid optics zoom lens and imaging apparatus
GB2488519A (en) Multi-channel image sensor incorporating lenslet array and overlapping fields of view.
EP1751600A1 (en) Compact, wide-field-of-view imaging optical system
US7777970B2 (en) Super-wide-angle lens and imaging system having same
US8675043B2 (en) Image recording system providing a panoramic view
US6980248B1 (en) Image pickup apparatus
US6618093B1 (en) Distortion-free imaging device having curved photosensor
US11402727B2 (en) Optical apparatus and imaging system including the same
US9154704B2 (en) Radial FPA based electro-optic imager
US7474349B2 (en) Image-taking apparatus
US20240004167A1 (en) Imaging lens and imaging apparatus

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): GB JP US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
122 Ep: pct application non-entry in european phase