WO2014049333A1 - Imaging device and method - Google Patents

Imaging device and method Download PDF

Info

Publication number
WO2014049333A1
WO2014049333A1 PCT/GB2013/052421 GB2013052421W WO2014049333A1 WO 2014049333 A1 WO2014049333 A1 WO 2014049333A1 GB 2013052421 W GB2013052421 W GB 2013052421W WO 2014049333 A1 WO2014049333 A1 WO 2014049333A1
Authority
WO
WIPO (PCT)
Prior art keywords
array
light
light redirection
imaging device
redirection units
Prior art date
Application number
PCT/GB2013/052421
Other languages
French (fr)
Inventor
Colin Jonathan Hughes
Original Assignee
Sony Computer Entertainment Europe Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Europe Limited filed Critical Sony Computer Entertainment Europe Limited
Publication of WO2014049333A1 publication Critical patent/WO2014049333A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/001Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
    • G02B13/0085Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras employing wafer level optics
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/0816Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more reflecting elements
    • G02B26/0833Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more reflecting elements the reflecting element being a micromechanical device, e.g. a MEMS mirror, DMD

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)

Abstract

An imaging device comprises an array of light sensor pixels, an array of steerable light redirection units, respectively operable to redirect incident light onto at least one sensor pixel of the array of light sensor pixels, and an input operable to receive steering configuration data to configure the respective steering of light redirection units in the array of light redirection units, so as to collectively form a virtual lens. The imaging device may then refocus by reconfiguring respective angles of light redirection units in the array of steerable light redirection units, so as to collectively form a new virtual lens with a different focal length.

Description

IMAGING DEVICE AND METHOD
BACKGROUND OF THE INVENTION
Field of the invention The present invention relates to an imaging device and method.
Description of the Prior Art
The "background" description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventors, to the extent it is described in this background section, as well as aspects of the description which may not otherwise qualify as prior art at the time of filing, are neither expressly or impliedly admitted as prior art against the present invention.
Conventional digital cameras utilise a lens or lens array to focus light on to an imaging plane that is typically occupied by a charge coupled device (CCD) or complementary metal-oxide- semiconductor (CMOS) sensor.
With the advent of phones, PDAs and tablets, it has become desirable to include a camera function for taking photographs and/or for making video calls. However, there is a desire for such devices to be very thin, making the available length of the optical path between a lens and a sensor very small. In response, sensors for such portable devices also tend to be proportionately small so that it is still possible to focus an image on them. However, as a result the quality of the captured image tends to be poorer (noisier and/or of lower resolution) than that obtained from conventional digital cameras with larger sensors.
The present invention seeks to address or mitigate this problem.
SUMMARY OF THE INVENTION
The foregoing paragraphs have been provided by way of general introduction, and are not intended to limit the scope of the following claims. The described embodiments, together with further advantages, will be best understood by reference to the following detailed description taken in conjunction with the accompanying drawings. In a first aspect, an imaging device is provided in accordance with claim 1. In another aspect, an electronic device is provided in accordance with claim 6. In another aspect, an imaging method is provided in accordance with claim 10.
Further respective aspects and features of the invention are defined in the appended claims.
BRIEF DESCRIPTION OF THE DRAWINGS A more complete appreciation of the disclosure and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
Figure 1A is a schematic diagram of the front side of an electronic device in accordance with an embodiment of the present invention.
- Figure IB is a schematic diagram of the rear side of an electronic device in accordance with an embodiment of the present invention.
Figure 2 is a schematic diagram of an image sensor and optical layer in accordance with an embodiment of the present invention. - Figure 3 is a schematic diagram of a light redirection unit in accordance with an embodiment of the present invention.
Figure 4 is a schematic diagram of a light redirection unit in accordance with an embodiment of the present invention.
Figure 5 is a schematic diagram of a light redirection unit in accordance with an embodiment of the present invention.
Figure 6 is a flow diagram of an imaging method in accordance with an embodiment of the present invention.
DESCRIPTION OF THE EMBODIMENTS
An imaging device and method are disclosed. In the following description, a number of specific details are presented in order to provide a thorough understanding of the embodiments of the present invention. It will be apparent, however, to a person skilled in the art that these specific details need not be employed to practise the present invention. Conversely, specific details known to the person skilled in the art are omitted for the purposes of clarity where appropriate.
Referring to Figure 1A, in an embodiment of the present invention a portable device 10, such as a tablet computer, PDA, media player, slim-line camera and/or mobile phone, comprises a housing 20 and a display 30. Referring now also to Figure IB, on the rear surface of the portable device is located an imaging device or camera unit 40 in accordance with an embodiment of the present invention. Referring now also to Figure 2, in an embodiment of the present invention, the imaging device 40 comprises a sensor array (such as a CCD or CMOS sensor) comprising sensor pixels, and a programmable optical layer. These may be combined within an array of light redirection units 50(1... N). Notably however, the imaging device does not comprise a physical lens arranged to focus light for the whole sensor, or even a substantial part of it.
As noted above, the programmable optical layer may be formed by array of light redirection units 50, having a similar construction and principle to digital micromirror devices found in digital light processing projectors and displays. In one embodiment of the present invention, the light redirection units refract light through a refraction surface onto a sensor pixel rather than reflecting the light. Accordingly, the devices can be programmably moved to refract or otherwise guide light through themselves at different respective angles at their different respective positions on the sensor array, thereby collectively simulating a virtual lens having the light bending properties of a predetermined conventional lens. Hence whilst the angles are exaggerated in Figure 2 for the purposes of explanation, it will be appreciated that the array of light redirection units can be programmed to focus on a point on the optical axis of the virtual lens at a selected focal distance. Subsequently, a proportionate change the angles of the light redirection units results in a shift in focal distance. Referring now also to Figure 3, in an embodiment of the present invention, the light redirection unit comprises a microrefractor 52 mounted on a yoke 54, itself supported by a resilient torsion member or spring 56. Electrostatic pads 58R,L then provide the force to tilt the microrefractor, with the torsion member restoring the microrefractor to a neutral position once the force is removed. Functionally equivalent variants of this arrangement corresponding to functionally equivalent variants of micromirrors are considered to be within the scope of the embodiment.
The corresponding sensor pixel itself (44) may form some or all of the substrate upon which the mi crorefr actor device is mounted, and may sit either above, below or adjacent to a static RAM cell (42) used to drive the device. To compensate for the obscuring of light by the device, the microrefractor or the substrate may comprise a microlens that focusses the light received over an area of the microrefractor device onto a smaller area of the substrate comprising the sensor pixel. As such, each microrefractor device operates as a steerable waveguide to redirect light, and can be controlled by the portable device to form part of a virtual lens simulated by the light redirection unit array.
Notably, this differs from conventional digital cameras that use a single lens (or lens group) to project light onto the sensor, and also differs from so-called plenoptic or light-field cameras, where a fixed microlens array is positioned in front of the sensor array and used to capture light from a plurality of directions. The captured image is thus not a conventional focussed image, but is used as the input to an algorithm that constructs a focussed image from it. It will be appreciated that if a microrefractor device has one axis of rotation about the torsion member, then in order to form a virtual lens the axis of the torsion member should be tangential to a radial line from the optical axis of the virtual lens. As a result, the torsion members, of the light redirection units, if viewed alone, would lie on the circumferences of concentric circles centred on the optical axis of the virtual lens.
Referring now to Figure 4, in an alternative embodiment of the present invention a different type of steerable microrefractor device 5CT may be used. In such an embodiment, the microrefractor is mounted on a resilient post 55 that restores the microrefractor to its neutral position once a force acting upon it has been removed. In this case, two pairs of electrostatic pads, 58XA,B and 58YA,B are provided to allow for force to be applied in to directions.
As a result, the micro refractor devices can be positioned rectilinearly on the sensor, with the two-dimensional steering of the microrefractor allowing each refractor to tilt toward the optical axis of the virtual lens if desired. This simplifies construction of the combined sensor and microrefractor array. Again, the microrefractor or substrate may also comprise a microlens to concentrate light onto a subsection of the substrate comprising the light sensor.
It will be appreciated that due to the close proximity of the microrefractor to the sensor (typically in the order of micrometers) when compared to the distance to the focal point, the actual refractive angles required to achieve a focal point for example lm from the sensor are very small and well within the capabilities of such device.
In another alternative embodiment, a liquid filled microlens (not shown) is distorted by electrostatic forces to similar effect; that is to say, the effective degree and/or direction of refraction through the liquid lens is modified by the electrostatic forces so that an array of such microlenses can again be selectively steered to form a virtual lens focussing at a point F on the optical axis of the virtual lens, in a manner similar to that shown for the microrefractors in Figure 2. Hence such liquid filled microlenses can be treated as functionally equivalent to the microrefractors described previously.
Referring to Figure 5, in an exemplary embodiment, a micromirror is used, rather than a microrefractor. In this case, the micromirror in the light redirection unit 60 is mounted behind the imaging sensor 44 and the imaging sensor is mounted on a transparent substrate 34 and arranged to allow light to pass through selective areas of the sensor, so as to reflect off the mirror 62 and onto the sensor. The mirror may again be mounted on a resilient post 65 on a substrate 64 to allow movement on two axes in the manner described previously for the microrefractor of Figure 4, or may be mounted on a torsion member (not shown) and oriented in the manner described previously for the microrefractor of Figure 3. The sensor 44λ may for example comprise red, green and blue sub-pixel sensors of a sensor pixel. Again, one or more microlenses may be affixed to the micromirror or on the substrate to increase the amount of light transferred to the sensor.
Hence more generally, an array of microrefractors or micromirrors may be thought of as a light redirection layer in which the direction of light is changed by a respective amount at each transport layer pixel (i.e. by each microrefractor or micromirror), either passing through the pixel in the case of a microrefractor, or being reflected from it in the case of a micromirror. It will be appreciated that the microrefractor and micromirror devices described previously, and more generally referred to herein as light redirection units, are able to change position very rapidly. Hence for example, under command from a processor that itself is under suitable software instruction, the light redirection units shown in Figure 2 could halve their respective angles to the imaging plane simultaneously, and do so in milliseconds. As a result, the focal point F of the new virtual lens would be positioned further from the imaging plane in the same amount of time.
Consequently, embodiments of the present invention allow for rapid changes in focus without the use of additional optics, by emulating successive lenses that focus at the desired distance. This in turn enables cameras in flat devices such as smartphones and tablets to have selective or controllable focus, and/or autofocus, rather than the conventional fixed focus at infinity.
In addition, embodiments of the present invention improve light sensitivity, as there is no additional aperture between the programmable optical layer and the subject to block light.
It will also be appreciated that in some embodiments of the present invention, the array of light redirection units can be arranged to emulate a wide variety of lenses for purposes other than focal control. For example, the light redirection units may emulate a small lens array to generate lightfield / plenoptic images, or may form two separate optical axes to generate stereoscopic images on a single sensor.
In an embodiment of the present invention, the image generated by the sensor is processed by an image processor (for example the CPU of the device operating under suitable software instruction), in conjunction with the known angles of the light redirection units, to correct for any defect or distortion in the image, in a manner similar to computing an image from a lightfield.
This may be of particular relevance if the image sensor has a higher resolution than the light redirection unit array. Hence for example common light redirection unit array sizes may be 640x480, 800x600, 1280x720, or 1920x1080. Meanwhile the sensor resolution may be 1920x1080 in each case, or even higher. Hence where the sensor resolution is higher than the light redirection unit array resolution, each light redirection unit (and optionally an associated microlens) would act as a local lens for a set of sensor pixels to form a lightfield image. The lightfield image may then be processed to generate the final image in conjunction with information about the angle of each light redirection unit.
Other image processing may be provided as appropriate. For example, if a light redirection unit fails, the associated pixel(s) in the image may be incorrect. In this case, a conventional noise reduction algorithm may be used to eliminate isolated pixel(s) of a colour that differs by a threshold amount from those of its neighbours (for example, by determining the variance in the colours of neighbouring pixels, and then estimating whether the current pixel colour exceeds a predetermined number of standard deviations from the mean of the colours of the neighbouring pixels). The granularity of this process would match that of the number of sensor pixels associated with each light redirection unit.
Hence, in a summary embodiment of the present invention, an imaging device (e.g. camera unit 40) comprises an array of light sensor pixels 44 (e.g. as part of the array of light redirection units 50(1 - ), or as a separate substrate). In addition it comprises an array of steerable light redirection units 50(1 - ) (either separate or incorporating the light sensor pixels), respectively operable to redirect incident light onto at least one sensor pixel of the array of light sensor pixels. In addition, it will have an input (not shown) operable to receive steering configuration data to configure the respective steering of light redirection units in the array of light redirection units, so as to collectively form a virtual lens. The input may be a matrix addressable input to allow each light redirection unit to receive a respective instruction. Optionally, assuming a fixed optical axis and symmetrical virtual lens, only inputs for a quadrant of the array may be input, and these may then be mirrored within the imaging device to generate a full set of data. In an instance of the summary embodiment, the imaging device is operable to refocus by reconfiguring respective angles of light redirection units in the array of steerable light redirection units, so as to collectively form a new virtual lens with a different focal length. Hence as described with respect to Figure 2 previously herein, by changing the relative angles of the light redirection units with respect to the imaging plane, the effective focal length of the virtual lens is changed.
In an instance of the summary embodiment the respective light direction units refract light through at least a portion of the unit onto at least one sensor pixel, in a similar manner to that described with respect to Figures 3 and 4. In another instance of the summary embodiment the respective light direction units reflect light onto at least one sensor pixel, in a similar manner to that described with respect to Figure 5.
In an instance of the summary embodiment the light redirection units are arranged to be steered on one axis (e.g. if mounted on a torsion member with one axis of rotation), and this axis is oriented circumferentially about an optical axis of the virtual lens (in other words, the axis of rotation is normal to a radial line extending from the optical axis).
In another instance of the summary embodiment the light redirection units are arranged to be steered on two axes, in a similar manner to that described with respect to Figure 4.
In an instance of the summary embodiment the refractive or reflective surface of the light redirection unit, or the surface comprising the sensor, comprises a microlens to better direct light onto the photoactive region(s) of the sensor.
In an instance of the summary embodiment, the imaging device is incorporated within an electronic device (10) such as a camera body, smartphone or tablet, or a flat-panel TV or other device, which comprises a processor (not shown) operable to transmit steering configuration data to the imaging device to form a virtual lens having a predetermined focal distance.
In an instance of the summary embodiment, the electronic device comprises a memory, and the processor is operable to receive data from the array of light sensor pixels and store the data as an image file in the memory.
In an instance of the summary embodiment, the processor of the electronic device is operable to identify and correct isolated pixel errors in the image, for example using known noise suppression techniques.
In an instance of the summary embodiment, the processor of the electronic device is operable to transmit steering configuration data to the imaging device to form a set of virtual lenses suitable for generating a lightfield on the array of light sensors, and to then compute an output image from lightfield data received from the array of light sensor pixels. Referring now to Figure 6, an imaging method for an imaging device comprising an array of light sensor pixels and an array of steerable light redirection units respectively operable to refract incident light through at least a portion of the unit onto at least one sensor pixel of the array of light sensor pixels comprises
- a first step (slO) of steering light redirection units in the array of light redirection units in response to steering configuration data so as to collectively form a virtual lens; and optionally a second step (s20) of reconfiguring respective angles of light redirection units in the array of steerable light redirection units, so as to collectively form a new virtual lens with a different focal length.
It will be apparent to a person skilled in the art that variations in the above method corresponding to operation of the various embodiments of the apparatus as described and claimed herein are considered within the scope of the present invention, including but not limited to:
orienting the light redirection units so that an axis of rotation of the refractor or reflector is parallel to the circumference of a circle centred on the assumed optical axis of a virtual lens. transmitting the steering configuration data to the imaging device;
receiving data from the array of light sensor pixels and storing them;
identifying and correcting isolated pixel errors in the image; and
transmiting steering configuration data to the imaging device to form a set of virtual lenses suitable for generating a lightfield on the array of light sensors, and computing an output image from lightfield data received from the array of light sensor pixels.
It will be appreciated that methods disclosed herein may be carried out on conventional hardware suitably adapted as applicable by software instruction or by the inclusion or substitution of dedicated hardware.
Thus the required adaptation to existing parts of a conventional equivalent device may be implemented in the form of a tangible non-transitory computer program product or similar object of manufacture comprising processor implementable instructions stored on a data carrier such as a floppy disk, optical disk, hard disk, PROM, RAM, flash memory or any combination of these or other storage media, or realised in hardware as an ASIC (application specific integrated circuit) or an FPGA (field programmable gate array) or other configurable circuit suitable to use in adapting the conventional equivalent device. Separately, if applicable the computer program may take the form of a transmission via data signals on a network such as an Ethernet, a wireless network, the Internet, or any combination of these or other networks.
It will be appreciated that, numerous modifications and variations of the present disclosure are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the invention may be practiced otherwise than as specifically described herein.

Claims

1. An imaging device, comprising
an array of light sensor pixels;
an array of steerable light redirection units, respectively operable to refract incident light through at least a portion of the unit onto at least one sensor pixel of the array of light sensor pixels; and
an input operable to receive steering configuration data to configure the respective steering of light redirection units in the array of light redirection units, so as to collectively form a virtual lens.
2. An imaging device according to claim 1 operable to refocus by reconfiguring respective angles of light redirection units in the array of steerable light redirection units, so as to collectively form a new virtual lens with a different focal length.
3. An imaging device according to any one of the preceding claims in which the light redirection units are arranged to be steered on one axis, and this axis is oriented circumferentially about an optical axis of the virtual lens.
4. An imaging device according to claim 1 or claim 2 in which the light redirection units are arranged to be steered on two axes.
5. An imaging device according to any one of the preceding claims in which respective light redirection units comprise a microlens.
6. An electronic device, comprising:
the imaging device of any one of claims 1 to 5; and
a processor; and in which:
the processor is operable to transmit steering configuration data to the imaging device to form a virtual lens having a predetermined focal distance.
7. An electronic device according to claim 6, comprising:
a memory; and in which the processor is operable to receive data from the array of light sensor pixels, and store the data as an image file in the memory.
8. An electronic device according to claim 6 or claim 7, in which:
the processor is operable to identify and correct isolated pixel errors in the image.
9. An electronic device according to any one of claims 6 to 8, in which
the processor is operable to transmit steering configuration data to the imaging device to form a set of virtual lenses suitable for generating a lightfield on the array of light sensors; and the processor is operable to compute an output image from lightfield data received from the array of light sensor pixels.
10. An imaging method for an imaging device comprising an array of light sensor pixels and an array of steerable light redirection units respectively operable to refract incident light through at least a portion of the unit onto at least one sensor pixel of the array of light sensor pixels, the method comprising the steps of:
steering light redirection units in the array of light redirection units in response to steering configuration data so as to collectively form a virtual lens.
11. An imaging method according to claim 10 comprising the step of:
reconfiguring respective angles of light redirection units in the array of steerable light redirection units, so as to collectively form a new virtual lens with a different focal length.
12. An imaging method according to claim 10 or claim 11, comprising the steps of:
transmitting steering configuration data to the imaging device to form a virtual lens having a predetermined focal distance;
receiving data from the array of light sensor pixels; and
storing the data as an image file.
13. A computer program for carrying out the steps of any preceding method claim.
PCT/GB2013/052421 2012-09-28 2013-09-17 Imaging device and method WO2014049333A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1217382.9A GB2506405A (en) 2012-09-28 2012-09-28 Imaging device with steerable light redirection units forming virtual lens
GB1217382.9 2012-09-28

Publications (1)

Publication Number Publication Date
WO2014049333A1 true WO2014049333A1 (en) 2014-04-03

Family

ID=47225367

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2013/052421 WO2014049333A1 (en) 2012-09-28 2013-09-17 Imaging device and method

Country Status (2)

Country Link
GB (1) GB2506405A (en)
WO (1) WO2014049333A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016175046A1 (en) * 2015-04-28 2016-11-03 ソニー株式会社 Image processing device and image processing method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0598546A1 (en) * 1992-11-16 1994-05-25 General Electric Company Lenticular lenses
US6124974A (en) * 1996-01-26 2000-09-26 Proxemics Lenslet array systems and methods
WO2003051189A2 (en) * 2001-12-14 2003-06-26 Technovision Gmbh Gesellschaft Für Die Entwicklung Medizinischer Technologien Improved hartmann-shack wavefront sensor apparatus and method
US20100244165A1 (en) * 2009-03-26 2010-09-30 Micron Technology, Inc. Method and apparatus providing combined spacer and optical lens element
US20120069209A1 (en) * 2010-09-22 2012-03-22 Qualcomm Mems Technologies, Inc. Lensless camera controlled via mems array
US20120119613A1 (en) * 2010-11-15 2012-05-17 Tassera MEMS Technologies, Inc. Mems actuator device
US20120200829A1 (en) * 2011-02-09 2012-08-09 Alexander Bronstein Imaging and projecting devices and methods

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7339746B2 (en) * 2004-03-22 2008-03-04 Angstrom, Inc. Small and fast zoom system using micromirror array lens

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0598546A1 (en) * 1992-11-16 1994-05-25 General Electric Company Lenticular lenses
US6124974A (en) * 1996-01-26 2000-09-26 Proxemics Lenslet array systems and methods
WO2003051189A2 (en) * 2001-12-14 2003-06-26 Technovision Gmbh Gesellschaft Für Die Entwicklung Medizinischer Technologien Improved hartmann-shack wavefront sensor apparatus and method
US20100244165A1 (en) * 2009-03-26 2010-09-30 Micron Technology, Inc. Method and apparatus providing combined spacer and optical lens element
US20120069209A1 (en) * 2010-09-22 2012-03-22 Qualcomm Mems Technologies, Inc. Lensless camera controlled via mems array
US20120119613A1 (en) * 2010-11-15 2012-05-17 Tassera MEMS Technologies, Inc. Mems actuator device
US20120200829A1 (en) * 2011-02-09 2012-08-09 Alexander Bronstein Imaging and projecting devices and methods

Also Published As

Publication number Publication date
GB2506405A (en) 2014-04-02
GB201217382D0 (en) 2012-11-14

Similar Documents

Publication Publication Date Title
US10044919B2 (en) Structures and methods for capturing images by a portable electronic device
US9733458B2 (en) Multi-camera system using folded optics free from parallax artifacts
US10084958B2 (en) Multi-camera system using folded optics free from parallax and tilt artifacts
Brückner et al. Thin wafer-level camera lenses inspired by insect compound eyes
JP2018529256A (en) Method and apparatus having a two-surface microlens array for a low F-number plenoptic camera
US20150373269A1 (en) Parallax free thin multi-camera system capable of capturing full wide field of view images
US20180220057A1 (en) Camera device and method for capturing images by using the same
US9148565B2 (en) Methods and apparatus for panoramic afocal image capture
WO2007115281A1 (en) Improved plenoptic camera
US11516391B2 (en) Multiple camera system for wide angle imaging
EP3633969B1 (en) Image sensor and image sensing method
TW202043896A (en) Camera module and electronic device
US20200049956A1 (en) Ultracompact wide field of view lens assembly
US20200371323A1 (en) Camera and Lens Systems Using Off-Axis Free Form Elements and Methods Therefor
CN110868526A (en) Shooting module, shooting method and electronic equipment
US20200012069A1 (en) Structures and Methods for Capturing Images by a Portable Electronic Device with a Linear Movement Switching Mechanism
WO2014049333A1 (en) Imaging device and method
KR20200038835A (en) Image sensor and method to sense image
EP4339658A1 (en) Imaging lens module, camera module and electronic device
TW202414015A (en) Imaging lens module, camera module and electronic device
US20170094150A1 (en) Image capture system and focusing method thereof
US20150241977A1 (en) Detecting a command from a combined motion
WO2018077446A1 (en) A multi-image sensor apparatus
KR20160150488A (en) Camera and control method thereof
KR20140077665A (en) System for photographing multi image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13766632

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13766632

Country of ref document: EP

Kind code of ref document: A1