WO2004092825A1 - Method for exploring optical parameters of camera - Google Patents

Method for exploring optical parameters of camera Download PDF

Info

Publication number
WO2004092825A1
WO2004092825A1 PCT/IB2004/001106 IB2004001106W WO2004092825A1 WO 2004092825 A1 WO2004092825 A1 WO 2004092825A1 IB 2004001106 W IB2004001106 W IB 2004001106W WO 2004092825 A1 WO2004092825 A1 WO 2004092825A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
projection
symmetric
center
image
Prior art date
Application number
PCT/IB2004/001106
Other languages
French (fr)
Inventor
Gwo-Jen Jan
Chuang-Jan Chang
Original Assignee
Appro Technology Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Appro Technology Inc. filed Critical Appro Technology Inc.
Publication of WO2004092825A1 publication Critical patent/WO2004092825A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • G01M11/0221Testing optical properties by determining the optical axis or position of lenses

Definitions

  • the invention relates to a method for exploring the optical parameters of a camera.
  • the analyzable parameters comprise the intrinsic intrinsic parameters
  • a fisheye camera also termed a fisheye image sensor mounted with a fisheye lens
  • optical parameters are hard to be precisely deduced by those methods used in the
  • panospherical imaging field has replaced the use of the fisheye image
  • the camera system with a mechanical pan-tilt-zoom motoring
  • FIG. 1A expresses the
  • FIG. IB is the
  • which is the angular vector in the polar
  • is the angle referring to the mapping domain 13' of the prime
  • ⁇ /2- ⁇ is regarded as the latitude and ⁇ as the longitude. Therefore, if several imaged
  • the image-based algorithm furthermoremore than the specific projection mechanism.
  • the imageable area 1 of the fisheye image is an
  • the value of ⁇ at the point E is supposed to be ⁇ /4 since it is located in the middle
  • An imaged point on the image plane can be denoted as C'(u, v) in a Cartesian
  • EDP equidistant projection
  • the present invention terms the EDP coupled with a FOV of
  • the coverage of the FOV is not constantly equal to ⁇ , ranging from
  • the object of this invention is to provide a camera-
  • Another object of this invention is to provide a method absolutely positioning the
  • optical axis is therefore traceable and the
  • optical parameters of a camera such as the viewpoint (VP), the focal length constant and
  • the projection function are inducible accordingly.
  • the present invention provides a
  • PCP symmetric pattern
  • the target is placed in the FOV of the fisheye
  • ICP imaged central-symmetric pattern
  • At least one symmetry index is employed to test the ICP's
  • the ICP is where the principal point can be located. Furthermore, the sight ray
  • the projection model involved in deduction can either be the
  • EDP equidistant projection
  • SGP stereographic projection
  • the method disclosed by the invention can ensure the extrinsic and intrinsic optical
  • FIGs. 1A and IB show the schematic view of a calibration method based on an
  • FIG. 2 sketches three typical projection functions of the fisheye lens
  • FIG. 3 shows an embodiment of the physical central-symmetric pattern (PCP) 11280pctF
  • FIG.4 cubically shows the 3-D optical paths between the PCP and the fisheye camera
  • FIG. 5 A shows a schematic view of multi-collimated optical paths simulated by an
  • FIG. 5B shows a schematic view of the cubically optical paths highlighting a part of
  • FIG.5A
  • FIG. 6 shows the employed embodiment of the PCP in an experiment based on the
  • FIG. 7 shows the schematic view of a device arrangement performing the task of
  • FIG. 8A shows the imaged schematic view in an experiment imaged by the PCP
  • FIG 8B shows the signal-intensity curves of the image contours shown in FIG. 8 A
  • FIG. 9 shows an imaged schematic view transformed by a polar-coordinate
  • FIG. 8A and
  • FIG. 10 shows the approaching curves for seeking the viewpoint in light of three
  • the fisheye lens is a non-linear perspective projection lens, which means its
  • the fisheye lens has the merits of a wide field of
  • the projection mechanism of a camera whose point of origin is termed a principal point.
  • optical axis of the camera The geometrical optical model is well known to those skilled in the art.
  • PCP symmetric pattern
  • ICP 230 here means the optical axis 21 perpendicularly penetrates both the image center
  • cardinal point (BCP) 243 are both located on the optical axis 21. The position of the
  • optical axis 21 in space can be absolutely determined by referring to the target 22 because
  • the PCP 220 shown in FIG. 3 can be regarded as an arc-laid optical layout imitating
  • the VP can be positioned by referring to the test camera's physical body.
  • which is the angle extending from the optical axis 21 to the incident ray.
  • multicollimator can measure the image element, module or system of any circularly
  • the projection model being put in operation has no limitation on certain close
  • the multicollimator is also suitable in examining the fisheye
  • FIG. 3 shows an embodiment of the PCP 220 having a solid circular center and a plurality of center-symmetric geometric figures (such as the concentric circles therein)
  • FIG. 4 shows the planar target 22 in the 3-
  • fisheye camera wherein the fisheye lens 24 and the image plane 23 stand equivalently for
  • the incident rays cast from the PCP 220 will certainly and essentially achieve a
  • FCP front cardinal point
  • the FCP 242 and BNP 243 are two
  • distance between the two cardinal points 242 and 243 is arbitrary because it is not a
  • the present invention therefore merges the two cardinal
  • FIG. 5A shows the optical paths on two meridional planes comprising
  • FIG. 5A also reveals that ⁇ '
  • the camera outer-space projection coordinate system is E( ⁇ , ⁇ ,h), wherein ⁇ , ⁇ ,
  • the object point 221 will be located at a visual angle larger than 180 degrees. Furthermore, the imaged points 231, 302 in the "image projection space" behind the fisheye lens 24 are no longer
  • the image-plane coordinate system of C ' (x,y) or P ' (p, ⁇ ) represents the image plane 23 vis-a-vis the Cartesian coordinate system or the polar coordinate system wherein its origin is set at the principal point 235.
  • the pixel coordinate system of I(u,v) represents the image which can be directly observed on a computer screen with a unit of "pixel".
  • the principal point 235 is imaged at the coordinate denoted as I(u c ,v c ) on the computer screen. Basically, the imaged dimensions on the image plane 23, C'(x',y') or
  • P'(p', ⁇ ') can correspond to the pixel coordinate system of I(u,v). Therefore, the Cartesian coordinate system of C(u,v) or the polar coordinate system of
  • P(p, ⁇ ) can represent the pixel coordinate system of I(u,v) as well, where I(u c , v c ) is the origin.
  • FIG. 5B also shows the orientation-and-location relationship between E( ⁇ , ⁇ ,h) and W(X,Y,Z) once the coordinate systems are set up.
  • the objective in establishing the coordinate systems is to align the Z-axis of W(X,Y,Z) with the optical axis 21 and make them overlap, as shown in FIG. 5B.
  • This figure utilizes a "small sphere" 30, which is a technical term in cartography, identically expressing the optical projection traces in the inner and outer projection spaces of a camera mounting a fisheye lens conforming to the equidistant projection (EDP).
  • EDP equidistant projection
  • FIG. 5 A shows an arc boundary of a "large sphere" 40 in order to explain how the
  • PCP 220 on the target 22 imitates the arc-laid point-light sources of the multicollimator
  • optical axis 21 perpendicularly passes through the pattern center 225 of the PCP 220, it
  • planar target 22 is normally secant to the large sphere 40
  • outermost circle of the PCP 220 is regarded as the secant circle (a small circle in geodetic
  • the projected image is expected to be an imaged central-symmetric pattern (ICP) 230 if
  • the optical axis 21 has aligned the pattern center 225; the geometric-symmetric center of the ICP 230 is exactly the principal point 235.
  • the relative position between the target 22 and the test camera ought to be 11280pctF
  • imaged by the pattern center 225 can be regarded as the location of the principal point 235
  • passing perpendicularly through the pattern center 225 can stand for the position of the
  • optical axis 21 The above procedure achieves the function of tracing the optical axis 21;
  • test pattern 220 available in the invention, not just the test pattern 220
  • the PCP 220 is designed as shown in FIG. 6, which is printed with
  • the radii of the concentric circles of the PCP 220 are designed to
  • the radial scales of the concentric circles can refer to the initially imaged contours
  • the target 22 is fixed on an adjusting platform 50 and is
  • the test camera 60 is a CCD B/W camera (Type CV-M50E, by Mechademic
  • the focal length is 1.78 mm and the diagonal FOV is 170 degrees; both the
  • each CCD cell length and height of each CCD cell are 9.8 ⁇ m, which is a referred unit while calculating the image height (p) in the pixel coordinate system.
  • the adjusting platform 50 is mainly composed of three rigid axes perpendicular to each other, namely, the X' rigid axis 51, the Y' rigid axis 52 and the Z' rigid axis 53. Every movement of the target 22 will represent the relative offset in the absolute coordinate system of W(X,Y,Z) because the relative position between the target 22 and the three rigid axis bodies 51,52, and 53, which can be precisely controlled by a computer, is firmly fixed.
  • the optical axis 21 in E( ⁇ , ⁇ ,h) has to be parallel with the Z' rigid axis 53 in W(X,Y,Z) in the final adjustment.
  • the camera holder 70 is moved to a proper location based on visional judgment and the PTZ-head 71 is adjusted in order to turn the camera 60 to aim at the target 22, namely, to make the optical axis 21 of the camera 60 "look like" perpendicular to the target plane.
  • a computer program finely adjusts the absolute coordinate of the target 22 by referring to the displayed image and the symmetric indexes thereof; meanwhile, the orientation of the camera 60 is adjusted as well by the PTZ-head 71 under the camera 60 for seeking the optimum symmetry of the displayed image.
  • the optical axis 21 supposedly aligns with the Z' rigid axis 53 if, ideally, the optical axis 21 is adjusted to pass perpendicularly through the feature coordinate of the pattern center 225 on the target 22.
  • FIG. 8A is the schematic view of the ICP 230 processed by the method of the present
  • border-distance is defined as the length from the sampled border point to the image center.
  • diff_2 EE-WW
  • diff_3 NE-SW
  • diff_4 NW-SE - each of which is expected to
  • the sum of two distance-summations in the opposite directions should have the largest value if the ICP 230 reaches an ideal
  • a computer program conducts the imaged-contour extraction through an image-processing method in the invention.
  • An imaged-border-identified algorithm is created in the invention according to the special characteristics of fisheye images; this algorithm operates automatically in the background to conduct the imaged-contour extraction during the experiment. Due to the rapid radial decay of the radiometric responses of the fisheye images, with reference to FIG. 8B, the peaks of original signal intensity (expressed by the solid signal curves) rapidly decay near the border of the fisheye image so that featured representative signals are hard to recognize in this area. Thus an unsharp-mask processing program is developed in the invention to handle the progressive decay.
  • a histogram equalizing process is performed in order to elevate the signal levels near the border area, manifested by the dashed signal curves.
  • a non-casual low-pass filter is applied to generate the dynamic threshold levels (expressed by horizontal solid lines).
  • the profiles of the dynamic threshold levels feature the edges at the crossing points with the equalized dashed signal curves. These edges are automatically delimited by the processing program and shown as the square waves at the bottom.
  • the fisheye images are
  • fisheye-lens imaging is symmetric to the principal point 235 on the image plane 23.
  • this second symmetric index is a commendable reference no matter that it is
  • optical axis 21 will be regarded as being
  • pattern center 225 is considered the principal point 235; meanwhile, the axis orthogonal
  • axis 21 can be obtained by referring to the absolute position of the PCP 220. This implies
  • the spatial absolute coordinate of the VP 241 on the optical axis 21 can also be
  • the VP 241 is postulated as the origin E(0,0,0) of the E( ⁇ , ⁇ ,h) coordinate system, and the optical axis 21 (denoted as E(0, ⁇ ,h) , where ⁇ and h are arbitrary) is postulated to overlap with the Z-axis (denoted as W(0,0,z), where z is a real number) in the absolute coordinate system.
  • E(0,0,h) the optical axis 21
  • W(0,0,z) the Z-axis
  • N th imaged contour is taken as the common reference, namely P N (D) —f*a ⁇ (D), the relation with the i imaged contour is given as:
  • ⁇ ,- is decided by z and r; in the and the scales of i are fixed on the image plane 23 (namely, the p,(D) which is invariable while the value of z has changed), at least two conjugated coordinates (namely, the (rj,p;), representing 11280pctF
  • the object distance D can be fixed at the
  • equation (2) only refers to two selected concentric circles.
  • a weight function is defined by referring to the increasing range of
  • p 0 (D) is a null value and is treated as the radius of the principal point 235.
  • VP 241 on the optical axis 21 is:
  • N ⁇ (z) ⁇ abs(e t (z) x w, (£>)) ⁇ -(4)
  • Equation (4) is
  • the target 22 is separately moved twice along the positive direction of the Z-axis from the initial coordinate of the target 22 where the alignment of the optical axis 21 is completed, covering 5 mm each time.
  • the position of the camera 60 and the target's coordinate on the X' and Y' rigid axes are fixed during the two more advanced tests.
  • the three experiments are named as "Testl", “Test2" and “Test3” in sequence.
  • Table 1 the parameters and results of the three tests 11280pctF
  • Table 1 lists the inferred values of D,/and ⁇ / ⁇ obtained in the three tests. Each test
  • the postulations selected from the group comprising the EDP, the OGP and the SGP.
  • the specifications may be caused by the manual fabrication of the lens.
  • FIG. 10 shows the ⁇ -profiles and the ⁇ -profiles while testing the D-value along the 11280pctF
  • test camera mounting the lens is accordingly parameterized.
  • the method disclosed in the invention has the function of categorizing the real natural
  • the principal point 235, the optical axis 21 and the VP 241 can be
  • the present invention enables the function of tracing the position of the optical
  • the present invention radically simplifies the logic of image transformation so
  • the present invention can accurately locate the single VP 241 in a particular

Abstract

The present invention is a method for exploring the optical parameters of a camera. Images projected from a plane target with a center-symmetric pattern are utilized to lead the function of alignment in the system, which takes advantage of the characteristic that image deformation is symmetric to the principal point owing to the phenomenon that projecting optical paths symmetrically surround the optical axis. The absolute position of a camera is deduced by the spatial absolute coordinates of the calibration points on the target and the imaged coordinates thereof, upon the base of a given projection model and the located optical axis. The simple target can imitate the delicate and complex multicollimator calibration mechanism. The accuracy of the absolute position of the optical axis essentially affects the measurement quality. Two image-process strategies are created by the present invention in order to analyze the symmetry of the images. These indirect indexes are employed to position the principal point on the image plane and the spatial absolute position of the optical axis. The method of trial-and-error is employed to determine the exact location of the optical projection center, and then the focal length constant is deducible. Referring to the intrinsic and extrinsic parameters obtained can transform the fisheye images with metering accuracy so that the relative applications of the fisheye camera are widely expanded.

Description

11280pctF
METHOD FOR EXPLORING OPTICAL PARAMETERS OF CAMERA
BACKGROUND OF THE INVENTION
Field of Invention
The invention relates to a method for exploring the optical parameters of a camera.
Particularly, it is a method of utilizing the center-symmetrical characteristic of camera
image deformation to develop image-process techniques in order to situate the principal
point and analyze the optical parameters of cameras in consideration of various nonlinear
perspective projection models. The analyzable parameters comprise the intrinsic
projection function and the absolute coordinates representing the extrinsic position of the
camera.
Related Art
The camera systems in the field of artificial vision have preferred using lenses with a
narrow field of view (FOV) in order to obtain images approaching an ideal perspective
projection mechanism for precise measurement and easy image processes. The pinhole
model is usually a basis to deduce the camera's parameters. The intrinsic and extrinsic
parameters obtained can be employed in the visual applications in quest of higher
precision, for instance in 3-D cubical inference, stereoscopy, automatic optical inspection,
etc. Regarding image deformation, a polynomial function is used to describe the deviation
between original images and the ideal model, or in conducting the work of calibration.
These applications, however, currently have the common limitations of narrow visual
angles and an insufficient depth of field. 11280pctF
A fisheye camera (also termed a fisheye image sensor) mounted with a fisheye lens,
which focuses deeper and wider, can capture a clear image with a FOV of 180 degrees or
even more, but a severe barrel distortion develops. Because the optical geometry of the
fisheye camera is extremely different from the rectilinear perspective projection model,
the optical parameters are hard to be precisely deduced by those methods used in the
related art for normal cameras. Therefore, technologies developed for the usual visual
disciplines have not resulted in any capability in processing the images of the fisheye
camera (simplified as "fisheye images" hereinafter).
Eventually, the panospherical imaging field has replaced the use of the fisheye image
sensor (also called a dioptric sensor) by alternatively developing various camera systems
with complex reflective optical elements (also called catadioptric sensors) as
compensation. These solutions employed optical components such as reflectors or prisms
to take panoramic views, for instance, the technologies which were disclosed in the US
patents 6,118,474 and 6,288,843 Bl. However, the catadioptric systems often elongate
the ray traces, complicate the image-forming mechanism and attenuate the imaging
signals by indirectly taking the reflective images through the added optical elements. A
blind area will be unavoidable at the center of an image because of the frontal installation
of the reflective element.
To expand the FOV, the camera system with a mechanical pan-tilt-zoom motoring
function is another solution in the related art, which separately captures surrounding
images in a row to achieve a panoramic view, such as, for instance, the technology
disclosed in the US patent 6,256,058 Bl. Or conversely, a number of cameras are
deployed to simultaneously capture images in different directions in order to seam a 11280pctF
panorama together. However, the first method of a rotation type cannot capture an entire
scene in a single shot so that the flaw of asyncl ronism remains in the results. Furthermore,
the volume of both systems can hardly be shrunk to approach a hidden function or to take
a close-range view, not to mention the heavy weights of the camera bodies which
consume more electricity, or the rotating device which is relatively easily thrown out of
order. In addition to the extra cost of multi-cameras, the sampling and integration of the
images from individual cameras still present many problems. Hence, adopting lenses with
a very wide FOV (such as the fisheye lens or compounded catadioptric sensors) to capture
an entire scene in a single shot is a tendency of this kind of camera systems while
considering many practical requirements in applications.
Owing to the poorly deduced accuracy of the optical parameters of a camera based on
the rectilinear perspective projection model, some alternative solutions were evolved to
tackle the transformation of fisheye images. They involve an image-based algorithm aims
at a specific camera which is mounted with a specific lens conforming to a specific
projection mechanism so as to deduce the optical parameters based solely on the images
displayed. With reference to FIG. 1A and FIG. IB, wherein FIG. 1A expresses the
imageable area 1 of a fisheye image in a framed oval/circular region and FIG. IB is the
hemispherical spatial projecting geometry corresponding to FIG. 1A, both figures note
the zenithal distance of α, which is the angle defined by an incident ray and the optical
axis 21, as well as the azimuthal distance of β, which is the angular vector in the polar
coordinate system whose origin is set at the principal point. Citing the positioning
concept of a globe, β is the angle referring to the mapping domain 13' of the prime
meridian 13 on the equatorial plane in the polar coordinate system, as shown in FIG. IB. 11280pctF
Thus, π/2-α is regarded as the latitude and β as the longitude. Therefore, if several imaged
points lie on the same radius of the imageable area 1, their corresponding spatial incident
rays would be on the same meridional plane (such as the sector determined by the arc
C'E'G' and two spherical radii); that is, their azimuthal distances (β) are invariant, as with
points D, E, F, and G in FIG. 1 A corresponding to points D', E', F', and G' in FIG. IB.
In addition to the specific projection mechanism, the image-based algorithm further
needs several basic postulates: first, the imageable area 1 of the fisheye image is an
analyzable oval or circle, and the intersection of the major axis 11 and minor axis 12 (or,
rather, of the two diameters) situates the principal point, which is cast by the optical axis
21 as shown in FIG. IB; secondly, the boundary of the image is projected by the light rays
of α=π/2; third, α and p are linearly related, wherein p, termed a principal distance, is the
length between an imaged point (such as point E) and the principal point (point C). For
example, the value of α at the point E is supposed to be π/4 since it is located in the middle
of the radius of the imageable area 1; hence the sight ray corresponding to point E is
destined to pass through point E' in the hemispherical sight space, as shown in FIG.IB;
the same is true with points C and C, points D and D', points F and F', etc.
An imaged point on the image plane can be denoted as C'(u, v) in a Cartesian
coordinate system or as P'(p, β) in a polar coordinate system, both taking the principal
point as their origin. Although the mapping mechanism was not really put on discussion
in the image-based algorithm, it is actually the equidistant projection (EDP) whose
function is p=kα where k is a constant and, actually, the focal length constant of/
The US patent 5,185,667 accordingly developed a method to transform fisheye
images conforming to the rectilinear perspective projection model in accordance with the 11280pctF
projection mechanism shown in FIGs. 1 A and IB so as to monitor a hemispherical field of
view (180 degrees by 360 degrees). This patented technology has been applied in
endoscopy, surveillance and remote control as disclosed in US patents 5,313,306,
5,359,363 and 5,384,588. The present invention terms the EDP coupled with a FOV of
180 degrees as "EDPπ". Based on the EDPπ postulation, the focal length constant (f) can
be obtained by dividing the radius of the imageable area 1 by π/2; the spatial angle (α, β)
of the corresponding incident ray can also be analyzed from the planar coordinates C'(u, v)
on the imageable area 1. In light of the known image-analyzing skills, an "ideal EDPπ
image" can be transformed into the image remapped by the rectilinear perspective
projection, referring to any projection line as a datum axis. This image-based algorithm is
easy and no extra calibration object needed. However, it is worthy noting that these serial
US patents did not concretely demonstrate the general suitability to average fisheye
lenses or not. Thus, the accuracy of the patented technology in transforming images
remains a big question insofar as no specific fisheye lens is used. The current practice has
system-application manufacturers asking for limited-specification fisheye lenses
combined with particular camera bodies and providing exclusive software, and only then
does the patented technology (US patent 5, 185,667) have practical and commercial value.
Major parts of the image-based postulates mentioned above, however, are unrealistic
because many essential factors or variations have not been taken into consideration. First,
the EDPπ might just be a special case among possible geometrical projection models
(note: however, it is the most common projection model of the fisheye lens). With reference to FIG. 2, it shows three possible and typical projection curves of the fisheye
lens, and implies that the natural projection mechanism of the fisheye lens may lie in the 11280pctF
following projections: the stereographic projection (or SGP, whose projection function is
p=2/xtan(α/2)) and the orthographic projection ( or OGP, whose projection function is p
=/χsin(α)). Moreover, the coverage of the FOV is not constantly equal to π, ranging from
larger to smaller. From the curves in FIG. 2, the differences between the three projection
models respectively are obviously increasing with the growing zenithal distances (α).
Thus, distortions will develop if lock all projection geometries on the EDPπ and
transform images accordingly. Secondly, the FOV of π is difficult to evaluate since the
form of the imageable area 1 is always presented as a circle irrespective of the angular
scale of the FOV. A third factor concerns the errors caused in locating the image border
even if the FOV is certainly equal to π. The radial decay caused by the radiometric
response is an unavoidable phenomenon in a lens, especially when dealing with a larger
FOV. This property will induce a radial decay on the image intensity, especially occurring
with some simple lenses, so that the real boundary is extremely hard to set under that
bordering effect. Perhaps there could even be no actual border feature upon consideration
of the diffraction phenomenon of light. Finally, if the imageable area 1 of a camera is
larger than the sensitive zone of a CCD, only parts of the "boundary" of an image will
show up, and therefore the image transformation cannot be effectively executed.
Consequently, the image-based algorithm depends considerably on the chosen devices
irrespective of whether the lens conforms to the ideal EDPπ postulation or not.
Alternatively, the method will result in poor accuracy, modeling errors, a doubtful
imageable area 1 extracted, an unstable principal point situated, as well as practical
limitations.
Moreover, Margaret M. Fleck [Perspective Projection: The Wrong Image Model, 11280pctF
19941 has demonstrated that the proj ection mechanisms of lenses hardly fit a single ideal
model over the whole angular spectrum in practice; otherwise, optics engineers could
develop lenses with special projection functions, such as the fovea lens, in light of the
different requirements in applications. Thus, imposing the postulation of the EDP on all
fisheye cameras is extremely forced.
Obviously, there has been no discussion in the related art about how to position the
principal point in a real camera system, not to mention the deduction of the extrinsic
parameters (namely, the position of the optical axis and the viewpoint thereon
representing the camera in the absolute coordinate system) and the intrinsic parameters
(namely, the projection function and its coefficients, such as "2", "f and "α/2" in the
projection function of p = 2/χtan(α/2)). These limitations keep the fisheye lens from
advanced applications. The present invention will carefully look into these issues and free
the procedure of camera parameterization from ideal image-based postulations, such as
the EDPπ and the image boundary, in order to precisely obtain the optical parameters and
to exactly transform fisheye images with fidelity on the basis of the obtained parameters.
Apart from this, visual measurement can also be well developed via the technology
disclosed by the present invention.
SUMMARY OF THE INVENTION
In view of the foregoing, the object of this invention is to provide a camera-
parameterizing method, which aims at the camera mounted with the lens of a non-linear
perspective projection mechanism, simply based on the natural optical projection
phenomenon of the lens. 11280pctF
Another object of this invention is to provide a method absolutely positioning the
principal point and the optical axis based on the characteristic of barrel distortion which is
symmetrical to the principal point owing to the phenomenon that projecting optical paths
symmetrically surround the optical axis. The optical axis is therefore traceable and the
optical parameters of a camera, such as the viewpoint (VP), the focal length constant and
the projection function, are inducible accordingly.
In accordance with the objects described above, the present invention provides a
method for exploring the optical parameters of a camera, which is an advanced research
based on the US patent applications of 09/981,942 and 10/234,258. A physical central-
symmetric pattern (PCP) is designed on a target according to the center-symmetric
characteristic of fisheye image's distortion. The target is placed in the FOV of the fisheye
camera and its position is adjusted until an imaged central-symmetric pattern (ICP)
appears on the image plane. At least one symmetry index is employed to test the ICP's
symmetry. If the ICP's symmetry satisfies an accuracy request, the geometrical center of
the ICP is where the principal point can be located. Furthermore, the sight ray
perpendicularly passing through the geometric center of the PCP will stand for the optical
axis. Thus, the absolute position of the optical axis in space can be deduced by referring to
the given position of the PCP.
Based on the traceable spatial trace of the optical axis of the camera, the method of
trial-and-error is employed at every point on the optical axis with the numerical
limitations composed of the absolute physical radii of the PCP and the measured imaged
radii of the ICP in order to obtain an optical center (or termed the viewpoint, simplified as
the VP) satisfying a specific projection model. The focal length constant of the camera 11280pctF
can also be determined by the mathematical equation of the specific projection model as
the VP is already fixed. The projection model involved in deduction can either be the
equidistant projection (EDP), the stereographic projection (SGP) or the orthographic
projection (OGP), those well-known projection models in the related art, or a particular
projection function provided by lens designers or manufacturers.
The method disclosed by the invention can ensure the extrinsic and intrinsic optical
parameters of the fisheye camera, so the fisheye images can be accordingly transformed
into ones with various image formats.
Further scope of applicability of the present invention will become apparent from the
detailed description given hereinafter. However, it should be understood that the detailed
description and specific examples, while indicating preferred embodiments of the
invention, are given by way of illustration only, since various changes and modifications
within the spirit and scope of the invention will become apparent to those skilled in the art
from this detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will become more fully understood from the detailed
description given herein below of illustrations only, and thus are not limitative of the
present invention, and wherein:
FIGs. 1A and IB show the schematic view of a calibration method based on an
image-based algorithm aiming at the EDPπ of fisheye images in the related art;
FIG. 2 sketches three typical projection functions of the fisheye lens;
FIG. 3 shows an embodiment of the physical central-symmetric pattern (PCP) 11280pctF
according to the spirit of the invention;
FIG.4 cubically shows the 3-D optical paths between the PCP and the fisheye camera
in the invention;
FIG. 5 A shows a schematic view of multi-collimated optical paths simulated by an
aligned PCP on two meridional planes a π-distance from each other, and also a schematic
view of optical paths for interpreting the projection behavior of sight rays through a small
sphere (taking the EDP as an example);
FIG. 5B shows a schematic view of the cubically optical paths highlighting a part of
FIG.5A;
FIG. 6 shows the employed embodiment of the PCP in an experiment based on the
invention;
FIG. 7 shows the schematic view of a device arrangement performing the task of
adjusting the position between the fisheye camera and the target;
FIG. 8A shows the imaged schematic view in an experiment imaged by the PCP
shown in FIG. 6;
FIG 8B shows the signal-intensity curves of the image contours shown in FIG. 8 A
along the four directions of the northeast, southwest, northwest and southeast;
FIG. 9 shows an imaged schematic view transformed by a polar-coordinate
transformation, which takes the principal point as the origin, from the image shown in
FIG. 8A; and
FIG. 10 shows the approaching curves for seeking the viewpoint in light of three
different projection functions in an experiment. 11280pctF
DETAILED DESCRIPTION OF THE INVENTION
The technology disclosed in the present invention is an advanced research based on
the US patent applications of 09/981,942 and 10/234,258.
The fisheye lens is a non-linear perspective projection lens, which means its
projecting behavior cannot be interpreted by the well-known pinhole model when spatial
sight rays pass through this kind of lens. The fisheye lens has the merits of a wide field of
view (FOV) and an infinite depth of field in comparison with other lenses following the
rectilinear projection, but a severe barrel distortion comes along in its images. Namely,
the quantities of distortion throughout an image are distributed with a radial symmetry
whose point of origin is termed a principal point. The projection mechanism of a camera
in space can be described as follows: the incident rays cast from an object in the FOV will
logically converge onto a unique spatial optical center (or termed the viewpoint,
simplified as the VP) and then divergently map on the image plane in light of a projection
function; meanwhile, the optical projection geometry in the FOV symmetrically encircles
the optical axis of the camera. The geometrical optical model is well known to those
skilled in the related art of optical engineering. However, no proper analytical technology
has come up yet; hence only the rectilinear perspective projection model is available as a
foundation for developing a computer vision system. This limitation comes from the huge
related quantity of a barrel distortion, which cannot be analyzed yet. However, the
characteristic of distortion turns into a key feature in the invention to develop a method
for parameterizing the fisheye camera, a method working better the severer the distortion.
From a geometrical view, a planar drawing capable of representing an axis-
symmetric geometric arrangement in space can image a center-symmetric image inside a 11280pctF
camera. Therefore, a planar target 22, as shown in FIG. 3, with a physical central-
symmetric pattern (PCP) 220 thereon, is placed in the FOV of a camera. The relative
position of the target 22 and the camera is adjusted in order to obtain an imaged central-
symmetric pattern (ICP) 230 on the image plane 23, as shown in FIG. 4. Obtaining the
ICP 230 here means the optical axis 21 perpendicularly penetrates both the image center
235 and the pattern center 225, and the front cardinal point (FCP) 242 and the back
cardinal point (BCP) 243 are both located on the optical axis 21. The position of the
optical axis 21 in space can be absolutely determined by referring to the target 22 because
its absolute position is man-made and given in advance. Therefore, seeking the ICP 230 is
the core procedure.
The PCP 220 shown in FIG. 3 can be regarded as an arc-laid optical layout imitating
the multicollimator. The quite aged multicollimator metrology has been employed to
calibrate large-scale aerial convex lenses. It utilizes an arc-laid array composed of
independent point-light sources with accurate alignment to generate a bundle of light rays
converging at a specific point whose absolute position in space is given. Through
adjusting the position of a test camera to make its image clearest, the VP of the test
camera will be regarded as covering at the preset light-ray converging point. Therefore,
the VP can be positioned by referring to the test camera's physical body. Each light ray
from the point-light sources simulates an incident ray from infinity with a known zenithal
distance of α, which is the angle extending from the optical axis 21 to the incident ray.
Because the coordinate of the imaged point imaged by each light ray can be measured
accurately, the α-to-p (p is the image height) projection profile of the lens can be obtained
from the directly measured data. 11280pc F
As far as operating models are concerned, the physical arrangement of the
multicollimator can measure the image element, module or system of any circularly
axis-symmetric projecting optical paths, and accordingly obtain the projection model
thereof; the projection model being put in operation has no limitation on certain close
circular functions. Certainly, the multicollimator is also suitable in examining the fisheye
lens. However, the accurate arc mechanism of the multicollimator is too sophisticated to
be realized in normal labs. The present invention creates a much easier way by employing
a planar drawing to indirectly imitate the multicollimator' s arrangement in space.
FIG. 3 shows an embodiment of the PCP 220 having a solid circular center and a plurality of center-symmetric geometric figures (such as the concentric circles therein)
designed in light of the spirit of the invention described above. The
measurement/calibration mechanism of the multicollimator will be used to assist in
describing the basis of the present invention. FIG. 4 shows the planar target 22 in the 3-
D space of the system and the optical projection paths generated thereby in the FOV of the
fisheye camera; wherein the fisheye lens 24 and the image plane 23 stand equivalently for
the fisheye camera. If the projection behavior of a camera conforms to any known
circular-function relationship (meaning the product of a circular function and a focal
length), the incident rays cast from the PCP 220 will certainly and essentially achieve a
collimating mechanism; namely, all incident rays will converge at a logical optical center
of the fisheye lens 24, termed the front cardinal point (FCP) 242, and then refract
divergently onto the image plane 23 (or the optical sensor) from the back cardinal point
(BCP) 243 according to the projection function. The FCP 242 and BNP 243 are two
referred points for the two distinct spaces delimiting the projecting behavior inside and 11280pctF
outside the fisheye camera. Sight rays refer to the FCP 242 and the image plane 23 refers
to the BCP 243 while analyzing the projection mechanism of the fisheye camera. The
distance between the two cardinal points 242 and 243 is arbitrary because it is not a
parameter of the camera system. The present invention therefore merges the two cardinal
points 242 and 243 at a single viewpoint (VP) 241 as shown in FIG. 5 A in order to unify
the imaging logic. FIG. 5A shows the optical paths on two meridional planes comprising
the optical axis 21 corresponding to the 3-D model of FIG. 4. FIG. 5A also reveals that α'
is' inferred backwards from p. The logical relationship between α and α' is ruled by the
natural projection model of the test lens.
To present the theoretical foundation of the invention clearly, the referred coordinate
systems are defined as follows:
1. The absolute coordinate system of W(X,Y,Z) places its origin at the geometrical
center of the target 22, and defines the positive of the Z-axis as the direction
perpendicularly away from the target 22.
2. The camera outer-space projection coordinate system is E(α,β,h), wherein α, β,
and h are three defined vectors. This coordinate system imitates the well-known
geodetic coordinate system of G(φ,λ,h), where φ is the latitude, λ the longitude
and h is identical to the one in E(α,β,h), the height. With reference to FIG. 5B,
the three vectors in E(α,β,h) are similar to the three in G(φ,λ,h), except that α
refers to the optical axis 21 and φ refers to the equatorial plane 31. The inner and
outer projection spaces of the camera are therefore demarcated into two
hemispheres by the lens while setting the origin of E(α,β,h) at the VP 241. If h is
positive, it means the object point 221 is located within 180 degrees in an "object 11280pctF
projection space"; if h is negative, the object point 221 will be located at a visual angle larger than 180 degrees. Furthermore, the imaged points 231, 302 in the "image projection space" behind the fisheye lens 24 are no longer
defined by α and h. 3. The image-plane coordinate system of C ' (x,y) or P ' (p,β) represents the image plane 23 vis-a-vis the Cartesian coordinate system or the polar coordinate system wherein its origin is set at the principal point 235. 4. The pixel coordinate system of I(u,v) represents the image which can be directly observed on a computer screen with a unit of "pixel". The principal point 235 is imaged at the coordinate denoted as I(uc,vc) on the computer screen. Basically, the imaged dimensions on the image plane 23, C'(x',y') or
P'(p',β'), can correspond to the pixel coordinate system of I(u,v). Therefore, the Cartesian coordinate system of C(u,v) or the polar coordinate system of
P(p,β) can represent the pixel coordinate system of I(u,v) as well, where I(uc, vc) is the origin.
FIG. 5B also shows the orientation-and-location relationship between E(α,β,h) and W(X,Y,Z) once the coordinate systems are set up. The objective in establishing the coordinate systems is to align the Z-axis of W(X,Y,Z) with the optical axis 21 and make them overlap, as shown in FIG. 5B. This figure utilizes a "small sphere" 30, which is a technical term in cartography, identically expressing the optical projection traces in the inner and outer projection spaces of a camera mounting a fisheye lens conforming to the equidistant projection (EDP). The concept shown in the figure is suitable to other projection functions as well; the lens category usable in the present invention sets no 11280pctF
limitations regarding any kind of the EDP. Some technical terms of geodesy and
cartography, both highly developed disciplines, will be introduced hereinafter in order to
assist the description of the theoretical foundation and the image-transformed principle of
the present invention.
FIG. 5 A shows an arc boundary of a "large sphere" 40 in order to explain how the
PCP 220 on the target 22 imitates the arc-laid point-light sources of the multicollimator,
in addition to the small sphere 30 with the radius of (the focal length constant). Once the
optical axis 21 perpendicularly passes through the pattern center 225 of the PCP 220, it
means that the planar target 22 is normally secant to the large sphere 40, and the
outermost circle of the PCP 220 is regarded as the secant circle (a small circle in geodetic
terms) on the surface of the large sphere 40.
Naturally, sight rays cast from any points (such as point 221) on the target 22 will
perpendicularly penetrate the surface of the small sphere 30 at incident point 301 and
converge toward the spherical center (namely, the VP 241). This means that every
concentric circle of the PCP 220 constructs a symmetric light cone in the outer projection
space of the camera, and its convergent point is the VP 241, like the cubical optical paths
shown in FIG.4. Logically, the sight rays will be refracted in light of a projection function
while passing through the VP 241, and then project on the image plane 23 to form the
imaged point 231. Based on the spatially axis-symmetric characteristic mentioned above,
the projected image is expected to be an imaged central-symmetric pattern (ICP) 230 if
the optical axis 21 has aligned the pattern center 225; the geometric-symmetric center of the ICP 230 is exactly the principal point 235.
Hence, the relative position between the target 22 and the test camera ought to be 11280pctF
properly adjusted until the ICP 230 is obtained (that is, until the symmetry of the
projected image reaches a certain preset accuracy); meanwhile, the feature coordinate
imaged by the pattern center 225 can be regarded as the location of the principal point 235
which is the origin, denoted as C'(0,0) or P'(0,β), on the image plane 23. This location is
denoted as I(uc, vc) in the pixel coordinate system. The sight ray passing through the
principal point 235 and being perpendicular to the image plane 23 would pass
perpendicularly through the pattern center 225 of the PCP 220 as well. Thus, the sight ray
passing perpendicularly through the pattern center 225 can stand for the position of the
optical axis 21. The above procedure achieves the function of tracing the optical axis 21;
this is a breakthrough in exploring the extrinsic parameters of the fisheye camera.
There are many kinds of the test pattern 220 available in the invention, not just the
planar concentric circles shown in FIG. 3. Every PCP 220 composed of concentric-and-
symmetric geometric figures is a practicable embodiment. Namely, the concentric
rectangles, the concentric triangles or the concentric hexagons are all applicable in the
invention in addition to the concentric circles. Even the combination of any number of
concentric-and-symmetric circles, rectangles, triangles and/or polygons is a possible
embodiment of the PCP 220 in the invention. Of course, a 3-D calibration target may
achieve the same function if it can symmetrically surround the optical axis 21, but it will
not result in an easier processing procedure.
An embodiment of the invention is presented below in order to concretely
demonstrate the positioning method for the principal point 235 and the optical axis 21. In
a practical experiment, the PCP 220 is designed as shown in FIG. 6, which is printed with
a laser printer on a piece of A3-size paper as an embodiment of the target 22. In 11280pctF
consideration of the severe distortion of the fisheye image rapidly increasing along the
outward radial direction, the radii of the concentric circles of the PCP 220 are designed to
be progressively larger outwards in order to match this optical phenomenon of the fisheye
lens. The radial scales of the concentric circles can refer to the initially imaged contours
proj ected from a plain target 22 like the one shown in FIG.3. The obj ect-to-image contour
relationship is obtained first at a proper measured location, and the widths of the physical
concentric circles are then determined accordingly in order to enable the system to clearly
display both the middle and the outer image ranges simultaneously. Besides, the visible
contours of the black and white concentric circles spaced in between will benefit the
following image-processing procedure.
With reference to FIG. 7, the target 22 is fixed on an adjusting platform 50 and is
moved as close to the camera 60 as possible in order to allow the PCP 220 to lie across the
whole FOV of the fisheye lens 24; the projected image will now cover the most sensitive
zone of the CCD. The above arrangement is devised for sampling the image information
at larger visual angles, because this part of the image is mostly able to reflect the specific
projection model of the fisheye lens 24; that is, referring to FIG.2 again, the differences
between different projection models become more obvious as the angles get larger.
The test camera 60 is a CCD B/W camera (Type CV-M50E, by Mechademic
Company, Japan) mounting a fisheye lens (Type DW9813, by Daiwon Optical Co.,
Korea); this is a pretty simple camera system. The following specifications are offered by
the vendors: the focal length is 1.78 mm and the diagonal FOV is 170 degrees; both the
length and height of each CCD cell are 9.8μm, which is a referred unit while calculating the image height (p) in the pixel coordinate system. l lZSOpctb-
The adjusting platform 50 is mainly composed of three rigid axes perpendicular to each other, namely, the X' rigid axis 51, the Y' rigid axis 52 and the Z' rigid axis 53. Every movement of the target 22 will represent the relative offset in the absolute coordinate system of W(X,Y,Z) because the relative position between the target 22 and the three rigid axis bodies 51,52, and 53, which can be precisely controlled by a computer, is firmly fixed. For the purpose of simplifying description, take the coordinates where the three rigid axis bodies 51,52, and 53 located to stand for the absolute coordinates of physical positions, and define the positive direction of the Z-axis as the one the test camera 60 moving away from the target 22. Ideally, the optical axis 21 in E(α,β,h) has to be parallel with the Z' rigid axis 53 in W(X,Y,Z) in the final adjustment.
However, in practice there is an initial six-dimension difference between E(α,β,h) and W(X,Y,Z), including three offset variables and three rotation variables; hence the two coordinate systems have to be aligned. First, the camera holder 70 is moved to a proper location based on visional judgment and the PTZ-head 71 is adjusted in order to turn the camera 60 to aim at the target 22, namely, to make the optical axis 21 of the camera 60 "look like" perpendicular to the target plane. Then, a computer program finely adjusts the absolute coordinate of the target 22 by referring to the displayed image and the symmetric indexes thereof; meanwhile, the orientation of the camera 60 is adjusted as well by the PTZ-head 71 under the camera 60 for seeking the optimum symmetry of the displayed image. Based on this device arrangement, the optical axis 21 supposedly aligns with the Z' rigid axis 53 if, ideally, the optical axis 21 is adjusted to pass perpendicularly through the feature coordinate of the pattern center 225 on the target 22.
Two image-symmetry judged methods are disclosed in the invention for examining 11280pctF
the alignment relationship between E(α,β,h) and W(X,Y,Z) so as to position the principal
point 235 and the optical axis 21. However, this does not signify a limitation in the
invention; any variations or modifications following the same spirit of judging image
symmetry are not to be regarded as a departure from the spirit and scope of the invention.
FIG. 8A is the schematic view of the ICP 230 processed by the method of the present
invention and shown on the computer screen. Taking the image as the reference plane,
and the principal point 235 (note: this actually is the imaged blob of the pattern center 225
in practice) as the datum point, eight radial symmetric directions, including the south,
north, east, west, northeast, southwest, northwest and southeast, are selected as sampled
directions for extracting the contours of the imaged concentric circles. Two kinds of
marks are displayed on the screen as well in order to indicate two different sampled
border points; referring to the image center and moving along the radial lines outwards,
wherein " — " proceeds from black to white and "+" moves from white to black. A
border-distance is defined as the length from the sampled border point to the image center.
All border-distances in the same direction are summed up to a "distance-summation"; that
is to say, eight distance-summations are calculated and separately denoted as "SS", "NN",
"EE", "WW", "NE", "SW", "NW" and "SE". The difference between two distance-
summations in the opposite directions is supposed to be close to the value of zero if the
ICP 230 reaches ideal symmetry; namely, there are four differences - diff_ l=NN-SS,
diff_2=EE-WW, diff_3=NE-SW and diff_4=NW-SE - each of which is expected to
achieve the value of zero. Alternatively, the sum of two distance-summations in the opposite directions should have the largest value if the ICP 230 reaches an ideal
symmetry; namely, there are four sums - sum_l=NN+SS, sum_2=EE+WW, l liSUpctl'
sum_3=NE+SW and sum_4=NW+SE - each of which is expected to attain the maximum value. Therefore, the four differences or the four sums or both (categorized together as the first symmetric index in the invention) displayed on the computer screen are the references for the examination of the orientation of the target 22, and accordingly, the relative position of the target 22 and the test camera 60 is adjusted in order to reach an
optimal symmetry of the ICP 230.
The techniques in processing the fisheye images are currently still rare. A computer program conducts the imaged-contour extraction through an image-processing method in the invention. An imaged-border-identified algorithm is created in the invention according to the special characteristics of fisheye images; this algorithm operates automatically in the background to conduct the imaged-contour extraction during the experiment. Due to the rapid radial decay of the radiometric responses of the fisheye images, with reference to FIG. 8B, the peaks of original signal intensity (expressed by the solid signal curves) rapidly decay near the border of the fisheye image so that featured representative signals are hard to recognize in this area. Thus an unsharp-mask processing program is developed in the invention to handle the progressive decay. First, a histogram equalizing process is performed in order to elevate the signal levels near the border area, manifested by the dashed signal curves. Next, a non-casual low-pass filter is applied to generate the dynamic threshold levels (expressed by horizontal solid lines). The profiles of the dynamic threshold levels feature the edges at the crossing points with the equalized dashed signal curves. These edges are automatically delimited by the processing program and shown as the square waves at the bottom. The skills for extracting the coordinates of the imaged contours are an important subject in the discipline of image metering; O 2004/092825
11280pctF
different kinds of imaging or photographing techniques should correspond to different
processing methods owing to different energy spectrums. The fisheye images are
possessed of very special phenomena; that is to say, large differences in image qualities
exist as zenithal distances (α) increase. This is a point needed to pay attention while
processing images of this kind. Other details related to the image processing techniques
are well known to those skilled in the related art, so the details will be skipped here.
The second symmetric index in the invention is also according to the characteristic
that fisheye-lens imaging is symmetric to the principal point 235 on the image plane 23.
Take the concentric circles of the PCP 220 in FIG. 6 as an example. If the optical axis 21
is already perpendicularly aligned to the pattern center 225 on the target 22, transforming
P'(p,β) in FIG. 8 A into C(p,β), the Cartesian coordinate system (namely the polar-
coordinate transformation), will turn the circles into horizontal lines as shown in FIG. 9
where its X-axis is β, the Y-axis is p, and its origin corresponds to the principal point 235
in FIG. 8A. The contour linearity of the transformed black/white lines is the second
symmetric index in the invention. The experiment's results show that this second
symmetric index is highly sensitive. Only a slight offset from the correct relative position
between the target 22 and the camera 60 would cause severe bending curves on the screen.
Hence, this second symmetric index is a commendable reference no matter that it is
identified by computer calculation or just naked-eye observation. This deduction is also
suitable for other circular symmetric targets.
When the symmetry of the ICP 230 is at the optimum, examined by either the first
or the second symmetry index, or both, the optical axis 21 will be regarded as being
perpendicularly aligned to the pattern center 225, and the imaged point projected from the 11280pctF
pattern center 225 is considered the principal point 235; meanwhile, the axis orthogonal
to the pattern center 225 would pass through the principal point 235 and be perpendicular
to the image plane 23. That is to say, the spatial sight ray representing the orthogonal axis
can absolutely position the optical axis 21 of the fisheye lens 24. Thus it can be seen that
an innovative contribution of the invention is this: the absolute coordinate of the optical
axis 21 can be obtained by referring to the absolute position of the PCP 220. This implies
that the spatial absolute coordinate of the VP 241 on the optical axis 21 can also be
determined by referring to the absolute position of the PCP 220. Hence the issue of posing
the fisheye camera is solved.
Referring to FIG. 5 A again, after the exposure of the optical axis 21, the VP 241
must be located on the optical axis 21 in light of the theory of optics. It means that the
possible range of the VP 241 shrinks to a quite limited scope. Under the numerical
limitations of the concentric circles' radii of the PCP 220 and ICP 230 denoted as (r , p;),
the method of trial-and-error is employed to examine each test point postulated as the VP
241 on the optical axis to find the optimal spot of the VP 241 in conformity with a specific
projection model, following which the focal length constant (denoted as/) of the fisheye
camera is derivable. The details are shown as follows:
If the location of the VP 241 on the optical axis 21 is given, the value of D is
determined by referring to the coordinate of the pattern center 225 of the PCP 220;
accordingly, the zenithal distance of αi5 defined by the ith concentric circle of the PCP 220,
is accordingly determined, that is, αj=tan D). Further, the principal distances (pj)
corresponding to the 1th imaged contours are derivable from the image plane 23. The
values off can be figured out by dividing p; by α; in the case of the EDP (p=/α). If the test 11280pctF
camera is an ideal EDP camera, all the calculated from the different concentric circles are supposed to be a constant. Hence, the optical feature of the fisheye camera can be identified by changing the value of D as well as adopting different projection models, such as the stereographic projection (SGP, wherein p=2/χtan(α/2)) or the orthographic
projection (OGP, wherein p =/xsin(α)), until a successful matching level is attained.
For descriptive purposes, the VP 241 is postulated as the origin E(0,0,0) of the E(α,β,h) coordinate system, and the optical axis 21 (denoted as E(0,β,h) , where β and h are arbitrary) is postulated to overlap with the Z-axis (denoted as W(0,0,z), where z is a real number) in the absolute coordinate system. Set the radii of the concentric circles of the PCP 220 as rt and the corresponding image heights as p;-. If the distance of D between the VP 241 and the PCP 220 is given, since both pi and α; are functions of D, the projection function of the EDP is turned into the mathematic form: p,(D) =
/ ,(D), wherein i= 1 ~N and N is the amount of imaged contours on the ICP 230 which
can be processed or sampled. If the Nth imaged contour is taken as the common reference, namely PN(D) —f*a^(D), the relation with the i imaged contour is given as:
pi(D)/pN(D)-αi(D)/αN(D)=0 (1)
However, the value of D cannot be foreseen in advance because the VP 241 is not yet fixed. A free point (0, 0, z) therefore replaces (0,0,D) to formulate the equation (1); a difference is given as:
e (z)=p;(D)/pN(D) - αi(z)/αN(z) (2)
Because α,- is decided by z and r; in the
Figure imgf000025_0001
and the scales of i are fixed on the image plane 23 (namely, the p,(D) which is invariable while the value of z has changed), at least two conjugated coordinates (namely, the (rj,p;), representing 11280pctF
the information concerning a pair corresponding to the object point 221 and the imaged
point 231) are needed to be measured in the experiment in order to decide the value of
ez(z). Scanning along the optical axis 21, the object distance D can be fixed at the
minimum of βj(z) according to the equation (2); meanwhile the exact spot of the VP 241 is
consequently fixed.
However, the equation (2) only refers to two selected concentric circles. In order to
cover the overall FOV and investigate the effective range of the test projection function,
multiple traces spanning the image are necessary and ought to preferably reach larger
visual angles. To fairly deal with the contribution of each imaged contour to the test
projection function, a weight function is defined by referring to the increasing range of
each imaged contour, which is:
wi(D)=(pi(D)-p,1(D))/pN(D) (3)
where p0(D) is a null value and is treated as the radius of the principal point 235.
Thus, the error function, which is practically used in the overall evaluation to search the
VP 241 on the optical axis 21, is:
N ε(z)= ∑ abs(et (z) x w, (£>)) -(4)
;=ι where z is the distance of a free point on the optical axis 21 from the PCP 220. The
VP 241 is located at the point where the ε(z) is minimum, or null. Equation (4) is
established on the postulation of the EDP; if the postulation is turned into other possible
projection models, such as the SGP (p=2 χtan(α/2)) or the OGP (p= χsin(α)), equations
from (1) to (4) have to be derived once again according to their projection functions
separately. Overall, the derivation in accordance with the idea described above is termed
the "ε-algorithm" in the invention. 11280pctF
To obtain the focal length constant/ the measured p;(D) and the respective αj(D) are
based to obtain:
N
ΛD)= 1 ∑ft(.D) x wt (D) (5)
1=1 where/(D)=pi(D)/ α;(D). Similarly, the/(D) will be equal to l/2*p!(D)/tan(α!(D)/2) if the postulated projection function becomes the SGP; or, the/(D) equals pi(D)/ sin(α;(D)) if the postulated projection function is the OGP. The D) and the/(D) should be equal to the inherent focal length constant / of the fisheye camera as long as the postulated projection function is exact, no error occurs in measurement, and the D value is correctly inferred. Put into practice, the descriptive-statistics standard deviation of all/ (D) can be the basis to evaluate the accuracy of the postulated projection model. Namely, the following equation can qualify the applicability of the postulated projection function, which is termed the "σ-algorithm" in the invention:
σ (D)=(∑(/,(Z>) - f(D)f )/(N-l) (6) ι=l
For an advanced evaluation of the reliability of the test results (including the position of the optical axis 21 and the matching projection model), referring to FIG.7 again, the target 22 is separately moved twice along the positive direction of the Z-axis from the initial coordinate of the target 22 where the alignment of the optical axis 21 is completed, covering 5 mm each time. The position of the camera 60 and the target's coordinate on the X' and Y' rigid axes are fixed during the two more advanced tests. Counting the first test, the three experiments are named as "Testl", "Test2" and "Test3" in sequence. Table 1 the parameters and results of the three tests 11280pctF
Figure imgf000028_0001
( unit : mm, except ε and σ which have no unit. )
Table 1 lists the inferred values of D,/and ε / σ obtained in the three tests. Each test
separately employs the ε-algorithm and the σ- algorithm to handle the measured data in
the postulations selected from the group comprising the EDP, the OGP and the SGP.
Comparing the absolute offsets in the left column, the results conclude that the test lens is
very close to the EDP type because the D-values shown in the EDP columns, irrespective
of whether they're inferred by the ε-algorithm or the σ- algorithm, faithfully reflect the
5mm offset each time; however, a difference of about 0.5mm persists in the D-values
between the two algorithms in each test. Moreover, the inferred values of /
(1.82mm/1.85mm) in the EDP-postulation are mostly close to the 1.78 mm provided by
the specifications; the difference may be caused by the manual fabrication of the lens. On
the other hand, the results corresponding to the OGP and the SGP are all far from the
given data, namely the absolute offsets and the focal length constant (f). The pretty small
values of ε/σ shown in the last row demonstrate the excellent accuracy of the two algorithms disclosed in the invention.
FIG. 10 shows the ε-profiles and the σ-profiles while testing the D-value along the 11280pctF
Z-axis in the absolute coordinate system, taking "Testl" as an example. The ε- or σ-
curves all reveal obvious minimums in six test conditions (three projection models
multiplied by two algorithms). The single minimum in each test condition verifies the
existence and location of the VP 241; it also proves the practicability of the invention.
Nevertheless, there are different locations of the VP 241 and values of the focal length
constant while postulating different projection functions in the same lens; this implies that
a single test is not enough to find out the real natural projection function of the test lens in
the invention. In practice, only a particular circular projection function is also not enough
to entirely describe the projecting behavior of a single lens.
The method disclosed in the invention, however, is not limited to a specific
projection model, such as the EDP. Any non-linear projection lens with a given projection
function is analyzable and the test camera mounting the lens is accordingly parameterized.
The method disclosed in the invention has the function of categorizing the real natural
projection models of test cameras without the postulation of an exact π-FOV. The
transformation and presentation of fisheye images in the invention is completely based on
the optical models, starting from the principal point 235 and moving outwards, so the
problems of the blurred boundaries and their uncertain visual angles can be ignored.
Consequently, users can designate a user-defined area on the image plane 23, which is the
part effectively complying with a particular projection model, and transform the image
solely within the user-defined area. Thus, the problem of fixing the boundaries is
eliminated, and any precondition that the whole imaging area has to comply with a
particular projection function is removed. In some cases, users can even shorten the
sampling area to meet the required accuracy. 11280pctF
In conclusion, the principal point 235, the optical axis 21 and the VP 241 can be
positioned accurately with the aid of the method in the invention, and the morphologic
fidelities of the images transformed can accordingly be recovered. Hence, the applications
of the fisheye camera will be widely expanded by the present invention.
In general, the invention is possessed of the following advantages:
1. The present invention enables the function of tracing the position of the optical
axis 21 to be concretely realized in practice so as to be able to further search the
spatial absolute coordinate of the VP 241. This is a breakthrough in
parameterizing cameras.
2. The present invention radically simplifies the logic of image transformation so
that the speed of such transformation is faster and at a low cost because the
necessary optical parameters, such as the principal point 235 and the focal length
constant, can be precisely inferred by the invention; the images transformed
accordingly recover morphological fidelity.
3. The method of camera-parameterization disclosed in the invention is suitable for
various fisheye cameras with different projection mechanisms, without any
limitation of particular projection model, i.e. the EDP.
4. The present invention needs no postulation established on an assumed and
uncertain image boundary; hence the problem of the blurred boundary of the
fisheye image is eliminated.
5. The present invention can accurately locate the single VP 241 in a particular
projection model as the optical center for image transformation, so image
metering through the fisheye camera becomes practicable. 11280pctF
6. The high precision of the optical parameters will greatly extend the applicable
visual angles to the current visual systems.
The invention being thus described, it will be obvious that the same may be varied in
many ways. Such variations are not to be regarded as a departure from the spirit and scope
of the invention, and all such modifications as would be obvious to one skilled in the art
are intended to be included within the scope of the following claims.

Claims

11280pctFCLAIMSWhat is. claimed is:
1. A method for exploring the optical parameters of a camera, the camera having a
non-linear perspective projection lens which conforms to one of a plurality of given
projection models, the method comprises: providing a target with a physical central-symmetric pattern (PCP)
composed of a pattern center and a plurality of center-symmetric geometric
figures;
placing the target in the field of view (FOV) of the camera to allow the PCP
imaging on an image plane;
adjusting the relative position between the target and the camera in order to
obtain an imaged central-symmetric pattern (ICP), imaged by the PCP, on
the image plane; and
examining the symmetry of the ICP by at least one symmetric index until the
at least one symmetric index meets the requirement for accuracy, following
that the feature coordinate of an imaged point projected from the pattern
center is a principal point on the image plane.
2. The method according to claim 1, wherein the pattern center further determines
a spatial sight ray, which perpendicularly passes through the pattern center, to be an optical axis of the camera.
3. The method according to claim 1, wherein the plurality of center-symmetric
geometric figures is selected from the group comprising concentric circles, concentric rectangles, concentric triangles or concentric polygons. 11280pctF
4. The method according to claim 1, wherein the plurality of center-symmetric
geometric figures is a combination of any number of concentric-and-symmetric circles,
rectangles, triangles and/or polygons.
5. The method according to claim 1 , wherein the symmetric index is determined by
the steps comprising:
calculating a distance-summation by summing up a plurality of border-
distances lying in the same radial direction, where each border-distance is
defined as the length from the principal point to one of a plurality of imaged
contours projected from the plurality of center-symmetric geometric figures;
and
computing a plurality of differences where each is obtained by subtracting
two distance-summations in the opposite radial directions, the plurality of
differences composing the symmetry index.
6. The method according to claim 1 , wherein the symmetry index is determined by
the steps comprising:
calculating a distance-summation by summing up a plurality of border-
distances lying in the same radial direction, where each border-distance is
defined as the length from the principal point to one of a plurality of imaged
contours projected from the plurality of center-symmetric geometric figures;
and
computing a plurality of sums where each is obtained by adding two
distance-summations in the opposite radial directions, the plurality of sums
composing the symmetry index. 11280pctF
7. The method according to claim 1, wherein when the plurality of center-
symmetric geometric figures is a plurality of concentric circles, the symmetry index is
determined by the steps comprising:
transforming the ICP by a polar-coordinate transformation so as to turn the
plurality of concentric circles into a plurality of lines; and
examining the linearity of the plurality of lines as comprising the symmetry
index.
8. The method according to claim 2, wherein along the optical axis a VP is further
located by the steps comprising:
selecting one of the plurality of given projection models as a test projection
function;
postulating a test point on the optical axis;
taking the test point as a reference point and deducing at least two zenithal
distances (α) defined by at least two geometric figures selected from the
plurality of center-symmetric geometric figures;
calculating at least two principal distances (p) defined by at least two imaged
contours on the image plane corresponding to the at least two geometric
figures; and
obtaining at least two focal lengths by separately substituting one of the at
least two zenithal distances (α) and the corresponding one of the at least two
principal distances (p), comprising at least two groups of data, into the test
projection function, when the at least two focal lengths are equal to a
constant, the test point is the VP and the test projection function is the natural 11280pctF
projection function of the camera.
9. The method according to claim 8, wherein the spatial absolute coordinate of the
VP is referring to the absolute position of the PCP.
10. The method according to claim 8, wherein the test projection function is
selected from the group comprising an equidistant projection (EDP), an orthographic
projection (OGP) and a stereographic projection (SGP).
11. The method according to claim 8, wherein the at least two groups of data are
further examined by an ε-algorithm to minimize the value of an error function whose
N mathematical form is ε(z)= ∑ abs(et (z) x W{(D)) , wherein e (z) is an error function
obtained by subtracting a focal length constant from the test projection function, w/(D) is
a weight function and N is the amount of imaged contours sampled.
12. The method according to claim 8, wherein the at least two groups of data are
further examined by a σ-algorithm to minimize the value of an error function whose
mathematical form is
Figure imgf000035_0001
N fi(D) wi(D) , f(D) is a focal length constant corresponding to the i* imaged
(=1 contour based on the test projection function, wz(D) is a weight function and N is the
amount of imaged contours sampled.
13. The method according to claim 1, wherein the target is mounted on an adjusting
platform possessed of three rigid axes which are perpendicular to each other and capable of adjusting the position of the target.
14. The method according to claim 1, wherein the camera is mounted on a camera 11280pctF
holder comprising a PTZ-head for adjusting the direction of the lens of the camera.
PCT/IB2004/001106 2003-04-18 2004-04-13 Method for exploring optical parameters of camera WO2004092825A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW92109160 2003-04-18
TW92109160A TW565736B (en) 2003-04-18 2003-04-18 Method for determining the optical parameters of a camera

Publications (1)

Publication Number Publication Date
WO2004092825A1 true WO2004092825A1 (en) 2004-10-28

Family

ID=32503979

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2004/001106 WO2004092825A1 (en) 2003-04-18 2004-04-13 Method for exploring optical parameters of camera

Country Status (2)

Country Link
TW (1) TW565736B (en)
WO (1) WO2004092825A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107024339A (en) * 2017-04-21 2017-08-08 杭州蓝斯特科技有限公司 A kind of test device and method for wearing display device
CN108257102A (en) * 2018-01-22 2018-07-06 豪威科技(上海)有限公司 Flake corrects system and method
CN109886889A (en) * 2019-02-12 2019-06-14 哈尔滨工程大学 A kind of aerial based on circle center error penalty method plus by oily tapered sleeve precise positioning method
CN110031014A (en) * 2019-03-27 2019-07-19 浙江亚特电器有限公司 Vision positioning method based on pattern identification
CN110858899A (en) * 2018-08-22 2020-03-03 高新兴科技集团股份有限公司 Method and system for measuring optical axis center and field angle of camera movement
CN113345033A (en) * 2021-07-14 2021-09-03 云南大学 Method and system for calibrating internal parameters of central catadioptric camera

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI554754B (en) * 2015-04-02 2016-10-21 財團法人國家實驗研究院 Automated optical inspection system for detecting defect of hollow cylinders and method thereof
TWI555378B (en) * 2015-10-28 2016-10-21 輿圖行動股份有限公司 An image calibration, composing and depth rebuilding method of a panoramic fish-eye camera and a system thereof
TWI555379B (en) * 2015-11-06 2016-10-21 輿圖行動股份有限公司 An image calibrating, composing and depth rebuilding method of a panoramic fish-eye camera and a system thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5185667A (en) * 1991-05-13 1993-02-09 Telerobotics International, Inc. Omniview motionless camera orientation system
US5870135A (en) * 1995-07-27 1999-02-09 Sensormatic Electronics Corporation Image splitting forming and processing device and method for use with no moving parts camera
EP1028389A2 (en) * 1999-02-12 2000-08-16 Advanet, Inc. Arithmetic unit for image transformation
US20030090586A1 (en) * 2001-09-17 2003-05-15 Gwo-Jen Jan Method for exploring viewpoint and focal length of camera

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5185667A (en) * 1991-05-13 1993-02-09 Telerobotics International, Inc. Omniview motionless camera orientation system
US5870135A (en) * 1995-07-27 1999-02-09 Sensormatic Electronics Corporation Image splitting forming and processing device and method for use with no moving parts camera
EP1028389A2 (en) * 1999-02-12 2000-08-16 Advanet, Inc. Arithmetic unit for image transformation
US20030090586A1 (en) * 2001-09-17 2003-05-15 Gwo-Jen Jan Method for exploring viewpoint and focal length of camera

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107024339A (en) * 2017-04-21 2017-08-08 杭州蓝斯特科技有限公司 A kind of test device and method for wearing display device
CN107024339B (en) * 2017-04-21 2023-10-20 小艾帮帮(杭州)科技有限公司 Testing device and method for head-mounted display equipment
CN108257102A (en) * 2018-01-22 2018-07-06 豪威科技(上海)有限公司 Flake corrects system and method
CN110858899A (en) * 2018-08-22 2020-03-03 高新兴科技集团股份有限公司 Method and system for measuring optical axis center and field angle of camera movement
CN109886889A (en) * 2019-02-12 2019-06-14 哈尔滨工程大学 A kind of aerial based on circle center error penalty method plus by oily tapered sleeve precise positioning method
CN110031014A (en) * 2019-03-27 2019-07-19 浙江亚特电器有限公司 Vision positioning method based on pattern identification
CN110031014B (en) * 2019-03-27 2024-01-26 浙江亚特电器股份有限公司 Visual positioning method based on pattern recognition
CN113345033A (en) * 2021-07-14 2021-09-03 云南大学 Method and system for calibrating internal parameters of central catadioptric camera
CN113345033B (en) * 2021-07-14 2022-07-15 云南大学 Method and system for calibrating internal parameters of central catadioptric camera

Also Published As

Publication number Publication date
TW200422755A (en) 2004-11-01
TW565736B (en) 2003-12-11

Similar Documents

Publication Publication Date Title
CN106651942B (en) Three-dimensional rotating detection and rotary shaft localization method based on characteristic point
US6985183B2 (en) Method for exploring viewpoint and focal length of camera
US8107722B2 (en) System and method for automatic stereo measurement of a point of interest in a scene
Habib et al. Automatic calibration of low-cost digital cameras
US7479982B2 (en) Device and method of measuring data for calibration, program for measuring data for calibration, program recording medium readable with computer, and image data processing device
US5699444A (en) Methods and apparatus for using image data to determine camera location and orientation
US20040046888A1 (en) Method for presenting fisheye-camera images
US10542195B2 (en) Pressurized fluid-submerged, internal, close-range photogrammetry system for laboratory testing
JP2016128816A (en) Surface attribute estimation using plenoptic camera
CN111709985B (en) Underwater target ranging method based on binocular vision
CN109883391B (en) Monocular distance measurement method based on digital imaging of microlens array
CN109727291A (en) A kind of high-precision online calibration method of zoom camera
CN109341668A (en) Polyphaser measurement method based on refraction projection model and beam ray tracing method
CN109711400A (en) A kind of electric inspection process method and apparatus identifying simulated pointer formula meter reading
EP1484576A2 (en) Apparatus and method for calibrating zoom lens
CN109974618A (en) The overall calibration method of multisensor vision measurement system
WO2004092825A1 (en) Method for exploring optical parameters of camera
WO2004092826A1 (en) Method and system for obtaining optical parameters of camera
Nakath et al. An Optical Digital Twin for Underwater Photogrammetry: GEODT—A Geometrically Verified Optical Digital Twin for Development, Evaluation, Training, Testing and Tuning of Multi-Media Refractive Algorithms
JPH03500934A (en) Calibration system for output correlation plane of optical correlator
CN107527323B (en) Calibration method and device for lens distortion
CN106643731A (en) System and method for tracking and measuring point target
Habib et al. New approach for calibrating off-the-shelf digital cameras
VATTAI Development and validation of a horizon-based optical navigation test facility
Shafer Automation and calibration for robot vision systems

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase