0% found this document useful (0 votes)
145 views41 pages

Camera Calibration

Camera calibration estimates intrinsic and extrinsic camera parameters using correspondences between 3D points on a calibration object and their 2D image projections. There are two main methods: indirect calibration estimates projection matrices from point correspondences, then decomposes them; direct calibration directly estimates parameters. Both require a calibration object with known 3D geometry and position, whose 2D features can be located accurately in images. Parameter estimation involves solving homogeneous systems to relate the 3D-2D point correspondences to the parameters. Accuracy is estimated by re-projecting known 3D points and measuring errors.

Uploaded by

bharadwaja502
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
145 views41 pages

Camera Calibration

Camera calibration estimates intrinsic and extrinsic camera parameters using correspondences between 3D points on a calibration object and their 2D image projections. There are two main methods: indirect calibration estimates projection matrices from point correspondences, then decomposes them; direct calibration directly estimates parameters. Both require a calibration object with known 3D geometry and position, whose 2D features can be located accurately in images. Parameter estimation involves solving homogeneous systems to relate the 3D-2D point correspondences to the parameters. Accuracy is estimated by re-projecting known 3D points and measuring errors.

Uploaded by

bharadwaja502
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 41

Camera Calibration

CS485/685 Computer Vision


Prof. Bebis

Camera Calibration - Goal


Estimate the extrinsic and intrinsic camera parameters.
f/sx

f/sy

Camera Calibration - How


Using a set of known correspondences between point features
in the world (Xw, Yw, Zw) and their projections on the image
(xim, yim)
f/sx

f/sy

Calibration Object
Calibration relies on one or more images of a calibration
object:
(1) A 3D object of known geometry.
(2) Located in a known position in space.
(3) Yields image features which can be located accurately.

Calibration object: example


Two orthogonal grids of
equally spaced black squares.
Assume that the world
reference frame is centered at
the lower left corner of the
right grid, with axes parallel to
the three directions identified by
the calibration pattern.

Calibration pattern: example (contd)


Obtain 3D coordinates (Xw, Yw, Zw)
Given the size of the planes, the number
of squares etc. (i.e., all known by
construction), the coordinates of each
vertex can be computed in the world
reference frame using trigonometry.

Calibration pattern: example (contd)


Obtain 2D coordinates (xim, yim)
The projection of the vertices on
the image can be found by
intersecting the edge lines of the
corresponding square sides (or
through corner detection).

Problem Statement

Compute the extrinsic and intrinsic camera


parameters from N corresponding pairs of points:
( X iW , YiW , Z iW ) and (xim_i , yim_i ), i = 1, . . . , N.
Very well studied problem.
There exist many different methods for camera calibration.

Methods
(1) Indirect camera calibration
(1.1) Estimate the elements of the projection matrix.
(1.2) If needed, compute the intrinsic/extrinsic camera
parameters from the entries of the projection matrix.

Methods (contd)
(2) Direct camera calibration
Direct recovery of the intrinsic and extrinsic camera
parameters.

Method 1: Indirect Camera Calibration


Review of basic equations

Note: replaced (xim,yim) with (x,y) for simplicity.

(Method 1) Step 1: solve for mijs


M has 11 independent entries.
e.g., divide every entry by m11

Need at least 11 equations for computing M.


Need at least 6 world-image point correspondences.

(Method 1) Step 1: solve for mijs


Each 3D-2D correspondence gives rise to two equations:

( X iW , YiW , Z iW ) ( xi , yi )

(Method 1) Step 1: solve for mijs


This leads to a homogeneous system of equations:
N x 12 matrix

(Method 1) Step 1: solve for mijs

(Method 1) Step 2: find intrinsic/extrinsic parameters

(Method 1) Step 2: find intrinsic/extrinsic parameters

Lets define the following vectors:

(Method 1) Step 2: find intrinsic/extrinsic parameters


The solutions are as follows (see book chapter for
details):

The rest parameters are easily computed ....

Method 2: Direct Camera Calibration


Review of basic equations
From world coordinates to camera coordinates

For simplicity, we will replace -T with T


Warning: this is NOT the same T as before!

Pc=RPw+T

Method 2: Direct Camera Calibration (contd)


Review of basic equations
From camera coordinates to pixel coordinates:

Relating world coordinates to pixel coordinates:

Method 2: Direct Parameter Calibration


Intrinsic parameters
Intrinsic parameters f, sx, sy, ox, and oy are not independent.

Define the following four independent parameters:

Method 2: Main Steps


(1) Assuming that ox and oy are known, estimate
all other parameters.
(2) Estimate ox and oy

(Method 2) Step 1: estimate f x , , R, and T


To simplify notation, set (xim- ox , yim- oy) = (x, y)

Combining the equations above (i.e., same denominator),


we have:

(Method 2) Step 1: estimate f x , , R, and T


(contd)
Each pair of corresponding points must satisfy the previous
equation:
( X iW , YiW , Z iW ) ( xi , yi )

divide by f y and re-arrange terms:

(Method 2) Step 1: estimate f x , , R, and T


(contd)
we obtain the following equation:

where

(Method 2) Step 1: estimate f x , , R, and T


(contd)
Assuming N correspondences leads to a homogeneous
system :

N x 8 matrix

(Method 2) Step 1: estimate f x , , R, and T


(contd)

(Method 2) Step 1: estimate f x , , R, and T


(contd)
Determine and | |

(Method 2) Step 1: estimate f x , , R, and T


(contd)
Determine r21, r22, r23, r11, r12, r13, Ty, Tx

(up to an unknown common sign)

(Method 2) Step 1: estimate f x , , R, and T


(contd)
Determine r31, r32, r33
Can be estimated as the cross product of R1 and R2:

The sign of R3 is already fixed (the entries of R3 remain


unchanged if the signs of all the entries of R1 and R2 are
reversed).

We have estimated R call the estimate

(Method 2) Step 1: estimate f x , , R, and T


(contd)
Ensure the orthogonality of R
The computation of R does not take into account explicitly
the orthogonality constraints.
The estimate
of R cannot be expected to be orthogonal:
Enforce orthogonality on
using SVD:

Replace D with I:

(Method 2) Step 1: estimate f x , , R, and T


(contd)
Determine the sign of
Consider the following equations again:

(Method 2) Step 1: estimate f x , , R, and T


(contd)

(Method 2) Step 1: estimate f x , , R, and T


(contd)
Determine Tz and fx
Consider the equation:

Lets rewrite it in the form:

or

xTz+fx(r11Xw+r12Yw+r13Zw+Tx) = -x(r31Xw+r32Yw+r33Zw)

(Method 2) Step 1: estimate f x , , R, and T


(contd)
We can obtain Tz and fx by solving a system of equations like
the above, written for N points:

Using SVD, the (least-squares) solution is:

(Method 2) Step 1: estimate f x , , R, and T


(contd)
Determine fy:

(Method 2) Step 2: estimate ox and oy


The computation of ox and oy is based on the following
theorem:
Orthocenter Theorem: Let T be the triangle on the image
plane defined by the three vanishing points of three
mutually orthogonal sets of parallel lines in space. Then,
(ox , oy) is the orthocenter of T.

(Method 2) Step 2: estimate ox and oy (contd)


We can use the same calibration pattern to compute
three vanishing points (use three pairs of parallel lines
defined by the sides of the planes).

None of the three mutually


orthogonal directions should
not be near parallel to the
image plane!

Comments
To improve the accuracy of camera calibration, it
is a good idea to estimate the parameters several
times (i.e., using different images) and average the
results.
Localization errors
The precision of calibration depends on how accurately the
world and image points are located.
Studying how localization errors "propagate" to the
estimates of the camera parameters is very important.

Comments (contd)
In theory, direct and indirect camera calibration should
produce the same results.
In practice, we obtain different solutions due to
different error propagations.
Indirect camera calibration is simpler and should be
preferred when we do not need to compute the
intrinsic/extrinsic camera parameters explicitly.

How should we estimate the accuracy


of a calibration algorithm?
Project known 3D points on the image
Compare their projections with the corresponding
pixel coordinates of the points.
Repeat for many points and estimate re-projection
error!

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy