Hello Q3man,
You did not mention any special optics, so I will assume uniform
pixels for the answer below. For standard lenses, this is a good
assumption (otherwise you would notice distortion in the pictures).
Please make a request for clarification if you need a non-uniform
solution (plus identify the type of optics you are using). I'll also
provide some references to diagrams and code for some non-uniform
lenses as well.
First, from the information provided, you have "non square" pixels.
The pixels are wider than tall.
Horizontal
46.0000 degrees / 320 pixels = 0.143750000 degree/pixel
or 6.956521739 pixels per degree
Vertical
25.5885 degrees / 240 pixels = 0.123285417 degree/pixel
or 8.111259442 pixels per degree
Using the example you provided
H = 120.0000 + 025 x 0.143750000 = 123.59375 degrees
V = 090.0000 + 100 x 0.123285417 = 102.32854 degrees
Doing the inverse (using the same values)
X = (123.59375 - 120.0000) x 6.95652 = 025 pixels
Y = (102.32854 - 090.0000) x 8.11126 = 100 pixels
[values above rounded to the nearest pixel]
As noted above, some lens are not uniform such as
http://www.digitalsecrets.net/secrets/FisheyeNikkor10.5.html
Be sure to move the mouse over the image to see the differences
between the rectified image (default view) and the "as captured"
image.
It turns out that the formulas for a fish eye lens (and this can be
adapted for other non-uniform lenses) are quite simple. See
http://local.wasp.uwa.edu.au/~pbourke/projection/fisheye/
for illustrations as well as source code (in C) for a few different
methods of calculating the results.
If any part of the answer is unclear or incomplete, please make a
clarification request. I would be glad to add to the answer so you are
completely satisfied.
--Maniac |
Request for Answer Clarification by
q3man-ga
on
31 Jul 2006 20:41 PDT
That's the method that I'm currently using to control the system. The
problem I'm running into is that the degrees per pixel change when
transforming to the global coordinate system.
Think of sitting inside the center of a globe. If you are looking
towards the equator, your degrees/pixel are the same as the
longitudinal lines. As you look up towards the north pole (our angle
V), the longitudinal lines (our angle H) get closer together and the
calculations become inaccurate.
The camera optics can be considered uniform.
|
Clarification of Answer by
maniac-ga
on
31 Jul 2006 21:40 PDT
Hello Q3man,
Perhaps the simplest to describe is to treat the sphere and
intersecting plane centered at the "equator", and then use rotations
to map to the correct coordinates. The rotation matrix for R(alpha,
beta, gamma) around the Z, Y, and X axes is the 3 x 3 matrix of
R(alpha, beta, gamma) is
[ A, B, C ]
[ D, E, F ]
[ G, H, I ]
A = cos(alpha)*cos(beta)
B = sin(alpha)*cos(beta)
C = -sin(beta)
D = -sin(alpha)*cos(gamma) + cos(alpha)*sin(beta)*sin(gamma)
E = cos(alpha)*cos(gamma) + sin(alpha)*sin(beta)*sin(gamma)
F = cos(beta)*sin(gamma)
G = sin(alpha)*sin(gamma) + cos(alpha)*sin(beta)*cos(gamma)
H = -cos(alpha)*sin(gamma) + sin(alpha)*sin(beta)*cos(gamma)
I = cos(beta)*cos(gamma)
[from "Interactive Computer Graphics" by W.K. Giloi]
For your example, gamma is the "vertical rotation", beta is the
"horizontal rotation", and alpha = 0 appears to be a reasonable
assumption, giving a result of sin(alpha) = 0, cos(alpha) = 1 to
simplify the formulas to
A = cos(beta)
B = 0
C = -sin(beta)
D = sin(beta)*sin(gamma)
E = cos(gamma)
F = cos(beta)*sin(gamma)
G = sin(beta)*cos(gamma)
H = -sin(gamma)
I = cos(beta)*cos(gamma)
I still need to confirm this interpretation, convert from XYZ to
lat/long, and write it up in a simpler to explain method. Its getting
late at my location - let me work on this some more tomorrow and I'll
provide a more complete answer then (and make sure it works at / near
the poles).
--Maniac
|
Clarification of Answer by
maniac-ga
on
01 Aug 2006 20:46 PDT
Hello Q3man,
I have the basic structure in place (as described previously) but
getting some inconsistent results in my testing. I am still working on
it and putting together a complete answer. In the meantime, here's a
brief explanation of the method I am pursuing.
The steps for the solution (angle / elevation for the point) are:
[1] Using a camera center at 0 horizontal, 90 vertical, compute the
angle / elevation of the XY pixel (at 3.59 horizontal, 102.33
vertical)
[2] Convert the angle / elevation to coordinates in X, Y, and Z
[3] Rotate the frame of reference to the camera's orientation (120
horizontal, 90 vertical); X, Y, and Z are now new values Xr, Yr, and
Zr. [in this example, Y is unchanged, only X & Z change]
[4] Compute the polar coordinates (angle / elevation) of Xr, Yr, and Zr.
Performing the inverse (XY pixel from angle / elevation) is basically:
[1] Compute Xr, Yr, and Zr from the angle / elevation.
[2] Rotate the camera frame of reference to 0 horizontal, 90 vertical;
giving a new X, Y, and Z for the point.
[3] Compute the angle / elevation for that X, Y, and Z
[4] Compute the XY pixel location from the angle / elevation using the
conversion factor
There are several good explanations of the conversion between the X,
Y, Z coordinates (Euclidean) and angle / elevation (polar) such as
http://www.j3d.org/matrix_faq/vectfaq_latest.html#Q17
Note that what this reference refers to lat / long is the typical
coordinate system & has to be mapped into the coordinates you've
chosen for angle and elevation.
The hard part is making sure all the transformations (between
coordinate systems & the rotations) are done in a consistent manner &
converting to / from the "standard" polar coordinates to the angle /
elevation system you've chosen.
--Maniac
|
Request for Answer Clarification by
q3man-ga
on
03 Aug 2006 14:00 PDT
Maniac,
Thanks for putting in the effort you have, but I am still unable to
solve the rotations with the information provided. This looks to be a
much tougher problem than I originally thought. Would you mind if I
relisted the problem at a dollar amount more suitable to the
difficulty involved?
|
Clarification of Answer by
maniac-ga
on
03 Aug 2006 19:14 PDT
Hello Q3man,
Sorry for the delay. Apparently there are some errors in some of the
material I referenced and I had to break the problem down in to small
steps to work around the problems.
If you can, please download a copy of
http://homepage.mac.com/mhjohnson/maniac-ga/q3man/q3man2.xls
which has two worksheets:
- Q3Man - works using your coordinate system (0-180 vertical, 0-360 horizontal)
- Normal - works with a more "normal" coordinate system (+/- 90
vertical, +/- 180 horizontal)
Both are commented with the formulas and have some "check values" as
they go along to show the intermediate results are correct. This
(along with the answer below) should have enough explanation to
provide a reference solution for your software programmer.
Let me explain the method used as well as point out the errors I found.
At the top left of the worksheet, are values for "horizontal" and
"vertical" angles, the "right" and "down" (or up in the normal
version) offsets, and the computed pixels per degree for horizontal
and vertical movement.
At the top right are rotation matricies for the horizontal (Y axis)
and vertical (Z axis) rotations plus the negative angle rotations.
When moving from zero latitude and longitude, the rotation is -Z
first, then Y. When moving to zero latitude and longitude, the
rotation is -Y first, then Z.
The description that follows matches the "normal" version - I explain
the additional work needed for your coordinate system at the end of
the answer. Also note, all the sine / cosine calculations need to be
done in radians (unless you have degrees cosine & sine). The
spreadsheet has the proper radian to degrees conversions included.
Computing Angle / Elevation
[A1] Generate the lat / long for the Right / Down XY value; set the
distance to 1 (unit vector).
(this was already described)
[A2] Convert to XYZ using polar conversion. The polar conversion formula is
X = cos(lat)*cos(long)
Y = sin(lat)
Z = cos(lat)*sin(long)
The reference provided earlier has X and Z swapped (oops).
[A3] Rotate up to the proper latitude (-Z rotation, R_v in the
spreadsheet) using matrix multiplication. The rotation matrix for -Z
is: (in A through I form)
A = cos(-vertical), B = sin(-vertical), C=0
D = -sin(-vertical), E = cos(-vertical), F = 0
G = 0, H = 0, I = 1
[A4] Rotate to the proper longitude (Y rotation, Rh in the
spreadsheet) using matrix multiplication. The rotation matrix for Y
is: (in A through I form)
A = cos(horizontal), B = 0, C = -sin(horizontal)
D = 0, E = 1, F = 0
G = sin(horizontal), H = 0, I = cos(horizontal)
[A5] Convert to lat / long. The formula already referenced at
http://www.j3d.org/matrix_faq/vectfaq_latest.html#Q17
was correct.
In the spreadsheet, I use the computed value from [A5] above as the
starting point for computing the reverse transformation (so you can
confirm the results "match" the original values).
Computing XY pixel from the lat / long value.
[X1] Convert to XYZ using the polar conversion (see [A2] above)
[X2] Rotate to zero longitude using matrix multiplication (-Y
rotation). The matrix is
A = cos(-horizontal), B = 0, C = -sin(-horizontal)
D = 0, E = 1, F = 0
G = sin(-horizontal), H = 0, I = cos(-horizontal)
[X3] Rotate to zero latitude using matrix multiplcation (Z rotation). The matrix is
A = cos(vertical), B = sin(vertical), C = 0
D = -sin(vertical), E = cos(vertical), F = 0
G = 0, H = 0, I = 1
[X4] Using the polar conversion, conver to lat / long
[X5] Using the degrees per pixel, convert to the XY pixels.
The additional steps for your coordinate system are:
- refer to latitude as 90-vertical (0 -> 90, 90 -> 0, 180 -> -90) and
the inverse relationship when displaing the final value
- refer to longitude as an IF statement like this:
IF horizontal > 180 then
longitude = horizontal-360
else
longitude = horizontal
and the inverse relationship when displaying final values.
Let me know if you still have problems at this point. I would be glad
to explain further.
To answer your last question - when you are completely satisfied,
please add a tip for what the full and complete answer is worth to
you. I am glad to help out.
--Maniac
|