Nvdewarper parameters

• Hardware Platform - Jetson
• DeepStream Version 7
• Issue Type - Question

I need to be able to recreate the math of what nvdewarper does given the various parameters. To be more accurate I need to be able to do the inverse (project points from the dewarped image back into the warped image. The reason this is needed is because I dewarp a fisheye lens into three perspective images and run that through inference. I then take the location of the detected object and need to translate it back into the fisheye coordinate. My final step would then be to project that point to the ground plane to know where an object is sitting. In the first step of this I’m assuming this follows a typical fisheye4 lens model like you would find in OpenCV.

In order to do the first step of projecting a point from the dewarped image back to the fisheye coordinates I’m first trying to replicate the algorithm of nvdewarp so that I can make sure I have that right before implementing the inverse. I’m having some trouble with that. Shader code below. I also have two functions that let me do the projections into and out of the ‘virtual’ view and the fisheye view.

Two questions 1) Does the algorithm below look similar to what is done in nvdewarper (for fisheye to perspective), or is this code open sourced so that I can look at it to double check.

  1. Why is the target dewarped fov expressed in top-angle and bottom-angle. I’ve never seen this done in CV before. When would you pick a top angle that is not equal to bottom angle? If I was going to do a bottom angle = 40 degrees and top angle of 20 degrees. Is this the same as doing a 60 degree fov and setting the pitch to be 10 degrees lower. I want to make sure I’m understanding that correctly and its not a sources of my virtual image calculation being different than that from nvdewarper.

Blockquote
precision highp float;
varying vec2 vUV;
uniform sampler2D textureSampler;
uniform float u_roll;
uniform float u_pitch;
uniform float u_fov;
uniform float u_aspect;
uniform float u_fisheyeFOV;
uniform vec4 u_D;
const float PI = 3.14159265359;
void main(void) {
vec2 screen = (vUV - 0.5) * 2.0;
screen.x *= u_aspect;
float f = 1.0 / tan(u_fov * 0.5);
vec3 ray = normalize(vec3(screen, f));
float cosPitch = cos(u_pitch);
float sinPitch = sin(u_pitch);
mat3 pitchMatrix = mat3(
1.0, 0.0, 0.0,
0.0, cosPitch, -sinPitch,
0.0, sinPitch, cosPitch
);
float cosRoll = cos(u_roll);
float sinRoll = sin(u_roll);
mat3 rollMatrix = mat3(
cosRoll, -sinRoll, 0.0,
sinRoll, cosRoll, 0.0,
0.0, 0.0, 1.0
);
vec3 rotatedRay = rollMatrix * pitchMatrix * ray;
float theta = acos(clamp(rotatedRay.z, -1.0, 1.0));
if(theta > (u_fisheyeFOV / 2.0)) {
gl_FragColor = vec4(0.0, 0.0, 0.0, 1.0);
return;
}
float phi = atan(rotatedRay.y, rotatedRay.x);
float r = (theta / (u_fisheyeFOV / 2.0)) * 0.5;
float r_corr = r * (1.0 + u_D.x * r * r + u_D.y * r * r * r * r + u_D.z * pow(r,6.0) + u_D.w * pow(r,8.0));
vec2 texCoords = vec2(0.5 + r_corr * cos(phi), 0.5 + r_corr * sin(phi));
gl_FragColor = texture2D(textureSampler, texCoords);
}

  1. For DeepStream, the gst-nvdewarper plugin is open source, You can find the source code in /opt/nvidia/deepstream/deepstream/sources/gst-plugins/gst-nvdewarper. The algorithm part is not open source.
  2. For the dewarper algorithm in Jetson platform, it is proprietary. top-angle and bottom-angle are only used with the “PushBroom”, “Cylindrical” and “Equirectangular” cases.

Please double check your answer for number two. I definitely see different output when using top and bottom angles on perspective to fisheye mappings.