Nvdewarper dewarp 360 into one image

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson Xavier AGX)
• DeepStream Version 5.0.0
• JetPack Version 4.5
• TensorRT Version 7.1.3

In the deepstream-dewarper-test, we divide the 360 image into 4 region as shown below:

Is it possible dewarp it into just 1 panaromic image?

Ok, we will check internally and update you ASAP.

1 Like

In deepstream-dewarper-test, in config_dewarper.txt, set only [surface0] section, remove [surface1/2/3] groups. Set [surface0] parameters as per the need. Also disable aisle-calibration-file=csv_files/nvaisle_2M.csv and spot-calibration-file=csv_files/nvspot_2M.csv from config file as these are for 4 surfaces

In addition, please try to use the latest DS5.1 version if possible.

@bcao If you were able to config the dewarp-test for one output, can you share the example config? That would save a lot of work for me. Thank you!

Sorry, I don’t have a camera locally, could you try as my last comment and share your configs with us if you still have issues.

@bcao Can you provide some more in-depth explanation of the parameters? I was able to dewarp the image with trial-and-error but I can’t fix the orientation. Now I would like to rotate it to get panaromic image.

My surface config:


Which parameters?
And have you checked Gst-nvdewarper — DeepStream 5.1 Release documentation firstly?

Yes. I also checked the dewarp-test example pdf but it had no information about top-angle, bottom-angle, pitch, yaw, roll. After playing around with this parameters, I am still not sure what exactly these parameters indicate and how to set so that my resulting image is oriented horizontally.

OK, I see, will check and reply you ASAP.


I had issues too with the examples. It’s a little lacking in detail to understand the different ways dewarping can be use and applied. Would love to see more!

@bcao any updates?

@bcao bump

@bcao bump

Hi @bcao ! I wonder if it is possible to do it using pad probe. I tried it with no success. My probe function and its resulting image is appended below. I would expect 1000x2000 → 500x2000 crop, rotate and scale to 1200x300 image output but I still get 1000x2000.

static GstPadProbeReturn
dewarper_src_pad_buffer_probe (GstPad * pad, GstPadProbeInfo * info, gpointer u_data)
  GstBuffer *buf = (GstBuffer *) info->data;
  GstMapInfo outmap = GST_MAP_INFO_INIT;
  gst_buffer_map (buf, &outmap, GST_MAP_WRITE);
  NvBufSurface*  surface = (NvBufSurface *)outmap.data;

  NvBufSurfTransformRect src_rect, dst_rect;
  src_rect.top   = 0;
  src_rect.left  = 0;
  src_rect.width = (guint) surface->surfaceList[0].width / 2;
  src_rect.height= (guint) surface->surfaceList[0].height;

  dst_rect.top   = 0;
  dst_rect.left  = 0;
  dst_rect.width = 1200;
  dst_rect.height= 300;

  NvBufSurface *dst_surface = NULL;
  NvBufSurfaceCreateParams create_params;

  create_params.gpuId  = surface->gpuId;
  create_params.width  = 1200;
  create_params.height = 300;
  create_params.size = 0;
  create_params.colorFormat = surface->surfaceList[0].colorFormat;
  create_params.layout = surface->surfaceList[0].layout;
  create_params.memType = surface->memType;


  NvBufSurfTransformParams transform_params;
  transform_params.src_rect = &src_rect;
  transform_params.dst_rect = &dst_rect;

  NvBufSurfTransformConfigParams transform_config_params;
  NvBufSurfTransform_Error err;

  transform_config_params.compute_mode = NvBufSurfTransformCompute_Default;
  transform_config_params.gpu_id = surface->gpuId;
  transform_config_params.cuda_stream = NULL;
  transform_params.transform_flip = NvBufSurfTransform_Rotate270;
  err = NvBufSurfTransformSetSessionParams (&transform_config_params);
  // crop and rotate 270 to dst surface
  err = NvBufSurfTransform (surface, dst_surface, &transform_params);
  if (err != NvBufSurfTransformError_Success) {
    g_print("NvBufSurfTransform failed with error %d while converting buffer", err);
    return GST_PAD_PROBE_DROP;

  // copy back to original surface
  transform_params.transform_flag = 0;
  err = NvBufSurfTransform (dst_surface, surface, &transform_params);
  if (err != NvBufSurfTransformError_Success) {
    g_print("NvBufSurfTransform failed with error %d while converting buffer", err);
    return GST_PAD_PROBE_DROP;
  surface->surfaceList[0].height = 300;
  surface->surfaceList[0].width = 1200;

  gst_buffer_unmap (buf, &outmap);
  return GST_PAD_PROBE_OK;

Hey customer,
We have a sample GitHub - NVIDIA-AI-IOT/Deepstream-Dewarper-App: This project demonstrate how to infer and track from a 360 videos by using the dewarper plugin. , could you check it and see if any help?

It seems the updated library is only supported for x86 for now. When it supports Jetson devices, it would definitely help me a lot. Thank you for the update!

Jeston lib had been uploaded now Deepstream-Dewarper-App/plugin_libraries/jetson_bin at main · NVIDIA-AI-IOT/Deepstream-Dewarper-App · GitHub