DeepStream TX2 and Basler Cameras

We have written software that uses Basler USB3 cameras and feeds these images into our detection/classification networks etc, as per the detectNet style samples. Each image is pre-processed to allow it to work the network.

We want to use the DeepStream capabilities and I am struggling to see how to adapt the sample supplied for the TX2 to allow it to work with the images from a basler camera - there seems little or no documentation on this subject?

Thanks

Hi Polyhedrus,
Can your usbcamera work with gst-nvcamerasrc or gst-v4l2src plugin?
If yes, you only need to change decode plugin to nvcamerasrc.

If you can get raw YUV only, then you need to write APP. Basic flow should be:
Get raw YUV, feed it into nvbuf(pls refer to tegra_multimedia_api) → replace decode plugin with APPSRC → Feed nvbuf(convert to nvmmbuffer) to APPsrc.

If you have detail issue, pls let me know.

Thanks
wayne zhu

Thanks for the response.

I figured it would be an APP.

Got the raw YUV into the the nvbuffer but this was as far as I could get with the details you described?

Can you describe with more detail from replacing the plugin on … ?

What you’ve written kind of makes sense but it needs more detail please.

Thanks

Hi polyhedrus,
Have you tried with enabling camera src in config file?

[source0]
enable=1
#Type - 1=CameraCSi 2=CameraV4L2 3=URI
type=3
camera-width=1920
camera-height=1080
camera-fps-n=30
camera-fps-d=1
camera-csi-sensor-id=0
camera-v4l2-dev-node=0
uri=file:///home/nvidia/sample_720p.mp4

If not work, then we can go to the solution in comment2.

THanks
wayne zhu

Thanks, the camera doesnt work with gst-nvcamerasrc or gst-v4l2src plugin so need to provide a custom app and this is what I was asking for guidance with above?

We can only get raw YUV so

Can you describe with more detail from replacing the plugin on … ?

What you’ve written, in comment 2, kind of makes sense but it needs more detail please.

Hi Polyhedrus:

  1. about how to feed yuv to dmabuf:
    You can refer to tegra_multimedia_api in Jetpack, B10, following code:
    NvBufferMemMap(fd, Y_INDEX, NvBufferMem_Write, &ptr_y);
    NvBufferMemSyncForCpu(fd, Y_INDEX, &ptr_y);
    ptr_cur = (uint8_t *)ptr_y + par.pitch[Y_INDEX]*START_POS + START_POS;

         // overwrite some pixels to put an 'N' on each Y plane
         // scan array_n to decide which pixel should be overwritten
         for (i=0; i < FONT_SIZE; i++) {
             for (j=0; j < FONT_SIZE; j++) {
                 a = i>>SHIFT_BITS;
                 b = j>>SHIFT_BITS;
                 if (array_n[a][b])
                     (*ptr_cur) = 0xff; // white color
                 ptr_cur++;
             }
             ptr_cur = (uint8_t *)ptr_y + par.pitch[Y_INDEX]*(START_POS + i)  + START_POS;
         }
         NvBufferMemSyncForDevice (fd, Y_INDEX, &ptr_y);
         NvBufferMemUnMap(fd, Y_INDEX, &ptr_y);
    
  2. dmabuf <-> NvMMBuffer
    Pls refer to API in NvBuffer_util.h:
    NvBufferGetParams function, you will get structure NvBufferParams
    void *nv_buffer ; Pointer to NvMMBuffer,
    uint32_t nv_buffer_size; size of NvMMBuffer

  3. Create pipeline with APPSrc, Pipeline may looks like:
    Appsrc(instead of nvcamerasrc)-> nvvidconv → nvInfer → OSD-> Display instead of nvcamerasrc.

  4. Feed NvMMBuffer to AppSRC’s callback, You can google something about APPSRC, it is open source, and many material about it in internet.

Pls use rel28.2 or later release.

Thanks
wayne zhu

Can anyone explain more these syntaxes? I am facing the same problem? We have a Basler camera and what to do the same thing, 1- Is this a Cuda code or C++ code? 2- What is yuv and what is dmabuf? 3- what is APPSrc? or is there any straight forward solution for this?
Regards