Deepstream 6.2 sample app giving error when trying to run CSI camera <30 FPS (gstnvarguscamerasrc.cpp, threadExecute:694 NvBufSurfaceFromFd Failed)

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU): Nvidia Jetson Xavier NX
• DeepStream Version: 6.2
• JetPack Version (valid for Jetson only): 5.1
• TensorRT Version: 8.5.2.2
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs): Bug
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing): Using deepstream-app sample application
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Hi,
We are currently developing multi-source multi-inference deepstream pipeline using 2 CSI (IMX219) camera and doing inference for detection and segmentation.

Our top level pipeline looks something like this:

We were using Jetpack version 4.6 and Deepstream 6.0 and custom deepstream pipeline was working fine with 2 camera configured on 10 FPS.

With change in the requirement now we are migrating to jetpack 5.1 with latest version on deepstream 6.2. With this we are facing a issue that the pipeline only works when we set FPS to 30. Any FPS below that is not working and throws below error:

nvbuf_utils: dmabuf_fd -1 mapped entry NOT found
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadExecute:694 NvBufSurfaceFromFd Failed.
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadFunction:247 (propagating)
nvstreammux: Successfully handled EOS for source_id=0

To narrow down the issue and by-passing our application code, we have tried reproducing the same issue using the sample deepstream-app with source2_csi_usb_dec_infer_resnet_int8.txt configuration file. I have updated the configuration file to use 2 CSI camera with 1280x720 @ 10 FPS and a file sink.

You can find my configuration file here:
source2_csi_dec_infer_resnet_int8.txt (3.8 KB)

After running the deepstream-app we were able to reproduce the exact issue using sample app. It is showing below error:

Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadExecute:694 NvBufSurfaceFromFd Failed.
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, threadFunction:247 (propagating)
H264: Profile = 66, Level = 0
NVMEDIA: Need to set EMC bandwidth : 125333
NVMEDIA: Need to set EMC bandwidth : 125333
NVMEDIA_ENC: bBlitMode is set to TRUE
0:00:08.063552005 110689 0xfffefc029300 WARN v4l2bufferpool gstv4l2bufferpool.c:1533:gst_v4l2_buffer_pool_dqbuf:<sink_sub_bin_encoder1:pool:src> Driver should never set v4l2_buffer.field to ANY
0:00:08.066936093 110689 0xfffefc029300 FIXME basesink gstbasesink.c:3246:gst_base_sink_default_event:<sink_sub_bin_sink1> stream-start event without group-id. Consider implementing group-id handling in the upstream elements
0:00:08.067157727 110689 0xfffefc029300 WARN qtmux gstqtmux.c:2981:gst_qt_mux_start_file:<sink_sub_bin_mux1> Robust muxing requires reserved-moov-update-period to be set
nvstreammux: Successfully handled EOS for source_id=0
**PERF: 0.00 (0.00) 9.86 (9.74)
**PERF: 0.00 (0.00) 10.01 (9.91)

Note: Our application with custom deepstream pipeline and the sample deepstream-app works when we set the FPS to 30.

I am looking forward for any valuable inputs on this direction.
Thanks you in advance!

hello hardik3,

may I know how you enable IMX219 on Xavier NX,
please running with Jetson-IO to configure CSI, and you may check Approaches for Validating and Testing the V4L2 Driver for several approaches to test the camera stream.

Hi @JerryChang ,
Thanks for your promt response.

I have tried using command provided in the documentation you shared. But none of command on the given link works. I am also not able to run pipeline using v4l2src .
Using the v4l2src gives following error:

ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.

Also, as I have mentioned camera streams are working when using the sample app at 30 FPS. One thing to note here is when I ran separate pipeline for 2 camera at 10 fps it works, but when I am trying to open 2 camera streams simulatenously it gives this error.

hello hardik3,

please see-also Camera Architecture Stack, you cannot using v4l2src to fetch the camera stream since IMX219 it’s a bayer camera sensor.

please check sysnode, is /dev/video* has created successfully?
please also check kernel init messages, $ dmesg | grep 219, is there any failures related to sensor registration?

furthermore,
you may have a try to execute Jetson-IO to re-configure the CSI channel to enable IMX219 camera support.

$ sudo /opt/nvidia/jetson-io/jetson-io.py
-> Configure Jetson 24pin CSI Connector 
-> Configure for compatible hardware 
-> Camera IMX219 Dual 
-> Save pin changes 
-> Save and reboot to reconfigure pins

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.