Use of a Basler camera as deepstream-app source

Please provide complete information as applicable to your setup.

• Hardware Platform (dGPU) rtx3090ti
• DeepStream Version 6.2
• TensorRT Version 8.5.2.2
**• NVIDIA GPU Driver Version ** 530.30.02-1
• Issue Type( question)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description) Basler Pylon (pylonsrc)

Hi, I’m looking to use a Basler USB camera as a source for DeepStream-app.
I’ve opened a topic, Basler camera as deepstream-app source - Intelligent Video Analytics / DeepStream SDK - NVIDIA Developer Forums.

In that topic i learned were to modify the source code of the deepstream-app.
I have made modifications on the deepstream-app source code, but i’m having issues on recompile the deepstream SDK and make the changes to work.

Hi @ivan48,

Are you aware of the application example we have with Basler cameras?
https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_Can_Orientation.html

I didn’t know. Does it work for USB cameras?

I was not aware of that one, We use a USB basler camera and a faster_rcnn tlt model.
Does this application work with those?

could you elaborate on this?

I did these modifications with support of this forum: Using Basler Camera as Deepstream source - Intelligent Video Analytics / DeepStream SDK - NVIDIA Developer Forums

  1. /opt/nvidia/deepstream/deepstream-6.2/sources/apps/apps-common/includes/deepstream_sources.h; I added a new enum NV_DS_SOURCE_CAMERA_BASLER in struct NvDsSourceType.
  2. I added code to create pylonsrc element in create_camera_source_bin, like this: bin->src_elem =
    gst_element_factory_make (“pylonsrc”, “src_elem”) In the function create_camera_source_bin on /opt/nvidia/deepstream/deepstream-6.2/sources/apps/apps-common/src/deepstream_source_bin.c

After this, I recompiled the deepstream-app. I changed the configurations in black_fp32.txt to run deepstream using basler camera.

And when I run the command, I received this message:
aliger-agx@agx:$ deepstream-app -c blackfp32.txt

(deepstream-app:164375): GStreamer-CRITICAL **: 09:10:56.669: passed ‘0’ as denominator for `GstFraction’
** ERROR: <create_camera_source_bin:204>: Failed to link ‘src_elem’ (video/x-raw, format=(string){ GRAY8, RGB, BGR, YUY2, UYVY }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-bayer, format=(string){ rggb, bggr, gbgr, grgb }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]) and ‘src_cap_filter’ (video/x-raw, format=(string)NV12, width=(int)0, height=(int)0)
** ERROR: <create_camera_source_bin:239>: create_camera_source_bin failed
** ERROR: <create_pipeline:1485>: create_pipeline failed
** ERROR: main:697: Failed to create pipeline
Quitting
App run failed
terminate called without an active exception
Aborted (core dumped)

please refer to \opt\nvidia\deepstream\deepstream-6.2\samples\configs\deepstream-app\source1_usb_dec_infer_resnet_int8.txt, please set camera-width and camera-height, which the camera can support. here width and height should not be 0.

I did this modification. I setup camera-width and camera-height to 1920x1200(which the camera can support) and the error " 'src_cap_filter’ (video/x-raw, format=(string)NV12, width=(int)0, height=(int)0)" remains.

(deepstream-app:164375): GStreamer-CRITICAL **: 09:10:56.669: passed ‘0’ as denominator for `GstFraction’
** ERROR: <create_camera_source_bin:204>: Failed to link ‘src_elem’ (video/x-raw, format=(string){ GRAY8, RGB, BGR, YUY2, UYVY }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-bayer, format=(string){ rggb, bggr, gbgr, grgb }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]) and ‘src_cap_filter’ (video/x-raw, format=(string)NV12, width=(int)0, height=(int)0)
** ERROR: <create_camera_source_bin:239>: create_camera_source_bin failed
** ERROR: <create_pipeline:1485>: create_pipeline failed
** ERROR: main:697: Failed to create pipeline
Quitting
App run failed
terminate called without an active exception
Aborted (core dumped)

the error is “NVGSTDS_LINK_ELEMENT (bin->src_elem, bin->cap_filter1);” of create_camera_source_bin. you can add log to check why width and height are 0.

Hi, How can i export these logs?

you can add printf log in create_camera_source_bin, then rebuild deepstream-app.

Thank you! Now I got modify the height and width values, as you can see below. But the same error remains.

aliger-agx@agx:~/infer$ deepstream-app -c source1_usb_dec_infer_resnet_int8.txt

** ERROR: <create_camera_source_bin:204>: Failed to link ‘src_elem’ (video/x-raw, format=(string){ GRAY8, RGB, BGR, YUY2, UYVY }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-bayer, format=(string){ rggb, bggr, gbgr, grgb }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]) and ‘src_cap_filter’ (video/x-raw, format=(string)NV12, width=(int)1920, height=(int)1200, framerate=(fraction)60/1)
** ERROR: <create_camera_source_bin:239>: create_camera_source_bin failed
** ERROR: <create_pipeline:1485>: create_pipeline failed
** ERROR: main:697: Failed to create pipeline
Quitting
App run failed
terminate called without an active exception
Aborted (core dumped)

Follow bellow, my infer.txt file:
[application]
enable-perf-measurement=1
perf-measurement-interval-sec=1
#gie-kitti-output-dir=streamscl

[tiled-display]
enable=0

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI
type=9 #NV_DS_CAMERA_BASLER
camera-width=1920
camera-height=1200
camera-fps-n=60
camera-fps-d=1
camera-v4l2-dev-node=0
gpu-id=0
cudadec-memtype=0

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink/nv3dsink(Jetson only) 3=File 4=RTSPStreaming 5=nvdrmvideosink
type=1
sync=0
plane-id=0
width=1920
height=1200
conn-id=1
source-id=1

[osd]
enable=1
gpu-id=0
border-width=3
text-size=6
text-color=0.5;0;0;1;
text-bg-color=0;0;0;0
font=Serif
nvbuf-memory-type=0

[streammux]
##Boolean property to inform muxer that sources are live
live-source=1
gpu-id=0
batch-size=1
##time out in usec, to wait after the first buffer is available
##to push the batch even if the complete batch is not formed
batched-push-timeout=40000

Set muxer output width and height

width=1920
height=1200

If set to TRUE, system timestamp will be attached as ntp timestamp

If set to FALSE, ntp timestamp from rtspsrc, if available, will be attached

attach-sys-ts-as-ntp=1

enable-padding=2
nvbuf-memory-type=0

[primary-gie]
enable=1
gpu-id=0
gie-unique-id=1
#model-engine-file=…/…/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine
config-file=//home/newton/Nestle-BD/blackinfer.txt
batch-size=1
#Required by the app for OSD, not a plugin property
bbox-border-color0=1;0;0;1
bbox-bg-color0=0;0;0;0
bbox-border-color1=1;0;0;1
bbox-bg-color1=0;0;0;0
interval=0
#Required by the app for SGIE, when used along with config-file property
labelfile-path=/home/newton/Nestle-BD/labels.txt
nvbuf-memory-type=0

[tests]
file-loop=0

from the log, src_cap_filter use default value NV12, but the camera dose not support NV12, you can try these values by setting video-format in the configuration file. here is a sample “video-format=YUY2”, please find video-format in link for explanation.

I modified my config file and I change format of NV12 to BGR. What is this ‘Lazy loading’ error?

(deepstream-app:48098): GLib-CRITICAL **: 10:22:57.858: g_strrstr: assertion ‘haystack != NULL’ failed

(deepstream-app:48098): GLib-CRITICAL **: 10:22:57.858: g_strrstr: assertion ‘haystack != NULL’ failed

(deepstream-app:48098): GLib-GObject-WARNING **: 10:22:57.858: value “TRUE” of type ‘gboolean’ is invalid or out of range for property ‘enable-padding’ of type ‘gboolean’
WARNING: [TRT]: CUDA lazy loading is not enabled. Enabling it can significantly reduce device memory usage and speed up TensorRT initialization. See “Lazy Loading” section of CUDA documentation CUDA C++ Programming Guide
ERROR: [TRT]: Validation failed: libNamespace == nullptr
plugin/proposalPlugin/proposalPlugin.cpp:515

ERROR: [TRT]: std::exception
ERROR: [TRT]: Validation failed: libNamespace == nullptr
plugin/proposalPlugin/proposalPlugin.cpp:515

ERROR: [TRT]: std::exception
WARNING: [TRT]: CUDA lazy loading is not enabled. Enabling it can significantly reduce device memory usage and speed up TensorRT initialization. See “Lazy Loading” section of CUDA documentation CUDA C++ Programming Guide
0:00:03.948869700 48098 0x55e547d46660 INFO nvinfer gstnvinfer.cpp:680:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1909> [UID = 1]: deserialized trt engine from :/home/newton/Nestle-BD/frcnn_8748_qat_int8.etlt_b1_gpu0_int8.engine
INFO: …/nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 3
0 INPUT kFLOAT input_image 3x1200x1920
1 OUTPUT kFLOAT NMS 1x100x7
2 OUTPUT kFLOAT NMS_1 1x1x1

0:00:04.027992692 48098 0x55e547d46660 INFO nvinfer gstnvinfer.cpp:680:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2012> [UID = 1]: Use deserialized engine model: /home/newton/Nestle-BD/frcnn_8748_qat_int8.etlt_b1_gpu0_int8.engine
0:00:04.034437087 48098 0x55e547d46660 INFO nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<primary_gie> [UID 1]: Load new model://home/newton/Nestle-BD/blackinfer.txt sucessfully
** ERROR: main:716: Failed to set pipeline to PAUSED
Quitting
ERROR from src_elem: No URI specified to play from.
Debug info: gsturidecodebin.c(1377): gen_source_element (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstURIDecodeBin:src_elem
App run failed

please ignore this warning log. from the log, why GstURIDecodeBin is used? pylonsrc should be used to capture raw data from the Basler camera.

Sorry I made a mistake in the code. In create_multi_source_bin I created a type for basler in
switch (configs[i].type) {
I added the case NV_DS_SOURCE_CAMERA_BASLER but dont put any parameters. Now I added these parameters:
case NV_DS_SOURCE_CAMERA_BASLER:
if (!create_camera_source_bin (&configs[i], &bin->sub_bins[i])) {
return FALSE;
}
break;

The debug was:
deepstream-app -c source1_usb_dec_infer_resnet_int8.txt
** ERROR: <link_element_to_streammux_sink_pad:99>: Failed to link ‘src_bin_muxer’ and ‘src_sub_bin0’
** ERROR: <create_multi_source_bin:1594>: source 0 cannot be linked to mux’s sink pad 0x558767c72150

** ERROR: <create_multi_source_bin:1618>: create_multi_source_bin failed
** ERROR: <create_pipeline:1485>: create_pipeline failed
** ERROR: main:697: Failed to create pipeline
Quitting
nvstreammux: Successfully handled EOS for source_id=0
App run failed
terminate called without an active exception
Aborted (core dumped)

source1_usb_dec_infer_resnet_int8.txt
[application]
enable-perf-measurement=1
perf-measurement-interval-sec=1
#gie-kitti-output-dir=streamscl

[tiled-display]
enable=0

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI
type=9
camera-width=1920
camera-height=1200
camera-fps-n=160
camera-fps-d=1
#camera-basler-sensor-id=40168364
video-format=YUY2
gpu-id=0
cudadec-memtype=0

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink/nv3dsink(Jetson only) 3=File 4=RTSPStreaming 5=nvdrmvideosink
type=1
sync=0
plane-id=0
width=1920
height=1200
conn-id=1
source-id=0

[osd]
enable=1
gpu-id=0
border-width=3
text-size=6
text-color=0.5;0;0;1;
text-bg-color=0;0;0;0
font=Serif
nvbuf-memory-type=0

[streammux]
##Boolean property to inform muxer that sources are live
live-source=1
gpu-id=0
batch-size=1
##time out in usec, to wait after the first buffer is available
##to push the batch even if the complete batch is not formed
batched-push-timeout=40000

Set muxer output width and height

width=1920
height=1200

If set to TRUE, system timestamp will be attached as ntp timestamp

If set to FALSE, ntp timestamp from rtspsrc, if available, will be attached

attach-sys-ts-as-ntp=1

enable-padding=1
nvbuf-memory-type=0

[primary-gie]
enable=1
gpu-id=0
gie-unique-id=1
#model-engine-file=…/…/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine
config-file=//home/newton/Nestle-BD/blackinfer.txt
batch-size=1
#Required by the app for OSD, not a plugin property
bbox-border-color0=1;0;0;1
bbox-bg-color0=0;0;0;0
bbox-border-color1=1;0;0;1
bbox-bg-color1=0;0;0;0
interval=0
#Required by the app for SGIE, when used along with config-file property
labelfile-path=/home/newton/Nestle-BD/labels.txt
nvbuf-memory-type=0

[tests]
file-loop=0

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

nvstreammux only support NV12/RGBA input, please refer to nvstreammux, you can use nvvideoconvert to convert data to RGBA.
we suggest to use gst-lauch to debug the application first. please refer to GitHub - basler/gst-plugin-pylon: The official GStreamer plug-in for Basler cameras

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.