Can't start TrafficCamNet using deepstream:6.2-samples container

• NVIDIA Jetson Xavier NX Developer Kit
• DeepStream 6.2
• L4T 35.3.1

Hello.

I’m trying to use the latest deepstream-l4t:6.2-samples to run TrafficCamNet model.

To do this, I reproduce the steps from the Helm chart

Please see the attached output and config file.

deepsteam-app output
# deepstream-app -c samples/configs/deepstream-app/run.txt 

(gst-plugin-scanner:2270): GStreamer-WARNING **: 14:18:42.080: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_udp.so': librivermax.so.0: cannot open shared object file: No such file or directory

(gst-plugin-scanner:2270): GStreamer-WARNING **: 14:18:42.147: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_inferserver.so': libtritonserver.so: cannot open shared object file: No such file or directory
** ERROR: <create_udpsink_bin:644>: Failed to create 'sink_sub_bin_encoder1'
** ERROR: <create_udpsink_bin:719>: create_udpsink_bin failed
** ERROR: <create_sink_bin:828>: create_sink_bin failed
** ERROR: <create_processing_instance:884>: create_processing_instance failed
** ERROR: <create_pipeline:1485>: create_pipeline failed
** ERROR: <main:697>: Failed to create pipeline
Quitting
nvstreammux: Successfully handled EOS for source_id=0
App run failed
config file
################################################################################
# Copyright (c) 2019-2022, NVIDIA CORPORATION. All rights reserved.
#
# Permission is hereby granted, free of charge, to any person obtaining a
# copy of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.  IN NO EVENT SHALL
# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
# DEALINGS IN THE SOFTWARE.
################################################################################

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
#gie-kitti-output-dir=streamscl

[tiled-display]
enable=1
rows=1
columns=1
width=1280
height=720
gpu-id=0
#(0): nvbuf-mem-default - Default memory allocated, specific to particular platform
#(1): nvbuf-mem-cuda-pinned - Allocate Pinned/Host cuda memory applicable for Tesla
#(2): nvbuf-mem-cuda-device - Allocate Device cuda memory applicable for Tesla
#(3): nvbuf-mem-cuda-unified - Allocate Unified cuda memory applicable for Tesla
#(4): nvbuf-mem-surface-array - Allocate Surface Array memory, applicable for Jetson
nvbuf-memory-type=0

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP
type=3
uri=file://../../streams/sample_1080p_h264.mp4
num-sources=1
#drop-frame-interval=2
gpu-id=0
# (0): memtype_device   - Memory type Device
# (1): memtype_pinned   - Memory type Host Pinned
# (2): memtype_unified  - Memory type Unified
cudadec-memtype=0

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink/nv3dsink (Jetson only) 3=File
type=2
sync=1
source-id=0
gpu-id=0
nvbuf-memory-type=0

[sink1]
enable=0
type=3
#1=mp4 2=mkv
container=1
#1=h264 2=h265
codec=1
#encoder type 0=Hardware 1=Software
enc-type=0
sync=0
#iframeinterval=10
bitrate=2000000
#H264 Profile - 0=Baseline 2=Main 4=High
#H265 Profile - 0=Main 1=Main10
profile=0
output-file=out.mp4
source-id=0

[sink2]
enable=0
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=4
#1=h264 2=h265
codec=1
#encoder type 0=Hardware 1=Software
enc-type=0
sync=0
bitrate=4000000
#H264 Profile - 0=Baseline 2=Main 4=High
#H265 Profile - 0=Main 1=Main10
profile=0
# set below properties in case of RTSPStreaming
rtsp-port=8554
udp-port=5400

[osd]
enable=1
gpu-id=0
border-width=1
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0

[streammux]
gpu-id=0
##Boolean property to inform muxer that sources are live
live-source=0
buffer-pool-size=4
batch-size=1
##time out in usec, to wait after the first buffer is available
##to push the batch even if the complete batch is not formed
batched-push-timeout=40000
## Set muxer output width and height
width=1920
height=1080
##Enable to maintain aspect ratio wrt source, and allow black borders, works
##along with width, height properties
enable-padding=0
nvbuf-memory-type=0
## If set to TRUE, system timestamp will be attached as ntp timestamp
## If set to FALSE, ntp timestamp from rtspsrc, if available, will be attached
# attach-sys-ts-as-ntp=1

# config-file property is mandatory for any gie section.
# Other properties are optional and if set will override the properties set in
# the infer config file.
[primary-gie]
enable=1
gpu-id=0
model-engine-file=../../models/Primary_Detector/resnet10.caffemodel_b4_gpu0_int8.engine
batch-size=1
#Required by the app for OSD, not a plugin property
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;0;1;1
bbox-border-color3=0;1;0;1
interval=0
gie-unique-id=1
nvbuf-memory-type=0
config-file=config_infer_primary.txt

[tracker]
enable=1
# For NvDCF and NvDeepSORT tracker, tracker-width and tracker-height must be a multiple of 32, respectively
tracker-width=640
tracker-height=384
ll-lib-file=/opt/nvidia/deepstream/deepstream/lib/libnvds_nvmultiobjecttracker.so
# ll-config-file required to set different tracker types
# ll-config-file=config_tracker_IOU.yml
# ll-config-file=config_tracker_NvSORT.yml
ll-config-file=config_tracker_NvDCF_perf.yml
# ll-config-file=config_tracker_NvDCF_accuracy.yml
# ll-config-file=config_tracker_NvDeepSORT.yml
gpu-id=0
enable-batch-process=1
enable-past-frame=1
display-tracking-id=1

[secondary-gie0]
enable=1
model-engine-file=../../models/Secondary_VehicleTypes/resnet18.caffemodel_b16_gpu0_int8.engine
gpu-id=0
batch-size=1
gie-unique-id=4
operate-on-gie-id=1
operate-on-class-ids=0;
config-file=config_infer_secondary_vehicletypes.txt

[secondary-gie1]
enable=1
model-engine-file=../../models/Secondary_CarColor/resnet18.caffemodel_b16_gpu0_int8.engine
batch-size=1
gpu-id=0
gie-unique-id=5
operate-on-gie-id=1
operate-on-class-ids=0;
config-file=config_infer_secondary_carcolor.txt

[secondary-gie2]
enable=1
model-engine-file=../../models/Secondary_CarMake/resnet18.caffemodel_b16_gpu0_int8.engine
batch-size=1
gpu-id=0
gie-unique-id=6
operate-on-gie-id=1
operate-on-class-ids=0;
config-file=config_infer_secondary_carmake.txt

[tests]
file-loop=1


[sink0]
enable=1
type=1
sync=1
codec=1
bitrate=4000000
rtsp-port=8554
udp-port=5400
source-id=0
gpu-id=0
nvbuf-memory-type=0


[sink2]
enable=1
type=4
container=1
codec=1
sync=1
bitrate=2000000
rtsp-port=8554
udp-port=5400
profile=0
output-file=out.mp4
source-id=0

from the log, the app created encoder failed. can you share there result of “gst-inspect-1.0 |grep nvv4l2h264enc” and “gst-inspect-1.0 |grep nvinfer”?

# gst-inspect-1.0 |grep nvv4l2h264enc
# gst-inspect-1.0 |grep nvinfer
nvdsgst_deepstream_bins:  nvinferbin: NvInfer Bin
nvdsgst_deepstream_bins:  nvinferserverbin: NvInferServer Bin
nvdsgst_inferaudio:  nvinferaudio: NvInfer Audio plugin
nvdsgst_infer:  nvinfer: NvInfer plugin

Output for nvv4l2h264enc is empty

I have discovered that the problem is caused by enabling the EglSink sink.
If I turn it off and only enable the FakeSink, everything works.

I don’t need EglSink because my Jetson is headless.
But I need RTSPStreaming, and when I enable it, I encounter the same error as when EglSink is enabled:

** ERROR: <create_udpsink_bin:644>: Failed to create 'sink_sub_bin_encoder1'
** ERROR: <create_udpsink_bin:719>: create_udpsink_bin failed
** ERROR: <create_sink_bin:828>: create_sink_bin failed
** ERROR: <create_processing_instance:884>: create_processing_instance failed
** ERROR: <create_pipeline:1485>: create_pipeline failed
** ERROR: <main:697>: Failed to create pipeline
Quitting
nvstreammux: Successfully handled EOS for source_id=0
App run failed

Okay, so It started if I change enc-type for RTSPStreaming sink from 0 (Hardware) to 1 (Software)

What’s wrong? There is no HW encoding available in Docker image?

Instructions how to start TrafficCamNet correctly: Run TrafficCamNet on latest JetPack - #4 by eugenyshcheglov

I started a separate thread for HW acceleration in the RTSPStreaming: Hardware encoding for RTSP streaming

This thread can be closed

P.S. Sorry for being straightforward, but it would be much easier to live for many and many developers if you wrote and updated the documentation. And the code for the Helm chart… Oh my goodness, I wish I hadn’t seen it.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.