Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) • DeepStream Version
5.1 • JetPack Version (valid for Jetson only)
4.5
• NVIDIA GPU Driver Version (valid for GPU only)
Xavier NX • Issue Type( questions, new requirements, bugs)
question
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
I am using a Flir thermal camera which is connected to a static IP through an encoder, We can see the stream itself from the IP while the camera is running. But deepstream does not run stable with that camera, The configuration file is as follows:
If I start the deepstream several times the video will go through. The output from the deepstream is very low quality and laggy in comparison to the original video, here is the output: #-------------------------------
[sink2]
enable=1 #Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=4
#1=h264 2=h265
codec=1
sync=0
bitrate=4000000 #set below properties in case of RTSPStreaming
rtsp-port=8554
udp-port=5400 #---------------------
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.1.168:554/hdmi
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (request) SETUP stream 0
.
.
.
.
(gst-launch-1.0:26558): GLib-ERROR **: 21:29:47.697: Creating pipes for GWakeup: Too many open files
Trace/breakpoint trap (core dumped)
What does this mean? I have another application that is using the Gstreamer. Could that application cause a conflict?
Deepstream SDK is based on gstreamer so you would need to figure out a valid URIwhic can be plaeyed in gstreamer command successfully. Currently it looks like rtspsrc plugin cannot read the source correctly. Please check with camera vendor to get a valid URI first.
Hi,
Thanks for your reply
As suggested by the company we can use an encoder to connect the camera. So we receive the data from an encoder. there are two options of TCP and UDP.
We change the settings to UDP and from the first command we are receiving the following:
gst-launch-1.0 uridecodebin uri=rtsp://192.168.1.168:554/hdmi ! nvoverlaysink sync=0
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.1.168:554/hdmi
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
Opening in BLOCKING MODE
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
type or paste code here
And from the second one we are receiving the following:
gst-launch-1.0 rtspsrc location='rtsp://192.168.1.168:554/hdmi' ! fakesink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.1.168:554/hdmi
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
Both of the commands are stopping after Progress: (request) Sent PLAY request.