**• Hardware Platform : Jetson AGX Xavier **
**• DeepStream Version : 5.0.1 / gstreamer 1.14.5 **
**• JetPack Version : 4.3 **
**• TensorRT Version : 7.1.3 **
**• NVIDIA GPU Driver Version : 10.2 **
Hello
I am testing deepstream(python) + tlt.
I have a problem with deepstream. IP camera rtsp input used for deepstream-uridecodebin pipeline is not working properly.
We are currently using IP camera and streaming to rtsp(tcp) - h264.
The interesting thing is that when video capture is being done using opencv, uridecodebin works perfectly in deepstream (+ gstreamer). However, when deepstream (or gstreamer) is soley used without any external connection, rtsp input is not received properly and the app gets frozen.
This is happening for deepstream python example code as well - deepstream-app(sample).
I have no idea why this is happening. Could you please help me?
[log]
** freezing **
xavier@xavier-desktop:~/Desktop$ gst-launch-1.0 uridecodebin uri=rtspt:///main ! nvoverlaysink
0:00:00.058084388 17081 0x5573578630 WARN omx gstomx.c:2826:plugin_init: Failed to load configuration file: Valid key file could not be found in search dirs (searched in: /home/xavier/.config:/etc/xdg/xdg-unity:/etc/xdg as per GST_OMX_CONFIG_DIR environment variable, the xdg user config directory (or XDG_CONFIG_HOME) and the system config directory (or XDG_CONFIG_DIRS)
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtspt:///main
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (request) SETUP stream 1
Progress: (open) Opened Stream
Setting pipeline to PLAYING …
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
Opening in BLOCKING MODE
0:00:00.314199828 17081 0x7fac080c00 WARN v4l2 gstv4l2object.c:4435:gst_v4l2_object_probe_caps:nvv4l2decoder0:src Failed to probe pixel aspect ratio with VIDIOC_CROPCAP: Unknown error -1
0:00:00.314292312 17081 0x7fac080c00 WARN v4l2 gstv4l2object.c:2372:gst_v4l2_object_add_interlace_mode:0x7f8405eb00 Failed to determine interlace mode
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
** freezing & ctrl + c **
^Chandling interrupt.
Interrupt: Stopping pipeline …
Execution ended after 0:00:23.461234930
Setting pipeline to PAUSED …
Setting pipeline to READY …
^C
xavier@xavier-desktop:~/Desktop$ ^C
xavier@xavier-desktop:~/Desktop$
** when video capture is being done using opencv **
xavier@xavier-desktop:~$ gst-launch-1.0 uridecodebin uri=rtspt:///main ! nvoverlaysink
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtspt:///main
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (request) SETUP stream 1
Progress: (open) Opened Stream
Setting pipeline to PLAYING …
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
reference in DPB was never decoded
** doing well & ctrl + c **
^Chandling interrupt.
Interrupt: Stopping pipeline …
Execution ended after 0:00:03.073757789
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …
Can you explain what does “video capture is being done using opencv” mean? What does “deepstream (or gstreamer) is soley used without any external connection” mean? The followed command are both “gst-launch” commands.
“video capture is being done using openv” means capturing camera inputs in the background using a code snippet below.
import cv2 as cv
cap2 = cv.VideoCapture(‘rtsp:///main’) # it can be rtsp or http $
while True:
ret, frame = cap2.read()
I ran “gst-launch” in the same way for both conditions (using and not using openCV)
(gst-launch-1.0 uridecodebin uri=rtspt:///ip:port/main ! nvoverlaysink)
We don’t know the mechanism with opencv. But when I run “gst-launch-1.0 uridecodebin uri=rtspt://xxxxx ! nvoverlaysink” command in my Xavier board, it works well.
As far as I know, opencv VideoCapture only supports appsink pipeline, so the pipeline is different to “gst-launch-1.0 uridecodebin uri=rtspt://xxxxx ! nvoverlaysink”.