Source disconnection crashes entire pipeline

• Hardware Platform (Jetson / GPU) AGX
• DeepStream Version 6.2
• JetPack Version (valid for Jetson only) 5.1

Hi,

I’m using the python SDK to create a pipeline that uses ~10-30 rstp (over TCP => rstpt) input streams from IP cameras, at a resolution of 704x576 at 6FPS. If I run this pipeline with a small number of cameras (1-5) it works as expected. However, if increase the number of cameras to above 10, I get the following error:

2023-08-14 16:25:06 jetson-2307101433 bus_call[1] ERROR Error: gst-resource-error-quark: Could not read from resource. (9): gstrtspsrc.c(5560): gst_rtspsrc_loop_interleaved (): /GstPipeline:pipeline0/GstBin:source-bin-ip_camera_17/GstURIDecodeBin:uri-decode-bin/GstRTSPSrc:source:
Could not receive message. (Parse error)

Sometimes the error is immediate (no data seems to have been processed at all) and sometimes the pipeline runs for a few minutes and then crashes with the above error. Something else that I noticed was that the frame rate becomes degraded when the number of cameras increases. I don’t know if this is connected or a result of another issue.
Additionally, in this case camera 17 failed but it is a different camera each time, which suggests to me that it’s not a source connection problem.

I have 2 questions:

  1. do you know what could be causing this issue?
  2. is there a way to stop the pipeline from failing as soon as one source becomes disconnected and try to reconnect to that source?

Some info about the model we’re using:

  • yolov5m / yolov5s (same result with both)
  • inference plugin: Gst-nvinfer
  • network mode: FP32

I purposefully didn’t include the whole list of pipeline plugins we’re using and how we’ve set the pipeline up because I thought that would be too much information, so if there’s any additional information that I can add here to help then please let me know.

Thanks a lot.
Haydn

Have you monitored the GPU performance and CPU performance when running with 17 rtsp sources?

Yes, the CPU did spike to ~100% while setting up the pipeline (creating and checking the source bins etc) but then stabilised at around 20% by the time the pipeline was actually starting. It was at 25% CPU (and 4% MEM) when it crashed

Ok I went back to the GPU and it’s at 100% a lot of the time so that definitely looks like an issue. It seems like it can just about handle 9 cameras but it’s at 100% a lot of the time (see image)

One thing that could be causing this issue is that the pipeline doesn’t appear to be able to use the DLA:

WARNING: [TRT]: DLA requests all profiles have same min, max, and opt value. All dla layers are falling back to GPU
Building complete

0:09:33.933826675     1     0x30cfc830 INFO                 nvinfer gstnvinfer.cpp:680:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1955> [UID = 1]: serialize cuda engine to file: /workspace/deepstream-ivy/model_b9_gpu0_fp32.engine successfully
INFO: [FullDims Engine Info]: layers num: 4
0   INPUT  kFLOAT input           3x640x640       min: 1x3x640x640     opt: 9x3x640x640     Max: 9x3x640x640     
1   OUTPUT kFLOAT boxes           25200x4         min: 0               opt: 0               Max: 0               
2   OUTPUT kFLOAT scores          25200x1         min: 0               opt: 0               Max: 0               
3   OUTPUT kFLOAT classes         25200x1         min: 0               opt: 0               Max: 0               

0:09:34.401145565     1     0x30cfc830 INFO                 nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<primary-inference> [UID 1]: Load new model:configs/deepstream_yolo_configs/config_infer_primary_yoloV5s.txt sucessfully

Could this be the root of the issue?

If you want to enable DLA, please make sure the model you are using can run in DLA. There is “enable-dla” configuation in gst-nvinfer to help you to enable DLA inferencing. Gst-nvinfer — DeepStream 6.2 Release documentation

100% GPU may cause the received video frames waiting in the rtsp queue, when the queue is full, it can not receive frames any more or it has to drop the data.

Thanks. How do I know if the model I am running can run in DLA? Is it just that I need to enable it?

It says that the “profiles” must have the same min, max and opt values. How can I ensure this? What actually are the profiles?

Please use the “trtexec” tool to check it. Or you can go to TensorRT forum for more useful information. Latest tensorrt topics in GPU-Accelerated Libraries - NVIDIA Developer Forums

Thanks a lot, I’ll look into that.

Is it normal that 9x rtsp streams with 704x576@6FPS would already result in 100% GPU usage on a Jetson with a 32GB GPU? Do you know anything else that can result in very high GPU usage? I’ve attached the model config for more info

[property]
gpu-id=0
net-scale-factor=0.0039215697906911373
model-color-format=0
onnx-file=/workspace/deepstream-ivy/models/yolo/yolov5s.onnx
#int8-calib-file=calib.table
labelfile-path=labels.txt
network-mode=0
num-detected-classes=80
interval=0
gie-unique-id=1
process-mode=1
network-type=0
cluster-mode=2
maintain-aspect-ratio=1
symmetric-padding=1
#force-implicit-batch-dim=1
#workspace-size=1000
parse-bbox-func-name=NvDsInferParseYolo
#parse-bbox-func-name=NvDsInferParseYoloCuda
custom-lib-path=/opt/nvidia/deepstream/deepstream/lib/libnvdsinfer_custom_impl_Yolo.so
engine-create-func-name=NvDsInferYoloCudaEngineGet
filter-out-class-ids= 4;5;6;7;8;9;10;11;12;13;14;15;16;17;18;19;20;21;22;23;24;25;26;27;28;29;30;31;32;33;34;35;36;37;38;39;40;41;42;43;44;45;46;47;48;49;50;51;52;53;54;55;56;57;58;59;60;61;62;63;64;65;66;67;68;69;70;71;72;73;74;75;76;77;78;79

[class-attrs-all]
nms-iou-threshold=0.45
pre-cluster-threshold=0.25
topk=300
classifier-threshold=0.8
threshold=0.8

If I increase he interval to 16 then I am able to run 17 cameras for a few minutes before it crashes with the folllowing error:

2023-08-16 12:04:34 ivy-jetson-2307101433 bus_call[1] ERROR Error: gst-resource-error-quark: Could not read from resource. (9): gstrtspsrc.c(5560): gst_rtspsrc_loop_interleaved (): /GstPipeline:pipeline0/GstBin:source-bin-ip_camera_18/GstURIDecodeBin:uri-decode-bin/GstRTSPSrc:source:
Could not receive message. (Parse error)

Although this time the FPS stays at 6, so the GPU may not be the issue. This could be the same issue as I mentioned in the initial post where it seems to lose connection to one camera and that causes the pipeline to crash.

For some more context, if I run
gst-launch-1.0 --gst-debug=rtspsrc:5 rtspsrc location=rtspt://user:pass@x.x.x.x:554/Streaming/Channels/102?transportmode=unicast&profile=Profile_2 debug=1 ! rtph264depay ! fakesink

then I get the following output:

0:00:00.368416383 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:9070:gst_rtspsrc_uri_set_uri:<rtspsrc0> configuring URI
0:00:00.368437727 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:9086:gst_rtspsrc_uri_set_uri:<rtspsrc0> set uri: rtspt://user:pass@x.x.x.x:554/Streaming/Channels/102?transportmode=unicast
0:00:00.368450048 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:9087:gst_rtspsrc_uri_set_uri:<rtspsrc0> request uri is: rtsp://172.16.96.27:554/Streaming/Channels/102?transportmode=unicast
Setting pipeline to PAUSED ...
0:00:00.368713636 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:8836:gst_rtspsrc_start:<rtspsrc0> starting
0:00:00.368879111 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5942:gst_rtspsrc_loop_send_cmd:<rtspsrc0> sending cmd OPEN
0:00:00.368894919 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5979:gst_rtspsrc_loop_send_cmd:<rtspsrc0> not interrupting busy cmd unknown
Pipeline is live and does not need PREROLL ...
0:00:00.369117354 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:8783:gst_rtspsrc_thread:<rtspsrc0> got command OPEN
0:00:00.369145323 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5093:gst_rtspsrc_connection_flush:<rtspsrc0> set flushing 0
0:00:00.369183980 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:4959:gst_rtsp_conninfo_connect:<rtspsrc0> creating connection (rtspt://user:pass@x.x.x.x:554/Streaming/Channels/102?transportmode=unicast)...
0:00:00.369510769 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:4970:gst_rtsp_conninfo_connect:<rtspsrc0> sanitized uri rtsp://172.16.96.27:554/Streaming/Channels/102?transportmode=unicast
0:00:00.369543857 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5004:gst_rtsp_conninfo_connect:<rtspsrc0> connecting (rtspt://user:pass@x.x.x.x:554/Streaming/Channels/102?transportmode=unicast)...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtspt://user:pass@x.x.x.x:554/Streaming/Channels/102?transportmode=unicast

(gst-launch-1.0:238633): dconf-CRITICAL **: 17:02:47.921: unable to create directory '/run/user/1009/dconf': Permission denied.  dconf will not work properly.

(gst-launch-1.0:238633): dconf-CRITICAL **: 17:02:47.929: unable to create directory '/run/user/1009/dconf': Permission denied.  dconf will not work properly.

(gst-launch-1.0:238633): dconf-CRITICAL **: 17:02:47.929: unable to create directory '/run/user/1009/dconf': Permission denied.  dconf will not work properly.

(gst-launch-1.0:238633): dconf-CRITICAL **: 17:02:47.929: unable to create directory '/run/user/1009/dconf': Permission denied.  dconf will not work properly.

(gst-launch-1.0:238633): dconf-CRITICAL **: 17:02:47.929: unable to create directory '/run/user/1009/dconf': Permission denied.  dconf will not work properly.

(gst-launch-1.0:238633): dconf-CRITICAL **: 17:02:47.930: unable to create directory '/run/user/1009/dconf': Permission denied.  dconf will not work properly.

(gst-launch-1.0:238633): dconf-CRITICAL **: 17:02:47.930: unable to create directory '/run/user/1009/dconf': Permission denied.  dconf will not work properly.

(gst-launch-1.0:238633): dconf-CRITICAL **: 17:02:47.930: unable to create directory '/run/user/1009/dconf': Permission denied.  dconf will not work properly.

(gst-launch-1.0:238633): dconf-CRITICAL **: 17:02:47.930: unable to create directory '/run/user/1009/dconf': Permission denied.  dconf will not work properly.

(gst-launch-1.0:238633): dconf-CRITICAL **: 17:02:47.930: unable to create directory '/run/user/1009/dconf': Permission denied.  dconf will not work properly.
0:00:00.561798292 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:7767:gst_rtspsrc_retrieve_sdp:<rtspsrc0> create options... (async)
0:00:00.561852149 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:7776:gst_rtspsrc_retrieve_sdp:<rtspsrc0> send options...
0:00:00.561978871 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:522:default_before_send:<rtspsrc0> default handler
0:00:00.561999127 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:522:default_before_send:<rtspsrc0> default handler
0:00:00.562007031 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6371:gst_rtspsrc_try_send:<rtspsrc0> sending message
Progress: (open) Retrieving server options
0:00:01.135615404 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6273:gst_rtsp_src_receive_response:<rtspsrc0> received response message
0:00:01.135675373 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6292:gst_rtsp_src_receive_response:<rtspsrc0> got response message 200
0:00:01.135711246 238633 0xaaaaf1926b60 INFO                 rtspsrc gstrtspsrc.c:7788:gst_rtspsrc_retrieve_sdp:<rtspsrc0> Now using version: 1.0
0:00:01.135736046 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:7796:gst_rtspsrc_retrieve_sdp:<rtspsrc0> create describe...
0:00:01.135752942 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:7813:gst_rtspsrc_retrieve_sdp:<rtspsrc0> send describe...
0:00:01.135864112 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:522:default_before_send:<rtspsrc0> default handler
0:00:01.135881680 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:522:default_before_send:<rtspsrc0> default handler
0:00:01.135911441 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6371:gst_rtspsrc_try_send:<rtspsrc0> sending message
Progress: (open) Retrieving media info
0:00:01.323650858 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6273:gst_rtsp_src_receive_response:<rtspsrc0> received response message
0:00:01.323702955 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6292:gst_rtsp_src_receive_response:<rtspsrc0> got response message 401
0:00:01.323755467 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6188:gst_rtspsrc_setup_auth:<rtspsrc0> Attempting authentication using credentials from the URL
0:00:01.323772780 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6217:gst_rtspsrc_setup_auth:<rtspsrc0> Attempting Digest authentication
0:00:01.323800332 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:522:default_before_send:<rtspsrc0> default handler
0:00:01.323816588 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:522:default_before_send:<rtspsrc0> default handler
0:00:01.323829613 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6371:gst_rtspsrc_try_send:<rtspsrc0> sending message
0:00:02.514022259 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6273:gst_rtsp_src_receive_response:<rtspsrc0> received response message
0:00:02.514108917 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6292:gst_rtsp_src_receive_response:<rtspsrc0> got response message 200
0:00:02.514146645 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:7871:gst_rtspsrc_retrieve_sdp:<rtspsrc0> parse SDP...
0:00:02.514415514 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:2060:gst_rtspsrc_collect_payloads: mapping sdp session level attributes to caps
0:00:02.514434682 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:2062:gst_rtspsrc_collect_payloads: mapping sdp media level attributes to caps
0:00:02.514466619 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:2080:gst_rtspsrc_collect_payloads:<rtspsrc0>  looking at 0 pt: 96
0:00:02.514536188 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:2219:gst_rtspsrc_create_stream:<rtspsrc0> stream 0, (0xffff90033a80)
0:00:02.514546780 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:2220:gst_rtspsrc_create_stream:<rtspsrc0>  port: 0
0:00:02.514553660 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:2221:gst_rtspsrc_create_stream:<rtspsrc0>  container: 0
0:00:02.514561948 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:2222:gst_rtspsrc_create_stream:<rtspsrc0>  control: rtsp://172.16.96.27:554/Streaming/Channels/102/trackID=1?transportmode=unicast
0:00:02.514569724 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:2256:gst_rtspsrc_create_stream:<rtspsrc0>  setup: rtsp://172.16.96.27:554/Streaming/Channels/102/trackID=1?transportmode=unicast
0:00:02.514605437 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:501:default_select_stream:<rtspsrc0> default handler
0:00:02.514626109 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:512:select_stream_accum: accum 1
0:00:02.514640189 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:501:default_select_stream:<rtspsrc0> default handler
0:00:02.514647358 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:512:select_stream_accum: accum 1
0:00:02.514659486 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:7222:gst_rtspsrc_setup_streams_start:<rtspsrc0> doing setup of stream 0xffff90033a80 with rtsp://172.16.96.27:554/Streaming/Channels/102/trackID=1?transportmode=unicast
0:00:02.514677022 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:7237:gst_rtspsrc_setup_streams_start:<rtspsrc0> protocols = 0x4, protocol mask = 0x4
0:00:02.514685854 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6706:gst_rtspsrc_create_transports_string:<rtspsrc0> got transports (NULL)
0:00:02.514697150 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6757:gst_rtspsrc_create_transports_string:<rtspsrc0> adding TCP
0:00:02.514704287 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6763:gst_rtspsrc_create_transports_string:<rtspsrc0> prepared transports RTP/AVP/TCP;unicast;interleaved=%%i1-%%i2
0:00:02.514715743 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:7253:gst_rtspsrc_setup_streams_start:<rtspsrc0> replace ports in RTP/AVP/TCP;unicast;interleaved=%%i1-%%i2
0:00:02.514728415 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:7264:gst_rtspsrc_setup_streams_start:<rtspsrc0> transport is now RTP/AVP/TCP;unicast;interleaved=0-1
0:00:02.514818176 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:522:default_before_send:<rtspsrc0> default handler
0:00:02.514829697 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:522:default_before_send:<rtspsrc0> default handler
0:00:02.514837025 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6371:gst_rtspsrc_try_send:<rtspsrc0> sending message
Progress: (request) SETUP stream 0
0:00:02.697745803 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6273:gst_rtsp_src_receive_response:<rtspsrc0> received response message
0:00:02.697811596 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6292:gst_rtsp_src_receive_response:<rtspsrc0> got response message 200
0:00:02.697865709 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6955:gst_rtsp_src_setup_stream_from_response:<rtspsrc0> stream 0xffff90033a80 as TCP interleaved
0:00:02.697880845 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:4575:gst_rtspsrc_stream_configure_transport:<rtspsrc0> configuring transport for stream 0xffff90033a80
0:00:02.697892781 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:4584:gst_rtspsrc_stream_configure_transport:<rtspsrc0> setting media type to application/x-rtp
0:00:02.697925646 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:3811:gst_rtspsrc_stream_configure_manager:<rtspsrc0> using manager rtpbin
0:00:02.700531384 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:3521:set_manager_buffer_mode:<rtspsrc0> auto buffering mode, have clock (NULL)
0:00:02.700565785 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:3542:set_manager_buffer_mode:<rtspsrc0> auto buffering mode
0:00:02.700576537 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:3547:set_manager_buffer_mode:<rtspsrc0> selected slave
0:00:02.700597273 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:3909:gst_rtspsrc_stream_configure_manager:<rtspsrc0> connect to signals on session manager, stream 0xffff90033a80
0:00:02.702927615 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:3581:request_rtp_decoder: decoder session 0, stream 0xffff90033a80, 0
0:00:02.703296709 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:3294:new_manager_pad:<rtspsrc0> got new manager pad <manager:recv_rtp_sink_0>
0:00:02.703337414 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:3367:new_manager_pad:<rtspsrc0> ignoring unknown stream
0:00:02.703563466 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:3581:request_rtp_decoder: decoder session 0, stream 0xffff90033a80, 0
0:00:02.703679980 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:3294:new_manager_pad:<rtspsrc0> got new manager pad <manager:recv_rtcp_sink_0>
0:00:02.703715884 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:3367:new_manager_pad:<rtspsrc0> ignoring unknown stream
0:00:02.703756173 238633 0xaaaaf1926b60 INFO                 rtspsrc gstrtspsrc.c:3953:gst_rtspsrc_stream_configure_manager:<rtspsrc0> configure bandwidth in session 0xffff9004c0e0
0:00:02.703797806 238633 0xaaaaf1926b60 INFO                 rtspsrc gstrtspsrc.c:3958:gst_rtspsrc_stream_configure_manager:<rtspsrc0> setting AS: 5000000.000000
0:00:02.703844526 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:4063:gst_rtspsrc_stream_configure_tcp:<rtspsrc0> stream 0xffff90033a80 on channels 0-1
0:00:02.703859567 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:4086:gst_rtspsrc_stream_configure_tcp:<rtspsrc0> using manager source pad
0:00:02.704007377 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:3614:request_rtcp_encoder: decoder session 0, stream 0xffff90033a80, 0
0:00:02.704090322 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:3294:new_manager_pad:<rtspsrc0> got new manager pad <manager:send_rtcp_src_0>
0:00:02.704108979 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:3367:new_manager_pad:<rtspsrc0> ignoring unknown stream
0:00:02.704208788 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:8783:gst_rtspsrc_thread:<rtspsrc0> got command LOOP
0:00:02.704234485 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5093:gst_rtspsrc_connection_flush:<rtspsrc0> set flushing 0
0:00:02.704254485 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5483:gst_rtspsrc_loop_interleaved:<rtspsrc0> doing receive with timeout 54 seconds, 810706 usec
Progress: (open) Opened Stream
Setting pipeline to PLAYING ...
0:00:02.704479225 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:3521:set_manager_buffer_mode:<rtspsrc0> auto buffering mode, have clock (NULL)
0:00:02.704508537 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:3542:set_manager_buffer_mode:<rtspsrc0> auto buffering mode
0:00:02.704523065 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:3547:set_manager_buffer_mode:<rtspsrc0> selected slave
0:00:02.704554938 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5942:gst_rtspsrc_loop_send_cmd:<rtspsrc0> sending cmd WAIT
0:00:02.704571290 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5967:gst_rtspsrc_loop_send_cmd:<rtspsrc0> cancel previous request LOOP
0:00:02.704584762 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5974:gst_rtspsrc_loop_send_cmd:<rtspsrc0> connection flush busy LOOP
0:00:02.704596347 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5093:gst_rtspsrc_connection_flush:<rtspsrc0> set flushing 1
0:00:02.704607803 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5096:gst_rtspsrc_connection_flush:<rtspsrc0> connection flush
0:00:02.704703324 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5553:gst_rtspsrc_loop_interleaved:<rtspsrc0> got interrupted
0:00:02.704750301 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6040:gst_rtspsrc_loop:<rtspsrc0> pausing task, reason flushing
0:00:02.704769629 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5942:gst_rtspsrc_loop_send_cmd:<rtspsrc0> sending cmd WAIT
0:00:02.704783006 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5974:gst_rtspsrc_loop_send_cmd:<rtspsrc0> connection flush busy LOOP
0:00:02.704795230 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5093:gst_rtspsrc_connection_flush:<rtspsrc0> set flushing 1
0:00:02.704898112 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5942:gst_rtspsrc_loop_send_cmd:<rtspsrc0> sending cmd PLAY
0:00:02.704926560 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5979:gst_rtspsrc_loop_send_cmd:<rtspsrc0> not interrupting busy cmd WAIT
0:00:02.704960673 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:8783:gst_rtspsrc_thread:<rtspsrc0> got command PLAY
0:00:02.704976065 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5093:gst_rtspsrc_connection_flush:<rtspsrc0> set flushing 0
0:00:02.704984577 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5096:gst_rtspsrc_connection_flush:<rtspsrc0> connection flush
0:00:02.705014401 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:8312:gst_rtspsrc_play:<rtspsrc0> PLAY...
0:00:02.705091747 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:522:default_before_send:<rtspsrc0> default handler
0:00:02.705110755 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:522:default_before_send:<rtspsrc0> default handler
0:00:02.705132195 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6371:gst_rtspsrc_try_send:<rtspsrc0> sending message
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
0:00:02.963036374 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6273:gst_rtsp_src_receive_response:<rtspsrc0> received response message
0:00:02.963123831 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6292:gst_rtsp_src_receive_response:<rtspsrc0> got response message 200
0:00:02.963178040 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:8127:gst_rtspsrc_parse_rtpinfo:<rtspsrc0> parsing RTP-Info url=rtsp://172.16.96.27:554/Streaming/Channels/102/trackID=1?transportmode=unicast;seq=700;rtptime=2282497928
0:00:02.963210073 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:8136:gst_rtspsrc_parse_rtpinfo:<rtspsrc0> parsing info url=rtsp://172.16.96.27:554/Streaming/Channels/102/trackID=1?transportmode=unicast;seq=700;rtptime=2282497928
0:00:02.963236089 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:8150:gst_rtspsrc_parse_rtpinfo:<rtspsrc0> parsing field url=rtsp://172.16.96.27:554/Streaming/Channels/102/trackID=1?transportmode=unicast
0:00:02.963263578 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:8150:gst_rtspsrc_parse_rtpinfo:<rtspsrc0> parsing field seq=700
0:00:02.963283258 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:8150:gst_rtspsrc_parse_rtpinfo:<rtspsrc0> parsing field rtptime=2282497928
0:00:02.963308538 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:8166:gst_rtspsrc_parse_rtpinfo:<rtspsrc0> found stream 0xffff90033a80, setting: seqbase 700, timebase 2282497928
0:00:02.963326203 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:4769:gst_rtspsrc_configure_caps:<rtspsrc0> configuring stream caps
0:00:02.963488253 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:4806:gst_rtspsrc_configure_caps:<rtspsrc0> stream 0xffff90033a80, pt 96, caps application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H265, sprop-sps=(string)"QgEBAWAAAAMAgAAAAwAAAwB4oAWCAJB/ja7tTd3Jdf+C9AIKtwUFBQQAAA+gAABdwch3uUQD0gARMwB6QAImYg\=\=", sprop-pps=(string)"RAHBcrCULxI\=", a-recvonly=(string)"", x-dimensions=(string)"704\,576", a-Media_header=(string)"MEDIAINFO\=494D4B48010200000400050000000000000000000000000000000000000000000000000000000000\;", a-appversion=(string)1.0, ssrc=(uint)1718908749, clock-base=(uint)2282497928, seqnum-base=(uint)700, npt-start=(guint64)0, play-speed=(double)1, play-scale=(double)1
0:00:02.963521502 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:4817:gst_rtspsrc_configure_caps:<rtspsrc0> clear session
0:00:02.964036102 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:8497:gst_rtspsrc_play:<rtspsrc0> mark DISCONT, we did a seek to another position
0:00:02.964099015 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:8783:gst_rtspsrc_thread:<rtspsrc0> got command LOOP
0:00:02.964121864 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5093:gst_rtspsrc_connection_flush:<rtspsrc0> set flushing 0
0:00:02.964146952 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5483:gst_rtspsrc_loop_interleaved:<rtspsrc0> doing receive with timeout 54 seconds, 741121 usec
Progress: (request) Sent PLAY request
0:00:03.100295863 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5494:gst_rtspsrc_loop_interleaved:<rtspsrc0> we received a server message
0:00:03.100388696 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5527:gst_rtspsrc_loop_interleaved:<rtspsrc0> got data message
0:00:03.100456601 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5292:gst_rtspsrc_handle_data:<rtspsrc0> pushing data of size 36 on channel 0
0:00:03.101130212 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:4712:gst_rtspsrc_activate_streams:<rtspsrc0> activating streams
0:00:03.101196741 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5383:gst_rtspsrc_handle_data:<rtspsrc0> first buffer at time 7:23:50.823778547, base 7:23:50.426947012
0:00:03.101266727 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5426:gst_rtspsrc_handle_data:<rtspsrc0> setting timestamp 0:00:00.396831535
0:00:03.101930641 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5483:gst_rtspsrc_loop_interleaved:<rtspsrc0> doing receive with timeout 54 seconds, 603351 usec
0:00:03.101990994 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5494:gst_rtspsrc_loop_interleaved:<rtspsrc0> we received a server message
0:00:03.102014803 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5527:gst_rtspsrc_loop_interleaved:<rtspsrc0> got data message
0:00:03.102042963 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5292:gst_rtspsrc_handle_data:<rtspsrc0> pushing data of size 76 on channel 0
0:00:03.103669838 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:2928:gst_rtspsrc_handle_internal_src_event:<'':internalsrc_0> received event reconfigure
0:00:03.103897841 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5483:gst_rtspsrc_loop_interleaved:<rtspsrc0> doing receive with timeout 54 seconds, 601390 usec
0:00:03.103931538 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5494:gst_rtspsrc_loop_interleaved:<rtspsrc0> we received a server message
0:00:03.103942450 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5527:gst_rtspsrc_loop_interleaved:<rtspsrc0> got data message
0:00:03.103957618 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5292:gst_rtspsrc_handle_data:<rtspsrc0> pushing data of size 20 on channel 0
0:00:03.103978611 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5483:gst_rtspsrc_loop_interleaved:<rtspsrc0> doing receive with timeout 54 seconds, 601286 usec
0:00:03.103998931 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5494:gst_rtspsrc_loop_interleaved:<rtspsrc0> we received a server message
0:00:03.104009203 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5527:gst_rtspsrc_loop_interleaved:<rtspsrc0> got data message
0:00:03.103994803 238633 0xffff90006640 DEBUG                rtspsrc gstrtspsrc.c:2928:gst_rtspsrc_handle_internal_src_event:<'':internalsrc_0> received event reconfigure
0:00:03.104038612 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5292:gst_rtspsrc_handle_data:<rtspsrc0> pushing data of size 1440 on channel 0
0:00:03.104070004 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5483:gst_rtspsrc_loop_interleaved:<rtspsrc0> doing receive with timeout 54 seconds, 601196 usec
0:00:03.104064916 238633 0xffff90006640 DEBUG                rtspsrc gstrtspsrc.c:3294:new_manager_pad:<rtspsrc0> got new manager pad <manager:recv_rtp_src_0_1718908749_96>
0:00:03.104097685 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5494:gst_rtspsrc_loop_interleaved:<rtspsrc0> we received a server message
0:00:03.104126965 238633 0xffff90006640 DEBUG                rtspsrc gstrtspsrc.c:3302:new_manager_pad:<rtspsrc0> stream: 0, SSRC 6674774d, PT 96
0:00:03.104129141 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5527:gst_rtspsrc_loop_interleaved:<rtspsrc0> got data message
0:00:03.104146070 238633 0xffff90006640 DEBUG                rtspsrc gstrtspsrc.c:3319:new_manager_pad:<rtspsrc0> stream 0xffff90033a80, container 0, added 1, setup 1
0:00:03.104151286 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5292:gst_rtspsrc_handle_data:<rtspsrc0> pushing data of size 1440 on channel 0
0:00:03.104170518 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5483:gst_rtspsrc_loop_interleaved:<rtspsrc0> doing receive with timeout 54 seconds, 601094 usec
0:00:03.104198902 238633 0xffff90006640 DEBUG                rtspsrc gstrtspsrc.c:2928:gst_rtspsrc_handle_internal_src_event:<'':internalsrc_0> received event reconfigure
0:00:03.104207735 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5494:gst_rtspsrc_loop_interleaved:<rtspsrc0> we received a server message
0:00:03.104242327 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5527:gst_rtspsrc_loop_interleaved:<rtspsrc0> got data message
0:00:03.104267192 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5292:gst_rtspsrc_handle_data:<rtspsrc0> pushing data of size 1440 on channel 0
0:00:03.104278808 238633 0xffff90006640 DEBUG                rtspsrc gstrtspsrc.c:3246:copy_sticky_events:<'':recv_rtp_src_0_1718908749_96> store sticky event stream-start event: 0xffff90049a00, time 99:99:99.999999999, seq-num 49, GstEventStreamStart, stream-id=(string)2f0fdcd59004fe0bea50a479b745fdaacaa10ed02a9dc7fdc760089414a41e77/0, flags=(GstStreamFlags)GST_STREAM_FLAG_NONE, group-id=(uint)1;
0:00:03.104286680 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5483:gst_rtspsrc_loop_interleaved:<rtspsrc0> doing receive with timeout 54 seconds, 600978 usec
0:00:03.104317592 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5494:gst_rtspsrc_loop_interleaved:<rtspsrc0> we received a server message
0:00:03.104325656 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5527:gst_rtspsrc_loop_interleaved:<rtspsrc0> got data message
0:00:03.104333785 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5292:gst_rtspsrc_handle_data:<rtspsrc0> pushing data of size 1440 on channel 0
0:00:03.104350649 238633 0xffff90006640 DEBUG                rtspsrc gstrtspsrc.c:3246:copy_sticky_events:<'':recv_rtp_src_0_1718908749_96> store sticky event caps event: 0xffff90063ab0, time 99:99:99.999999999, seq-num 74, GstEventCaps, caps=(GstCaps)"application/x-rtp\,\ media\=\(string\)video\,\ payload\=\(int\)96\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H265\,\ sprop-sps\=\(string\)\"QgEBAWAAAAMAgAAAAwAAAwB4oAWCAJB/ja7tTd3Jdf+C9AIKtwUFBQQAAA+gAABdwch3uUQD0gARMwB6QAImYg\\\=\\\=\"\,\ sprop-pps\=\(string\)\"RAHBcrCULxI\\\=\"\,\ a-recvonly\=\(string\)\"\"\,\ x-dimensions\=\(string\)\"704\\\,576\"\,\ a-Media_header\=\(string\)\"MEDIAINFO\\\=494D4B48010200000400050000000000000000000000000000000000000000000000000000000000\\\;\"\,\ a-appversion\=\(string\)1.0\,\ ssrc\=\(uint\)1718908749\,\ clock-base\=\(uint\)2282497928\,\ seqnum-base\=\(uint\)700\,\ npt-start\=\(guint64\)0\,\ play-speed\=\(double\)1\,\ play-scale\=\(double\)1";
0:00:03.104362041 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5483:gst_rtspsrc_loop_interleaved:<rtspsrc0> doing receive with timeout 54 seconds, 600903 usec
0:00:03.104423706 238633 0xffff90006640 DEBUG                rtspsrc gstrtspsrc.c:3246:copy_sticky_events:<'':recv_rtp_src_0_1718908749_96> store sticky event segment event: 0xffff90049a70, time 99:99:99.999999999, seq-num 52, GstEventSegment, segment=(GstSegment)"GstSegment, flags=(GstSegmentFlags)GST_SEGMENT_FLAG_NONE, rate=(double)1, applied-rate=(double)1, format=(GstFormat)GST_FORMAT_TIME, base=(guint64)0, offset=(guint64)0, start=(guint64)0, stop=(guint64)18446744073709551615, time=(guint64)0, position=(guint64)0, duration=(guint64)18446744073709551615;";
0:00:03.104439098 238633 0xffff90006640 DEBUG                rtspsrc gstrtspsrc.c:3356:new_manager_pad:<rtspsrc0> We added all streams
0:00:03.105554541 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5494:gst_rtspsrc_loop_interleaved:<rtspsrc0> we received a server message
0:00:03.105639566 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5527:gst_rtspsrc_loop_interleaved:<rtspsrc0> got data message
0:00:03.105655918 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5292:gst_rtspsrc_handle_data:<rtspsrc0> pushing data of size 1440 on channel 0
0:00:03.105688719 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5579:gst_rtspsrc_loop_interleaved:<rtspsrc0> could no handle data message
0:00:03.105700015 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6040:gst_rtspsrc_loop:<rtspsrc0> pausing task, reason not-linked
0:00:03.105711023 238633 0xaaaaf1926b60 WARN                 rtspsrc gstrtspsrc.c:6057:gst_rtspsrc_loop:<rtspsrc0> error: Internal data stream error.
0:00:03.105717935 238633 0xaaaaf1926b60 WARN                 rtspsrc gstrtspsrc.c:6057:gst_rtspsrc_loop:<rtspsrc0> error: streaming stopped, reason not-linked (-1)
0:00:03.105887026 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5942:gst_rtspsrc_loop_send_cmd:<rtspsrc0> sending cmd WAIT
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Internal data stream error.
0:00:03.105904914 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5967:gst_rtspsrc_loop_send_cmd:<rtspsrc0> cancel previous request LOOP
0:00:03.105929875 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5974:gst_rtspsrc_loop_send_cmd:<rtspsrc0> connection flush busy LOOP
0:00:03.105937747 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5093:gst_rtspsrc_connection_flush:<rtspsrc0> set flushing 1
0:00:03.105944723 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5096:gst_rtspsrc_connection_flush:<rtspsrc0> connection flush
Additional debug info:
gstrtspsrc.c(6057): gst_rtspsrc_loop (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
streaming stopped, reason not-linked (-1)
Execution ended after 0:00:00.400863824
Setting pipeline to NULL ...
0:00:03.106069077 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5942:gst_rtspsrc_loop_send_cmd:<rtspsrc0> sending cmd WAIT
0:00:03.106083733 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5979:gst_rtspsrc_loop_send_cmd:<rtspsrc0> not interrupting busy cmd WAIT
0:00:03.106100597 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:8783:gst_rtspsrc_thread:<rtspsrc0> got command WAIT
0:00:03.106109494 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5093:gst_rtspsrc_connection_flush:<rtspsrc0> set flushing 0
0:00:03.106115126 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5096:gst_rtspsrc_connection_flush:<rtspsrc0> connection flush
0:00:03.106162966 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5942:gst_rtspsrc_loop_send_cmd:<rtspsrc0> sending cmd PAUSE
0:00:03.106174487 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5979:gst_rtspsrc_loop_send_cmd:<rtspsrc0> not interrupting busy cmd WAIT
0:00:03.106187799 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:8783:gst_rtspsrc_thread:<rtspsrc0> got command PAUSE
0:00:03.106196087 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5093:gst_rtspsrc_connection_flush:<rtspsrc0> set flushing 0
0:00:03.106203031 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:8559:gst_rtspsrc_pause:<rtspsrc0> PAUSE...
0:00:03.106240792 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:522:default_before_send:<rtspsrc0> default handler
0:00:03.106251928 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:522:default_before_send:<rtspsrc0> default handler
0:00:03.106259640 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6371:gst_rtspsrc_try_send:<rtspsrc0> sending message
0:00:03.106260920 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5942:gst_rtspsrc_loop_send_cmd:<rtspsrc0> sending cmd CLOSE
0:00:03.106282584 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5967:gst_rtspsrc_loop_send_cmd:<rtspsrc0> cancel previous request LOOP
0:00:03.106290041 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5974:gst_rtspsrc_loop_send_cmd:<rtspsrc0> connection flush busy PAUSE
0:00:03.106296857 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5093:gst_rtspsrc_connection_flush:<rtspsrc0> set flushing 1
0:00:03.106303705 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5096:gst_rtspsrc_connection_flush:<rtspsrc0> connection flush
0:00:03.106330873 238633 0xaaaaf1926b60 WARN                 rtspsrc gstrtspsrc.c:6410:gst_rtspsrc_try_send:<rtspsrc0> send interrupted
0:00:03.106341913 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6527:gst_rtspsrc_send:<rtspsrc0> got error -3
0:00:03.106349914 238633 0xaaaaf1926b60 WARN                 rtspsrc gstrtspsrc.c:8672:gst_rtspsrc_pause:<rtspsrc0> PAUSE interrupted
0:00:03.106364122 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:8783:gst_rtspsrc_thread:<rtspsrc0> got command CLOSE
0:00:03.106371290 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5093:gst_rtspsrc_connection_flush:<rtspsrc0> set flushing 0
0:00:03.106377594 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5096:gst_rtspsrc_connection_flush:<rtspsrc0> connection flush
0:00:03.106388634 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:7994:gst_rtspsrc_close:<rtspsrc0> TEARDOWN...
0:00:03.106742144 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:522:default_before_send:<rtspsrc0> default handler
0:00:03.106759520 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:522:default_before_send:<rtspsrc0> default handler
0:00:03.106771296 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6371:gst_rtspsrc_try_send:<rtspsrc0> sending message
0:00:03.106866626 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6277:gst_rtsp_src_receive_response:<rtspsrc0> handle data response message
0:00:03.106881314 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5292:gst_rtspsrc_handle_data:<rtspsrc0> pushing data of size 1440 on channel 0
0:00:03.193241092 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6277:gst_rtsp_src_receive_response:<rtspsrc0> handle data response message
0:00:03.193333638 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5292:gst_rtspsrc_handle_data:<rtspsrc0> pushing data of size 1440 on channel 0
0:00:03.193427271 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6277:gst_rtsp_src_receive_response:<rtspsrc0> handle data response message
0:00:03.193450856 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5292:gst_rtspsrc_handle_data:<rtspsrc0> pushing data of size 1440 on channel 0
0:00:03.206405307 238633 0xaaaaf195c640 WARN                 rtspsrc gstrtspsrc.c:6001:gst_rtspsrc_loop_send_cmd_and_wait:<rtspsrc0> Timed out waiting for TEARDOWN to be processed.
0:00:03.206651679 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:8867:gst_rtspsrc_stop:<rtspsrc0> stopping
0:00:03.206687904 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5942:gst_rtspsrc_loop_send_cmd:<rtspsrc0> sending cmd WAIT
0:00:03.206709888 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5974:gst_rtspsrc_loop_send_cmd:<rtspsrc0> connection flush busy CLOSE
0:00:03.206726784 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5093:gst_rtspsrc_connection_flush:<rtspsrc0> set flushing 1
0:00:03.206749409 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:5096:gst_rtspsrc_connection_flush:<rtspsrc0> connection flush
0:00:03.277235487 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6277:gst_rtsp_src_receive_response:<rtspsrc0> handle data response message
0:00:03.277303872 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5292:gst_rtspsrc_handle_data:<rtspsrc0> pushing data of size 1440 on channel 0
0:00:03.277380130 238633 0xaaaaf1926b60 WARN                 rtspsrc gstrtspsrc.c:6326:gst_rtsp_src_receive_response:<rtspsrc0> receive interrupted
0:00:03.277434883 238633 0xaaaaf1926b60 WARN                 rtspsrc gstrtspsrc.c:6424:gst_rtspsrc_try_send:<rtspsrc0> receive interrupted
0:00:03.277455331 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:6527:gst_rtspsrc_send:<rtspsrc0> got error -3
0:00:03.277475907 238633 0xaaaaf1926b60 WARN                 rtspsrc gstrtspsrc.c:8099:gst_rtspsrc_close:<rtspsrc0> TEARDOWN interrupted
0:00:03.277493444 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:8063:gst_rtspsrc_close:<rtspsrc0> closing connection...
0:00:03.277508452 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5060:gst_rtsp_conninfo_close:<rtspsrc0> closing connection...
0:00:03.277688231 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:5066:gst_rtsp_conninfo_close:<rtspsrc0> freeing connection...
0:00:03.277732967 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:2336:gst_rtspsrc_cleanup:<rtspsrc0> cleanup
0:00:03.277785160 238633 0xaaaaf1926b60 DEBUG                rtspsrc gstrtspsrc.c:2272:gst_rtspsrc_stream_free:<rtspsrc0> free stream 0xffff90033a80
0:00:03.278324497 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:7994:gst_rtspsrc_close:<rtspsrc0> TEARDOWN...
0:00:03.278376498 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:7999:gst_rtspsrc_close:<rtspsrc0> not ready, doing cleanup
0:00:03.278406418 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:8063:gst_rtspsrc_close:<rtspsrc0> closing connection...
0:00:03.278427699 238633 0xaaaaf195c640 DEBUG                rtspsrc gstrtspsrc.c:2336:gst_rtspsrc_cleanup:<rtspsrc0> cleanup
Freeing pipeline ...

The GPU load varies between 0 and 100% (probably 4/5 of the time at 100%, 1/5 of the time at 0%). From this forum post it seems like maybe that’s normal because it’s just using all available resources but I’m struggling to see anything else that could cause the issue apart from the GPU resources

Please also lock the max clock with the method here: VPI - Vision Programming Interface: Performance Benchmark (nvidia.com)

It depends on the model and your settings. I notice that you are running FP32 model.
You can use “trtexec” tool to measure the inferencing time of the model.

We have a fleet of jetsons that we will be using - do we need to lock the max clock on all of them? From the description in the link it looks like it’s just for testing

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

No. It should be set when your use case needs the max capability of our GPU.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.