• Hardware Platform : A100 • DeepStream Version 5.1 • NVIDIA GPU Driver Version (valid for GPU only) 460
Have seen these two logs below, when used rtsp camera deployed on the same network.
Starting pipeline
0:00:17.046720481 39655 0x2e99ca0 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<secondary1-nvinference-engine> NvDsInferContext[UID 2]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1702> [UID = 2]: deserialized trt engine from :/app/model/Secondary_CarColor/resnet18.caffemodel_b16_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:685 [Implicit Engine Info]: layers num: 2
0 INPUT kFLOAT input_1 3x224x224
1 OUTPUT kFLOAT predictions/Softmax 12x1x1
0:00:17.057234028 39655 0x2e99ca0 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<secondary1-nvinference-engine> NvDsInferContext[UID 2]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1806> [UID = 2]: Use deserialized engine model: /app/model/Secondary_CarColor/resnet18.caffemodel_b16_gpu0_int8.engine
0:00:17.080750611 39655 0x2e99ca0 INFO nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<secondary1-nvinference-engine> [UID 2]: Load new model:config/Vehicle/PoC_sgie1_config.txt sucessfully
gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream/lib/libnvds_mot_klt.so
gstnvtracker: Optional NvMOT_RemoveStreams not implemented
gstnvtracker: Batch processing is OFF
gstnvtracker: Past frame output is OFF
0:00:17.181690013 39655 0x2e99ca0 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1702> [UID = 1]: deserialized trt engine from :/app/model/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:685 [Implicit Engine Info]: layers num: 3
0 INPUT kFLOAT input_1 3x368x640
1 OUTPUT kFLOAT conv2d_bbox 16x23x40
2 OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40
0:00:17.181823236 39655 0x2e99ca0 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1806> [UID = 1]: Use deserialized engine model: /app/model/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine
0:00:17.183529523 39655 0x2e99ca0 INFO nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<primary-inference> [UID 1]: Load new model:config/Vehicle/PoC_pgie_config.txt sucessfully
Decodebin child added: source
Warning: gst-resource-error-quark: Could not read from resource. (9): gstrtspsrc.c(5427): gst_rtspsrc_reconnect (): /GstPipeline:pipeline0/GstBin:source-bin-00/GstURIDecodeBin:uri-decode-bin/GstRTSPSrc:source:
Could not receive any UDP packets for 5.0000 seconds, maybe your firewall is blocking it. Retrying using a tcp connection.
and
Decodebin child added: sourceWarning: gst-resource-error-quark: Could not read from resource. (9): gstrtspsrc.c(5280): gst_rtspsrc_loop_udp (): /GstPipeline:pipeline0/GstBin:source-bin-00/GstURIDecodeBin:uri-decode-bin/GstRTSPSrc:source:
The server closed the connection.
Decodebin child added: decodebin0Decodebin child added: rtph264depay0Decodebin child added: h264parse0Decodebin child added: capsfilter0Error: gst-stream-error-quark: GStreamer encountered a general stream error. (1): gstdecodebin2.c(4695): gst_decode_bin_expose (): /GstPipeline:pipeline0/GstBin:source-bin-00/GstURIDecodeBin:uri-decode-bin/GstDecodeBin:decodebin0:
all streams without buffers
Exiting appq flushed
Closing RMQ Connection
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Resource not found.
Additional debug info:
gstrtspsrc.c(7460): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
No valid RTSP URL was provided
0:00:00.223163093 276045 0x55939f241140 DEBUG rtspsrc gstrtspsrc.c:4625:gst_rtsp_conninfo_connect:<rtspsrc1> sanitized uri rtsp://<ip>:554/
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...