VLC error when running python rtsp app

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version 5.0.0
• JetPack Version (valid for Jetson only)
• TensorRT Version 7.1
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs) questions
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

list all deepstream version with deepstream-app --version-all
deepstream-app version 5.0.0
DeepStreamSDK 5.0.0
CUDA Driver Version: 11.0
CUDA Runtime Version: 10.2
TensorRT Version: 7.1
cuDNN Version: 8.0
libNVWarp360 Version: 2.0.1d3

when I run python3 deepstream_test1_rtsp_out.py -i '/opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_720p.h264'

it shows below, I guess the app is running now.

python3 deepstream_test1_rtsp_out.py -i ‘/opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_720p.h264’
Creating Pipeline

Creating Source

Creating H264Parser

Creating Decoder

Creating H264 Encoder
Creating H264 rtppay
Playing file /opt/nvidia/deepstream/deepstream-5.0/samples/streams/sample_720p.h264
Adding elements to Pipeline

Linking elements in the Pipeline

*** DeepStream: Launched RTSP Streaming at rtsp://localhost:8554/ds-test ***

Starting pipeline

ERROR: …/nvdsinfer/nvdsinfer_func_utils.cpp:33 [TRT]: coreReadArchive.cpp (38) - Serialization Error in verifyHeader: 0 (Version tag does not match)
ERROR: …/nvdsinfer/nvdsinfer_func_utils.cpp:33 [TRT]: INVALID_STATE: std::exception
ERROR: …/nvdsinfer/nvdsinfer_func_utils.cpp:33 [TRT]: INVALID_CONFIG: Deserialize the cuda engine failed.
ERROR: …/nvdsinfer/nvdsinfer_model_builder.cpp:1567 Deserialize engine failed from file: /opt/nvidia/deepstream/deepstream-5.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine
0:00:00.772046786 13536 0x2f36520 WARN nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1690> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-5.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed
0:00:00.772078980 13536 0x2f36520 WARN nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1797> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-5.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed, try rebuild
0:00:00.772090166 13536 0x2f36520 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger: NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1715> [UID = 1]: Trying to create engine from model files
INFO: …/nvdsinfer/nvdsinfer_func_utils.cpp:39 [TRT]: Reading Calibration Cache for calibrator: EntropyCalibration2
INFO: …/nvdsinfer/nvdsinfer_func_utils.cpp:39 [TRT]: Generated calibration scales using calibration cache. Make sure that calibration cache has latest scales.
INFO: …/nvdsinfer/nvdsinfer_func_utils.cpp:39 [TRT]: To regenerate calibration cache, please delete the existing one. TensorRT will generate a new calibration cache.
INFO: …/nvdsinfer/nvdsinfer_func_utils.cpp:39 [TRT]: Detected 1 inputs and 2 output network tensors.
ERROR: …/nvdsinfer/nvdsinfer_model_builder.cpp:1495 Serialize engine failed because of file path: /opt/nvidia/deepstream/deepstream-5.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine opened error
0:00:16.619379778 13536 0x2f36520 WARN nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1743> [UID = 1]: failed to serialize cude engine to file: /opt/nvidia/deepstream/deepstream-5.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine
INFO: …/nvdsinfer/nvdsinfer_model_builder.cpp:685 [Implicit Engine Info]: layers num: 3
0 INPUT kFLOAT input_1 3x368x640
1 OUTPUT kFLOAT conv2d_bbox 16x23x40
2 OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40

0:00:16.623874617 13536 0x2f36520 INFO nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus: [UID 1]: Load new model:dstest1_pgie_config.txt sucessfully
Frame Number=0 Number of Objects=6 Vehicle_count=4 Person_count=2
Frame Number=1 Number of Objects=6 Vehicle_count=4 Person_count=2
Frame Number=2 Number of Objects=6 Vehicle_count=4 Person_count=2
Frame Number=3 Number of Objects=6 Vehicle_count=4 Person_count=2
Frame Number=4 Number of Objects=7 Vehicle_count=5 Person_count=2
Frame Number=5 Number of Objects=7 Vehicle_count=5 Person_count=2
Frame Number=6 Number of Objects=6 Vehicle_count=4 Person_count=2

But at the sametime when I try to open RTSP streaming in VLC as suggested rtsp://localhost:8554/ds-test, VLC output > Your input can’t be opened:

VLC is unable to open the MRL 'rtsp://localhost:8554/ds-test'. Check the log for details.

Anything else need to be done for configuring VLC?

Thanks!

Best regards,

The uri is correct if you use vlc in the same device with deepstream.