Problems running the deepstream python test apps

Hello,

i have some problems running the deepstream-test apps provided within DeepStream-5.0.
Im quiet new to this, so i think its just a problem how to supply the h264_elementary_stream to the deepstream_test_1.py.

When i use the comand:
python3 deepstream_test_1.py

It returnes:
sage: deepstream_test_1.py <media file or uri>

In the README it states <h264_elementary_stream> so i guess rtsp ore .mp4 files are not an potion as i always get errors when trying it with:
python3 deepstream_test_1.py rtsp://admin:****@192.168.0.61:554/h264Preview_01_main

Creating Pipeline
Creating Source
Creating H264Parser
Creating Decoder
Creating EGLSink
Playing file rtsp://admin:****@192.168.0.61:554/h264Preview_01_main
Adding elements to Pipeline
Linking elements in the Pipeline
Starting pipeline
Using winsys: x11
Opening in BLOCKING MODE
ERROR: Deserialize engine failed because file path: /mnt/A/DeepStream/DeepStream-5.0-ROOT/deepstream_python_apps/apps/deepstream-test1/…/…/…/…/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine open error
0:00:02.234641898 28831 0x13f7c350 WARN nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1690> [UID = 1]: deserialize engine from file :/mnt/A/DeepStream/DeepStream-5.0-ROOT/deepstream_python_apps/apps/deepstream-test1/…/…/…/…/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed
0:00:02.235811959 28831 0x13f7c350 WARN nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger: NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1797> [UID = 1]: deserialize backend context from engine from file :/mnt/A/DeepStream/DeepStream-5.0-ROOT/deepstream_python_apps/apps/deepstream-test1/…/…/…/…/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed, try rebuild
0:00:02.235916635 28831 0x13f7c350 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger: NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1715> [UID = 1]: Trying to create engine from model files
ERROR: Cannot access prototxt file ‘/mnt/A/DeepStream/DeepStream-5.0-ROOT/deepstream_python_apps/apps/deepstream-test1/…/…/…/…/samples/models/Primary_Detector/resnet10.prototxt’
ERROR: failed to build network since parsing model errors.
ERROR: failed to build network.
0:00:02.236863231 28831 0x13f7c350 ERROR nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger: NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1735> [UID = 1]: build engine file failed
0:00:02.236935490 28831 0x13f7c350 ERROR nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger: NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1821> [UID = 1]: build backend context failed
0:00:02.236996740 28831 0x13f7c350 ERROR nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger: NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::initialize() <nvdsinfer_context_impl.cpp:1148> [UID = 1]: generate backend failed, check config file settings
0:00:02.237082216 28831 0x13f7c350 WARN nvinfer gstnvinfer.cpp:809:gst_nvinfer_start: error: Failed to create NvDsInferContext instance
0:00:02.237132330 28831 0x13f7c350 WARN nvinfer gstnvinfer.cpp:809:gst_nvinfer_start: error: Config file path: dstest1_pgie_config.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
Error: gst-resource-error-quark: Failed to create NvDsInferContext instance (1): /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(809): gst_nvinfer_start (): /GstPipeline:pipeline0/GstNvInfer:primary-inference:
Config file path: dstest1_pgie_config.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED

Im not shure how to supply the h264 stream if that is my problem.
I have an working RTSP stream and also could plug in a USB-Webcam (not tried yet) ore some media files?!

Thanks for helping out.

• Hardware Platform (Jetson / GPU): Xavier NX
• DeepStream Version: 5.0
• TensorRT Version: 7.1.3.0-1+cuda10.2 arm64
• Issue Type: questions

Yes, only H264 elementary stream supported for test1 sample, you could use test3 sample which use uridecodebin, accept any gstreamer support container format, and any codec

1 Like