Cannot run deepstream-test-1 in deepstream_python_apps: Where is the ../../../../samples/ folder?

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU): Jetson
• DeepStream Version 5.0
• JetPack Version (valid for Jetson only) 4.4
• TensorRT Version 7.1.3.0-1+cuda10.2
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs) bugs
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)

I have installed the following prereqs according to this link:

  • DeepStream SDK 5.0
  • Python 3.6
  • Gst Python v1.14.5

Then I’ve git cloned the deepstream_python_apps repo.

I am trying now to run deepstream-test-1 but get the following error:

marcus@marcus-desktop:~/dev/deepstream_python_apps/apps/deepstream-test1$ python3 deepstream_test_1.py ~/test.264
2020-10-01 11:59:17.399539: I tensorflow/stream_executor/platform/default/dso_loader.cc:48] Successfully opened dynamic library libcudart.so.10.2
Creating Pipeline 
 
Creating Source 
 
Creating H264Parser 

Creating Decoder 

Creating EGLSink 

Playing file /home/marcus/test.264 
Adding elements to Pipeline 

Linking elements in the Pipeline 

Starting pipeline 


Using winsys: x11 
Opening in BLOCKING MODE 
ERROR: Deserialize engine failed because file path: /home/marcus/dev/deepstream_python_apps/apps/deepstream-test1/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine open error
0:00:03.809189017 12874     0x10ec5430 WARN                 nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1690> [UID = 1]: deserialize engine from file :/home/marcus/dev/deepstream_python_apps/apps/deepstream-test1/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed
0:00:03.809270010 12874     0x10ec5430 WARN                 nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1797> [UID = 1]: deserialize backend context from engine from file :/home/marcus/dev/deepstream_python_apps/apps/deepstream-test1/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed, try rebuild
0:00:03.809302298 12874     0x10ec5430 INFO                 nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1715> [UID = 1]: Trying to create engine from model files
ERROR: Cannot access prototxt file '/home/marcus/dev/deepstream_python_apps/apps/deepstream-test1/../../../../samples/models/Primary_Detector/resnet10.prototxt'
ERROR: failed to build network since parsing model errors.
ERROR: failed to build network.
0:00:03.809598940 12874     0x10ec5430 ERROR                nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1735> [UID = 1]: build engine file failed
0:00:03.809627164 12874     0x10ec5430 ERROR                nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1821> [UID = 1]: build backend context failed
0:00:03.809654460 12874     0x10ec5430 ERROR                nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::initialize() <nvdsinfer_context_impl.cpp:1148> [UID = 1]: generate backend failed, check config file settings
0:00:03.809719708 12874     0x10ec5430 WARN                 nvinfer gstnvinfer.cpp:809:gst_nvinfer_start:<primary-inference> error: Failed to create NvDsInferContext instance
0:00:03.810303008 12874     0x10ec5430 WARN                 nvinfer gstnvinfer.cpp:809:gst_nvinfer_start:<primary-inference> error: Config file path: dstest1_pgie_config.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
Error: gst-resource-error-quark: Failed to create NvDsInferContext instance (1): /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(809): gst_nvinfer_start (): /GstPipeline:pipeline0/GstNvInfer:primary-inference:
Config file path: dstest1_pgie_config.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
marcus@marcus-desktop:~/dev/deepstream_python_apps/apps/deepstream-test1$ python3 deepstream_test_1.py ~/test.264
Creating Pipeline 
 
Creating Source 
 
Creating H264Parser 

Creating Decoder 

Creating EGLSink 

Playing file /home/marcus/test.264 
Adding elements to Pipeline 

Linking elements in the Pipeline 

Starting pipeline 


Using winsys: x11 
Opening in BLOCKING MODE 
ERROR: Deserialize engine failed because file path: /home/marcus/dev/deepstream_python_apps/apps/deepstream-test1/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine open error
0:00:01.467262473 12892     0x2b678840 WARN                 nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1690> [UID = 1]: deserialize engine from file :/home/marcus/dev/deepstream_python_apps/apps/deepstream-test1/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed
0:00:01.467332874 12892     0x2b678840 WARN                 nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1797> [UID = 1]: deserialize backend context from engine from file :/home/marcus/dev/deepstream_python_apps/apps/deepstream-test1/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed, try rebuild
0:00:01.467373482 12892     0x2b678840 INFO                 nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1715> [UID = 1]: Trying to create engine from model files
ERROR: Cannot access prototxt file '/home/marcus/dev/deepstream_python_apps/apps/deepstream-test1/../../../../samples/models/Primary_Detector/resnet10.prototxt'
ERROR: failed to build network since parsing model errors.
ERROR: failed to build network.
0:00:01.467716876 12892     0x2b678840 ERROR                nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1735> [UID = 1]: build engine file failed
0:00:01.467750028 12892     0x2b678840 ERROR                nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1821> [UID = 1]: build backend context failed
0:00:01.467776205 12892     0x2b678840 ERROR                nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::initialize() <nvdsinfer_context_impl.cpp:1148> [UID = 1]: generate backend failed, check config file settings
0:00:01.467838573 12892     0x2b678840 WARN                 nvinfer gstnvinfer.cpp:809:gst_nvinfer_start:<primary-inference> error: Failed to create NvDsInferContext instance
0:00:01.467860333 12892     0x2b678840 WARN                 nvinfer gstnvinfer.cpp:809:gst_nvinfer_start:<primary-inference> error: Config file path: dstest1_pgie_config.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
Error: gst-resource-error-quark: Failed to create NvDsInferContext instance (1): /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(809): gst_nvinfer_start (): /GstPipeline:pipeline0/GstNvInfer:primary-inference:
Config file path: dstest1_pgie_config.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED

I think the reason is because I don’t have any samples/ folder in ../../../../ which is my home directory. But I don’t see any link to download the samples folder.

should be in /opt/nvidia/deepstream/deepstream/

Thanks a lot. Should I be editing the config file then?

EDIT: I had a look at the config file and opt/nvidia/deepstream/deepstream and it seems for some reason that there is no file called resnet10.caffemodel_b1_gpu0_int8.engine.

This is bad because the config file requires it:

model-engine-file=/opt/nvidia/deepstream/deepstream/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine

I checked this link where apparently it’s a bug with realpath? So I fed it ../../../../../opt/nvidia... and that seemed to work, but I still get the following output:

Opening in BLOCKING MODE 
ERROR: Deserialize engine failed because file path: /home/marcus/deepstream_python_apps/apps/deepstream-test1/../../../../../opt/nvidia/deepstream/deepstream/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine open error
0:00:01.475607739 14394     0x29f84840 WARN                 nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1690> [UID = 1]: deserialize engine from file :/home/marcus/deepstream_python_apps/apps/deepstream-test1/../../../../../opt/nvidia/deepstream/deepstream/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed
0:00:01.475682747 14394     0x29f84840 WARN                 nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1797> [UID = 1]: deserialize backend context from engine from file :/home/marcus/deepstream_python_apps/apps/deepstream-test1/../../../../../opt/nvidia/deepstream/deepstream/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed, try rebuild
0:00:01.475718972 14394     0x29f84840 INFO                 nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1715> [UID = 1]: Trying to create engine from model files
INFO: [TRT]: Reading Calibration Cache for calibrator: EntropyCalibration2
INFO: [TRT]: Generated calibration scales using calibration cache. Make sure that calibration cache has latest scales.
INFO: [TRT]: To regenerate calibration cache, please delete the existing one. TensorRT will generate a new calibration cache.
INFO: [TRT]: 
INFO: [TRT]: --------------- Layers running on DLA: 
INFO: [TRT]: 
INFO: [TRT]: --------------- Layers running on GPU: 
INFO: [TRT]: conv1 + activation_1/Relu, block_1a_conv_1 + activation_2/Relu, block_1a_conv_2, block_1a_conv_shortcut + add_1 + activation_3/Relu, block_2a_conv_1 + activation_4/Relu, block_2a_conv_2, block_2a_conv_shortcut + add_2 + activation_5/Relu, block_3a_conv_1 + activation_6/Relu, block_3a_conv_2, block_3a_conv_shortcut + add_3 + activation_7/Relu, block_4a_conv_1 + activation_8/Relu, block_4a_conv_2, block_4a_conv_shortcut + add_4 + activation_9/Relu, conv2d_cov, conv2d_cov/Sigmoid, conv2d_bbox, 
INFO: [TRT]: Detected 1 inputs and 2 output network tensors.
ERROR: Serialize engine failed because of file path: /opt/nvidia/deepstream/deepstream-5.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine opened error
0:00:21.523116420 14394     0x29f84840 WARN                 nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1743> [UID = 1]: failed to serialize cude engine to file: /opt/nvidia/deepstream/deepstream-5.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine
INFO: [Implicit Engine Info]: layers num: 3
0   INPUT  kFLOAT input_1         3x368x640       
1   OUTPUT kFLOAT conv2d_bbox     16x23x40        
2   OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40         

0:00:21.542280137 14394     0x29f84840 INFO                 nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<primary-inference> [UID 1]: Load new model:dstest1_pgie_config.txt sucessfully
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
NVMEDIA: NVMEDIABufferProcessing: 1504: NvMediaParserParse Unsupported Codec 
  1. According to " Running Sample Applications" in your link (link, you should have cloned the git repo inside the sources directory. As the sample config paths are relative, they assume everything is located there. Cloning it in the suggested path should work right away, without requiring checking and changing every relative path.

  2. If you insist on running it elsewhere, check for the logs:

ERROR: Deserialize engine failed because file path: /home/marcus/dev/deepstream_python_apps/apps/deepstream-test1/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine open error
0:00:03.809189017 12874     0x10ec5430 WARN                 nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1690> [UID = 1]: deserialize engine from file :/home/marcus/dev/deepstream_python_apps/apps/deepstream-test1/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed
0:00:03.809270010 12874     0x10ec5430 WARN                 nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1797> [UID = 1]: deserialize backend context from engine from file :/home/marcus/dev/deepstream_python_apps/apps/deepstream-test1/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed, try rebuild
0:00:03.809302298 12874     0x10ec5430 INFO                 nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary-inference> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1715> [UID = 1]: Trying to create engine from model files

Here it says it was not able to find the “serialized” model-engine-file, so it will try to build it using the raw model. For this, it will need the proto-file, model-file, etc (and int8-calib-file if network-mode is for int8)

ERROR: Cannot access prototxt file '/home/marcus/dev/deepstream_python_apps/apps/deepstream-test1/../../../../samples/models/Primary_Detector/resnet10.prototxt'
ERROR: failed to build network since parsing model errors.
ERROR: failed to build network

Here it says it could not find the proto-file

In summary, just make sure to check for every relative path in the config file your using. To do this, you can cd into your folder where you’re running the app and check if files are reachable.

But i strongly suggest you clone it in the intended location:

git clone https://github.com/NVIDIA-AI-IOT/deepstream_python_apps /opt/nvidia/deepstream/deepstream/sources/deepstream_python_apps

(you’ll need probably need sudo for that)

then cd:

cd /opt/nvidia/deepstream/deepstream/sources/deepstream_python_apps/apps/deepstream-test1

then run your test:

python3 deepstream_test_1.py ~/test.264
4 Likes