Deepstream on RTX 3000 series cards

• Hardware Platform : GPU RTX 3070, Ryzen 5 3600X, 16GB Ram
• OS : Ubuntu 20.04
• DeepStream Version: nvcr.io/nvidia/deepstream:5.0-20.08-devel-a100
• TensorRT Version: 7.0.0
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type: questions

Hello, I’m wonderring if anyone could run Deepstream-5.0 on RTX 3070?

I did try with nvcr.io/nvidia/deepstream:5.0-20.08-devel-a100 because I though A100 has the same architect with RTX 3070 - Ampere. Here is the error I got. It’s seem tensorrt does not support the current SW - 86.

oot@adddf63f3acf:/opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-test1# ./deepstream-test1-app ../../../../samples/streams/sample_1080p_h264.mp4 
Now playing: ../../../../samples/streams/sample_1080p_h264.mp4
ERROR: ../nvdsinfer/nvdsinfer_model_builder.cpp:1523 Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-test1/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine open error
0:00:00.365906486   113 0x55e070ee3260 WARN                 nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1690> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-test1/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed
0:00:00.365933236   113 0x55e070ee3260 WARN                 nvinfer gstnvinfer.cpp:616:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1797> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-test1/../../../../samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed, try rebuild
0:00:00.365941486   113 0x55e070ee3260 INFO                 nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1715> [UID = 1]: Trying to create engine from model files
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1277 INT8 not supported by platform. Trying FP16 mode.
WARNING: ../nvdsinfer/nvdsinfer_model_builder.cpp:1291 FP16 not supported by platform. Using FP32 mode.
ERROR: ../nvdsinfer/nvdsinfer_func_utils.cpp:33 [TRT]: ../rtSafe/cuda/caskUtils.cpp (98) - Assertion Error in trtSmToCask: 0 (Unsupported SM.)
ERROR: ../nvdsinfer/nvdsinfer_model_builder.cpp:1186 Build engine failed from config file
ERROR: ../nvdsinfer/nvdsinfer_model_builder.cpp:884 failed to build trt engine.
0:00:01.083725847   113 0x55e070ee3260 ERROR                nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1735> [UID = 1]: build engine file failed
0:00:01.083865436   113 0x55e070ee3260 ERROR                nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1821> [UID = 1]: build backend context failed
0:00:01.083873126   113 0x55e070ee3260 ERROR                nvinfer gstnvinfer.cpp:613:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::initialize() <nvdsinfer_context_impl.cpp:1148> [UID = 1]: generate backend failed, check config file settings
0:00:01.083905645   113 0x55e070ee3260 WARN                 nvinfer gstnvinfer.cpp:809:gst_nvinfer_start:<primary-nvinference-engine> error: Failed to create NvDsInferContext instance
0:00:01.083910035   113 0x55e070ee3260 WARN                 nvinfer gstnvinfer.cpp:809:gst_nvinfer_start:<primary-nvinference-engine> error: Config file path: dstest1_pgie_config.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
Running...
ERROR from element primary-nvinference-engine: Failed to create NvDsInferContext instance
Error details: gstnvinfer.cpp(809): gst_nvinfer_start (): /GstPipeline:dstest1-pipeline/GstNvInfer:primary-nvinference-engine:
Config file path: dstest1_pgie_config.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
Returned, stopping playback
Deleting pipeline

Thank you and best regards,

1 Like

Seems TensorRT issue, can you run any TRT samples on RTX 3070?

The issue is gone with Deepstream 5.1

Thank you