Testing deepstream-test3-app with peoplenet model

The original dstest3_pgie_config.txt has sample running with resnet10 caffe model.
But I changed to test with peoplenet downloaded from tlt.

The config file is changed as follow.
[property]
gpu-id=0
net-scale-factor=0.0039215697906911373
model-file=…/…/…/…/samples/models/tlt_peoplenet/resnet34_peoplenet_pruned.etlt
#proto-file=…/…/…/…/samples/models/tlt_peoplenet/resnet10.prototxt
model-engine-file=…/…/…/…/samples/models/tlt_peoplenet/resnet34_peoplenet_pruned.etlt_b1_gpu0_fp16.engine
labelfile-path=…/…/…/…/samples/models/tlt_peoplenet/labels.txt
int8-calib-file=…/…/…/…/samples/models/tlt_peoplenet/cal_trt.bin
force-implicit-batch-dim=1
batch-size=1
process-mode=1
model-color-format=0
network-mode=1
num-detected-classes=4
interval=0
gie-unique-id=1
output-blob-names=conv2d_bbox;conv2d_cov/Sigmoid

I have error compiling code as follows.

Using winsys: x11 ERROR: Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-test3/../../../../samples/models/tlt_peoplenet/resnet34_peoplenet_pruned.etlt_b1_gpu0_fp16.engine open error 0:00:01.451818911 25100 0x55858cec00 WARN nvinfer gstnvinfer.cpp:599:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1566> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-test3/../../../../samples/models/tlt_peoplenet/resnet34_peoplenet_pruned.etlt_b1_gpu0_fp16.engine failed 0:00:01.452338103 25100 0x55858cec00 WARN nvinfer gstnvinfer.cpp:599:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1673> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-test3/../../../../samples/models/tlt_peoplenet/resnet34_peoplenet_pruned.etlt_b1_gpu0_fp16.engine failed, try rebuild 0:00:01.452661734 25100 0x55858cec00 INFO nvinfer gstnvinfer.cpp:602:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1591> [UID = 1]: Trying to create engine from model files ERROR: failed to build network since there is no model file matched. ERROR: failed to build network. 0:00:01.453307844 25100 0x55858cec00 ERROR nvinfer gstnvinfer.cpp:596:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1611> [UID = 1]: build engine file failed 0:00:01.453623315 25100 0x55858cec00 ERROR nvinfer gstnvinfer.cpp:596:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1697> [UID = 1]: build backend context failed 0:00:01.453849149 25100 0x55858cec00 ERROR nvinfer gstnvinfer.cpp:596:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::initialize() <nvdsinfer_context_impl.cpp:1024> [UID = 1]: generate backend failed, check config file settings 0:00:01.454093864 25100 0x55858cec00 WARN nvinfer gstnvinfer.cpp:781:gst_nvinfer_start:<primary-nvinference-engine> error: Failed to create NvDsInferContext instance 0:00:01.454355573 25100 0x55858cec00 WARN nvinfer gstnvinfer.cpp:781:gst_nvinfer_start:<primary-nvinference-engine> error: Config file path: dstest3_pgie_config.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED Running... ERROR from element primary-nvinference-engine: Failed to create NvDsInferContext instance Error details: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(781): gst_nvinfer_start (): /GstPipeline:dstest3-pipeline/GstNvInfer:primary-nvinference-engine: Config file path: dstest3_pgie_config.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED Returned, stopping playback Deleting pipeline

Why can’t create engine?

I have changed model-file to tlt-encoded-model as follow.
[property]
gpu-id=0
net-scale-factor=0.0039215697906911373
tlt-encoded-model=…/…/…/…/samples/models/tlt_peoplenet/resnet34_peoplenet_pruned.etlt
model-engine-file=…/…/…/…/samples/models/tlt_peoplenet/resnet34_peoplenet_pruned.etlt_b1_gpu0_fp16.engine
labelfile-path=…/…/…/…/samples/models/tlt_peoplenet/labels.txt
int8-calib-file=…/…/…/…/samples/models/tlt_peoplenet/cal_trt.bin
force-implicit-batch-dim=1
batch-size=1
process-mode=1
model-color-format=0
network-mode=1
num-detected-classes=4
interval=0
gie-unique-id=1
output-blob-names=conv2d_bbox;conv2d_cov/Sigmoid

But still have error as

Running from bin folder

xaiver@xaiver-desktop:/opt/nvidia/deepstream/deepstream/bin$ ./deepstream-test3-app file:///opt/nvidia/deepstream/deepstream-5.0/samples/streams/S20704.mp4
Failed to load config file: No such file or directory
** ERROR: <gst_nvinfer_parse_config_file:1158>: failed
Now playing: file:///opt/nvidia/deepstream/deepstream-5.0/samples/streams/S20704.mp4,

Using winsys: x11 
0:00:00.195799095 23383     0x1fb07ed0 WARN                 nvinfer gstnvinfer.cpp:747:gst_nvinfer_start:<primary-nvinference-engine> error: Configuration file parsing failed
0:00:00.195854362 23383     0x1fb07ed0 WARN                 nvinfer gstnvinfer.cpp:747:gst_nvinfer_start:<primary-nvinference-engine> error: Config file path: dstest3_pgie_config.txt
Running...
ERROR from element primary-nvinference-engine: Configuration file parsing failed
Error details: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(747): gst_nvinfer_start (): /GstPipeline:dstest3-pipeline/GstNvInfer:primary-nvinference-engine:
Config file path: dstest3_pgie_config.txt
Returned, stopping playback
Deleting pipeline

Running from sources/sample_apps/ folder

  xaiver@xaiver-desktop:/opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-test3$ ./deepstream-test3-app file:///opt/nvidia/deepstream/deepstream-5.0/samples/streams/S20704.mp4
Now playing: file:///opt/nvidia/deepstream/deepstream-5.0/samples/streams/S20704.mp4,

Using winsys: x11 
ERROR: Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-test3/../../../../samples/models/tlt_peoplenet/resnet34_peoplenet_pruned.etlt_b1_gpu0_fp16.engine open error
0:00:01.312286203 23598   0x5595070ed0 WARN                 nvinfer gstnvinfer.cpp:599:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1566> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-test3/../../../../samples/models/tlt_peoplenet/resnet34_peoplenet_pruned.etlt_b1_gpu0_fp16.engine failed
0:00:01.312525478 23598   0x5595070ed0 WARN                 nvinfer gstnvinfer.cpp:599:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1673> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-test3/../../../../samples/models/tlt_peoplenet/resnet34_peoplenet_pruned.etlt_b1_gpu0_fp16.engine failed, try rebuild
0:00:01.312622667 23598   0x5595070ed0 INFO                 nvinfer gstnvinfer.cpp:602:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1591> [UID = 1]: Trying to create engine from model files
WARNING: INT8 calibration file not specified/accessible. INT8 calibration can be done through setDynamicRange API in 'NvDsInferCreateNetwork' implementation
ERROR: Uff input blob name is empty
ERROR: Failed to create network using custom network creation function
ERROR: Failed to get cuda engine from custom library API
0:00:01.313467251 23598   0x5595070ed0 ERROR                nvinfer gstnvinfer.cpp:596:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1611> [UID = 1]: build engine file failed
0:00:01.313555127 23598   0x5595070ed0 ERROR                nvinfer gstnvinfer.cpp:596:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1697> [UID = 1]: build backend context failed
0:00:01.313706591 23598   0x5595070ed0 ERROR                nvinfer gstnvinfer.cpp:596:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::initialize() <nvdsinfer_context_impl.cpp:1024> [UID = 1]: generate backend failed, check config file settings
0:00:01.313812100 23598   0x5595070ed0 WARN                 nvinfer gstnvinfer.cpp:781:gst_nvinfer_start:<primary-nvinference-engine> error: Failed to create NvDsInferContext instance
0:00:01.313939498 23598   0x5595070ed0 WARN                 nvinfer gstnvinfer.cpp:781:gst_nvinfer_start:<primary-nvinference-engine> error: Config file path: dstest3_pgie_config.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
Running...
ERROR from element primary-nvinference-engine: Failed to create NvDsInferContext instance
Error details: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(781): gst_nvinfer_start (): /GstPipeline:dstest3-pipeline/GstNvInfer:primary-nvinference-engine:
Config file path: dstest3_pgie_config.txt, NvDsInfer Error: NVDSINFER_CONFIG_FAILED
Returned, stopping playback
Deleting pipeline

Hi @
Where you get the calibration file? In my opinion, this calibration file isn’t for resnet34-peoplenet model, because in the unpurned of zip file this model doesn’t exist this file, there are .txt files and tlt and etlt files.

Calibration file will be created by tensorrt. You need to provide images for calibration. Not sure settings in deepstream, I did in tensorrt.

@edit_or
that file will be generated running with tlt-converter api step? If so, this api only generated .engine file and accept calibration file for input right?

Yes according to my TensorRT experience, the file is generated during calibration according to the path you defined.
That is for Int8. Don’t need calibration for fp16. You need to provide calibration images for Int8.

I’ve calibrated images for int8, but I’ve met an trouble as shown:

As my config: