Back-to-back-detector, TensortRT cache

Hi,

Back-to-back-detector create tensortRT cache every time we restart the application and not does read exiting cache file

provide us sample code ?

0:00:00.764166565 12586   0x5599b47890 WARN                 nvinfer gstnvinfer.cpp:515:gst_nvinfer_logger:<primary-nvinference-engine3> NvDsInferContext[UID 3]:useEngineFile(): Failed to read from model engine file
0:00:00.764275947 12586   0x5599b47890 INFO                 nvinfer gstnvinfer.cpp:519:gst_nvinfer_logger:<primary-nvinference-engine3> NvDsInferContext[UID 3]:initialize(): Trying to create engine from model files

Hi,

Please modify the configure with a cache file path:
https://github.com/NVIDIA-AI-IOT/deepstream_reference_apps/blob/master/back-to-back-detectors/primary_detector_config.txt#L60

[property]
gpu-id=0
net-scale-factor=0.0039215697906911373
model-file=../../../../samples/models/Primary_Detector/resnet10.caffemodel
proto-file=../../../../samples/models/Primary_Detector/resnet10.prototxt
model-engine-file=../../../../samples/models/Primary_Detector/resnet10.caffemodel_b8_fp16.engine
... 

Thanks.

Ah thanks, it was the issue with path