Engine File location

I’m running the following pipeline:
gst-launch-1.0 v4l2src ! capsfilter caps=video/x-h264,width=(int)1920,height=(int)1080,framerate=30/1 ! h264parse ! nvv4l2decoder ! nvstreammux0.sink_0 nvstreammux name=nvstreammux0 live-source=true width=1920 batch-size=1 height=1080 ! nvinfer config-file-path=/opt/nvidia/deepstream/deepstream-6.1/samples/configs/deepstream-app/config_infer_primary.txt ! nvvideoconvert ! nvdsosd display-clock=true ! nvvideoconvert ! videoconvert ! nvv4l2h264enc ! h264parse ! mpegtsmux ! hlssink playlist-root=http://10.0.0.2:8080/loopback target-duration=1 max-files=30 location=/tmp/video/hls/%05d.ts playlist-location=/tmp/video/hls/loopback.m3u8

It is working correctly however it generates the engine file at each run making me lose a lot of time each run.
From what I’m reading in the forum the compiled engine file is stored somewhere but I could not find it.
I checked the run directory end of the path in which the nvinfer config points but still nothing, Am I doing something wrong?

The config file:

################################################################################
# Copyright (c) 2018-2021, NVIDIA CORPORATION. All rights reserved.
#
# Permission is hereby granted, free of charge, to any person obtaining a
# copy of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.  IN NO EVENT SHALL
# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
# DEALINGS IN THE SOFTWARE.
################################################################################

# Following properties are mandatory when engine files are not specified:
#   int8-calib-file(Only in INT8)
#   Caffemodel mandatory properties: model-file, proto-file, output-blob-names
#   UFF: uff-file, input-dims, uff-input-blob-name, output-blob-names
#   ONNX: onnx-file
#
# Mandatory properties for detectors:
#   num-detected-classes
#
# Optional properties for detectors:
#   cluster-mode(Default=Group Rectangles), interval(Primary mode only, Default=0)
#   custom-lib-path,
#   parse-bbox-func-name
#
# Mandatory properties for classifiers:
#   classifier-threshold, is-classifier
#
# Optional properties for classifiers:
#   classifier-async-mode(Secondary mode only, Default=false)
#
# Optional properties in secondary mode:
#   operate-on-gie-id(Default=0), operate-on-class-ids(Defaults to all classes),
#   input-object-min-width, input-object-min-height, input-object-max-width,
#   input-object-max-height
#
# Following properties are always recommended:
#   batch-size(Default=1)
#
# Other optional properties:
#   net-scale-factor(Default=1), network-mode(Default=0 i.e FP32),
#   model-color-format(Default=0 i.e. RGB) model-engine-file, labelfile-path,
#   mean-file, gie-unique-id(Default=0), offsets, process-mode (Default=1 i.e. primary),
#   custom-lib-path, network-mode(Default=0 i.e FP32)
#
# The values in the config file are overridden by values set through GObject
# properties.

[property]
gpu-id=0
net-scale-factor=0.0039215697906911373
model-file=../../models/Primary_Detector/resnet10.caffemodel
proto-file=../../models/Primary_Detector/resnet10.prototxt
#model-engine-file=../../models/Primary_Detector/resnet10.caffemodel_b30_gpu0_int8.engine
model-engine-file=/opt/nvidia/deepstream/deepstream/samples/models/Primary_Detector/resnet10.caffemodel_b30_gpu0_int8.engine
labelfile-path=../../models/Primary_Detector/labels.txt
int8-calib-file=../../models/Primary_Detector/cal_trt.bin
batch-size=30
process-mode=1
model-color-format=0
## 0=FP32, 1=INT8, 2=FP16 mode
network-mode=1
num-detected-classes=4
interval=0
gie-unique-id=1
output-blob-names=conv2d_bbox;conv2d_cov/Sigmoid
force-implicit-batch-dim=1
#parse-bbox-func-name=NvDsInferParseCustomResnet
#custom-lib-path=/path/to/libnvdsparsebbox.so
## 1=DBSCAN, 2=NMS, 3= DBSCAN+NMS Hybrid, 4 = None(No clustering)
cluster-mode=2
#scaling-filter=0
#scaling-compute-hw=0

#Use the config params below for dbscan clustering mode
#[class-attrs-all]
#detected-min-w=4
#detected-min-h=4
#minBoxes=3

#Use the config params below for NMS clustering mode
[class-attrs-all]
topk=20
nms-iou-threshold=0.5
pre-cluster-threshold=0.2

## Per class configurations
[class-attrs-0]
topk=20
nms-iou-threshold=0.5
pre-cluster-threshold=0.4

#[class-attrs-1]
#pre-cluster-threshold=0.05
#eps=0.7
#dbscan-min-score=0.5

#[class-attrs-2]
#pre-cluster-threshold=0.1
#eps=0.6
#dbscan-min-score=0.95

#[class-attrs-3]
#pre-cluster-threshold=0.05
#eps=0.7
#dbscan-min-score=0.5

Hi,
/opt/nvidia needs permission of root user. Please run with sudo in the first time and it shall show the path of saved engine file like:

0:00:03.796290048  9610   0x55c98c6460 WARN                 nvinfer gstnvinfer.c
pp:635:gst_nvinfer_logger:<nvinfer0> NvDsInferContext[UID 1]: Warning from NvDsI
nferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1889>
 [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.1/
samples/configs/deepstream-app/../../models/Primary_Detector/resnet10.caffemodel
_b30_gpu0_int8.engine failed
0:00:03.830752928  9610   0x55c98c6460 WARN                 nvinfer gstnvinfer.c
pp:635:gst_nvinfer_logger:<nvinfer0> NvDsInferContext[UID 1]: Warning from NvDsI
nferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1996> [UID
 = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream
/deepstream-6.1/samples/configs/deepstream-app/../../models/Primary_Detector/res
net10.caffemodel_b30_gpu0_int8.engine failed, try rebuild
0:00:03.830944096  9610   0x55c98c6460 INFO                 nvinfer gstnvinfer.c
pp:638:gst_nvinfer_logger:<nvinfer0> NvDsInferContext[UID 1]: Info from NvDsInfe
rContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1914> [UID = 1]: Trying t
o create engine from model files
0:03:49.651701888  9610   0x55c98c6460 INFO                 nvinfer gstnvinfer.c
pp:638:gst_nvinfer_logger:<nvinfer0> NvDsInferContext[UID 1]: Info from NvDsInfe
rContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1947> [UID = 1]: serializ
e cuda engine to file: /opt/nvidia/deepstream/deepstream-6.1/samples/models/Prim
ary_Detector/resnet10.caffemodel_b30_gpu0_int8.engine successfully

If I change the engine path the new path will be used to place the engine file?
So that I can use a path that does not require sudo?

Sorry for the late response, is this still an issue to support? Thanks

In truth, I’m still having issues.
I changed the path to one in the user folder but still nothing.
However if run it with sudo, the engine file is generated but is not readable if I do not relaunch the pipeline a second time using sudo.

Hi,
In second run , we can run successfully without sudo. It is a bit strange you still hit the permission issue. You may try to configure these paths:

model-file=../../models/Primary_Detector/resnet10.caffemodel
proto-file=../../models/Primary_Detector/resnet10.prototxt
model-engine-file=/opt/nvidia/deepstream/deepstream/samples/models/Primary_Detector/resnet10.caffemodel_b30_gpu0_int8.engine
labelfile-path=../../models/Primary_Detector/labels.txt
int8-calib-file=../../models/Primary_Detector/cal_trt.bin

to absolute path under /tmp/ see if it can work without sudo. /tmp/ should be the folder without much restriction. Please give it a try for testing purpose.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.