• Hardware Platform: GPU RTX2080
• DeepStream Version: 5.0
• TensorRT Version: 7.0.0.11
• NVIDIA GPU Driver Version:440.59
I faced problem with yolo model serialization path. I’m using nvcr.io/nvidia/deepstream:5.0-dp-20.04-base docker image. I have following folders structure:
/opt/app/
|______ 20200428-000000.mp4
|______ src/
|______models/yolov3-tiny.cfg,yolov3-tiny.weights
|______configs/config.txt
I want to run my application in src folder and save .engine file to models/ folder. Config.txt contains following:
net-scale-factor=0.0039215697906911373
model-file=../models/yolov3-tiny.weights
custom-network-config=../models/yolov3-tiny.cfg
model-engine-file=../models/model_b1_gpu0_fp32.engine
batch-size=1
process-mode=1
# 0=RGB, 1=BGR
model-color-format=0
## 0=FP32, 1=INT8, 2=FP16 mode
network-mode=0
num-detected-classes=80
gie-unique-id=1
network-type=0
## 0=Group Rectangles, 1=DBSCAN, 2=NMS, 3=DBSCAN+NMS Hybrid, 4=None(No clustering)
cluster-mode=2
maintain-aspect-ratio=1
parse-bbox-func-name=NvDsInferParseCustomYoloV3Tiny
custom-lib-path=../libs/yolo/libnvdsinfer_custom_impl_Yolo.so
engine-create-func-name=NvDsInferYoloCudaEngineGet
#scaling-filter=0
#scaling-compute-hw=0
[class-attrs-all]
nms-iou-threshold=1.
threshold=1.1
## Person class configuration
[class-attrs-0]
nms-iou-threshold=0.3
#threshold=0.7
I’m running following command in src folder:
gst-launch-1.0 uridecodebin uri=file:///opt/app/20200428-000000.mp4 ! m.sink_0 nvstreammux name=m batch-size=1 width=1280 height=720 ! queue ! nvinfer config-file-path=/opt/app/configs/config.txt ! fakesink
and model_b1_gpu0_fp32.engine
saving to src folder, not models folder, as I specified in config. Could you advise what I do wrong or is it bug?