The fp16 engine cannot be generated normally, a Segmentation fault (core dumped) occurs, and the fp32 is ok

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) :GPU 2080Ti,ubuntu18.04
• DeepStream Version deepstream 6.0.1
• JetPack Version (valid for Jetson only) :no
• TensorRT Version : TensorRT 8.0.1
• NVIDIA GPU Driver Version (valid for GPU only) :470.63.01
• Issue Type( questions, new requirements, bugs):questions

I use the deepstream 6.0.1 version to run the yolov5-5,0 model, I can normally generate the fp32 engine model file, and can reason normally, but if I switch to fp16, an error will occur

The configuration file is as follows:

[property]
gpu-id=0
net-scale-factor=0.0039215697906911373
model-color-format=0
custom-network-config=yolov5x.cfg
model-file=yolov5x.wts
#model-engine-file=model_b10_gpu0_fp32.engine
#int8-calib-file=calib.table
labelfile-path=labels.txt
batch-size=10
network-mode=0
num-detected-classes=1
interval=0
gie-unique-id=1
process-mode=1
network-type=2
cluster-mode=2
maintain-aspect-ratio=1
symmetric-padding=1
parse-bbox-func-name=NvDsInferParseYolo
custom-lib-path=nvdsinfer_custom_impl_Yolo/libnvdsinfer_custom_impl_Yolo.so
engine-create-func-name=NvDsInferYoloCudaEngineGet

[class-attrs-all]
nms-iou-threshold=0.45
pre-cluster-threshold=0.65
topk=300

The running log is as follows:

deepstream-app -c deepstream_app_config.txt
0:00:00.376821032 39843 0x55a9b6906e60 INFO                 nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1914> [UID = 1]: Trying to create engine from model files
Segmentation fault (core dumped)

But when I just put

network-type=2    # fp16

changed to

network-type=0  #fp32

The engine culture can be generated normally, and normal reasoning can be performed.

strong text

我已经解决了这个问题

应该修改network-mode而不是network-type

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.