Segmentation fault occurs at deserializeCudaEngine

OS:Ubuntu16.04
GPU type 1080Ti
nvidia driver version 390.59
CUDA version 9
CUDNN version 7.0.5
TensorRT version TensorRT 5 GA

Hi.

I use onnxparser to save from onnx model to trt model and use processing to directly call from this trt model. I wrote this process with reference to the following.

GitHub - dusty-nv/jetson-inference: Hello AI World guide to deploying deep-learning inference networks and deep vision primitives with TensorRT and NVIDIA Jetson. ‘tensorNet.cpp’

However, a segmentation fault occurs at the following location and the trt model can not be loaded.

trt_engine = trt_infer-> deserializeCudaEngine (modelMem, modelSize, NULL);

mnist, and vgg etc were successfully loaded, but my custom model could not be loaded. When examining it, the layer of “Upsample” of the custom model seems to be related. Loading succeeded when deleting this layer.

Incidentally, without using deserializeCudaEngine(not use from trt model), using the following trt_engine from the onnx model succeeded in inference processing.

trt_engine = trt_builder->buildCudaEngine(*trt_network);

But this process takes too long to convert, so I’d like to call it directly from the TRT model.
Is there a solution?

Thanks.

Hello,

Can you share the traceback from the segmentation fault? Can you share a repro of your onnx model that exihibit the seg fault? You can DM me directly if you’d like.

Hi

Thank you for reply.

I attached gdb log and model together.

model.onnx is a ONNX model without an upsample layer.
model.trt is a TRT model converted from model.onnx.
This model can be loaded without error at deserializeCudaEngine.

model_up.onnx is a ONNX model with an upsample layer.
model_up.trt is a TRT model converted from model_up.onnx.
This model will occurs a segmentation fault at deserializeCudaEngine.

Please check.

Thanks.
temp.zip (4.72 MB)

hello,

Can you share how you are saving your model?

Our engineers are able to parse model_up.onnx and generate a TRT engine locally and reload it just fine with deserializeCudaEngine, but when loading model_up.trt gets an error.

regards,
NVES

Hello.
Thank you for your reply.

Conversion to TRT model was done on onnx2trt.
I built the following things.

onnx2trt model_up.onnx -o model_up.trt

There is a way to load it!
Could you tell me?

Thanks.

Hello.

This problem was solved.
The cause was because I used “libnvonnxparser” which I built on the site below.

It is now possible to load by using “libnvonnxparser” in TensorRT 5.
Thank you for your confirmation.

Thanks.

Hello j-kim & NVES
could you tell more detail about “libnvonnxparser”?
I met same error,

Look forward for your reply.