Padding error

Description

I’m trying to convert ‘Yet Another Efficientdet’ pytorch model to trt engine to run on jetson nx.

Relevant Files

[Error files] [sample onnx]

Steps To Reproduce

I downloaded the pre-trained model and repo from Yet another efficientdet. After that I tried to convert it into an onnx file following this link(note, step 4 is not required) and using onnx.export:

torch.onnx.export(self.model, 
            x, 
            "./efficientdet-d4_final.onnx", 
            opset_version=11)

This gave an onnx file. I kept this copy and also folded the onnx using:
polygraphy surgeon sanitize model.onnx --fold-constants --output model_folded.onnx

After getting the onnx file, I ran the command:
trtexec --onnx=name.onnx --fp16=enable --workspace=5500 --batch=1 --saveEngine=model.trt --verbose
This gave me the error relating to pading.

I ran the whole process for:

  • opset = 10,11
  • onnx = folded and not folded
  • torch version = 1.5, 1.8

Errors and the --verbose output of trtexec is in the folder attached.

TensorRT Version: 7.1.3-1+cuda10.2
NVIDIA GPU: NVIDIA Tegra Xavier (nvgpu)/integrated
NVIDIA Driver Version: 32.4.4 Release Build
CUDA Version: 10.2
Operating System: Ubuntu 18.04
Python Version (if applicable): 3.6.9
PyTorch Version (if applicable): 1.5/1.8
Graphics : NVIDIA Tegra Xavier (nvgpu)/integrated
Processor : ARMv8 Processor rev 0 (v8l) × 6
onnx=1.9.0
tf2onnx=1.8.5/50049d

Hi,

Thanks.

We are checking this issue internally.
Will share more information with you later.

1 Like

Hi,

Could you help to validate the ONNX model first?

We can reproduce the dimensions issue from Add operator with TensorRT.
And try to deploy it with onnxruntime.

However, it is not working due to some size issue below:
Have you inferenced the model with any frameworks before?

Traceback (most recent call last):
  File "test_onnxruntime.py", line 13, in <module>
    sess = onnxruntime.InferenceSession(model)
  File "/usr/local/lib/python3.6/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 280, in __init__
    self._create_inference_session(providers, provider_options)
  File "/usr/local/lib/python3.6/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 307, in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from efficientdet-d4_op10_torch1-5.onnx failed:Node (Conv_6) Op (Conv) [ShapeInferenceError] Attribute strides has incorrect size

Thanks.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.