I got this error on an ONNX model which works perfect on TensorRT 7
After upgrade to tensorrt 8.0 or tensorrt 8.2, always hit this error.
my model is DETR a transformer oNNX model which can be accessed from facebookresearch repo.
I got this error on an ONNX model which works perfect on TensorRT 7
After upgrade to tensorrt 8.0 or tensorrt 8.2, always hit this error.
my model is DETR a transformer oNNX model which can be accessed from facebookresearch repo.
Hi @LucasJin , welcome back to the NVIDIA Developer forums!
Judging from how you describe your issue being connected to TensorRT, I moved this topic to that specific category
I hope you will get it resolved soon.
Markus
Hi @LucasJin,
Could you please share us issue repro ONNX model and trtexec --verbose
logs for better help.
Thank you.
Hi,
Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:
import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
onnx.checker.check_model(model).
2) Try running your model with trtexec command.
https://github.com/NVIDIA/TensorRT/tree/master/samples/opensource/trtexec
In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging
Thanks!
Oh, after serveral years I just got the message.
However, seems I have already made DETR inference via tensorrt long time ago.