Hi, I exported a model to ONNX from keras2onnx, and tried to load it to tensorRT with python.
I have received the following error:
Is it that mean tenssorrt do not bave the node for kears?
What can I do to solve it.
By the way,
I use tensorrt 8.2,
cuda 10.2,
cudnn 8.2,
jetpack 4.6 in jetson nano
It is my onnx model. And I have checked the model with onnx.
model.zip (333.2 KB)
It is my log that I use ./trtexec --onnx=model.onnx --saveEngine=model.trt --workspace=30 --verbose
onnx2trt_log.txt (32.0 KB)