Internal Error (Could not find any implementation for node {ForeignNode[W1...Softmax]}.)

Hi, I exported a model to ONNX from keras2onnx, and tried to load it to tensorRT with python.
I have received the following error:

Is it that mean tenssorrt do not bave the node for kears?
What can I do to solve it.
By the way,
I use tensorrt 8.2,
cuda 10.2,
cudnn 8.2,
jetpack 4.6 in jetson nano

It is my onnx model. And I have checked the model with onnx.
model.zip (333.2 KB)

It is my log that I use ./trtexec --onnx=model.onnx --saveEngine=model.trt --workspace=30 --verbose
onnx2trt_log.txt (32.0 KB)

Ok fine, I saved the engine by using trtexec. The reason for the failure of the above save is because of my bin folder, which ordinary users don’t have write permissions.
I found the reason for the error of using code to scrape. That is because the input shapes too large.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.