Description
I have had issues using PyTorch’s InstanceNorm2d in my neural network. When exporting to ONNX, originally I hit an error ‘ONNX export of instance_norm for unknown batch size.’ Note that it was batch size, everywhere else I saw the error mentioned was for channel size. I solved that by turning off track_running_stats
. onnx.checker.check_model
doesn’t raise an error now either. So now I can export the model, but when I try to build the engine I get the following error:
context = engine.create_execution_context()
AttributeError: 'NoneType' object has no attribute 'create_execution_context'
I’ve tried using trtexec
instead, and I get this error:
Assertion failed: inputs.at(2).is_weights() && "The bias tensor is required to be an initializer."
again relating to InstanceNorm. However when I use onnx-simplifier
to simplify the ONNX graph I am able to build the engine as expected. Could someone help me to understand where this error comes from?
Environment
TensorRT Version: 8.6.1.6
GPU Type: RTX A5000
Nvidia Driver Version: 525.147.05
CUDA Version: 12.0
CUDNN Version: 8.9.2.26
Operating System + Version: Ubuntu 23.10
Python Version (if applicable): 3.8.19
TensorFlow Version (if applicable):
PyTorch Version (if applicable): 2.2.0
Baremetal or Container (if container which image + tag):
Relevant Files
Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)
Steps To Reproduce
Please include:
- Exact steps/commands to build your repro
- Exact steps/commands to run your repro
- Full traceback of errors encountered