Could not parse ONNX model (2)

Hello!

Unfortunately this - not yet solved - post somehow went out of Your „radar“ but is of great importance for us.

In short:
When trying to convert ONNX file into TensorRT engine file on Jetson Nano an error occured and you suggested that I try with newest TRT ver. 8.6.1 but:

Q1) Does for this 8.6.1 still apply the rule that the engine must be built on same machine type where inference will be run (Nano in this case) ?

Q2) If above answer is yes: TRT 8.6.1 is incompatible with Nano AFAIK, how to run in there then, in Docker perhaps?

Q3) If above answer is yes: could You please suggest Docker image for TRT 8.6.1 ? And must Nano be the host or host is other machine and Nano L4T is the container?

Thanks!

Environment

TensorRT Version: 8.0.1
GPU Type: Jetson Nano devkit (Maxwell)
Nvidia Driver Version:
CUDA Version: 10.2.300
CUDNN Version: 8.2.1
Operating System + Version: Ubuntu 18.04 (Jetpack 4.6.0)
PyTorch Version (if applicable): 1.12.1

Relevant Files

Please attach or include links to any models, data, files, or scripts necessary to reproduce your issue. (Github repo, Google Drive, Dropbox, etc.)

Steps To Reproduce

Please include:

  • Exact steps/commands to build your repro
  • Exact steps/commands to run your repro
  • Full traceback of errors encountered