Please provide the following info (check/uncheck the boxes after creating this topic):
Software Version
DRIVE OS Linux 5.2.6
DRIVE OS Linux 5.2.6 and DriveWorks 4.0
DRIVE OS Linux 5.2.0
DRIVE OS Linux 5.2.0 and DriveWorks 3.5
NVIDIA DRIVE™ Software 10.0 (Linux)
NVIDIA DRIVE™ Software 9.0 (Linux)
other DRIVE OS version
other
Target Operating System
Linux
QNX
other
Hardware Platform
NVIDIA DRIVE™ AGX Xavier DevKit (E3550)
NVIDIA DRIVE™ AGX Pegasus DevKit (E3550)
other
SDK Manager Version
1.7.0.8846
other
Host Machine Version
native Ubuntu 18.04
other
Hi,
I am working on deploying onnx model to xavier machine. Since there are some custom operators in our original model, we need to implement plugins for running these model. We’ve tested in x86 environment with TensorRT6.0 opensource software. Through implementing plugins in onnx parser onnx2tensorrt and recompile it. We managed converting our onnx model to x86 tensorRT engine.
However, we can’t find suitable onnx2tensorrt for arm64 TensorRT 6.3.1 in DriveOS 5.2.0. And we are working on modifying opensource onnx2tensorrt and cross compiling it for using on xavier machine. But we met some problems.
I’m wondering is there a onnx2tensorrt opensource project for TensorRT6.3.1? Is there a more elegant way dealing with custom ops in onnx model?
thank you.