Trtexec Internal Error (Symbolic relation a.z >= 0 is always false. )


Unable to convert an ONNX model to a trt engine using trtexec command or python package. Would love to give tensorrt a try but have been unable to use it for quite some time now. The model architecture is called “Segformer”, it’s a vision transformer for image segmentation.

Previously created another thread attempting to use trtexec to build the model on a cloud GPU as I thought I simply needed more memory but now that I know that was not the issue I am attempting to simply build directly on the Jetson device, especially since the developer guide explains that serialized engines are not transferable between GPUs. I understand that this problem may have been resolved in the latest version of TensorRT, is there any way that I can use these new features on my Jetson Xavier NX or am I stuck waiting for the next update to JetPack? If I am stuck waiting for the next update, is there a roadmap I can access somewhere? Is it a matter of weeks, months, or years until the new version of TensorRT can be used on my NVIDIA Jetson device? Thank you.


TensorRT Version: 8.4.3
GPU Type: Jetson Xavier NX
Nvidia Driver Version:
CUDA Version: 11.4
CUDNN Version:
Operating System + Version: NVIDIA Jetpack 5.0.2
Python Version (if applicable): 3.8.10
TensorFlow Version (if applicable): N/A
PyTorch Version (if applicable): 1.11.0
Baremetal or Container (if container which image + tag): N/A

Relevant Files

sidewalk3.onnx sidewalk3.onnx - Google Drive
sidewalk4.onnx sidewalk4.onnx - Google Drive

Steps To Reproduce

(Based on this portion of the developer guide: Developer Guide :: NVIDIA Deep Learning TensorRT Documentation)

import tensorrt as trt
logger = trt.Logger(trt.Logger.WARNING)
builder = trt.Builder(logger)
network = builder.create_network(1 << int(trt.NetworkDefinitionCreationFlag.EXPLICIT_BATCH))
parser = trt.OnnxParser(network, logger)
success = parser.parse_from_file(model_path)
config = builder.create_builder_config()
serialized_engine = builder.build_serialized_network(network, config)

Request you to share the ONNX model and the script if not shared already so that we can assist you better.
Alongside you can try few things:

  1. validating your model with the below snippet

import sys
import onnx
filename = yourONNXmodel
model = onnx.load(filename)
2) Try running your model with trtexec command.

In case you are still facing issue, request you to share the trtexec “”–verbose"" log for further debugging

Yes the models pass the checker without error.


We are moving this post to the Jetson Xavier NX forum to get better help.

This issue is fixed as part of 8.5.1 TensorRT release, in case of latest version is not available on JetPack and you can also try TensorRT NGC container.

Thank you.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.