Description
I am trying to run inference using a trt engine, but when allocating buffers it is showing my output bindings as negative sized, and preventing me from running the engine. What could cause this?
Here is the snippet to get sizes:
for binding in self.engine:
print(binding)
size = trt.volume(self.engine.get_tensor_shape(binding))
print(size)
Output:
scores
-1
labels
-1
boxes
-4
Environment
TensorRT Version : 8.5
GPU Type : Jetson Xavier
Nvidia Driver Version :
CUDA Version : 11.4
CUDNN Version :
Operating System + Version : Ubuntu 20
Python Version (if applicable) : 3.8.10
TensorFlow Version (if applicable) :
PyTorch Version (if applicable) : 1.13
Baremetal or Container (if container which image + tag) :