Yolo v3 output boxes are "nan" both in Python and C++


I converted a detection model trained on Tensorflow to ONNX and then to a TensorRT engine file. When I apply inference on the deserialized engine, I get an output tensor which all of its boxes values are nan. The rest of the values that don’t correspond to the boxes coordinates are valid numbers (the scores /class id / objectness).
This happens both in the Python API and in the C++ API. Either if the engine’s precision is float32 or float16.

When I applied inference on the onnx model I got valid outputs for the boxes.


TensorRT Version : 7.1.2
CUDA Version : 11.0
Operating System + Version : Ubuntu 18.04 for the Python inference and the C++ inference; These two are different environments. Each TRT engine was built separately on each environment.
Python Version (if applicable) : 3.6
C++ version: 14
Cmake version: 3.13
TensorFlow Version (if applicable) : The model was trained on tf 1.15, converted to onnx, and then converted to tensorRT engine.
The conversion to TRT engine was done in

I can’t share the model (maybe some “dummy” model in private later if necessary). Any advice why this might happen will be much appreciated!

Hi @weissrael,
Could you please confirm if your model is yolov3(as mentioned in the title) or tensorflow(as mentioned in description)?
Also if you can provide a verbose log statement, that will be helpful.

Found the issue (a bug that repeated somehow both in C++ and python, actually)- I have several inputs in the model, and by mistake the memory copy to the bindings (just before the inference) for all inputs wasn’t in their corresponding indices. Now it’s valid outputs

1 Like