How to infer using tensorRT on jetson nano?

Hello, @AastaLLL

I trained yolov3(pytorch) as a custom dataset, and the file came out.
I converted this to best.onnx by the method provided by the project (

I converted best.onnx to best.trt using trtexec.

We don’t know how to infer in jetson NANO using best.trt like this.

Can you give me any advice?

The onnx obtained after training is in the following form.


Since the TensorRT engine doesn’t support portability, you will need to use ONNX on Nano to generate the engine first.
An example of creating an engine from ONNX and inference can be found in the below comment:


1 Like


I converted trained with a custom dataset to yolov3.onnx
Paste it into the /usr/src/tensorrt/samples/python/yolov3_onnx directory, then
When I run sudo python3, the following error appears.
Could you tell me how to fix it?

jetson7@jetson7-desktop:/usr/src/tensorrt/samples/python/yolov3_onnx$ sudo python3
Loading ONNX file from path yolov3.onnx…
Beginning ONNX file parsing
[TensorRT] WARNING: onnx2trt_utils.cpp:220: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
Completed parsing of ONNX file
Building an engine from file yolov3.onnx; this may take a while…
Completed creating Engine
Running inference on image sy01_val_0001.jpg…
Traceback (most recent call last):
File “”, line 187, in
File “”, line 167, in main
trt_outputs = [output.reshape(shape) for output, shape in zip(trt_outputs, output_shapes)]
File “”, line 167, in
trt_outputs = [output.reshape(shape) for output, shape in zip(trt_outputs, output_shapes)]
ValueError: cannot reshape array of size 463524 into shape (1,255,19,19)

Thank you.


Have you updated the following configure based on your model?

The error indicates some dimensions incompatible in the output layer.
So it’s recommended to check if all the configure is correct first.

postprocessor_args = {"yolo_masks": [(6, 7, 8), (3, 4, 5), (0, 1, 2)],                    # A list of 3 three-dimensional tuples for the YOLO masks
                      "yolo_anchors": [(10, 13), (16, 30), (33, 23), (30, 61), (62, 45),  # A list of 9 two-dimensional tuples for the YOLO anchors
                                       (59, 119), (116, 90), (156, 198), (373, 326)],
                      "obj_threshold": 0.6,                                               # Threshold for object coverage, float value between 0 and 1
                      "nms_threshold": 0.5,                                               # Threshold for non-max suppression algorithm, float value between 0 and 1
                      "yolo_input_resolution": input_resolution_yolov3_HW}


1 Like