Custom trained model on Jetson Nano

Thx for the reply.
I successfully converted my model to onnx format but running inference script ( Custom ResNet Jetson Xavier - #3 by AastaLLL ) throws same error as this https://imgur.com/OD8laNl.
Heres my recent model: model.onnx (2.9 MB)
Do i have to change the inference script or do i need to configure the ONNX converter?