Model Performance Request

Hello,

Sorry if this type of post is not permitted, but I wanted to ask if anyone has a NVIDIA Jetson Orin Nano 8GB and is willing to run inference on an image classification model I have?

I’m willing to pay for your time, I mostly wanted to see how well the model performs before I bite the bullet and purchase the dev kit.

Cheers,
Isaac

Would anyone be interested in this?

Hi,

Sorry for the late update.
You can benchmark the model with trtexec directly.

$ sudo nvpmodel -m 0
$ sudo jetson_clocks
$ /usr/src/tensorrt/bin/trtexec --onnx=[model]

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.