Jetson Inference (Imagenet)

I have a trained model on my matlab, If I convert it to ONNX model. would I be able to test and get results in Nvidia jetson Nano through live camera feed?. My jetson has a camera mounted on it.
I previously trained resnet 18 through pytorch on, converted it into ONNX format and was successfully able to get my results on live feed on jetson nano.
The command I used is as follows
“imagenet --model=models/resnet18.onnx --labels=data/labels.txt --input_blob=input --output_blob=output --video-viewer csi://0

However, my question is that I trained inception v3 model on matlab and want that model in ONNX and the put it on jetson nano and get results. However, this page below suggests that jetson inference (imagenet) only supports inception v4 and not inception v3 ( highlighted link and screenshot attached).
jetson-inference/docs/imagenet-console-2.md at master · dusty-nv/jetson-inference · GitHub.

Is there a way I can use inception v3 model in ONNX format to get results on live feed.

Please guide me.
jetson

Hi,

Please find below to use jetson-inference with the custom network:

Thanks.

Thank you AastaLLL

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.