Training Mask RCNN Model for use on the AGX Orin Development Kit

Hi, I originally asked this question on a different forum but what redirected here. I’ll repost my question.

I currently have an AGX Orin Developer Kit and am trying to evaluate my options and figure out what’s best for my situation. I apologize in advance for any misunderstanding I may have. I am new to deep learning and trying to learn as I go.

My current plan is to train a Mask RCNN model, and use that to run inference for a specific case. I don’t want to dive into any specifics, but the model will run inference on a new image every 5 seconds (for example), then the output of the inference of each image will be used for another task.

To train this model, I plan to use a could service such as Google Cloud to run the TAO Toolkit. I’ve created a VM and been able to successfully train a model (just by running through the Jupyter notebooks provided). But I was also wondering, can I train a Mask RCNN model on my AGX Orin without the use of TAO. From what I understand, TAO cannot be run on the AGX Orin Developer Kit. So am I able to use the developer kit and run python code to train a Mask RCNN model? I’d prefer TAO, but I have been running into availability issues with the Google Cloud GPUs recently… I may try another VM service.

I am going to write this next question assuming I have trained the model with the TAO toolkit. I now believe I can deploy the model. Do I have to use DeepStream to deploy the model? I used the SDKManager to flash my development kit with the latestest version of JetPack and DeepStream. Aside from DeepStream, are there other ways to deploy the model and use it for inference in the method described above?

I once again apologize as these may be basic questions. I am just trying to get my footing with training and deploying models.

Thanks,
Andrew

No, currently, TAO is not supported to train on Jetson devices. But after training, users can run inference on Jetson devices.

TAO model can be deployed on Jetson Orin. As mentioned above, TAO is not supported to train on Jetson devices.

Aside from DeepStream, you can use other ways. For example, you can leverage the tao source code to generate your own standalone code to run inference against .hdf5 file or tensorrt engine.