How to run model generated through TLT/TAO on CPU

Hi @Morganh

I had trained one model using TLT and I want to run it over CPU. How can I run that model on CPU. Is there any python script available for the same ?

Thanks.

Sorry, TLT/TAO does not support it. At least 1 GPU required to run any task.

Okay thanks.

Is there any way to convert tlt model into tensorflow and can run inference using tensorflow on CPU ?

Refer to Frequently Asked Questions - NVIDIA Docs

Is it possible to export a custom trained .tlt (or.etlt) model to a conventional TensorFlow(TF) frozen inference graph (.pb) to make inferences with traditional TF tools?

No, this is currently not supported.

Okay Thanks.