Convert SSD-mobilenetv2 to int8

I have a custom trained single class SSD-Mobilenetv2 detector(in .uff format) trained using tensorflow. how do I operate it in int8 mode? I cannot use the tlt-converter as I have not used tlt to train the model.

Hi sivaishere96,
The tlt-converter tool is only compatible in TLT. Please see tlt user guide

And also you should follow the process of TLT. For example, to train a tlt model, converter tlt model in int8 mode, etc.

As I have mentioned above I am using SSD-Mobilenetv2 which is not supported by TLT. Hence I have to use tensorflow to train the model. Now I have to convert the trained model to int8. How do I do that?

Do you mean you want to convert the uff model into a TensorRT Int8 engine? It looks like a TensorRT topic.

Yes. I want to convert the uff model to an int8 engine . This is supposed to be deployed in deepstream. Hence asked the question here.

You can search some useful topics inside TensorRT forum.
For example,

Can the SampleUffSSD repsoitory be used for calibrating SSD-Mobilenet-v2 to int8? From what I have seen it is for inferencing the SSD Inception network

Please refer to below two samples.