How to load a model into graph composer?

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version 7.0
• JetPack Version (valid for Jetson only)
• TensorRT Version 8.5
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs) question

I have trained a resnet on my data and have a .pb file with its weights. I want to know if there is a way to use that model on graph composer in the same way I can use the sample models used on deepstream tests 1 and 2. I know that I probably have to convert it to onnx format.
I’ve already seen similar questions on this forum but I still don’t have any idea of how I can do that. Most of them were referring to the peoplenet TAO model or something similar.

The onnx model can be supported.

What files do I have to export and where?What do I have to do in order to see the new model in graph composer just like I can see the sample models?I want to see my trained model in there.

You can use the NvDsInferVideo extension. NvDsInferenceExt — DeepStream documentation 6.4 documentation
Set the correct nvinfer configuration file in the red circle.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.