Port Resnet50.pb to be used in Deepstream 5.0?

Hi I have a Resnet50.pb, Kindy give me the easiest step on how to port it on Deepstream SDK. So that I can test it using a sample pipeline in NVINFER plugin.

**• Hardware Platform (Jetson / GPU)**T4
• DeepStream Version 5.0
• JetPack Version (valid for Jetson only)
• TensorRT Version 7.0
**• NVIDIA GPU Driver Version (valid for GPU only)**450.36.06

@GalibaSashi

I think I have already answered the same question here: SSD_Mobilenet_V2.pb

Are you planning to build a pipeline with SSD_Mobilenet_V2.pb as a detector and Resnet50.pb as a classifier?

Hi @ersheng,
Yes I am planning to build a pipeline with SSD_Mobilenet_V2.pb as a detector and Resnet50.pb as a classifier. But first I would like to try standalone. Also can you tell if conversion to ONNX is the best method to implement in DeepStream, and if so can you tell the steps to convert the .pb file to ONNX supported file.
Thanks in advance
GokuDDG

@GalibaSashi

ONNX is now becoming a bridge between different DL frameworks and inference targets, accordingly, TensorRT is now following this trend and focusing on enhancement and maintenance of conversions from ONNX to TensorRT engine.

You can click here for more detailed info about ONNX conversions:
https://elinux.org/TensorRT/ONNX
https://github.com/onnx/tensorflow-onnx

python -m tf2onnx.convert --graphdef $WORK/$MODEL/frozen_inference_graph.pb --output $WORK/$MODEL.frozen.onnx
–fold_const --opset 10
–inputs image_tensor:0
–outputs num_detections:0,detection_boxes:0,detection_scores:0,detection_classes:0
How to set correct –fold_const --opset 10 value while converting to onnx format

Hi @kayccc @ersheng
Kindly do help me out.

you can enable "–fold_const " and use “–opeset 11” as TRT 7.0 supports opset 11 - https://github.com/onnx/onnx-tensorrt/blob/7.0/operators.md

Hi @mchi,
I am not able to parse the ONNX model correctly and no bounding box is shown in the video, kindly do help.
I have shared the files with you privately. Kindly do help.

what have you done? We would encough user to do as you could do.

Thanks!

Hi @mchi,
Actually I had converted the models to onnx . I used the custom bbox parser.cpp fie to try to get the proper output but unfortunately I failed to do it. Can you please try fro your end.
Thanks. Kindly note these models are not standard available ones but is a bit custom

Hi @mchi,
Kindly do help

Hi @GalibaSashi
Please try to do what you can do, e.g. verify your Resnet50.pb with TF, print the bbox value in the parser, check the input of your DeepStream.

Actually I meant SSDmobilenet_V2.pb. I have tried in my own end it works in general tensorflow.
Resnet50.pb is classification model.This value changes while taking buffer is seen in VGG model and this SSDmobilenet_v2 which I have shared. In the buffer output of layers I am getting -ve values. Kindly can you point out what I did wrong. As I have done everything suggested in the forums. If possible can you try out from your end.