deepstream : config for using deepstream-test1 for tensorflow

Dear All,

Can someone help me with creating config for using tensorflowf flow graph in
deepstream_sdk_on_jetson/sources/apps/sample_apps/deepstream-test1 test app.

Current dstest1_pgie_config.txt file is for caffe.

Thanks in advance.

Hi,

This includes two steps:

1. Please convert your tensorflow model into uff file first.
https://github.com/NVIDIA-AI-IOT/tf_to_trt_image_classification/blob/master/scripts/convert_plan.py#L18

2. Run deepstream with the converted .uff model:
You can find a sample config in ${deepstream_sdk_on_jetson}/sources/objectDetector_SSD.

Thanks.

Hi AastaLLL,

Step 1. I see that I need to use this command

python scripts/convert_plan.py data/frozen_graphs/inception_v1.pb data/plans/inception_v1.plan input 224 224 InceptionV1/Logits/SpatialSqueeze 1 0 float

I am confused about output node names. In my application, I use https://github.com/tensorflow/models/tree/master/research/object_detection

I use multiple output names.

How will this work there?

Hi,

Sorry for the late reply.

You can mark multiple output layer as a list. Like:

uff_model = uff.from_tensorflow_frozen_model(
    frozen_file=frozen_graph_filename,
    output_nodes=[output_name1,output_name2,...],
    output_filename=TMP_UFF_FILENAME,
    text=False,
)

Thanks.