Run tensorflow savedmodel on nvinferserver in DS

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) Orin
• DeepStream Version 6.1
• JetPack Version (valid for Jetson only) 5.0
• TensorRT Version 8.
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs) questions
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
I am trying to run a tensorflow saved model within a Deepstream python application. I tried first by trying the model with a standalone Triton Inference Server and a gRPC client and was able to get the model working as expected.
When migrating the model into Nvinferserver transposing the same model config keeps on giving me a dimension error while inputs are given in the correct shape. Can you please give me insights how to debug this?
The model I use is a tf2 Unet model with input shape of [-1,-1,3] and output [-1,-1,1] my
infer_config {
unique_id: 5
gpu_ids: [0]
max_batch_size: 1
backend {
triton {
model_name: “unet”
version: -1
model_repo {
root: “models”
strict_model_config: false
tf_gpu_memory_fraction: 0.0
tf_disable_soft_placement: 0

preprocess {
network_format: IMAGE_FORMAT_RGB
tensor_order: TENSOR_ORDER_NHWC
maintain_aspect_ratio: 0
frame_scaling_hw: FRAME_SCALING_HW_DEFAULT
frame_scaling_filter: 1
normalize {
scale_factor: 1.0
channel_offsets: [0, 0, 0]

postprocess {
segmentation {

extra {
copy_input_to_host_buffers: false
output_buffer_pool_size: 2
input_control {
interval: 0
why wouldn’t this work if TRTIS is supposed to guess the proper config file if not spcified?

Can you provide the model?

Here is the model - Google Drive

The link is not accessible.

Sorry just updated it.

Still not accessible.

I hope this one works.

I got the model now. It will take some time to investigate the problem. Will be back soon after we get the conclusion.

1 Like

Can you provide the “config.pbtxt” you use?

config.pbtxt (1.5 KB)
Yes here it is. I am not sure if the model name is correct but this is the file.