Triton Error: UNAVAILABLE: Invalid argument: unable to load model 'pose_classifier_tensorrt', configuration expects 2 inputs, model provides 1

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU): A10
• Triton Version: 2.24.0
• Issue Type( questions, new requirements, bugs): Question
**• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files

I am not sure if this is a bug, typo, or something else. When Triton attempts to load our model with the following config.pbtxt Triton is seemingly finding 2 inputs in the configuration according to the following error.

| pose_classifier_tensorrt | 2       | UNAVAILABLE: Invalid argument: unable to load model 'pose_classifier_tensorrt', configuration expects 2 inputs, model provides 1 |

As it correctly states there is only 1 input for the model. We cannot figure out where it is finding a second input in this file.

name: "pose_classifier_tensorrt"
platform: "tensorflow_savedmodel"
max_batch_size: 0
input [
  {
    name: "input_0"
    data_type: TYPE_FP32
    format: FORMAT_NCHW
    dims: [ 3, 224, 224 ]
    reshape { shape: [ 1, 224, 224, 3 ] }
  }
]
output [
  {
    name: "predictions"
    data_type: TYPE_FP32
    dims: [ -1, 3 ]
    reshape { shape: [ -1, 3] }
  }
]

This issue would be outside of deepstream. triton is opensource. you might ask in triton issue channel. Issues · triton-inference-server/server · GitHub

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.