I am trying to run the Triton inference server but while running the client-side script, I am facing the below issue.
tritonclient.utils.InferenceServerException: Input must set only one of the following fields: 'data', 'binary_data_size' in 'parameters', 'shared_memory_region' in 'parameters'. But no field is set
Full error:
File "/workspace/mmpose/mmpose/apis/inference.py", line 301, in _inference_single_pose_model
model_version=model_version, inputs=[input0], outputs=[output0])
File "/root/anaconda3/envs/mmdeploy/lib/python3.7/site-packages/tritonclient/http/__init__.py", line 1418, in infer
_raise_if_error(response)
File "/root/anaconda3/envs/mmdeploy/lib/python3.7/site-packages/tritonclient/http/__init__.py", line 65, in _raise_if_error
raise error
tritonclient.utils.InferenceServerException: Input must set only one of the following fields: 'data', 'binary_data_size' in 'parameters', 'shared_memory_region' in 'parameters'. But no field is set
This is what my congif model file looks like:
name: "mmpose_mobilenet_onnx"
platform: "onnxruntime_onnx"
max_batch_size : 0
input [
{
name: "input"
data_type: TYPE_FP32
format: FORMAT_NCHW
dims: [ 3, 256, 192 ]
reshape { shape: [ 1, 3, 256, 192 ] }
}
]
output [
{
name: "output"
data_type: TYPE_FP32
dims: [ 17, 64, 48 ]
reshape { shape: [ 1, 17, 64, 48 ] }
label_filename: "labels.txt"
}
]