I’m trying to make a dali preprocessing pipeline extremely similar to the inception example but I can’t seem to figure out how to get the correct format on the client side. Here is the code I have now on the client side:
def prepare_batches(data_loader):
batches = []
for dataiter in data_loader:
images, labels = dataiter['image'], dataiter['label']
#images is a list of encoded images as strings
images_np = np.array(images, dtype=object)
batches.append((images_np, labels))
return batches
def infer_batch(client, batch):
images_np, labels = batch
inputs = grpcclient.InferInput("x", [len(images_np)], datatype="BYTES")
inputs.set_data_from_numpy(images_np)
try:
detection_response = client.infer(model_name="simple_ensemble", inputs=[inputs])
predictions = detection_response.as_numpy('classifier')
return predictions, labels.numpy()
except Exception as exc:
print(f"Exception during inference: {exc}")
return None, None
However this code doesnt work with the dali pipeline config file from the example:
max_batch_size: 256
input [
{
name: "DALI_INPUT_0"
data_type: TYPE_UINT8
dims: [ -1 ]
}
]
because triton is expecting input with a shape [-1, -1] but its getting data in the format [ -1 ].
unexpected shape for input 'x' for model 'simple_ensemble'. Expected [-1,-1], got [4].
What would be the correct way to format the data on the client side?