TRITON's config.pbtxt only accepts 3dim input layers?

The input of my PyTorch model (denset121) is [10, 3, 224, 224] for a single batch [patch, color channel, height, width]. The output is [10, 3] which is a three-class prediction for each patch. How should my config.pbtxt look like? The issue that I’m facing (whether I use ONNX or torch.jit) is that I have more than three dimensions, so I cannot use the image_client.py. Thank you!

So I tried:

As per my previous post, I get a “dim!=3” error in the image_client.py - which I’m using as a base.

Moving to Inference server forum so that TRITON inference server team can take a look.