Need help authoring Model configuration for Pytorch MNIST

Hi I am trying to host my first model in triton.

I ran the MNIST model from examples/main.py at master · pytorch/examples (github.com)

I have saved the pytorch model in torch script format. I tried following config but get error saying mnist model failed, bad request.

platform: "pytorch_libtorch"
max_batch_size: 0
input [
  {
    name: "INPUT__0"
    data_type: TYPE_FP32
    dims: [ 28,28 ]
  }
]
output [
  {
    name: "output__0"
    data_type: TYPE_FP32
    dims: [ 10 ]
  }
]

What am I missing?

Any suggestions?

Please re-post your question on: Triton Inference Server · GitHub , the NVIDIA and other teams will be able to help you there.
Sorry for the inconvenience, thanks for your patience.