Issue while Serving the model using Triton Server

Hi Nvidia Team,

I am trying to Serve the Custom Model using Triton Server on my PC(having RTX 2060 GPU).
I am using container for Serving.

While I am serving I am getting an error(found in the attached image below).

I used Image to convert the Model from ONNX to Tensorrt.
May I know where I am doing wrong? And how to avoid this error?

Thanks in advance