State custom initialization on Triton Inference Server

Description

I’d like to initialize the state of the model with custom tensor on triton inference server.
I don’t know the structure of the datafile for the state initialization also My model is tensorrt.
Please help me.

Environment

TensorRT Version: 8.6.1
GPU Type: A100 40G
Nvidia Driver Version:
CUDA Version: 12.2
CUDNN Version: 8.9
Operating System + Version: Debian 11
Python Version (if applicable): 3.9
PyTorch Version (if applicable): 2.2.0

Please refer to the following section of the document, which may help you.

Please reach out to Issues · triton-inference-server/server · GitHub if you need any further help.

Thank you.

Thanks for your reply. I just reported issue on github