Kubeflow + TensorRT

On the Kubeflow Website there is no longer any reference to TensorRT or Triton.
How can I deploy a Inference server that uses Nvidia Triton/TensorRT in Kubeflow.
Is there any update documentation on Nvidia, maybe internally ?

I was trying to deploy RIVA with Kubeflow.
Some ppl did RAPIDS with kubeflow, but thats a different thing.



I hope the following documents may help you.
If you encounter any problems, please reach out to the github issues section.

Thank you.

The closest I got was this: Tensorflow - KServe Documentation Website

All the documentation and resources Triton had in relation to Kubeflow are outdated or no longer exist.