How to run a tao yolov4 model in triton inference server

I am training a model Tao Yolov4 and deploying it on the Triton server, but there is no Python Triton client available.

Can you help?