How to compile Triton Inference Server on ppc64le

I’d like to use Triton Inference Server on ppc64le with Tensorflow support.

The only available option seems to build it using CMake. Am I right ?

Configuring with :
cmake -DTRTIS_ENABLE_TENSORRT=ON -DTRTIS_ENABLE_TENSORFLOW=ON -DTRTIS_EXTRA_LIB_PATHS=“/home/localuser/tensorflow-lib” …/build

It looks like I need to have a Tensorflow library in order to build libtrtserver.so.
(I will put it in directory “/home/localuser/tensorflow-lib”)

So I need to build Tensorflow on ppc64le too ?

I found a docker (Docker Hub) with a Tensorflow version for ppc64le.
but I’m not sure docker is an option for me.

Is there any special option for building Tensorflow on ppc64le for use with Triton Inference Server ?

Regards,

Hi,
@lazydogc0tvl have you succeed to compile Triton Inference Server on ppc64le?

Rgds,
FM