I am using the latest Deepstream-triton container.
I’d like to run a custom Python model in Triton Server from Deepstream.
Following the official instruction here GitHub - triton-inference-server/python_backend: Triton backend that enables pre-process, post-processing and other logic to be implemented in Python. it seems that I need to install the Python backend for Triton as I get the error ModuleNotFoundError: No module named 'triton_python_backend_utils'
.
The setup instruction goes as follow:
cmake -DTRITON_ENABLE_GPU=ON -DTRITON_BACKEND_REPO_TAG=<GIT_BRANCH_NAME> -DTRITON_COMMON_REPO_TAG=<GIT_BRANCH_NAME> -DTRITON_CORE_REPO_TAG=<GIT_BRANCH_NAME> -DCMAKE_INSTALL_PREFIX:PATH=`pwd`/install ..
What are the values that I should use for these parameters?