How to start triton server after building the tritonserver Image for custom windows?

Building the windows-based triton server image.

Building the Dockerfile.win10.min for triton server version 22.11 was not working as base image required for building the server image was not available for downloading.

To build the image downgraded the triton server version to 22.10. Also needed to download the appropriate CUDNN and TENSORT version for building the image.

Successfully built the base image using below command for 22.10 server version.

docker build -t win10-py3-min -f Dockerfile.win10.min .

Once base image is built. To build the triton server image used below command with appropriate container tag and required backend.

python --cmake-dir=<path/to/repo>/build --build-dir=/tmp/citritonbuild --no-container-pull --image=base,win10-py3-min --enable-logging --enable-stats --enable-tracing --enable-gpu --endpoint=grpc --endpoint=http --repo-tag=common: --repo-tag=core: --repo-tag=backend: --repo-tag=thirdparty: --backend=ensemble --backend=tensorrt: --backend=onnxruntime: --backend=openvino:

Tried different arguments with command to build the triton server image was unsuccessful. As there were certain issues related to CMAKE, Rapid Json.

Certain issues on github suggested to try with the latest stable release which is 22.12. For building the base image it required the same os as per 22.11. So, to build the base image used the Dockerfile.win10.min file from 22.10 version which worked earlier.

After passing different arguments to file was able to build the tritonserver image. However, initially built images did not have the tritonserver.exe which is required for starting the server. In the end was able to build the tritonserver which had the required tritonserver.exe.

built the image using below command.

python -v --no-container-pull --image=base,win10-py3-min --enable-logging --enable-stats --enable-tracing --enable-gpu --endpoint=grpc --endpoint=http --repo-tag=common:r22.12 --repo-tag=core:r22.12 --repo-tag=backend:r22.12 --repo-tag=thirdparty:r22.12 --backend=ensemble

To start the triton server, need to mount the local model repository to the docker image. Command to start the server is as below.

docker run -it -v C:/Users/Desktop/model_repository:C:/opt/tritonserver/models tritonserver:latest bin/tritonserver.exe --model-repository=C/opt/tritonserver/models

While starting the triton server getting the below error.

failed to resize tty, using default size