I have been running a TensorRT Inference Server fine with the following command:
nvidia-docker run --rm --shm-size=1g --ulimit memlock=-1 --ulimit stack=67108864 -p8000:8000 -p9000:8001 -p8002:8002 -v /my/model/repo:/models nvcr.io/nvidia/tensorrtserver:19.05-py3 trtserver --model-store=/models
I am trying to run it using docker compose. My docker compose file follows:
version: '2.3'
services:
inference_server:
image: nvcr.io/nvidia/tensorrtserver:19.05-py3
runtime: nvidia
volumes:
- /my/model/repo:/models
ports:
- 8000:8000
- 8002:8002
command: ["trtserver --model-store=/models"]
shm_size: 1g
ulimits:
memlock: -1
stack: 67108864
When I try run the file with docker-compose up, I get the following output:
Recreating processing_host_inference_server_1 ... done
Attaching to processing_host_inference_server_1
inference_server_1 |
inference_server_1 | ===============================
inference_server_1 | == TensorRT Inference Server ==
inference_server_1 | ===============================
inference_server_1 |
inference_server_1 | NVIDIA Release 19.06 (build 6791108)
inference_server_1 |
inference_server_1 | Copyright (c) 2018-2019, NVIDIA CORPORATION. All rights reserved.
inference_server_1 | Copyright 2019 The TensorFlow Authors. All rights reserved.
inference_server_1 | Copyright 2019 The TensorFlow Serving Authors. All rights reserved.
inference_server_1 | Copyright (c) 2016-present, Facebook Inc. All rights reserved.
inference_server_1 |
inference_server_1 | Various files include modifications (c) NVIDIA CORPORATION. All rights reserved.
inference_server_1 | NVIDIA modifications are covered by the license terms that apply to the underlying
inference_server_1 | project or file.
inference_server_1 |
inference_server_1 | /opt/tensorrtserver/nvidia_entrypoint.sh: line 88: /opt/tensorrtserver/trtserver --model-store=/models/: No such file or directory
processing_host_inference_server_1 exited with code 1
I have accessed the shell of the docker container using docker-compose run inference_server sh and the model repository is mounted at /models and contains the correct files.
Is there something that I am overlooking causing this error?
My system specs follow:
Operating system: Ubuntu 18.04
RAM: 32GB
Docker version: Docker version 19.03.1, build 74b1e89
Docker compose version: docker-compose version 1.24.1, build 4667896b