/usr/local/bin/nvidia_entrypoint.sh: line 33: exec: trtserver: not found

hello, I want to use tensorrt serving.
first, as my server os has no nvidia driver version more then 410, I run docker pull nvcr.io/nvidia/tensorrtserver:18.08-py2, then I run nvidia-docker run --rm --shm-size=1g --ulimit memlock=-1 --ulimit stack=67108864 -p8000:8000 -p8001:8001 -p8002:8002 -v/xxx/model_repository:/models nvcr.io/nvidia/tensorrt:18.08-py2 trtserver --model-store=/models, but it shows error:

=====================
== NVIDIA TensorRT ==
=====================

NVIDIA Release 18.08 (build 601781)

NVIDIA TensorRT 4.0.1 (c) 2016-2018, NVIDIA CORPORATION.  All rights reserved.
Container image (c) 2018, NVIDIA CORPORATION.  All rights reserved.

https://developer.nvidia.com/tensorrt

To install python sample dependencies, run /opt/tensorrt/python/python_setup.sh

/usr/local/bin/nvidia_entrypoint.sh: line 33: exec: trtserver: not found

I can not find nvidia_entrypoint.sh under the dir /usr/local/bin,

how to fix it?

Linux distro and version:

LSB Version:	:core-4.1-amd64:core-4.1-noarch
Distributor ID:	CentOS
Description:	CentOS Linux release 7.4.1708 (Core)
Release:	7.4.1708
Codename:	Core

other envirs:

GPU type: Tesla v100
nvidia driver version: NVIDIA-SMI 396.44
CUDA version: 9.0
CUDNN version: 7.3.0
Python version [if using python]: python2.7
TensorRT version: 5.0.2.6
gcc>5.3/lib64

sorry,I a mistake.