tensorrtserver: Detected NVIDIA GeForce 940MX GPU, which is not supported by this container

when i run

nvidia-docker run --rm -it --shm-size=1g --ulimit memlock=-1 --ulimit stack=67108864 -p8000:8000 -p8001:8001 -p8002:8002 -v /dl-inference-server/examples/models:/models nvcr.io/nvidia/tensorrtserver:18.10-py3 trtserver --model-store=/models

or

nvidia-docker run --rm -it --shm-size=1g --ulimit memlock=-1 --ulimit stack=67108864 -p8000:8000 -p8001:8001 -p8002:8002 -v /dl-inference-server/examples/models:/models nvcr.io/nvidia/tensorrtserver:18.09-py3 trtserver --model-store=/models

I will get error info:
ERROR: Detected NVIDIA GeForce 940MX GPU, which is not supported by this container
ERROR: No supported GPU(s) detected to run this container


but when i run with tag=18.08 inferenceserver, it does not show up info.

nvidia-docker run --rm -it --shm-size=1g --ulimit memlock=-1 --ulimit stack=67108864 -p8000:8000 -p8001:8001 -p8002:8002 -v /dl-inference-server/examples/models:/models nvcr.io/nvidia/inferenceserver:18.08-py3 inference_server --model-store=/models

can someone help me or show me the right way?

host software env:
ubuntu18.04
NVIDIA GeForce 940MX
nvidia-driver:410.73
cuda_10.0.130_410.48_linux.run
cudnn-10.0-linux-x64-v7.3.0.29.tgz
python3.6
TensorRT-5.0.2.6.Ubuntu-18.04.1.x86_64-gnu.cuda-10.0.cudnn7.3.tar.gz

dl-inference-server is from https://github.com/NVIDIA/dl-inference-server branch:18.10

Hello,

The GPU-accelerated deep learning containers are tuned, tested, and certified by NVIDIA to run on NVIDIA TITAN V, TITAN Xp, TITAN X (Pascal), NVIDIA Quadro GV100, GP100 and P6000, NVIDIA DGX Systems​.

Please see https://www.nvidia.com/en-us/gpu-cloud/deep-learning-containers/ for more information.

ok, thanks.