Docker container is not built against TensorRT 21.02 (7.2.2)

Looking through the different containers I’ve noticed that “” is not actually built against TensorRT 21.02 (7.2.2). It’s using TensorRT 7.2.1.

The other containers “Anyscale” and “” are built against 21.02 (7.2.2).


root@4d3b64ad616c:/# ls -al /usr/lib/x86_64-linux-gnu/*
lrwxrwxrwx 1 root root        19 Oct  6  2020 /usr/lib/x86_64-linux-gnu/ ->
lrwxrwxrwx 1 root root        19 Oct  6  2020 /usr/lib/x86_64-linux-gnu/ ->
-rw-r--r-- 1 root root 625274176 Oct  6  2020 /usr/lib/x86_64-linux-gnu/

-base and -devel:

root@76d24f0909e3:/# ls -al /usr/lib/x86_64-linux-gnu/*
lrwxrwxrwx 1 root root        19 Dec 11  2020 /usr/lib/x86_64-linux-gnu/ ->
lrwxrwxrwx 1 root root        19 Dec 11  2020 /usr/lib/x86_64-linux-gnu/ ->
-rw-r--r-- 1 root root 625245504 Dec 11  2020 /usr/lib/x86_64-linux-gnu/

This is causing my converted models to fail on the triton image:

ERROR: …/nvdsinfer/nvdsinfer_func_utils.cpp:33 [TRT]: INVALID_CONFIG: The engine plan file is not compatible with this version of TensorRT, expecting library version 7.2.1 got 7.2.2, please rebuild.

Hi @sandberg,
Yes, you are right! We will check internally why this happened.
But., since it has been released, I’m afraid we can not change it in short term.

Could you rebuild the engine in the corresponding docker?


The docker name is a bit confusing. this name doesn’t mean it’s built on TensorRT:21.02 or Triton:21.02. It just means deepstream:21.02 with triton,
Actually the docker is build on top of Triton20.11 and keep compute stack unchanged. The TensorRT version is TRT7.2.1 that’s expected.

Thanks for the info. Yes the name is a bit confusing since it doesn’t align with the dependencies. I guess it’s even more confusing that it doesn’t use the same TensorRT as DeepStream base and devel image.