Can Stable Diffusion run on lower tensorrt version(like 7.2.3) and lower driver?

Description

We got some server with different Nvidia graphics card like M40、T4、V100, but with low nvida cuda driver (like: Driver Version: 455.32.00 CUDA Version: 11.1)

I already test my code from [TensorRT demo Diffusion] (https://github.com/NVIDIA/TensorRT/tree/release/8.6/demo/Diffusion) on own 3090,with tensorrt 8.6.1.

I wonder if we can run it on lower version (tensorrt and its driver), if not, we will need to upgrade these machines’ driver.

I have tested trtexec on tensorrt 7.2.3, here are some log of it. Are there some ops that are not supported in lower version?

[11/10/2023-16:30:45] [V] [TRT] ImporterContext.hpp:120: Registering tensor: /Gather_output_0 for ONNX tensor: /Gather_output_0
[11/10/2023-16:30:45] [V] [TRT] ModelImporter.cpp:179: /Gather [Gather] outputs: [/Gather_output_0 -> ()], 
[11/10/2023-16:30:45] [V] [TRT] ModelImporter.cpp:103: Parsing node: /Constant_1 [Constant]
[11/10/2023-16:30:45] [V] [TRT] ModelImporter.cpp:125: /Constant_1 [Constant] inputs: 
[11/10/2023-16:30:45] [V] [TRT] ModelImporter.cpp:179: /Constant_1 [Constant] outputs: [/Constant_1_output_0 -> (1)], 
[11/10/2023-16:30:45] [V] [TRT] ModelImporter.cpp:103: Parsing node: /Unsqueeze [Unsqueeze]
[11/10/2023-16:30:45] [V] [TRT] ModelImporter.cpp:119: Searching for input: /Gather_output_0
[11/10/2023-16:30:45] [V] [TRT] ModelImporter.cpp:119: Searching for input: /Constant_1_output_0
[11/10/2023-16:30:45] [V] [TRT] ModelImporter.cpp:125: /Unsqueeze [Unsqueeze] inputs: [/Gather_output_0 -> ()], [/Constant_1_output_0 -> (1)], 
terminate called after throwing an instance of 'std::out_of_range'
  what():  Attribute not found: axes
Aborted

Hi @Bing00000 ,
For the unsupported ops, we recommend to implement custom plugin to support that operation.

However this is always recommended to use the latest TRT version to get better results.

Thanks

1 Like

Thanks!